Working data set

Working data set
Рабочий набор данных.

Краткий толковый словарь по полиграфии. 2010.

Игры ⚽ Нужна курсовая?

Смотреть что такое "Working data set" в других словарях:

  • Working Set Size — In computing the working set size is the amount of memory needed to compute the answer to a problem. In any computing scenario, but especially high performance computing where mistakes can be costly, this is a significant design criteria for a… …   Wikipedia

  • Data transformation (statistics) — A scatterplot in which the areas of the sovereign states and dependent territories in the world are plotted on the vertical axis against their populations on the horizontal axis. The upper plot uses raw data. In the lower plot, both the area and… …   Wikipedia

  • Data Validation and Reconciliation — Industrial process data validation and reconciliation or short data validation and reconciliation (DVR) is a technology which is using process information and mathematical methods in order to automatically correct measurements in industrial… …   Wikipedia

  • Data classification (business intelligence) — In business intelligence, data classification has close ties to data clustering, but where data clustering is descriptive, data classification is predictive.[1][2] In essence data classification consists of using variables with known values to… …   Wikipedia

  • Data-flow analysis — is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program s control flow graph (CFG) is used to determine those parts of a program to which a particular value assigned… …   Wikipedia

  • Data center bridging — (DCB) refers to a set of enhancements to Ethernet local area networks for use in data center environments. Specifically, DCB goals are, for selected traffic, to eliminate loss due to queue overflow and to be able to allocate bandwidth on links.… …   Wikipedia

  • Data Intensive Computing — is a class of parallel computing applications which use a data parallel approach to processing large volumes of data typically terabytes or petabytes in size and typically referred to as Big Data. Computing applications which devote most of their …   Wikipedia

  • Data Encryption Standard — The Feistel function (F function) of DES General Designers IBM First publis …   Wikipedia

  • Working memory — (also referred to as short term memory, depending on the specific theory) is a theoretical construct within cognitive psychology that refers to the structures and processes used for temporarily storing and manipulating information. There are… …   Wikipedia

  • Data Format Description Language — (DFDL, often pronounced daff o dil) is a modeling language from the Open Grid Forum for describing general text and binary data. A DFDL model or schema allows any text or binary data to be read (or parsed ) from its native format and to be… …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»