Search results
Results from the WOW.Com Content Network
The conferences listed here are major conferences of the area; they have been selected using the following criteria:- the notability of the conference has been confirmed by multiple independent sources; for example, it has been mentioned in textbooks [1] [2] [3] or other sources, [4] [5] or it has received a high ranking [6]
Distributed data processing. Distributed data processing [1] (DDP) [2] was the term that IBM used for the IBM 3790 (1975) and its successor, the IBM 8100 (1979). Datamation described the 3790 in March 1979 as "less than successful." [3] [4] Distributed data processing was used by IBM to refer to two environments: IMS DB/DC; CICS/DL/I [5] [6]
Distributed Computing is a peer-reviewed scientific journal in the field of computer science, published by Springer. The journal covers the field of distributed computing , with contributions to the theory, specification, design, and implementation of distributed systems .
Distributed Data Management Architecture (DDM) is IBM's open, published software architecture for creating, managing and accessing data on a remote computer. DDM was initially designed to support record-oriented files; it was extended to support hierarchical directories, stream-oriented files, queues, and system command processing; it was further extended to be the base of IBM's Distributed ...
Distributed Processing Technology Corporation (DPT) was an American computer hardware company active from 1977 to 1999. Founded in Maitland, Florida , DPT was an early pioneer in computer storage technology, popularizing the use of disk caching in the 1980s and 1990s.
Distributed Artificial Intelligence (DAI) is an approach to solving complex learning, planning, and decision-making problems.It is embarrassingly parallel, thus able to exploit large scale computation and spatial distribution of computing resources.
Apache Hadoop (/ h ə ˈ d uː p /) is a collection of open-source software utilities for reliable, scalable, distributed computing.It provides a software framework for distributed storage and processing of big data using the MapReduce programming model.
CADP [1] (Construction and Analysis of Distributed Processes) is a toolbox for the design of communication protocols and distributed systems. CADP is developed by the CONVECS team (formerly by the VASY team) at INRIA Rhone-Alpes and connected to various complementary tools.