Weitere Kapitel dieses Buchs durch Wischen aufrufen
Research for the development of credible solutions within the Information Continuum has been a 17 year journey that began in the mid 1990s when the authors of this book were designing new ways to perform data capture, processing, analysis, and dissemination of high volume, high data rate, streams of information (what today would be called a “big data” problem). Hence, data analysis and lack of quality user interaction within that process are not a new problem. Users have continued to be challenged with keeping up with the vast volumes and multiple streams of data that have had to be analyzed. By the time a user had grabbed a time-slice of data, plotted it, and analyzed it, 100s of Gigabytes of data had passed through the system. In order to provide some semblance of rational attack against the onslaught of data we created what could be likened to a virtual window into the system that allowed the analysts to “walk into the middle” of the data and look at it as it flowed through the system. Analysts could reach out and grab a set of data, rotate it through its axes, and perform automated analysis on the data while remaining within the system data flow. This way analysts could intelligently and rapidly hop between portions of data within multiple data streams to gain pattern and association awareness.
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
In this case, the decay represents the information’s relative value over time.
Zurück zum Zitat Ourston D, Mooney R (1994) Theory refinement combining analytical and empirical methods. Artif Intell 66:311–344 MathSciNetCrossRef Ourston D, Mooney R (1994) Theory refinement combining analytical and empirical methods. Artif Intell 66:311–344 MathSciNetCrossRef
- The Information Continuum
James A. Crowder
John N. Carbone
Shelli A. Friess
- Springer New York
- Chapter 2
Neuer Inhalt/© ITandMEDIA