Big Data: Processing large volumes of data efficiently

Gaining additional information by parallel computer operations for data analysis

As a result of the burgeoning use of sensors along with networked equipment with complex software systems, the data flow into manufacturing is increasing rapidly. Just recording and filing such large volumes of data in a structured manner takes a considerable amount of time and effort. Initially, instead of bringing about the intended transparency, this can lead to somewhat chaotic conditions. Only when suitable data processing systems are in place and when the truly relevant information can be extracted from such expansive volumes of data, knowledge can be acquired. The Fraunhofer IPT is therefore developing efficient concepts for rapid data processing and evaluation and is transforming these into applications with real-time capability.

Adaptive optical systems for high-speed microscopy permitting relevant information to be extracted quickly from copious amounts of measurement data is one such example. The parallelization of computer operations is one way of processing such large volumes of data. The fast processors of the graphics card are used instead of the main processors to evaluate the microscopy data.

The graphics processors can perform a number of calculations at the same time and independently of one another. They can also record the large volumes of data generated in wavefront metrology in real time. Adaptive optics compensate immediately for any interference in the imaging by analyzing the deviations and transforming them into specific actuator instructions. Microscopy systems which adapt correctly within on-going operations can be developed in this way, for example.

Further areas of application for large-scale calculations at the Fraunhofer IPT include signal processes in optical coherence tomography (OCT) and the so-called “Pyramidal View” used to view and analyze big image data.