Information
0  1  2  3  4  5
WPS-System
  • Deutsch
  • English

Big Data

Everyone is talking about Big Data. Enterprises across the globe are spending significant dollars on it and endeavor with their strategy. The people, the processes, and the technologies that are needed to understand and solve the complex problem of processing enormous amounts of data remains elusive. We are storing more and more data at staggering amounts. Our fast paced lifestyles are only adding to the requirements for faster and faster machines to endlessly gather all forms of unstructured, semi structured, and structured data. We are simply hungry for data without any for seen ability to satisfy our appetite. We are data rich and information poor.

The global data stores are growing exponentially. In 2013, Big Data has been projected to produce an annual revenue of $187 Billion in 2015 (reference: annual report of IBM, 2013). Big Data is a Big Business. Big Data is also a Big Deal, a Big Financial Deal. It can offer Big Business gains, but hidden costs and its complexity present barriers that organizations will struggle to mitigate.

Standards organizations like the National Institutes of Standards and Technology (NIST) are working in collaboration with government and industry to define Big Data and the issues associated with it. One overarching issue involves security aspects, and the significant challenges that have yet to be solved. Information security like availability, back-ups, and disaster recovery is also a big deal regarding hidden ongoing costs. While technology firms and numerous vendors across the globe are developing new software tools and services to mine, query, analyze, and to make the data stores meaningful much is yet unknown as to the payoff of Big Data.

Now is the time to look for and apply proven methods from other sciences to the Big Data discussion. The science of 3D coordinate measurement metrology (3DCMM) as a discipline of Dimensional Metrology (DM) has been successfully practiced for over 25 years and offers some of the answers. 3DCMM is the science of calibrating and using physical measurement equipment to quantify several physical characteristics from any given object. It embraces and optimizes the value of data.

The purpose of this paper is to serve as a catalyst for a paradigm shift, while we have vast stores of data, analytical projects could use the 3DCMM approach that first identifies the minimal dataset required, then use an iterative control loop as a feedback mechanism to detect the end for gathering datasets. Simply said, more is not always better. If the 3DCMM modular, structured approach would be applied to the Big Data strategy: Just enough data, the right data, at the correct location would produce the right conclusions.

Therefore, the endpoint results in risk elimination and reduced financial costs, ensured customer satisfaction, and production of THE requested product.

The overarching application of 3DCMM applied to Big Data ensures that data stewardship and management are interwoven into the company culture. These characteristics are embraced by outstanding companies that have zero tolerance for waste and errors, or defective outcomes.

Again, be careful! There will not be a better result automatically even if there are more datasets involved. The key will be to adapt the best evaluation procedure depending on the identified requirements. Some people often think that new approaches like Big Data produce a better outcome. This is a misconception.

Fon: