10 Big Data Predictions for Automation and Control in 2017

This year will provide increasing clarity and direction on Big Data technologies guiding the Industrial Internet of Things.

The emergence and relevance of Big Data in the world of automation and control has been a work in progress for the last five years, but 2017 could represent a major step forward and tipping point. Certainly rosy predictions about more focus on Big Data, more projects and spending, along with new cloud-based solutions, aren’t a big surprise. But the reality is that Big Data initiatives are no longer new technology, and we do have a path that is becoming increasingly clear on the potential impacts of these technologies moving ahead.

A Datamation blog offered an excellent summary of Big Data predictions for 2017 that can be expanded and focused to specifically reflect developments in the Industrial Internet of Things (IIoT).  
 

According to the Datamation article, “For 2017, the primary trends in Big Data will revolve around refining enterprises' core big data capabilities. They are looking for ways to analyze more data, more quickly. Having seen the payoff from their initial investments in big data technology, they are looking to expand their big data projects to achieve even greater financial results.”

 

Realizing that the world of industrial automation and control has its own specific set of objectives and priorities, comparisons to general computing market trends are still valid. So here is an updated list of 10 Big Data predictions for automation and control in 2017:


1.    More data than ever before
2.    More projects and spending
3.    Cloud-based solutions vs. on-premises
4.    Rise of artificial intelligence and machine learning
5.    Growth of predictive analytics
6.    Increased focus on real-time analysis
7.    More staff with Big Data skills especially data scientists and database professionals
8.    New tools that enable automation professionals to self-service their own needs
9.    Increased focus on privacy and security
10.    More productivity


None of the first three predictions are a big surprise.  Automation control vendors are reporting extremely high client interest and focus on developing IIoT projects. Budgets reflect that commitment, backed by higher management, to create more data-driven business and manufacturing solutions.  


And perhaps the biggest developments in 2016, in terms of impact in industrial control, was a burst of activity and solutions in the area of cloud computing. OPC UA emerged as a connectivity standard, and vendors have responded by implementing Pub-Sub technology as an enterprise-wide connectivity solution.


The next set of three predictions focus on areas including machine learning, predictive analytics and real-time analysis where there is ongoing development of solutions specifically for automation and control. Predictive maintenance using technologies such as condition monitoring are continuing to develop but widespread adoption is still in the future.  Machine learning and real-time analysis are also areas where vendors see considerable potential in the ability to deploy more advanced control algorithms, and effectively use the increased processing capabilities available in the newest machine controllers.


One of the keys to success is the development of IoT application solutions, software tools that enable better data analysis, and an ability to create actionable information. One trend in automation and control software is the development of tools that make

Comments

I see a big problem in the future with all of that data. Just imagine millions of billions of terabytes of data, somehow possibly stored someplace. How would it be organized and how would it ever be analyzed. It would be more like the individual grains of sand along the ocean. so while some are touting how useful it would be they seldom are able to say more than that they will be happy to sell the software to attempt to possibly make something out of it, maybe. 500 word limit STINKS!!!!!

One big question is where are all of those millions of billions of gigabytes of data going to be stored? AND, what is going to provide the funding needed to pay for the processing of that data in the hopes of learning anything of value? Who is going to provide the artificial intelligence to do the analysis of this data, and where will the funding for that effort come from? Who will understand the process producing the data well enough to have the insight to know where to look for value?

Has anyone considered that restricting the length of comments is aimed at those with very short attention spans and the inability to focus for more than a few seconds. Most good engineers solve problems through extensive focus on analysis of the possible solutions. A short attention span is not part of good engineering.

Add new comment

By submitting this form, you accept the Mollom privacy policy.