What Are Schneider Electric Doing With The IoT?

We speak to Stephen Dillon about his work at the utilities giant


Ahead of his presentation at the Big Data Innovation Summit in Boston on September 8 & 9, we spoke to Stephen Dillion about his work as a Sr Engineering Fellow\Big Data Architect at Schneider Electric.

Stephen is a senior member of Schneider Electric's engineering fellow program with over 17 years of experience in the data engineering field. He has been working with IoT since 2009, Big Data technologies since 2011 and Fast Data technologies since 2012 and has spoken at numerous conferences and events on the subject of In-memory databases, MongoDB, and NoSQL. He is currently working on IoT Innovation with a focus on Fast Data technologies for Schneider Electric's Global Solutions team which is responsible for developing their IoT Platform.

Innovation Enterprise: How do you see the use of data in companies developing over the next 5 years?

Stephen: We can expect streaming, In-memory databases, and Graph analytics to become common place as Fast Data solutions are embraced in support of real-time decisioning. This will prepare companies to implement predictive analytics and deep learning which will be expected to be part of Enterprise architectures especially in the IoT industry. Today; predictive analytics is the Holy Grail for companies but most simply are not ready for a multitude of reasons. Either their use cases are not understood well enough, their employees are not up to speed on the technical topics, they've not identified the value proposition for investing in this area, or they're still early on in the adoption process of Big Data solutions. Of course we see some companies at the forefront of IoT today that are utilizing predictive analytics but the gap between these companies and others is wide.

What should companies do to improve their data security in the wake of the several widely reported hacks of the last 2 years?

Learn, adapt, and be aware. Companies must not only invest in a technological effort but also one of social awareness as well. From the technology perspective, companies must be able to react faster to potential threats especially as devices become more connected to a network and to other devices. It is fair to say there will be new unforeseen threats that will emerge as a result that human intervention and today's reactionary responses alone cannot sufficiently handle. Machine Learning is one area companies can look to that can play a role in preventing many such breaches or at least are able to identify threats and intervene as they happen as opposed to after they've happened. But even these technical approaches are only so good. If people are not aware of the possible threats of social engineering, one employee can cause a multi-million dollar system's security to be breached.

What is your outlook on the amount of regulation of data from governments?

There are so many concerns to date regarding security of data, especially in the IoT domain, that the amount of regulation can only increase. Where and how that regulation will be implemented is the real question the industry is watching. Simply locking it down and encrypting all data in flight and at rest and in-depth access control practices will not be sufficient. Even the greatest banks with the most secure of vaults are still guarded by other security measures. However; if data becomes too difficult to work with, due to too many or harsh constraints, its full potential will not be achieved. Security must be fluid and this is where artificial intelligence can play a role.

What advice would you give to a company making their first moves into implementing a new data program?

First and foremost, you must understand your use case(s). Your use cases will define the value for your business. If you cannot identify the value, you may want to question the purpose of your data program's initiative. Without understanding the value, you will have no way to measurably determine success. Knowing your use cases well will also help narrow down your technology choices. In most cases there is no absolutely correct answer regarding which myriad technologies will meet your needs but there are absolutely incorrect answers if applied to the wrong use case. Be comfortable in the fact there will be tradeoffs between different solutions and only you can weigh the pros and cons against your needs. Second; technology should not drive your business decisions but eventually you must implement technologies that will support your business. There is no one size fits all solution and you must be ready to cultivate and acquire the skills necessary to support your data initiative with a mixture of specialists and generalists. These must be open-minded technologists who can adapt to the evolving use cases and technologies. Else you'll have a team ill prepared to implement a solution when the technologies must change. Finally; be willing to fail fast and fail forward. There is still a practice I observe many in the industry are afraid to embrace. If something is not working or your use cases have evolved beyond the usefulness of the chosen technologies then it is a foolish and stubborn person who will refuse to make the necessary changes. 

Stephen will be speaking at the Big Data Innovation Summit on September 8 & 9. 

Bean small

Read next:

City of Chicago: An Analytics-Driven City