icon-play icon-normal-size icon-expand icon-pause iconExit Click for search iconTarget iconCheckmark iconEngine iconTruck linkedin_icon twitter_icon arrow arrow-left arrow-right close iconWorkforce iconAudit iconEmergency icon-pin icon-dots icon-act playbutton pause-button

When Security Fails, Data Integrity Has Your Back

For many good reasons, security is an underlying obsession in any connected digital system. Most security provisions attempt to tightly control access, resist illicit intrusion and safeguard identities and valuable data from corruption and illegitimate extraction. In short, security keeps bad actors out while the needs of legitimate users are served. Except it doesn’t, as headlines vividly demonstrate, week after week, month after month, year after year. Just a few weeks ago, the New York Times and other media outlets described a 50 million users’ data breach at Facebook as “…the largest in the company’s 14 year history.” As though this is just one of many breaches to be expected. The Financial Times reported on October 16th how connected home appliances can be targeted en masse with instruction sets to spike electric power demand, inducing regional failures in the grid. How can systems know which instructions are genuine?

At Sixgill, we think security companies should continue to innovate, hardening and improving their systems. We think anyone responsible for operating IoT systems or shipping connected items into the marketplace should strive for better security. But we also know they won’t succeed for long and that the cat-and-mouse rivalry between security and intrusion will never be altogether won. With Sixgill providing the data services foundation for anything actionable in the Internet of Everything, we are especially aware of the vulnerabilities of action-triggering data traversing multiple-owners networks where no one has responsibility for end-to-end security. To be blunt, sensor-populated networks are intrinsically unsecure.

But there is an answer. Once it’s obvious that bad actors will, sooner or later, force their way into any system they want to, focus shifts to the integrity of the data in transit itself. At least it has for us. So, in critical sensor-dependent applications, fueled by a plethora of networked emitters producing streams of data gathered and analyzed for automatic action, the question becomes: Can people and systems trust that the emitted data, the transmitted data, the ingested data and acted-upon data are the same? Today, you mostly cannot be absolutely sure. We want to change that.

Distributed Ledger Technology (DLT), or the Kleenex-like shorthand “Blockchain,” gives us a database scheme for immutability by distributing data storage away from single-vulnerability repositories, and shining sunlight on all the copies. But all the DLT variants are poor at scaling for high performance data management. In our world, sensor-dependent applications are mostly deployed to support real-time operations, sometimes on the basis of very high volume & velocity of time-series data. No real-time operation can be supported by today’s DLTs. To solve this, Sixgill embarked on a data integrity initiative almost a year ago, to apply the immutability of DLT to sensor-emitted data pipelines, without precluding real-time operations performance.

I’ll go so far as to say that the future of the Internet of Everything (and its IoT subset) depends on data integrity. To be actionable, data must be truthful, accurate, complete, retrievable and verifiable. But there are many impediments to data integrity, chief among them being enterprise reliance on centralized cloud resources. Centralized clouds present more opportunities for data interception and are subject to 3rd party vulnerabilities outside of enterprise’s control. They inhibit real-time access to data due to hard constraints on global bandwidth capacity. They create a highly siloed data landscape that makes holistic data analysis difficult if not impossible.

Edge computing is certainly a mitigator of these vulnerabilities. Sixgill incorporates edge services in Sense for this and many other reasons we will write about shortly. For data integrity, IoT edge computing significantly reduces the opportunities for data interception. It overcomes limitations in bandwidth by combatting latency to maintain real-time performance. It enables distributed compute capacity that supports additional data integrity measures. But even that needs another ally: Distributed Ledger Tech. DLT can be applied at the edge to log, track and verify data generated by devices and sensors. The tamper-proof nature of distributed ledgers ensures that data emitted, transmitted and ingested remains unchanged and is auditable. For the first time we can ensure integrity of data in intrinsically unsecure systems. And we can do so without relying upon trusted 3rd parties and temporary gains in security.

With DLT and edge computing combined to underpin our scheme for data integrity, what’s left is figuring out how to deliver without disabling real-time operations performance. We use DLT and Edge Computing to verify device identity and sensor inputs, which boosts confidence in the authenticity and accuracy of data. By focusing on the alpha problem, we can turn the growing population of devices and sensors from a security liability into a data integrity asset. And as the size of the network grows, we can leverage economies of scale to enable efficient data storage and real-time data processing. But all this only counts for something if it works in-the-moment.

Our answer is to implement an on-chain/off-chain schema wherein our control layer is built on proven and secure public blockchains, while the data layer is deployed off-chain. You are going to be hearing more specifics about this as we get closer to product launch. However, implementation has all the hallmarks of Sixgill’s intrinsic biases in favor of open architecture, portability, and data-ecumenism. Our coming data integrity services will run on any or multiple public blockchains and will be freely-movable between them, just as Sense can be deployed to any cloud service or enterprise data center, without dependency on any specific cloud’s functions.

To summarize, we see sensor networks as intrinsically unsecure. We believe bad actors will breach any system and that the element that can be protected is the integrity of transmitted and ingested data. You and your systems must be able to trust the immutability of the emitted, transmitted, ingested and acted-upon data, or be able to reliably recognize data that’s been compromised, preventing action on corrupted versions of it. To ensure the integrity of sensor data driving operations, we leverage public blockchains for their strengths (i.e. decentralized, trustless, immutable). We rely upon off-chain mechanisms to address public blockchain limitations (i.e. storage, latency, privacy). Our Hybrid Protocol separates control and data layers: The Control Layer is powered by a public blockchain. The Data Layer is off-chain with ability for data storage to be distributed (ideal), cloud-based or on-premises.

How is your sensor data services platform solving this problem?  From Sixgill, more to come.