icon-play icon-normal-size icon-expand icon-pause iconExit Click for search iconTarget iconCheckmark iconEngine iconTruck linkedin_icon twitter_icon arrow arrow-left arrow-right close iconWorkforce iconAudit iconEmergency icon-pin icon-dots icon-act playbutton pause-button

The New Era of the Sensing Application

Statista reports there are over 23 billion emitting devices of all types connected to the internet today. Others estimate smaller and larger numbers, but the lowest match or exceed human population and the upper estimates are even higher than Statista’s. What’s certain is that Sensors are becoming smarter and cheaper, and more numerous for it. We’re headed from billions to trillions over the next decade or two.

Your mobile phone aggregates many sensors. The camera is a subsystem that includes an imaging sensor. The microphone, accelerometer, the various radios are all sensors. Smart watches can sense heartbeat and other biometrics. The industrial world is plastered with temperature, humidity, light, sound, location, vibration, infrared, ultraviolet and RF sensors, among others. The label “IoT” has been hung on this phenomenon, but we think software is what makes the phenomenon of the Internet of Things valuable. From a software standpoint, where the emitted data from connected things gets its value, IoT is just an enabler for what I call the New Era of the Sensing Application. And it’s not things alone. To us, people are data generators too. We only know the Internet of Everything (IoE).

Broadly speaking, computer applications in business began with the first generation of Archival applications. “Let’s get off paper, for filing and retrieval.” The second generation of computer applications was Transactional. “Let’s track deals! — record trades, sales, contracts, etc. as close as possible to real-time so we know where our business stands.” Even a week or a day was better than what manual systems and practices could deliver. Most people reading this experienced computing in business this way most of their working life, and for the longest period. But circa 2000 after Web 1.0 became established and Web 2.0 was on the horizon, the third great generation of computer applications emerged, the Social application. Social applications could also be Transactional and Archival, but they included a community context and user-generated content, for doing business.

A few things are common to this evolution of application generations. 1/ Each new generation does not wipe away its predecessor(s), nor render them less important; 2/ new application generations tend to produce unprecedented amounts of data, exceeding the data volume of prior generations; 3/ each new generation of applications is enabled by new data sources; and 4/ each new generation of applications more closely approximates real-time utility.

Which brings me to sensor data and the New Era of the Sensing IoE applications. Sensing applications are enabled by the explosive proliferation of sensors throughout our world. Because sensors are generally emitting data frequently and continually (whether data is captured or not) they collectively “sniff” the world to see, hear, feel….sense. Sensors make known what they sense by emitting time-series data. And when organized and analyzed by software, those data streams can enable us to understand circumstances that we otherwise are likely to miss, or be aware of too late.

But the Sensing application has requirements. First it needs underlying data services that are, end-to-end, optimized for efficient sensor data processing of time-series rather than record-oriented data. Sensing applications are closely associated with real-time operations, so the underlying data system cannot impose delay or aggravate latency in other parts of the data network. In fact, the underlying data services must be architected to actively attack latency as well as to preserve application functionality when connectivity is interrupted. Sensing applications will generate a lot of “normal” data about normal things. The temperature is what it’s expected to be. Vibration is absent. Humidity isn’t spiking. Not many people have crossed a boundary. That drone hasn’t wandered into restricted airspace. With huge numbers and vast aggregate data volumes, most generated data will be unexceptional, so the determinant task for Sensing applications is governance. This requires a data system that can ingest and organize data in real-time and then immediately identify the exception events and data intersections that demand action.

This is what Sixgill Sense is built for — to master the data wrangling for the New Era of the Sensing Application. Sense supports the full spectrum of sensor-informed applications; it scales and performs for real-time operations; it’s optimized for time-series data management through and through; and it automates governance in the data deluge of the Internet of Everything for everyone who needs it.