In the current, connected environment, it is very easy to trap ourselves by believing that every device, and every application, has access to a reliable, low latency network connection.
And whilst this may generally be true, for some applications, it is not enough. Consider for a moment how often your 3G/4G mobile internet connection bombs out. Frequently for sure. So, what happens when IoT-enabled devices suffer from the same lack of a robust network connection?
In most scenarios, inbuilt fault tolerance would take up the slack. But in some situations, such as high volumes of time-series data being transferred at short intervals, even our best efforts are not enough. The time sequence can be broken.
Enter the Cloud
Time-series data normally needs to be received sequentially. For example, weather monitoring equipment might send current weather conditions once a minute. The entire sequence of time-series data can be used to visualize weather pattern history and potentially predict future weather.
By sending this time-series data first to the cloud, it is possible to improve the reliability of the data stream. This is done in the following way:
This improves reliability in two main ways. Firstly, cloud providers are often much more robust, offering an always-on service with full fault tolerance and failover capabilities.
Secondly, the intelligent dataflow pipeline that acts as a pre-processor for incoming time-series data before it is sent to cloud storage actively monitors the data stream and corrects errors in the time series.
Additional Benefits of the Cloud
Of course, once data is stored in the cloud, it can be accessed using a vast array of tools. For example, analytics applications can easily import the data, opening the possibility to perform almost real-time analytics on data send from potentially hundreds or thousands of devices.
Additionally, the cloud platform itself will usually come with a full feature set with regards to logging of transactions. And in this case, it provides the ability to check whether time-series data has been received correctly. If something is out of sequence, it can be repaired.
Importing Historical Data
Finally, we come to the concept of using a cloud platform to build a reliable silo of historic time-series data.
The historic data can be imported into the cloud by first being parsed by the dataflow pipeline. This will effectively repair any parts of the time series that is out of sequence. Historic data can then be forwarded to cloud storage for application access. What we have achieved in this case, is the full import of historic time-series data that has had the time sequence intelligently repaired as part of the import process.
Smartsourcing is a brief guide to the world of modern technology partnerships. It was developed through a collaborative effort of top Zymr executives as we uncovered a gap in the market between the perception of what outsourcing used to be, and how leading technology innovators are leveraging this globalized approach to value generation. Read this guide to learn...