Thanks for this editorial. About 10 years ago I found a nifty Power Point presentation that a physicist (or two) at CERN had put together for a presentation regarding their data processing environments at a few of the big accelerators and found it fascinating. I always wondered how they filtered that sensor data down to manageable chunks (he didn't go into much detail in the presentation). I also had wondered what kind of computing and storage hardware they used and wasn't surprised to learn that it was mostly commodity stuff. Do you know if they still do that or have they started pushing some of it to the Cloud?
I bet the National Intelligence Agency could use this for nefarious and not so nefarious telecom monitoring purposes. I would think there could be lots of applications in the engineering field. Maybe an algorithm that evaluates electrical signals for imminent device failures? Seem to recall reading somewhere that that is possible. Would think ML would help there maybe.
In the general business community these might come in handy if you are networked into a bunch of 3rd party apps and want to filter the information down before committing to storage or forwarding it any further? Little harder to see the uses there but I'm sure they exist.