Sensors Everywhere – Data Nowhere
Smartphones have become increasingly successful in monitoring everything The data collection of a smartphone provides sensor data that was unimaginable just months ago. All Smart Phones have sensors in the form of a camera, microphone, GPS. gyroscope, accelerometer, compass, proximity sensors, and it does not end there. Every day apps use sensors to collect data as well.
Sensor data is the output of a device that detects and responds to some type of input from the physical environment. The output may be used to provide information or input to another system or to guide a process. Like a template for data.
In the early days of sensors, a pattern would have to be recognized in order to cause the sensor to activate a process. For instance an elevator door that opens immediately as soon as it senses an object in the way of the closing door.. An oil light in your car when you are low on oil is another type of sensor.
With nanotechnology comes smaller sensors. Smarter sensors. Sensors that are able to collect data on just about everything. So not only will a heart sensor let a doctor know when the heart is beating, but data collected from sensors can help predict when it may stop.
Just like the elevator door that stops closing because its sensors detected a person while the doors were closing, that same sensor can now measure how many people rode the elevator, wirelessly diagnose itself of trouble and send data to the elevator maintenance company to schedule service.
How many sensors does a person deal with on an average day? Think about it.
Your smartphone is a sensor network. Depending on what apps you have installed on your smartphone, that phone is seeing, hearing, and reporting on its surroundings. But to whom?
We know a big search engine company that uses data from its phones to gain semantic language skills. Did you ever use Google Voice? When someone left a voicemail, Google would attempt to transcribe it into a text. When the user played the voice mail, Google would highlight the words in real time to indicate that it did not understand. So you corrected it and taught Google how to learn the human language in different tones, timbers and accents.
The little sensor in your phone called a microphone got what it needed.
Sensors have helped us get out of jams in the past.
But what do we do with this data? Throw it back in the cloud and use it only when we needed it? There are many uses for this data including but limited to health and surveillance.
In 2012 Dilshan Silva proposed a place where sensor analytics could be stored and studied collection of data from sensors and also a place to collaborate, share and even make new friends.
WikiSensing: A collaborative sensor management system with trust assessment for big data