Sensorization and machine learning
Sensorization and machine learning
|
The
news of the day leaves little doubt: we are heading towards a future
where we will live completely surrounded by sensors of all kinds. The
photo's earphones are the latest development of SMS audio, the company
created by rapper 50 Cent, based on Intel technology, and designed to
monitor physiological variables associated with physical exercise, a
socket that might seem rather more natural for the practice of sport
than wearing a bracelet, a chest band, or a wrist watch.
But
the earphones are only a tiny piece in a huge puzzle that is behind
many of the recent developments and movements in the technology sector:
Yesterday also announced the acquisition of SmartThings on the part of
Samsung, two hundred million dollars that position the Korean giant in
the world of home automation (lighting, humidity, locks ... of all) and
make millionaires the founders of a company started in Kickstarter.
Clearly, the tendency is to sensoricemos our bodies, our environment,
our homes and our cars, even if it leads us to have no clear who will be
responsible when the information collected by these sensors trigger a
bad decision.
Intelligent
watches, bracelets for the monitoring of older people, new developments
in batteries designed specifically for such devices ... and a real
flood of data produced every time we move, exercise, or just breathe.
Data of all kinds, with possibilities of use very imaginative or very
dangerous, that will determine new business rules that are putting in
SOLFA even the international agreements.
What
do we do with so many data generated by so many sensors? We are already
saturated, and we are only analyzing around 1% of the data generated.
The logical-or almost the only thing-we can do is ... put other machines
to analyze them. The machine learning is being shown as the great
frontier, as the only way to make such a constant collection of data a
minimum of meaning. The training of an algorithm with data from 133,000
patients from four Chicago hospitals between 2006 and 2011 achieved a
diagnosis of emergency situations such as cardiovascular or respiratory
problems, issued with four hours of advancement over that performed by
physicians. A compilation of parameters of the patient's clinical
history, combined with information about their age, family history, and
certain analytical analyses, after being analyzed by an algorithm, it is
likely to lead to a drastic reduction in deaths related to this type of
situation, in which the provision of medical assistance a few hours
before may prove vital.
We
are definitely experiencing a sensorization boom. But the next step,
logical or even essential is going to be the development of tools so
that the immense amount of data generated by these sensors can be
analyzed with a minimum of criterion. A very interesting scenario, with a
brutal potential, and in which we will certainly see some important
movements soon ...
(This article is also available in English in my medium page, "Sensorization and Machine learning")
0 comentarios: