There has been a huge amount of discussion around how the IoT could have a significant impact on healthcare. From the ability of doctors to monitor patients when they’re at home, through to increasing the ability to diagnose diseases. However, it could arguably have a larger impact on those living with disabilities.
We have seen a concerted effort to help those with disabilities through technology, from 3D printed prosthetics, through to the ability to understand what those who cannot communicate are saying through brain activity. It may well be the turn of the IoT to have an impact, but what is this likely to look like?
One of the most exciting and publicized uses of the IoT in people’s homes has been the use of it within technology like the Amazon Echo and Google Home. Here people no longer need to use switches and traditional physical methods of interaction in order to use utilities and basic services within a house. For instance, it allows people to change the heating settings without needing to programme a controller, which is incredibly difficult for people who have visual impairments to unless there are modifications to the systems. Using a voice recognition system, it can work more-or-less straight out the box, without any kind of special setup needed.
Having the ability to control the major components of a house from one location is also very useful for those who are less able to easily move around the house or who may even be bed-bound. It means that doors can be unlocked, food stocks checked, and even allows for items to be delivered to the house without the need for anything more than talking. It means that those with disabilities have the opportunity to live a considerably more independent life, while also increasing self esteem.
One of the most important uses of the IoT is in the development of the self-driving car, with thousands of different sensors communicating millions of times only on a short journey. It is also the element of the IoT that the public is most excited by, even though many don’t associate the two. However, it will offer those with disabilities an incredible level of mobility that was previously never possible.
When it comes to moving from one place to another, cars would either need to be expensively modified for specific disabilities, ie hand accelerators and flappy handle gear changing. Many disabilities, like visual impairment and some forms of paralysis, stop people driving completely, and according to the World Federation of the Deaf there are even 26 countries who do not allow deaf people to drive.
Self driving cars remove this barrier and do not require a specific level of movement or sense within a body to move from one place to another. It means that those with a disability do not need to rely on others or public transport, which is often difficult for those with slight impairments.
Many of those with disabilities require constant attention, be that through simple monitoring or treatment, which is both expensive and often demeaning for the disabled person. We have already seen impressive work being done that allows patients to be monitored by doctors outside the hospital and this is something similar that could help those with disabilities or long-term illnesses.
A prime example of this is the work currently being undertaken by the Michael J Fox Foundation (MJFF), who began work with Intel and Cloudera in 2014 to use IoT devices to monitor hundreds of people with the disease. The idea of this is to collect millions of different data points that may hold a clue to curing the disease, according to Vijay Raja, Senior Solutions Manager at Cloudera ‘You are talking about a GB of data per patient per day. If you multiply that by 10,000 people, you have an enormous sample size of patients. You are going to have data equivalent to the Library of Congress.’
It is something that can be used not only for those suffering from diseases with potential cures, but for helping almost every kind of disability. Sensors in prosthetics could identify the ways that people are using them and develop newer models that best cater to this. It could also identify how people move around their house and which elements could be altered to make it more convenient.
It is something that Ford have already started to do to some extent, with their use of sensors in cars to find snag points for users. It then allows designers to create cars that cater to specific needs. In a March 2016 article in Fast Company, Cliff Kuang describes his experience of testing a Ford F-150 whilst wearing an aging suit that meant he was experiencing using the car in the same way a 70 year old would do, it is not difficult to envision similar tests being done with those with disabilities in order to create a design that works best for them.