The development of technology is moving at an incredibly fast pace but what does that mean for people with deafness, tinnitus and hearing loss? Do we continue to create dedicated assistive technology solutions to help people? Or do we try to influence developers of main stream technology to integrate technology that can help and make their technology more accessible?
‘Smart home’ technology refers to the ability to be able to control various electronic devices in your home through the internet. The use of the internet enables you to use your smartphone or tablet as a remote control to do things like control your lights, your oven or keep an eye on what is in your fridge.
One particularly interesting product on show at CES was the Ring Video Doorbell 2. This smart doorbell is connected to an app so that you can easily see who is at the door. It can also let you have a conversation with them if you can’t see them or don’t know who it is. It can also double up as a security feature. But most importantly, for people with difficulty in hearing, mobility or sight, you can be alerted to the doorbell through your smartphone which has a much wider range of alert types. You can also give access to a trusted family member or friend who can see who is at the door and help you decide whether or not you need to open the door if you are worried.
The world of robotics has come a long way over the past year. Improvements in camera’s, speech recognition, and face and object recognition, means that it is now possible for domestic robots to be potentially very useful. Currently, they range from basic functions like vacuum robots that navigate around the room (some better than others) and do your vacuuming for you to the more complex where you can ask a robot to bring you something and it will be able to find the object, pick up the object and then find you to bring it to you.
There is also now more development going into robotics that can be used to provide health and social care. Some have been designed to be able to dispense your medication and do basic health checks like taking your blood pressure, blood sugar levels and so on. Others have been designed to be alerted when an incident like a fall has occurred, after which the robot will locate you, assess your need and contact the relevant person accordingly.
Artificial intelligence (AI) is what allows computers and robots to make decisions like humans. The recent boom in AI has been catalysed by significant strides in what is referred to as ‘machine learning’. Computers are now capable of processing enough information in order for them to learn from examples (machine learning) rather than their traditional method of processing, which was dictated by a person.
Some basic functions of AI are things like facial recognition on your smartphone to unlock it. More complex and exciting developments in machine learning are things like automated vehicles where they can make sense of the surroundings and respond accordingly. In healthcare, this can help by having automated machines that can carry out diagnostic tests quickly and efficiently and refer people on to the relevant next stage. This is particularly useful in remote hard to reach areas.
Automated cars are becoming an almost regular feature these days in the tech news. With the huge strides being made in AI, automated cars are almost ready to be tested on real roads. The development of automated vehicles can bring some great opportunities such as improving safety on roads and supporting people in rural and hard to reach areas to access healthcare more easily. However, it also raises concerns of how well they will be integrated into the existing roads with cars driven by people and if the law can keep up with this kind of change.
Assistive technology was a very small part of CES. Currently, we are seeing less stand-alone assistive tech being developed and very little integration of assistive tech into mainstream devices. With such a diverse set of users of assistive technology, it is helpful to have a mixture of the two. However, as tech grows, it is important that we capture user’s needs and influence the development of tech accordingly to ensure that people with deafness, tinnitus and hearing loss are not left behind.