1. Home
  2. /
  3. News
  4. /
  5. Will the Computers of...

Will the Computers of the Future Be Able to Communicate With Us?

 

Technology has advanced a lot in the past few years and with the introduction of devices with voice command capabilities, it has made many people in the technology world wonder how far we can utilise the use of verbal commands with regards to how we interact with the technology that is around us.

One group of researchers and scientists from the Colorado State University has started developing new technologies which will allow us to interact with our computers in the same way that we do with other humans; through gestures, body language and facial expressions.

How Can a Computer Recognise and Communicate Using Gestures?

The researchers and scientists have titled this innovative project “Communication Through Gestures, Expression and Shared Perception”. It is being led by Professor Bruce Draper who is a professor of computer science at the university and aims to change the way that we are able to interact with our computers during everyday interactions.

tristarsupport

Professor Draper realises that although technology has advanced drastically in the past, we are still quite limited with how we interact with technology. Many people click, type, search and more recently speak to their computers to get them to perform the tasks that are required, but what if computers could understand visual cues and commands too?

The team are working hard to create a two-way communication system that allows our computers to communicate back with us in order to help with everyday tasks.

First, they provide essentially one-way communication: users tell the computer what to do. This was fine when computers were crude tools, but more and more, computers are becoming our partners and assistants in complex tasks. Communication with computers needs to become a two-way dialogue.” says Professor Draper.

Creating a Gesture Library

So, what are the team doing to allow our computers to communicate back to us?

During their research, they have been compiling a library of small packets of information known as Elementary Composable Ideas (ECIs).

tristaritsupport

Each of these ECIs contain information about a certain facial expression of gesture which have been derived from human users and information which tells the computer how these gestures should be read.

Using a Microsoft Kinect based interface, a human user will be invited to sit down and interact with the researchers via natural gestures such as “stop” or “go”.

Professor Draper explains that there is no cue as to what gesture should be used and that the reactions of the user should be completely natural.

tip-month-uses-mouse-didnt-know

“We don’t want to say what gestures you should use, we want people to come in and tell us what gestures are natural. Then, we take those gestures and say, ‘OK, if that’s a natural gesture, how do we recognize it in real time, and what are its semantics? What roles does it play in the conversation? When do you use it? When do you not use it?’”

The ultimate goal is to make these computers recognise non verbal cues and react in a specific way. The research team hope that this technology could be used in scenarios when people may have issues communicating, such as if a person is deaf or speaks another language.

war-net-neutrality

Being able to understand your computer is an important aspect of owning one and here at Tristar, we focus our IT support in London on educating and inspiring people to get to know their computer better.

If you feel like you could benefit from the trustworthy, reliable service that we provide, don’t hesitate to contact us today by calling 01707 378453 and have a chat with one of our friendly team members who will be more than happy to help you.

Menu