Published: 05th February 2020
This IIT Madras prof's AI tech can convert brain signals of the speech-impaired into English
We can now interpret what plants and nature are trying to communicate to us. This will help in predicting monsoons, earthquake, floods, tsunamis and other natural disaster using AI, prof said
During a brief couple of days when Dr Vishal Nandigana, now an Assistant Professor, Fluid Systems Laboratory, Department of Mechanical Engineering at IIT Madras, was in the US a few years ago, he suffered from a bout of extreme stress and pressure. Between finishing his Masters and deciding what he wanted to do next, Vishal said he had been pushed to an edge, so much so that he began to have hallucinations. Thankfully, it was quickly cured and he went back to his regular life shortly after he got the healthcare that he needed. But it was that experience that put an idea in his 'engineering' head, that has materialised today.
Vishal asked his doctor what caused the hallucinations. He was told that they were brain signals that were probably evoking some old memory. But since then, Vishal has been curious about how brain signals can be decoded. Today, he is the head of an IIT Madras Research team that has developed AI technology that can convert brain signals of speech-impaired humans into the English language. According to the 2011 Census there are nearly 20 lakh people who were found to be speech-impaired — a number that would have considerably risen in the past 9 years.
Before we get down to telling you how that would actually work, let's get to know Vishal a little better. Having been brought up on the IIT Madras (his father was an SSO Gr-1 there) campus, it is no wonder that Vishal wanted to be a Professor where his father was an employee. Even though his father wanted him to become an IAS officer, Vishal didn't feel like it was something that he was destined to do. So after an undergraduate course in mechanical engineering, he moved to the USA to pursue his Masters and PhD. Soon after he finished, he found himself a cushy job at Intel. "I eventually wanted to get into academics but wanted a taste of the industry as well. But I wanted to also be an entrepreneur, so I figured it would be a good experience," he said. But Vishal only stayed there for about six months and then decided he was only cut out for the academic life. But he was still keen on entrepreneurship as well.
He then decided to return to IIT Madras and took up entrepreneurship projects. He then conducted research in subjects like membrane technology and developed something called 'blue energy' - creating renewable energy from seawater. Then he delved into Computational Fluid Dynamics (CFD). It's safe to say he created some waves there. But how did he get around to developing AI to read brain signals? "It was after my work in CFD. Take for example, if you wanted to custom make a car and you gave the software all the details and the car is created. It would work like Amazon's Alexa. With things like 3D printing, we can actually create a car. So this software analyses your design needs and modifies the product accordingly," he explained.
"The algorithms were tight," he tells us. Definitively so.
Time out for the layman's brain. So how did this research lead to brain signal-interpreting-AI? "There are two parts of the brain, the training set and the test set. So either the brain is trained to react in a certain way or if it is something new, it tests how it can respond and then does the action. Any signal will create waveforms. These signals can be read," he said. Electrical signals, brain signal or any signal, in general, are waveforms which are decoded to meaningful information using physical law or mathematical transforms such as Fourier Transform or Laplace transform.
Once the waves have been recorded, the AI will automatically match it with pre-recorded meanings for specific waveforms — and display the word or sequence of words that the person is thinking. Can you imagine the possibilities of what this would mean for people who have never spoken a word in their lives?
On how this can be applied to even natural processes, Vishal said, “The other major application of this field of research we see potentially is, can we interpret nature's signals, like plant photosynthesis or their response to external forces when we collect their real data signal. The data signal also, we believe, is going to be in some wave-like pattern with spikes, humps and crusts. So the big breakthrough will be can we interpret what plants and nature are trying to communicate to us. This will help in predicting monsoons, earthquake, floods, tsunamis and other natural disasters using our Artificial Intelligence and Deep Learning algorithms. If we understand nature's signals better we can take care of it well is our objective that we want to pitch in from our laboratory.”
He goes on to give a better example, "From the olden days, we've been told that a koel sings just before it starts to rain. So now with the ability to read the waves created by the bird's brain signal, we can actually find out if it indeed does know when it's going to rain or if it's just a myth," he explained. He says it is the same with clouds, "There are plenty of sensors, plenty of data to find out how something works because it gives out signals. So we can find out from the cloud if it's going to rain or not. Similarly, we can predict rains, tsunamis and various other natural processes too," he adds.
Elaborating on how this could drastically impact the lives of the speech-impaired, Vishal said, “The output result is the ionic current, which represents the flow of ions which are charged particles. These electrically driven ionic current signals are worked on to be interpreted as human language meaning speech. This would tell us what the ions are trying to communicate with us. When we succeed with this effort, we will get electrophysiological data from the neurologists to get brain signals of speech impaired humans to know what they are trying to communicate.”
Vishal, and his team of students, tested this concept by getting experimental electrical signals through experiments in the laboratory to get signals from nanofluidic transport inside nanopores. The nanopores were filled with saline solution and mediated using an electric field.