Now understand the apps. Eye language
Now understand the apps. Eye language
In association with Microsoft software giant Microsoft, Anibal, a team of researchers, created an app that will help people speak their language. These apps can understand eye signals. A report says that the app can help gyaposickers with metopic Lateral Skar-rose (ALS). ALS is a condition that allows the affected person to lose the ability to speak and eat slowly, and even the ability to move other muscles is also damaged. Researchers claim that the eyes detecting input systems are very costly for the ALS sufferers, which are not very capable in sunlight and need to be repeatedly sorted.
"We have made gyazepic in order to reduce the errors of conventional methods, it is an eye-catching communication device that can run on a smartphone," researchers said in a research paper organized in a conference called "Conference on Human Factors in Computing Systems" in Colorado, USA, this year.
It is more capable, portable, easy to learn and cost less. "British weekly New Scientist has said that artificial intelligence has been used to transform the movement of eye movement into words.
It will run on the listening device so that he understands what is being said from the other side. Apps will be available in the Apple App Store before the May conference.