Lip reading is a technique of understanding speech by interpreting the movement of a person's lips. A highly complex phenomenon, the practice is susceptible to errors and can lead to whimsical results.
However, scientists at Oxford University have developed an artificial intelligence program known as the LipNet, partially funded by Google's DeepMind, which can match the movement of the lips with 93 percent accuracy, according to a report published on Futurism. This is an impressive result since experts in the field of lip reading can achieve approximately an accuracy of 52 percent. However, researchers say that the system has to be tested in real-life situations before using on a wide scale.
AI lip-readers have massive potential and can be applied in silent dictation in public places, hearing aids, speech recognition in a clattery environment, covert conversations, silent movie processing, and biometric identification. According to the researchers, the AI has to be provided with the entire sentence so that it can educate itself on matching the letter with the movement of the lips. The researchers fed the AI almost 29,000 videos, with each video lasting for three seconds. Human testers showed an error rate of 47.7 percent, while the figure for AI was only 6.6 percent for similar videos.
According to Jack Clark from OpenAI, implementing this software in the real word will require three fundamental improvements: feeding the AI with a huge number of videos showing people speaking in the real world, enhancing the capability of AI to read lips from various angles, and differentiating the types of phrases the software can predict.
Neil Lawrence, a Twitter user, however, criticized this robot, stating that it's not groundbreaking. He added that the AI can only read a meaningless list of words from an extremely constrained vocabulary.
Given the fact that machines can now identify the world in a much better and accurate way than humans, this new addition should certainly be welcomed by all.
© 2024 NatureWorldNews.com All rights reserved. Do not reproduce without permission.