Democratizing Machine Learning Algorithms for Integrated Data-Sharing
The Tango of AI and Big Data
Transforming the Art Museum in the 21st Century.
Machine Learning: Enabling New Capabilities in Health and Beyond
Revolutionizing Disease Predictions Using Machine Learning
Ylan Kazi, Vice President, Data Science + Machine Learning, UnitedHealthcare
Embedding AI in everything. Why we can't do without it now
Vishwa Kolla, AVP, Head of Advanced Analytics, John Hancock Insurance
Technology Megatrends that Will Shape the Future and Your Business
Jag Randhawa, Technology Executive & Author, CAMICO
Thank you for Subscribing to CIO Applications Weekly Brief
The New ML Model Enhances Computer Perception of Human Emotions
The advanced machine learning model will significantly enhance human-robot interaction, enabling machines to better perceive human emotions.
FREMONT, CA – Artificial intelligence (AI) technology has dramatically progressed over the years. One of the most significant challenges faced by AI was its inability to comprehend human emotions. However, a group of researchers from MIT has developed a machine learning (ML) model designed to better understand human emotions.
It will significantly influence the efforts of organizations in developing technologies designed to analyze facial expressions, interpret emotions, and respond accordingly. The ML model can be used to create devices for monitoring the health and well being of patients, gauging student interest, diagnosing certain diseases, and so on.
A significant challenge for Artificial Intlelligence (AI) systems is the variation of emotional expressions based on different factors, including gender, culture, and age. It can also depend on the time of the day, hours of sleep, and such. Although deep learning strategies were developed to aid the computers in recognizing the subtleties, they are not accurate or adaptable across different populations.
The ML model developed by MIT researchers offers enhanced capabilities when compared to the conventional models, and can capture even the smallest change in facial expressions to gauge the mood. By adding additional training data, the model can be adapted to an entirely new group of people.
It offers an unobtrusive approach to monitoring moods in the design of socially intelligent robots. The conventional affective-computing models are trained using a limited set of images depicting several facial expressions with optimized features, including the curling of lips. The general feature optimizations are then mapped across an entire collection of new images.
The researchers, on the other hand, leveraged a combination of a mixture of experts (MoE) and model personalization techniques to mine fine-grained facial-expression data from individuals. The MoE technique utilizes several neural network models, each trained to specialize in various processing tasks, to produce one output. The gating network is used to calculate the best expert for detecting moods, enabling the system to discern between individuals.
The ML model can be used to enhance the capabilities of computers and robots, enabling them to learn from small amounts of data. It can run in the background of the host device to track the facial expressions of the target subjects. Its applications can also be utilized in monitoring various health conditions, including depression and dementia. The new technology stands to revolutionize human-robotic interactions, bring enhanced capabilities into the landscape.
Check out: Top Sleep Disorder Care Solution Companies