Researchers find machine learning can predict how we rate social interactions in videoconference conversations.
Videoconferencing is now a frequent mode of communication in both professional and informal settings, yet it often lacks the fluidity and enjoyment of in-person conversation.
Researchers at New York University have developed an artificial intelligence (AI) model capable of analyzing human behaviors during videoconferences to assess the quality of interactions in real-time. By examining factors such as turn-taking and facial expressions, the AI can predict whether participants perceive the meetings as smooth and enjoyable.
The study involved training the AI on over 100 hours of Zoom recordings, focusing on voice, facial cues, and body movements to identify moments when conversations became awkward or less engaging. Interestingly, the AI determined that prolonged silences were more disruptive to the flow of meetings than instances where participants spoke over each other.
To validate the AI’s assessments, more than 300 human judges reviewed the same videoconference footage. Their evaluations closely matched the AI’s predictions, highlighting the model’s accuracy. This advancement offers potential for enhancing virtual meeting experiences by identifying and addressing conversational breakdowns before they occur.
News Source: Eurekalert