Navigation
orbit-logoMIT logo
Ememe

Ememe

Text-to-Emotional Expression AI for Game NPC

Idea in Artificial Intelligence

Introduction

Ememe is the most emotionally expressive AI for Game NPC, enabling one to make eye contact and smile with human beings based on text generation.


Problem

For 40 years, there has been almost no change in how we communicate with AI characters. 

AI conversations are displayed in a text window, and the player selects a predetermined option. We don't have non-verbal communication with AI, such as eye contact or laughter, as we do daily.

With the recent advances in LLM and 3DCG technologies, AI characters that are as realistic as humans have emerged. However, while ChatGPT can automatically generate dialogue with humans, it can still not generate animated emotional expressions in response to the automatically generated conversational text by LLM.

However, the challenge is that creating such an AI requires a large amount of special data sets: a paired data set of motion data and text data specialized for emotional expression.


Opportunity

Ememe is Text-to-emotional expression AI, which automatically generates human-like emotional expression animations from text dialogues, also automatically generated by LLM. We also provide an Emote creation App to gather a large number of training datasets for AI.

The product consists of the following three components;

  1. Motion capture App: BtoC App that allows users to capture their movements and convert them into their own Emote(a) - motion data for expressing emotions in games.
  2. Game Emote SDK: SDK for game companies that can import and use Emotes created in 1 for multiple games. Collect in-game text/voice chat data (b) in exchange for providing the SDK free of charge to game companies
  3. Text-to-emotional expression AI: Using Emote (a) and Text data (b) collected in 1 and 2 above as training data, develop an AI that automatically generates emotional expression animations from Text input. Provide this AI to Game Developer via SDK.