BEAT: the Behavior Expression Animation Toolkit. The Behavior Expression Animation Toolkit (BEAT) allows animators to input typed text that they wish to be spoken by an animated human figure, and to obtain as output appropriate and synchronized non-verbal behaviors and synthesized speech in a form that can be sent to a number of different animation systems. The non-verbal behaviors are assigned on the basis of actual linguistic and contextual analysis of the typed text, relying on rules derived from extensive research into human conversational behavior. The toolkit is extensible, so that new rules can be quickly added. It is designed to plug into larger systems that may also assign personality profiles, motion characteristics, scene constraints, or the animation styles of particular animators.
Keywords for this software
References in zbMATH (referenced in 8 articles )
Showing results 1 to 8 of 8.
- Xie, Zhibing; Guan, Ling: Multimodal information fusion of audio emotion recognition based on kernel entropy component analysis (2013)
- Gobron, Stephane; Ahn, Junghyun; Paltoglou, Georgios; Thelwall, Michael; Thalmann, Daniel: From sentence to emotion: a real-time three-dimensional graphics metaphor of emotions extracted from text (2010)
- Morency, Louis-Philippe; De Kok, Iwan; Gratch, Jonathan: A probabilistic multimodal approach for predicting listener backchannels (2010)
- Bickmore, Timothy W.; Mauer, Daniel; Brown, Thomas: Context awareness in a handheld exercise agent (2009)
- van Deemter, Kees; Krenn, Brigitte; Piwek, Paul; Klesen, Martin; Schröder, Marc; Baumann, Stefan: Fully generated scripted dialogue for embodied agents (2008)
- Egges, Arjan; Papagiannakis, George; Magnenat-Thalmann, Nadia: Presence and interaction in mixed reality environments (2007)
- Ishizuka, Mitsuru; Prendinger, Helmut: Describing and generating multimodal contents featuring affective lifelike agents with MPML (2006)
- Pelachaud, Catherine; Poggi, Isabella: Subtleties of facial expressions in embodied agents (2002)