University of Wollongong
Browse

Emotional states control for on-line game avatars

Download (940.16 kB)
conference contribution
posted on 2024-11-14, 10:50 authored by Ce Zhan, Wanqing LiWanqing Li, Farzad Safaei, Philip OgunbonaPhilip Ogunbona
Although detailed animation has already been achieved in a number of Multi-player On-line Games (MOGs), players have to use text commands to control emotional states of avatars. Some systems have been proposed to implement a real-time automatic system facial expression recognition of players. Such systems can then be used to control avatars emotional states by driving the MOG's "animation engine" instead of text commands. Some of the challenges of such systems is the ability to detect and recognize facial components from low spatial resolution face images. In this paper a system based on an improved face detection method of Viola and Jones is proposed to serve the MOGs better. In addition a robust coarse-to-fine facial landmark localization method is proposed. The proposed system is evaluated by testing it on a database different from the training database and achieved 83% recognition rate for 4 emotional state expressions. The system is able to operate over a wider range of distance from user to camera.

History

Citation

Zhan, C., Li, W., Safaei, F. & Ogunbona, P. (2007). Emotional states control for on-line game avatars. Conference on Applications, Technologies, Architectures, and Protocols for Computer CommunicationsNetGames '07 Proceedings of the 6th ACM SIGCOMM workshop on Network and system support for games (pp. 31-36). Melbourne, Australia: ACM.

Parent title

Proceedings of the 6th ACM SIGCOMM Workshop on Network and System Support for Games, NetGames '07

Pagination

31-36

Language

English

RIS ID

22296

Usage metrics

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC