You are here: American University School of Communication News Student Brings Avatars to Life for ETS

Achievements

Student Brings Avatars to Life for ETS

Kirby Cofino

Video games are known for being engaging, entertaining, and generally a lot of fun. But the technology and design strategies that make games so appealing can also be used to help users achieve practical goals in everything from physical therapy to building cultural awareness to language learning.

Kirby Cofino, a second-year student in American University's Game Design MA program, worked last summer as part of a paid internship with the DIAMONDS research group in the San Francisco office of Educational Testing Service (ETS) on a speech system called HALEF.

HALEF is known as a "multimodal dialogue system." Put simply, it is a computer program that allows a person to talk to a computer, and the computer responds. Cofino's role in the development of HALEF was to animate 3D virtual human avatars, James and Elena, and then add code to the avatars to allow them to be synced up with the computer's responses.

"I had no background in 3D art, so it [has] been great getting to develop that," Cofino says about the experience. "[At AU] a lot of what we do is quick sprints-we have a couple weeks to do a project, then it's done. With this I really got to dig into something, and really learn the ins and outs of web development, coding, animating."

The software will be administered by ETS and used in testing environments. At the moment, it is primarily designed for simulating classroom settings, allowing teachers to dialogue directly with student avatars to assess educators' teaching practices. In the future, the technology could be used for other applications, including teaching, sales and interviewing job applicants.

ETS found Cofino's work so valuable that they asked him to stay on through the fall 2017 semester. He is now working on a second generation of avatars and is really pleased with the project so far.

"I think [the] key to my experience was that there wasn't a set person telling me what to do. They give you a task and it's up to you to figure out how to achieve the goal. So, it's really about freedom and critical thinking," Cofino says.

ETS also published several papers regarding the project, and Cofino is the first author of one: " A Modular Multimodal Open-Source Virtual Interviewer Dialog Agent." The paper discusses how HALEF has the potential "to serve as a job interviewer for workforce training applications" and the technologies used to create the system.

Interviewees would begin on a website, calling in via video and microphone on their laptop. HALEF detects the call and begins talking with an audio prompt that is then presented to the web browser. The exchange is recorded and continuously streams. All interactions go through a few hoops, like servers and key APIs, a set of routines, protocols and tools for building software applications. Throughout this process, other code is processed in order for the avatar itself to properly react, move and engage realistically with the user.