Extended reality tool lets dancers analyze movement
By Tom Fleischman, Cornell Chronicle
It’s been said that “writing about music is like dancing about architecture.” Writing, or talking, about dancing can be similarly futile.
A Cornell doctoral student has helped develop a tool that lets dancers use video and extended reality (XR) headsets to create an immersive environment for analyzing and refining their movements.
In other words, dancers can actually dance about dancing.
“DanXeReflect” transforms two-dimensional video into a 3D virtual studio, where movements appear as interactive avatars. Users can reenact poses to search a catalog of sequences, perform alternative revisions alongside originals, and attach annotations directly to avatars’ body parts.
“We found that when dancers are talking to each other, they’re kind of demonstrating with their bodies to specify the movement,” said Hyunju Kim, a doctoral student in information science and part of the Siegel PiTech Ph.D. Impact Fellowship program at Cornell Tech. “So I wanted to use that idea in an augmented reality and virtual reality interface.”
Kim presented “DanXeReflect: Interacting with the Spatio-Temporal Past Movements for Embodied, Reflective Choreographic Collaboration,” at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI ’26), April 13-17 in Barcelona. The paper earned honorable mention at the conference.
François Guimbretière, professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, is a co-author. The corresponding author, Bokyung Lee, is an assistant professor of interaction design and human-computer interaction at Yonsei University in Seoul, South Korea.
Cornell researchers have developed a tool, DanXeReflect, that lets dancers use video and extended reality headsets to create an immersive environment for analyzing, annotating and refining their movements.
This work was a collaboration with Michael Byrne, creative lead for tech, arts and culture at Cornell Tech, as well as several students from the Martha Graham Dance Company of New York, the oldest professional dance troupe in the United States. The company, whose namesake was a pioneer of modern dance, is celebrating its 100th anniversary in 2026.
“As part of Hyunju’s PiTech fellowship,” Byrne said, “we worked together to find novel solutions for activating materials from the Graham archives, as well as new technical methods for engaging with 2D and 3D imagery. It provided an exciting opportunity to create a pop-up ‘dance laboratory’ on campus.”
Choreographers have long used notation systems for codifying movement, and recent advances in digital technology have given instructors better tools for teaching and demonstration. But the two-dimensionality of video limits the extent to which ideas can be translated into movement; some XR programs are available to dancers, but they are generally useful only for individual practice or studio rehearsal.
DanXeReflect takes this technology a step further, offering a more interactive, immersive environment for reviewing and refining a dancer’s movements.
Participants enter the DanXeReflect studio by donning a VR headset, which places them in front of a virtual mirror. The first step is to reenact a pose in front of the mirror; the system compares their posture to corresponding avatar sequences, finding the pose that best matches theirs. The avatar appears both in the virtual mirror and next to the user in the VR space.
This work began with Kim conducting 90-minute Zoom interviews with six Martha Graham dance professionals, including a rehearsal director, a dance school director, advanced dancers and a former dancer. Kim asked them about how they engaged in choreography reflections with others, then showed the interviewees a video depicting a sequence of 3D avatars, similar to DanXeReflect, to gauge their receptiveness to it.
For the actual user study, the team recruited nine female dancers across genres, including street/urban, ballet, jazz and ballroom. The study involved three actions: reading notes from a partner on their own choreography; writing notes on their own performance; and writing notes on another’s performance.
In addition to refining movements, dancers can also leave time-stamped feedback notes on specific areas of the body.
The participants generally viewed DanXeReflect as an extension of their post-rehearsal video review. As one participant said, they appreciated being able to “better understand and take notes about the 3D movements” in the review process.
“Our approach was trying to be more immersive,” Kim said, “so that the dancers can actually see the avatar close by, then reflect based on what they see. They’re sort of ‘in the room’ with the avatar, studying what it’s doing.
“What we were trying to do was what dancers usually do in rehearsal,” she said. “They’re talking to each other, and demonstrating with their bodies the specific movement.”
This work was supported by the National Research Foundation of Korea.
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe