Imagine sitting on your couch at home and through the magic of technology, you are transported to the lobby of the American Museum of Natural History in Manhattan. You see and hear the people bustling about; then everything fades to white outlines and you hear the sounds of the natural world from the Upper West Side in the year 1609.
This is one of four scenes from a virtual reality project that replaces the sounds of today’s urban Manhattan with scientifically accurate audio representations of the island in 1609, re-creating what Henry Hudson might have heard when he first arrived.
The other three locations are Collect Pond Park at Lafayette Street and Franklin Avenue, the Highline Park and Inwood Hill Park. Each of the areas had distinctive ecosystems, and the sounds reflect that biodiversity.
The project, Calling Thunder: The Unsung History of Manhattan, aims to remind people to notice nature, even in a densely populated city.
The project’s creators hope the interactive and immersive experience will draw users to the story of how Manhattan has changed in the last 400 years and will help people become more engaged with the natural world.
“Maybe the issues we are presenting – by comparing what Manhattan used to be to what it is now – can spark people to actually want to do something, not just get information but become involved in trying to figure out how can we find a better balance,” said Bill McQuay, an audio producer at the Cornell Lab of Ornithology and co-creator of the Calling Thunder project.
McQuay and co-creator David Al-Ibrahim, a graduate student at the School of Visual Arts’ Interaction Design Program, have used audio and visual technologies to transport users from four locations in modern-day Manhattan to those same spots as they sounded before the Dutch arrived, when Lenape Native Americans trod lightly on the island. At the time, Mannahatta, as the Lenape called it, had more ecological diversity per acre than Yellowstone National Park does today.
A mobile virtual reality headset, such as Google Cardboard, paired with a smartphone delivers a 360-degree view of modern-day New York at each location with a turn of the viewer’s head. At the same time, immersive sound through headphones starts with recent recordings of the metropolis before fading into an ecologically accurate representation of the area’s soundscape. Users look to the sky to transition to new locations, and the narrator (Emily Kron) acts as a guide for the virtual time-travel tour.
“There has been a loss of language around the natural world and a loss of words to describe natural things, and with that we’ve lost our ability to see and experience and notice them,” Al-Ibrahim said. “This is a way of getting people to pay attention to nature within urban environments and their everyday lives.”
He added that since working on the project he sees New York differently; he notices the plants growing in the street medians, and he wonders where the starlings flying by might live and what they eat and drink.
McQuay and Al-Ibrahim used sounds from the lab’s Macaulay Library – one of the largest scientific archives of natural history audio, video and photographs in the world – to re-create these immersive audio soundscapes of Manhattan in 1609.
They also consulted Eric Sanderson’s book, “Mannahatta: A Natural History of New York City,” and his related website, The Welikia Project, to identify species that would have lived in each location at that time. Sanderson’s materials provide block-by-block details of the ecology of Manhattan in 1609. By matching species lists with audio recordings from the Macaulay Library, and by studying the topographical features of each location, McQuay pieced together an audio landscape of the past.
“One of the challenges was trying to re-create how sound would behave in that environment and trying to imagine what that was like,” McQuay said. To accurately re-create these landscapes, McQuay consulted with colleagues at the lab before manipulating frequencies with equalizers and reverberation effects to duplicate a 3-D environment.
The project does not re-create the 1600s visually; instead each scene dissolves into white outlines when the audio fades into the 1609 soundscape.
Al-Ibrahim is working to provide the experience through multiple formats. One will be through a downloadable app for an immersive experience with a virtual reality headset and earbuds on a smartphone, where the user turns his head or body for 360 degree views and interactivity. Another format, without goggles, delivers the 360-degree experience through a web browser displayed on a smartphone screen where the user views, turns and interacts by moving the phone itself. There will also be a desktop experience that plays like video, with some interactivity. Shorter scenes will also be posted on Facebook and YouTube.
The project was supported by a fellowship to Al-Ibrahim from the Arts and Entertainment Network and by the Cornell Lab of Ornithology.