A collaborative group of researchers from the University of California San Diego traveled to Turin, Italy recently to digitally map an entire portion of the city—complete with historic architecture, expansive murals and stunning works of art.

From left, Eric Lo and Dominique Rissolo of the UC San Diego Cultural Heritage Engineering Initiative with Polytechnic University students in Italy. Photo by Farshid Bazmandegan

Digital data will be used by students and researchers on campus to explore the site’s buildings and artifacts, ultimately recreating an interactive, virtual-reality experience. Through high-resolution images and 3D models, students can study all the pieces together without the difficulty of travel to the site, nor fear of losing objects to passing time.

“The idea is to create a model— a digital surrogate—of these structures that allows us to interact with them, to analyze them, to annotate them and to make measurements to

really understand their state of health,” said Dominique Rissolo, an archaeologist and assistant research scientist at the UC San Diego Qualcomm Institute. “We have a unique capability here on campus … that allows us to go one step beyond the model to actually create the digital surrogate.”

Trip findings are an extension of Division of Arts and Humanities Dean Cristina Della Coletta’s research on the historical significance of the 1911 Turin International, the world’s fair that took place in the city’s Parco del Valentino. With a goal to visually recreate the 1911 fair, the team digitally captured a majority of the park, which includes a medieval village and the Castello del Valentino, or “Valentino Castle”—home to Polytechnic University’s architecture department.

“For the first time, we really have a wide array of expertise that is brought to bear on the project: we have engineers, working with architects, working with cultural historians,” Della Coletta said. “This is a winning combination.”

Researchers digitally mapped a portion of Turin’s Parco del Valentino, the location of the 1911 World’s Fair. The park includes a medieval village called Borgo Medievale, pictured here. Photo by Farshid Bazmandegan

Led by Della Coletta, Rissolo, Falko Kuester and Vid Petrovic, the Turin fieldwork brings together engineering students and arts and humanities students for the betterment of both. Cross training these students, Kuester said, creates empowered and dynamic learning: just as the cultural preservation work is important, honing and testing new technical skills in the field and classroom is equally as important.

“The exciting part for us, as educators, is that we get to work with talented students from a broad range of disciplines—disciplines that historically do not really collaborate together as much,” said Kuester, the Kinsella Heritage Engineering Director at Qualcomm Institute and professor at the UC San Diego Jacobs School of Engineering.

“By doing this, and being able to put on these different lenses, our students get to learn to speak each other’s language [and] to communicate in ways we were never required to do before,” he said. “Our students, in the process, are becoming more complete human beings and innovators.”

Considered the initial steps in recreating the structures of the 1911 Turin World’s Fair, Rissolo said the team collected field data by using terrestrial laser scanners, structure-from-motion photogrammetry and stereo spherical giga-pixel imaging. Taking several scans from many different perspectives in the park allows the team to process the data quickly and with a high degree of certainty. They collected thousands of images, and use advanced software on campus to create digital models of the structures.

“The most important takeaway for our students is the ability to connect theory and practice, whether it is in the archives or in the digital lab,” said Della Coletta, who participated in the data gathering during fall quarter. “What makes the project meaningful is the ability of our students to learn by doing.”

With a first round of digital information collected, the researchers have been busy on campus recreating a 3D model of the buildings, both inside and out. Viewing the models at the university’s WAVE lab, Qualcomm Institute research and development engineer Eric Lo said the results were a good representation of what he saw on the ground in Italy, but some data was missing. Once the 3D models are created, gaps appear, giving the team an overview of what to record on future trips.

“Ultimately, it’s a map to the site, but a map that also tells us what else to map in order to first create the best possible digital surrogate, combining site geometry, building materials and overall state of health, as well as its art and history,” Kuester said.

The Turin team also included Department of Visual Arts alumnus Farshid Bazmandegan and students at Italy’s Polytechnic University, headed by geomatics professor Filiberto Chiabrando. The Polytechnic team continues to digitally document historic structures in the park.

“The approach that we will follow in this project is very interesting, since it’s connected to the cultural heritage, [and] it’s connected to the humanities as well. This is a fruitful collaboration that we have started,” Chiabrando said. “The most important thing is to merge different backgrounds, and different experiences. A multidisciplinary approach is the best way.”

The images will ultimately recreate a “digital map” of the buildings, both outside and in—as shown here in the researcher’s planning book. Photo by Farshid Bazmandegan

The Cultural Heritage Engineering Initiative at UC San Diego brings the power of student-driven engineering to the study and preservation of historic structures, archaeological sites, art and other artifacts. There are multiple projects underway, from visualizing shipwrecks near Bermuda to the Hearing Seascapes initiative with Department of Music professor Lei Liang.

“It’s a phenomenal opportunity to have the Division of Arts and Humanities partner with the Cultural Heritage Engineering Initiative, the Jacobs School of Engineering and, on top of that, the Polytechnic University in Turin,” said Della Coletta. “What better place to engage in a project that connects engineering and technology with the humanities.”

The remains were discovered by lead diver Alberto Nava and his colleagues on the floor of a flooded pit 130 feet from the surface. Among them were parts of more than 30 animal skeletons, including the nearly intact skull and skeleton of a teenage girl of around 16 who had fallen to her death in the pit at the end of the last Ice Age some 13,000 years ago. Also found were fossils of Ice Age megafauna such as saber-tooth cats and huge Shasta ground sloths.

The researchers used the state-of-the art SunCAVE (Cave Automatic Virtual Environment) at QI, which allowed scientists associated with a NOVA documentary to interact with, map and measure the fossils, as well as plan future diving missions. QI and CHEI are an integral part of the Hoyo Negro Project – working with the technical dive time to develop optimal image acquisition strategies, creating the high-resolution digital models, and powering the visual analytics necessary to bring this remote site to the scientific community.

The QI effort is being led by Rissolo, an archaeologist who has been working in the Yucatan for 25 years, as well as cultural heritage engineering specialists Falko Kuester, Vid Petrovic and Eric Lo. Many of the researchers studying the site’s diverse Ice Age fauna will never have a chance to go there. Not only has the virtual “twin” of the site enabled paleontologists to study the bones remotely, but they are making discoveries in the data – bones and tell-tale features that have eluded detection by divers at the bottom of the deep dark pit.

Exploring an environment that only few have seen before, yet to be the first to see it as a whole and in its full beauty, combining site-scale context with the finest possible details captured by its digital twin for in-depth analysis, is truly transformative, says Falko Kuester, Professor for Visualization and Virtual Reality at CHEI.

“Not only is the virtual cave essential for a comprehensive fossil inventory,” writes NOVA Next writer Evan Hadingham, “it enables the team to take measurements and print accurate 3D replicas of specific bones, including Naia’s skull.” Explorers on the most recent National Geographic-funded mission to Hoyo Negro used the virtual SunCAVE to plan their excursion in detail, which ultimately allowed them to bring up parts of eleven ancient animals, some of them previously unknown.

San Diego, February 6, 2018 – “Lost Treasures of the Maya Snake Kings,” a new one-hour National Geographic special premiering today at 9/8 p.m. central, shows how LiDAR laser imaging technology is revolutionizing archaeology and features the WAVE data visualization technology created by researchers at the University of California San Diego Qualcomm Institute (QI). Albert Yu-Min Lin, an affiliate of QI, is the host of the program.

The documentary explores what’s being hailed as a “major breakthrough” in Maya archaeology: the identification of ruins of more than 60,000 houses, palaces, elevated highways, and other human-made features that have been hidden for centuries under the jungles of northern Guatemala. The work was conducted by researchers of the PACUNAM LiDAR initiative.

Using a powerful technology known as LiDAR (short for “Light Detection And Ranging”), scholars digitally removed the tree canopy from aerial images of the now-unpopulated landscape, revealing the ruins of a sprawling pre-Columbian civilization that was far more complex and interconnected than most Maya specialists had supposed.

“The LiDAR images make it clear that this entire region was a settlement system whose scale and population density had been grossly underestimated,” said Thomas Garrison, an Ithaca College archaeologist and National Geographic Explorer who specializes in using digital technology for archaeological research.

Garrison is part of a consortium of researchers who are participating in the project, which was spearheaded by the PACUNAM Foundation, a Guatemalan nonprofit that fosters scientific research, sustainable development, and cultural heritage preservation.

Working closely with National Geographic Explorers, QI’s Cultural Heritage Engineering Initiative (CHEI) was launched in 2007 and has since created a comprehensive toolbox and talent pool that brings the power of student-driven science and engineering to the study and preservation of archaeological sites, monuments, historic structures and other artifacts. QI researchers have a history of working on various ground based and drone based LiDAR imaging projects in Guatemala. Lin, along with his collaborators in the QI Engineers for Exploration program —co-directors Ryan Kastner and Curt Schurgers — collaborated with Garrison to lead teams of students on expeditions over the past four years to the jungles of Guatemala to test out various platforms for mapping and imaging. QI Staff Engineer Eric Lo and Ph.D. student Dominique Meyer were also instrumental in these field expeditions.

Researchers examine LiDAR imagery from Guatemala on two of the visualization displays (including the WAVE, at right) in the lab of the QI Cultural Heritage Engineering Initiative.

“Engineering and exploration go hand in hand — National Geographic was co-founded by the Alexander Graham Bell who invented the telephone,” said Lin. “The things we create allow us to go further, and with the exponential rate of innovation today this truly is the new golden age of exploration.”

The Qualcomm Institute is a leader in visualization technologies that make it possible to look at data at a massive scale. Researchers at CHEI — including Falko Kuester, Vid Petrovic, Eric Lo, Christopher McFarland, Jurgen Schulze, Greg Dawe, Joel Polizzi, Joe Keefe and Tom DeFanti — played a primary role in developing CHEI’s hardware and software toolbox, including the 70-megapixel Wide Angle Virtual Environment (WAVE), which is featured heavily in the documentary. The team at CHEI also developed the VisCore visual analytics engine that allows archaeologists to use virtual reality to literally walk into the arena of data-enabled scientific discovery, as featured in the “Lost Treasures of the Maya Snake Kings.”

“Turning big-data into insights and action is one of the truly transformative elements that our team enables,” says CHEI Director Kuester. “Lots of data is being acquired and simulated these days, but making sense of it all is a completely different story. The opportunity to work in highly interdisciplinary teams that change to state of knowledge is where it gets truly exciting.”

An advanced civilization

The PACUNAM project mapped more than 800 square miles (2,100 square kilometers) of the Maya Biosphere Reserve in the Petén region of Guatemala, producing the largest LiDAR data set ever obtained for archaeological research.

The results suggest that Central America supported an advanced civilization that was, at its peak some 1,200 years ago, more comparable to sophisticated cultures such as ancient Greece or China than to the scattered and sparsely populated city states that ground-based research had long suggested.

The ancient Maya never used the wheel or beasts of burden, yet “this was a civilization that was literally moving mountains,” said Marcello Canuto, a Tulane University archaeologist and National Geographic Explorer who participated in the project.

“We’ve had this western conceit that complex civilizations can’t flourish in the tropics, that the tropics are where civilizations go to die,” said Canuto, who conducts archaeological research at a Guatemalan site known as La Corona. “But with the new LiDAR-based evidence from Central America and [Cambodia’s] Angkor Wat, we now have to consider that complex societies may have formed in the tropics and made their way outward from there.”

Surprising insights

“LiDAR is revolutionizing archaeology the way the Hubble Space Telescope revolutionized astronomy,” said Francisco Estrada-Belli, a Tulane University archaeologist and National Geographic Explorer. “We’ll need 100 years to go through all [the data] and really understand what we’re seeing.”

Already, though, the survey has yielded surprising insights into settlement patterns, inter-urban connectivity, and militarization in the Maya Lowlands. At its peak in the Maya classic period (approximately A.D. 250–900), the civilization covered an area about twice the size of medieval England, but it was far more densely populated.

“Most people had been comfortable with population estimates of around 5 million,” said Estrada-Belli, who directs a multi-disciplinary archaeological project at Holmul, Guatemala. “With this new data it’s no longer unreasonable to think that there were 10 to 15 million people there—including many living in low-lying, swampy areas that many of us had thought uninhabitable.”

Virtually all the Maya cities were connected by causeways wide enough to suggest that they were heavily trafficked and used for trade and other forms of regional interaction. These highways were elevated to allow easy passage even during rainy seasons. In a part of the world where there is usually too much or too little precipitation, the flow of water was meticulously planned and controlled via canals, dikes, and reservoirs.

Among the most surprising findings was the ubiquity of defensive walls, ramparts, terraces, and fortresses. “Warfare wasn’t only happening toward the end of the civilization,” said Garrison. “It was large-scale and systematic, and it endured over many years.”

The survey also revealed thousands of pits dug by modern-day looters. “Many of these new sites are only new to us; they are not new to looters,” said Marianne Hernandez, president of the PACUNAM Foundation. (Read “Losing Maya Heritage to Looters.”)

Environmental degradation is another concern. Guatemala is losing more than 10 percent of its forests annually, and habitat loss has accelerated along its border with Mexico as trespassers burn and clear land for agriculture and human settlement.

“By identifying these sites and helping to understand who these ancient people were, we hope to raise awareness of the value of protecting these places,” Hernandez said.

The survey is the first phase of the PACUNAM LiDAR Initiative, a three-year project that will eventually map more than 5,000 square miles (14,000 square kilometers) of Guatemala’s lowlands, part of a pre-Columbian settlement system that extended north to the Gulf of Mexico.

“The ambition and the impact of this project is just incredible,” said Kathryn Reese-Taylor, a University of Calgary archaeologist and Maya specialist who was not associated with the PACUNAM survey. “After decades of combing through the forests, no archaeologists had stumbled across these sites. More importantly, we never had the big picture that this data set gives us. It really pulls back the veil and helps us see the civilization as the ancient Maya saw it.

Performances of Erasure and Hearing Seascapes set for 5-7pm on Feb. 8 at UC San Diego’s Qualcomm Institute

In spring 2017, UC San Diego music professor and former Qualcomm Institute (QI) composer in residence Lei Liang, and Falko Kuester, the institute’s professor of visualization and virtual reality, organized a unique seminar course on “Hearing Seascapes: A Collaborative Seminar on the Sonification of Coral Reefs.” It provided graduate students, primarily from the Music department, with an opportunity to develop multimedia projects to highlight the dangers facing coral reefs in many parts of the world.

Scene from Hearing Seascapes

Out of that seminar course emerged two performance-and-installation works accepted into the Qualcomm Institute’s Initiative for Digital Exploration of Arts & Sciences (IDEAS) 2017-2018 season. The two works will premiere simultaneously in QI’s Atkinson Hall on the UC San Diego campus. The immersive works include:

Erasure, an ambitious multimedia installation produced by a robust collaboration among three UC San Diego Music Ph.D. students: Jacob Sundstrom in Computer Music, Fiona Digney in Music Performance, and Anthony Vine in Musical Composition, together with Computer Science and Engineering Ph.D. student Vid Petrovic; and

Hearing Seascapes, which combines coral-reef imagery and audio data to generate sound based on the location and viewpoints of endangered coral reefs. The work was co-developed by Lauren Jones, a Music graduate student in Vocal Performance, and Music Ph.D. student Eunjeong Stella Koh, both at UC San Diego.

​Hearing Seascapes will be staged in the SunCAVE virtual-reality (VR) room in the Immersive Visualization Lab, and Erasure in the Reconfigurable Media Lab, both on the first floor of QI’s Atkinson Hall. The works will premiere Thursday, February 8, 2018 from 5-7 p.m. in Atkinson Hall, followed by a public reception. The schedule:

5:00pm Erasure and Hearing Seascapes installations open to visitors

5:30pm Music Prof. Lei Liang introduces both teams of artists, followed by artist talks

6:00pm Both installations reopen for viewing; public reception begins in entry hall in front of the Calit2 Theater.

“Over the last several decades, coral-reef ecosystems have suffered significant impacts from both local and global factors,” said Koh. “We aim to convey important messaging to the audience and illuminate data that show the declining health of ocean coral reefs.”

ERASURE

Through an interconnected network of three-dimensional (3D) photomosaic models of coral reefs and spatially as well as electronically processed percussion sounds, a metaphorical ecosystem forms and responds directly to human presence and the temporal history of that presence throughout the work’s existence.

Audio-visual cues label types of coral with unique sounds and painted overlays on top of 3D photomosaic models from the 100 Island Challenge.

Erasure responds negatively to human presence in the installation environment. As more and more people enter the space, the installation begins to break down: the sonic tapestry of percussion sounds contort and particulate, the synthetic biome of coral visualizations begin to morph into unnatural forms, and the entire system mutates – all with the presence of the audience – and members of that audience are aesthetically confronted with their impact on these remote and fragile ecosystems. The transformation, however, is neither irreversible nor unidirectional. “As viewers leave, the piece rebounds, but more slowly than the rate at which it broke down,” explained Ph.D. student Sandstrom. “The system bounces back from the immediate and long-term human impact, and it reflects the resilience of the reefs to withstand and adapt to global shifts in climate and the ecosystem.”

The visual component of Erasure consists of 3D photomosaic models of coral reefs taken from the 100 Island Challenge, based in the Scripps Institution of Oceanography. These digital reproductions were created by the Challenge’s technical visualization advisors from the Qualcomm Institute’s Cultural Heritage Engineering Initiative (CHEI), jointly with Scripps Oceanography. By rigorously photographing and collecting data from reef sites and rendering the data into 3D computer models using custom software developed by CHEI’s Falko Kuester and Vid Petrovic, students from the Music department were able to observe the reefs from various angles, light levels, and distances.

“These coral reef constructions ebb and flow between their natural state—meticulously constructed synthetic ecosystems—and transformative states: from granulations of the stony corals and polyps into whirling cascades of particles, to fissions of vast reef colonies into splintered slabs that recede in and out of focus,” said Music’s Anthony Vine. “Lingering traces of the piece in its untouched state float among the remains, and the metaphorical ecosystem appears to be dreaming and longing to return to an undisturbed state.”

The sound-space of Erasure is created from a reservoir of percussion improvisations that both reflect the sounds that might be found in and around a coral-reef environment, as well as poetic expansions that reach beyond the palette of oceanic utterances: scraped and struck limestone tiles, sweeping washes of hands streaked across a bass drum, and the murky drones of rolled bell plates. By manipulating these samples in a simple causal network, emergent behavior materializes to constitute a lush atmosphere of sound. In this way, the behavior of the sound world is not unlike the behavior found in reef ecology: masses of small units combining to create a complex and rich environment.

HEARING SEASCAPES

“Music can bring an image to life, and by giving a voice to the coral reefs, we can help the audience make an emotional connection to ecology and realize the fragility of these reefs,” said Lauren Jones. “The goal of this project is to create an immersive experience for the viewer that allows them to submerge themselves in the world of the coral.”

3D laser scan of coral reef featured in Hearing Seascapes.

Viewers use a joystick to ‘dive’ and explore the reef, controlling the location, viewpoint, depth and speed of navigation. Audience members hear different sounds that represent different species in the data set. Each species has its own specific personality (represented by different sounds). “Coral reefs are living, breathing organisms that are vulnerable to small changes of the surrounding environment and climate,” explained Stella Koh. “We assign the coral reefs a distinct personality by examining certain characteristics such as texture, habit, origin and growth. To convey messages through music, we’ve recorded an underwater dialogue of voices.” The installation was designed to induce conversation between coral reefs and a fish maneuvering through the reefs, and the sound becomes louder or softer depending on how far away it is from the coral.

Jones and Koh set for themselves three goals with Hearing Seascapes: to display experiments with different aspects of sound and innovative graphic design to create an enjoyable environment for the audiences; to tell an effective and interactive story invoking concepts of adventure, imagination and humor to motivate people to recognize environment health; and to create an inviting seascape with a synergy of voices, images, synthesized sounds and human emotion.

“Based on the notion of acoustic ecology, we want to bring out the positive aspects of sound in the ocean environment,” they noted. “We hope to highlight the importance of engaging in the soundscapes of coral reefs in hopes that our musical voice can make scientific results more accessible to society.”