Three Tandon Teams Debut VR/AR Projects from NYC Media Lab Program

If you’re looking for the next big idea in emerging media, such as virtual and augmented reality (VR/AR) and artificial intelligence (AI), there is no better place to look than universities like NYU. This reality is one that NYC Media Lab understands quite well. Tapping into New York’s student talent pool, the organization helps students develop prototypes that will disrupt and advance sectors such as media, technology, healthcare, retail, advertising, and more.

“Through our Verizon Connected Futures program, we have a dozen new ideas at various stages of realization,” Justin Hendrix, Executive Director of NYC Media Lab, shared at the Demo Day. “We’re experiencing what’s possible right now with new technology.” The 12 teams designed products and services that span various industries, including education, transportation, retail, music, psychology, and more.

Driven by their passion for innovation and using technology to serve society, three NYU Tandon student teams shared their prototypes to a rapt audience of industry experts at the Demo Day.

CitytravelAR

Baris Siniksaran and Subigya Basnet with their mobile app CitytravelAR

Navigating the New York City subway system can be difficult, so the team behind CitytravelAR created an augmented reality wayfinding platform that allows you to use your smartphone to easily get around the city. “Using subway signage as AR markers, users would be able to not only find out the best train route for their journey, but also learn about local communities in and around the station area,” Subigya Basnet said. Basnet alongside his teammates and fellow Integrated Digital Media graduate students Vhalerie Lee and Baris Siniksaran hope their application can function within building smart cities using AR technology to merge commuting and community. “For all of us, NYU Tandon gave us the platform to begin this dive into VR/AR production and development,” Basnet shared, as he, Lee, and Siniksaran have all created projects like the “Game of Thrones” AR narrative experience and WAVR. “The opportunity to experiment our ideas encouraged us to take up opportunities, like the Verizon Connected Futures Challenge.”

ARSL

Heng Li and Mingfei Huang demonstrate ARSL

Computer science students Zhongheng (Heng) Li,Jacky Chen, and Mingfei Huangdebuted their mobile app that provides real-time sign language interpretation to help people communicate with each other more easily. With computer vision, cloud computing, and AR, ARSL uses a smartphone’s camera to capture and translate sign languages into the other user’s native spoken language, and instantaneously records spoken language and translates into sign. “This accessible solution empowers people to get connected,” Li said, adding that his team has a shared passion for using emerging tech for social good. With the millions of people around the world who are deaf or hearing impaired, this app could help people book appointments, explain their symptoms to doctors, or ask for directions, all through a translator in the palm of their hands.

Vrbal

Olivia Cabello, far right, was part of the multi-school team Vrbal

Olivia Cabello, a graduate student in Integrated Digital Media, has vast experience in human-centered design, and brought her expertise to the interdisciplinary and multi-school team that developed Vrbal. The VR training experience allows people with social anxiety or communication disorders to practice for situations, such as public speaking, interviews, and more. Using smart and adaptable technology, such as IBM’s Watson, Vrbal helps users gain more confidence and comfort in particular situations through its personalized environments and its interactive AI assistant that guides them along. Cabello added her user experience background to design Vrbal’s interface.

Camila RyderGraduate School of Arts and Science
Master of Arts in English Literature, Class of 2018