Datavized is working on a social impact project that explores how virtual reality technology can be used to communicate insights from data faster and more effectively.

Data visualization has moved from simple charts and graphs to sophisticated interactive projects and VR allows us to go further, adding a new dimension to the flat 2D screen and entering a world where the user can be fully immersed in 3D 360° data with new navigation possibilities such as gaze, voice and gesture.

Immersive 2030, is focused on the UN SDGs and allows users to view global data sets on an interactive globe in both virtual reality and online on any web-connected device.

Debra Anderson, Co-founder, Chief Strategy Officer at Datavized Technologies Inc: “Big data keeps getting bigger and we need better ways to keep up. The real world is in 3D but for decades we have viewed the digital world behind 2D screens. We are just beginning to scratch the surface of what’s possible to push the boundaries with data that matters.

Data visualization projects that use VR are usually restricted to looking around at the data in 3D space. We want to move beyond this and examine which practical applications of data are most suited to presenting insights when viewed in an immersive environment. This demo is a first step in a broader project.”

Datavized “Immersive 2030” is a Datavized Technologies Data Solutions for Good initiative designed to apply WebXR technologies as inclusive and connected data systems for sustainable development. The initiative aims to address the problem of data accessibility and bridge gaps between data producers and users through the use of accessible web-based data presentation, analysis and visualization solutions. Through data-driven immersive communications tools accessible to all, we can make better-decisions together to implement the Sustainable Development Goals. Immersive 2030 was recognized as a Finalist in the 2nd annual Global WebXR Hackathon.

News Release:

Morph – Make Art With Data

This Free Google-backed Tool Lets You Build Eye-Catching Graphics From Spreadsheets in Minutes Without Code or Design Skills

Morph is made by Datavized in partnership with Google News Initiative

Datavized and Google News Initiative today announce the online launch of Morph – a free and open source web browser tool to make animations, graphic design and interactive visualizations from data. Morph is made by Brooklyn-based software company Datavized in partnership with Google News Initiative. The software uses a generative algorithm to create graphics based on data from a spreadsheet and the user designs by guiding the evolution of their artwork through simple taps or clicks.Morph is built to be fast, fun and easy to use, but allows for output to industry-standard formats so that users can share their creations to social media or download them for use in professional design projects or presentations. A progressive web app, it allows users to install the app to their device directly from the browser. Morph works on mobile, tablet and desktop, and aims to bring data and design capabilities to a wider audience.Hugh McGrory, Co-Founder and CEO at Datavized : “There’s a lot of great tools available for serious data analysts and scientists. We wanted to make something creative for non-technical people who are often intimidated by data and design software. Morph works great in a classroom setting where beginners can make artworks in minutes, but also professional users like it for the randomness and speed it offers them for rapid-prototyping ideas.”Simon Rogers, Data Editor at Google: “We are working constantly to develop new tools to really push the boundaries of what journalists can do with data by creating a way that anyone can experiment with generative art, regardless of your level of experience with coding or design. The result is a beautiful tool that will help newsrooms really think about new and experimental ways to display numbers. ”Alberto Cairo, Knight Chair in Visual Journalism, University of Miami, and Consultant on Morph: “This is a great tool for individuals and organizations without the resources to hire in-house developers or design teams. Data-driven art projects usually require a lot of money, people and time to produce but Morph now lets anyone create something great in minutes with free software, even if they only have a smartphone.”Morph is the first of two web-based data tools Datavized have built in partnership with Google News Initiative. The second will be released early 2019. Users who voluntarily choose to sign up on the Morph website will be given free Beta access to this new tool when it becomes available. Datavized is also a recent winner of the Online News Association ‘Journalism360 Challenge’, in September 2018. The team will be presenting their WebXR technology this week in New York City at UN headquarters during the 73rd session of the UN General Assembly (UNGA 73) in the SDG Media Zone September 24-28. To learn more about Datavized software tools, visit datavized.comAbout Datavized TechnologiesDatavized Technologies, Inc. is a software company based in NYC specialized in data-driven products and services. Datavized develops 100% web-based tools that enable individuals and businesses to effortlessly create and distribute compelling digital content from data. The company’s suite of products include immersive audio and visual applications that work cross-browser and cross-platform, including desktop, mobile, tablet and with virtual and mixed reality headsets. About Google News Initiative The Google News Initiative is Google’s effort to work with the news industry to help journalism thrive in the digital age. The GNI brings together everything Google does in collaboration with the industry—across products, partnerships, and programs—to help build a stronger future for news. It is focused on three objectives: elevate and strengthen quality journalism, evolve business models to drive sustainable growth and empower news organizations through technological innovation.Electronic Press KitKey Links

Morph ​lets you make art from data. It’s free and ​open-source​, works 100% in the web browser and runs on desktops, tablets and phones. It’s designed to be fast and easy to use. It’s focused more on aesthetics than analytics. It takes away the fear of working with data and makes it fun.

Morph is a ​progressive web app​ built with J​avascript​. Here’s a quick overview of how it got made – with original sketches, links to early demos and interface design storyboards.

.

ABOVE IMAGE: The Morph Editor, allows for manual or random changes to the image.

Datavized​ has been working for three years reimagining how data can be represented beyond static charts and graphs. Our thinking goes like this…

In digital form, data is ones and zeros

Everything else in the digital world is also made from ones and zeros

This means that data as a raw material can be turned into any other media using code, without losing the integrity of the information

…it’s 21st Century alchemy.

We have focused primarily on ​WebXR​ to allow users to be fully immersed in 3D 360 experiences where they can walk around inside their data and use gesture or gaze for navigation. We’ve made serious things like this:

ABOVE IMAGE: Geometric by Datavized. A WebXR platform that allows users to experience their data interactively on all web-connected devices, including immersive virtual reality headsets, without writing code.

And fun things like this:

ABOVE IMAGE: WebXR Interactive Basketball Demo. Every shot attempted by every NBA player in the 2015-16 season, Datavized

Datavized CSO ​Debra Anderson​ teaches classes in ​WebXR​ and ​Data​ and is often asked by students which software to use to get started. Our goal is to open up data for everyone by providing simple tools. We want to make working with data ordinary. We want to bring the process of creating art from data to the place where photography is now; something you can do in seconds without thinking about it too much, something so easy that it’s not worth listing as a skill anymore since everyone can do it. Democratizing creativity makes things better for both amateurs and professionals: to strengthen the top of a pyramid, widen its base.

Simon Rogers ​from​ Google News Initiative ​allowed us to take a step back from virtual reality headsets and think exclusively about the 2D screens on devices that the vast majority of people still use to connect to the web today, to solve the problem of making art from data without asking a user to write any code. Our mentor throughout the process was ​Alberto Cairo,​ (data journalist, information designer, author and teacher) who challenged and inspired us on weekly video chats. It’s worth mentioning here that we looked at some really great web-based data visualization tools like ​Flourish​ and ​RawGraphs​ and would highly recommend these to users who are more interested in serious data analytics use cases rather than creative artistic expression.

From Concept To Art

At the start of the process we reached out to Belfast-based artist ​Glenn Marshall​ who has spent decades using generative animation techniques inspired by science and nature to make art from algorithms. Glenn took the initial concept in two directions:

1. Darwinian Evolution

2. The Tree of Life

The user would drive a process of evolution where random mutation would be guided by human decision making. The final design or animation would result from aesthetic choices made each time a node was selected. The user directs everything based on how it looks.

“​The algorithms themselves are not fixed. The user can randomly mutate, evolve and generate new algorithms creating new visuals, encouraging the sense of exploration and discovery. The algorithms are dynamic in this sense – the underlying code and math will actually be changed by the user, without them having to know any code – but rather interact with an intuitive UI to change parameters.” – Glenn Marshall

Glenn’s demos illustrated the concept from a generative art point of view and laid out a great framework to begin to map out the tool for coding.

We Code So You Don’t Have To

Datavized CTO ​Brian Chirls​ was challenged as follows… take the generative animation concepts as a guide and build a self-service tool that runs 100% in a web browser (on desktop, tablet and phone) without the ability to save anything to a server, and do all of these things:

Allow a user to upload their own data set

Allow review of the data

Automate mapping and grouping of fields and variables

Design multiple chart types

Allow the tree to evolve shapes using math

Make sure one variable never changes, to keep integrity of the data

Run and render everything interactively in realtime

Build an editor

Allow for export as still image or animation

Allow for export to social media

There were a lot of leaps required, in both coding and imagination, to take Morph from an artistic prototype to a usable tool. The full list of hacks and workarounds is too long to delve into here but one area merits an explanation…

Q: How do you render the visual elements of the interactive tree in realtime in a mobile web browser? Let’s break the problem down like this – to run on a phone we need to keep things under a few hundred interactive objects. Imagine a user keeps generating pie charts until there are 100, of which you can see maybe 50 on the screen at any time depending on how zoomed in or out you are. If each slice of each pie chart is a separate interactive object then we quickly get to thousands of objects to be rendered in the browser simultaneously, which may work on a high end desktop but certainly not a phone.

A:At any given moment we get a list of all of the objects visible on the screen. Each time we render a chart (leaf) in the tree, we take a snapshot of it at a series of different resolutions. When a user pans around, it’s these snapshots that they see until they zoom in. The act of zooming in makes fewer objects appear on the screen (by moving some out of view) until they arrive a one chart (leaf) and then we magically convert back from a high resolution snapshot into separate interactive slices. Zooming out reverses the process and converts slices back into snapshots.

Buttons

The Interface design for Morph presented significant challenges since the user has a lot of options to control the visualization creation process and all of these steps need to be actionable across devices; from phones and tablets to desktops. The first big decision was where to put everything.

For users on a phone held vertically, the menu has to appear only when needed by expanding or collapsing.

Colors

Now that we had a rough idea of where to put the buttons, the conversation moved to the look and feel. Morph had a different use case from other visualization tools with a goal to be playful and fun but it still looked quite serious and scientific with a clean white background and muted colors. We talked with a few early test users – our mentor ​Alberto Cairo​, ​Simon Rogers​ from Google, UI/UX Designer ​Beth Wegner​, and even the kids in a ​6th Grade Class​ in Manhattan – and the response was unanimous, everyone wanted bright colors on a black background.

Formats

Morph runs 100% in the web browser without saving anything to a server but is coded to allow for the following file formats:

]]>https://datavized.com/2018/09/24/how-we-made-morph/feed/0Datavized is a Winner of the 2018 Journalism360 Challengehttps://datavized.com/2018/09/13/datavized-is-a-winner-of-the-2018-journalism360-challenge/
https://datavized.com/2018/09/13/datavized-is-a-winner-of-the-2018-journalism360-challenge/#respondThu, 13 Sep 2018 15:20:47 +0000https://datavized.com/?p=1624We are pleased to announce today Datavized Technologies is a winner of the 2018 Journalism360 Challenge! The challenge, which supports projects that seek to help develop and expand best practices in immersive journalism, including virtual, augmented and mixed reality, is a joint initiative of the John S. and James L. Knight Foundation, Google News Initiative and Online News Association. Our project “Interdimensional Audio Editor”, led by Datavized CEO Hugh McGrory, is being funded a $20,000 grant. The Interdimensional Audio Editor is an intuitive tool that allows journalists to quickly and easily assemble and share sound to export as stereo (2D) for traditional use, such as in video storytelling, or as spatialized (3D) for virtual, augmented and mixed reality. We look forward to sharing our process and learnings as we develop this audio tool in the coming months. If you are not already signed up to our newsletter, sign up to stay in the know of our upcoming product releases, including opportunities to access tools in beta. Congratulations to all the winners! Find out more about the initiative and winners in the press release below.

The Interdimensional Audio Editor is an intuitive tool that allows journalists to quickly and easily assemble and share sound to export as stereo (2D) for traditional use, such as in video storytelling, or as spatialized (3D) for virtual, augmented and mixed reality.

WASHINGTON, D.C. — Eleven projects led by creators from around the world that seek to help develop and expand best practices in immersive journalism, including virtual, augmented and mixed reality, will receive a share of $195,000 as winners of the 2018 Journalism 360 Challenge. The challenge is a joint initiative of the John S. and James L. Knight Foundation, Google News Initiative and Online News Association.

The 11 winning projects were selected from more than 400 applications that addressed the question: How might we experiment with immersive storytelling to advance the field of journalism? Each winner will receive grants ranging from $4,600 to $20,000 to test, refine and build out an early-stage idea.

Winners will cover complex issues such as privacy and surveillance, race issues and domestic violence in new ways. Reflecting emerging trends and challenges in immersive storytelling, the projects will explore best practices in key areas such as audio tools, volumetric tools and on-the-go production.

Journalism 360 opened the call for ideas in May 2018. Now in its second year, the challenge helps to fulfill Journalism 360’s mission to support news organizations and individuals to explore, learn and share new ways to use immersive storytelling.

“The winning projects focused on advancing new models and techniques in immersive storytelling. They will work to innovate journalism — providing news organizations and reporters with insights into new forms of storytelling and audience engagement techniques that can shine a light on the issues that matter most to communities,” said Paul Cheung, Knight Foundation director for journalism and technology innovation.

“We’re proud to support the second class of Journalism 360 grant winners. We’re eager to see these projects brought to life and become points of reference and education for the rest of the journalism industry,” said Erica Anderson, U.S. Partnerships Lead, Google News Initiative.

“The diversity of voices and projects represented by this year’s challenge participants provides an optimistic outlook for journalism. We heard from creators around the world who are expanding the boundaries of immersive storytelling and experimenting with more varied tools, with some building on inspiration from the 2017 challenge winners. We’re also excited to see journalists pitching innovative ways to help people in their communities use immersive technology to tell their own stories,” said Laura Hertzfeld, Director of Journalism 360.

Bristol, UK March 21, 2018 – Launching March 21 online and in virtual reality, the United Nations Environment Programme (UNEP), Datavized Technologies, Brookline Interactive Group (BIG) and The Public VR Lab are pleased to announce the public launch of “There’s Something in the Air” a VR data visualization experience exploring air pollution and global data over time around the globe. The public launch coincides with the official launch of Datavized’s closed beta platform at the Data for Development Festival, the inaugural gathering of the Global Partnership for Sustainable Development Data in Bristol, March 21-23 and will be on view in the VR Data Play Space along with access to Datavized software tools at both the festival and the Bristol Data Dive on March 23.

The visualization, powered by Datavized WebVR software, was originally presented by the Public VR Lab’s team at the third session of the United Nations Environment Assembly “Towards a Pollution-Free Planet” of the United Nations Environment Programme (UN Environment) in Nairobi, Kenya on December 4-6, 2017 to 800-1000 UNEA delegates, volunteers, NGOs, students, businesses, activists and world leaders, helping them to experience and understand environmental data stories in a new way.

The collaborative project was spearheaded by the UNEP, Datavized Technologies, Brookline Interactive Group, The Public VR Lab and The EcoLearn Project, to demonstrate how VR can create a paradigm shift to a more hands-on, visceral understanding of environmental issues through immersive data storytelling, through the physical sense of presence and increased empathy that VR provides. “There’s Something in the Air” presents statistical estimates of air pollution based on fine particulate matter (PM2.5) and mean annual exposure by country every 5 years from 1990-2010 and yearly between 2010 and 2015. Data sources include the Health Effects Institute – State of Global Air.

“There’s Something in the Air” features a customized visualization of the Datavized software tools scheduled to be released publicly in 2018. Datavized immersive visualization technology, built on the WebVR API, gives users powerful, easy-to-use, three-dimensional geospatial templates for mapping global, national and city data visualizations. The startup, headquartered in New York, is launching its closed beta program working with pilot partners in industries including government, business, education, transportation, mapping, statistics and sustainable development. Business and individual users can sign up to request access to the beta platform through datavized.com.

The experience is viewable on any connected device at https://demo.datavized.com/somethingintheair, including mobile, desktop, tablet and in VR through WebVR browsers, on headsets including in the HTC Vive, Oculus Rift, Samsung Gear VR, Google Daydream, Google Cardboard and Microsoft Windows Mixed Reality headsets. For details on WebVR browsers and supported platforms, visit webvr.rocks.

Quotes

“This experience demonstrates how big data and VR can be used together to create an immersive environment for increased understanding and enhanced communication of real world challenges. We are delighted Datavized geospatial software products and mapping technologies are being used as powerful tools for environmental education, awareness and impact.” Debra Anderson, co-founder and chief strategy officer, Datavized Technologies Inc.

“For us, it was amazing to collaborate with UNEP and Datavized on this project and put our VR-in-the-Public Interest Tools to good use to create a customized global air pollution VR experience for UNEA,” shared Kathy Bisbee, co-founder and executive director at the Public VR Lab.

“The experience we created represents the new frontier in environmental education, uniting the possibilities of new technologies with the urgency of crises such as air pollution. The leaders exposed to this tool gain a new understanding of the issue, and also a new understanding of how emerging technologies can add a vital dimension to education.” Nir Darom, lead creative designer at the Public VR Lab

“Datavized has been coded from the ground up for optimal performance across devices. We gave the same attention to the mobile web experience as we did to room-scale VR to allow users to move seamlessly from 2D to 3D, and doing this involved solving a series of technical challenges that would not have been possible using a game engine.”

“We are excited to see how people take advantage of another dimension with their data visualizations. We are keen to learn from our Beta users and will use the feedback to guide further development. The platform is initially limited to global datasets with geolocation but you can expect to get more updates from us this year that extend the reach of our tools.”

Hugh McGrory, CEO/Co-Founder, Datavized

Priorities as a partner of the Global Partnership for Sustainable Development Data

“As a leader in immersive visualization technology, Datavized will provide platform tools for monitoring and visualization of sustainable development goals data. Global, country, and city geospatial templates will address the needs of developing countries and provide global partners with capacity-building tools to map datasets on the open web and strengthen interoperability standards and protocols. Datavized cross-platform geospatial data and analysis tools will allow new ways of experiencing data through accessible web-based collaboration, and contribute to innovative data adoption and methodologies using VR, augmented reality and mixed reality to help better understand and communicate the most pressing issues.

To achieve the goals; from building sustainable cities and communities, to economic growth and climate action, Datavized immersive web visualization tools will build new behaviors in users, moving from merely analyzing data to experiencing information in more visceral ways to better understand the complexities, ask better questions and make better decisions. Datavized is committed to sharing our expertise and solving real-world problems to support the Global Partnership for Sustainable Development Data”

Debra Anderson, Co-founder, Chief Strategy Officer, Datavized

About Datavized Technologies

Datavized is an immersive visualization platform that makes it easy to turn complex data into fully interactive web experiences. Datavized’s geodata software products provide users with web-based drag and drop tools to effortlessly turn location data in spreadsheets into fully interactive 3D maps for enhanced spatial analysis, visualization and decision making. Datavized works on all platforms and connected devices; including desktop, mobile, tablet, and with virtual reality, augmented reality and mixed reality headsets, enabling users to tell immersive data-driven stories. Datavized Technologies Inc is currently in closed beta and is headquartered in New York. datavized.com

About Brookline Interactive Group (BIG):

Brookline Interactive Group (BIG) is an integrated media and technology education center and a community media hub for Brookline, MA and the region. BIG facilitates diverse community dialogue, incubates and funds hyperlocal storytelling, arts, journalism and technology projects, and serves over 500 youth and adults annually through innovative classes and partnerships. BIG offers extensive multimedia training, VR, AR and 360-video cameras and training, access to high quality filmmaking equipment, production grants, artists’ residencies,and provides low-cost professional media services to non-profit organizations, education partners, businesses, and to local government.

About The Public VR Lab

The Public VR Lab, a project of Brookline Interactive Group, is building a global network for a Community VR movement that facilitates public dialogue; provides professional training; empowers community knowledge and creation of 360, virtual and augmented content; offers access to tools, headsets, arcades, toolkits, and professional expertise; and generates locally-focused, broadly impactful, XR experiences in the public interest. www.publicvrlab.com

About EcoLearn

EcoLearn is an educational research group at the Harvard Graduate School of Education that explores the use of immersive technologies to support learning about the complexity of ecosystems. EcoMUVE and EcoMOBILE are two products that are freely available for download and use. EcoMOBILE uses mobile devices and augmented reality to infuse real environments with digital resources that engage, inspire and educate people about the complexities of the natural systems that sustain us. ecolearn.gse.harvard.edu

Datavized is a VR startup with a data visualization platform designed to display across all devices using WebVR. Read full article here.

]]>https://datavized.com/2018/03/15/datavized-featured-in-upload-vr/feed/0Datavized at UN Environment Assemblyhttps://datavized.com/2017/12/03/datavized-at-un-environment-assembly-nairobikenya/
https://datavized.com/2017/12/03/datavized-at-un-environment-assembly-nairobikenya/#respondSun, 03 Dec 2017 15:39:09 +0000http://datavized.com/?p=1066We’re excited to participate this week at the United Nations Environment Assembly of the United Nations Environment Programme in Nairobi, Kenya to demonstrate how our immersive visualization software tools can be used to solve the most pressing environmental issues. “There’s Something in the Air”a VR experience exploring air pollution and global data over time, powered by Datavized launched the Assembly’s “Towards a Pollution-Free Planet” program on December 3rd at the UN Science-Policy-Business Forum on the Environment in the lead-up to the third session of the UN Environment Assembly December 4-6. The project, in partnership with the Public VR Lab, a Datavized beta partner, will also be showcased in the Assembly’s Sustainable Innovation Expo, December 4-6 from 9am-5pm, a platform presenting innovative technology demonstrating solutions that address the world’s environmental challenges while protecting the ecological boundaries of the planet for future generations.

Check our blog in January for additional updates on the project. To become a partner and access our software tools in closed beta, contact us.

]]>https://datavized.com/2017/12/03/datavized-at-un-environment-assembly-nairobikenya/feed/0Datavized at Made in NY Talk Serieshttps://datavized.com/2017/12/02/datavized-at-lehman-college/
https://datavized.com/2017/12/02/datavized-at-lehman-college/#respondSat, 02 Dec 2017 22:06:42 +0000http://datavized.com/?p=1072On December 5, 2017 Datavized Co-founder and CEO Hugh McGrory will join NYC Media Lab, The Knowledge House and A+E Networks on a “Breaking inot the VR/AR Industry” panel, a Made in NY Talk series hosted by NYC Mayor’s Office of Media & Entertainment presented in partnership with CUNY Lehman College. Photos of the event can be found here, and video will be shared soon.