Advisors

Tools

Timeline

Results

During the summer of 2016, I designed the initial mobile concept for Eyewire, a game to map the brain played by over 200,000 people in 150 countries.

As one of the world’s largest and fastest growing citizen science projects, Eyewire (founded by neuroscientist Sebastian Seung) is a web game that crowdsources players to map neurons in 3D, making up for the shortfalls of current AI reconstruction.

In designing Mobile Eyewire, I was challenged to bring the game to a wider audience through streamlined mobile gameplay. To do so, I synthesized the needs of casual mobile players with those of the neuroscience community, creating a game concept that takes advantage of new reconstruction methods to alleviate data creation bottlenecks.

My 2-month long process began with exploratory interviews and concluded with a fully interactive prototype.

Final Design

Misty the AI, illustrated by Tyler Scagliarini

Hello, Player! Meet Misty.

In order to learn more about the brain, scientists need to turn thousands of cell slides into 3D maps. Misty is an AI who’s good at mining the “segments” of brain cells, but often makes mistakes when trying to piece them into whole cells. That’s where you can help. Tell Misty when they’re wrong by shooting segments that don’t fit. Join thousands of players on a quest to map the brain.

The Game



Character as feedback

Misty provides a friendly face and its reactions provoke camaraderie with the player



Yes or no

User testing shows that yes/no is the best judgment mechanism



Shoot-em-up tetris

Throwback gameplay balances complex science



60 seconds

Rounds easily fit into pockets of spare time



Gestures

Swipe left and right to rotate. Swipe down to increase speed. All accessible with just the thumb.



Segment Accept Delay

2 second grace period after “touching” before auto-accept for better judgment accuracy

The Application



Visual Design

Iconography and visual system adapted from Eyewire’s web interface for consistency and legibility



Front page design

Focus on starting gameplay, with minimal distractions



Profile Elements

User research revealed players high value badges (a current feature) and rank

Explore Eyewire

Imagine studying human anatomy without knowing where the heart is or what our skeleton looks like. That’s the state of neuroscience right now. We still know very little about the structure of brain cells (neurons) and their millions of connections (synapses).

Using electron microscopes, neuroscientists have recently been able to produce cross-sectional “cube” images of brain tissue with every neuron and connection visible. To reconstruct these neural circuits would allow scientists to test theories about how the brain works. But doing so is like tracing individual noodles through an 80-million-strand bowl of spaghetti.

Reconstructing neuron cell structure from electron microscopy

Credit: Alex Norton, Eyewire

Fortunately, AI programs can do most of the heavy lifting. But computers still need human help because our ability to see patterns is much better. In 2012, Seung launched EyeWire, an online game that asks the public to reconstruct neural networks by connecting computer-recognized “segments.” The tracing work of thousands of players form accurate models based on consensus.

Screencap of the Eyewire web game, showing AI-generated segments being connected by the player

Credit: Seung Lab

Research: What does science need from Eyewire Mobile?

Recent advances in AI technology mean that computers can now “trace” as well as Eyewire’s best players by rating the likelihood that detected segments fit together and “accepting” segments that reach a certain confidence threshold.

Demonstration of AI “autospreading” feature, which traces segments together automatically based on affinity scores

Credit: Ignacio Tartavull, Seung Lab

I discovered through talks with Eyewire’s Director and game administrators that the bottleneck is now in the checking process, or “reaping” in game parlance. When piecing segments together, players and AI are prone to marking segments of adjacent cells as part of the same cell (creating “mergers”) so it’s necessary for someone to review the reconstruction and correct these errors.

Manually reviewing an AI-generated reconstruction and identifying a segment that’s actually part of an adjacent cell

Credit: Ignacio Tartavull, Seung Lab

After speaking with game admins and top players who use the current reaping interface, I discovered that the biggest challenges related to reaping are:



Speed

it takes 30 paid admin hours to reap each neuron cell ($$)



Non-scalability

the process requires mastery of a complicated interface based on experience and expertise; only admins and a handful of top players are currently involved

Based on my findings, I decided that the goal of Eyewire Mobile was to open the current bottleneck by transforming the process of “reaping” into a game that anyone could learn.

PROJECT GOAL

Eyewire Mobile transforms “reaping” into a game that anyone can learn.

Research: What do players want from Eyewire Mobile?

I then interviewed some of Eyewire’s top players to discover what motivates them to play. With this knowledge, I created some frameworks for developing the components of Eyewire Mobile with a sensitivity to the motivations of players:

Design Process

My design process consisted of observation, research, brainstorming, prototyping, and getting user feedback. Here’s a peek into the process I used to develop Mobile Eyewire’s core game mechanisms and information architecture.
I spoke to Ignacio, a computational neuroscientist at the Seung Lab, who revealed some insights from the lab that could be repurposed as gameplay mechanisms.



Subtract > add

It is demonstrably easier for people to remove extra segments than it is to fill in gaps



Binary choices are a good thing

Yes/no checking strategies akin to Tinder (where the AI suggests matches and the human agrees/disagrees) have been observed to be up to 5X faster than Eyewire’s current tracing method

Ignacio’s beta “Tinder Mode” where users are shown two segments at a time and decide yes or no

Credit: Ignacio Tartavull, Seung Lab

In order to better understand user scenarios for mobile gaming, I did some secondary research on gaming trends.



74%

mobile game users play to kill time while waiting (on public transport, in line, etc.)



77%

of the most popular mobile games are Casual/Social or Puzzle/Board games

Observing mobile phone users on the Boston MBTA

At the same time I used my daily commutes to observe how people use their mobile phones, taking note of body positions, surroundings, and behaviors.

From there, I defined the following user and game profile. Since the game is meant to be accessible to a general audience, I focused my use cases around time, gestural ability, and attention span instead of specific user needs. I then thought about how the game could optimize for those situations.

Player profile



In Public

either in transit (subway, bus) or waiting (in line, for appointment)



One free hand

while the other one holds bags, children, and subway handles



easily distracted

by arrival at their transit stop/appointment, or other factors in their environment

Game profile

Quick

One-handed gestures

Casual and Simple

Straightforward, focused, and simple game rounds with uncomplicated storyline

Based on these parameters, I jumped straight into competitive research, analyzing and brainstorming game flows, using frameworks like Bartle’s Taxonomy of Players to rapidly ideate.

Researching and analyzing similar games (Tetris, Prune, Space Invaders, Scoops), brainstorming game flows, and ideating game mechanisms.

I quickly realized that the unusually technical nature of Eyewire’s game material made it crucial to use real 3D neurons in the prototype to get accurate feedback. So I set up a prototyping strategy that used Framer.js for touch controls/interface animations and mobile mirroring, three.js to handle the neuron meshes, and Sketch-Framer imports to quickly iterate interface layouts. With this setup, I was able to quickly iterate realistic prototypes.

Using Framer and three.js to experiment with game modes, gestures, and beyond.

Using Sketch to iterate game controls and information interface

Conducting user tests

Next Steps

In the short run, these are the most important design problems for Eyewire Mobile.



Onboarding

How do we initially explain the science and context behind Eyewire? How can we teach players to differentiate a merger from a legitimate segment through a clear, well-designed tutorial?



Feedback

Unlike regular games, Eyewire can’t give immediate feedback because points and accuracy are determined retroactively by consensus. Given this limitation, how can we design feedback loops that are understandable, timely, and motivating?

In the long run, the game should also look into creating a sense of communityamong players and conveying how each user is making tangible contributions to science.

Reflections

Eyewire Mobile was one of the most challenging and rewarding projects I’ve ever tackled. Eyewire’s raw material was a huge challenge, made harder by the fact that I’d never delved seriously into game design. On the prototyping side, I spent a good chunk of time learning three.js and figuring out how to implement it in Coffeescript with Framer.

Shoutout to my brilliant mentor, Alex Norton, for pushing me to learn more about problem solving and prototyping in two months than I ever thought I could.