Aira

Aira

Aira is an on-demand, autonomous, passenger drone service. The Aira app will allows users to book electronic vertical take off and landing (eVTOL) vehicles, and provides clear multi-modal booking and navigational features to seamlessly guide users to vertiports and their final destination.

OVERVIEW OF PROCESS

PROCESS IN NUMBERS

DESIGN PROCESS & CONCEPT VIDEO

DISCOVER

OVERVIEW

For our project, we decided to focus on the opportunity area of commercial aerial autonomous vehicles as a means for multi-modal transportation. The “Air Taxis” utilise Electrical Vertical Take Off and Landing (eVTOL) Vehicles to easily manoeuvre cityscapes as well as reduce noise and air pollution. Through my user research, I wanted to understand the relationship between lengthy commuting and quality of life in order to understand their user needs and pain points. Using the insights gained to develop a set of UX principles that would help inform a solution that would improve users’ commuting experience but, beyond that, their lives as a whole.

RESEARCH METHODS

SYNTHESIS METHODS

UX PRINCIPLES

Be Reliable - Don’t make me late with delayed or cancelled services or unexpected changes.

Time is Precious - Get me where I need to be as quick as possible so I can spend more of my time doing things that I enjoy.

Manage Expectations - If any there are any changes to my commute update me immediately so I can alter my plans if required.

Value Me - Some form of reward or compensation will make my commute more enjoyable and increase my patience.

Respect Me - I deserve to be comfortable and treated with respect throughout my commute.

REFRAMING THE PROBLEM

DESIGN

OVERVIEW

We then began rapidly ideating a number of concepts before narrowing in on a select few and creating user flows, user stories and storyboards to define user requirements. We felt that the chosen three concepts had enough breadth for us to experiment with a variety of user interfaces and interactions to help later inform our design choices for our final chosen concept.

AiraSolo — An on-demand autonomous passenger drone service.

AiraShare — An on-demand autonomous rideshare drone service.

*Although AiraScenic is not directly addressing the problem in our research, we saw opportunities to learn from designing for a different problem space in the design round.

Overview of user flows, user stories and storyboards developed for each concept.

OVERVIEW OF DESIGN PROCESS (TESTING IN LOW FIDELITY AND REFINING)

INITIAL WIREFRAMES

AiraSolo Initial Wireframes

AiraShare Initial Wireframes

AiraShare Initial Wireframes

PAPER PROTOTYPES V1

AiraSolo

AiraShare

AiraScenic

USER TESTING ROUND 1

Think AloudsInterviewsObservation

4 Participants

After testing each paper prototype we decided to continue iterating AiraSolo and AiraShare based on the feedback and recommendations of our users.

REFLECTION

This was our initial round of think aloud testing in which we tested a variety of interface styles using different concepts to gain a high level understanding of what types of interfaces user’s found intuitive. We conducted relatively unstructured think aloud tests with only a small number of set tasks, placing emphasis on trialing the concepts with participants. This was followed up by a short interview afterwards to understand not only their pain points but what they wanted to see. After synthesising the results from our think aloud testing we decided to eliminate AiraScenic, as we felt we had utilized this concept as best we could to experiment with different interactions and interface styles.

PAPER PROTOTYPES V2

AiraSolo

AiraShare

USER TESTING ROUND 2

Think AloudsInterviewsObservationA/B Testing5 Participants

By comparing screens across the two different interfaces through A/B testing we were able to understand the differences between a universal card based interface vs an innovative circular based interface.

REFLECTION

This round of usability testing was a lot more streamlined as we were only testing two concepts, the goal was to test these two prototypes with similar functionality but vastly different interface styles to learn how users interact with these different interaction models and how to apply them in the next round of iteration. We did this by conducting A/B testing on the screens that had overlapping functionality e.g booking, route overview and navigation. AiraShare focused on a more universal card system whilst AiraSolo explored a circular interface with an emphasis on its compass wayfinding functionality. We created a set of concrete tasks and a series of think aloud tables to make it easier to analyse comparatively. By comparing the consistent screens across the two different interfaces we were able to understand user’s mental models and how we could explore bringing more intuitive card features into AiraSolo to make content more digestible.

DIGITAL PROTOTYPES V3

We scaled up from paper prototypes to digital prototype in this round as we found that it was difficult for participants to understand the circular interface as paper prototypes. As we iterated this interface we ended up creating two versions low-fi digital prototypes as the circular design was very unique there wasn't really any existing interfaces or familiar systems we could reference.

AiraSolo V3A

AiraSolo V3B

USER TESTING ROUND 3

We used a variety of qualitative and quantitative methods to learn which aspects were most successful. Whilst there were similarities between both it was clear in our results discussed above that V3B was the more intuitive interface and had greater potential.

REFLECTION

Since the circular design was very unique, there wasn't really any existing interfaces or familiar systems we could refer to which is why we chose to iterate two variations. We used a variety of qualitative and quantitative methods to learn which aspects were more successful, in addition to the think aloud testing we also used the SUS and UEQ questionnaires to collect quantitative results to support our qualitative think alouds and aid our A/B analysis. Whilst there were similarities between both it was clear that V3B was the more intuitive interface and had greater potential. V3B demonstrates a more cohesive interface by combining elements cards in our circular layout which helped present information in a more intuitive way versus V3A.

DIGITAL PROTOTYPE V4

Based off the results from our prior round of testing we iterated V3B into V4.Due to its similarities to the previous version as we only tweaked and refined some of the interactions between V3B and V4 before moving into higher fidelity.

With user research and extensive testing with 44 walkthroughs, we progressed from low-fidelity on to high-fidelity prototyping. We aimed to take our user-driven concept to a professional product with the three testing methods found below.

DELIVER

OVERVIEW

Think Alouds Observation InterviewsContextual-WalkthroughsExpert

Critiques

13 Participants

In our final round of user testing, our participants accomplished tasks with ease and found the experience intuitive and visually appealing, displaying the success of a user-driven iterative design process. This then allowed us to finalise our interface.

REFLECTION

This round of testing was interesting as we had a mix of regular users as well as expert users. The regular users had very little issue in interacting with our interface and found it overall to be very intuitive and aesthetically pleasing. Whilst the expert users could also successfully navigate the interface they had more concerns regarding the interaction models and their affordances, a number of them also criticised the information architecture of some screens.

Whilst the mix of participants made for more disparity in synthesis of results it was highly beneficial to us, in particular, the expert users feedback helped us address concerns that may have taken many more rounds of testing with regular users to uncover. Since we did not have that time it was great to be able to accelerate our usability testing. Overall it was a successful round of testing however we did have a few issues with InVision which made some parts of the prototype quite buggy, particularly concerning interactions such as swiping, which was frustrating but still did not have a major effect.

ITERATION OVERVIEW

FINAL UI DESIGN

INTERACTIVE PROTOTYPE

Aira was built to meet user needs and address current pain points with transit. Using a holistic and iterative design process, we were able to narrow our design down to the critical features. We've used Principle, After Effects, and Framer to demonstrate some of the functionality.

Trip Selection

Want to get home with the fastest transit option? We speak your language. Transit destinations and route options broken down in an easy to read manner.

End of Trip

Finish your trip with more transparency. We empower the users with opportunities for feedback and upfront comparisons of time saved over other modes of transit.

SOME CHALLENGES ALONG THE WAY

Managing conflicting feedback from users between different iterations. Does giving the user what they want always fix the problems that they are having? We had to strike the balance between using user-driven insights for our design decisions while also keeping heuristics and the overall flow in mind. The last thing that we wanted was to introduce our own bias being so close to the project into the design.

Communicating high fidelity interaction concepts during the low fidelity phase. With wayfinding compasses that change direction with you to more bespoke transitions between screens, the signifiers towards actions and abilities on the app for the user were sometimes confusing. As designers, our ability to think in a dynamic concept is often stronger than our users, but how can we still get good feedback on high fidelity ideas without going into high fidelity?

Turning smaller samples of data collected in user testing rounds into informed design decisions. Each round we carefully selected our methods to get the best qualitative and quantitative data from the user testing. Sometimes with limited participants in a usability study though, there are outliers that don’t represent the whole demographic. How does one identify those outliers effectively?

REFLECTION

In our final round of user testing, our 8 participants accomplished tasks with ease and found the experience intuitive and visually appealing, displaying the success of a user-driven iterative design process. Our team learned a great deal about our target demographic, what drives positive experiences for them, and how to take those insights from synthesis to a fully realised product. We hope you enjoy Aira as much as ourselves and our users.