Human Machine Interaction (HMI) with Augmented Reality in a Family Vehicle 

Asset 1.png

Human Machine Interaction (HMI) Lab

What is the future of integrated augmented reality systems in vehicles?  We were sponsored by the Human Machine Interaction (HMI) Lab at the Georgia Institute of Technology to research how augmented reality integrated windows could benefit people. We aim to enhance family interactions through the use of a family car that utilizes augmented reality windshields/windows. Our focus is on the encouragement of engagement between passengers and drivers through an infotainment system in a family trip context.

I developed this project with Beatrice Adebisi, Samira Bandaru, Iris Colendrino, and Abigail Maeder 

My personal role included leading design research, interviews, task analysis, observational analysis, usability testing, journey mapping, insight generation, and UX/UI design.

*NDAs and confidentiality agreements were signed during the course of this project. Not all project and personal details are shown.

Discovery Research

We began this project by conducting stakeholder interviews to generate foundational research questions.

Initial Research Questions:​

  • What does a typical family road trip entail?

  • What are the influencing factors for family dynamics?

  • How do family interactions change based off their environment?

  • What are passenger interactions inside an automobile?

  • ​How can we test a technology that doesn’t exist yet?

  • How can we gain recognition for the HMI lab to attract future partnerships and sponsorships?

We implemented the following methodologies to begin answering these questions.

Asset 2@2x.png

User Interviews

Screening users resulted in 11 families/individuals that we interviewed. Interviews were transcribed, recorded, and coded.

Asset 11@2x.png

Video Diary Study

Several go pro cameras were distributed to 6 different families that agreed to filming their family road trip.

Asset 6@2x.png

Journey Maps and Task Analysis

We compiled our interview and observational data to create several different journey maps to visualize participant pain points.

Asset 4@2x.png


Our observations yielded information about the individual roles played by each unique family member. Developing these roles led to better empathy.

Asset 12@2x.png

Insights and Design Recommendations

Our efforts yielded 3 major insights that directly influenced design decisions and focus.


Emotions and connections developed between family members are influenced by activities unique to the environment.

“If we wanted human interaction we would play the ABC game, license plates, yellow car, the cow game.”

- Participant 1, Parent

“We put a mattress in the back and the kids would play games, they fought, there wasn’t any screen opportunity.”

- Participant 2, Parent


Family hierarchy, or roles, determine the individual’s perception of what a “road trip” is.

“The hierarchy changes, my dad is in charge of directions and driving. If we tell him to go a different way, he won’t.”

- Participant 3, Child


People’s most impressionable memories are created in the formative years (4-16).

“We have the best talks in the car... they ask the most insane questions. Do alligators live in the trees?”

- Participant 4, Parent

“We have these really long and great discussions while we are in the car.”  

- Participant 5, Parent

We translated these insights directly into imperative statements to generate design criteria and recommendations.


How can we foster old memories and activities while creating and encouraging new memories and experiences?

Asset 21@2x.png

Connection between all users

Asset 22@2x.png

Complete interactivity for users


How can we provide the choice to all family members to experience a road trip from a different perspective?

Asset 24@2x.png

Safety/Trust features for parents

Asset 23@2x.png

Autonomous Driving Capabilities


How can we facilitate conversation, engagement, and create memories during family interaction in the car?

Asset 25@2x.png

System Mode Options

Asset 20@2x.png

Adaptable Learning System


Quick Card Sorting with participants provided us with user-focused categorical structures for the elements in our interface.

Asset 19.png
Asset 20.png

With screen flows solidified we experimented with different UI designs for testing.

Designing UI on transparent screens:

White - Opaque

Black - Transparent

Initial Interface Concepts

Asset 21.png

Testable Interface


We created an immersive test using a transparent touch OLED screen as the car windows and car simulation buck.

Through creating detailed screeners and reaching out to family communities we found several families to test our designs.

We introduced our interface to the participants after a set control in the test without any external stimuli to compare experiences.

Tobii Eye-Tracking equipment allowed us to evaluate our primary user's focus during the testing.

Asset 16.png

Testing Set-up



Main Menu.png

Participant 3 - Focus Time Distribution

Asset 18.png

While our system was very effective in mitigating the pain points identified in our research, it did not encourage interactions between family members. Our designed system reduced the number of interactions between family members.

The proposed solution was a family mode that consisted of our original base system but added the option of connecting with all other screens to play a game or complete an activity.

This introduced a new variable that resulted in an increase of interactions that was statistically similar to the number of interactions in our control without the interface.


The main differentiator became the quality of the interactions between family members. The new family mode introduced an opportunity for users to develop unique environment-driven experiences that were shared with all members of their family including the driver. 

Asset 23.png


We interviewed groups that reflected our potential users, developed an ethnographic film, defined our design and mechanical specifications, developed low fidelity prototypes and evaluated them in order to narrow our focus, and after many iterations, developed our interface.

Our team compiled a detailed report and slide deck which we presented to our stakeholders and potential sponsors. 

We successfully created an educational dialogue between our stakeholders and potential sponsors that resulted in talks of future partnerships and the extension of the project research.

Our work was packaged and delivered as foundational research and prototypes that the Human Machine Interaction Lab at the Georgia Institute of Technology will continue to build upon.

A huge thank you to Director Wayne Li and the GM HMI Lab. 

If you're interested in this project and their amazing facilities, you can learn more here: