Hyundai Hy: Creating a modern conversational experience for hyundaiusa.com in order to improve the purchase and post-purchase experiences.
I led the initial design of the entire product/experience - GUI, VUI and Multimodal explorations.
Once the project was granted to us, I overlooked efforts of a team of 4 designers (research, visual, UX) to evolve the service and address customer pains and the overall experience.
Backed by research or not, we all know how exhausting and daunting the car purchasing experience is. From deciding what you like, or what upgraded car your newly divorced neighbor just bought, to what your friends think you should get, to how much you want to spend, and don’t even get me started on the actual test drive and dealership negotiations show, to issues with the service manager later on. We all have been there and scared by it - no need for a journey map here!
We wanted this experience to be different. We want it to be as simple as calling up your car-head friend. We think of the experience of “Hy” as sharing a conversation with a close friend that just happens to know everything about Hyundai and Hyundai’s sedan vehicle lineup. We envision a personal, engaging experience that is both informative but also fun.
We developed user stores and personas to help guide our VUI/GUI journey taking cue from Google’s conversational design guidelines and Amazon Alexa Designing a Voice Experience. This allowed us to focus more on a user problem and how to navigate through this experience.
To help define the journey we created same dialogs revolving the persona and their particular needs.
To comply with my non-disclosure agreement, I have removed and obfuscated confidential information in this work. The information here is my own and does not necessarily reflect the views of anyone else.
Some research was provided by the client, some were done by the research team. Various methodologies were used to give us as much ammunition to validate our hypothesis on the overall experience. The testing started post-prototype to improve usability and was conducted throughout the development process including benchmarking and measuring system usability scores after major updates.
Since voice interface was not so new for Hyundai (been done as a pilot in Korea on another model) we had no problems reciting participants to assist in testing. We conducted qualitative and benchmark tests with a mix of internal Hyundai methodologies. Some of the insights gained from the tests were:
What can you help me with? (finding key utterances)
How can you add value to my buying process? (comparisons - understanding features users want)
Ask me about X,Y or Z (Welcome - Testing help & welcome messages)
Is there a human behind this? (User data collection - Ai match with a sales rep.)
What I was trying to solve for
My challenge was to take the user through the purchasing funnel or at least create a ‘guided experience’ that would take the user through the funnel with minimal frictions and ultimately get him/her in a car of her choice (or the closest to it) - all from the bot experience. This was a 1st experience for the user with a Hyundai bot so we had to take the user through this path before they learn Hy’s skills and capabilities.
I had a 2 weeks to pull this off - yes, what happens when you say YES! A lot (see what I did there) So here are some challenges I wanted to overcome.
Customer Insights & Ideation
I partnered with 2 project managers and one part-time researcher (bless his heart) to uncover insights and translate concepts into features that address customer behaviors and motivations.
Experience Strategy & Vision
I created frameworks and prototypes to share the vision, design principles and content, GUI and VUI strategies. This helped to evangelize ideas, gain alignment and drive decision making.
Planning & Scope Definition
I defined the product with my project manager partners. I evangelized customer goals and balanced business goals. I prioritized and conceptualized features for launch and beyond.
Oversight & Coordination
I designed across and collaborated with Inn Ocean Art Directors, Producers, Platform Designers and their PM partners to translate product features for each platform context and the Hyundai brand.
Design Execution & Validation
I designed down on Web, Mobile, and Virtual Assistants (Alexa mainly). I designed multimodal experiences for Alexa. I executed conversations, wireframes, prototypes across multimodal platforms.
I designed up and presented works to gain buy‐in from executives, senior stakeholders and many other Hyundai, Inn Ocean and Neudesic teams throughout the project lifecycle.
“HY” - Starting a conversation
When a user reaches the Sedan category page on hyundaiusa.com they will be greeted with a digital assistant, or a friend, that knows everything about Hyundai including vehicles, dealers, and sales people. We call this friend “Hy”…
Once engaged, a buyer can ask anything of “Hy” and leveraging connections back to vehicle and dealer information, “Hy” will get the buyer answers quickly using a natural language interface.
Finding the right vehicle for you
The primary objective of “Hy” will be to help the buyer make the best possible vehicle decision for themselves and their family. We want “Hy” to know every aspect of the type of driver “Hy” is talking to and make the recommendations by taking into consideration everything we know. We want to create an experience that meets and exceeds the best possible sales interaction.
Talking to your Hyundai vehicle expert
More and more we are becoming accustomed to interacting with our devices by simply talking and not typing. “Hy” should be fully integrated with a devices capability to collect voice and translate that to text.
Take the next step
Once a buyer has selected the perfect vehicle, “Hy” will help them take the next step and schedule a test drive. While we understand that linking to existing capability might be sufficient, but we strongly believe creating a cohesive, complete interaction is critical.
I tried to extend the voice experience for after the ‘post launch’ phase and create a multimodal experience on Amazon Alexa. I recreated the conversation flows on a tool I often use for voice prototypes for Alexa and Google - VoiceFlow. Then started prototyping the experience just to sell the ‘concept’ to the client. Trying to stay away from ‘over designing’ and solving the most important challenge at hand - 80/20 rule or Pareto Principle.
I used the same research material for the bases of Persona building and conversation flows keeping the change of intentions, context of use, and possible change of action in mind. The overall goal was to:
Craft conversations that are natural and intuitive for the user to understand and Alexa to take appropriate actions on
Create a multimodal experience to Scale my conversations across all devices to help users wherever they are in whatever context they are
Later on we expanded the VUI to Sales and Services (in progress and under NDA)
Alexa MultiModal Experience
Here are some screen grabs of the process and also a recording of the Alexa interactions.