DARI

PRODUCT DESIGN

UX STRATEGY

RESEARCH

INTERACTION

Dari is an AI-powered assistive technology that bridges the communication gap between Deaf and hearing individuals. It features the Woori armband, which interprets ASL gestures into speech, and Sari smart glasses, which provide real-time captions and environmental awareness. By combining gesture recognition, AI translation, and contextual alerts, Dari enables natural casual communication, fostering inclusivity and seamless interactions.

Specifications

Role: Team Lead, UX Strategy

Team:

Suji Kim, Visual Lead, UI Design Lead

Lukas Wiesner, Research Lead, UX Engineer

Lara Kurt, Branding Lead, Marketing Lead

Duration: 26 Weeks

Tools: Figma, Rhino 3D, Keyshot, Blender, Adobe Suite, After Effect, DaVinci Resolve

Why

Despite major technological advances, the Deaf community still encounters significant communication barriers, hindering their ability to engage in seamless everyday interactions.

How

Understand the nature of American Sign Language and the culture of the Deaf community to enable more fluent communication between Deaf and hearing individuals.

What

Through an ecosystem of wearables, an app, and a community website, we provide essential information seamlessly during casual conversations.

🖱️ Hover on the image to stop the slideshow!

Background

Uncovering the unnatural

Existing technologies primarily support one-way communication unless interpreters are involved. However, there are still gaps in ASL technology, which do not do justice to the language, including limitations in ASL gloves that hinder truly seamless interaction.

Research

Secondary Research

We needed to make sure we understand the Deaf community and the industry in general before getting into details in the how American Sign Language works.


I collected the meaningful information that would explain why we need to start this project in the beginning.

  • 360 million people are considered Deaf

  • More than 90% of Deaf children are born to hearing parents.

  • More than 70 million Deaf individuals use sign language as their primary language

  • ASL and English are NOT word-to-word translation

Research

Stakeholder Map

I mapped out the stakeholder map to understand the people and services that are closely related to the lives of the Deaf individuals. This helped us visualize the daily interactions that Deaf individuals usually have in general.

Research

Market Research

Video Relay Service

Allows Deaf and hard-of-hearing individuals to communicate with hearing people via a sign language interpreter over video in real time.

Allows Deaf and hard-of-hearing individuals to communicate with hearing people via a sign language interpreter over video in real time.

Allows Deaf and hard-of-hearing individuals to communicate with hearing people via a sign language interpreter over video in real time.

Allows Deaf and hard-of-hearing individuals to communicate with hearing people via a sign language interpreter over video in real time.

Pros

Provides real-time, professional ASL interpretation; accurately conveys facial expressions and body language.

Cons

Requires a stable internet connection and an interpreter; not ideal for private or emergency situations.

ASL Sign Gloves

wearable devices that translate American Sign Language (ASL) into text or speech using sensors to detect hand movements.

wearable devices that translate American Sign Language (ASL) into text or speech using sensors to detect hand movements.

wearable devices that translate American Sign Language (ASL) into text or speech using sensors to detect hand movements.

Wearable devices that translate American Sign Language (ASL) into text or speech using sensors to detect hand movements.

Pros

Portable and can potentially translate ASL into text or speech without needing an interpreter.

Cons

Often struggles with accuracy and lacks the ability to capture facial expressions and body language, which are crucial in ASL communication. It does not do a justice of the language.

Research

Primary Research

Now that we have a clear understanding of the scope and industry, I recruited Subject Matter Experts across the United States who are involved in the Deaf and Hard of Hearing community.


We first spoke to experts to validate the information and gather holistic view before engaging with Deaf individuals to prevent any miscommunication or misunderstanding.

Research

Semi-structured Interview

  • Interviews are held on Zoom

  • Topic guides are developed from the primary research questions

  • Interviews were recorded/transcribed

Research

Analysis

Using the transcription in Google Docs, we highlighted key quotes that provided important data points, either as validation or as insights to help us better understand the culture of the Deaf community.

Research

Synthesis

We used a reflexive thematic approach to analyze transcripts, engaging closely with the data to identify patterns. First, we extracted key excerpts individually, then assigned codes inductively as a group. Using reflexive notes and thematic maps in NVIVO, we refined themes and visualized the relationships, deepening our understanding and insights.

The team gained a clear understanding of the experiences of using English, text-based technology, ASL interpreters, and interactions with the hearing community.


Most importantly, we built strong relationships with experts throughout the project, which motivated us to maintain an empathetic perspective on the experiences of Deaf individuals.

Research

Insights

We used a reflexive thematic approach to analyze transcripts, engaging closely with the data to identify patterns. First, we extracted key excerpts individually, then assigned codes inductively as a group. Using reflexive notes and thematic maps in NVIVO, we refined themes and visualized the relationships, deepening our understanding and insights.

Moving from our analysis, we considered the insights developed from our refined themes and how they not only defined our problem space, but introduced opportunities available which we could take advantage of in designing a solution which truly addresses the needs of Deaf individuals when conversing with the hearing population. From the opportunities detailed in our themes, we synthesized our insights into 4 actionable criteria to guide the development and initial ideation of our solution.

Understanding

POV Statement

Understanding

Key Performance Indicator

Ideation

Ecosystem

After extensive discussions, we developed an ecosystem for seamless interaction across various situations. Wearables will be the primary interface, enabling screen-free communication, while a mobile app and online community will provide the data needed for fluid and diverse interactions. Otherwise, the wearables will have too much datas to process in real-time.

Ideation

Logistics

I explored potential AI models to enable seamless interaction. The team chose a semantic engine to match predefined ASL terms with English sentences for faster processing. Sentiment analysis will enhance communication by recognizing emotions, while gesture recognition will retrieve ASL terms linked to specific gesture data.

We recognize that American Sign Language is highly expressive and ever-evolving. To embrace this, our platform includes a community-driven ASL repository where users can share unique ASL terms. If a term gains recognition and community support, experts will review it, and upon approval, it can become an official ASL term for use in wearables.

Ideation

Branding

Established a branding guideline to communicate the branding consistently across the mobile and desktop.

Prototype

Physical Prototype

The team collaborated with an industrial design student to develop the wearables, from sketching and modeling to initial 3D printing.

Prototype

UI Devlopment

We prioritized to finish the mobile app first to plan out the user testing. We developed from wireframe, lo-fi, then applied our branding into the mid-fi. Then we prepared for usertesting.

User Testing

Atlanta Area School for the Deaf

Atlanta Area School for the Deaf were very nice to let us give a presentation to the students, to showcase our initial products to the students who are in the age group of our target audience. Everything had to be very intentional, since we had to go through interpreter who were conveying our message to the students.

  • Interpretation solution was very well received with excitement and many suggested practical applications.

  • Environmental sound awareness works well within participants vision and is applicable to targeted contexts.

  • LED color indication suggestions for color blind

  • Legibility and Deaf-blind concerns of low contrast between text and background concerns on mobile app

User Testing

Three Rivers Association of the Deaf

It was honored to be part of the Valentine’s event conducted by Three Rivers Association held at Cave Springs, Atlanta. This time, there were wide range of age group. The adults were able to provide us a validation with very technical and detailed example of how this could be impactful to the lives of the Deaf individuals.

  • Averaged a 4.6 overall in usefulness, usability, effectiveness, and satisfaction with solution.

  • Received detailed contexts where our solution could be applied.

  • Deaf-blind concerns on the contrast between text and background on mobile app.

  • Even with good english proficiency, the application of ASL terms is faster and easier to recall than typing.

Final Design

Wearables

Our wearables enhance our app’s capabilities. Woori, our smart ASL armband, uses IMU and EMG data to let users respond by signing, paired with saved ASL terms. Sari, our smart glasses, keeps conversations phone-free with an embedded display and discreetly indicates sound sources for environmental awareness.

Final Design

MyDARI

MyDARI enables seamless conversation support. With one touch, users get live English transcription paired with an AI-generated ASL signer. They can respond by typing in English or selecting saved ASL terms, which are instantly spoken aloud.

Final Design

Website

Desktop experience lets users manage their account, review saved ASL terms, and access past conversations. Most importantly, they can join our Community—a user-driven forum to request, suggest, and discuss ASL terms by typing in English or recording themselves signing. This ensures our platform evolves to support users’ needs.

Sejoon Kim

Product Designer

UX Researcher

Sejoon Kim

Product Designer

UX Researcher

Sejoon Kim

Product Designer

UX Researcher