top of page
Of the 330 million individuals in the United States, 48 million are hard of hearing and nearly half a million are deaf. This community, although small, faces a big challenge when it comes to drive-thrus. In order to better understand this problem, we reached out to individuals in the deaf and hard of hearing community as well as Starbucks employees to see what systems need addressing.
While we were solving with the deaf and hard of hearing community in mind, we knew there was an opportunity to create something that would benefit everyone. It was also important for us to think about ways in which we could design a drive-thru experience that's not only replicable for Starbucks, but also scalable. That meant exploring other drive-thru systems and journey mapping the experiences of three separate audiences: the deaf and hard of hearing community, non-native speakers and ourselves.
Our AI-powered, talk-to-text design helps consumers not only better visualize the experience, but also expedite the process and ensure order validity. Creating a default language setting, as well as options to initiate ASL AI or non-native language AI, enables consumers to easily adjust when necessary without the stress and delay of a hearing or language barrier.
So many drive-thru experiences today are the same: you drive up, say your order, pull around to the window, pay the allotted amount, receive your order and well, that's pretty much it. Because the drive-thru system is so similar across the U.S., we tested our design among chains like McDonald's and Wendy's to see if our concept was scalable — spoiler alert: it is. That said, I hope to see this concept come to life in the near future.
bottom of page