Inclusive Design Challenge Semifinalists
The U.S. Department of Transportation is proud to announce the Inclusive Design Challenge Semifinalists. These researchers and innovators have proposed hardware and software solutions addressing a wide range of physical, sensory, and cognitive disabilities, all aimed at integrating with Automated Driving System-Dedicated Vehicles (ADS-DVs). The listing below provides a summary of each Semifinalist’s proposal, along with their contact information. Visit this page in the future for more updates and sign up to receive notifications for ways to stay engaged.
USDOT Inclusive Design Challenge Semifinalists
- AbleLink Smart Living Technologies
- Boston University
- Carnegie Mellon University Human-Computer Interaction Institute
- Clemson University
- Foresight Augmented Reality
- May Mobility & University of Michigan Transportation Research Institute (UMTRI)
- Purdue University
- University of Kansas
- University of Maine
- Waymo
|
AbleLink Smart Living Technologies The research team will lay the foundation for accessibility to Level 4 & 5 ADS-DVs by developing the WayFinder ADS system, a comprehensive mobile application designed to support independent access to ADS-DVs by individuals with cognitive disabilities and others with special needs. A WayFinder ADS Dashboard will also be developed to allow caregivers to set up secure connections to companies providing reservation access to ADS-DVs. The module will also provide the opportunity to set pre-determined destinations for the user. The overall goal is to provide cognitively accessible interfaces and app navigation features, and to reduce the overall cognitive load associated with interacting with an ADS-DV. Project Contact: Dan Davies |
|
Boston University Project Contact: Eshed Ohn-Bar |
|
Carnegie Mellon University Human-Computer Interaction Institute This team will design and develop an inclusive, generalized smartphone-based interface design system, building off of modern smartphone accessibility systems. The interface will allow a user to send and receive messages from the vehicle and control the vehicle’s physical interfaces. The team’s prototype will consist of an app that shows the capabilities of the design system, a set of design components that have been tested and validated with end-users through simulation, and documentation of accessible design guidelines for designers to quickly build inclusive ADS-DV communication and control apps. Project Contact: Nikolas Martelaro |
|
Clemson University ATLAS II is a Distributed Automated Vehicle Human-Machine Interface (DAVHMI) consisting of an in-vehicle receiver within a Wi-Fi and Bluetooth enabled infotainment system, a mobile device, a cloud-based back end, and optional physiological sensing devices. This will enable an ADS-DV user with physical disabilities to interact with the vehicle using familiar technologies and provide an optimal user experience. For Stage II, the team will develop a prototype integrated into a conventional motor vehicle configured to appear functionally as a capable ADS-DV. Project Contact: Julian Brinkley |
|
Foresight Augmented Reality This team will address the problem of locating an ADS-DV, interacting with it in routine and emergency situations, and improving the safety and security of people with disabilities. They will develop an accessible app with agnostic interface capabilities (i.e., compatible with all ADS-DVs), utilization of ultra-wideband (UWB) technology for location capabilities, and advanced mapping technology to deliver accurate destination location information. These three components will function together fluidly to solve major issues without human intervention. Activities planned for Stage II include development of the UWB anchor and tag prototypes and app development. Project Contact: Chris Webb |
|
May Mobility & University of Michigan Transportation Research Institute (UMTRI) For this project, the research team will install an automated wheelchair docking system meeting specifications for a universal docking interface geometry (UDIG). This concept allows any wheelchair with attachment hardware meeting the UDIG specifications to dock with any vehicle equipped with anchoring hardware meeting the UDIG specifications. The team will also include an automated seatbelt donning-system with its docking system. The research team intends to present their automated wheelchair docking and restraint system within a wheelchair-accessible electric hybrid minivan for their Stage II demonstration. Project Contact: Tara Lanigan |
|
Purdue University The team will work to develop the Efficient, Accessible and Safe Interaction in a Real Integrated Design Environment for Riders with disabilities (EASI RIDER). The team will develop an in-floor ADS-DV ramp design, an automatically deploying “Smart Ramp,” an automated wheelchair securement system, and an on-board user-interface that will provide accessibility features that cater to people with a wide range of disabilities. During Stage II of the competition, the team plans to develop a life-sized, operational demonstration platform with these features using an existing ADS-DV shuttle. Project Contact: Bradley Duerstock |
|
University of Kansas The goal of this project is to develop innovative ADS designs that operate at Levels 4 and 5 for people with mild cognitive impairment (MCI) and mild to moderate dementia. The team proposes an integrated system solution comprised of a mobile secure app, a traveler monitoring system, an automated in-vehicle agent, and a cloud framework for real-time analytics. The solution is vehicle-agnostic and can be integrated as original equipment. In Stage II, the team will develop a physical prototype consisting of a full-size vehicle equipped with the developed interfaces and sensors. Project Contact: Alexandra Kondyli |
|
University of Maine |
|
Waymo’s existing ride-hailing service already integrates inclusive design features, including the option to minimize walking time when ordering a trip, turn-by-turn walking navigation, the ability to honk the car’s horn remotely, and to contact support staff who have access to the vehicle’s cameras. In Stage II, the team will prototype additional features to be integrated into their ADS-DV ride-hailing application: adaptive app navigation (visual, audio, and haptic cues to navigate to a vehicle), purpose-built car sounds for wayfinding, locating with headlights, hands-free car communications, and video chat support. Project Contact: Clement Wright |
For more information about Stage II activities, please visit both our Stage II Instructions page and our FAQs page. If you would like to speak to us directly about the Inclusive Design Challenge, please email inclusivedesign@dot.gov.