Top

Beyond the Tap: Rethinking Touchscreen Usability for Neurodiverse Users in Healthcare

Inclusive User Experiences

Designing for neurodiversity

A column by Yuri Shapochka
July 21, 2025

In today’s healthcare environments, touchscreen user interfaces are everywhere—from patient check-in kiosks to bedside tablets to medication-dispensing units. These digital tools promise convenience and efficiency, yet for many users, especially those who are neurodiverse, they can introduce confusion, anxiety, or even outright barriers to access.

I first became aware of this tension in a waiting room at a busy clinic. A touchscreen check-in station stood near the entrance, with bright icons inviting interaction. I watched as a woman stood hesitantly in front of the screen, her body stiff with uncertainty. After several failed attempts to navigate a series of unclear prompts and becoming visibly flustered, she stepped aside. A staff member intervened, but by then, the system had already failed in its core mission: making access easier.

Champion Advertisement
Continue Reading…

This is not an isolated example. Neurodiverse individuals—including those with autism, attention deficit hyperactivity disorder (ADHD), sensory-processing disorders, or anxiety—often struggle with user interfaces that assume a singular cognitive or sensory mode. Even simple interactions such as Tap here to begin can become overwhelming when they are compounded by the stress and urgency of a healthcare visit. Such systems often rely on one-size-fits-all logic: large buttons, basic navigation, and fast-completion flows. But larger touch targets alone don’t equal accessibility and fast is not always friendly.

Designing touchscreen user experiences for healthcare demands more than just responsiveness and compliance with the Americans with Disabilities Act (ADA). It requires nuanced interactions, empathy, and a deeper understanding of how real people—not ideal users—encounter technology when under stress. In this column, I’ll explore how touch-based healthcare user interfaces can better support neurodiverse users by rethinking interaction patterns, reducing friction, and embracing inclusive-design strategies.

We’ll look beyond value contrast and font size and question how microinteractions, pacing, affordances, and feedback can either build trust or trigger discomfort. Because, when users are already anxious or overstimulated, the last thing they need is to decipher a blinking button or guess the meaning of a vague icon.

Touchscreens are here to stay. Let’s make them better for everyone.

Beyond the Swipe—When Touch Goes Wrong

Touch user interfaces promise simplicity. A tap, a swipe, a pinch—all can potentially provide greater accessibility to users. However, in real-world use, especially in high-stakes or unpredictable environments, a clunky button or complex navigation system on a touchscreen can become an obstacle instead of an aid. What’s marketed as easy to use often fails under pressure.

Consider trying to silence a ringing phone during a medical consultation, with gloves on or slightly wet hands. Or navigating a touchscreen-based kiosk in a hospital with trembling fingers. The sleek user interface that worked perfectly in the design lab becomes frustrating and sometimes inaccessible in less-than-ideal conditions.

Champion Advertisement
Continue Reading…

A key problem with touch interactions is the need for precision. Touchscreens often expect users to hit small targets such as buttons, links, sliders with millimeter accuracy. But people don’t always have steady hands. They may be older, distracted, in a rush, or dealing with temporary impairments. When a user interface punishes imprecision, the experience quickly unravels.

Touch-based gestures also carry cognitive weight. Users are expected to remember what different swipes and pinches do, often with no visual indication of their functionality. Worse, these gestures vary across apps and platforms, leaving users guessing whether swiping right deletes an item or saves it. The result is hesitation and anxiety—both of which are enemies of usability.

The problem magnifies in shared environments such as public kiosks, healthcare devices, or industrial touch panels. Screen glare, fingerprints, smudges, and varying lighting conditions all contribute to misfires. And let’s not forget winter gloves, bandaged hands, or simply users with long fingernails. These are not edge cases—they’re everyday realities for many people.

When touch fails, users have no backup. There’s rarely a tactile clue or audible feedback to tell them what went wrong. The sleekness of a flat glass surface comes at a price: the complete absence of physical guidance. To move forward, we must question whether touch alone is ever enough. More importantly, we must ask what happens when it isn’t.

Designing Redundancy into Touch User Experiences

The elegance of a touchscreen should never come at the expense of usability. In interaction design, redundancy means offering more than one way to accomplish a task. When designing for touch, redundancy is not a flaw—it’s a lifeline. In critical contexts, especially healthcare, users need options. If a swipe fails, a button should still work. If a tap misfires, a voice command or physical control could close the gap.

Why is redundancy necessary? Not because users are lazy or inattentive, but because real life can interfere. A clinician might be rushing. A patient could be in pain. A caregiver might be wearing gloves. In all these scenarios, user interfaces that depend on a single mode of interaction risk becoming barriers instead of bridges.

Good redundancy is not clutter—it’s thoughtful support. For example, combining touchscreen controls with hardware buttons on a medical device allows fast, error-proof operation. Including an on-screen keyboard alongside voice input gives users the freedom to switch input modes based on their current context. Even something as simple as placing a Cancel link near a swipe-to-delete gesture gives users an out if they misstep.

It’s tempting to view touch as a complete solution. After all, it’s clean, scalable, and modern. But systems that rely solely on touch often reveal their limitations when people use them outside the ideal lab setting. Adding layers of support—voice, tactile feedback, keyboard access, or alternative input modes—ensures broader usability without sacrificing visual polish.

In healthcare and other high-stakes environments, the cost of interaction failure is not just user frustration. It can lead to real-world consequences. A mistyped dosage, a delayed alert, or a navigation error in an emergency user interface is not a design issue—it’s a safety issue.

By designing for touch plus something else, we can build resilience into our user interfaces. And resilience is the hallmark of a good user experience.

The Role of Feedback and Forgiveness

Touch user interfaces lack the tactile resistance of physical buttons, making feedback essential. Without clear responses, users are left wondering whether their tap registered, the system is processing their request, or they made a mistake. This uncertainty builds frustration and, in critical environments such as healthcare, can cause dangerous delays.

Effective feedback doesn’t just confirm users’ input, it reinforces trust. Visual cues such as button color changes or slight animations or tactile cues such as vibration can signal acknowledgment. Subtle sounds or haptics can further enhance confirmation, particularly for users with vision impairments or those under cognitive stress. The goal is to remove ambiguity and make users feel confident that the system is listening and acting on their commands.

Equally important is forgiveness—the ability of a user interface to tolerate users’ mistakes. Touch interactions are prone to error: a wrong swipe, a missed tap, or even performing a gesture too slowly. We must design systems to absorb such imperfections. Buttons should be generous tap targets. Undo functions should be easily accessible. Destructive actions—such as deleting a record or canceling an appointment—should display confirmation dialog boxes or other recovery options.

In medical settings, where gloves, fatigue, and time pressure can affect performance, forgiving design is not just a bonus, it’s a requirement. Systems must accommodate imperfect user inputs without punishing the user. Designing for forgiveness means building slack into the system: accepting slightly off-target taps, recognizing gestures even if user don’t perform them perfectly, and supporting error recovery with minimal friction.

Too often, digital products assume optimal user conditions. But UX professionals know the real world is messy. People use touchscreens in cars, at hospital bedsides, in noisy clinics, or with shaky hands. User interfaces that acknowledge and accommodate this reality perform better, inspire confidence, and reduce the stress of interactions.

When users are unsure whether a system is working—or worry that it will punish them for a slip—their anxiety rises. When they’re confident a user interface will confirm, guide, and help them recover from mistakes, their anxiety diminishes. That’s the power of feedback and forgiveness. And in touch user experiences, these are non-negotiable qualities.

Designing for Touch in High-Stakes Environments

Nowhere are touch user experiences more critical than in environments where precision, clarity, and speed can impact lives—environments such as healthcare, transportation, and emergency services. In such settings, users do not have time to second-guess a user interface. Every interaction must be intentional, fast, and error resistant.

For example, healthcare professionals often interact with touchscreens while wearing gloves, standing, or moving quickly between tasks. A poorly placed button or a delayed response could mean the difference between catching a critical alert and missing it entirely. These are not theoretical concerns—they’re design realities that we must address from the outset.

In high-stakes settings, gesture-based controls can backfire. While complex swipe patterns or hidden navigation layers may be elegant in theory, they can introduce risk when the user is under pressure. The better approach is often the simpler one: visible buttons, minimal steps, and clear outcomes. Predictability becomes more valuable than novelty.

In such environments, touch user-interface design also demands attention to visibility and legibility. Glare, changing lighting conditions, and screen angles can all interfere with usability. Therefore, designers must ensure strong contrast, scalable text, and easy-to-follow layouts that maintain clarity under a variety of physical conditions.

Moreover, systems within critical environments must prioritize responsiveness. A delayed response—whether the result of poor performance or server lag—feels like a failure. Users must know immediately whether their input has registered, especially when they’re multitasking or moving between screens.

Context matters, too. A touchscreen user interface in an ambulance has vastly different constraints from one in a quiet office. Designers must ask: Who is using this? Under what stress? In what conditions? Then they can design accordingly—by providing larger tap targets, simplified workflows, and clear redundancies.

Touchscreens have become the default user interface in modern systems, but in high-stakes environments, they must earn their place. Designing for touch in such contexts is about more than aesthetics; it’s about trust, safety, and performance under pressure.

UX professionals carry a responsibility here. We’re not just designing user interfaces; we’re shaping how people interact with technology when it matters most. And when lives are on the line, usability is not optional—it’s essential.

Conclusion: Rethinking Touch User Experiences with Precision and Empathy

As touchscreens continue to dominate digital interactions across industries, it’s easy to overlook how much nuance they require. The gestures we take for granted—taps, swipes, and pinches—are not universally easy to use, nor are they always practical. Designing a touch user experience isn’t about simply making things larger or more responsive; it’s about understanding the human contexts in which these gestures occur.

From accessibility challenges to accidental interactions, from neurodiverse perspectives to high-pressure environments, touch user interfaces demand careful, deliberate thinking. The best touch experiences aren’t just usable, they’re also forgiving, inclusive, and adaptable. They anticipate the uncertainty of fingers in motion and environments in flux.

As UX professionals, we must move beyond a one-size-fits-all mentality. Touch is a human interface. It carries intent, hesitation, and emotion. Our job is to meet users where they are, whether they’re tapping with confidence or navigating with uncertainty. When we get touch user-interface design right, we create systems that not only respond to the user’s hand, but can understand the person behind it. 

UX Visual Designer at Illumina

San Diego, California, USA

Yuri ShapochkaYuri is an experienced design leader with expertise in the design and development of engaging user experiences. He has more than 20 years of experience, working within fast-paced, innovative development environments, including in the highly regulated healthcare industry. Yuri has a deep understanding of contemporary user-centered design methods, as well as a working knowledge of regulations and best practices for medical devices and human factors. He has a proven ability to oversee the entire design process, from concept to implementation, ensuring that he maintains the design intent at launch. Yuri holds a Master of Science from Donetsk National Technical University and a Master of Arts from Donetsk National University, in Ukraine.  Read More

Other Columns by Yuri Shapochka

Other Articles on Designing for Diversity

New on UXmatters