Tactile UI for blind users designed with the usage of haptics technology.

*Concept project*


I'd like to be able to use a computer without speech. Do you know what I mean? I'd like to be able to look at the whole screen and see a bunch of stuff all at once and just pop, pop, pop. I mean, sighted people whip through the computer so fast, it blows my mind. I'd like to be able to do that.

Tommy Edison - You Tuber, The Blind Film Critique [1]

When I heard it I felt touched and knew that I will try to make it for him and other 36 million blind people [2] who live among us. We take the Graphical User Interfaces (GUI) and the speed we browse GUI on our phones or computers for granted. I believe that with technology developing so fast and available brain power we have a great opportunity to become an inclusive society like never before.

I started thinking about how I can contribute and help people like Tommy make their experience of using technology better.



UX research
UX design
Tactile UI design

Self exploratory research project


The haptics screen has a realistic sense of touch, that enables creating dynamic textures that can be felt with the swipe of a finger.

A couple of months ago I heard for the first time about haptic screen technology. One of the companies that do this is Tanvas, that is based in Chicago, IL, USA. This how they describe their product:

"We use electrostatics to control friction and create a virtual touch. The applications for this technology are endless – feel the edges of keys, the snap of a toggle switch, the swipe of a turned page, the direction and magnitude of impacts in a game.”[3]

This technology is not available on the market, however, it doesn’t prevent us think what is the potential it opens up for us in the future.

Experience the haptics screen by Tanvas


To understand how blind people use their technology I watched plenty of YouTube videos where blind youtubers share their experiences. Here are the biggest takeaways, which shaped my project during the design process: 

  • The screen reader is a Voice User Interface that allows blind people to use their phones or laptops. It reads out loud everything that is highlighted or selected by the user on the screen. Screen reader also reads the function of the read sections. I primarily used this sectioning in my design.

  • Mobile first. All the YouTubers were mostly using their phones

    "When I'm using something like Facebook Twitter posting on YouTube that I'm typically using either my iPad or my iPhone for it because I like just the control of having my finger around the screen just because I find navigating on an actual computer more difficult but in terms of writing longer documents out or checking emails and replying to emails I tend to do that on my actual laptop itself "
    Molly Burke, YouTuber [4]


  • iPhone users. All of the blind users I encountered so far was using Apple products.

  • They use their phones often for social media apps such as Facebook, Twitter, Instagram or YouTube

  • They tend to use their memory or muscle memory, therefore they are able directly to find the placement of objects or interface elements

  • When they browse through their phone they move a finger from one place to another to find the desired item on their interface.

  • When they remember where some particular elements of the interface are, they directly tap this place.

Christine Ha shows how she uses her Apple devices


The next step in my design was to understand how we receive and process tactile information, here are core findings:

  • People who are blind from birth are able to detect tactile information faster than people with normal vision. [5]

    "Our findings reveal that one way the brain adapts to the absence of vision is to accelerate the sense of touch. The ability to quickly process non-visual information probably enhances the quality of life of blind individuals who rely to an extraordinary degree on the non-visual senses."
     Daniel Goldreich, PhD, of McMaster University [6]


  • People who have smaller fingers have a finer sense of touch. That is why women tend to have better tactile acuity than men because women on average have smaller fingers. [7]

  • To feel finer textures, people need to move their fingertips across the texture. The receptors in the skin have to feel vibrations to be able to process the texture.[8]

  • People feel embossed elements easier.

  • Braille readers are more proficient in recognizing textures. [9]


I decided to analyze 3 popular social media platforms and read the Apple Human Interface Guidelines in order to categorize the most crucial elements of the app’s interface.

  • Menu

Group 4 (1).png
  • Button and specific buttons

Group 3.png
  • Text

  • Input field (search)

Rectangle 3.1.png
  • Image

Group 5.png


  • Twitter

Twitter (1).png
  • You Tube

YouTube (1).png

What about Braille?

At the beginning of the project, I thought of incorporating the Braille symbols, even try to display the text in braille - at first, I considered designing the interface such that a blind user would not have to use VoiceOver at all. However, after the research, I found out that, actually not only Braille isn't used by all blind people [9] but also the cells of braille symbols are quite big [10] thus would take too much space on the real estate of the mobile phone’s small screen.


After research and preparation, I decided to build a low fidelity prototype to design a tactile interface that is based on sections from VoiceOver for Facebook. I chose the iPhone 7 as a small and more popular size.

To build the prototype I used the playdoh clay, it is medium in hardness and relatively easy to shape. Thanks to these properties it was quite easy to build the first prototype quickly.


I had to find 5 distinguishable textures that are easy to detect during the initial touch (without rubbing). I decided to choose simple geometric shapes with wide textures, that would be easier detectable.


After testing the textures, I noticed that the texture of the menu (vertical embossed lines) makes it impossible to understand the borders of a menu element. I decided that the edges of the elements are important in feedback information, that is why I decided to change the texture.


I decided to use horizontal lines for the menu, which made the feeling gap between elements much easier.



One of the drawbacks of making the whole phone from the clay was, that I could not hold it and truly test it how it would feel in hand. That is why I decided to use a phone cover for the base of the second prototype. Also, I made a modular design where I have a flat base and building blocks (patterns). To build the phone, I add the chosen textures on top of the base.

Facebook's Graphical UI interpreted in Tactile UI



For the menu, I chose a single punch big hole that gives immediate tactile feedback. It is the clearest feedback. I chose this separate shape for the menu (different than the button) because of multiple reasons:

  1. It helps to define edges of the screen - the blind user doesn’t know where the screen finishes and bezel starts.

  2. The menu is one of the most UI elements that is why deserves clear indication.

  3. Menu, especially the bottom one, usually has a similar squarish shape that is perfect for that kind of texture.



Input field has a distinctive texture of embossed dots (I couldn’t clearly show this pattern on this prototype). Here are the reasons why I choose this pattern:

  1. It is different from other textures.

  2. Uses embossed dots that resonate the Braille system.



Buttons use yet another take on the textures. Instead of geometric hard shapes, I decided to propose fabric textures that are richer in form.

  1. It makes this texture the most delicate to feel for the first touch yet it is very distinguishable in comparison to strong textures of the rest of the elements.

  2. It is possible to add on top of this texture embossed elements, that would help to recognize the function of the button for instance: back, add or play button.

  3. In case of ok button texture would not change, but for the cancel or delete could be rough more unpleasant in touch.

  4. I decided to not use Braille symbol as an embossed indication of the special buttons (for instance: x or +) because their signing would be inconsistent and braille takes too much space.

Diffrent styles of buttons_for_tactile_U


For the image, I choose vertical embossed lines and for the text diagonal embossed lines. These patterns are easy to recognize for big and small elements

I find the “white space” between elements crucial for their recognition that is why I propose making smaller tactile buttons than they are on visual UI. However, for better readability and to not clutter the interface of the user, some of the elements should be grouped under one tactile button if they lay next to each other and have the same function.


Understanding the blind user, actually without interacting personally with her/him.


Finding 5 easily recognizable distinctive textures

Lack of haptics screen

Designing without the haptic screen. I have sent the inquiry for the screen, unfortunately, I received no reply from Tanvas.


Less voice-over input, less noise

A faster way of using the apps

Maximizing the feedback

Possibility of “touch scanning” the UI of the app



This project has an exploratory character about the future of the user interface.

I just wanted to start the discussion about what the technology gives us today, and what may it mean for the users tomorrow. My passion and one of the main aims is to use technology to make other people's life easier. I hope that tactile interfaces could be one day commonly used not only by blind users but by all of us.


This project is just the beginning, proof of concept. In the future I would like to make a physical prototype that is unbreakable - maybe 3d printed - and test it with blind users to keep on iterating with textures.


How Tommy Edison uses IG:

How Apple Saved My Life - Story of a blind film director James Rath


Lilly Spirkovska
Summary of Tactile User Interfaces Techniques and Systems
NASA Ames Research Center


1. The quote is from this video:


2. Data from WHO website


3. Tanvas website


4. Molly Burke talks about how she uses the technology

5. Research:

A. Bhattacharjee, A. J. Ye, J. A. Lisak, M. G. Vargas, D. Goldreich.
Vibrotactile Masking Experiments Reveal Accelerated Somatosensory Processing in Congenitally Blind Braille Readers.
Journal of Neuroscience, 2010; 30 (43): 14288 DOI:


6. Quote:

7. Research

Ryan M. Peters, Erik Hackeman and Daniel Goldreich.
Diminutive Digits Discern Delicate Details: Fingertip Size and the Sex Difference in Tactile Spatial Acuity.
Journal of Neuroscience, 27 October 2010, 30 (43) 14288-14298; DOI: https://doi.org/10.1523/JNEUROSCI.1447-10.2010


8. Research

Justin D. Lieber and Sliman J. Bensmaia.
High-dimensional representation of texture in somatosensory cortex of primates.
PNAS, February 19, 2019 116 (8) 3268-3277; first published February 4, 2019 https://doi.org/10.1073/pnas.1818501116


9. Research

Michael Wong, Vishi Gnanakumaran and Daniel Goldreich.
Tactile Spatial Acuity Enhancement in Blindness: Evidence for Experience-Dependent Mechanisms.

Journal of Neuroscience 11 May 2011, 31 (19) 7028-7037; DOI:


10. Quora

Like what you see?

Let's chat.