Inclusive design is something that has recently come to light for many businesses, programmers and designers around the world. Creating to include as many people as possible is typically a goal for most people, however this is much easier said than done. Assumptions based on the way fortunately abled people live run rampant, and this makes it much more difficult to solve the problems that less fortunate people face.
In particular, Nullspeak aims to aid in communication between both deaf and / or blind people and with those that are more fortunate.
My solution aims to solve a small section of the larger problem of communication between differently abled people, through the use of gestures and visual displays. Users of the product would be able to simply perform a gesture above Nullspeak, triggering a message to show on the screen. Gestures include swipes, rotations of the hands, making fists and more, up to a total of 16 motions.
Each message and gesture combination is designed to be customised, such that every individual is able to make the product their own. This product aims to help the blind, deaf and mute to be able to get through their day with a little more independence, hopefully improving their typical day.
This project first began with research into the products that were currently available for those that are differently abled, including those with physical ailments. Each product was gauged against effectiveness in solving each user's problem, with the attempt to find the market in which there was the largest gap.
The blind, deaf and mute community all had few products to easily solve their problems with affordable yet portable products, and this led to the start of the development phase. Nullspeak began with an initial prototype using a BBC Microbit, some sensors and LEDs. The concept was to use sensors that integrated with the Microbit to display the text in a small and legible manner. This was done using a gesture sensor and the onboard 5x5 LED array.
Figure 1: An example of the BBC Microbit's onboard 5x5 LED array in action. The text scrolling across is: "My Name" (makecode.microbit.org/projects/name-tag, 2020).
Using some sample tutorials and videos, I learnt how the Microbit and the connected devices communicated between each other in Python. I tested out each set of gestures to see reliability and consistency of recognition, and using some focus groups I decided on setting different phrases to each motion.
Testing with mid-fidelity prototypes revealed additional features that could be of great importance for potential users. Firstly, the onboard buttons although hidden when using the product in it's normal state, were to be used for navigating the settings menu, for further customisation. LEDs were installed along the side of the product to give more feedback on system status.
Once the software of the Microbit device was written and tested, the development of the physical housing of the device was required. I compared two different options for the housing mechanism for this product: wood fabrication and 3D printing.
Figure 2: Modelling the housing of the Microbit within 3DS Max. This was an initial prototype, with less curves and smaller spacing for the processing indicators.
To begin with, a sketch was made to identify the style and shape of the housing. It needed to be small, yet easily held in the hand. Additionally, there needed to be affordances to indicate the directionality of the device (which side would make the text legible).
Wood fabrication implied the need for special processing techniques to arrive at the intended housing shape. These techniques were extremely intricate and these would introduce a much higher cost, shifting away from the initial aim of the project: to create an affordable solution. However, a wooden construction would seem much more professional off the bat and feel nicer in the hand.
Conversely, 3D printing solutions were much cheaper and easier to construct. Creating a 3D product was also simple, where the housing could be modelled in the software and just as easily sent to the 3D printer to produce.
Figure 3: A mid-fidelity prototype of the housing. The side indicator spacing can be seen here, as well as a small hole for the gesture to be sensed.
After the first few stages of development, testing with users was conducted to see what aspects of the housing could be improved.
To begin with, a motif was developed to be used as the logo, and as a viewing window into the Microbit. This was in the shape of a small chat bubble, signifying that people can communicate through this device. Secondly, the spacing for the indicator was lengthened further, with a small diffusing sheet of plastic placed on the sides. This helped ease the bright lights along the sides, while simultaneously improved the build quality of the device as users could not look inside the housing.
Figure 4: The first stage of the fabrication process.
Finally, the model was sanded, painted and finished appropriately. The final product was a high-quality device that solved the problem of communication between people that are hard of hearing, sight and speech and those which are more fortunate.
Figure 5: Some of the renders used in the final display of the product. The central motif of the speech bubble also gives a sense of direction to the product.