Want to Create Touchscreen Layouts that Feel Great? Embrace Human Movement Patterns
I’ve observed that people using large touchscreens experience varying degrees of fatigue and discomfort from the physical effort of swiping, tapping and zooming. The placement of the interface elements is one contributor to the discomfort. Repeatedly executing a gesture that is at odds with the body’s natural movement patterns is fatiguing. Hands and arms naturally move in curves and arcs yet screen layouts are predominantly rectangular and right-angled.
Why do designers create visual layouts that are less than comfortable for users? The answer: it’s what they know. For more than four decades, designers have been creating strict (i.e. mouse/pointer-enabled) graphical user interfaces (GUIs). These GUIs borrow heavily from graphic design methodology and are founded on grid theory, which is rooted in mathematics, specifically the ideas espoused by Pythagoras. Following this theory, composition elements are laid out on a grid composed of rectangles and right angles that have aesthetically pleasing proportions. This theory is based on established mathematical ratios, not human movement patterns.
Does this grid-theory approach – visual logic that has long ruled the design of GUIs – make sense for touch devices? Is there another way designers can create touchscreen layouts that are more aligned with the way the human body moves?
In my opinion, the answer is most definitely yes. But it requires designers expand their focus, see beyond the screen they’re designing for and look at the larger world. Say you’re designing a new bicycle. You’ll no doubt consider how the human body moves, how a rider will fit on the bike. Can the rider comfortably reach the handlebars and pedals with the prescribed seat placement? Are the handlebar grips molded to the contours of the human hand so they are comfortable during long rides? Can the bicycle move in a way that feels natural, especially over time and with physical exertion? In other words, does the bicycle work with the body, rather than against it?
Swipe is the New Click
Just as enhancing comfort and moderating fatigue are critical in bicycle design, so too are they in the design of touch-enabled devices. This is where designers need to step up their game. Touch interaction is often assumed to be simply a variation on mouse interaction – finger replaces mouse pointer. It’s not. Mouse interaction requires minimal physical effort, but touch gestures involve a variety of hand and arm movements, which can make them more taxing to perform. Designers who become students of human movement can create more enjoyable touch interactions. By applying to touch design the same attention to ergonomics and kinetics that goes into bicycle design, you can deliver a superior user experience.
For instance, if you stand in front of a large touchscreen and just move your hands around the screen, you will find that your hand naturally moves in curves or arcs. Each hand makes arcs in a different direction. If you reach up and around you trace an arc on the screen. If you reach in front of you, you trace an arc in the air that swipes the screen horizontally. These movements feel natural.
Researchers Kutliroff and Yanai(1) explored movement patterns like these. They observed that when interacting with a touchscreen device on a desktop, a user’s elbow tends to rest on the desk and his hand naturally moves in that type of arc. With this insight, they created a menu that originated in the upper right corner of a screen and could be swiped open to reveal a series of buttons that arced to the middle of the screen. They concluded that this shape was successful in creating a natural feel and in mitigating arm fatigue.
Much of the research in the ergonomics of touch has been in the mobile realm, but designing for large touchscreens is a new frontier. Large touchscreens require more effort to use and don’t have the screen real estate constraints of mobile so designers tend to think they can use this space to create desktop-style GUIs – GUIs that require users to interact rather mechanically, like the finger is a mouse pointer.
Finding Solutions
The problem of creating touch layouts that feel natural is complex, and there are many variables at play. It turns out that good solutions go beyond static layouts, to dynamic forms that the user can manipulate. Here’s an example. The content of an interface has a three-dimensional shape that is only revealed when it moves. Think about a swipe-able carousel. Although each individual carousel element (like a group of images) can be right-angled, the carousel itself has a visible curvilinear shape that feels very comfortable to swipe. The arm can move in that natural arc as it swipes from next to next and back to previous.
One of the guiding principles of good UX design is that an interface should correspond to people’s natural behaviors rather than forcing users to adapt to the limitations of the technology. When designing touch interaction, aim to maximize user comfort and minimize fatigue. That means you need to consider the body itself as much as you consider the visual logic of the layouts. Analyze how the hand and arm naturally move while gesturing and design with those movement patterns in mind rather than create interfaces that are built solely on visual logic.
Here’s why. While the traditional visual approach may initially attract users, as soon as they begin to interact with the screen, touch fatigue can set in. So when designing for touch gestures ask yourself this: if the interface’s content was tangible how would I interact with it? Is this natural and comfortable? This perspective helps you put the focus back on the user and helps you avoid the compulsion to strictly adhere to visual logic and instead create visual shapes that work with human gestures.
1. Chapter 3: Taking Control of Gesture Interaction, by Gershom Kutiroff and Yaron Yania, in Designing For Emerging Technologies, Follet, editor. O’Reilly, 2015.