Consumers may be slowly warming to the emergence of wearable technology, but that doesn’t make wearables any easier to operate.
The dominance of touchscreen user interfaces will reduce over the next 5 years as more sensors are introduced to mainstream products and entirely new product form-factors emerge, enabling and necessitating new user interfaces like voice, gesture, eye-tracking, and neural.
That’s the prediction from ABI Research‘s recent report which examines popular user interface (UI) methods as well as the natural sensory technologies transitioning from research labs into future product solutions.
“Touch got mobile device usability to where it is today, but touch will become one of many interfaces for future devices as well as for new and future markets,” says Senior Practice Director Jeff Orr. “The really exciting opportunity arrives when multiple user interfaces are blended together for entirely new experiences.”
As mobile applications integrate more technology, the UI must be kept simple enough to be intuitive, ABI says in a press release detailing its new report.
“Packing a mobile device with sensors goes little beyond being a novelty,” adds Orr. “Complexity contradicts good UI design and a critical mass of engaging mobile applications are required for mainstream adoption.”