We Rep [Ideas]

From Touch to Feel: Part 2

leave a comment »

Continuation of a past series I wrote for IC’s noodleplay some years back…

So what do we (and the brands we live by) gain by replacing our buttons with pixels and graphics? The answer is: possibilities.

Touch opens a wider variety of interface and application options not constrained by old degrees of interaction physicality. It has improved on the accessibility and experience of websites, video and gaming. It has created sentimental consumer demand for a new retro paradigm by transferring analogue artifacts (e.g. rotary phone interfaces, compasses) into the digital realm and it offers a relatively low cost and efficient way to try, fail .and improve upon even more new ideas.

haptic-feedback

What do we lose by replacing our buttons with pixels and graphics though?

For many of us, sensuality. Touch eliminates the familiar tactile feedback associated with the push, click, and resistance of a button. Like somatosensory wastelands, flat screen touch devices lack the stimulating vibro-electro-mechanical feedback of past interfaces. This has sparked criticism from both consumers and proponents of universal design principles. For fewer than most of us, but equally if not more important in considering the “unmet and unarticulated consumer needs” that many of us say should drive design thinking- touch threatens accessibility. Designers of touch have yet to seriously consider how, for example, a person with visual disabilities will interact with current and future products.

Steps towards feeling

Ongoing, innovative work is being done to re-capture, improve upon, and amplify the tactile and multi-sensorial qualities of future interfaces. Much of this work points to emerging transitions in the first wave of feel and feelback systems. For example, a recent project entitled “Dynamically Changeable Physical Buttons on a Visual Display,” conducted by Chris Harrison and Scott Hudson at Carnegie Mellon University exemplifies an incremental push towards more tactile forms of touch-based interaction. In a more radical fashion, “Airborne Ultrasound Tactile Display,” is a new holographic display system developed by researchers at the University of Tokyo that enables users to experience tactile feedback through focused ultrasound waves that produce vibrations felt on the skin. Other new interface and interaction modalities serve as starting points for further thought and discussion about the ongoing shift from touch to feel.

philips_skin

Haptic Technology has evolved way beyond the Rumble Pack video game controller and the iPhone’s turn to view or shake to shuffle interactions towards more sophisticated forms of input/output. Novint’s newly released Falcon gaming controller is an excellent example of a design evolution enabling entirely new gaming experiences. Phillips’ Forced Feedback Jacket and similar projects by the United States Department of Defense aim at increasing a grunt’s situational awareness and ability to feel their way around the battlefield.

Tele-presence and Tele-intimacy are pushing the boundaries in more personal and intimate ways. Consider tele-dildonics, Internet connected and mobile sex toys that enable direct feelback stimulation between partners. Alternatively, Mustugoto, a project developed at Distance Lab, uses computer vision and a projection system to “allow users to draw on each other’s bodies – enabling a different kind of synchronous communication that leverages the emotional quality of physical gesture.”

havesexwithyourcomputer

In the realm of Gestural Interface, Nokia has been exploring how pointing, waving and flipping a phone over to silence it can enhance mobile experiences. These natural gestures and spur-of-the-moment emotional responses to a disruptive in-coming call create the illusion that the device can see, sense and feel its user. Microsoft’s project Natal for Xbox 360 pushes this apparent feelback even further by mixing computer vision with an avatar (read: an intelligent agent) that can recognize a user’s facial features and, to some degree, displayed emotions to deliver a more ‘natural’ interaction and compelling experience.

Also known as brain machine interfaces (BMIs), Neural Interfaces employ non-invasive fMRI and EEG signal scanning techniques to enable mind-to-machine interaction. Applications include the control of robotic limbs, gaming, therapeutic exercises for treating ADHD, communication and art. One example comes from  NeuroSky which has developed a ‘mindset’ application where users visualize brainwaves as they listen to music – described on their company website as the ability to “translate feelings into actions.” Interaxon is another key player in this thought controlled computing game.

muse

Bio-emotional Interfaces harness emotions, cognitive states and physiological states as input/output modalities. According to Philips, SKIN, one of their many inspirational design explorations, signifies a shift from ‘intelligent’ to ‘sensitive’ products and technologies by integrating new materials into the area of emotional sensing. Although we still haven’t experienced widespread intelligent products yet, the promise of sensitive ones is certainly alluring.

Affective Computing suggests markets emerging around new experiences with sensitive products and services based on the softer side of input – mood, feeling and emotion. MIT’s Affective Computing group is conducting a wide range of research and design that focuses on, among other things, the development of new affective sensing techniques, machine learning algorithms, technologies to help people become more aware of emotional states and communicating them, and the ethics of Affective Computing. Applications in this domain vary from serving people with Autism to gathering customer experience data to mobile health applications like outpatient monitoring.

milo

Research being conducted on Artificial Intelligence and Assistants could well lead to the emergence of working relationships between people and their intelligent assistants, A.I. entities that understand and help us satisfy our needs. DARPA (Defense Advanced Research Projects Agency) funded projects like Pal (Personal Assistant that Learns) and SRI International’s CALO (Cognitive Assistant that Learns and Organizes) are pushing the boundaries of what these intelligent agents might do to help us maximize our individual and collective potential. Some applications include managing tasks, social networks and interactions as well as gathering, organizing and preparing information. Note: these are not the friendly paper clips or wizards we’ve grown accustomed to on Windows machines.

These are all steps towards the shift from touch to feel, a transition that might eventually combine potentials and characteristics to enable entirely new, sense-based forms of interaction, communication and exchange. This transition from disparate forms of single modality interaction towards multi-modal interaction will be slow, but when (and if) it occurs the illusion of predictive modeling and suggestion will be shattered by a new reality where our products, objects and devices will, over time and through new forms of usage intimacy, get to know us, feel us and learn how to better meet our needs.

How will this shift change the way products are defined, shaped, and made? Will it make products or services easier, better, more enjoyable, more intuitive or more meaningful to use?

Stay Tuned for part 3…

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: