when humans become integrated with machines?
What new considerations will we designers have to take into account when our users interact with devices that are plugged directly into their nervous systems, or installed in their bodies?
The good folks at Superflux have been thinking about this in the context of curing blindness. Here’s what they have to say:
What if we could change our view of the world with the flick of a switch? The emerging field of optogenetics combines genetic engineering and electronics to manipulate individual nerve cells with light. With this technology, scientists are developing a new form of retinal prostheses. Using a virus to infect the degenerate eye with a light-sensitive protein, wearable optoelectronics can establish a direct optical link with the brain. Song of the Machine explores the possibilites of this new, modified – even enhanced – vision, where wearers adjust for a reduced resolution by tuning into streams of information and electromagnetic vistas, all inaccessible to the ‘normally’ sighted.
I’ve been fascinated with this concept–the idea of enhancing human experience through designed objects that integrate with our bodies–ever since I watched this TED talk by Aimee Mullins, who sees her prosthetic legs as a desirable enhancement rather than simply a replacement for her legs:
Aimee doesn’t want her old legs back. She considers herself better off being able to change her body to suit her mood, and her desires.
It got me thinking, how does the job of a UX designer change when we can help users meet their needs not just through designing easy to understand, learn, and use digital interfaces on their computers, and handheld devices, but by changing a user’s body? What new needs can we help people meet? What are the ethics that we should follow? What should we do for example when a perfectly sighted person wants to blind himself to take advantage of an enhanced visual system?
What do you think?