Work from Hiroshi Ishii's Tangible Media Group at MIT. From the left, Glume modular modeling medium, Senspectra modeling toolkit, and Topobo programmable robot toy.
Last week, the Boston's Hynes Convention Center housed the CHI conference, an annual event which showcases the world's best and brightest in Computer Human Interface Design. Though by its nature this conference often very software focused, this year's was very concerned with the physical, and mobile devices, wearable computing and tabletop surface displays played a larger role than ever before.
Amongst the many presentations of hybrid software-hardware experiments and studies into the practicalities of interface development was one highly conceptual presentation/panel entitled "Eek! A Mouse! Organic User Interfaces: Tangible, Transitive Materials and Programmable Reality" where heavy hitters presented their own visions of how computing devices will move away from the keyboard and mouse and manifest in unexpected forms. "Industrial design is the new interface design" was the mantra of the week, and this panel was composed of researchers whose passion lies in the tangible manifestation of dynamic data. According to the panel, which included famed researchers Hiroshi Iishi and Pattie Maes from the MIT Media Lab, along with Seth Goldstein of Carnegie Mellon University, Sony's Jun Rekimoto and media artist Sachiko Kodama, data-laden, sentient, computational devices will be embedded in the very fabric of everyday objects.First up was Hiroshi Ishii, the master of tangible bits himself. As Director of the Tangible Media Group at the MIT Media Lab, he has been instrumental in the development of many projects that involve physical objects embedded with digital technology. He began his presentation with a historical overview of his lab's work as a way to introduce his vision for the "next 100 to 200 years". With an accompanying video entitled "Perfect Red" he described how the "Radical Atoms" project, in conjunction with nanotechnology research, plans to create "physical pixels" or individual particles that can sense and respond to real time input. In the video, a designer grabs some "radical atom" matter and in a process of what Ishii termed "kinetic sketchup" records the motion of the modeling activity shapes it into the form of a child's rattle. The form definition is then used to manufacture the thing itself.
Later, Jun Rekimoto, Director of the Sony Interaction Lab described "Interfaces as skin" and explained that he sees a future where surfaces have a sophisticated level of response and feedback such that a keyboard can not only sense finger presses, but knows how much pressure is being applied to what part of the keyboard at any given moment. All of this dynamic input can then be used to create appropriate product behaviors where objects can automatically sense exactly how they are being held.
Sachiko Kodama's ferrofluid sculpture
Next up was media artist Sachiko Kodama who uses ferrofuid (a viscous, flowing substance composed of tiny bits of iron particles that become polarized in the presence of a magnetic field) to create choreographed moving sculptures. In a video entitled "Morpho Towers", masses of dark grey matter undulate, protrude and reconfigure themselves in an animated landscape composed of solid three dimensional patterns of curves. Imagine a pile of sand that rises up from the ground to form solid volumes that move in smooth, organic paths and you have some sense of how mesmerizing these scupltures really are.
Student projects from the Fluid Interfaces Group at the MIT Media Lab include interactive shutters, dynamic carpets, wirelessly paired pillows that transmit LED lit messages, urban displays, paper with embedded circuits and flexible input foam structures.
Following Kodama was Pattie Maes, who heads the MIT Media lab's Fluid Interfaces Group. She called for a focus on object qualities and a need to create technological objects that we can cherish and that will evoke the kinds of emotional responses that we are accustomed to with non-technology based products. This kind of directive is par for the course for most designers, but her lab takes on the challenge of creating emotional attachment through digital behaviors that are integral to the product's identity. Amongst the projects shown in her part of the presentation were Relational Pillows that can wirelessly communicate lighted patterns in real time, and I/O Paper with circuits embedded into pulp.
The "Claytonics" vision from Carnegie Mellon University
The final presentation was delivered by Carnegie Melon's Seth Goldstein. In his lab's vision of "Claytonics", illustrated through a video showing a fictional group of designers discussing the form of a new car, masses of particles imbued with sensing abilities can be handled directly and used as a direct input device. In other words, instead of using an input device, such as a mouse or keyboard, and an output device, such as a screen, his "digital clay" can act as a way to specify and communicate 3 dimensional information just by molding it with your hands. Parts of the domestic landscape, such as chairs and tables, can be formed and reformed at will, composed of particles that can sense, transmit and manipulate data on demand.
While we found the panel incredibly inspirational, we couldn't help but wonder how close the projects showcased are to coming out of the lab and into users hands. Seth Goldstein boldly proclaimed 2015 as a "conservative estimate", while Hiroshi Ishii reiterated his estimated one-to-two-hundred year timeline required to make the technology a reality. While it all seems very promising and prescient, none of the panelists could describe a clear vision for power management (with all these advances, will we still have to lug around batteries and power cords?), admitting this is a tough problem that the physicists in the labs next door are tackling. Regardless of the time frame, every panelist expressed confidence in the ability to produce the future as described, stating that the technology is essentially in the works in their respective labs. Though the researchers envision a fascinating future of possibilities, it's clear that designers will be needed more than ever before to act as mediators determining appropriate and meaningful ways to embrace these new ways of relating to our synthetic world.
A "Tweenbot": passive social engineerThe "Tweenbots" project created by Kacie Kinzer of NYU's ITP program centers around cute, little cardboard robots sporting flags that display "Help me!" messages. They're not exactly the smartest robots in the world (they simply roll at a set speed in a straight line) but in...
Moy designed by Elvis Tomljenovic."MOY koncept is made for generation who uses technology as a means to express themselves and communicate with others. The idea behind MOY koncept is that everyone can design their own car on their own computer and then apply the design to the vehicle using wireless...
Conveniently located near to Core's Portland office, Autodesk's Manufacturing Solutions Headquarters in Lake Oswego, OR opened its doors yesterday to a dozen or so CAD nerds and bloggers (including Core77's), for a preview day slightly reminiscent of high school. Dubbed "Manufacturing Tech Day," the schedule had us herding from room...
With a gentle crescendo and swift pitch shift, Brandi House uses her voice to beckon the "Voicebot" (a sound-controlled Lynx 6 robot arm) towards her before coaxing it to move near a block that she wants it to grab. An abrupt "ch" sound then makes the arm's gripper...