In anticipation of the upcoming IxDA Interaction12 Conference taking place in Dublin, Ireland February 1-4, Core77 will be bringing you a preview of this year's event. Follow us as we chat with keynote speakers, presenters and workshop leaders to give you a sneak peek at some of the ideas and issues to be addressed at this year's conference. Come by and say hello to us at the Coroflot Connects recruiting event and don't miss out on our live coverage as we report from the ground in Dublin!
* * *
19th century culture was defined by the novel, 20th century culture by cinema, the culture of the 21st century will be defined by the interface.
-Lev Manovich, Media Theorist
What will the future of interaction look like in 10 years? 20 years? According to Pete Denman, we will see personalized reactions to a singularity of "vibrant data." Pete is a Portland, Oregon-based interaction designer for Intel Labs. Such a job description is still a relatively new one that didn't even exist very long ago. But in a digital lifestyle where our mobile phones, streets, televisions and even our shoes are a nest of analog sensors, these ubiquitous interactions are increasingly governed by digital information transforming the physical into data.
This data will revolutionize how we act and interact: Biomimetic infographics will help us to interpret data, according to Denman. These biomimic techniques tell a story—whether it's the rings on a tree, the petals of a flower or the depth of clutter on your desk—as Denman will discuss at his presentation, "Biomimic Infographic," during IxDA's Interaction12 conference taking place in Dublin this week.
1. What exciting things are Intel Labs currently working on?
Pete Denman: There are some great things being done in Intel Labs. The projects and ideas produced by the group I work in—Interaction and Experience Research—range from 'revolutionary' to 'thrilling.' The Labs is a crazy place where ideas are currency, and we are purchasing the passion/excitement/talent of our co-workers. Everything from automobile technology to sensing, data, mobile and cloud tech. I'm sorry I can't be more specific. The things we are working on are in the delicate process of becoming "real" and are not ready to be shared yet.
How does Intel Labs make use of Rapid Prototyping in your everyday work? Specifically, what tools and processes are you finding most helpful to experience design and why?
Everyone has their favorite tools. I do my prototyping for UI in Flash/Actionscript. It gives me rich visual capabilities and a vast array of interaction tools and content connections. I am able to generate these prototypes quickly, and I do my testing mostly on iOS. Yup, on the iPhone/iPad making and testing prototypes is pretty easy. In the past months I have begun to explore other options since Adobe seems to be abandoning my favorite tool. Recently I have taken Android and iOS training, but programming natively in either of these languages seems to limit me to one platform, and while knowing the structure and capabilities is great, it commits me to that camp. I've decided to give up my heretic ways and make the agnostic choice: html5 and CSS. Now I just need to become as fast using it.We seem headed towards an integration between the physical world and digital cloud with devices like Intel's gesture-control ultrabooks, Berg's Little Printer, Microsoft's Kinect and even the rise of the Internet of Things. What are some future trends in Interaction that you see in your own work?
We will see more personalized reactions to data. It's not the same as personalized content! Personalized reactions is a much bigger idea and happens everywhere, and in ways you might never see. We call it "vibrant data." Over the last few decades we have let technology into our lives, this technology bristles with sensors. These sensors monitor where we go, how we get there, what the temperature it is, how much we eat, what our activity levels are and in what condition. The thing is, this data is largely thrown away or unused. We are working on ways to wake this data up and have it work for you. We see a future when this data enriches your life in extraordinary ways, while staying secure and anonymous. As ordinary as it seems, information like this can be filtered to create a holistic view of a person, their environment and activities. When compared against other people, suddenly abstract connections emerge.
Imagine sharing resources with people you've never met. You might pick up a stranger's laundry as a different stranger is dropping off those organic tomatoes you love. Abstract connections and the examples we can now think of are just the tip, opening this data to let people play with it will produce connections we never imagined. As our tools to look into the depth of the data improve, connections will begin to emerge. It seems silly to think that the sleeping patterns of my cat will help a 15-year-old boy in China develop an app that compares "cat naps" to hurricanes and somehow cures foot fungus, but without the data we'd never know if there were a connection. Data will revolutionize how we act, and interact.
How can biomimicry techniques help us make sense of the big data sets we encounter and generate everyday?
It's really about seeing the forest and not the trees. Abstracting the details and organizing them the way nature would. We have a built-in affordance for information nature gives us. Inherently we understand that the yellow and black markings of a bumble bee are a warning, while the thickness of a tree limb indicates its age. We are sitting on a foundation of 200,000 years of human experience, pre-programmed to understand nature's data, and without effort we make sense of it. Sharp/dull, hot/cold, bright/dark, color, texture, smell, sound brought back to its essence, back to nature's constructs and not forced into a pie or bar or bubble.
We see industrial designers and architects embracing this nexus of biomimicry, data visualization and digital fabrication. Can you speak more of this apparent cross-over of disciplines in the industry and how it relates to your work?
At first I was looking at biomimicry as an aesthetic solution to give interest to data sets, which it did very well. People in user-tests understood the visuals and gravitated to the graphics and began to play with the filtering tools. Based on that, I'm now approaching the idea with a broader biomimetic approach by letting the data dictate structure and appearance using nature's favorite building block: the fractal. I suppose I could make correlations to the other efforts in biomimicry, but really, my main inspiration on the topic was a combination of needing a different way to show data, specifically lots of data. Bar charts, pie charts, spark lines, scatter charts, temperature tree...even the infographic with its wryly gigs, arrows, all give some understanding of given data. It's that they are focused, a 'snap-shot' look at something specific, or in the case of the infographic, delivering data to us in stylistic metaphors. With a biomimic approach the idea is getting the data out there so the user can see it all in an organic form allowing our senses to take over, enabling discovery. That's where this idea shines in its abstractness. It goes back to data reacting for us, this is the kind of tool to help that happen, to let our senses wash across something.
The central component of this Interactive Environment is gestural with one's hands. What comes next? What are some of your favorite examples of touch-less technology or gestural interfaces you've currently seen?
Here's the thing: I'm disabled. Actually, I'm really hard-core disabled. I'm a quadriplegic after a spinal injury 23+ years ago. I have very limited use of one arm and no use of my fingers or wrist. I bring this up because I have a unique view on interaction. To make things work for me I have had to invent tools, dissect devices and repurpose things that were never meant to go together. A few years ago I began to notice something interesting about new user interfaces. It was as if they were being designed for me. As if the User Interface designer had someone with physical limitations in mind. Touch screens, soft keyboards, word prediction and touch-less input were all used for years in yesterday's cutting-edge handicap user interface. 20 years ago I had a $7000 voice recognition hardware/software package called "Dragon Dictate Systems." Today we still have Dragon but it's common software used by millions. Apple has taken voice input even further, weaving key elements of iOS into their Siri app. No-contact motion interaction has been used for decades to control wheelchairs. Today Kinect brings no-contact interface into everyone's living room. The disabled have a need for the cutting edge interface because they can't use the standard interface. But they aren't the only ones using tomorrows technology...when you find an intersection of "need," and "motivation" people do amazing things.
About Marko Manriquez
Marko Manriquez (@markolicious) is an new media artist and maker of multidisciplinary mythologies based in Brooklyn NY. His current work explores paper art, biomimetic inspired digital fabrications, data representation and moss graffiti/post-graffiti. Marko is currently a graduate student at New York University's Interactive Telecommunications Program (ITP) pursuing a thesis creating the world's first 3D printer to make burritos to comment on food politics and sustainability. He prefers riding a bike to riding the subway, even in winter.