Get Our Newsletter
Submit

Sign-up for your monthly fix of design news, reviews and stuff to make you smarter.

Follow Core77
Twitter Facebook RSS

 

UX

The Core77 Design Blog

send us your tips get the RSS feed
 
Posted by hipstomp / Rain Noe  |  26 Jan 2015  |  Comments (2)

0movieUI001.jpg

Looking at Dino Ignacio's work made me start thinking about fantasy-based UI design. The first time I really became aware of motion graphics cooking up digital UI's was probably way back during Minority Report or one of the Matrix movies. Being over ten years ago, you can see how primitive it looks now:

It's obvious the operator isn't really doing anything, unless there's some value to aimlessly moving an on-screen tile back and forth. But the first time I saw it, it was fairly mind-blowing, monochromatic though it was.

Fast-forward to today and sci-fi movie UI is nothing short of jawdropping.

0movieUI002.jpg

0movieUI003.jpg

0movieUI004.jpg

0movieUI005.jpg

0movieUI006.jpg

In the theatre we see it flash across the screen in too-short instances that never give us the time to appreciate them. But thankfully the motion graphics houses that create them turn them into "sizzle reels" readily found on YouTube and Vimeo, where we can freeze-frame them and pore over them at will. Here's Territory Studio's stunningly beautiful Guardians of the Galaxy interfaces:

continued...

Posted by hipstomp / Rain Noe  |  23 Jan 2015  |  Comments (0)

0dignacio001.jpg

We all know that your average, workaday industrial designer's work often goes unsung. The same could be said of the guys who design UI for videogames. And when those games are designed with an efficient UI, literally millions of players work through those games in smooth immersion, never considering the pains a designer took to make it so.

Dino Ignacio is one such designer. He's the lead for UI Design over at Visceral Games, and a Kill Screen article called "How Dead Space 3 Pulled from Dieter Rams and Instagram" highlights what Ignacio does, like ensuring that game interfaces are designed properly for the hardware they're running on:

"The problem is that most games design thinking they'll have dropdown menus," he says. It reflects a fundamental disconnect between what game designers want and what the players need. Designers suddenly realize the freedom of motion on the PC isn't available on game consoles. "A lot of UI is designed with the mouse in mind. It never translates."

These types of decisions unwittingly doom many games before they've even started. It's Ignacio's job to make sure that doesn't happen. As the user interface design lead for the survival horror game Dead Space 3, he's tasked with designing all the elements that a player might need to navigate and manipulate this virtual world. His fingerprints are all over what you see on screen. To be more specific, it's what you don't see.

0dignacio002.jpg

In the video below, Ignacio walks you through the weapons crafting interface he designed for the Dead Space series, and shows you how it evolved through the games. (Warning: Potentially NSFW, contains gory action footage.)

continued...

Posted by Anki Delfmann  |  22 Jan 2015  |  Comments (0)

passagen2015_cologne_labor_klanglichter_kreter_hekimoglu2.jpg

Klanglichter by Onat Hekimogulu and Tobias Kreter

Our first stop during Cologne's design week is Passagen, a collection of 190 exhibitions scattered throughout the north part of the city. Off the beaten path for people who are more used to strolling through more established hubs and brands, the chilly walk lead us to some unusual venues and reused spaces. Our favorite exhibition was held in an empty, glass-fronted shop space in the brutalist concrete underground station of Ebertplatz. LABOR: Design n+1 by Köln International School of Design showed some experimental objects and lighting, exploring the boundaries of art, design and research.

Klanglichter, above, is a laser harp that combines gamification and music-making. The Arduino-based audiovisual interactive installation was designed by Onat Hekimogulu and Tobias Kreter. Fueled by the will to hit targets on a projection on the wall, visitors play the laser harp to create new compositions.

passagen2015_cologne_labor_binary_talk_isselburg_kilian.jpg

passagen2015_cologne_labor_binary_talk_isselburg_kilian2.jpg

Binary Talk by Niklas Isselburg and Jakob Kilian transforms the ASCII data of a word into binary code, which is then translated into a smoke signal sent off through the air by a subwoofer. We loved this experimental approach to uncover hidden processes of modern communication. The project combines advanced technology and one of the oldest forms of long distance transmission, the smoke signal. Light sensors in the recipient module detect the binary smoke puffs, which are translated back into ASCII code on a second computer. Mistakes in interpretation caused by a breeze in the room remind us of the telephone game, and the accuracy we have come to expect in modern means of communication.

continued...

Posted by Kill Screen  |  20 Jan 2015  |  Comments (0)
This post originally appeared on Kill Screen, a videogame arts and culture website.
Sunset_header.png

Story by Jess Joho for Kill Screen

Tale of Tale's upcoming game Sunset has beguiled us since we first saw it—a vision altogether more assured, colorful and inviting than the vast majority of games we come across. Last week, when the first real look at the game arrived in the form of screenshots, we took the opportunity to discuss the game a little further with the creators.

Gaming's favorite (and only) Belgian power-couple, Auriea Harvey and Michaël Samyn, first began thinking of what would eventually become Sunset years ago. Inspired by films like Wong Kar Wai's Chungking Express, they envisioned an exploration of romance through space, in a relationship between a cleaning lady and the apartment's inhabitant. But as time went on and the game evolved, their focus turned from pure romance to a more pressing issue.

s_16_big.png

"How do you get on with your day-to-day while living in a world or atmosphere filled with violence?" Auriea asks. "I think we're all experiencing that to some degree now, with the constant wars, terrorism, everyday another bombing, another shooting. Michaël and I at least feel this need to step back, to think about what it all means and how to deal with it. We thought if we needed an experience like that, maybe other people did too. And since games can be such a great tool for examining the world around us, maybe something like Sunset could be an opportunity to explore that atmosphere in a controlled environment."

Sunset follows the story of Angela Burnes, who emigrates from the revolutionary climate of 1970's America and into the revolution of a war-torn South American country. As the housekeeper of the highly cultured (perhaps even pretentious) native to the country, Gabriel Ortega, your relationship to both your employer and his country develops through how you choose to interact with his apartment.

"A lot of Tale of Tales games has ourselves in it," Auriea explains. "In this case, my experience of being an American expatriate is definitely part of Sunset. I want to give people that experience: of at first feeling completely alien to a city, and then eventually developing a sense of place there, a real stake in your new country."

continued...

Posted by hipstomp / Rain Noe  |  16 Jan 2015  |  Comments (0)

0saitekknobfeel.jpg

I loved KnobFeel from the moment we first covered the site. To refresh your memory, it's a guy in the UK who provides succinct, non-verbal video reviews of knobs, like so:

Tells you all you need to know in just a few seconds.

And while knobs are fairly straightforward, the most recent KnobFeel review tackles something a good deal more complex: Saitek's X52 Control System, a pair of sprung joysticks bristling with multiple knobs, dials, lights and switches. Ex-videogame-tester and video editor Drew Scanlon provides the special guest review in the proper style, though with a rather KnobFeel-atypical ending:

continued...

Posted by hipstomp / Rain Noe  |  15 Jan 2015  |  Comments (6)

0dillonmarkey001.jpg

Dillon Markey is a Los Angeles-based stop-motion animator who works for Robot Chicken and the film director PES. And as we saw in the Boxtrolls video, stop-motion work requires making hundreds upon thousands of minute adjustments. But what we didn't see in that video was the animator stepping away from the stage after each adjustment, the constant back-and-forth dance the artist must do to interact with his capture equipment.

Markey, tired of this dance, sought to create a body-mounted remote control solution that would allow him to remain within arm's reach of the stage. With no such product existing on the market, Markey hacked one up himself with a little help from an electrical engineer. What's most impressive is what they used: A Nintendo Power Glove, a failed game accessory product from 1989.

0dillonmarkey002.jpg

0dillonmarkey003.jpg

Here's how they did it. And be sure to pay close attention around 4:25 in the video to check out Markey's brilliant integration of a self-parking tweezer dock.

continued...

Posted by Kill Screen  |  13 Jan 2015  |  Comments (0)
This post originally appeared on Kill Screen, a videogame arts and culture website.
killscreen_alvarez-maze-2.jpg

Story by Chris Priestman for Kill Screen.

Corridors are a significant architectural space in a lot of science fiction films. Perhaps you haven't considered this before given the brevity of their screen time. And that's largely due to the corridor's purpose as an interstitial space that connects rooms, meaning they are usually walked or ran through, and not dwelled upon.

One of the first lessons in filmmaking is to cut out any unnecessary footage when editing, and the example used is often a shot concerning a character walking between locations. It's a lesson that informs its student that the corridor is a waste of time, for the most part. Hence, when a corridor does make an appearance in a film, it is never happenstance; it's always for effect.

killscreen_alvarez-maze-1.jpg

Take the Death Star in Star Wars Episode IV: A New Hope, which is depicted almost entirely as a labyrinth of imposing corridors to be run through. This builds the idea that the space station is a hive of militaristic activity with soldiers constantly striding through to deliver messages to their superiors. Everyone is walking and talking; people in a constant state of transit in front of the camera.

Later on, the same corridors play host to Han Solo and Chewbacca's escape, forcing them to run and shoot in a half-backwards manner at the pursuing Stormtroopers for the lack of cover. Similarly, in Alien, Ellen Ripley alternately dashes and creeps through the hissing smoke and spinning red alarms of the Nostromo's corridors as it counts down to self-detonation. These narrow, dark passages are the veins and arteries of this enormous starfreighter, which has become host to a deadly creature. The corridors provide nowhere for Ripley to hide but are the quickest way for her to reach the evacuation shuttle that she has prepared for herself.

continued...

Posted by hipstomp / Rain Noe  |   5 Jan 2015  |  Comments (1)

0montblancestrap001.jpg

The Apple Watch launches this year, which means we'll soon see loads of imitators and competitors. The 2015 CES is bound to be smartwatch-crazy. But one prominent watch manufacturer is taking a different tack: Offering a smart watch strap that can be attached to different watches.

The Montblanc E-Strap is an Italian-made piece of leather texturized to look like carbon fiber (?!?) connected to a tiny screen that sits on the inside of the wearer's wrist. This OLED display is actually a touchscreen, and while I can't imagine it allows a wide variety of gestures given its diminutive size (0.9 inch) and resolution of just 128x36, the company reckons it will be good enough to provide notifications and some remote control functions. According to watch news site A Blog to Watch,

In addition to basic calls, texts, e-mails, calendars, social media, and reminder notifications, the e-Strap will function as an activity monitor/tracker with a pedometer and accelerometer to measure data that feeds into an included iPhone or Android phone app.

continued...

Posted by hipstomp / Rain Noe  |  31 Dec 2014  |  Comments (1)

0michellevandy001.jpg

What would you do if you couldn't work a mouse?

Michelle Vandy was an architecture student who developed a repetitive strain injury in first one arm, then the other. Obviously this was a rather crippling setback. "All my interests and hobbies revolved around my arms and my studies and future career depended on them too," she writes. "But now all I had was pair of useless extremities causing me pain."

While she could still use her arms to perform basic tasks, interacting with the computer long-term was proving impossible. She began experimenting with non-arm-based interface methods, like using a Leap motion sensor in conjunction with a foot-powered keyboard device. That didn't work out. Neither did a stylus held in her mouth ("Observation: Too much saliva"). Things didn't look good, until:

I was sitting in my room late one evening fiddling around with this external touchpad I had lying on my desk and without thinking, lifted it up to eye level and touched it with my nose. "Click". I tried swiping too - it worked! I opened up photoshop with shaking fingers, hadn't opened it in months! I had a few more goes holding the trackpad to my nose and swiping left and right, up and down and the movements felt strangely natural to me.

Vandy began practicing drawing with her nose on the touchpad, and eventually even swipe-typing. "I wrote a large chunk of my Bachelors thesis on the iPad with my nose," she says.

Now a nose-using veteran, Vandy's work rig consists of a desktop Manfrotto tripod, Apple's Magic Trackpad, a tripod adapter plate and velcro strips.

0michellevandy002.jpg

She uses Illustrator and Photoshop at her current internship. Here's an example of the work she produces and how she does it:


Posted by Teshia Treuhaft  |  22 Dec 2014  |  Comments (0)

Senicwooddesk.jpg

Chances are if you're a designer, artist, musician or use a computer daily, you have encountered that fateful moment when your mouse keeps you from making that perfect color selection or nudging a layer into exact position with Photoshop. While most computer aided drawing and modeling programs account for clumsy hardware (thanks magnetic lasso), isn't it about time we demanded better hardware? The fact is—from fancy Wacom tablets to every incarnation of touch screen and foldable keyboards—UI tools still fall into the uninspired categories of keyboard, tablet and mouse.

Recently however, the Y Combinator alumni and Berlin-based startup Senic has tackled this exact issue of high precision interface with their wireless device aptly named 'Flow.' The freely programmable controller is not only compatible with most computer based applications but also has potential integrations for connected home objects and even Internet enabled microprocessors.

The sleek aluminum, stainless steal and polycarbonate casing pays not-to-subtle homage to Dieter Rams-ian simplicity. At just under 2.75 inches, Flow boasts 360 degree angular positioning, capacitive touch and infrared-based hand gesture recognition. Additionally, with 3,600 values in just one rotation of Flow, exact manipulation of brush sizes, color selection and anything else is right at your fingertips.

SenicGif.gif

The four co-founders represent a broad skill set and media prowess enviable to most start-ups launching a crowdfunding campaign. We caught up with CEO Tobias Eichenwald to discuss the campaign, the frustration that gave birth to Flow and the future of UI.

C77: How did Senic start? What first put you on the path to designing a tool like Flow?

Tobias Eichenwald: We're three friends and co-founders from Germany and we use digital tools like Photoshop, Illustrator, Premiere, Rhino or Eagle on a daily basis. We need to be fast and we need to be good at what we do. Browsing through menus and pulling a fake slider with a mouse didn't feel that way. Existing interfaces don't give us the pixel-precision we need; they are time consuming and interrupt our workflow.

We found similar problems in other fields like controlling our connected devices for example. We grew up with the assumption that you turn on a light by hitting a big white button on the wall without thinking about it. Now that smart devices are replacing traditional devices and the market for connected homes is exploding, we are expected to browse through apps and spend time waiting in a hallway, just to turn on a light.

continued...

Posted by Sam Dunne  |  12 Dec 2014  |  Comments (0)

PizzaHut_SubconsciousMenu8_Dec14.jpgJust look at those vacant expressions—if only there was an easier way

Something's definitely been cooking in the R&D department at Pizza Hut this year. In a market showing trends to polarization—the rise of the high-end, handmade, hipster-friendly, small batch, sourdough, pizza-craft on one hand, and the quick, easy, cheap, delivered-to-your-door stuff still going strong on the other—the middle of the road pizza chain has been struggling with a lack of relevance in recent years. Moderately priced, average pizza (to be kind?) and '80s salad bars are clearly doing it for nobody in the 2010's. And by the looks of things, they know it.

Earlier this year, we reported on the Hut's first foray into interactive ordering technology with the release of their concept touchscreen table top for (playing at) designing your own pizza (with some games and phone interconnectivity thrown in for good measure). Last month, the chain announced a total revamp, launching both an attempt at a bold and contemporary new menu—whipping out on-trend big guns like Sriracha sauce, Buffalo drizzle, "Skinny Slice" and more premium toppings, all under a pretty nauseating (and fairly offensive to Italians) campaign "The Flavor of Now" (I'm not linking to that shit)—and a big identity update; the company's fourth refresh in 15 years.

As if Sriracha, touchscreen tables and insulting geriatric Italian's (ok here's the video) wasn't enough innovation for one year, Pizza Hut have released a new concept that claims to be "the future of dining"...

continued...

Posted by Sam Dunne  |   1 Dec 2014  |  Comments (1)

HashKeyNov2014_2.jpg

On first glimpse of the HashKey—perhaps as a result of Kickstarter overexposure—my heart sank under the weight of my tumbling faith in humanity and fear for its future. Fortunately, on closer inspection, I found sweet salvation in the realization this was, of course, a product conceived with an eyebrow raised and a tongue in cheek (see also).

HashKeyNov2014_3.jpg

Whilst, of course, nobody in their right mind is going to dedicate one of their ever decreasing number of USB ports to such a device, the HashKey makes an amusing observation of the low prioritization of the hashtag key on traditional Qwerty keyboards, especially in contrast to their mobile equivalents—I dread to think of the number of fledgling Twitter adopters copying and pasting the symbol. Whether the (overhyped?) hashtag will ever be promoted to a higher prominence in keyboard culture is yet to be seen, but I can already hear the cogs in the brains of Microsoft's and Samsung's innovation teams turning.

Posted by Ray  |  19 Nov 2014  |  Comments (0)

MaxKessler-CoinFlipper-1.jpg

Well, here's a rather fun self-proclaimed "stupid pet project"—a literal brief if there ever was one—by SVA IXD student Max Kessler. As a kind of analog random number generator, "Coin Flip" is rather more purposeful than this brilliant gizmo, and the drinking-bird-meets-desktop-trebuchet invariably offers a more delightful user experience than, say, a web app. "The programming and robotics were build with an Arduino One, photocell, and Jameco 12V DC Motor," Kessler writes. "All the prototyping was built with MDF."

MaxKessler-CoinFlipper-2.jpg

continued...

Posted by Ray  |  14 Nov 2014  |  Comments (7)

fuseproject-Fluidigm_Juno-QuirkyGE-Tripper.jpgL: The Fluidigm Juno, designed by fuseproject; R: Quirky+GE's "Tripper" sensor

As an editor at Core77, I often find myself attempting to explain what industrial design is, and I'm sure those of you who are actually practicing designers often find yourselves in find yourselves in the same position. It's regrettable that ID is a widely unsung (if not outright overlooked) force in the world, to the effect that it falls on a precious few star designers such as Karim Rashid and Jony Ive to speak for the profession. The latter made a rare public appearance at the Design Museum this week in a conversation with museum director Deyan Sudjic, making a strong case for design-led business model (perhaps RE: suggestions to the contrary), hands-on education, and maintained that failure is part of the design process.

If Apple represents the paragon of industrial design in the post-industrial age—hardware that is as much a vessel/vehicle for digital UX (i.e. a screen) as it is a beautiful artifact—so too are we always curious to see new developments in other the frontiers of design. A colleague mentioned offhand that insofar as space exploration is constrained by the logistics of astrophysics itself, there isn't exactly a 'design angle' to the Philae lander that, um, rocketed into headlines this week. (That said, we have reported on design at NASA, where problem-solving is paramount... whether you call it design thinking or not.)

fuseproject-Fluidigm.jpg

Which brings us to fuseproject's recent work for fellow SFers Fluidigm, a B2B life sciences company that called on Yves Béhar—a star designer in his own right—for a complete design overhaul in a traditionally un-(or at least under-)designed category. From the now-dynamic logo to the genre-busting form factor, the entrepreneurial design firm has risen to the challenge of expressing the genuine technological innovation behind the Juno "single-cell genomic testing machine" with equally revolutionary design.

The shape is sculptural and practical; a delicate balance between a futuristic piece of machinery and something more familiar. The aluminum enclosure is machined at high speed and the rough cuts visible and used as finished surfaces, which is a cost saving. The resultant ridges run along the exterior in a fluid, yet pronounced way, and resemble the miniature functional traces on the cell sample cartridge that enable single cell manipulations.

continued...

Posted by core jr  |  10 Nov 2014  |  Comments (0)

Causelabs-MoneythinkforiOS-0.jpg

By Sheryle Gillihan

Every organization that is developing an app has hopes of becoming the next viral hit, but even great apps compete for attention amongst the distractions.

Moneythink, the established and growing financial literacy program for urban, low-income high school students, also had high expectations of adoption for its app. After completing the Moneythink Mobile pilot last spring, our organization, CauseLabs, reviewed the quantitative and qualitative results with Moneythink. Their initial response? Disappointment. But the CauseLabs team saw something different. We saw the pilot program as a success.

Analyzing the Right Results

It happens all the time. Organizations that are building apps for the first time set the bar too high. Having years of app development experience, CauseLabs knew better than to expect astronomical results and saw success where Moneythink did not.

Consider some of the challenges of mobile engagement: Mobile users are selective when downloading apps due to limited device space. Moreover, studies have shown that, no matter how many apps are installed on a device, users only open an average of 20-30 of them per month. Even useful apps fail to hit 100% engagement.

Take email as an example of a useful app. For some of us, checking email is a daily to-do and behavior that has caused mobile email clients to far surpass their desktop counterparts. Yet, despite prevalent email use, fewer than 20% of email client users were active in the last 12 months, according to an IT Business Net article from earlier this year. Gmail, topping the charts for email usage, showed that only 11% of users were active over a 90-day period.

Causelabs-MoneythinkforiOS-4.jpg

We don't expect to put a number as low as 11% on reports when we set new app engagement goals, but perhaps it is more realistic. Active use of an app from any user group (outside of paid staff) indicates that it offers something of value. When evaluating our work at CauseLabs, we look through the lens of the 1% difference. Not everything we build will be the next overnight hit, but we build tools to create impact. In the case of Moneythink Mobile, we are introducing financial literacy to the next generation of leaders. If we get Moneythink Mobile in front of 100 students and impact 1% of them in two months, what happens when we reach 1,000 students over a year?

Of the students in the Moneythink Mobile pilot group, 80% downloaded the app and 34% interacted with it, while 4% completed all nine challenges. This percentage may seem small, but these are power users. Moneythink can grow their user group and start to see the 4% impact increase over time.

continued...

Posted by hipstomp / Rain Noe  |  23 Oct 2014  |  Comments (1)

0hpmagic.jpg
HP making magic?

Earlier this month it was reported that Hewlett-Packard was breaking up into two companies. While one half, Hewlett-Packard Enterprise, will focus on boring stuff like corporate computing, the other half, HP Inc., sounds a little sexier with its emphasis on 3D printing and "new computing experiences."

Since that announcement, it didn't take long for HP Inc. to arrange an event to show what that new experience might be. The new organization plans to hold a press event next week, where they'll pull the sheets off of a new type of computer called Sprout. The all-in-one PC will reportedly feature not only a flatscreen, but a touch-sensitive flat horizontal area over which will be mounted both a projector and a 3D scanner.

No one knows what the thing looks like (in case our visual atop this entry didn't tip you off) or how the interaction will work, but it seems likely that it's similar to the Fujitsu FingerLink Interaction System we showed you last year, which features components similar to what the Sprout is described as having:

continued...

Posted by hipstomp / Rain Noe  |  16 Oct 2014  |  Comments (4)

0multilayerinter.jpg

We assume that gesture control will be the wave of the future, if you'll pardon the pun. And we also assumed it would be perfected by developers tweaking camera-based information. But now Elliptic Labs, a spinoff company from a research outfit at Norway's University of Oslo, has developed the technology to read gestures via sound. Specifically, ultrasound.

In a weird way this is somewhat tied to Norway's oil boom. In addition to the medical applications of ultrasound, Norwegian companies have been using ultrasound for seismic applications, like scouring the coastline for oil deposits. Elliptic Labs emerged from the Norwegian "ultrasonics cluster" that popped up to support industrial needs, and the eggheads at Elliptical subsequently figured out how to use echolocation on a micro scale to read your hand's position in space.

With Elliptic Labs' gesture recognition technology the entire zone above and around a mobile device becomes interactive and responsive to the smallest gesture. The active area is 180 degrees around the device, and up to 50 cm with precise distance measurements made possible by ultrasound... The interaction space can also be customized by device manufacturers or software developers according to user requirements.

Using a small ultrasound speaker, a trio of microphones and clever software, a smartphone (or anything larger) can be programmed to detect your hand's location in 3D space with a higher "resolution" (read: accuracy) than cameras, while using only a miniscule amount of power. And "Most manufacturers only need to install the ultrasound speaker and the software in their smartphones," reckons the company, "since most devices already have at least 3 microphones."

The demo of the technology, which they're calling Multi Layer Interaction, looks pretty darn cool:

continued...

Posted by hipstomp / Rain Noe  |   8 Oct 2014  |  Comments (0)

0flexsense.jpg

Once upon a time, industrial designers, animators, graphic designers and illustrators physically used acetate or mylar sheets as overlays on drawings. Newer generations of creatives now understand this concept as Photoshop layers, which can easily be clicked on and off digitally. But now a team of researchers has combined the physical and digital with "a new thin-film, transparent sensing surface" they're calling FlexSense.

Developed in collaboration between two Austria-based outfits—the human-computer interaction researching Media Interaction Lab and the Institute for Surface Technologies and Photonics—and Microsoft Research, the FlexSense appears to be nothing more than a good ol' acetate overlay, albeit embedded with thin sensors. But since this sheet can precisely sense the manner in which the user is deforming it, when coupled with clever software this can lead to some interesting interactions. You can skip the first half of the video below, which is mostly egghead-speak, but be sure to tune in at 2:05 to see the proposed applications:

While the interface is probably too abstruse for your average consumer, it's easy to see applications that would be perfect for ID and other creative fields. I'd love to see Wacom buy this technology and incorporate it into their stuff.

Posted by hipstomp / Rain Noe  |  29 Sep 2014  |  Comments (1)

0learntodial.jpg

If you want to call your friend Jim, you can say "Call Jim" into your phone and it dials him. Five years ago you'd click on the name "Jim" in your phone and it would dial him. Twenty-five years ago, you'd call Jim by punching his number into a touch-tone phone. Fifty years ago you'd dial Jim's number on a rotary dial.

Before that is where it gets interesting.

Sixty years ago, you'd lift your telephone receiver and be met with silence. (There was no such thing as "dial tone" yet.) You'd tap the hang-up mechanism a few times and an operator—an actual human being sitting in a room waiting for just this moment—would come on the line. You'd then say "Please connect me to [two-letter district code followed by five-digit phone number]." The operator would then plug freaking wires into a switchboard and connect you to Jim.

So when Bell Systems started incorporating this amazing new interface called a "rotary dial" into their telephones, they needed to show consumers how to use them. Watch and be amazed:

continued...

Posted by hipstomp / Rain Noe  |  19 Sep 2014  |  Comments (1)

0hurffthumbzone-001.jpg

The iPhone 6 and 6 Plus roll out today, and uptake will be massive. In addition to the insane sidewalk lines you'll shortly see on the news, Apple has racked up a staggering 4 million pre-orders. iOS app developers who upgrade their offerings will have a ready market, but they "can't just treat screens in the 5.5-inch range simply as a scaled-up version of a smaller phone," writes mobile products developer Scott Hurff, citing basic ergonomics. "[With the larger sizes] grips completely change, and with that, your interface might need to do so, as well."

To help app developers who haven't already made their bones on already-large Android devices, Hurff has released "Thumb Zone" maps on his blog. Research from Steven Hoober, author of Designing Mobile Interfaces, concluded that the majority of users prefer to use smartphones one-handedly, and Hurff used Hoober's data to create visual representations of where your thumb can, can't, and can kind of reach on various models of iPhone:

0hurffthumbzone-002.jpg

Then he puts Thumb Zones for the 6 and 6 Plus side-by-side:

0hurffthumbzone-003.jpg

This is where you start to see a sharp difference brought about by a much larger screen size. The sheer width of the 6 Plus means the thumb can no longer naturally reach all the way to the left edge, while the different grip required to support the larger device also changes the shape of the "Natural" area.

continued...

Posted by hipstomp / Rain Noe  |   8 Sep 2014  |  Comments (6)

0galaxynoteedge-001.jpg

As designers, we find it amusing that there are Apple lovers that hate Samsung and vice versa. What the layperson doesn't seem to grasp is that the rivalry is good for the advancement of UI design. While Apple typically marches to their own drum, and reportedly had no interest in producing a smartphone with a larger screen, Samsung's dominance in that area has driven Cupertino to increase the size of the new iPhones they'll be announcing tomorrow; and in desperate anticipation of that event, Samsung has attempted to steal the march by announcing their new Galaxy Note Edge last week.

0galaxynoteedge-002.jpg

At first glance the unusual, asymmetrical, curved-glass design of the Galaxy Note Edge just seems plain weird. But look at this video by Marques Brownlee demonstrating the intended functionality:

continued...

Posted by hipstomp / Rain Noe  |  30 Jun 2014  |  Comments (0)

0googlematerial.jpg

An interface design is not successful just because you can figure out how to work it. The true test is whether you can explain to your parents, over the phone, how to work it. For any of you who have served as de facto tech support for your folks in this manner, this spot-on video by comedian Ronnie Chieng will be the funniest thing you'll see all week:

YouTube is of course a Google product, and they've got a lot more to worry about than how to delete comments—namely, their Android mobile OS intended for the next generation of smartphones, tablets, smartwatches and Glass. To that end, the Google Design site aims to spread the gospel of their design approach while laying down guidelines for those looking to operate within the Googleverse.

They've coined their approach to interface design "Material Design." By this they mean that interface design ought mimic the design of something involving a physical material. This does not refer to skeuomorphism, like Apple's scuttled faux-stitched-leather etc.; rather they mean that physical materials have easily comprehensible properties and that this predictability ought be emulated. You can pick a piece of paper up, flip it over, fold it in half, write on one side, write on the other. It does not zoom around your desk on its own nor spontaneously change color, but instead obeys the laws of physics and your physical manipulations.

continued...

Posted by Christie Nicholson  |  25 Jun 2014  |  Comments (1)

0PHOTO1_samsungcurvedtv.jpg

There's a good reason we are experiencing the rise of the so-called "visual web." Our minds were destined to be attracted to visuals over text—since most of our brain real estate is devoted to sight. The visual cortex makes up one third of our brain. And the emerging trend of curved screens for smartphones and TVs feeds right into our desire for awesome images.

There are a few concave screens already on the market and some say the iPhone 6 will show up with a curved bend in the screen. It may be the case that market research found that the user feels it makes for a more immersive experience, but there are scientific studies that show we have desire for curved things.

Such reports are coming from a relatively new field in science: Neuroaesthetics. This is where neuroscience (the study of the brain) meets our appreciation for art or beauty.

A group from the University of Toronto recently studied how our brains react to rooms in a house. They had subjects look at photos of rooms while their brains were scanned in an fMRI (functional magnetic resonance imaging) machine.

0PHOTO2_curvedroom.jpg

And the scans revealed that the pleasure centers of their brains "lit up" when they looked at rooms that had curved features as opposed to having the more typical sharp edges. The latter type of rooms actually lit up areas of the brain normally associated with detecting threats.

The curved screens for digital hardware have been constrained by manufacturing—but no longer.

continued...

Posted by hipstomp / Rain Noe  |   5 May 2014  |  Comments (1)

0whipsawnod-001.jpg

One can't help but notice all of the experimentation going on in the wearable devices field. Nothing has gained ubiquitous traction, but that's not for lack of trying; the field includes Google Glass, Nike's Fuelband (R.I.P.), Jawbone's Up, a variety of bluetooth earpieces, Samsung's Galaxy Gear Smartwatch, and whatever Apple's forthcoming iWatch will be, to name a few.

There is of course a real estate issue with the human body, as there's only so many places you can park a device. With the eyes, ears, and wrists already being targeted, industrial design firm Whipsaw (like Autodesk before them) is looking to the fingers. Their Nod device is a touchless gesture control device meant to be worn as a ring:

continued...

Posted by Christie Nicholson  |  23 Apr 2014  |  Comments (0)

PHOTO_mistableproject1.jpg

I'm definitely among those who have been waiting for Minority Report-like gesturing to become a reality. While light beams on desks and walls seems close, it's not our hands manipulating objects in thin air. But now researchers at the University of Bristol have developed the starting point, called MisTable. And they're doing it with mist.

Words will only fail to properly describe the look of this thing, but a tabletop computer system projects images onto a thick blanket of fog. They appear as ghostly apparitions, much like R2D2's projected Princess Leia.

We can interact with the 3D images by sticking our hands into the 'objects' and moving them—maybe to the person sitting next to us. At this time it's simple stuff, but still it means moving something as if it were actually something tangible. Check out the video:

continued...

Posted by Ray  |  14 Apr 2014  |  Comments (0)

ECAL-DeliriousHome_HERO.jpg

It's an increasingly pressing question in this day and age, and one that has certainly seen some interesting responses—including this interdepartmental collaboration from Switzerland design school ECAL—as an evolving dialectic between two closely related design disciplines. Exhibited in Milan's Brera District during the Salone del Mobile last week, "Delirious Home" comprises ten projects that explore the relationship between industrial design and interaction design. (Naoto Fukasawa, for one, believes that the former will eventually be subsumed into the latter as our needs converge into fewer objects thanks to technology.)

ECAL-DeliriousHome-exterior.jpg

Both the Media & Interaction Design and the Industrial Design programs at the Lausanne-based school are highly regarded, and the exhibition at villa-turned-gallery Spazio Orso did not disappoint. In short, professors Alain Bellet and Chris Kabel wanted to riff on with the "smart home" concept—the now-banal techno-utopian prospect of frictionless domesticity (à la any number of brand-driven shorts and films). But "Delirious Home" transcends mere parody by injecting a sense of humor and play into the interactions themselves. In their own words:

Technology—or more precisely electronics—is often added to objects in order to let them sense us, automate our tasks or to make us forget them. Unfortunately until now technology has not become a real friend. Technology has become smart but without a sense of humor, let alone quirky unexpected behavior. This lack of humanness became the starting point to imagine a home where reality takes a different turn, where objects behave in an uncanny way. After all; does being smart mean that you have to be predictable? We don't think so! These apparently common objects and furniture pieces have been carefully concocted to change and question our relationship with them and their fellows.
Thanks to the development of easily programmable sensors, affordable embedded computers and mechanical components, designers can take control of a promised land of possibilities. A land that until now was thought to belong to engineers and technicians. With Delirious Home, ECAL students teach us to take control of the latest techniques and appliances we thought controlled us. The students demonstrate their artful mastery of electronics, mechanics and interaction, developing a new kind of esthetic which goes further than just a formal approach.
The ultimate object—still missing in the delirious home—would be an object able to laugh at itself.

ECAL-DeliriousHome-COMP.jpgPhotos courtesy of ECAL / Axel Crettenand & Sylvain Aebischer

"Delirious Home" was easily a highlight of this year's Fuorisalone and was duly recognized with a Milano Design Award. The video, which features all of the projects, is well worth watching in full:

Additional details and images of each project below.

continued...