Get Our Newsletter

Sign-up for your monthly fix of design news, reviews and stuff to make you smarter.

Follow Core77
Twitter Facebook RSS



The Core77 Design Blog

send us your tips get the RSS feed
Posted by Sam Dunne  |  12 Dec 2014  |  Comments (0)

PizzaHut_SubconsciousMenu8_Dec14.jpgJust look at those vacant expressions—if only there was an easier way

Something's definitely been cooking in the R&D department at Pizza Hut this year. In a market showing trends to polarization—the rise of the high-end, handmade, hipster-friendly, small batch, sourdough, pizza-craft on one hand, and the quick, easy, cheap, delivered-to-your-door stuff still going strong on the other—the middle of the road pizza chain has been struggling with a lack of relevance in recent years. Moderately priced, average pizza (to be kind?) and '80s salad bars are clearly doing it for nobody in the 2010's. And by the looks of things, they know it.

Earlier this year, we reported on the Hut's first foray into interactive ordering technology with the release of their concept touchscreen table top for (playing at) designing your own pizza (with some games and phone interconnectivity thrown in for good measure). Last month, the chain announced a total revamp, launching both an attempt at a bold and contemporary new menu—whipping out on-trend big guns like Sriracha sauce, Buffalo drizzle, "Skinny Slice" and more premium toppings, all under a pretty nauseating (and fairly offensive to Italians) campaign "The Flavor of Now" (I'm not linking to that shit)—and a big identity update; the company's fourth refresh in 15 years.

As if Sriracha, touchscreen tables and insulting geriatric Italian's (ok here's the video) wasn't enough innovation for one year, Pizza Hut have released a new concept that claims to be "the future of dining"...


Posted by Sam Dunne  |   1 Dec 2014  |  Comments (1)


On first glimpse of the HashKey—perhaps as a result of Kickstarter overexposure—my heart sank under the weight of my tumbling faith in humanity and fear for its future. Fortunately, on closer inspection, I found sweet salvation in the realization this was, of course, a product conceived with an eyebrow raised and a tongue in cheek (see also).


Whilst, of course, nobody in their right mind is going to dedicate one of their ever decreasing number of USB ports to such a device, the HashKey makes an amusing observation of the low prioritization of the hashtag key on traditional Qwerty keyboards, especially in contrast to their mobile equivalents—I dread to think of the number of fledgling Twitter adopters copying and pasting the symbol. Whether the (overhyped?) hashtag will ever be promoted to a higher prominence in keyboard culture is yet to be seen, but I can already hear the cogs in the brains of Microsoft's and Samsung's innovation teams turning.

Posted by Ray  |  19 Nov 2014  |  Comments (0)


Well, here's a rather fun self-proclaimed "stupid pet project"—a literal brief if there ever was one—by SVA IXD student Max Kessler. As a kind of analog random number generator, "Coin Flip" is rather more purposeful than this brilliant gizmo, and the drinking-bird-meets-desktop-trebuchet invariably offers a more delightful user experience than, say, a web app. "The programming and robotics were build with an Arduino One, photocell, and Jameco 12V DC Motor," Kessler writes. "All the prototyping was built with MDF."



Posted by Ray  |  14 Nov 2014  |  Comments (7)

fuseproject-Fluidigm_Juno-QuirkyGE-Tripper.jpgL: The Fluidigm Juno, designed by fuseproject; R: Quirky+GE's "Tripper" sensor

As an editor at Core77, I often find myself attempting to explain what industrial design is, and I'm sure those of you who are actually practicing designers often find yourselves in find yourselves in the same position. It's regrettable that ID is a widely unsung (if not outright overlooked) force in the world, to the effect that it falls on a precious few star designers such as Karim Rashid and Jony Ive to speak for the profession. The latter made a rare public appearance at the Design Museum this week in a conversation with museum director Deyan Sudjic, making a strong case for design-led business model (perhaps RE: suggestions to the contrary), hands-on education, and maintained that failure is part of the design process.

If Apple represents the paragon of industrial design in the post-industrial age—hardware that is as much a vessel/vehicle for digital UX (i.e. a screen) as it is a beautiful artifact—so too are we always curious to see new developments in other the frontiers of design. A colleague mentioned offhand that insofar as space exploration is constrained by the logistics of astrophysics itself, there isn't exactly a 'design angle' to the Philae lander that, um, rocketed into headlines this week. (That said, we have reported on design at NASA, where problem-solving is paramount... whether you call it design thinking or not.)


Which brings us to fuseproject's recent work for fellow SFers Fluidigm, a B2B life sciences company that called on Yves Béhar—a star designer in his own right—for a complete design overhaul in a traditionally un-(or at least under-)designed category. From the now-dynamic logo to the genre-busting form factor, the entrepreneurial design firm has risen to the challenge of expressing the genuine technological innovation behind the Juno "single-cell genomic testing machine" with equally revolutionary design.

The shape is sculptural and practical; a delicate balance between a futuristic piece of machinery and something more familiar. The aluminum enclosure is machined at high speed and the rough cuts visible and used as finished surfaces, which is a cost saving. The resultant ridges run along the exterior in a fluid, yet pronounced way, and resemble the miniature functional traces on the cell sample cartridge that enable single cell manipulations.


Posted by core jr  |  10 Nov 2014  |  Comments (0)


By Sheryle Gillihan

Every organization that is developing an app has hopes of becoming the next viral hit, but even great apps compete for attention amongst the distractions.

Moneythink, the established and growing financial literacy program for urban, low-income high school students, also had high expectations of adoption for its app. After completing the Moneythink Mobile pilot last spring, our organization, CauseLabs, reviewed the quantitative and qualitative results with Moneythink. Their initial response? Disappointment. But the CauseLabs team saw something different. We saw the pilot program as a success.

Analyzing the Right Results

It happens all the time. Organizations that are building apps for the first time set the bar too high. Having years of app development experience, CauseLabs knew better than to expect astronomical results and saw success where Moneythink did not.

Consider some of the challenges of mobile engagement: Mobile users are selective when downloading apps due to limited device space. Moreover, studies have shown that, no matter how many apps are installed on a device, users only open an average of 20-30 of them per month. Even useful apps fail to hit 100% engagement.

Take email as an example of a useful app. For some of us, checking email is a daily to-do and behavior that has caused mobile email clients to far surpass their desktop counterparts. Yet, despite prevalent email use, fewer than 20% of email client users were active in the last 12 months, according to an IT Business Net article from earlier this year. Gmail, topping the charts for email usage, showed that only 11% of users were active over a 90-day period.


We don't expect to put a number as low as 11% on reports when we set new app engagement goals, but perhaps it is more realistic. Active use of an app from any user group (outside of paid staff) indicates that it offers something of value. When evaluating our work at CauseLabs, we look through the lens of the 1% difference. Not everything we build will be the next overnight hit, but we build tools to create impact. In the case of Moneythink Mobile, we are introducing financial literacy to the next generation of leaders. If we get Moneythink Mobile in front of 100 students and impact 1% of them in two months, what happens when we reach 1,000 students over a year?

Of the students in the Moneythink Mobile pilot group, 80% downloaded the app and 34% interacted with it, while 4% completed all nine challenges. This percentage may seem small, but these are power users. Moneythink can grow their user group and start to see the 4% impact increase over time.


Posted by hipstomp / Rain Noe  |  23 Oct 2014  |  Comments (1)

HP making magic?

Earlier this month it was reported that Hewlett-Packard was breaking up into two companies. While one half, Hewlett-Packard Enterprise, will focus on boring stuff like corporate computing, the other half, HP Inc., sounds a little sexier with its emphasis on 3D printing and "new computing experiences."

Since that announcement, it didn't take long for HP Inc. to arrange an event to show what that new experience might be. The new organization plans to hold a press event next week, where they'll pull the sheets off of a new type of computer called Sprout. The all-in-one PC will reportedly feature not only a flatscreen, but a touch-sensitive flat horizontal area over which will be mounted both a projector and a 3D scanner.

No one knows what the thing looks like (in case our visual atop this entry didn't tip you off) or how the interaction will work, but it seems likely that it's similar to the Fujitsu FingerLink Interaction System we showed you last year, which features components similar to what the Sprout is described as having:


Posted by hipstomp / Rain Noe  |  16 Oct 2014  |  Comments (4)


We assume that gesture control will be the wave of the future, if you'll pardon the pun. And we also assumed it would be perfected by developers tweaking camera-based information. But now Elliptic Labs, a spinoff company from a research outfit at Norway's University of Oslo, has developed the technology to read gestures via sound. Specifically, ultrasound.

In a weird way this is somewhat tied to Norway's oil boom. In addition to the medical applications of ultrasound, Norwegian companies have been using ultrasound for seismic applications, like scouring the coastline for oil deposits. Elliptic Labs emerged from the Norwegian "ultrasonics cluster" that popped up to support industrial needs, and the eggheads at Elliptical subsequently figured out how to use echolocation on a micro scale to read your hand's position in space.

With Elliptic Labs' gesture recognition technology the entire zone above and around a mobile device becomes interactive and responsive to the smallest gesture. The active area is 180 degrees around the device, and up to 50 cm with precise distance measurements made possible by ultrasound... The interaction space can also be customized by device manufacturers or software developers according to user requirements.

Using a small ultrasound speaker, a trio of microphones and clever software, a smartphone (or anything larger) can be programmed to detect your hand's location in 3D space with a higher "resolution" (read: accuracy) than cameras, while using only a miniscule amount of power. And "Most manufacturers only need to install the ultrasound speaker and the software in their smartphones," reckons the company, "since most devices already have at least 3 microphones."

The demo of the technology, which they're calling Multi Layer Interaction, looks pretty darn cool:


Posted by hipstomp / Rain Noe  |   8 Oct 2014  |  Comments (0)


Once upon a time, industrial designers, animators, graphic designers and illustrators physically used acetate or mylar sheets as overlays on drawings. Newer generations of creatives now understand this concept as Photoshop layers, which can easily be clicked on and off digitally. But now a team of researchers has combined the physical and digital with "a new thin-film, transparent sensing surface" they're calling FlexSense.

Developed in collaboration between two Austria-based outfits—the human-computer interaction researching Media Interaction Lab and the Institute for Surface Technologies and Photonics—and Microsoft Research, the FlexSense appears to be nothing more than a good ol' acetate overlay, albeit embedded with thin sensors. But since this sheet can precisely sense the manner in which the user is deforming it, when coupled with clever software this can lead to some interesting interactions. You can skip the first half of the video below, which is mostly egghead-speak, but be sure to tune in at 2:05 to see the proposed applications:

While the interface is probably too abstruse for your average consumer, it's easy to see applications that would be perfect for ID and other creative fields. I'd love to see Wacom buy this technology and incorporate it into their stuff.

Posted by hipstomp / Rain Noe  |  29 Sep 2014  |  Comments (1)


If you want to call your friend Jim, you can say "Call Jim" into your phone and it dials him. Five years ago you'd click on the name "Jim" in your phone and it would dial him. Twenty-five years ago, you'd call Jim by punching his number into a touch-tone phone. Fifty years ago you'd dial Jim's number on a rotary dial.

Before that is where it gets interesting.

Sixty years ago, you'd lift your telephone receiver and be met with silence. (There was no such thing as "dial tone" yet.) You'd tap the hang-up mechanism a few times and an operator—an actual human being sitting in a room waiting for just this moment—would come on the line. You'd then say "Please connect me to [two-letter district code followed by five-digit phone number]." The operator would then plug freaking wires into a switchboard and connect you to Jim.

So when Bell Systems started incorporating this amazing new interface called a "rotary dial" into their telephones, they needed to show consumers how to use them. Watch and be amazed:


Posted by hipstomp / Rain Noe  |  19 Sep 2014  |  Comments (1)


The iPhone 6 and 6 Plus roll out today, and uptake will be massive. In addition to the insane sidewalk lines you'll shortly see on the news, Apple has racked up a staggering 4 million pre-orders. iOS app developers who upgrade their offerings will have a ready market, but they "can't just treat screens in the 5.5-inch range simply as a scaled-up version of a smaller phone," writes mobile products developer Scott Hurff, citing basic ergonomics. "[With the larger sizes] grips completely change, and with that, your interface might need to do so, as well."

To help app developers who haven't already made their bones on already-large Android devices, Hurff has released "Thumb Zone" maps on his blog. Research from Steven Hoober, author of Designing Mobile Interfaces, concluded that the majority of users prefer to use smartphones one-handedly, and Hurff used Hoober's data to create visual representations of where your thumb can, can't, and can kind of reach on various models of iPhone:


Then he puts Thumb Zones for the 6 and 6 Plus side-by-side:


This is where you start to see a sharp difference brought about by a much larger screen size. The sheer width of the 6 Plus means the thumb can no longer naturally reach all the way to the left edge, while the different grip required to support the larger device also changes the shape of the "Natural" area.


Posted by hipstomp / Rain Noe  |   8 Sep 2014  |  Comments (6)


As designers, we find it amusing that there are Apple lovers that hate Samsung and vice versa. What the layperson doesn't seem to grasp is that the rivalry is good for the advancement of UI design. While Apple typically marches to their own drum, and reportedly had no interest in producing a smartphone with a larger screen, Samsung's dominance in that area has driven Cupertino to increase the size of the new iPhones they'll be announcing tomorrow; and in desperate anticipation of that event, Samsung has attempted to steal the march by announcing their new Galaxy Note Edge last week.


At first glance the unusual, asymmetrical, curved-glass design of the Galaxy Note Edge just seems plain weird. But look at this video by Marques Brownlee demonstrating the intended functionality:


Posted by hipstomp / Rain Noe  |  30 Jun 2014  |  Comments (0)


An interface design is not successful just because you can figure out how to work it. The true test is whether you can explain to your parents, over the phone, how to work it. For any of you who have served as de facto tech support for your folks in this manner, this spot-on video by comedian Ronnie Chieng will be the funniest thing you'll see all week:

YouTube is of course a Google product, and they've got a lot more to worry about than how to delete comments—namely, their Android mobile OS intended for the next generation of smartphones, tablets, smartwatches and Glass. To that end, the Google Design site aims to spread the gospel of their design approach while laying down guidelines for those looking to operate within the Googleverse.

They've coined their approach to interface design "Material Design." By this they mean that interface design ought mimic the design of something involving a physical material. This does not refer to skeuomorphism, like Apple's scuttled faux-stitched-leather etc.; rather they mean that physical materials have easily comprehensible properties and that this predictability ought be emulated. You can pick a piece of paper up, flip it over, fold it in half, write on one side, write on the other. It does not zoom around your desk on its own nor spontaneously change color, but instead obeys the laws of physics and your physical manipulations.


Posted by Christie Nicholson  |  25 Jun 2014  |  Comments (1)


There's a good reason we are experiencing the rise of the so-called "visual web." Our minds were destined to be attracted to visuals over text—since most of our brain real estate is devoted to sight. The visual cortex makes up one third of our brain. And the emerging trend of curved screens for smartphones and TVs feeds right into our desire for awesome images.

There are a few concave screens already on the market and some say the iPhone 6 will show up with a curved bend in the screen. It may be the case that market research found that the user feels it makes for a more immersive experience, but there are scientific studies that show we have desire for curved things.

Such reports are coming from a relatively new field in science: Neuroaesthetics. This is where neuroscience (the study of the brain) meets our appreciation for art or beauty.

A group from the University of Toronto recently studied how our brains react to rooms in a house. They had subjects look at photos of rooms while their brains were scanned in an fMRI (functional magnetic resonance imaging) machine.


And the scans revealed that the pleasure centers of their brains "lit up" when they looked at rooms that had curved features as opposed to having the more typical sharp edges. The latter type of rooms actually lit up areas of the brain normally associated with detecting threats.

The curved screens for digital hardware have been constrained by manufacturing—but no longer.


Posted by hipstomp / Rain Noe  |   5 May 2014  |  Comments (1)


One can't help but notice all of the experimentation going on in the wearable devices field. Nothing has gained ubiquitous traction, but that's not for lack of trying; the field includes Google Glass, Nike's Fuelband (R.I.P.), Jawbone's Up, a variety of bluetooth earpieces, Samsung's Galaxy Gear Smartwatch, and whatever Apple's forthcoming iWatch will be, to name a few.

There is of course a real estate issue with the human body, as there's only so many places you can park a device. With the eyes, ears, and wrists already being targeted, industrial design firm Whipsaw (like Autodesk before them) is looking to the fingers. Their Nod device is a touchless gesture control device meant to be worn as a ring:


Posted by Christie Nicholson  |  23 Apr 2014  |  Comments (0)


I'm definitely among those who have been waiting for Minority Report-like gesturing to become a reality. While light beams on desks and walls seems close, it's not our hands manipulating objects in thin air. But now researchers at the University of Bristol have developed the starting point, called MisTable. And they're doing it with mist.

Words will only fail to properly describe the look of this thing, but a tabletop computer system projects images onto a thick blanket of fog. They appear as ghostly apparitions, much like R2D2's projected Princess Leia.

We can interact with the 3D images by sticking our hands into the 'objects' and moving them—maybe to the person sitting next to us. At this time it's simple stuff, but still it means moving something as if it were actually something tangible. Check out the video:


Posted by Ray  |  14 Apr 2014  |  Comments (0)


It's an increasingly pressing question in this day and age, and one that has certainly seen some interesting responses—including this interdepartmental collaboration from Switzerland design school ECAL—as an evolving dialectic between two closely related design disciplines. Exhibited in Milan's Brera District during the Salone del Mobile last week, "Delirious Home" comprises ten projects that explore the relationship between industrial design and interaction design. (Naoto Fukasawa, for one, believes that the former will eventually be subsumed into the latter as our needs converge into fewer objects thanks to technology.)


Both the Media & Interaction Design and the Industrial Design programs at the Lausanne-based school are highly regarded, and the exhibition at villa-turned-gallery Spazio Orso did not disappoint. In short, professors Alain Bellet and Chris Kabel wanted to riff on with the "smart home" concept—the now-banal techno-utopian prospect of frictionless domesticity (à la any number of brand-driven shorts and films). But "Delirious Home" transcends mere parody by injecting a sense of humor and play into the interactions themselves. In their own words:

Technology—or more precisely electronics—is often added to objects in order to let them sense us, automate our tasks or to make us forget them. Unfortunately until now technology has not become a real friend. Technology has become smart but without a sense of humor, let alone quirky unexpected behavior. This lack of humanness became the starting point to imagine a home where reality takes a different turn, where objects behave in an uncanny way. After all; does being smart mean that you have to be predictable? We don't think so! These apparently common objects and furniture pieces have been carefully concocted to change and question our relationship with them and their fellows.
Thanks to the development of easily programmable sensors, affordable embedded computers and mechanical components, designers can take control of a promised land of possibilities. A land that until now was thought to belong to engineers and technicians. With Delirious Home, ECAL students teach us to take control of the latest techniques and appliances we thought controlled us. The students demonstrate their artful mastery of electronics, mechanics and interaction, developing a new kind of esthetic which goes further than just a formal approach.
The ultimate object—still missing in the delirious home—would be an object able to laugh at itself.

ECAL-DeliriousHome-COMP.jpgPhotos courtesy of ECAL / Axel Crettenand & Sylvain Aebischer

"Delirious Home" was easily a highlight of this year's Fuorisalone and was duly recognized with a Milano Design Award. The video, which features all of the projects, is well worth watching in full:

Additional details and images of each project below.


Posted by hipstomp / Rain Noe  |  20 Mar 2014  |  Comments (2)


Volvo's recently introduced a trio of concept cars: The Concept Coupe, the Concept XC Coupe and the Concept Estate. It is the latter that has most caught our eye because it is, quite oddly to us Yanks, a two-door station wagon. In America, the station wagon has always been about families, but by omitting rear doors, Volvo seems to be aiming this concept at the childless couple that likes to ski.


The Concept Estate brings with it Volvo's bold new styling direction, both inside and out, that's a million miles (er, kilometers) away from the Swedish carmaker's designed-by-Etch-a-Sketch look that we grew up with:



Posted by hipstomp / Rain Noe  |   6 Mar 2014  |  Comments (9)


It's been over a year since we've seen interactive restaurant tables in the news, but here comes a new one from Pizza Hut. Yes, the American fast food joint is hoping that if their deep-dish pizzas aren't enough to get you inside, perhaps their fee-yancy touchscreen table will be. Have a look:

What's interesting about this, from a business perspective, is that Pizza Hut is owned by Yum! Brands, which also owns KFC and Taco Bell. While the last interactive restaurant table we looked at was integrated into a one-off restaurant, Yum! Brands (God I hate typing that stupid exclamation point in their name) has some 40,000 restaurants in over 125 countries.

As for the actual interface design (which was done by creative firm Chaotic Moon), it still seems a bit cutesy to me; I'm not confident that people will want to do a two-finger drag to choose a pie size, for instance—I suspect they'd rather just hit an S, M or L button. But the visual representation of how large something is will probably prove popular. And once the balance between what the technology can do and what people actually want has been worked out, if Y!B decides to move ahead with this concept, we could see mass uptake in a relatively short time period, on account of their size. Presumably they've got the juice to require individual franchisees to integrate these units, handily spreading the costs out.

Posted by hipstomp / Rain Noe  |  27 Feb 2014  |  Comments (1)


Someone has finally taken note that throughout the day, we use our smartphones in at least two different ways. There's the active way, where you're futzing around with an app and your thumbs are flying across the screen. Then there's the passive way, where you're glancing at it to reference some piece of information you need. And with that latter usage, it would be better if the information was persistently presented, not something you had to call up by doing a home-button-press/swipe/access-code-enter/app-button-press.

Thus Russian tech manufacturer Yota Devices produces the Yota Phone, billed as "The world's first dual-screen, always-on smartphone." While one side has got the familiar color touchscreen we're all familiar with, flip the thing over and there's a black-and-white, EPD electronic-ink-type display that draws no power once its pixels are in place. (The image or text will stay "burned" there even if the phone's battery dies.) In other words you send whatever data you want to that second screen and it stays there, ready for immediate viewing when you pull the phone out of your bag, no button presses necessary. If I owned this phone I'd constantly avail myself of the convenience of having a grocery list, boarding pass, map snippet, reference dimensions, addresses and appointment times, etc.


First gen: Square-ish


Posted by Gloria Suzie Kim  |  21 Oct 2013  |  Comments (0)

Homeplus-viaBrandSugar.jpgWoman shopping for groceries in South Korea at a HomePlus display using her mobile phone

Earlier this month, Adaptive Path held the Service Experience conference in San Francisco, CA. The conference invited designers and business leaders who are out there 'in the trenches' to share insights, tips, and methods from their case studies in service design.

Service Design is an emergent area of design thinking that's been percolating in design circles for many years. Though corporate brands like Apple, Nike, P&G and Starbucks have built their success on the principles of good service design, it's an approach getting more serious consideration in countries like the U.S. after years of being developed in Europe.

Service Design, Service Experience, or Consumer Experience is a design approach that understands that the process by which a product is made and the organization that produces it, not only affects the product, but also defines the experience of the product. Service Design is made up of many ecosystems, including a company's own internal culture, their approach to production and development, as well as the context of the product as it exists in the day to day life of the users. Think about how Apple represents not only the product, but also customer service combined with the branded architectural experience of the Apple store. Or how Tesla motors is not only considering the product (an electric vehicle) but also mapping out a plan for a network of electric charging stations in California.

Service Design is a holistic system that takes into consideration the end to end experience of a product, whether it be a car, a computer, a trip, or a book. It is invested in creating the infrastructure that supports and empathizes with human needs by prioritizing people and experiences over technology during the design process. Service design is a design approach that can be applied across fields.

Swimming in Culture

A key perspective of Service Design is the ability to grasp organizational culture. Ever wonder why you had a great time working for one company and another company, not so much? Maybe it's not all 'in your head': According to keynote speaker David Gray of Limnl, culture is a summation of the habits of a group, and that "people swim in culture the way fish swim in water," using the analogy of dolphins and sharks.

culturemap3.jpgIllustration from David Gray's presentation. (People may prefer to self-identify as a dolphin rather than a shark.)

In order to change culture, one must be able to find its foundation first. Ask dumb questions, talk to the newbies, gather evidence, and the evidence (what you see) usually leads to levers (how and why decisions are made and the protocol used) which leads to the company values (the underlying priorities and what's considered important) that uncover foundational assumptions (how they view the way the world works and what is the reasoning behind those values).

dividedcompany.jpgSketchnote courtesy of Kate Rutter /


Posted by Ray  |  20 Sep 2013  |  Comments (0)


Although it launched nearly a year ago, I'm surprised that an app called How.Do didn't turn up on our radar—after all, an app for making quick'n'dirty how-to tutorials is right up our alley. Thankfully, co-founder Emma Rose Metcalfe reached out to us on the occasion of the launch of How.Do Two.Oh (Version 2.0, that is), which was released yesterday on the occasion of iOS7 and the World Maker Faire this weekend. (Supported by venture capital, her fellow co-founders Nils Westerlund and Edward Jewson round out the Berlin-based team.)

Viewable both through the free app and online, the Micro Guides are concise user-generated slideshows with audio, an ideal format for step-by-step tutorials and on-the-go reference guides. Insofar as the app hits a sweet spot in the maker/fixer/lifehacking movement, the How.Do team will be reporting from World Maker Faire tomorrow and Sunday, offering a unique window into the festivities at the New York Hall of Science—follow them on Twitter @HowDo_ to get the scoop!

As busy as they are this weekend, Metcalfe took a few moments to share her thoughts at this exciting time for the growing company.

Core77: What inspired you to create How.Do in the first place?

Emma Rose Metcalfe: How.Do is the intersection of my MFA research in sharing and distributing meaningful experiences and Nils' interest in the challenges of scaling projects for large communities. He had left SoundCloud to finish his studies at Stockholm School of Entrepreneurship where the two of us met. Long story short, we came home from a design bootcamp in India wanting work on something together. We shared the belief that knowledge is deeply personal. The space created between the emotional power of sound and the fantasy of image is incredibly profound—we wanted to harness that to make sharing and learning feel good.


Posted by hipstomp / Rain Noe  |  15 Aug 2013  |  Comments (0)


As we recently saw, Ford has been experimenting with ways for drivers to use real-time vehicle information. Now competitor Chevrolet is also throwing their hat into this ring with a new, configurable dashboard display in the 2014 Corvette Stingray.

For the Fast & Furious set, the Stingray's dash can display acceleration and lap timers, as well as surprisingly techie stuff like a "friction bubble" displaying cornering force and a gauge showing you how hot the tires are. (Hot tires have better grip, which is why you see F1 drivers violently zigzagging on their way to the starting line; they're trying to get some heat on.)


For drivers in less of a rush, the dash can be set to display more practical information like fuel economy, what the stereo's playing or navigational details. I think the latter one in particular is a good move, as having route guidance graphics front and center behind the steering wheel is a lot better than having to shift your gaze to the center of the entire dashboard.


There are 69 different pieces of information the system can display, divided into three main themes: Tour, aimed at commuters and long-distance driving; Sport, which provides a pared-down, classic-looking radial tachometer; and Track, which gives you the hockey-stick tach, shift lights and an enlarged gear indicator. "Each of these three themes," says Jason Stewart, General Motors interaction designer, "can also be configured so that drivers can personalize their experience in the Stingray."

Here's a video look at the system:


Posted by hipstomp / Rain Noe  |   5 Aug 2013  |  Comments (3)


It's very strange that Google Glass is not mentioned once in this news segment. Researchers at Taiwan's Industrial Technology Research Institute (ITRI) have developed this eyeglass-based display, below, that uses images projected onto the lenses, and depth cameras focusing beyond the lenses, to create the functional illusion of operating a "floating touchscreen":

ITRI is simply the latest research group to use depth cameras to track our fingers, which then triggers a microprocessor to recognize that as an actionable "touch." Most recently we saw this with Fujitsu Labs' FingerLink Interaction System. So you might wonder why we're looking at this—isn't this just a combination of existing technologies that we've all seen before? It is, but so was the iPod, the iPhone and the iPad when they first came out.


Posted by frog  |  22 Jul 2013  |  Comments (4)


Our friends at frog design recently released a short documentary on Industrial Design in the Modern World, a kind of iterative manifesto (the consultancy's first but certainly not their last), featuring several key players of the design team. We had a chance to catch up with Creative Director Jonas Damon on the broader message of the piece, as well as his thoughts on user experience and a possible revision to Dieter Rams' canonical principles of design.

Core77: Can you elaborate on the points you touch on in the opening monologue? Specifically, to what degree do 'traditional' (or outdated) forms and materials embody value or character? For example, I recently came across an iPod speaker in which the dock opens like a cassette tape deck, evoking a certain nostalgic charm despite being rather impractical (it was difficult to see the screen behind the plastic).

Jonas Damon: The opening monologue is about the physical constraints that have guided forms in the past vs. forms today, and the opportunities that arise from the absence of these constraints. 'Honesty' in design is a widely admired quality, and in the past that honesty was expressed by skillfully sculpting with and around a given product's physical conditions, rather than just hiding or disguising these. So when products were more mechanical, they had a more imposing DNA that informed their character; their mechanics largely defined their identities. Many product types came preconditioned with an iconic, unmistakable silhouette.

Today, most products in the consumer electronics space can be made with a rectangular circuit board, a rectangular screen, and a rectangular housing. Therefore, the natural expression of these products today is limited to a rectangle—not really a unique identity. Expression of character becomes more nuanced and malleable. With that newfound freedom, we have to be more sensitive, judicious and inventive. These days, 'honesty' is more complex and difficult to design for, as it's about the intangible aspects of the brand the product embodies.

Traditional forms and materials have cultural value because of their iconic, built-in character. The starting point for many contemporary consumer electronics forms is generic and sterile, so historical forms are often tapped to artificially trigger our memory-based emotions. It's been a popular fallback that we may be a little tired of these days, but on occasion its been well executed, and even that can have merit.

Of course, the 'flat black rectangle' effect also implies a shift from traditional form-follows-function I.D. to a broader, UX-centric approach to design (i.e. some argue that Apple's focus on iOS7 is simply a sign that they've shifted from hardware innovation to the UX/software experience). What is the relationship between hardware and UX?

Hardware is an integral part of UX. A true "user experience" is multi-sensory: when you engage with something, don't you see, feel, hear, maybe even smell that which you are engaging with? (I'm not sure why anybody refers to solely screen-based interactions as "UX"; that notion is outdated) As an Industrial Designer, I am a designer of User Experience. ID has gotten richer since we've started considering "living technology" as a material. By "living technology," I mean those elements that bring objects to life, that make them animate and tie them to other parts of the world around us: sensors, screens, haptics, connectivity, software, etc. By claiming these elements as part of our domain (or by tightly embedding their respective expert designers/engineers in our teams), we are able to create holistic designs that are greater than the sums of their parts.


Posted by hipstomp / Rain Noe  |  11 Jun 2013  |  Comments (9)


In addition to unveiling their redesigned Mac Pro, yesterday Apple also previewed their forthcoming iOS 7. This is the one many an industrial designer has been waiting to see; we all know Jonathan Ive can do hardware, but iOS 7 will be the first real indication of what software will look like under Ive rule—and if he'd be given free reign. Former Apple executive Scott Forstall was famously a proponent of skeuomorphism, the inclusion of real-world elements—stitched leather, lined legal pads, spiral bindings—that many in the design community found tacky and backwards-looking. Following his ouster, Ive was placed in charge of iOS design, and he's made it no secret that he intended to Think Different.

Well, based on what we're seeing, we're happy to report that it seems Ive's creative control is complete.

The first thing users will likely note is the change in typography. Just as Forstall's beloved word "skeuomorphism" has an unusual sequence of three vowels in a row, Ive has switched the font to what looks to be Helvetica Neue Ultra Light, which has an equally foreign sequence of contiguous vowels. The resultant look is undoubtedly more modern (though your correspondent prefers thicker fonts for legibility's sake).



"Flatness" is the adjective of the day, and the new iOS has it in spades. In the past decade-and-a-half icons have spun steadily out of control; what were once simple representations of objects, necessarily drawn in low-res due to computing constraints, unpleasantly evolved into overcomplicated, miniaturized portraits. Ive's flat design approach returns to the roots of the graphic icon, eschewing 3D shading and instead using line to tell the tale. With the exception of a couple of icons—the Settings gears and Game Center's balloons—shading is completely absent. The cartoonish highlights on the text message word bubbles are gone. Background gradations are the only non-flat visual variation allowed.


Interestingly enough, the keypad now looks like something graphically designed by the Braun of yore...


Posted by Ray  |   6 Jun 2013  |  Comments (1)


Among the many criticisms leveled against New York City's new bikeshare program, I'm particularly perplexed by the notion that the stations are a blight upon Gotham's otherwise pristine streetscapes—at worst, they're conspicuously overbranded, but, as many proponents have pointed out, they're no worse than any other curbside eyesore. Although the city is making a conscious effort to reduce the visual overstimuli at street level, it's only a matter of time before static signage simply won't suffice.

While Maspeth Sign Shop continues to crank out aluminum signs, BREAKFAST proposes an entirely novel concept for interactive, real-time wayfinding fixtures. "Points" is billed as "the most advanced and intelligent directional sign on Earth," featuring three directional signs with LEDs to dynamically display relevant information. However, "it's when the arms begin to rotate around towards new directions and the text begins to update that you realize you're looking at something much more cutting-edge. You're looking at the future of how people find where they're headed next."

See "Points" in action: