Get Our Newsletter
Submit

Sign-up for your monthly fix of design news, reviews and stuff to make you smarter.

Follow Core77
Twitter Facebook RSS

 

Technology

The Core77 Design Blog

send us your tips get the RSS feed
 
Posted by Ray  |  20 Oct 2014  |  Comments (0)

Snohetta-Beaufort.jpgReddit-USD.jpgAt top: Designed by Snøhetta, the reverse sides of all five denominations of Norwegian kroners form a continuous pattern; above: A proposed redesign of the U.S. Dollar

By now, you've probably caught a glimpse of what were widely hailed as "the most beautiful banknotes ever." Somewhat less widely reported, at least in the first wave of press, is the fact that the Snøhetta-designed reverse side of the new Norwegian kroner is based on the Beaufort Scale for windspeed, or the fact that the jury actually selected Enzo Finger as the winner but that Norges Bank overruled their judgment and, um, split the bill between runner-up Metric System—who, in fairness, received credit for the obverse—and the architecture firm's PR-friendly abstraction. (A curiously contrarian interview with Snøhetta's Matthias Frodlund in Creators Project is perhaps the most interesting window into the process behind the pieces; "[Since] this might be the last [paper] money to be produced in Norway, [it's like] giving the digital world a little sneer—look we can be like you, digital and pixelated, just much more beautiful.")

NOK100.jpgThe front and back of the 100kr note, designed by the Metric System and Snøhetta

In fact, all of the entries are available for viewing in the exegetical catalogue [PDF] (published with the October 7 press release), which elaborates on requirements such as standardized dimensions and colors of the notes—these properties remain consistent with extant currency for easy identification by both blind and sighted users—and judging criteria. Taking the theme of "The Sea," each denomination was required to express a subtheme, i.e. "Sea that brings us into the world" (100kr); "Sea that brings us further" (1,000kr). Other considerations include acceptance by the general public, aesthetic longevity, and, interestingly, the fact that it will represent the national idenitity as "a businesss card for Norway."

NOK1000.jpgThe front and back of the 1000kr note, designed by the Metric System and Snøhetta

That much I gleaned from some de rigueur Google translating; the 64-page document (only about 15 of the pages have text) is a fairly straightforward outline of the competition, but I won't deny you the surprise of seeing Aslak Gurholt Rønsen's entry (pp. 16–21)...

continued...

Posted by Ray  |  17 Oct 2014  |  Comments (0)

AerialBold-NOAA.jpgL: ABC Dataset Samples; R: Photo credit: NOAA, Vancouver Aquarium.

We've long been enamored with the Eames' Powers of Ten short film, which is as much an introduction to aerial photography as it is to the math behind astronomy and biology. Just as everyone now takes beautiful images (and the retina displays to view them on) for granted, there is also a sense in which we are collectively GPS-enabled: After all, digital cartography is perhaps the most practical application of constant connectivity, and we can thank one company for the ability to zoom out to god's-(or satellites'-)eye view with a pinch of the fingers.

AerialBold.jpg

Benedikt Groß & Joey Lee take it even further with Aerial Bold, the "first map and typeface of the earth."

The project is literally about "reading" the earth for letterforms, or alphabet shapes, "written" into the topology of buildings, roads, rivers, trees, and lakes. To do this, we will traverse the entire planet's worth of satellite imagery and develop the tools and methods necessary to map these features hiding in plain sight.
The entire letterform database will be made available as a "usable" dataset for any of your art/design/science/textual projects and selected letterforms will be made into a truetype/opentype font format that can be imported to your favorite word processor.

continued...

Posted by hipstomp / Rain Noe  |  16 Oct 2014  |  Comments (4)

0multilayerinter.jpg

We assume that gesture control will be the wave of the future, if you'll pardon the pun. And we also assumed it would be perfected by developers tweaking camera-based information. But now Elliptic Labs, a spinoff company from a research outfit at Norway's University of Oslo, has developed the technology to read gestures via sound. Specifically, ultrasound.

In a weird way this is somewhat tied to Norway's oil boom. In addition to the medical applications of ultrasound, Norwegian companies have been using ultrasound for seismic applications, like scouring the coastline for oil deposits. Elliptic Labs emerged from the Norwegian "ultrasonics cluster" that popped up to support industrial needs, and the eggheads at Elliptical subsequently figured out how to use echolocation on a micro scale to read your hand's position in space.

With Elliptic Labs' gesture recognition technology the entire zone above and around a mobile device becomes interactive and responsive to the smallest gesture. The active area is 180 degrees around the device, and up to 50 cm with precise distance measurements made possible by ultrasound... The interaction space can also be customized by device manufacturers or software developers according to user requirements.

Using a small ultrasound speaker, a trio of microphones and clever software, a smartphone (or anything larger) can be programmed to detect your hand's location in 3D space with a higher "resolution" (read: accuracy) than cameras, while using only a miniscule amount of power. And "Most manufacturers only need to install the ultrasound speaker and the software in their smartphones," reckons the company, "since most devices already have at least 3 microphones."

The demo of the technology, which they're calling Multi Layer Interaction, looks pretty darn cool:

continued...

Posted by hipstomp / Rain Noe  |  13 Oct 2014  |  Comments (8)

BoschEdit.jpg

They are the first to market, but they certainly won't be the last: Power tool manufacturer Bosch has rolled out wireless charging for 18-volt cordless tools before any of their competitors. An inductive charger transmits electricity to the battery placed atop it, meaning for the first time one doesn't have to disconnect the battery to juice it up.

0boschwirelesschar-02.jpg

The productivity gains spread across the entire body of users should be enormous. I can't tell you how many times I've been using my drill and impact driver in concert, and invariably one or the other will run out of juice, meaning I've got to go back and forth with one battery on both units while I charge the other battery up. Arguably this wouldn't happen if I had the discipline to disconnect both batteries after every job and pop them back on the charger, but I just don't. With a charger frame like Bosch's, I could simply dock the entire tool after each gig and come back to 100% battery life, checking the little LED indicator on the base to be sure.

Check it out:

The company reports that the new 18V batteries are backwards-compatible, so legacy Bosch users won't be left in the wired-up cold.

Posted by hipstomp / Rain Noe  |   8 Oct 2014  |  Comments (2)

0cycloramic.gif

Bruno Francois is a clever man. Back in 2012 he figured out how to game the vibrating function in an iPhone 5, combined with data from the gyro and compass, in order to cause the iPhone to precisely rotate in place when stood up on its edge. The resultant app he created, Cycloramic, could then shoot hands-free panoramic photos and video:

This was good enough to garner Francois some 600,000-plus downloads, and with a $0.99 retail price, he presumably recouped whatever investment of time and money he put into developing the app. But earlier this year he appeared on the competitive "Shark Tank" TV show, where entrepreneurs compete to gain financial backing from Mark-Cuban-level big dogs, to see if he could go next-level. The clip was riveting:

continued...

Posted by hipstomp / Rain Noe  |  30 Sep 2014  |  Comments (4)

0invisibleprius.jpg

This is a fascinating idea that was developed by a research group at Japan's Keio University. By applying optical camouflage technology and using recursive reflectors, which "[reflect] light back in the direction of incidence," the researchers were essentially able to render the back of a Toyota Prius invisible, at least from the driver's point of view. Take a look:

What we found fascinating is their proposal that this could be applied to all 360 degrees. And aside from average motorists trying to back passenger cars into parking spaces, imagine what a boon this would be to folks driving delivery trucks, tractor-trailers, construction machinery and other bulky, blind-spot-laden vehicles.

Unfortunately, the technology may never come to pass. The concept was put forth in 2011, and there's been no word on an update since the video above was released in 2012. But tell me this thing wouldn't get Kickstarted in a heartbeat.

Via DigInfo TV

Posted by hipstomp / Rain Noe  |  30 Sep 2014  |  Comments (0)

0seekthermal-01.jpg

I'm not looking forward to winter, because the ex-manufacturing space I moved into last year is brutally cold and drafty. I spent last winter making futile attempts to caulk this and shrink-wrap that, only to achieve zero perceptible gains in thermal efficiency; the space is simply too deteriorated on all six sides for me to determine where I can best make a dent.

What I need is a focused plan, a way of determining where the largest heat leaks are so I can tackle those first. And I think I've found my solution in this awesome-looking Seek Thermal Smartphone Infrared Camera.

0seekthermal-02.jpg

The tiny, three-inch, half-ounce, $199 device brings something close to military- or industrial-grade thermal imaging to the common man with the common paycheck. (A commercial infrared camera would run you four figures.) You plug it into the bottom of your smartphone and bang, you've got an image on your screen that can accurately display a range of temperatures from -40° Celsius (-40° Fahrenheit) up to 330° C (626° F).

Here's a demo of it in action from Android Police's David Ruddock, and you can skip the first 30 seconds of pitch-blackness:

continued...

Posted by hipstomp / Rain Noe  |  30 Sep 2014  |  Comments (2)

0dronelamps-01.jpg

We've seen drones used or proposed for package delivery, elaborate selfies, action sports capture, movie promotion, and even weather control. But a recent creative collaboration points to the possibility of a more domestic usage that we think could be the killer drone app of the future: How about floating lamps? Which is to say, just the lampshade and a light source, no stem, no cable, hovering in mid-air, able to follow you around the room if need be.

0dronelamps-02.jpg

In the video below you'll see what it would look like, but before it becomes domesticated, there are just a few (completely solveable) technological hurdles to clear:

Noise. To cancel out the incessant whining of a hovering drone, a small on-board speaker could project a noise cancellation frequency.

Power. During the daytime, the drone could dock itself, perhaps to something attached to the ceiling, where it would recharge the batteries required for both the light and its own sustained flight. (Ideally the power would come from solar, so you're not wasting a bunch of coal-fired juice on an admittedly frivolous technology.)

User Interaction. Remote control, gesture control or voice activation could turn it on and off, adjust the brightness and hue, and ask the lamp to follow you around or focus light on a particular area.

At any rate, a floating lamp would give you one less thing to vacuum around, if replacing a floor lamp, and free up some table space if replacing a desk lamp.

Maybe it sounds silly but it looks beautiful in practice. Check out this sweet video created in a collaboration between performance group Cirque du Soleil, the Swiss Federal Institute of Technology Zurich and drone developer Verity Studios:

continued...

Posted by hipstomp / Rain Noe  |  23 Sep 2014  |  Comments (1)

0ac130-chaffflares.jpg

When modern warplanes have missiles fired at them, they deploy flares or chaff to lead those missiles off-target. The magnesium-containing flares are designed to burn hotter than the airplane's exhaust, drawing heat-seeking missiles to the flare rather than the plane. Meanwhile the reflectiveness of chaff—typically small pieces of aluminum or reflective plastic—are meant to dazzle and confuse radar-guided missiles. This overly dramatic video of a Eurofighter Typhoon shows you how it's supposed to work (at least with flares):

continued...

Posted by hipstomp / Rain Noe  |  12 Sep 2014  |  Comments (1)

0mahjong.gif

If you've ever passed a park in Chinatown and seen the older folks playing Mahjong, you've undoubtedly seen them manually "shuffle" the tiles between games before rearranging them into fresh rows. This is how they've done it for thousands of years, but in the past few decades, Mahjong tile shuffling and dealing has received a rather awesome upgrade:

continued...

Posted by Sam Dunne  |   9 Sep 2014  |  Comments (2)

BergClosure.png

Earlier today, connected device guru Matt Webb announced the closure of cloud service developer Berg Cloud in a blogpost, citing difficulties in finding a sustainable business model for the innovative venture since moving away from the agency model back in 2013. Conceived after experiencing the difficulties of making connected products first hand, the vision of the ambitious Berg Cloud had been to create cloud services that would make life easier for hardware innovators—effectively serving as the missing link between the wireless chip in a new connected device and a user-facing website or smartphone app.

The Little Printer—perhaps the most iconic of Berg's creations—also looks to be implicated in the sad news. Closure of the service behind the product, the release reports, could come as soon as March 2015 unless a suitable buyer can be found. Fortunately for fans and owners of the diminutive device and the wider technologist community, the announcement also suggests that the code for the Little Printer could be opened up—no doubt to the delight of legions of hackers and tinkerers.

BergClosure_Blog.png

continued...

Posted by hipstomp / Rain Noe  |   4 Sep 2014  |  Comments (7)

0dysoneye360-001.jpg

With a press release this morning, Dyson tidily answered some questions that have been in the back of our minds. Questions like:

1. What are they going to use those tiny, powerful digital motors they developed for?
2. What weren't they showing us in that Dyson Proving Grounds video from last year?
3. What does a company that spends £3 million on research every week put that money towards?
4. Why on earth does a company with just three product categories employ 2,000 engineers?
5. Why hasn't Dyson, with its expertise in vacuums, yet moved into the robot vacuum space?

With this morning's announcement of the Dyson 360 Eye, all of the answers have fallen into place. Their forthcoming robot vacuum has been in development for a staggering 16 years, and a close look at the thing explains what all of those R&D eggheads have been working on.

First off, the "most powerful suction of any robot vacuum" claim is what you'd expect with Dyson, particularly if you've used any of their products. (Look for our upcoming "Living With..." product review on the Dyson DC59.) As you'll see in the demo video below, the 360 Eye with that digital motor was designed to suck up way more dust than the competing robot vacs, including from the crevices between floorboards, in a single pass. This comes as no surprise.

0dysoneye360-002.jpg

What is surprising is the wayfinding technology they've come up with. Existing robot vacuums have sensors and algorithms that they use to bounce around rooms seemingly randomly, relying on multiple passes and path redundancy to get a room clean. Dismissing this method as primitive and inadequate, Dyson opted to go with vision.

0dysoneye360-003.jpg

They developed a 360-degree camera that shoots 30 frames per second and actually sees the room, and selects high-contrast points—the edge of a picture frame, the corner of a bookshelf, for instance—to triangulate its location in space.

0dysoneye360-004.jpg

continued...

Posted by Ray  |   3 Sep 2014  |  Comments (8)

iPhone6-1.jpg

Cupertino seems to have sprung a few leaks lately, from the iCloud celeb photo hack to a drone-eye view of the spaceship construction site Of course, insofar as Apple is known for its secrecy as much as its industry leadership, the company has long been a target for another reason: speculation about new products.

Hot on the heels of Feld & Volk's hands-on teaser, above, Russian tech reviewer Rozetked brings us a fully assembled iPhone 6 a week prior to its official unveiling next week. Reportedly sourced from various factories that are supplying parts for the sixth generation iPhone, the product walkthrough imparts a strong sense of the larger, thinner smartphone's features, most notably its rounded edges and protruding camera. Other notable details include unibody construction with the signature plastic bands for the antenna, while the 4.7” screen is reportedly not made of solid sapphire (which we'd previously seen and was introduced for the home button of the iPhone 5S).

iPhone6-3.jpg

Check out the video:

continued...

Posted by Ray  |  28 Aug 2014  |  Comments (0)

USOpenSessions-IBM-James_Murphy.jpg

"I hear you're buying a synthesizer and an arpeggiator and are throwing your computer out the window because you want to make something real."

–LCD Soundsystem, 'Losing My Edge'

Well this is weird and fun: The data wizards at IBM have partnered with the U.S. Open and James Murphy of LCD Soundsystem / DFA Records fame to create real-time musical interpretations of tennis matches throughout the tournament. The premise of the U.S. Open Sessions is simple: IBM processes millions of data points via cloud-based algorithms to generate synth tones that represent the gameplay, complemented by Platonic shapes in the browser window. Developer Patrick Gunderson of digital production companyTool does the heavy lifting while Murphy transposes the progress of the match from groundstrokes to keystrokes; from playing the baseline to, um, playing the bassline.

continued...

Posted by Kat Bauman  |  26 Aug 2014  |  Comments (1)

OuternetHands.jpg

This is the second half of a two-part conversation with Geoff Baldwin, head of the new-ish Industrial Design department at Code and Theory. Read the first part here.

Core77: Outernet is technically involved, it's mechanically involved, and it's got a big dream behind it. How did C+T come to the project?

Geoff Baldwin: This whole industry, everywhere I've been, it's all about good people. Sayed [Karim], the founder of Outernet, I used to work with him at IDEO, where he was our tech guy, which is the best story ever: The IT guy at IDEO is trying to win a Nobel Peace prize! He was the best IT guy, he'd fix your computer super fast and was so responsive, because he just wanted to get done being an IT guy so he could go back to the shop and build shit.

I kept in touch with him, he went from IDEO to NPR to an investment firm that invested in news and information startups in developing countries. He was living all over the world and saw this problem: Yes, people need the Internet, but maybe they just need information. Back in March or February, I got an email asking if I knew anyone who could help him out with hardware. What he needed was a concept car and a vision. He was starting to get funding, but needed something tangible that people could hold onto and believe in. That's where it started.

Did C+T do the entire physical development of the Outernet?

Yes, and I think it should be understood that the project is still at a very gestational stage. [Sayed] is trying to do something incredibly ambitious that requires tons of capital and people being interested, so what we were doing here was creating that concept car and vision—if people can't get it in two minutes, they're not going to get it. But in order to create that concept car, we had to do some intense nuts and bolts engineering. We did this incredibly rapidly, as a six week project. Instead of staging it as the product, then the story and then... it was all at the same time. We brought Sayed in for a week, he was basically living with us. Sometimes we'd kick him out and he'd sit in the hallway and do... whatever he did.

He had collected so much knowledge about satellites and how they work, a ton of work on that back end, so we got him in and got his input on the technical basics and problems to solve and constraints. But in addition to getting a lot of hard information we also got the basis for the story. In a way the story, the fluffy-message fun part started to drive the really hard, critical engineering.

Outernet2.jpg

continued...

Posted by Kat Bauman  |  25 Aug 2014  |  Comments (0)

Outernet2sides.jpg

If you care about the power of free information, you may have heard of Outernet, an ambitious new project aimed at ending information inequality, eliminating censorship, and bringing data to distant places. The world may be well into the Information Age, but less than 40% of the global population has Internet access! That's bad for democracy, bad for innovation, and bad for business worldwide. What Outernet proposes is to take the heart of the Internet—free information—offline, and deliver it to anyone with a satellite dish using small cheap satellites, the existing geostationary satellite network, and simple hardware. It's a lot like a radio-transmitted library. As they put it, they offer "information for all from outer space. Unrestricted, globally accessible, broadcast data. Quality content from all over the Internet. Available to all of humanity. For free." For more details, check out LA Times' excellent infographics.

To get out of technical start-up talk and into people's hands, Outernet reached out to Code and Theory to help them level up. Why would an international humanitarian tech project work with a digital agency on prototyping? It's a good question, with an I.D. twist. To learn about building the Outernet satellite receiver and how Code and Theory helped, I spoke with Geoff Baldwin who heads their new but busy Industrial Design group.

Core77: Tell me about what you do at Code and Theory

Geoff Baldwin: I'm the director of Industrial Design. Code and Theory is known for its long history of doing digital design and interactive experiences. In the last 5 or 6 years we've become more known for digital agency of record for major brands, for doing social campaigns, and different digital advertising-ish things, so the idea was to build an Industrial Design team inside of this existing digital creative culture to do everything at once. To be able to design the thing, the interactions around it, and the story about it all from the same point of inspiration.

outernet-ctteam.jpgAll smiles

continued...

Posted by Moa Dickmark  |  25 Aug 2014  |  Comments (1)

MD_A_WEP_00.jpg

A few months ago, I was contacted by an organization called Women Engineers Pakistan, which introduces girls to the field of engineering and technology. Just reading the name made me curious. For those of you who don't know, I'm an architect, and I come from a family full of engineers and tech-heads. In other words, my choice of becoming an architect has never, at any point of my life, ever been questioned. I went to a technical high school in Uppsala, Sweden, always with the support of mom and dad, brothers and sister, my grandmother, aunts, uncle and most of all my wonderful grandfather. With 26 boys and 5 girls in my class, the male-to-female ratio was rather high, but my knowledge and competence was never questioned by anyone of the male gender. Not by teachers, nor by fellow students.

Hearing about an organization like this and its origins was inspiring, and it takes more then a bit of willpower and skin on the nose (Swedish expression) to start something as groundbreaking and controversial in a country where female students are told that they should reconsider their choice to study engineering and start studying something more suitable for women...

In this interview, I've had the great pleasure of talking directly with Ramla Quershi, the co-founder of Women Engineers Pakistan. She recently moved to the U.S. to study engineering on a full Fullbright scholarship. So even though she's busy with the big move and getting her bearings, she set aside some time for this interview. I hope you get as inspired by reading this as I did from writing it.

MD_A_WEP_01.jpgMD_A_WEP_02.jpg

Core77: Tell us a bit about the organization and the thoughts behind it.

Ramla Quershi: The organization is a budding startup, which looks to increase participation from Pakistani women in Pakistan in engineering. Women have always been by and large in domestic and agricultural jobs in Pakistan, and their participation in science and technology has been minimal. We realize that women make over half the Pakistani population and we're working to prevent that potential talent for technical prowess from going to waste. We're working with young girls at high schools to encourage them towards science and math

When did you start working on getting Women Engineers Pakistan up and running?

It started with a Facebook page last August. But it's wasn't until six months ago that we started working as an organization.

Why did you decide on starting WEP?

Throughout my engineering degree, I felt a nagging lack of women in this field. We were often discouraged by our professors that engineering is a 'big boy' area. It was disheartening to realize that there weren't many role models set out for us. So I created this organization to give women engineers a platform to represent themselves.

When the professors talked about it being a "big boy" profession, how did your fellow male students react to those sort of comments?

My fellow males knew that I was good at my studies, so they would often turn up for a group study option and ask me to explain things to them. So they had found out that the women in their class were just as good (some even better) engineers. Barring a few, many were courteous and encouraging. However, there were some 'go make a sandwich' sort of comments—but not many.

There must have been many ideas/incentives to make it go from an concept into reality, what were they?

Oh yes, there were. Initially it was just a Facebook page, but then it started getting attention, and I realized that I had hit a niche. We were contacted by the U.S. Embassy through the Facebook page for meeting with a NASA engineer coming to Pakistan. And i thought, 'Oh wow, not much representation for the women in engineering crowd.'

MD_A_WEP_03.jpg

continued...

Posted by hipstomp / Rain Noe  |  21 Aug 2014  |  Comments (0)

0graffitifur-004.jpg

So your kid draws all over your expensive carpet with a handful of Sharpies. You're so infuriated that after giving him a time-out, you need to go outside to carve some sand patterns in your Zen garden just to cool off. Well, if only you had access to the designs of Yuta Sugiura, a professor at Keio University's Graduate School of Media Design, you could cleanly ameliorate both situations.

Sugiura headed up the research team that produced "Graffiti Fur: Turning Your Carpet Into a Computer Display." Three clever devices can put images that you've either drawn or captured onto a plain ol' carpet, Sharpie-free and completely reversible:

Sugiura's team—which was comprised of researchers not only from Keio, but from the Nagoya Institute of Technology and The University of Tokyo—presented "Graffiti Fur" at this month's SIGGRAPH in the Emerging Technologies & Studio Collaboration category.

0graffitifur-003.jpg

Posted by hipstomp / Rain Noe  |  19 Aug 2014  |  Comments (0)

0corvettevaletmode.jpg

Carlos Tomas dropped his Mazda 6 off for a detailing appointment at a shop in Toronto. When he returned to pick it up, he noticed some cosmetic damage to the front of the car that he swore wasn't there before. But the body shop denied responsibility. A suspicious Tomas brought another car to the shop the following week, a sportier RX-8, and this time he secretly photographed the odometer before handing over the keys.

When Tomas picked the RX-8 up five days later, he noticed an extra 449 kilometers had been racked up on it. And amazingly, he received a CAD $45.60 bill in the mail from the local automatic toll collection agency.

We're guessing the designers and engineers over at Chevy have heard stories like this once too often, as they've actually cooked up a feature to solve this with their 2015 Corvette:

What's interesting is that the technology already existed as part of the Corvette's Performance Data Recorder package, which uses a small camera to shoot HD footage from the driver's POV, while a mic records the in-cabin audio and a computer records the vehicle data and telemetric info. The PDR was originally designed for track-heads who wanted to improve their lap times, but "We soon realized the system could have many more applications," Corvette product manager Harlan Charles said in a press release issued yesterday, "such as recording a scenic drive up Highway 101, or recording when the Valet Mode is activated."

The info and video can be viewed in-car immediately after recording, and it's also downloaded onto an SD card if you want to take the proof to the cops or just upload it onto YouTube. "Think of it," says Charles, "as a baby monitor for your car."

Posted by Kat Bauman  |  14 Aug 2014  |  Comments (0)

mars-geological-map1.jpg

If you're anything like me, you've hit midsummer without noticing, your thighs are sweat-glued to your chair, and you aren't taking a tropical vacation within the next decade. If the heat and business as usual are getting you down, join me on a virtual tour of a cool and exotic locale on the cusp of hitting it big: Mars. And what better to inspire your fantasy travel than a cool map of the region and its history?

After centuries of squinting at that tiny red blip on our starry radar, we've gathered some increasingly good intel on the terrain and climate of Mars. The high tech ex-pats living there have been sightseeing for over ten years and the planet has a host of new snooping satellites. However, we haven't had new maps made since the late '80s! While everyone loves a good old map, old maps won't do you much good on the ground. Thankfully the United States Geological Survey has finally crammed nearly 30 years of new data and imaging into a beautiful new map [PDF].

While the last large Martian mapping effort predated digital techniques and required hand-plotting, the new one has a lot of high-tech heft behind it. This geological map was compiled using satellite photos, topological and soil information from rovers, and super detailed laser altimeter data. The result is a map that clearly breaks down the surface composition, topography, and (maybe most importantly) age of the red planet's diverse regions. As the map abstract puts it, the work is "based on unprecedented variety, quality, and quantity of remotely sensed data acquired since the Viking Orbiters." As I'd put it: It leaves previous maps in the dust.

continued...

Posted by hipstomp / Rain Noe  |   6 Aug 2014  |  Comments (1)

0soundwave.jpg

First thing I had to check was that this wasn't released on April 1st. But no, in a research paper titled "The Visual Microphone: Passive Recovery of Sound from Video" submitted for the upcoming SIGGRAPH 2014, a team of researchers have allegedly discovered how to extract sound from video images.

I'm still waiting for Snopes to debunk this, but this research collaboration from MIT's Computer Science and Artificial Intelligence Laboratory, Microsoft Research and Adobe Research makes the following claim:

When sound hits an object, it causes small vibrations of the object's surface. We show how, using only high-speed video of the object, we can extract those minute vibrations and partially recover the sound that produced them, allowing us to turn everyday objects--a glass of water, a potted plant, a box of tissues, or a bag of chips--into visual microphones.

Sounds crazy, no? Watch this and see (er, hear):

As you can see, the technology is predicated on using high-speed, high-resolution video. But just imagine if it was possible to apply this to old, audio-free archival footage.

Posted by Kat Bauman  |   5 Aug 2014  |  Comments (0)

Roambotics.jpg

Further developments from the eerie future we're inhabiting: Roambotics wants a wheeling robot to patrol your home. Their first concept robot "Jr." is the winner of the Proto Labs Cool Idea! Award, and is a pretty good-looking object. Designed as an autonomous, self-leveling, wheel-shaped robot with video and audio monitoring abilities, it will identify intruders or changes within the home. Data is wirelessly streamed to the cloud and the Roambotics network, and will update the homeowner via an app or email. The software uses machine learning to better understand the environment over time, and they hope to integrate 3D mapping of spaces. According to their description on Crunchbase, "Jr. features a base station with inductive charging, multi-surface slip and cliff detection, self-stabilization, Bluetooth 4.0, 802.11 A-N, and a NVIDIA tegra 4 microcontroller." That's right, we're one (wheeled) step closer to Robocop.

Roambotics2.jpg

continued...

Posted by erika rae  |  25 Jul 2014  |  Comments (0)

Hitchbot-Lead.jpgAn artist's rendition of Hitchbot on the road

Seeing as self-driving cars won't be a reality any time soon, robots need to find an alternate means of travel for the time being. Case in point, the HitchBOT, a tiny, rainboot-wearing robot who is looking to travel across Canada by (you guessed it) the time-honored tradition of hitchhiking. Even so, he's probably less eccentric than your average itinerant: With bright yellow Wellies strapped to his feet and a cake-saver helmet, HitchBOT has a bucket for a body and pool noodles for limbs and looks something like a child's storybook or TV show... which is kind of the point.

The brainchild of Dr. Frauke Zeller of Ryerson University and Dr. David Harris Smith of McMaster University, the HitchBOT was designed to travel some 6,000+ km from the Nova Scotia College of Art and Design to Victoria, British Columbia, by means of friendly strangers. He comes fully equipped with 3G, GPS, WiFi and solar panels, so the hitchBOT team can track and receive texts/photos highlighting the droid's progress and deliver the updates to his quickly growing circle of fans. The robot will depend on the goodwill of travelers when he runs out of juice—once the energy from the solar panels is used up, all it takes is a simple connection to a car's cigarette lighter to reboot.

Hitchbot-Comp2.jpg

The droid is pretty limited when it comes to mobility—his only moving part is his arm—but can sit thanks to a retractable tripod. A car seat is attached to his torso for easy buckling. HitchBOT can also speak English (along with a few sentences of French) and has access to Wikipedia for small talk topics. You could do worse when it comes to a road-trip buddy.

continued...

Posted by hipstomp / Rain Noe  |  25 Jul 2014  |  Comments (5)

0digitaltattoo.jpg

Sure, smartphones allow us to communicate with anyone in the world at any time and provide access to a global network of knowledge and entertainment, but it's not like we can just pull the things out of our pockets and start using them. No. Instead we are forced to type in a four-digit security code!

This provides a unique set of physical challenges. For example, let's say that your security code is 1-9-8-2. This means you have to send your thumb up to the "1" at the top left of the screen, then move it all the way down to the bottom right to press the "9!" Then you have a little break moving it over to the "8," but that's temporary, because guess what, then you have to move your thumb all the way up to the top again to hit "2!" What are we, slaves?

Thankfully, for those of us who weren't born with Arnold Schwarzenegger's thumbs, help is here in the form of Digital Tattoos. These NFC-based skin stickers come in packs of ten. You stick one onto your body and tap your phone against it to "accept" it, which should be easier than getting your parents to accept that tribal/Celtic/Chinese character tattoo. From then on, you just tap your smartphone (it can be any smartphone in the world, as long as it's a Moto X) against the sticker and boom, the phone is unlocked, no Gatorade breaks required.

The adhesive "lasts for five days, and is made to stay on through showering, swimming, and vigorous activities like jogging," making this ideal for those who like to shower, swim, and/or jog vigorously.

Digital Tattoos aren't free, of course, they're $10 per pack. But that's no problem, because when you run out, you just pay them another ten dollars and then they give you another pack. In other words, you can just keep buying them!

Posted by Kat Bauman  |  23 Jul 2014  |  Comments (0)

LaurenMcCarthy-MoodManip.jpg

Regardless of whether you're in the Invasion of Mypace camp, or the Well That's How Business Works camp, Facebook has been playing games with your heart. As we all now ought to know, Facebook has admitted to experimentally filtering feed results to test emotional response and behavior in users. While it's hard to consider experimentation without informed consent to be anything less than blatantly shady, it's also well within their legal rights. Ethical it ain't, but then again deskchair epidemiology has never had the luxury of such self-selecting scale.

But the biggest bummer—other than seeing an upswing in pictures of your exes and their stupid beautiful lives—is that we didn't get to see the results! Not so any longer. Artist Lauren McCarthy created the Mood Manipulator, a browser extension that allows you the gratification of choosing your own digitally devised mood swings.

Now you can choose your own emotional filtering rather than passively interacting with a pre-adjusted feed filtered by unseen researchers without enough scruple to feel weird studying emotional effects in people who have not been notified. These tasteful opt-in controls give you four tonal "channels" with three positions each: Positive, Emotional, Aggressive and Open (in other news four-metric psych news, the Myers-Briggs test is totally meaningless). Just download the extension and toggle your way to psycho-social harmony.

MoodManip2.jpgAlways with the babies

continued...

Posted by Teshia Treuhaft  |  15 Jul 2014  |  Comments (0)

ElectricObj1.jpg

It should come as no surprise that the marriage of art and technology has had some difficulty finding a place in the institutional white cube exhibition spaces of most contemporary galleries and museums—after all, many practitioners reject the traditional art-object format on principle. Indeed, the incorporation of technology in art has vastly expanded the realm of creative possibilities, both aesthetically and with respect to distribution—auction house Phillips recently held the second edition of its forward-looking "Paddles On!" digital art auction—yet the modes by which it is bought, sold or displayed continue to shift and evolve.

The recent Kickstarter campaign for Electric Objects marks a noteworthy attempt to streamline the presentation of Internet and digital art into more conventional means. Electric Object's first major product run, the EO1, is essentially a wall-mountable, high-definition screen with Wifi connection for control from their handy mobile app. The EO1, framed in your choice of white, black or wood, displays your collection of Internet art without drawing away your attention from daily activities. The EO1 supports static images, animated GIFs and javascript-based visualizations.

Electricobjectsapp.jpg

continued...