It's very strange that Google Glass is not mentioned once in this news segment. Researchers at Taiwan's Industrial Technology Research Institute (ITRI) have developed this eyeglass-based display, below, that uses images projected onto the lenses, and depth cameras focusing beyond the lenses, to create the functional illusion of operating a "floating touchscreen":
ITRI is simply the latest research group to use depth cameras to track our fingers, which then triggers a microprocessor to recognize that as an actionable "touch." Most recently we saw this with Fujitsu Labs' FingerLink Interaction System. So you might wonder why we're looking at this—isn't this just a combination of existing technologies that we've all seen before? It is, but so was the iPod, the iPhone and the iPad when they first came out. What is significant here, from an interface design perspective, is the likelihood that in the future we'll use the touch interface—without actually needing touch-enabled glass screens. It's strange to think that the touchscreen technology behind the iDevices, which seemed so groundbreaking when they first came out, might actually be a short-lived one; historically the touchscreen may be seen as simply the "gateway drug," in a manner of speaking, that gets us hooked on this form of interaction before depth cameras take over and we no longer need a physical surface.
At a minimum, once the resolution and accuracy issues are worked out, and the depth cameras are further miniaturized, the technology could yield unlimited-screen-size iPads--that don't actually exist, of course. But as long as our eyes think they exist, that will be all that matters.