Continuing the theme of future mobile design, stimulated by this link from MIT's Fluid Interface Group, this post is a quick look at some ideas for possible futures in machine-to-man communications. The screen of the future, if you like.
First off, a confession. I love my iPhone. It's a great piece of technology that just works; well enough to get me excited about handsets for the first time in years. It's also led to a spate of people wandering around, head down in their smart phone, writing emails, tweeting, surfing the web, playing games or another one of the tens of thousands of things you can get an app for. This is a shame, because the iPhone has lots of next generation functionality that's exciting to use, but has lost some of the mobility that made cellphones compelling in the first place.
The Fluid Interfaces Group device gets around this "heads down" problem by projecting an interactive screen onto a surface by means of a small projector (and presumably a camera of some sort to watch the user interacting and feed that back). It's a neat solution, however the thought of thousands of people wandering around projecting images onto each other and every spare surface seems a bit far fetched (sorry - I don't like to be negative about new ideas, and it is really neat!).
Here's a couple of technologies that are making their way through the research pipeline that may offer a new way of viewing content, sometime in the next 5-10 years. The first option is a bit conventional, in that it's a screen, but rather unconventional, in that it takes the form of a contact lens. In summary, the reasearchers at the University of Washington created a thin film, biologically safe contact lens, containing all the lights and circuitry required for an LCD screen. They then put it in a rabbit's eye. Quite what the temporarily bionic bunny thought of the new tech is unclear, however a human suitable equivalent offers fascinating possibilities by building on the capabilities of known technology - how many hundreds of millions of people globally wear contact lenses already?
The image from such a display would probably be similar to a head-up display in a car or aeroplane, overlaying information on top of the line of sight. If contextual tagging of objects was included, via a wearable camera, then people would instantly (and discretly) be able to access information about anything they're looking at, surf the web, instant message, or whatever.
There are hurdles, of course, primarily due to the lens' need for a power supply suitable to fuel a device that is actually mounted in the eye. Nanotechnology and micro mechanics researchers are begining to come up with generators that harvest kinetic energy to provide electricity to small devices, however a fully transparent version seems a little far off at this point (I found no patents or papers on such a thing in a brief scan). Induction is another possibility as it offers a way of wirelessly powering a device from a power supply contained elsewhere on the human body. Incidentally, induction (which is also the means by which near field payment technologies work) is the most likely means that the phone would communicate with the lens. Finally, the body itself is a reasonable conductor of electricity - could we become the copper cables of the future?
So, at the level of a rudimentary glance at least, a contact lens screen is possible - there are significant hurdles, but they all seem resolvable given time and a bit of ingenuity to integrate exisiting or imminent technology. This post is getting rather long, so I'll break here, with the promise that next time I'll post about non-screen solutions to machine-human communication. Hope the above was interesting - any comments or thoughts greatly appreciated.
First off, a confession. I love my iPhone. It's a great piece of technology that just works; well enough to get me excited about handsets for the first time in years. It's also led to a spate of people wandering around, head down in their smart phone, writing emails, tweeting, surfing the web, playing games or another one of the tens of thousands of things you can get an app for. This is a shame, because the iPhone has lots of next generation functionality that's exciting to use, but has lost some of the mobility that made cellphones compelling in the first place.
The Fluid Interfaces Group device gets around this "heads down" problem by projecting an interactive screen onto a surface by means of a small projector (and presumably a camera of some sort to watch the user interacting and feed that back). It's a neat solution, however the thought of thousands of people wandering around projecting images onto each other and every spare surface seems a bit far fetched (sorry - I don't like to be negative about new ideas, and it is really neat!).
Here's a couple of technologies that are making their way through the research pipeline that may offer a new way of viewing content, sometime in the next 5-10 years. The first option is a bit conventional, in that it's a screen, but rather unconventional, in that it takes the form of a contact lens. In summary, the reasearchers at the University of Washington created a thin film, biologically safe contact lens, containing all the lights and circuitry required for an LCD screen. They then put it in a rabbit's eye. Quite what the temporarily bionic bunny thought of the new tech is unclear, however a human suitable equivalent offers fascinating possibilities by building on the capabilities of known technology - how many hundreds of millions of people globally wear contact lenses already?
The image from such a display would probably be similar to a head-up display in a car or aeroplane, overlaying information on top of the line of sight. If contextual tagging of objects was included, via a wearable camera, then people would instantly (and discretly) be able to access information about anything they're looking at, surf the web, instant message, or whatever.
There are hurdles, of course, primarily due to the lens' need for a power supply suitable to fuel a device that is actually mounted in the eye. Nanotechnology and micro mechanics researchers are begining to come up with generators that harvest kinetic energy to provide electricity to small devices, however a fully transparent version seems a little far off at this point (I found no patents or papers on such a thing in a brief scan). Induction is another possibility as it offers a way of wirelessly powering a device from a power supply contained elsewhere on the human body. Incidentally, induction (which is also the means by which near field payment technologies work) is the most likely means that the phone would communicate with the lens. Finally, the body itself is a reasonable conductor of electricity - could we become the copper cables of the future?
So, at the level of a rudimentary glance at least, a contact lens screen is possible - there are significant hurdles, but they all seem resolvable given time and a bit of ingenuity to integrate exisiting or imminent technology. This post is getting rather long, so I'll break here, with the promise that next time I'll post about non-screen solutions to machine-human communication. Hope the above was interesting - any comments or thoughts greatly appreciated.
Comments
Post a Comment