… taking the accelerometer (the portrait/landscape detector) and the proximity sensor a bit further and having a Nintendo Wii remote-like capacity to have the phone be fully aware of its 3d orientation and motion vectors. Though I can’t really imagine an application and use for it other than, say, theft detector.
Here you go, Jaanus, some potential applications:
Couple of years ago, I had the pleasure of hearing a presentation from Simon Jones, former Managing Director of former MIT Media Lab Europe. He spoke on their liminal (marginally perceptible) devices research, among other things.
(As a sidenote, I’m not quite sure what caused MIT to shut their European labs down, but their current website with all those “former” labels and a notice about the “historic value only” status of the site are really sad. Check out their project listings, a lot was happening… and now it is quite hard to even find publications of their work.)
I remember three practical directions that they were playing with in mobile handset space, in my subjective interpretation:
- Gesture based user interfaces, or in other words, complementing what the user does with the phone’s software with what he does with the phone itself. Changing Apple’s iPhone’s screen orientation by physically turning the handset and software responding to user bringing the phone next to their ear fall into this category, sort of. But you can be more creative with interpreting real life metaphoric gesture sequences. If you put your phone to the back pocket (as if it were a wallet) and then bring it back, it could automatically log you into your internet bank or launch a mobile payment service. Or, if you want to delete the SMS you just have read on the screen, you could gesture as if tossing away a piece of notepaper.
- “Touch” over distance. If two mobile devices were equipped with pressure sensitive as well as pressure emitting sensors around their casing, the communication between their users could include the sense of touching. Gripping a phone stronger when angry or stressed sends a powerful message. Touching the cheek of your loved one sends something far gentler across. Or use a handshake to start a call. Note that this sort of technology would also dictate the industrial design of mobiles, probably moving from candybars to something more like thin mobile gloves with a soft-screen on the back of your hand.
- “Hand-led” navigation. Think of a tourist-filled street in the summer in Paris. An amazing share of them today neglect the opportunity to experience their surroundings in first person. Some people look at their mobile, PDA, Blackberry, GPS, etc screens while walking around. A far larger group consists of those looking at the world through their photo or video camera’s LCD screens. If you go for a walk (or use a hop-on-hop-off tour bus) equipped with an audio travel guide, you miss the sounds of birds, emotionally honking traffic and loud Mid-Western American tourists (we’ll maybe not all that bad?). A prototype gadget the MIT guys built was a GPS-enabled PDA plus gyroscopic additions that could make the device generate motoric inertia to any direction, thus effectively “dragging” you by hand to the right direction on your selected scenic walking route. Going the right way with your eyes and ears still free for the real world.
The iPhone version one is by far not liminal in it’s current state, but a tiny step to that direction. It is still a device you mostly use in an artificial, pre-learned way. (And the learning side is the human user.)
It is just really cool that looking at iPhone, Wii and many power-feedback gaming devices, all (almost) on the market today, the futuristic concepts above are just a matter of great integration with each-other and underlying software. The necessary components are getting already out there.