Touchscreen moves out of the Screen

While the world’s ooh-ing and ah-ing with Microsoft Surface some time ago for its engaging and intuitve interaction, researchers within the campus are moving on to yet another interesting interface – touch control but out of the screen.

Called SideSight, the interface allows you to control a phone placed on a table by wiggling your fingers in the space around it. This helps to solve the problem that a touch screen is limited by the need for fingers to touch it – thereby limiting how small the screen can go.

Personally I see application of this more outside of the phone though – how often do you place a phone down on the table? But think about things like ultramobile laptops and stuff – a virtual trackpad if you wil – and things start getting more interesting.

[via New Scientist]


No comments yet

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: