Ubuntu creates an interface that responds to user movements on a webcam

Original author: Christian Giordano
  • Transfer
This topic is a translation of a post on the Canonical design blog.

Introduction


With the introduction of products such as the Nintendo Wii, Apple iPhone, and Microsoft Kinect, developers have finally begun to realize that there are several ways a person can control a computer besides keyboards, mice, and touch screens. There are many alternatives these days, obviously based on hardware sensors, and the main difference is software dependency. Machine vision based product solutions (like Microsoft Kinect) rely on art software to analyze images captured by one or more cameras.
If you are interested in the technical side of this, we recommend taking a look at the following project projects: Arduino and OpenFrameworks .

Use with Ubuntu


During a little research a few months ago, we were thinking about how Ubuntu could behave if she knew more about her physical context. Not only when detecting the tilt of the device (like an iPhone application), but also when analyzing the user's presence.
This was not really a new concept for us, in 2006 we experimented with user proximity. We believe that it is important to adapt the screen content based on the presence of the person watching it.
We came up with few scenarios that are far in order to be developed and defined, but we hope that this will only open some discussions or, even better, help launch some initiatives.

Full screen mode


If the user moves away from the screen while the video is playing, the video will automatically switch to full screen mode.



Full Screen Notifications


If the user is not in front of the screen, notifications could be shown in full screen mode. Thus, the user can read them from various locations (including far from the monitor).


Parallax


Since this year is the year of 3D screens, we could not omit the parallax effect with windows. A user gesture could also trigger an appearance.

Prototype




Also popular now: