
How I wanted to make Ambilight for a mobile phone and what came of it
So, it will be about how I tried to make Ambilight for my phone (HTC Hero with Android on board). Just in case, let me remind you that Ambilight is a TV backlight technology based on the picture shown. According to Philips, which promotes it, ambilight improves film perception in the dark and reduces eye strain.
We need a phone and a little programming. First of all, we learn to get a screenshot from the device using ddmlib.jar from $ SDK_PATH / tools: Here is the first rake - very often this does not work out - the maximum that shines for us is 2-3 frames per second. Next, we follow a simple algorithm:
1) create a quantized color palette of the current frame - for all the colors we perform the conversion: 2) Find the most commonly used color 3) Here, in a good way, there should be a long description of how I connected all kinds of LEDs and killed the weekend, but it doesn’t will) Instead, I used a means at hand as a backlight - a laptop screen. And so, what came out in the end:
The feasibility of such a solution is under a certain question (:, but the main goal (to make a cool thing) is fulfilled. If you wish, you can further develop the idea - for example, split the image into 4-6 parts and provide illumination for each segment. You can adapt the Arduino board for control LEDs or design a Lego Mindstorms robot waving them. In general, there are a lot of opportunities.
At this point, I suddenly realized that I was suffering from some crap. Manufacturers themselves could implement a real ambilight for mobile phones, making a translucent matte case in some experimental model, the same Hero might look something like this:

The profit is obvious - a phone with such an unusual feature will certainly attract attention and find many new applications in everyday life. Who is the main innovation - go for it)
jeck_landin
We need a phone and a little programming. First of all, we learn to get a screenshot from the device using ddmlib.jar from $ SDK_PATH / tools: Here is the first rake - very often this does not work out - the maximum that shines for us is 2-3 frames per second. Next, we follow a simple algorithm:
AndroidDebugBridge bridge = AndroidDebugBridge.createBridge();
IDevice[] devices = bridge.getDevices();
RawImage rawImage = this.device.getScreenshot();
1) create a quantized color palette of the current frame - for all the colors we perform the conversion: 2) Find the most commonly used color 3) Here, in a good way, there should be a long description of how I connected all kinds of LEDs and killed the weekend, but it doesn’t will) Instead, I used a means at hand as a backlight - a laptop screen. And so, what came out in the end:
int color = ...
int r = (color >> 16) & 0xff;
int g = (color >> 8) & 0xff;
int b = color & 0xff;
r -= (r%16);
g -= (g%16);
b -= (b%16);
Color quantedColor = new Color(r, g, b) ;
The feasibility of such a solution is under a certain question (:, but the main goal (to make a cool thing) is fulfilled. If you wish, you can further develop the idea - for example, split the image into 4-6 parts and provide illumination for each segment. You can adapt the Arduino board for control LEDs or design a Lego Mindstorms robot waving them. In general, there are a lot of opportunities.
At this point, I suddenly realized that I was suffering from some crap. Manufacturers themselves could implement a real ambilight for mobile phones, making a translucent matte case in some experimental model, the same Hero might look something like this:

The profit is obvious - a phone with such an unusual feature will certainly attract attention and find many new applications in everyday life. Who is the main innovation - go for it)
