CES 2015 through the eyes of a programmer

    The Consumer Electronics Show (CES) annually attracts 3,000+ companies participating in the exhibition and 150-160 thousand visitors and places them in several complexes. The Las Vegas Convention Center (LVCC) is considered central, where corporations such as Samsung and Sony are located on several floors. However, this year I was struck at CES by not huge curved mirrors (sorry, TVs ), not flocks of self-coordinating drones , not weightless laptops from large corporations, but small companies doing small things with great potential, which they placed in a small one (compared to LVCC ) Sands showroom.



    I think it would not be an exaggeration to say that an IoT explosion occurred this year. If before that IoT modestly huddled on a few scattered stands, now it took almost a whole exhibition complex where you could see dozens of smart watches, robots that can control the house and even cook, 3D printers, fitness devices, muscle composition meters, fat, bone density, smart beds, all kinds of wearables and even hearables . IDC predicts that by 2020, the IoT market will already be measured in trillions.

    This, of course, is very interesting for us consumers, but what does this mean for us programmers?


    1. Data from all these devices will need to be collected and sent to the server. Via BLE , ZigBee , AllJoyn ™, etc. the data will go first to some smarter device with a larger battery (or connected to the network) or a phone - let's call it a gateway, and it, in turn, will communicate with the server via the Internet. Frameworks like DeviceHive (shameless plug) and openHAB will help , simplifying communication with devices, but programmers will still have a lot of work to do.

    2. The data will need to be stored and analyzed somewhere, and there will be not just a lot of them, but a lot. Now a couple of billion people are connected to the Internet and the number of tweets, posts, photos and clicks on sites that everyone can generate is limited.

    There will be tens of billions of devices (each of us has several), and they will send “clouds” to the temperature in each room, in the refrigerator, air humidity, car metrics, cardiogram, pressure, oxygen saturation, glucose, hemoglobin, not to mention already about banal steps and other activities. And this is only what came to mind, and what is already being done now. Undoubtedly, this list will only expand.

    As a result, the concept of Big Data becomes commonplace and disappears - everything turns out to be Big Data. Data will flow, and users will want to see real-time results. Grandfather Hadoop already breathing in the air will move and give way to technologies like Spark + Spark Streaming , queues like Kafka will become more relevantand RabbitMQ and generally asynchronous technologies such as , Akka and Node.js .

    NoSQL databases like Cassandra will be even more widespread. DBA and scaling specialists will have something to do. If you envy Apple engineers (75,000 Kassandra nodes, 10 PB data) or Netflix (2,500 nodes, 420 TB), that power will soon be in your hands. Data Scientists, who are already in short supply, will be even more needed. Some of my friends over 30 dug up a briefcase and went to get a second education. You can start at least with Coursera .

    3. Of course, we will be asked to develop new applications for managing this device zoo and for data visualization. Apparently, success awaits those who come up with how to orchestrate and integrate everything so that there is no need to get confused in dozens of separate applications, one for each smart ashtray and toaster, and also provide opportunities that are not available to individual devices / applications. For example, when I turned around in the morning, I want to open the curtains so that they gently wake me up in the phase of light sleep for the last 30 minutes before I have to wake up on the alarm clock ...

    We were somewhat distracted from the exhibition itself, but it was very important to reflect the criticality of the moment. What they came up with back in the 20th century and about which they talked and wrote a lot lately - robots, smart homes, etc. - is now finally becoming a reality. Do not believe about the XIX century? See Nikola Tesla’s patent on remote control from 1898 :). And now, in 2015, at the exhibition, a BMW representative from the Samsung Gear S gave the command to the new i3 to go ahead, and then fearlessly jumped in front of it to demonstrate a collision protection system. Video here .

    But back to CES. I bring to your attention a list of companies / products that struck me or simply liked me.

    Lifeq





    Company LifeQ four years of quietly working on their technology at CES and only came out of the gloom. Using continuous monitoring of physiological parameters in tandem with biomathematical modeling (also known as computational system biology ), LifeQ guys are going to completely change what we know about ourselves and how we make decisions.

    I am superficially familiar with systems biology and naively believed that this new field of science is still in its infancy, and its application in real life, and even more so in commercial products, is not yet possible, however, apparently, “the technology is secret, scientists might not have known ". He was also amazed when he learned that Cape Town in South Africa, where half of LifeQ scientists work, is a major research center in the field of computational systems biology. Live and learn.

    How exactly this will work.

    1. Device manufacturers are integrating LifeQ LENS technology into various fitness devices. Using a multi-wavelength optical sensor, the device takes primary measurements from the body.

    2. LifeQ CORE - the main innovation of the company - takes primary indicators, performs modeling of physiological systems and calculates additional indicators that cannot be obtained directly (without analysis in the laboratory).

    3. Through the open LifeQ LINK platform, developers integrate technology into existing or develop new applications and thus give ordinary people information about what is really happening with their body.

    Ozobot



    Ozobot

    This two-centimeter baby attracted more attention on CES than robots the size of a teenager. And not for nothing: not only is he pretty, he also solves an important problem - he makes programming more accessible. In general, the topic “programming skills - for everyone!” Has recently become incredibly popular. Like mushrooms after rain, new simplified languages ​​like Sсratch and visual media like Blockly appear , online courses and “academies” multiply, Barack Obama writes in JavaScript during the Coding Hour.

    And then the company Evollve, who created Ozobot, did something brilliant: figured out how to program using drawing! Now, not only Barack Obama - even George W. Bush will be able to do this, not to mention primary school children.

    How it works: you draw lines with a marker on paper (or with your finger on an iPad), solid lines create the road that Özobot will follow, and using a combination of colors you can give various commands, for example, “stop”, “go faster”, “ turbo mode ”, etc. After you have drawn the road-program and installed a robot on it, it will start to execute instructions on its own.

    image
    ( from here )

    Here is a list of supported commands:

    image

    JINS MEME



    Until recently, the Japanese company JINS was making “designer” frames and excellent aspherical (non-distorting reality) lenses. But this was not enough for them, and they decided to make the glasses smart.

    Adding a three-point electrooculography , a six-axis accelerometer and a gyroscope, they got smart JINS MEME glasses that can not only determine the direction of view and blink times, but also understand whether we are focused or sleepy and how smoothly we stand / walk.



    The most interesting thing for us, of course, is the API and SDK that JINS MEME is going to release in March this year. According to their Developer Portal , the API will support two modes: “real time” and “standard”.

    In real time, we get:

    • Eyes: eye movements, blinking times, blinking speed, blinking power (wow!).
    • Body: position / balance, speed, acceleration.
    • Device: battery information, rating, how correctly the glasses are worn, error information.

    In standard mode, statistics on the same metrics will be available every minute, plus:

    • Drowsiness level.
    • Level of focus.




    Opticwash



    At the end of the topic of glasses, I would like to present a brilliant creation from Opticwash - a car wash!



    They don’t have an SDK or API :), but if your glasses, like mine, are constantly getting dirty, washing will be a very useful device. Yes, and look at it is a pleasure. I risked my glasses and did not lose - it’s so impossible to brush my hands perfectly. Watch the video:



    You can still wash the jewelry, but I was somehow afraid to test on the wedding ring. If something goes wrong, what will I say at home ?! “Imagine, dear, there was such a machine at the conference that washes rings, and it swallowed mine ...”

    Also popular now: