Oriense Development of devices for helping the blind and visually impaired

    The history of the project goes back to 2006.
    Then an active member of the blind society turned to the St. Petersburg Research Institute with a proposal to create a device for helping the blind and visually impaired.
    One of the founders of the current Oriense then headed the department, which was involved in the field of vision of robots and developed its stereo camera. They decided to base the device on it: a wearable computer processes information from two cameras and two ultrasound sonars and issues tips to the headphones using a voice synthesizer.



    A separate stereo processor on Altera was supposed to build a depth map, but it didn’t come to that, the processor only carried out stereo input. The on-board computer software built a point cloud, segmented it into objects, classified them in a simple way, determined the degree of danger, and issued warnings.



    A mock-up was made, the project attracted enough attention, but such execution and functionality was not enough, and then it didn’t go and the project froze, never reaching the users.

    In 2010, work resumed, but as a result, we practically did not move forward, only a few modernizing the layout. Then it became completely clear that bringing such a device to a state of real utility would require a lot of time and financial influences. And the approach should be different: like many teams who tried to create something like this, we relied on existing technologies, and not on the real needs of users. As a result, the project was abandoned again, apart from my experiments with image recognition in the framework of the dissertation and graduate grants from the St. Petersburg government.



    In 2012, there was a third approach to the projectile, but this time in the form of undergraduate and graduate studies for the Microsoft Imagine Cup. The stereo camera was replaced by Kinekt (taped to a construction helmet with tape), the whip software was supplemented by recognition of objects, text and faces.



    We won the Petersburg stage, and at the All-Russian were content with a special prize from Intel. Then there was BIT (competition "Business of Innovative Technologies) - the second place and a special prize - a trip to Slush. Then SumIT startup school, where we were noticed by the managing partners of the iDealMachine startup accelerator Sergey Fradkov and Mikhail Averbakh. We hit the accelerator, got $ 20,000 preseed from RSV Venture Partners, and became an Oriense startup. It was in November 2012.

    iDealMachine gave us the right development process, we finally went directly to the blind and visually impaired with questions and formed the problem and the required functionality. The prototype was brought to a demonstration-tolerant state, Kinekt replaced with PrimeSense Carmine, mounted on the chest. The unit was finally tested by the blind at the center for medical and social rehabilitation of the visually impaired .



    Then we plunged into the usual startup chores: we looked for financing for the next stage, solved legal and organizational issues, presented a lot (including at DEMO Europeand the iron VC Day Ingria). In June 2013, we received a state grant under the START program of the Assistance Fund, expanded the staff a bit, purchased components and finally started to work closely on development. Now we are completing the first version of the product, for which we have a small pre-order from the educational center "Edukor" , which already has experience in using development for people with disabilities - an educational class with devices for controlling a computer without the help of hands.

    Oriense-1 is still based on PrimeSense, so it will only work in indoor environments, but at least we can run in the local navigation software and the design of the wearable device. As an on-board computer, Odroid-U2 is used , the case is now being finalized.

    In parallel, we are working on a 3D sensor on a stereo camera, it was planned to use the Etron stereo-vision chip, but negotiations with them have reached an impasse. Based on the Hardkernel board with SoC Exynos 4412, we are developing our own single-board computer, which will include all the sensors: stereo, GPS / GLONASS, 9-axis position sensor and optional 3G modem. It is possible that stereo will be miscalculated on Adapteva 's new 16-core Epiphany processors , which are used by Parallella Board . Thus, we can make a device consisting of one module, with full functionality: local navigation, global navigation, reading text, object recognition, etc.

    We wanted to release a GPS navigator, and as a separate device, based on RaspberryPi and a sensor board - it is now wired. The software was planned to be made open, based on the OpenStreetMap database, for Linux, and then included in a fully functional device. True, there is no progress on this software yet.

    The most popular GPS navigation tool for the blind was the LoadStone GPS application for Symbian, but due to the death of the platform they are forced to switch to smartphones, and on the basis of the OsmAnd application for Android there is a version for the blind Osmand Access developed in Russia, which includes part of the LoadStone functionality. A big minus of the smartphone in the absence of a physical keyboard, so we are now talking with the developers of OsmAnd Access regarding the creation of a sharpened navigator for it with its presence.



    On November 1-2, we will be in Moscow at the Open Innovations Forum (due to the fact that the Russian Startup Rating gave us the highest AAA), on November 6-7 at the Webit Congress in Istanbul, with our own stand.

    We are based in St. Petersburg, our site is oriense.com , mail is info@oriense.com, twitter , a YouTube channel , the latest version of our presentation

    Thank you for your attention, we will be happy to receive feedback.

    Also popular now: