Virtual Reality on Windows Phone with Unity3d

    Virtual Reality on Windows Phone with Unity3d



    At the last Game Developer Conference, there was a lot of news related to virtual reality. These are new devices, such as Microsoft HoloLens, the struggle for recognition between Oculus Rift and Project Morfeus, the announcement of SteamVR. All this suggests that the topic is very interesting and hot. Although the concept of virtual reality includes a lot of things, technologically it is primarily glasses or a helmet that show a stereoscopic image and respond to movement. Many of us would like to experiment in this area, but without a device, for example, Oculus Rift, this is difficult. Fortunately, there are technologies that can turn your Windows Phone into a virtual reality helmet.



    Where to begin


    The most important thing is of course a helmet, or glasses if you like. They need to be made so that they allow you to hold the phone in front of your eyes. You can use Google Cardboard or make them yourself from cardboard, acrylic and plastic .
    After you make glasses, or a helmet, in general, what the phone will keep in front of your eyes, the question arises of what needs to be done to immerse yourself in that very virtual reality:
    1. Create a stereoscopic picture
    2. Change it depending on the position of the head
    3. Interact with the virtual world
    Unity3d 4.6 and Windows Phone will be used to create our application. In the finished example , all settings have already been made and within the framework of the article I would like to note the most important points.

    Creating a Unity3d Project


    Create an empty Unity3d project and create any 3D objects to your taste, add lighting. On the other hand, you can use the ready-made free Medieval Home package, import it and open the example scene (Scene / Sample Scene).


    Next, configure the export of the project. To do this, go to the Edit / Project Settings / Player menu and select the Windows Store icon.


    The most important item is the “TestCertificate” button. Click on the Create button and create a new one. Unfortunately, there is a small bug in Unity due to the Create dialog not working in some cases. To work around this problem, start Visual Studio 2013, create a new Windows Phone 8.1 project, open the folder where this project is located and find the * .PFX file. Rename it to WSATestCertificate.pfx and copy it to the Assets folder of the current Unity project.
    Next, configure the build. Go to the File / Build Settings menu and select the following values ​​for the Windows Store:


    Do not forget to first save the current scene ( File / Save Scene ) and click the "Add Current ”in this dialog box so that the start scene appears in the resulting project. After clicking on the Build button , a Visual Studio 2013 project will be created, which for verification should be opened in Visual Studio, compiled and run on the device.

    Create a stereoscopic picture


    The first thing to do in the project is to create a stereoscopic image. And this is very easy to do since we have a three-dimensional scene that can be calculated for each eye.
    Locate the First Person Controller in the project and duplicate the main camera (right mouse button, Duplicate).
    You should have the following:

    Next to each camera you must specify the properties that render the image being only half the so-called in b th port ( the viewport ) . For the left and right eyes, respectively, at the beginning of the screen, and in its second half:

    But of course this is not all, a stereo image cannot be obtained in this way, the pictures that are now displayed by these cameras are the same.
    In order for the stereo effect to appear, it is necessary to introduce parallax between these cameras.
    Add a new C # script to the project:

    Name it FPSVRBehavior.cs and after it is created, drag it onto the “First Person Controller”. The created script will be connected to this object.

    Now you can start the task of creating parallax between cameras. In the Start () method of the FPSVRBehavior class, we simply set the “left eye” offset from the main “parent” object by a small amount:
    var leftEye = GameObject.Find("LeftEye");
    leftEye.transform.Translate(Vector3.left * 0.1f);
    


    The stereo image is ready, create a new build, publish it on your phone and enjoy the stereo image in your virtual glasses! True, the image is static, but in real helmets the position of the head is monitored and the image on the screen changes depending on it.

    We take into account the position of the head


    Windows Phone phones have a gyro sensor. It can be used for our purposes. It remains only to learn how to use the data that this sensor returns.

    Some departure from the main topic, writing a plugin for Unity


    As you know, Unity3d is a cross-platform application creation tool. For all its versatility, some of the bottlenecks and device-specific APIs are not available. In order for us to get to the values ​​of the gyro sensor data, we will have to prepare a plug-in. Without going into details, a plugin is two DLLs with the same name that are located in the Assets / Plugins and Assets / Plugins / Metro directories .

    The WindowsPhoneVRController.dll file which is located in the / Assets / Plugins directory is a standard assembly of the .NET Framework 3.5 which is designed to work during the Unity designer.
    A similar file located in the / Assets / Plugins / Metro directory is the Windows Store 8.1 Class Library, and it is used by the Unity environment to create a Visual Studio solution so that our project on the target platform gets the necessary functions. You can read more about how to create plugins on my blog.
    The source code for both DLLs is the same:
    #if NETFX_CORE
    using System.Threading.Tasks;
    using Windows.Devices.Sensors;
    #endif
    namespace WindowsPhoneVRController
    {
        public class Controller
        {
    #if NETFX_CORE
            Gyrometer gyro;
    #endif
            public Controller()
            {
    #if NETFX_CORE
                gyro=Gyrometer.GetDefault();
    #endif
            }
            public static void Vibrate()
            {
                Vibrate(8);
            }
            public double AngularVelocityX
            {
                get
                {
    #if NETFX_CORE
                    return gyro.GetCurrentReading().AngularVelocityX;
    #else
                    return 0;
    #endif
                }
            }
            public double AngularVelocityY
            {
                get
                {
    #if NETFX_CORE
                    return gyro.GetCurrentReading().AngularVelocityY;
    #else
                    return 0;
    #endif
                }
            }
            public double AngularVelocityZ
            {
                get
                {
    #if NETFX_CORE
                    return gyro.GetCurrentReading().AngularVelocityZ;
    #else
                    return 0;
    #endif
                }
            }
            public static void Vibrate(int _milliseconds)
            {
    #if NETFX_CORE
                var vibrationDevice = Windows.Phone.Devices.Notification.VibrationDevice.GetDefault();
                vibrationDevice.Vibrate(TimeSpan.FromMilliseconds(_milliseconds));
    #endif
            }
        }
    }
    


    Depending on #CONDITION, when compiling for the .NET Framework 3.5, we get a “stub”, and when compiling for the Windows Store, we get working logic. In the finished example, the link to which is below, a Visual Studio solution has been prepared which compiles both DLLs and decomposes them into the corresponding Assets / Plugins directories

    We take into account the sensor values



    As it becomes clear from the source code, working with a gyro sensor is very simple. We interrogate the values ​​of the phone’s rotation accelerations, and then it remains only to influence the image on the screen using this data.
    To do this, open the FPSVRBehavior.cs file again and add the following code in the Update method:
    void Update ()
    {
        float vertical_angle_delta = (float)gyroscopePlugin.AngularVelocityY * 0.05f;
        float horisontal_angle_delta = (float)gyroscopePlugin.AngularVelocityX * 0.05f;
        transform.localEulerAngles = new  
        Vector3(transform.localEulerAngles.x+vertical_angle_delta ,
        transform.localEulerAngles.y-horisontal_angle_delta, 
        transform.localEulerAngles.z);
    }
    


    gyroscopePlugin is an instance of our plugin, do not forget to declare a variable in this class and create this object in the Start () method;
    As it became clear from the code, we simply poll the sensor data and change the position of the object - in our case, this is First Person Controller. The magic coefficient of 0.05f affects how fast the image will respond to the rotation of the phone.
    After you build the build and run the application on your phone, the camera will come to life and will now track the position of the head!

    Movement is life



    Now we have a practically full-fledged virtual reality application, but in this virtual reality you can only turn your head. In order to move forward, it is necessary to provide some kind of mechanism. There is no way to touch the screen (the phone is in front of our eyes), connecting additional devices to the phone is difficult, and not convenient. Therefore, we will provide an elegant and easy way for us to walk inside our virtual world.
    This method is very simple - if you tilt your head (phone) a bit, so that the virtual camera looks at the floor, at a certain angle, the First Person Controller moves forward. Naturally, given the collisions in our virtual world, so as not to pass through the walls.
    Add the following code to the Update method of the FPSVRBehavior class:
    var motor = GetComponent();
    if (transform.localEulerAngles.x > 30 && transform.localEulerAngles.x < 40) 
    {
        motor.inputMoveDirection = transform.rotation*(Vector3.forward * 0.1f);
        WindowsPhoneVRController.Controller.Vibrate();
    }
    

    The code is self-evident - if we tilt our heads, then somewhere between 30 and 40% degrees there is a zone that leads to a forward movement. To "feel" where this zone is, we help the user with the vibration of the phone.
    Download the finished project and immerse yourself in virtual reality!

    Also popular now: