UI research in Unity 4.6 beta

The other day, I began to learn the new UI in Unity 4.6 beta. Everything on the official website in the video tutorials was naturally viewed, but there is nothing about how the new UI works. I also did not see the docks and, of course, I wanted to figure it out; I how it all works. So, briefly about what I understood:

Based on poking studies, the main object without which building a UI is impossible is Canvas. He is responsible for rendering interface elements and forwarding events to them. Canvas also has 3 options for rendering UI: ScreenSpace - Overlay, ScreenSpace - Camera and WorldSpace.

ScreenSpace - Overlay and ScreenSpace - Camera

As you can see from the name of the modes, they work with screen coordinates. This allows you to build a PixelPerfect interface, but if you want the interface to look the same at all resolutions, or you want to create a 3D UI, these options are not suitable for you.


This mode draws elements in world space, so if you do not have a camera into which they will fall, then you may not see them. This mode interested me, because it allows you to create a UI that will look the same - regardless of screen resolution. The only small problem is that Canvas, unlike previous modes, does not respond to changing aspects of the screen. But this problem is solved by a simple script that controls the width / height of the canvas at startup.

using UnityEngine;
using System.Collections;
publicclassCanvasHelper : MonoBehaviour
	privateconstfloat ETHALON_2x3_LANDSCAPE = 1.3333333333333f;
	privateconstfloat ETHALON_2x3_PORTRAIT = 0.666666666666f;
	public Canvas canvas;
	// Use this for initializationvoidStart ()
		var ethalon = Screen.orientation == ScreenOrientation.Landscape ? ETHALON_2x3_LANDSCAPE : ETHALON_2x3_PORTRAIT;
		var cam = canvas.worldCamera;
		var rectTransform = canvas.transform as RectTransform;
		var delta = rectTransform.sizeDelta;
		if (Screen.orientation == ScreenOrientation.Landscape)
				delta.x *= cam.aspect / ethalon;
				delta.y *= cam.aspect / ethalon;
		rectTransform.sizeDelta = delta;

I will not describe individual elements, anyway you can get acquainted with them using the method of mathematical poking. But the message system is worth a look in more detail.

For your UI to work, you must have an EventSystem in the scene. EventSystem is a component that handles user events and passes them to the UI. Event processing occurs in the InputModule components. I met 2 (StandaloneInputModule for PC, consoles and the web and TouchInputModule for mobiles and tablets). Moreover, judging by their settings, they can at least partially be interchangeable.

An InputModule catches user events and passes them to an EventSystem, which already routes the whole thing to the UI. But how is it determined whether the active element has been pressed? GraphicRaycaster is responsible for this.


This component is located on Canvas and in response to a mouse click / touch determines which object the event should be sent to. In total, the new UI has 3 types of raykaster: raykaster for 2D physics, raykaster for 3D physics and raykaster for graphic elements. By default, the last one is added to the game object.

This reykaster has one huge drawback: for an object to receive an event, it must have a graphic component. In other words, if you want to create a transparent area of ​​the screen that will trigger a particular action by clicking, you will have to create a component with graphics and make it completely transparent. In my opinion, this is very inconvenient, it’s good that it is possible to expand this system with the introduction of new rakasters.

Bit about code

The UI system can be divided into 3 parts: the event generation system (UnityEngine.Events) - an innovation in Unity 4.6, affects not only the UI, but also the physics and render systems, the event capture system (UnityEngine.EventSystems) and the UI logic (UnityEngine.UI) ) Moreover, the first system is part of the main library of the engine, and the rest is 2 part of the UI library.


Classes are described in this namespace that describe the basic structure of events. There are 2 types of classes: UnityAction (event source) and UnityEvent (event listener, while it is possible to listen to several events). They can take up to 4 parameters of the following types: EventDefined, Void, Object, Int, Float, String, Bool (based on the description in Enum PresustentListenerMode).

Event systemsystem

This namespace contains classes and interfaces that provide event processing for UI elements, and each event has its own interface. Also in this namespace are reykasters for physics and a base class for describing the behavior of UI elements - UIBehavior and setting input modes.


Here are classes directly related to UI elements. Including a description for GraphicRaycaster, as well as a bunch of event-related interfaces. I have just started exploring this namespace; that is where the key to writing your own UI elements lies.

I’ll write the next part when I’ll figure out how to create my own controls. Thank you all for this, if you have a similar experience, I will be glad to read about it in the comments.

Also popular now: