The future of technology: AR / VR in engineering and design

    In 2017, Dell and AMD conducted a survey, analyzing the opinions of 147 representatives of creative professions from various fields. The companies wanted to get an idea of ​​the technical problems faced by designers and designers, to look into the near future and get a picture of what the next year could bring.

    We will share the results of this survey and show the opinions of leading industry experts. In addition, we will discuss a new technology, which gradually penetrates large companies specializing in engineering and design.



    VR & AR: new technologies at the service of business


    According to the survey, 41% of respondents are already implementing virtual (VR) or augmented (AR) reality as part of a business strategy. Another 53% plan to do this over the next year and a half.

    Practice shows that VR technologies have already moved away from just being a fad of narrow specialists, and some companies get a competitive advantage due to their implementation.

    One striking example is Area SQ, a well-known British company that designs and repairs office space. Virtual reality technologies allowed her to give her customer service a “new dimension” and demonstrate her work in a way that was impossible to imagine a few years ago.

    “VR has a huge impact on how we work throughout the project’s life cycle,” explains Daniel Calgary, Design Director, Area SQ (London). “We are proud of how Area SQ is advancing in the field of design technology. One of the foundations of our strategy is innovation, and VR fits perfectly into it. ” Calgary notes that for Area SQ, it’s important to be able to “overcome existing and future challenges.” And virtual reality has become a great opportunity for the company to introduce new developments.



    The company works with virtual reality technologies at different levels depending on the type of project. Sometimes this is a static picture with a 360-degree view, sometimes it’s a whole virtual space that you can navigate through (in this case, the Unreal Engine game engine is used). But technology will never replace the creative process.

    “It’s important for us to have VR-ready systems ready for virtual reality,” said Gary Hunt, Area SQ visualization manager. - Thus, we can simply use a piece of equipment for the creative. "We don’t want to think about what is inside the system, how it functions, we want to connect VR accessories and work with it."



    Ultra HD Shooting


    With the increasing resolution of monitors and TVs, people in creative professions have to constantly review their workflow and product quality. As the survey showed, photos and videos with 8K resolution have already become part of a business strategy for 40% of respondents, and 51% plan to implement it over the next year and a half.

    However, with increasing resolution, the requirements for graphic (GPU) and central (CPU) processors also increase: the software must cope with large projects, and increasing the size of monitors requires more powerful graphics cards.

    40% of respondents said that 8K video is part of their business strategy


    Here's what Paul Wyatt, the director of short documentaries and commercials, told about the problems he faces when working with high-definition video.



    What do you see as the most significant technological advances in video production in the next year and a half?

    A couple of years ago, 4K video in decent bitrate and with a normal frame rate was something unattainable for most professionals, not to mention ordinary consumers. We needed an expensive external recorder and “dancing with a tambourine” to make friends with him a non-linear video editing program. But now cameras like the Panasonic Lumix DMC-GH5 and the APS-C sensor with the LUMIX DMC-FZ2500 lens really raise the bar when you need to shoot 4K video without recording time limits. This incredibly facilitates the task of the filmmaker or videographer, because you carry a smaller set of equipment with you and do not sacrifice the quality of the material. We also see the appearance in the latest cameras of a 10-bit 4: 2: 2 color space. The video contains more information about colors, which allows more active use of color correction or, possibly,

    What challenges do you face?

    The biggest problem with all these achievements is that they require reorganization of work processes and additional capacity for processing high-resolution content. Fortunately, non-linear editors like Adobe Premiere Pro initially work with 4K. They can even create proxy clips with low resolution if the system does not have enough power to work with full 4K resolution.

    How has your work changed over the past few years from a technical and creative point of view?

    Technology freed creative thought, and it became possible to do more with a smaller budget. I still like to work with the team and use high-end cameras (for example, Sony FS7 or Canon c400), but all this equipment is usually expensive and time consuming. Compact mirrorless cameras allow me as a director to maximize the use of a limited budget. I can use Sony A7s ii, RX10 iii or Panasonic GH4 and I know that I get the required 4K resolution, focus and exposure tools: you do not need additional equipment for this. I'm already tired of seeing all these creepy cameras, bulky installations with lots of cables, monitors and external recorders. This is now much less necessary, which frees up the time of the producer and director. So creatively you can do a lot more. Everything is simple.



    What problems does the increase in video resolution cause? How do you plan to solve them?

    This year there is real demand for 4K, so my system had to be upgraded to process video of this resolution at speeds of up to 250 Mbps. Some non-linear editing systems do better with 4K, others worse. If I edit the assembly on a laptop, I will use the proxy process in Premiere Pro. This allows you to create lower resolution files for work, which reduces processor requirements. Then, if I transfer all this to my Dell workstation , I can change the work clips to a higher resolution version.

    It is also important not to lose sight of the acceptance phase. I am usually asked to submit the result in 1080p format. Typically, the resolution of a 4K video is reduced to 1080p and then output. However, the plus is that later you can open this project with a reference of the 1080p timeline to the 4K timeline and scale the video again to 4K. In Premiere and Final Cut, this is very easy.

    51% of respondents plan to start working with 8K video in the next 18 months


    When it comes to capturing 4K video on a camera, tools that help you work with focus are very important, as the slightest flaws in the image will be immediately visible to viewers. At 4K resolution, it’s easy to evaluate the availability of tools that allow you to zoom the image in the viewfinder. And for a home audience when watching material on a 4K TV, this will give a higher level of control.



    What technologies are you using now?

    For many years I worked in creative studios, where the video was always in the background: the set of technologies that we used always lagged, and the rendering was very slow. When I started creating my own films, I wanted all these trade-offs to prevent creativity, so I bought the Dell XPS 8300 system. At that time it was equipped with an Intel Core i7-2600 processor, a 1 TB SATA drive and a 28-inch Full HD monitor from Dell. I worked with this system (with several updates) for five years when I shot HD films. And even made a half-hour documentary on television.

    However, there was a need to create films with higher resolution and higher bitrate, so I turned to Dell again — I like the way the company works with customers. The experts gave practical advice on the configuration that I will need to work with 4K. I chose the Dell Precision Tower 7000 with a memory upgrade and an UltraSharp 27 monitor . These are serious investments, but I'm sure that with a few upgrades the workstation will last a long time - as much as the previous model.

    Interactive touch displays


    Touch displays have gone from mobile phones and tablets to hybrids, and large-format interactive devices are now becoming viable tools for creative professionals in many industries.

    Most studios use tablet computers to showcase their work. However, companies such as Adobe, Autodesk and AVID create separate versions of their applications for touch interfaces, and on devices with a touch screen, you can not only view, but also create content.



    70% of respondents hope to include interactive devices in their business processes over the next year and a half


    Area SQ bases its business on innovation, considering it as a key success factor. An analysis of what competitive advantages a new technology will provide is an important part of the company's philosophy.

    We believe that touch interfaces will be a breakthrough, ”explains Andrea Williams-Vedberg, Area SQ Creative Director. “When you need to demonstrate something new to customers, a product like the Dell Canvas comes in handy .” “You can easily move the image around the screen, increase the scale of CAD models, and also view the files you are working on together with your colleagues.”

    Virtual reality technology and the increase in video resolution are forcing to change the workflow. There is a constant search for compromises between the introduction of technical innovations and a rational approach to budget allocation. The majority (63%) of the survey participants said that rendering using GPUs plays a role in their current business approach. 80% of these respondents said they reduced rendering time by more than half. However, it is important to strike a balance between price and performance.

    The future of the special effects industry


    MPC (Moving Picture Company) is one of the world's leading studios in the special effects industry. She worked on the films Survivor, The Jungle Book, and the Harry Potter series.

    To create Oscar-winning effects, MPC uses a range of hardware technologies, including Dell mobile workstations. It is interesting to discuss the technical problems that the world leader in the special effects industry is facing. This is what Damien Fagnou, Chief Technical Officer at MPC Film, says.



    What are the main technical problems that you are already facing or will face in the coming year?

    It's not easy for a large studio like ours to cope with the growing demands for data storage and rendering. Therefore, we continue to invest heavily in the optimization process. To stay in the forefront, we are helped by the technologies of Universal Scene Description and VR.

    How significant will VR play in the special effects industry next year?

    We are already experimenting with very interesting workflows, where VR serves as an attractive platform for viewing and even creating virtual environments. At the same time, there is a growing demand for the full “interactive experience” of virtual reality, which complements the films created with the participation of the studio.

    And what changes do you need to work with VR?

    In order to use virtual reality in our industry, you need to solve several problems. Take game engines: although they have been improved over the past few years, they are still not able to really cope with special effects. In addition, most of the special effects are done on Linux, where there are few game engines - just like the VR headsets with which they need to be integrated. We invest in various work processes, as well as conduct research, and are confident that the situation will improve in the coming year.



    What problems does high resolution video create? How do you overcome them?

    An increase in resolution affects the amount of detail and texture, but overall we have found good solutions and have been using them for many years. The most difficult task is the final rendering. Its cost when ray tracing grows almost linearly with the number of pixels, and rendering a 4K video takes 4 times as much time. This requires efficiency both in terms of workflow and speed.

    We are also working with our partners, Technicolor's research team, on noise reduction and scaling algorithms specially adapted for special effects.

    Which unit is facing the most serious technical challenges?

    In fact, that’s all. For animation, you need more pixels on the screen and more details, the special effects department wants to use more powerful modeling, faster rendering with higher detail, etc. Everyone who is engaged in creativity wants the technology to free them as much as possible from restrictions.

    What excites you most from a technical point of view, if we talk about the prospect of the next year or two?

    Various technologies, especially cloud ones, are already reaching maturity levels and are capable of transforming the special effects industry. Now we have not just more features of a computer or server “on demand”, but a more distributed and dynamic workflow. On the other hand, technologies like deep learning are changing the way we solve problems.

    What were the main difficulties when working on the “Jungle Book”, and what has changed over the past 15 years?

    In the “Jungle Book” we had to create a lot of photorealistic environments, incredibly complex and rich in details - more than 80 minutes of special effects. The world was “populated” with hundreds of realistic animals that were supposed to act and behave quite naturally. It was almost impossible in 2001. Over the course of 15 years, computer technology and data storage have made such progress that we can now fully simulate the environment using ray tracing, creating billions of polygons and curves. Pixar and Autodesk software running on these advanced computing platforms has also evolved incredibly.

    Survey results










    Overcoming Barriers: AMD Transforms Entertainment with SSG and WX Series Technology


    The latest survey highlights the ever-growing demand for GPUs, and the new Radeon Pro product line demonstrates AMD’s commitment to setting new benchmarks. “We announced a partnership with AMD on Radeon technology to optimize Nuke for OpenCL and enable users to work with the new product,” said Alex Mahon, CEO of Foundry.



    Thanks to the Radeon Pro line, AMD provides a range of tools - from WX Series graphics cards to SSG technology that meets the full spectrum of creative needs. “The goal of the WX series is to create the most affordable workstation for training virtual reality content,” said Raja Koduri, senior vice president and chief architect, RTG (Radeon Technologies Group). - We supplemented the graphics processor with terabytes of memory. And with our SSG technology, you can edit 8K video in real time. It allows you to work at a speed of 90 frames per second and 45 Gb / s directly on the Radeon Pro SSG. "

    For 63% of respondents, GPU rendering is part of a business strategy.


    Every company working with visual effects is experimenting with VR, and this imposes certain requirements on the GPU. “Game engines have become more powerful and widespread in many industries. In fact, they can no longer be called "gaming," said Roy Taylor, vice president of AMD. “We are creating tools to help artists and directors come up and tell their story.”

    The performance of the older model Radeon Pro WX7100 with 2048 stream processors exceeds 5 teraflops. This card is equipped with 8 gigabytes of GPU memory on a 256-bit bus. The effective frequency is 7 GHz.

    The graphics processor is designed to handle huge amounts of data. However, at the system level, it cannot instantly access large arrays. Radeon Pro SSG Solution(Solid State Graphics) aims to take another step in that direction.



    The 3D card is equipped with two M.2 slots for SSDs with a PCIe 3.0 x4 interface. Drives can be used to store data that is processed by the graphics adapter. SSDs connect to the PCIe bus using the PEX8747 bridge.

    SSG is a suitable option for designers and developers, since the card can significantly increase the performance of highly loaded systems. In addition to design, she will be able to help in areas where real-time image rendering is required: for example, in medicine for 3D-animation of the patient’s heart, in the oil and gas industry and other industries.

    Also popular now: