WaveEngine 3.0 Preview
Today we release the first preview of Wave Engine 3.0. This version is the result of more than a year of research and development and with a big effort invested in reinventing this technology. Here’s a summary of what Wave Engine 3.0 means.
And if you want to know more about our newest graphic engine, discover Evergine!
In recent years, technologies related to the generation of computer graphics have evolved very quickly. New graphical APIs have appeared such as Microsoft’s DirectX 12, Khronos Group’s Vulkan or Apple’s Metal of Apple which have each offered radical changes on the previous technologies DirectX 10/11 and OpenGL providing greater control of the driver by developers in order to obtain solutions with a performance never seen until now.
More than a year ago these new APIs were analyzed by the Wave Engine team so that the engine could support them. At that time, versions of Wave Engine 2.x were already published with great results within the industrial sector, but this version used a Low-Level layer that worked as an abstraction layer on the DirectX 10/11 and OpenGL/ES graphics APIs.
However, the changes posed by these new graphical APIs (DirectX 12, Vulkan and Metal) were so important that they needed a more profound effort than simply adding additional functionalities to the Low-Level layer to support new graphical functions as in previous versions.
The changes went further, they practically meant a change of “the rules of the game” so after multiple investigations and following the technical recommendations of NVidia it was determined that it was not possible to adapt the old Low-Level layer with support for DirectX 10/11 and OpenGL/ES to the new APIs. The best solution to get the most out of the latest and upcoming graphics hardware was to develop a completely new layer on the fundamentals of DirectX 12, Vulkan and Metal and then adapt and simulate certain concepts on DirectX 10/11 and OpenGL/ES to achieve backward compatibility on older devices.
None of the DirectX 12, Vulkan and Metal APIs are completely equal but they do present several important similarities with the intention of minimizing communication between CPU and GPU, concepts such as GraphicsPipeline, ComputePipeline, ResourceLayout, ResourceSet, RenderPass, CommandBuffer or CommandQueue should be supported by the new layer Low-Level Wave Engine.
Wave Engine 3.0 was born from the beginning because of the need to support these new graphical APIs and to be lighter and more efficient than its previous versions thanks to advances in Microsoft’s .NET technology, such as the NETStandard libraries and the new runtime called NETCore.
So, we started building Wave Engine 3.0 from the bottom up so that we could implement important changes so all libraries should now be NetStandard 2.0 to support NetCore on multiple platforms. This change has meant an in-depth revision of each Wave Engine library where performance has also become one of the most important goals, so from the beginning, many tests were made to compare the performance achieved. Then you can go deeper into these results, but in this image, we can see the performance improvement over Wave Engine 2.5 (last stable) vs Wave Engine 3.0 using DirectX11 in both:
After a lot of work this is the new architecture of Wave Engine 3.0:
In this diagram, you can see all the technologies used by Wave Engine 3.0 to provide the greatest flexibility and adaptability to any type of industrial project.
To explain it we can divide it vertically into 4 sections that starting from the lower part would be:
Now Wave Engine can draw on DirectX12/11, OpenGL/ES, Vulkan, Metal and WebGL. This means greater versatility that allows us to get the best performance possible in each of the platforms and architectures on the market.
The default runtime is .NetCore 3.0 in Windows, Linux, and MacOS although it can also be compiled natively to a destination platform using AOT. This is also the only option on platforms such as iOS and Android using Xamarin’s AOT and the only option on the Web where the AOT Mono-Wasm is used.
At this level, we would have the low-level layer of Wave Engine that connects to the platform, not only at the level of graphics but also for sound, access to files, device sensors, and inputs. On top of this would be the Wave Engine framework, components, and extensions all distributed as Nugets packages.
These are all the new pillars of Wave Engine 3.0, but the novelties do not end here. The team is working on improving each and every one of the elements that make up the graphics engine using the latest tools available. We will review these advances below:
- 1 Major Features:
- 2 New WaveEditor
- 3 New Effect Editor
- 4 XR Ready
- 5 New extensible Render Pipeline
- 6 The new life cycle of entities
- 7 New Web project Support
- 8 New serialization system based on yaml
- 9 New HoloLens 2.0 support
New launcher and update system
WaveEngine 3.0 comes with a launcher app separate from the editor which allow you to create new projects, open existing ones, update the WaveEngine NuGet version or even manage and download different Engine versions. These versions are installed on your machine in different folders so that you can work on different projects and use a different version of the engine for each of them.
When a new engine version is available, the launcher will automatically tell you that there is a new version and will let you download it, as well as update any existing projects that you are working on to the latest version.
The launcher app has been integrated into the Windows OS taskbar so that users can quickly open recent projects without having to run the launcher.
The new editor has been completely rewritten to increase the functionality to levels never seen before in Wave Engine. The new editor is faster, more efficient, easier to use and more powerful in all aspects.
Why a new editor? The Wave Engine 2.x editor was developed using GTKSharp 2.42, a multiplatform technology that allowed us to create interfaces in Windows, Linux, and MacOS. GTK was still evolving, but Xamarin’s C# wrapper (GTKSharp) wasn’t, and this started to cause us some problems with modern operating system features like DPI management or x64 support. Then we raised the option of creating our own wrapper of GTK 3 and port the whole editor but after a study of our users, it resulted that 96% of them used Wave Engine for Windows. After this fact, we proposed that continuing to support Linux and MacOS was very expensive for the team and we decided to concentrate all our efforts on developing a new editor in WPF for Windows with which we would get a better integration with this operating system as it was the one used by the vast majority of our users.
The new 3.0 editor is more solid, it consists of 2 independent processes, one that manages the rendering of the 3D views and the other that manages the UI and the different layouts. This allows the editor to not close or lock when an error occurs during the rendering of views, and even allows us to recover the render by restarting the rendering process, which has led to an increase in the stability of the new editor.
The new Wave Engine 3.0 editor is completely flexible, it allows the modification of the user layout for each project. All icons have been redesigned to be vectored, so we will never see pixelated icons on high-density screens again. It includes a theme manager that will allow you to switch between dark and light themes to help display on different screens. The contents of an application are even easier to create and edit, it is possible to visualize all the contents and modify them in real time. There is a new system of synchronization of external changes, which means that if we modify an asset externally, the editor will be informed and kept up to date.
New viewers have been implemented for each type of asset, which is more complete and more functional. These viewers are not external processes as in the previous version of the editor, they run inside the new editor and in the same context which greatly increases the loading speed.
New Effects Viewer
Allows you to write your own effect. An extension for HLSL has been designed that will help automatically port the created shaders to all the platforms supported by Wave Engine. This extension is metadata that easily defines and models properties, shader passes, default values…
New Materials Viewer
Shows on a 3D surface a material that we can modify while we see the changes in real time.
New Render Layer Viewer
The painting of entities is grouped by render layers, in this viewer the layers are created and modified, allowing modification of the sorting direction, the rasterizer configuration, the color mixture (BlendState) and the depth control (DepthStencilState).
New Sampler Viewer
This viewer allows you to create and edit Sampler assets, which allows you to modify the way in which Wave Engine will treat the textures to which each Sampler is applied.
New Textures Viewer
As in the other viewers, a new texture viewer has also been included where you can indicate the output format, % scaling, if the texture is NinePath type, if it is necessary to create MipMaps or if the texture includes the pre-multiplied alpha channel, as well as the Sampler that will be used to draw.
New Model Viewer
The visualization of models has improved with the development of the new viewer and the incorporation of the GLTF format. The viewer allows the models to be observed, as well as modifying the illumination to improve the detail. It is possible to access each of the animations included in the model and assign key events to specific times in each animation. These events could be used to launch methods in our code, reproduce sounds, activate effects and other activities we imagine.
New Audio Viewer
Another of the improvements of this version is the new Audio file viewer, where you can see the waveform of the file and configure the output characteristics such as the sample rate or the number of channels.
New Scene Viewer
This viewer is the center of the editor. The viewer unifies and uses all the contents so that the user can create and modify scenes, which are fundamental pieces of any application. The scene viewer controls and organizes the entities associated with the scene, as well as the components, behaviors, and drawables associated with each entity. These entities can be grouped in a hierarchical way that allows an easy and logical organization, and also includes the possibility of filtering the entities by name within the tree. The viewer allows direct modification of the preview of all the entities through the so-called manipulators (translation, rotation, scale) and like 3D design programs and CAD, all these changes are immediately reflected in the scene.
New Effect Editor
One of the weaknesses of Wave Engine 2.x was the creation of own materials. Although this was possible, it required a completely manual and very tedious process. In Wave Engine 3.0 a new effects editor has been created that allows you to write your own effects that you can then use as materials in your scenes.
This editor allows you to write your own shaders and group them into effects. These shaders will be defined in HLSL (DirectX) with the engine’s own metadata that makes the step of parameters and the integration with the system easier. When defining shaders, the editor has Syntax highlighting, Intellisense, automatic compilation, as well as highlighting errors on the code itself to make it easier for the user to create and define them.
The shaders are composed of 2 blocks:
- Resource Layout: Where the list of parameters that the shader will receive is defined.
- Pass collection: A list of passes that integrate perfectly with the new Render system in which you can define your own Render Pipeline, as well as the passes that must have the materials defined.
It also has a 3D viewer where the shader you are defining is compiled and executed in real time. And in a dynamic way, it allows the testing of passing different values for all the parameters that you have defined in the resources section Layout, what obtains a totally interactive edition.
The effects are composed of multiple shaders, which are possible to define from compilation directives. The new effects editor allows you to define your own compilation directives and compile and visualize the different shaders generated after the activation of some or other compilation directives.
In addition, when you create very complex effects that have thousands of shaders resulting from the definition of multiple compilation directives, an analyzer is included that compiles all possible combinations of your effect indicating as a result if all combinations have compiled successfully or otherwise, which combinations have not compiled, allowing you to navigate to them and repair the errors they contain.
Once your shader is written, it is automatically translated into each of the languages used in the different graphics technologies (OpenGL/ES, Metal, Vulkan). In order to debug this automatic process, the editor includes a viewer that allows you to check which are the translations made from your shader to each language.
Finally, the editor is able to automatically generate the effect asset, as well as add a decorator material (class c#) to the user’s solution that allows you to manage the creation of such material from code in a very comfortable way, as well as the assignment of parameters to the effect, facilitating all these tasks to the user.
XR (Extended Reality), is a term that encompasses applications such as Virtual Reality (VR), Mixed Reality (MR) or Augmented Reality (AR).
Wave Engine 3.0 has been designed with XR in mind.
Single Pass (Instanced) Stereo Rendering
Rendering in an XR application usually requires drawing the scene twice, once for the left eye and once for the right eye. Traditionally each image is rendered in two passes (one for each eye), so the end time is doubled.
In Wave Engine 3.0, the rendering time has been optimized using the Single Pass (Instanced) Stereo Rendering technique. This is how it is broken down:
- Single Pass: The scene is rendered in a single pass, sharing a lot of processing between each eye (culling, sorting, batching…)
- Instanced: Each object is rendered twice using a single DrawCall via Instancing. Additionally, each instance is drawn in its corresponding texture (left or right eye). This way, a lot of processing is shared for each element (binding material parameters, update material buffers, etc…).
- Stereo Rendering: The result is a stereo image (TextureArray2D), which will be provided to the headset so that it can present each eye independently.
All effects provided by default in Wave Engine 3.0 support Single Pass Instanced Stereo Rendering. Additionally, with the new effects editor, it is easy to develop an effect that supports it.
As a result, and together with the improvements introduced in the RenderPipeline, incredible performance improvement has been achieved, which can increase the complexity of our scenes.
With the feedback obtained when developing XR applications with previous WaveEngine versions, all the components and services offered to users have been reimplemented in order to simplify development as much as possible.
XRPlatform is the Base Service that manages all communication with the XR platform in question. Each platform in question will be provided through implementation of that service (MixedRealityPlatform, OpenVRPlatform, etc…). Points to bear in mind:
The old CameraRig component has been removed. In WaveEngine 3.0 you just need a Camera3D in your scene, without any additional components. The XRPlatform service will take care of making this camera render in stereo in the headset.
XRPlatform exposes different properties that allow access to different interesting areas, in case the underlying implementation supports it:
- InputTracking: In charge of providing the positioning and status of each of the devices of the XR platform (controllers, base stations, hands, headset…).
- GestureInput: Offers access to complex spatial gestures on those platforms that support it (HoloLens and HoloLens 2).
- RenderableModels: It is possible to obtain 3D models of different devices.
- SpatialMapping: Provides a dynamic mesh of the environment in which the device is (HoloLens, HoloLens 2, Magic Leap, ARKit, ARCore).
- SpatialAnchorStore: Allows you to add Spatial Anchors in the space and store them to be retrieved in the next session.
New extensible Render Pipeline
In Wave Engine 3.0, the way objects in a scene (meshes, sprites, lights, etc…) are processed and represented on screen can be adapted to the needs of our application.
A render pipeline is a fundamental element in Wave Engine 3.0, which is responsible for controlling the entire rendering process of the scene (sorting, culling, passes, etc…). By default, a RenderPipeline is provided, which implements all the functions required for proper operation, called DefaultRenderPipeline.
However, now users are given the possibility to provide a customized implementation of render pipeline that fits their needs. Implementing our custom RenderPipeline gives us greater granularity and customization, allowing us to eliminate unnecessary processes or add tasks not previously contemplated.
Each scene has an associated render pipeline, which is responsible for:
Collecting all the necessary elements to render the scene:
- Lights: Lights of our scene (DirectionalLight, PointLight, and SpotLight)
- Cameras: The list of cameras from our scene.
- RenderObjects: Any object that needs to be rendered, can be meshes, sprites, billboards, lines, etc…
Preparing the elements to render for each camera:
- Culling: Render-only those objects that are visible by the camera.
- Sorting: Sorting objects for optimal rendering.
- Batching: Grouping of the objects to be painted to minimize the number of draw calls, and thus improve performance by reducing CPU consumption.
Rendering the elements of the scene already processed. To do this, a render pipeline offers different rendering modes, called RenderPath. A RenderPath takes care of:
Managing how the lighting will be processed. (Forward, Light-Pre-Pass, etc…)
- Exposing properties and textures that can be automatically injected into the materials.
- Defining a series of passes necessary to paint in the scene and executes them sequentially to obtain the final result.
- Each pass affects the draw call of an object. To do this, the material of the object must offer an implementation for each pass. Otherwise, the object will not be rendered.
Now, it is also possible to implement our own fully customized RenderPath, and record it in our render pipeline, so that our effects and materials can use it in the scene.
The new life cycle of entities
Wave Engine 3.0 has redefined, simplified and standardized the way entities, components, services, and scene managers work, allowing them to be attached, enabled, disabled and detached. It also keeps the old dependency injector and it improves it.
Component life cycle diagram
The new component lifecycle is explained in the next diagram:
We control the behavior of our component implementing these methods:
- Called when the component has been deserialized.
- Called only once during the component lifecycle.
- Used to initialized all variables and properties not dependent on external components.
- Invoked when the component has been added to an entity, or when the associated entity changes its parent.
- All the dependencies of this component have been resolved. However, they may not be initialized (can be in the detached state).
- This method can be used to establish relations with other elements of the scene.
- This method called when a component is activated (its IsEnabled property is set to true).
- Called during startup if the component is enabled by default.
- This method used to realize all the task we want to realize when the entity is activated.
- Invoked when an activated component is deactivated (IsEnabled property is set to false).
- Method used to make all necessary changes for preparing the component for a deactivated state.
- Method invoked when the component is removed from its owner entity. Its also called when the owner entity is removed from the
- Method mainly used for removing all external references.
- Invoked when the component is destroyed.
- Called only once. Once disposed, the component cannot be used again.
- This method is used to dispose all resourced associated to this component.
We’ve also redefined the way dependencies are injected in the component. All the dependencies are resolved before the component is attached.
Wave Engine 3.0 allows these attributes as element bindings:
- BindComponent: Injects a component of the same entity of a specified type..
- BindEntity: Injects an entity reference by its tag.
- BindService: Injects a service of the Wave application of a specified type..
- BindSceneManager: Injects a scene manager of a specified type.
New Web project Support
From the first releases of Wave Engine, we have covered most of the mainstream devices: phones, tablets, desktops, headsets, etc. However, the Web was simply not reachable. The state of the art with .NET did not allow a feasible solution to build a bridge between the browser and our C# code.
The strong shape WebAssembly is taking lately and the efforts made by Mono to run the CLR on top of it has opened us a new window to seriously think of taking Wave Engine 3.0 apps into the Web.
On late 2018, it started to be possible running .NET Standard libraries in the browser. At the same time, Uno’s Wasm bootstrap NuGet allowed us to quickly jump into our first tests consuming WebGL.
Nowadays our performance analysis comes from running our samples under different scenarios: browsers mixed along with devices. However, we have still not stressed the runtime by enabling JIT or AOT: currently, IL is purely interpreted. At the same time, Mono keeps working on their Wasm tooling, mostly improving performance.
All this enforces we believe such route is good enough to keep investing on it, followed by the steps WebAssembly it-self is gaining with time —as being able to run such out of the browser, or enabling multithreading scenarios.
We have already started playing with initial Wave Engine 3.0 apps for Web with this technology and expect to ship such anytime soon, although there is no estimation currently.
New serialization system based on yaml
Wave Engine 3.0 has decided to have more control over how the scene entities and components are stored and edited. That’s why we’ve decided to leave behind XML DataContract serialization and fully embrace the more lightweight and customizable SharpYaml serialization. This change made us improve in some key features like:
The same scene as YAML file tends to be more readable, also using less space than the XML DataContract version.
Now we have a much deep error control of the scene. The scene is deserialized even if they are components of an unknown type. This is important during development because we can keep editing the Wave scene even when there are issues deserializing your component in the application.
The easy customization of the serialization progress allows us to customize the way some specific types are serialized. This is crucial because this way the scene detects all the asset references and injects its Id. This allows just declaring properties of the base type (Texture, Model, etc), instead of declaring variables of their path.
Members serialized by default
Now all properties are serialized except the properties marked with the WaveIgnore attribute. This way creating new components by code is much simpler because we don’t have to create all their DataContract and DataMember attributes.
The best is yet to come
This is the first step taken to improve the overall quality desired for WaveEngine 3.0. However, we want to keep customizing the serialization/deserialization process and be much more flexible.
New HoloLens 2.0 support
We have been working with Microsoft in order to support all the new features which this device brings, which means a great evolution in the interaction with respect to the first version.
In this second version we now have hand tracking support for both hands with 21 tracking points, this will allow users to interact in a more natural way with 3D elements without the need to learn certain gestures to use the apps. The new API gives individual information for each finger which can be used in the new interfaces to give a more agile way of entering data.
It also includes a new eye tracking API which will enable developers to know not only in which direction the user’s head is pointing to, but also the distance between eyes and where each eye is pointing to at all times.
The environment tracking API has also improved which will help to represent environments with a higher precision, being able to generate more realistic 3D object occlusions or even use these representations as a useful part of the app.
This new device has an important change in its architecture as the HoloLens 1 ran on x86 architecture whereas the new version runs on ARM64. Since WaveEngine 3.0 already uses the new .NetCore 3.0 runtime it is capable of translating the native instructions to this new architecture, taking advantage in this way of the device.
Today a preview of Wave Engine 3.0 is released that only has support to create projects in Windows, UWP, and HoloLens. Over the coming weeks, we will continue generating new versions to release all the work done. Stay Tuned!