Pixel Streaming allows users to interact with an Unreal Engine 3D application through their web browser. Mouse and keyboard inputs are communicated back to the remote server and processed with low latency, creating a user experience almost exactly as if the application were on their own computer or mobile device.
Although pixel streaming offers significant advantages in cloud rendering and cross-device access, it still faces multiple limitations in practical applications, primarily in terms of technical implementation, functional expansion, and scenario adaptability.
Pixel streaming relies on real-time video streaming, making it highly sensitive to network fluctuations. Any instability can cause blurriness, stuttering, or even disconnection. This issue is particularly evident in weak network conditions (e.g., mobile networks), where rapid camera movements or high-motion scenes lead to a significant drop in visual quality. Users have reported issues such as screen interference when multiple people access the service simultaneously, noticeable loading delays, and frequent crashes after prolonged operation.
Official solutions, such as Unreal Engine’s Pixel Streaming plugin, struggle with high concurrency on a single server. Traditional configurations require launching separate signaling services for each user, consuming numerous ports and failing to utilize GPU resources efficiently. This results in low GPU utilization and uneven workload distribution. For example, a single server typically supports only 3-4 concurrency, making it difficult to meet large-scale user demands.
Each user requires a dedicated port, leading to a sharp increase in port demand. This poses significant deployment challenges in environments with strict port management policies, such as government and healthcare applications. While some commercial solutions, such as DRTStreamer, employ port multiplexing to mitigate this issue, the bottleneck persists in native implementations.
Pixel streaming exhibits poor compatibility across browsers and devices. Different hardware, especially mobile devices, may experience black screens or input delays due to variations in decoding capabilities. Furthermore, the official plugin exclusively supports Unreal Engine, making it incompatible with Unity, Vulkan, O3DE, Cesium, and other engines. It also lacks direct support for non-3D applications such as design software, limiting its application scope.
As an official Unreal Engine plugin, pixel streaming requires frequent adjustments with each engine update, resulting in high maintenance costs. Development teams must allocate substantial resources to resolve compatibility and stability issues, such as crashes after extended operation. Native implementations also demand manual configuration of signaling servers, STUN/TURN servers, and other components, requiring a deep understanding of network architecture and operational expertise. For example, deploying across networks involves handling NAT traversal issues, making it challenging for standard development teams to manage independently.
The native pixel streaming solution lacks a commercial-grade management dashboard for features like load monitoring, access control, and data analytics, requiring developers to build these functionalities from scratch. In contrast, mature solutions such as DRTStreamer provide an all-in-one management platform that offers real-time GPU server monitoring, concurrency settings, bitrate adjustments, and more.
Moreover, most pixel streaming solutions have limited extensibility. They lack advanced features like integrated cloud-based audio/video calls and group collaboration, which are essential for complex use cases such as industrial simulations and remote training. Additionally, when multiple users operate independently, resource contention often occurs, preventing true parallel interaction00s.
Features |
Pixel Streaming |
Real-time Cloud Rendering |
Rendering on |
Server side |
Server side |
Streaming Type |
UE projects |
All 3D projects and software |
Protocol |
WebRTC |
WebRTC, DLCA(Dolit self-developed low-latency transmission protocol), RTMP, SRT |
Client platforms |
Browser(H5) |
Browsers, Mobile Devices, Desktop Applications |
Latency |
50ms-200ms |
10ms-60ms |
Performance scalability
|
Single-server deployment with limited scalability
|
Supports cluster deployment with strong scalability
|
Convenience |
Low |
Easy use. Zero-code deployment |
Use Case |
Basic experience; requires further development for commercial use |
Cloud gaming, digital twins, virtual simulation, Metaverse, large-scale concurrency |
Real-time cloud rendering is a broader concept that goes beyond pixel streaming’s video-based approach, incorporating technologies such as network transmission optimization and automatic load balancing. Compared to pixel streaming, professional real-time cloud rendering services like DRTStreamer offer lower latency, high-quality real-time interaction, and dynamic GPU resource allocation, enabling “multi-instance per GPU” to significantly enhance concurrency. Stability issues from long-term operation are also optimized, allowing for extended, uninterrupted performance. Additionally, these solutions support multiple engines and protocols, provide API and SDK, and offer private deployment options for seamless integration.