top of page
Add.png

Virtual Production

Stewart Platform
Digital Twin & Control for Unreal Engine

Introduction

I’m passionate about building tools that blend robust engineering with smooth user experiences. This project showcases my journey developing a interface plugin that bridges the gap between digital environments and physical motion systems, creating a seamless Unreal Engine-based control and visualisation system for Stewart motion platforms.

Context
  • What’s the Stewart Platform?
    A 6-DOF motion platform used in flight simulators, vehicle testing, and entertainment.

  • Why is integration hard?

    • Real-time communication is tricky (UDP handling, data rates).

    • Visualisation and control need precise sync.

    • Exporting motion data from Previz for analysis or playback adds extra complexity.

  • Why it matters?
    Mismatches between physical movement and virtual states lead to unsafe tests, inaccurate simulations, or broken user experiences.

Problem Setting / Research Question
  1. How can we integrate real-time hardware control inside Unreal?

  2. What communication challenges arise (latency, packet drops)?

  3. How can we visualise live motion states (digital twin)?

  4. What kinds of data do users want to export for testing and analysis?

  5. What existing tools are used, and where do they fall short?

  6. What features matter most for VP Sup, VFX Sup, DOPs, and operators?

Secondary Research
Understanding Typical Stewart Platform workflows (motion script, realtime sync, safety cutoffs)

1.Existing motion-control ecosystems

Digging through the catalogues of leading “Stewart Platform” suppliers VHT, Symetrie 6-axis controllers units for flight simulators, and classroom-scale rigs such as Acrome’s desktop Stewart Platform. All of them share three design pillars:

  • Parallel kinematics with six linear actuators for compact stiffness and high payload-to-weight ratios.

  • Dedicated real-time servo electronics that expose a proprietary command set (often over UDP or EtherCAT)

  • Safety interlocks and limit-checking baked in at the drive level, so the host PC can focus on the trajectory instead of watchdogs.

These commercial platforms are robust but also isolated. None of them can talk natively to a Realtime game-engine, and their “visualisers / digital twins ” are 1990s OpenGL windows at best. Most unit, for instance, provides a simple MATLAB demo but no immersive preview.

2.Why plugging real hardware into a game engine is harder than it looks

Typical simulators render at 60–120 fps, but a Stewart Platform’s servo loop runs at 500–1000 Hz. That mismatch surfaces three problems:

  • Latency & Determinism (Smooth UDP Packets): Unreal’s fluctuating frame budget made our UDP network packets jitter. My solution involved pushing these packets from a dedicated, high-priority thread to ensure consistent delivery.
     

  • Coordinate-Frame Gymnastics (Unit & Rotation Headaches): The hardware expects positions in millimeters and Euler angles relative to its base, while the engine uses centimeters and often quaternions for rotation. One tiny rounding error in these conversions could make the rig slam a strut, so precision was critical here.
     

  • Haptic Realism (Ultra-Fast Motion Cues): For truly immersive experiences, simulators must deliver haptic onset cues in under 50 milliseconds, a key benchmark noted by pilots. Existing literature suggests that classic wash-out filter algorithms often fall short with modern agile aircraft models, leading recent research to focus on model-predictive control techniques implemented on the platform side.

Operational Excursion Limits Table

Excursions are measured relative to the Motion Reference Point with the motion system in its neutral (home) position. From this neutral position, the system supports both non-simultaneous and simultaneous maximum excursions.

"Non-simultaneous" refers to motion along one degree of freedom while all others are at zero. "Simultaneous" refers to motion where other degrees of freedom are unconstrained.

Under normal operating conditions, actuators remain outside their cushioning stroke, and the system is capable of achieving its full operational excursion limits. Values provided in the table may vary by up to ±1%.

Axis
Primary Research

PR goal is to understand Typical Stewart-platform workflows day-to-day pattern

Problem Statement

Virtual-production and VFX teams lack an integrated, low-latency way to drive and visualise a physical VHT Stewart Platform directly inside Unreal Engine. Existing motion-control software runs as separate standalone applications, offers only rudimentary 2-D previews, and cannot share live kinematic data  with the realtime engine. This fragmentation forces crews to juggle scripts, safety consoles and match-move checks across multiple systems, leading to pose drift, missed limits, time-consuming re-takes and, in worst cases, safety risks.

Solution

Identified Needs

  • A unified, in-engine tool that streams real-time control

  • Shows a 1-to-1 digital twin, and captures synchronised motion data

  • Restore confidence by cutting setup time and keeping physical and virtual worlds perfectly in step.

Overview

My Unreal-Engine plugin unifies control, visualisation, and data capture for the VHT Stewart Platform so that filmmakers and VFX teams can work from a single, live source of truth.

Group 48095779.png
Core Components
  • StewartPlatformComponent (C++ / Blueprint-exposed)- Handles network sockets, safety filtering and command generation.

  • DigitalTwinActor- Renders the live pose, limit volumes, and ghost overlay.

  • DataRecorder- Buffers high-rate telemetry, then flushes synced CSV/JSON/FBX on cut.

  • Operator UI (UMG)

​Plugin Feature Summary

1. Core System Control & State Management

The plugin will provide functions to manage the motion platform's operational state.

  • GoToEngaged(): Activates the platform to follow motion cues.

  • GoToNeutral(): Commands the platform to its central, neutral position.

  • GoToSettled() / GoToOff(): Commands the platform to its settled or off state.

  • AcknowledgeFault(): Resets system errors after they have been resolved.

  • FreezeMotion() / UnfreezeMotion(): Commands the platform to hold its current position.

2. Real-Time Motion Control (via UDP)

The plugin will offer two distinct methods for controlling platform movement in real-time:

Indirect Cueing (Using Washout Filters):
This method is for realistic simulation cueing. The plugin will send vehicle physics data to the motion computer, and the internal washout filters will translate it into motion.
Inputs: Specific Forces (or Total Accelerations) , Rotational Accelerations , and Rotational Velocities.

Direct Platform Control (Using _EXTRA Setpoints):
This method provides direct, responsive control by bypassing the main washout filters.The plugin will send explicit commands for the platform's target state for all six degrees of freedom.

Inputs:

  • Position: _EXTRA setpoints.

  • Velocity: _EXTRA_DOT setpoints.

  • Acceleration: _EXTRA_DDOT setpoints.

3. Special Effects Control (via UDP)

The plugin will provide functions to trigger and control all the available special effects to enhance realism:

  • Frequency Spectrum: Control the 20 sine wave generators to create effects like turbulence, buffets, and runway roughness.

  • Bumps: Trigger up to 5 pre-stored waveforms with a specified amplitude to create effects like landing bumps or track-switching jolts.

  • Direct Setpoints: Send small, direct position offsets (X_DIRECT, Y_DIRECT, Z_DIRECT) to create sharp cues like runway rents or track joints.

  • Road Rumble: A function to switch the road rumble effect on or off, which is then automatically scaled by vehicle speed on the motion computer.

  • Cockpit Shaker Compensation: A flag to activate a compensation routine to reduce wear when a physical cockpit shaker is active.

bottom of page