
Virtual Production
Stewart Platform
UDP Control
eMotion-2700-6DOF-650-MK1
Aditya Sisodiya
Introduction
I’m passionate about building tools that blend robust engineering with smooth user experiences. This project showcases my journey developing a interface plugin that bridges the gap between digital environments and physical motion systems, creating a seamless Unreal Engine-based control and visualisation system for Stewart motion platforms.
Data : MRP : 153.7mm Below from top , 148mm in front of Geometrical Center. 1078mm (1103mm) / 1449mm (+25) height from ground
Settled Height : 1231mm(+25) Neutral Height : 1603mm(+25)
Guide to Motion System Tx Packet
Let's outline the structure of the 508-byte data packet, showing how human-readable inputs are converted into their 4-byte hexadecimal representation for transmission.
Data Format: Little-Endian
Example Scenario: A command to "Engage" and "Heave Up" by 0.2 meters.
Section 1: Packet Header (Items 1-2)
This section is the mandatory "envelope" for every command. In any scenario, the host must populate these fields to ensure the Motion Computer can sequence the incoming data (COUNT) and that the system is in the correct operational state (CMND). For example, before any motion can occur, a packet must be sent with the CMND field set to 7 to "Engage" the system.
Section 2: Primary Motion Inputs (Items 3-23)
This section is designed for full-fidelity flight or driving simulators. In this scenario, the host computer is a powerful machine running a detailed physics model of a vehicle. It continuously calculates the forces and rotations the vehicle is experiencing. It then sends these values (Specific Forces, Rotational Velocities, etc.) in these fields to the motion system. The motion system's internal "washout filters" then interpret this data to create realistic motion cues that mimic the feel of flying the plane or driving the car. For simple, direct movements, these fields are not used.
Section 3: Flags & Special Effect Triggers (Items 24-29)
This section is for controlling discrete, event-based effects. For the Flags field, a scenario would be a flight simulator's aircraft touching down; the host sets a "Ground Mode" bit, and the motion system can switch to a different tuning profile better suited for ground handling. For the Bump fields, a scenario would be a train simulator crossing a track switch; the host sends 80.0 to Bump #1 to trigger a pre-recorded lateral jolt at 80% intensity, making the event feel authentic.
Bump Information
Section 4: Frequency Spectrum for Vibrations (Items 30-109)
This large section is used to add layers of "texture" and vibration to the motion. The primary scenario is simulating air turbulence or road roughness. For example, to simulate light turbulence, the host computer would activate several of the 20 available sine-wave generators by sending non-zero YGAIN and ZGAIN values for various frequencies. The motion computer then sums these sine waves together to create a complex, random-feeling buffeting motion.
Section 5: Additional Setpoints for Direct Control (Items 110-127)
The primary use case for this section is diagnostics, testing, and special non-physics-based effects. For example, a maintenance engineer needs to verify the full travel of the heave (up/down) axis. Instead of running a complex simulator, they use a simple script (like motiontester6.py) to send packets where only Item #112 (Z_EXTRA) is active, commanding the platform to move from its bottom limit to its top limit. This allows for direct, isolated testing of the hardware.
Subsection 5.1: Position Commands
Subsections 5.2 & 5.3: Velocity and Acceleration Commands
Guide to Motion System Rx Packet
Let's outline the structure of the 152-byte Data packet sent from the Motion Computer to the Host Computer. This packet provides critical feedback on the system's status, its internal targets, and its actual, physically measured state.
Section 1: Packet Header & Status (Items 1-2)
This section is the mandatory header for every reply packet and is the first thing a host application should decode. The Frame Count is used to ensure no reply packets have been lost in transmission. The Motion Status is the single most important field for diagnostics; it's a 32-bit integer packed with information about the system's operational state (e.g., Neutral, Engaged, Error), specific warning codes, and the status of safety circuits. The host must continuously monitor the Motion Status to understand the system's health.
Section 2: Commanded Setpoints (Items 3-14)
This block of data reveals the motion computer's internal targets. The Platform setpoint represents the overall position/orientation that the system's main filters are commanding. The Special effect setpoint isolates the portion of that command coming only from effects like bumps or direct inputs. A key scenario is diagnostics and visual compensation. A developer can compare the Platform setpoint to the Actual platform position (from Section 4) to analyze how well the physical system is tracking the commands. The Special effect setpoint is used to synchronize the visual display with a physical jolt, preventing motion sickness.
Section 3: Actual Actuator Positions (Items 15-20)
This data is primarily for low-level diagnostics and maintenance. It provides the real, measured extension of each individual actuator (leg) of the motion platform. A maintenance engineer would use this data to verify that all actuators are responding correctly to commands. If the platform wasn't moving as expected, they could check these values to see if a specific actuator is stuck or lagging behind the others.
Section 4: Actual Platform State (Position, Velocity, & Acceleration) (Items 21-38)
This section provides the "ground truth" of the platform's motion, measured by physical sensors and calculated by the motion computer.
-
Actual Position (Items 21-26): The primary scenario is view compensation for simulators where the screen is not mounted on the moving platform (e.g., a projection dome). The host uses the platform's actual position to adjust the rendered image, ensuring the horizon appears stable to the user even as the platform under them moves.
-
Actual Velocity & Acceleration (Items 27-38): The scenario for this data is performance analysis and data logging. An engineer can record this data during test maneuvers to verify that the platform is meeting the required velocity and acceleration specifications for certification or for fine-tuning the system's performance.
Section 4: Actual Platform State (Position, Velocity, & Acceleration) (Items 21-38)
This section provides the "ground truth" of the platform's motion, measured by physical sensors and calculated by the motion computer.
-
Actual Position (Items 21-26): The primary scenario is view compensation for simulators where the screen is not mounted on the moving platform (e.g., a projection dome). The host uses the platform's actual position to adjust the rendered image, ensuring the horizon appears stable to the user even as the platform under them moves.
-
Actual Velocity & Acceleration (Items 27-38): The scenario for this data is performance analysis and data logging. An engineer can record this data during test maneuvers to verify that the platform is meeting the required velocity and acceleration specifications for certification or for fine-tuning the system's performance.
Section 3: Flags & Special Effect Triggers (Items 24-29)
This section is for controlling discrete, event-based effects. For the Flags field, a scenario would be a flight simulator's aircraft touching down; the host sets a "Ground Mode" bit, and the motion system can switch to a different tuning profile better suited for ground handling. For the Bump fields, a scenario would be a train simulator crossing a track switch; the host sends 80.0 to Bump #1 to trigger a pre-recorded lateral jolt at 80% intensity, making the event feel authentic.
Bump Information