- Host Configuration
- Runtime Node Configuration
- SDF Configuration
- Ecosystem Tools
- Application Development
- Dynamic Driver Development
3D View is a Studio plugin designed to visualize and explore data from various sensing modalities. It can be used to visualize most types of both live and recorded sensor data.
By default sensor data is visualized in the vehicle centred coordinate frame, with origin at the center of the rear axle on the ground.
The 3D View plugin can be loaded from the plugin selector on the right-hand sidebar in Studio.
The perspective in 3D View is initialized to the origin of the coordinate frame in perspective mode. Tilt and pan the perspective by clicking and dragging in the plugin window. There are preset perspective views that can be toggled from the control panel─or right-clicking in the 3D View plugin─such as “Birds Eye” and “Perspective”.
There are a few keyboard commands that offer control over the perspective of the camera.
|Shift||Pan view position|
1. Drawable primitive types
The left-hand sidebar has settings for controlling how data is drawn in the 3D View plugin.
1.1 Lane model
ps_lane_model_msg is always drawn as yellow lines with the curvature and distance-from-origin as defined in the incoming message.
1.2 LiDAR points
LiDAR points are represented as the small dots, and come from the
Drawn as rectangles with the width, height and length as defined in the
Objects have many classification types, which can be toggled from the left-hand control panel.
1.4 RADAR targets
RADAR targets are drawn as spheres or orbs within 3D View. By default each RADAR source (sensor) produces different color spheres.
ps_radar_targets_msg with the following range type, status, and quality are not drawn in 3D View.
target.getRangeType() == RANGE_INVALID or trackStatus == TRACK_STATUS_INVALID or trackStatus == TRACK_STATUS_NO_TRACK or trackStatus == TRACK_STATUS_INVALID_COASTED or target.getQuality() == QUALITY_INVALID
By default the RADAR target sizes within 3D view are scaled by their amplitude. The larger amplitude returned, the larger the spheres are drawn.
Zones are rendered as polygons using data defined in
ps_zones_msg. They are rendered as red and orange (think proximity sensor). Some examples of supported sensors that output this message type:
polysync::datamodel::ZonesMessage contains an array of
PsZone objects that contain the data used to render zones. Read/Write data from/to the message using
When an application is publishing the
ps_zones_msg, it is important to understand what to expect when using the 3d Plugin to visualize the data. Below are a list of rules used by the plugin when drawing zone data.
ps_quality_kindand must not be
std::array<double, 2>that represent the start/stop points of the zone alone the x-axis relative to zone orientation. Index 1 of the array must be greater than 0 and not equal to
std::array<double, 2>that represent start and stop field of view angles in radians about the z-axis.
std::array<double, 2>that represent start and stop field of view angles in radians about the y-axis.
- No angle value can equal PSYNC_ANGLE_NOT_AVAILABLE
- It does not matter which index contains start or stop.
- Drivers generally have one positive (0 -> π, left of orientation) and one negative (0 -> -π, right of orientation)
- Another option is to provide two positive values 0 - 2π, right-hand rule (π/2(90°) on left side of circle, 3/2π(270°) on right side of circle)
- The field of view, or difference between start and stop, is assumed to be less than π(180°).