Vehicle Sensor Alignment Tutorial

It is critical an autonomous vehicle has situation awareness so it can compute an accurate picture of moving objects around the vehicle. This article will describe the steps used to align LiDAR, RADAR, GPS/IMU, camera, ultrasonic, and V2X sensors connected to PolySync Core. These sensors enable situation awareness in an autonomous vehicle.

Sensor alignment consists of the following steps:

  • Mount the sensors on the vehicle
  • Calculate the mount positions
  • Update the Core configuration
  • Use software to align the data from each sensor on the vehicle with each other

1. Requirements

  • The sensors are mounted on the vehicle
  • The sensors are publishing data to the PolySync Core bus
    • Visit the supported sensors page to connect and configure each sensor in the Core runtime

1.1 Tools

  • Digital angle finder
  • Tape measure
  • Sensor targets, preferably mobile
    • A box, trash can, or other small to medium size object that can be easily detected with camera, LiDAR, and ultrasonic sensors
    • A dense and metallic, or reflective object for RADAR sensors
  • A parking lot light pole is typically detected by both LiDAR and RADAR sensors, and can be used for final alignment validation at longer 100-200m distances

2. Active Coordinate Frame

PolySync Core dynamic drivers default to the standard right-handed platform coordinate frame, which is a fixed body coordinate system called the Vehicle Centered Reference Coordinate Frame (VCRCF).

Parameter ID 800 is the Active Coordinate Frame Identifier which correlates to the values described in the coordinate frames article.

Ensure the sensor Active Coordinate Frame Identifier value is set to 4 ─representing the VCRCF─ in the SDF Configurator Node Configuration section for each sensor in the runtime.

Active Coordinate Frame Identifier

3. Aligning the sensor

Each sensor node has X/Y/Z position and roll/pitch/yaw orientation parameters to represent where the hardware is mounted on the vehicle. Calculate the mount position of each sensor and update the Configurator node entries.

3.1 Add sensor measurements to SDF

Using the diagram below as a reference make accurate measurements of the mounted sensor location and orientation.

Right Handed Coordinate Frame

The (0,0,0) origin is the center of the rear axle on the ground.

Use a digital angle finder to measure roll, pitch, and yaw angles for your mounted sensors.

Open the SDF Configurator and select the sensor node to be aligned.

  • In the Sensor Configuration section
    • Add the sensors X/Y/Z measurements in meters
    • Add the roll/pitch/yaw measurements in radians
      • Use at least 5 digits of precision─and more if possible─for entering angles in radians
  • Disable other nodes in the SDF

3.2 Verify sensor alignment

Using a sensor target and the 3D View plugin in PolySync Core Studio verify the sensor is aligned in the PolySync runtime .

  • Open PolySync Core Studio
  • Put Studio in Hardware mode
  • For a forward facing sensor place the sensor target 10m in front of the vehicle, this is X/Y/Z (0,10,0)
    • Verify you can see the sensor target at 0 degrees
    • If the sensor target is not at 0 degrees open the System Hierarchy plugin and select the sensor node being aligned
      • Change the orientation of the sensor until the sensor target is on the Y axis in the 3D View plugin
      • Update the orientation value in the System Hierarchy press enter
      • Repeat the last two steps until the sensor target is on the Y axis
      • Update the orientation value for this sensor node in the SDF Configurator
    • If the sensor target is not displayed on the 10m grid-line
      • Verify your X/Y/Z position measurements
      • Update the position values in the System Hierarchy press enter
      • Repeat the last two steps until the sensor target is on the 10m grid-line
      • Update the position value for this sensor node in the SDF Configurator

4. Adding additional sensors

Use the steps in the aligning the sensor section as a guide.

Recommended adjustments:

  • Rear facing sensors place the sensor target 10m behind the vehicle, this is X/Y/Z (0,-10,0)
  • When adding a front left RADAR set the Sensor 0 Yaw Orientation to 0.79000 radians

4.1 View object by multiple sensors

After two sensors have been aligned verify they can see the same object at the same X/Y/Z position in PolySync Core Studio’s 3D View.

  • If they are visualizing the same object in slightly different locations pick one sensor to be the primary sensor
    • Adjust the orientation of the secondary sensor until the object aligns in the 3D View plugin
    • Update the orientation value in the System Hierarchy then press enter
    • Update the orientation value for this sensor node in the SDF Configurator

5. Final alignment

Repeat the View object by multiple sensors steps using a sensor target that is at a distance in the range of 100-200m.

A parking lot light pole is typically detected by both LiDAR and RADAR sensors.