PolySync Overview

This article was written for version 2.0.9 of Core. Click here for the most recent version of the help center.

This article will walk you through the key concepts and components of the PolySync platform.

Thanks for joining us

We built the PolySync platform after working with the world’s most advanced autonomous driving teams and realizing the need for faster app creation. We wanted the experience of autonomy system development to feel more like mobile app development─fast backend assembly that leaves you the time to focus on your algorithms. This article will guide you through core concepts and components of this platform.

1. The basics

PolySync is a platform, or autonomy operating system, consisting of a middleware and set of software API services that are common across autonomous driving applications.

By separating the application layer from the underlying Operating System (OS) and hardware, PolySync makes it much more consistent and portable to build autonomous vehicle applications.


The diagram above introduces various applications that can utilize the PolySync middleware layer. These applications perform important functions for autonomous driving─like perception and control─with a hardware agnostic approach.

Applications access the hardware and sensor data using PolySync’s software APIs. Applications and logfiles are portable across any PolySync ECU─even different architectures─taking advantage of the abstraction provided from the middleware layer.

Let’s take a closer look at the PolySync layer to understand what type of functionality it provides.

1.1 A deeper look

PS Functionality

As you can see from the above diagram, at the lowest level there are three types of inter-process communications (IPC):

  • Publisher/Subscriber
    • A single node can broadcast messages to many listeners simultaneously, and a single node can listen to one or many broadcasters simultaneously
  • Get/Set
    • Query a node for a specific parameter at run-time
    • Change a node’s parameter value at run-time
  • Shared Memory
    • Save bandwidth by keeping large data─for example uncompressed image data─off the wire using shared memory segments

In addition to IPC, PolySync provides a variety of helpful services, including:

  • Hardware abstraction
    • Sensors and their recorded logfiles are portable across all supported architectures
  • Host abstraction
    • Applications written using the APIs are portable across all supported architectures
  • Time services
    • Unified time-domain across all hosts and nodes
  • Coordinate frame transformation
    • By default all incoming sensor data is translated to the vehicle centered coordinate frame for easy reference
  • Record and replay
    • Ability to record raw sensor data, and be able to recreate the environment by replaying Polysync logfile sessions
  • Diagnostics
    • Fully traceable diagnostic trace for each node through the PolySync Diagnostic Trouble Codes (DTCs)
  • Analytics
    • Host and node level analytics, for example messages published per second

Now it’s time to dive into some of the finer details of PolySync communication.

2. Bus and runtime communication

The PolySync bus and runtime work together to keep your sensors and software talking.

In order to start this process, a PolySync runtime has to be established within the boundaries of the System Design File (SDF). This makes for a convenient way to reference the full software and hardware system. The SDF represents the static runtime configuration─defining each host, every node, and all sensors connected to the autonomous vehicle system.

A runtime won’t exist until one or more nodes are executing as valid domain participants, which is done by calling the psync_init function. And all nodes within the runtime communicate on a shared PolySync bus.

The PolySync bus exists on the Ethernet wire. On a distributed system, all hosts would be connected to the same Gigabit Ethernet switch.

System Architecture

In the diagram, the hardware-dependent nodes─the Dynamic Driver Interfaces─have been logically separated from the pure-software application nodes. All nodes use the PolySync C and C++ APIs to access the PolySync bus.

Information and sensor data is passed around in pre-defined PolySync messages, which are published and subscribed to the bus using the various APIs.

The PolySync Node Scaffolding Generator can be used to generate a node to subscribe to the desired message(s).

3. Get/set and parameters

All of the provided Dynamic Driver interfaces ship with a complete parameter set, which include parameters like mounting position, operating states, and what kinds of filters are active. This provides you with the ability to manipulate features during runtime.

Any node on the PolySync bus can utilize the get/set framework to easily implement runtime configurability. This is done on a node-by-node basis through publishing and subscribing to the ps_parameters_msg, as well as maintaining a real-time database of the node’s supported PolySync parameters.

PolySync also includes some useful tools to help you create, build, debug, and manage runtime parameters.

  • SDF Configurator
    • A utility program that defines the runtime system by defining hosts, nodes, and node parameters
  • Studio System Hierarchy plugin
    • Real-time get/set interface for active runtime nodes
    • Read/write parameters can be updated in this plugin
  • Data Model Generator
    • A utility to generate custom message types and new parameters that can be used by the publisher/subscriber network

4. Abstracted data model

PolySync’s main form of communication involves using a set of common messages that are broadcast and received to the PolySync bus through a publisher/subscriber model. PolySync ships with a core data model, as well as other data model modules. This includes typical types of data that are often seen on autonomous vehicles, as well as our own messages, that enable some of the API functionality.

Big Picture

The data model provides an abstraction layer that separates data consumers (application nodes) from data producers (dynamic drivers). The data within each message is in the vehicle-centered reference coordinate frame and is time stamped with UTC time, by default. This makes it easy to swap out hardware and software without breaking the system or the developer’s code.

4.1 Example of sensor abstraction

Architecturally, the Dynamic Driver for the Lux looks like this:

Lux Node

The Dynamic Driver abstracts the “native bus” of the sensor─Ethernet and CAN─into a set of generalized data types (messages) on the PolySync bus. Once these messages are published, they are available to any other participant on the bus. They also contain any data generated by other sensors or processes.

For instance, the ps_lidar_points_msg contains the LiDAR points generated by each of the LiDAR sensors on the bus. Each LiDAR point contains generalized data:

  • Sensor description
    • String name for the source sensor
  • Native timestamp
    • Some sensors provide their own timestamps
  • Echo number
    • Used for LiDAR that supports multiple returns (i.e. through transparent objects)
  • Echo pulse width
    • Duration of energy return
  • Layer
    • Scanning LiDAR usually returns multiple “layers” of data in horizontal lines
  • Positions
    • The position of the point, returned in vehicle centered reference frame

There is also a header that contains:

  • Message type
    • A string name for the sensor
  • Timestamp
    • Usually a UTC timestamp that comes from GPS
  • Source GUID
    • A unique identifier for the node, generated by PolySync at runtime

5. PolySync Components

Listed below are the core tools provided with every PolySync release.

  • APIs
    • Set of APIs to enable the development of Level 45 autonomous vehicle applications
  • Dynamic Driver
    • Dynamically loads the .so interfaces written to communicate directly with supported sensors and actuators
    • Interfaces are provided by PolySync for each supported sensor, and they can also be developed in-house
  • Manager
    • Entry point to the PolySync runtime
    • Spawns nodes that are defined in the SDF Configurator
    • Manages the runtime wall-clock and replay clock
  • SDF Configurator
    • Defines the PolySync runtime, all hosts, and nodes in the system
    • What sensors are connected to the ECU, or
    • What sensors will be replaying data
  • Studio
    • Plugin based visualization node
    • Subscribes to all PolySync message types
    • Data capture and logfile management


Congratulations! The information you’ve worked your way through has provided you with a background in all of the core concepts and components of the PolySync platform. Head over to Getting Started to set up PolySync, from installation to visualizing data in Studio.