simulation_brain_link¶
When simulating a vehicle’s behavior some vehicle specific tasks occur. E.g. the sensor suite varies from vehicle to vehicle. Or the dynamics may differ quite extensively. Adding to that, the code that should be tested in the simulation may be tightly interconnected with the actual hardware. To cope with these challenges, the simulation_brain_link package contains nodes that are necessary to integrate the KITcar_brain into this simulation.
simulation/src/simulation_brain_link
├── launch
│ ├── test
│ │ ├── sensor_camera.test
│ │ └── vehicle_simulation_link.test
│ ├── brain.launch
│ ├── jetson.launch
│ ├── sensor_camera_node.launch
│ ├── sensor_tof_node.launch
│ ├── vehicle_link.launch
│ └── vehicle_simulation_link.launch
├── msg
│ ├── MissionMode.msg
│ └── State.msg
├── param
│ ├── sensor_camera
│ │ ├── default.yaml
│ │ └── topics.yaml
│ ├── sensor_tof
│ │ └── default.yaml
│ └── vehicle_simulation_link
│ ├── default.yaml
│ └── topics.yaml
├── scripts
│ ├── sensor_camera_node
│ ├── sensor_tof_node
│ └── vehicle_simulation_link_node
├── src
│ ├── sensor_camera
│ │ ├── __init__.py
│ │ └── node.py
│ ├── sensor_tof
│ │ ├── __init__.py
│ │ └── node.py
│ ├── vehicle_simulation_link
│ │ ├── __init__.py
│ │ └── node.py
│ └── __init__.py
├── test
│ ├── test_sensor_camera
│ └── test_vehicle_simulation_link_node
├── CMakeLists.txt
├── __init__.py
├── package.xml
└── setup.py
13 directories, 31 files
Packages and Modules
Sensors¶
First and foremost the sensor data generated by Gazebo must be accessible to our Code pipeline just as it would be on the real Dr. Drift. To ensure that, the sensor_camera_node preprocesses the camera image before passing it along:
The camera used in Dr.Drift does some internal precropping to enable a higher frame rate. The internal precropping changes the camera’s center of view. To make life easier and enable other features (e.g. noise), the camera image is preprocessed in the SensorCameraNode before it is passed to KITcar_brain.
roslaunch simulation_brain_link sensor_camera_node.launch
See simulation.src.simulation_brain_link.src.sensor_camera.node
for more.
Dr. Drift is also equipped with time-of-flight distance-sensors. However, Gazebo does not provide distance sensors out of the box. The SensorTofNode converts the output of a depth camera into a distance by publishing the distance to the closest object.
roslaunch simulation_brain_link sensor_tof_node.launch name:=NAME_OF_SENSOR topic:=OUTPUT_TOPIC
See simulation.src.simulation_brain_link.src.sensor_tof.node
for implementation details.
Dynamics¶
The car needs to be moved, when KITcar_brain decides to start driving. Instead of reinventing the wheel, the control team’s(Regelungsteam) vehicle simulation is used to simulate driving behavior. To do so, the command
roslaunch simulation_brain_link vehicle_link.launch
launches the vehicle simulation and also the vehicle_simulation_link_node. Which allows to control the vehicles movement by passing the twist calculated by the vehicle simulation to the car’s twist service.
roslaunch simulation_brain_link vehicle_simulation_link.launch
See simulation.src.simulation_brain_link.src.vehicle_simulation_link.node
for implementation details.
Coordinate Frames¶
Dr. Drift usually has multiple coordinate frames, such as vehicle, ir_ahead, and world which are all subjective from the car’s perspective. However, the simulation has a need for an objective coordinate frame which allows to evaluate the simulated position. To allow for conversions between the vehicles coordinate frames and the simulated frame, a transformation is published by the simulation.src.simulation_brain_link.src.vehicle_simulation_link
.