提交 f75b5753 编写于 作者: Q Qi Luo 提交者: Dong Li

Docs : a couple of fix in 2.0 software architecturei / security introduction (#2220)

上级 b6cad6d3
......@@ -6,7 +6,7 @@ Core software modules running on the Apollo 2.0 powered autonomous vehicle inclu
* Routing: the routing module is to tell the autonomous vehicle how to reach its destination via a serie of lanes or roads
* Planning: the planning module plans the spatio-temporal trajectory for the autonomous vehicle to take
* Control: the control module executes the planned spatio-temporal trajectory by generating control commands such as throttle, brake and steering
* CanBus: the interface where control commands are passed to the vehicle hardware. It is also where chasiss information gets passed through to the software system
* Canbus: the interface where control commands are passed to the vehicle hardware. It is also where chasiss information gets passed through to the software system
* HD-Map: this module is more like a library. Instead of publishing/subscribing messages, it is frequently used as a query fashion engine to provide add-hoc structured information regarding the roads
* Localization: the localization module leverages various information sources such as GPS, Lidar and IMU to estimate where the autonomous vehicle is located.
......@@ -17,23 +17,21 @@ Every module is running as a separate baidu CarOS based ROS node. Each module no
## Perception
Obviously perception strictly depends on the raw sensor data such as Lidar point cloud data and camera data. However, besides these raw sensor data inputs, traffic light detection also needs to depend on the localization data as well as the HD-Map. Since real-time ad-hoc fashioned traffic light detection is computationally infeasible, the traffic light detection will need localization to determine when and where to start detecting traffic lights through the camera captured pictures.
**Perception seems to be so different from other modules like prediction, routing , planning, control, canbus. Hard to describe its data interfaces. Maybe better to be a little blurring here.**
## Prediction
Prediction module “predicts” the future motion trajectories for all the perceived obstacles. And the output prediction message also wrapps the perception information. Prediction subscribes two both localization and perception obstacle messages as shown below.
Prediction module predicts the future motion trajectories for all the perceived obstacles. And the output prediction message also wraps the perception information. Prediction subscribes both localization and perception obstacle messages as shown below.
![Prediction](images/prediction.png)
When localization update is received, the prediction module updates its internal status. And the actual prediction is triggred when perception sends out its published perception obstacle message.
When localization update is received, the prediction module updates its internal status. And the actual prediction is trigged when perception sends out its published perception obstacle message.
## Localization
The routing module fuses various information to locate where the autonomous vehicle is at. There are two types of localization modes. The first localization method is an RTK based one, with a timer based callback function “OnTimer”.
The localization module fuses various information to locate where the autonomous vehicle is at. There are two types of localization modes. The first localization method is an RTK based one, with a timer based callback function “OnTimer”.
![Localization](images/localization.png)
The other localization method is the “Multiple Sensor Fusion” (MSF) method, where a bunch of event triggered callback functions are registered, such as:
![Localization](images/localization_2.png)
## Routing
The routing module also needs to know where the autnomous vehicle location is, as well as the destination, in order to compute the passage lanes and roads towards the destination. The important data interface is an event trigged function called “OnRoutingRequest”, in which “RoutingResponse” will be computed and published.
The routing module also needs to know where the autonomous vehicle location is, as well as the destination, in order to compute the passage lanes and roads towards the destination. The important data interface is an event trigged function called “OnRoutingRequest”, in which “RoutingResponse” will be computed and published.
![Routing](images/routing.png)
## Planning
......@@ -44,17 +42,17 @@ The data dependency such as chassis, localization, traffic light, prediction, et
![Planning](images/planning_2.png)
## Control
Control takes the planned trajectory as input, and generates the control command to pass to CanBus. It has main data interfaces like: OnPad, OnMonitor, OnTimer, localization and CanBus messages.
Control takes the planned trajectory as input, and generates the control command to pass to Canbus. It has main data interfaces like: OnPad, OnMonitor, OnTimer, localization and Canbus messages.
![Control](images/control_1.png)
The “OnPad” and “OnMonitor” are control interaction with the PAD based human interface and simulations. The main data interface is the “OnTimer” interface which periodically produces the actual control commands.
![Control](images/control_2.png)
## CanBus
The CanBus has two data interfaces as shown below:
## Canbus
The Canbus has two data interfaces as shown below:
![Canbus](images/canbus_1.png)
The first data interface is a timer based publisher with callback function “OnTimer”. This data interface will periodically publish the chassis information as well as chassis details if enabled.
![Cansbus](images/canbus_2.png)
The second data interface is an event based published with callback function “OnControlCommand”, which is triggered when the CanBus module receives control commands.
The second data interface is an event based published with callback function “OnControlCommand”, which is triggered when the Canbus module receives control commands.
......@@ -51,14 +51,14 @@ SDK contains four directories:
This function should be called before using secure upgrade APIs.
```
init_secure_upgrade(root_config_path)
input para:
input para:
root_config_path root configuration file path
```
#### b) Device token generation
This function is used to generate the device token.
```
sec_upgrade_get_device_token()
Ouput para:
Output para:
return code: true generating device token successfully
false generating device token failed
Device_token: device token (string format)
......@@ -69,12 +69,12 @@ This function is used to generate secure upgrade package and package token.
sec_upgrade_get_package(original_package_path,
secure_package_path,
package_token_path)
input para:
original_package_path original upgrade package file path
input para:
original_package_path original upgrade package file path
secure_package_path secure upgrade package file path
package_token_path secure package token file
ouput para:
return code:
output para:
return code:
true generating secure upgrade package successfully
false generating secure upgrade package failed
```
......@@ -83,11 +83,11 @@ This function is used to generate a device’s authorization token based on devi
```
sec_upgrade_get_authorization_token(package_token_path,
device_token_path)
input para:
input para:
package_token_path secure package token file path
device_token_path device token file path
ouput_para:
return code:
output_para:
return code:
true generating authorization token successfully
false generating authorization token failed
authorization_token authorization token buffer(string formate)
......@@ -97,7 +97,7 @@ This function is used to verify the downloaded secure package with the authoriza
```
sec_upgrade_verify_package(authorization_token_buffer,
secure_package_path)
input para:
input para:
authorization_token_buffer authorization token buffer(string format)
secure_package_path secure upgrade package file path
output para:
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册