提交 7c6f2d7b 编写于 作者: N Natasha Dsouza 提交者: Calvin Miao

Update in How-to - Missing links and Reference section now added (#4641)

* Update in How-to

* Missing link in readme - perception visualizer

* updating how-to

* updating how-to

* updating how-to

* updating how-to
上级 bbc3be28
......@@ -25,7 +25,7 @@
- [How to add a new predictor in prediction module](how_to_add_a_new_predictor_in_prediction_module.md)
- [How to add a new vehicle](how_to_add_a_new_vehicle.md)
- [How to add an external dependency](how_to_add_an_external_dependency.md)
- [How to run Offline Perception Visualizer for Apollo 2.5](how_to_run_apollo_2.5_perception_visualizer)
- [How to run Offline Perception Visualizer for Apollo 2.5](how_to_run_apollo_2.5_perception_visualizer.md)
- [How to Run MSF Localization Module On Your Local Computer](how_to_run_MSF_localization_module_on_your_local_computer.md)
- [How to train Prediction's MLP model](how_to_train_prediction_mlp_model.md)
- [How to use the navigation mode of Apollo 2.5](how_to_use_apollo_2.5_navigation_mode.md)
......
# How to Add a New Vehicle
# How to Add a New Vehicle to Apollo
## Introduction
The instructions below demonstrate how to add a new vehicle to Apollo.
*Note:* The Apollo control algorithm is configured for the default vehicle, which is a Lincoln MKZ.
```
Note: The Apollo control algorithm is configured for the default vehicle, which is a Lincoln MKZ.
```
When adding a new vehicle, if your vehicle requires different attributes from those offered by the Apollo control algorithm, consider:
- Using a different control algorithm that is appropriate for your vehicle.
- Modifying the existing algorithm parameters to achieve better results.
- Modifying the existing algorithm's parameters to achieve better results.
## Add a New Vehicle
## Adding a New Vehicle
Complete the following task sequence to add a new vehicle:
* Implement the new vehicle controller.
......
## How to Add a New External Dependency
A design and implementation goal is to minimize the dependency that must be pre-installed in the system. If your target depends on a module for which you have to `apt-get install` first, consider using bazel as the package/dependency management system.
For example if you want to add a workspace rule `foo` that is not originally built with bazel, do the following:
The design and implementation goal is to minimize the dependency that must be pre-installed in the system. If your target depends on a module for which you have to `apt-get install` first, consider using **Bazel** as the package/dependency management system.
For example, if you want to add a workspace rule `foo` that is not originally built with Bazel, do the following:
- Add a workspace rule named 'foo' in the WORKSPACE file.
- Specify the source of `foo` (usually a URL), and the version (usually a commit hash or a git tag).
- Write a `foo.BUILD` under a third_party directory to build it. The BUILD file will be similar to any other bazel BUILD file of your own targets.
- Write a `foo.BUILD` under the third_party directory to build it. The BUILD file will be similar to any other Bazel BUILD file of your own targets.
- In your target that depends on `foo`, put `@foo://:<foo_target>` in its dependencies.
### Use Bazel to Add an External Dependency
If you add a workspace rule `foo` using bazel to build your target, depending on `foo`, bazel pulls the source code of `foo` from the source specified, and builds it with `foo.BUILD`. If `foo` was originally built with bazel, then only the workspace rule is needed.
If you add a workspace rule `foo` using Bazel to build your target, depending on `foo`, Bazel pulls the source code of `foo` from the source specified, and builds it with `foo.BUILD`. If `foo` was originally built with Bazel, then only the workspace rule is needed.
### References
Click on the following links for a more detailed description on adding a dependency with bazel:
[Workspace Rules](https://bazel.build/versions/master/docs/be/workspace.html),
[Working with external dependencies](Working with external dependencies).
For a detailed description on adding a dependency with Bazel, refer to the following:
* [Workspace Rules](https://bazel.build/versions/master/docs/be/workspace.html)
* [Working with external dependencies](https://docs.bazel.build/versions/master/external.html).
HOW TO PREPARE CENTOS(7) BASED GPU ENABLED IMAGE FOR PERCETION RESEARCH
=======================================================================
## How to Prepare CENTOS(7) based GPU enabled image for the Perception Module
## Setup docker image in CentOS based system
......
......@@ -2,23 +2,23 @@
## 1. Preparation
- Download source code of Apollo from [GitHub](https://github.com/ApolloAuto/apollo)
- Follow the [tutorial](https://github.com/ApolloAuto/apollo/blob/master/README.md) to set up docker environment and build Apollo.
- Download localization data from [Apollo Data Open Platform](http://data.apollo.auto/?name=sensor%20data&data_key=multisensor&data_type=1&locale=en-us&lang=en).(US only)
- Follow the tutorial to set up [docker environment](https://github.com/ApolloAuto/apollo/blob/master/docs/howto/how_to_build_and_release.md) and [build Apollo](https://github.com/ApolloAuto/apollo/blob/master/docs/howto/how_to_launch_Apollo.md).
- Download localization data from [Apollo Data Open Platform](http://data.apollo.auto/?name=sensor%20data&data_key=multisensor&data_type=1&locale=en-us&lang=en)(US only.
## 2. Configure Parameters
Assume that the path to download localization data is DATA_PATH.
### 2.1 Configure Sensor Extrinsics
## 2. Configuring Parameters
Assume that the path to download localization data from is DATA_PATH.
### 2.1. Configure Sensor Extrinsics
```
cp DATA_PATH/params/ant_imu_leverarm.yaml /apollo/modules/localization/msf/params/gnss_params/
cp DATA_PATH/params/velodyne64_novatel_extrinsics_example.yaml /apollo/modules/localization/msf/params/velodyne_params/
cp DATA_PATH/params/velodyne64_height.yaml /apollo/modules/localization/msf/params/velodyne_params/
```
The meaning of each file
- ant_imu_leverarm.yaml: Lever arm value
- velodyne64_novatel_extrinsics_example.yaml: Transform from Imu coord to Lidar coord
- velodyne64_height.yaml: Height of the Lidar relative to the ground
- **ant_imu_leverarm.yaml**: Lever arm value
- **velodyne64_novatel_extrinsics_example.yaml**: Transform from IMU coord to LiDAR coord
- **velodyne64_height.yaml**: Height of the LiDAR relative to the ground
### 2.2 Configure Map Path
### 2.2. Configure Map Path
Add config of map path in /apollo/modules/localization/conf/localization.conf
```
# Redefine the map_dir in global_flagfile.txt
......@@ -48,7 +48,7 @@ In /apollo/data/log directory, you can see the localization log files.
cd DATA_PATH/bag
rosbag play *.bag
```
The localization module will finish initialization and start publish localization result after around 30 seconds.
The localization module will finish initialization and start publishing localization results after around 30 seconds.
## 5. Record and Visualize localization result (optional)
### Record localization result
......@@ -59,7 +59,7 @@ The localization module will finish initialization and start publish localizatio
```
./scripts/localization_online_visualizer.sh
```
First, the visualization tool will generate a series of cache files from localization map, which will be stored in the apollo/data/map_visual directory.
First, the visualization tool will generate a series of cache files from the localization map, which will be stored in the apollo/data/map_visual directory.
Then it will receive the topics blew and draw them on screen.
- /apollo/sensor/velodyne64/compensator/PointCloud2
......@@ -99,16 +99,16 @@ And we can get the statistical results like this
The first table is the statistical data of Fusion localization. The second table is the statistical result of Lidar localization. The third table is the statistical result of GNSS localization.
The meaning of each row in the table
- error: the plane error, unit is meter
- error lon: the error in the car's heading direction, unit is meter
- error lat: the error in the car's lateral direction, unit is meter
- error roll: the roll angle error, unit is degree
- error pit: the pitch angle error, unit is degree
- error yaw: the yaw angle error, unit is degree
- **error**: the plane error, unit is meter
- **error lon**: the error in the car's heading direction, unit is meter
- **error lat**: the error in the car's lateral direction, unit is meter
- **error roll**: the roll angle error, unit is degree
- **error pit**: the pitch angle error, unit is degree
- **error yaw**: the yaw angle error, unit is degree
The meaning of each col in the table
- mean: evaluation value of the error
- std: the standard deviation of the error
- max: the maximum value of the error
- < xx: percentage of frames whose error is smaller than the indicated range
- con_frame(): the maximum number of consecutive frames that satisfy the conditions in parentheses
\ No newline at end of file
- **mean**: evaluation value of the error
- **std**: the standard deviation of the error
- **max**: the maximum value of the error
- **< xx**: percentage of frames whose error is smaller than the indicated range
- **con_frame()**: the maximum number of consecutive frames that satisfy the conditions in parentheses
\ No newline at end of file
......@@ -4,8 +4,8 @@ For Apollo 2.5, we provide an offline visualization tool based OpenGL and PCL li
We introduce the detailed steps to build and run the offline visualizer in docker as below:
### 1. Build The Perception Lowcost
We use Bazel to build the offline perception visualizer.
### Build The Perception Lowcost
We use Bazel to build the offline perception visualizer:
```
cd /apollo
./apollo.sh build_opt_gpu
......@@ -25,8 +25,8 @@ This config contains both image and radar with fusion:
--dag_config_path=conf/dag_camera_obstacle_offline_fusion_sync.config
```
### 2. Run The Visualizer With Collected ROS Bag for Apollo 2.5
Please double check your collected rosbag contains topic including camera images, radar, as well as localization results. The 3 corresponding topics are show below:
### Run The Visualizer With Collected ROS Bag for Apollo 2.5
Please double check that your collected rosbag contains topics including camera images, radar, as well as localization results. The 3 corresponding topics are show below:
```
/apollo/sensor/camera/obstacle/front_6mm
......@@ -34,24 +34,29 @@ Please double check your collected rosbag contains topic including camera images
/apollo/localization/pose
```
After that, you can start perception and play bags. As before, bootstrap first:
After that, you can start perception and play bags. As done before, remember to bootstrap first:
```
./scripts/bootstrap.sh
```
Then you have 3 options to run:
* Run perception with camera subnode only + visualization
* Run full low-cost perception of camera, radar, lane markings + visualization
* Run full low-cost perception of camera, radar, lane markings, and no visualization
### 1) Run perception with camera subnode only + visualization
Let's look at how they can be run:
### Run perception with camera subnode only + visualization
This option requires `/apollo/sensor/camera/obstacle/front_6mm` only.
```
./scripts/perception_offline_visualizer.sh
```
### 2) Run full low-cost perception of camera, radar, lane markings + visualization
### Run full low-cost perception of camera, radar, lane markings + visualization
```
./scripts/perception_lowcost_vis.sh
```
### 3) Run full low-cost perception of camera, radar, lane markings, and no visualization
### Run full low-cost perception of camera, radar, lane markings, and no visualization
```
./scripts/perception_lowcost.sh
```
......@@ -63,4 +68,4 @@ rosbag play <bag file> [-l] [--clock]
You can also see obstacles and lane markings in Dreamview with `natigationn` and `simcontrol`
If running with visualization, you will see a pop-up window showing the perception result with images frame-by-frame. Top level panes are showing image detections in both 2D and 3D. Bottom left is the bird view visualization showing image obstacle tracks, radar obstacle tracks as well as fused tracks. You may switch on/off by pressing `O` (image track), `F` (fused track) and `D` (radar track) on the fly.
If running with visualization, you will see a pop-up window showing the perception result with images frame-by-frame. Top level panes display image detections in both 2D and 3D. The bottom left display is the bird view visualization showing image obstacle tracks, radar obstacle tracks as well as fused tracks. You may switch on/off by pressing `O` (image track), `F` (fused track) and `D` (radar track) on the fly.
# Setting up the Network
### How to Set up the Network
## Helpful hints before you begin:
### Helpful hints before you begin:
* The IPC that is running the Apollo software must access the Internet to acquire the Real Time Kinematic (RTK) data for accurate localization. A mobile device also needs to connect to the IPC to run the Apollo software.
* It is recommended that you configure a fixed IP instead of using DHCP on the IPC to make it easier to connect to from a mobile terminal.
......@@ -8,7 +8,7 @@
![4G_network_setup](https://github.com/tc87/apollo/blob/master/docs/quickstart/images/4G_network_setup.png)
## Setting up the network:
### Setting up the network:
1. Install and configure a 4G LTE router with Wi-Fi Access Point (AP) capability and Gigabit Ethernet ports.
......
## Tutorial on training MLP Deep Learning Model
## How to train the MLP Deep Learning Model
### Prerequisite
### Prerequisites
There are 2 prerequisites to training the MLP Deep Learning Model:
#### Download and Install Anaconda
* Please refer to https://www.anaconda.com/download
* Please download and install Anaconda from its [website](https://www.anaconda.com/download)
#### Install Necessary Dependencies
* Install numpy: `conda install numpy`
* Install tensorflow: `conda install tensorflow`
* Install keras (version 1.2.2): `conda install -c conda-forge keras=1.2.2`
* Install h5py: `conda install h5py`
* Install protobuf: `conda install -c conda-forge protobuf`
#### Install Dependencies
Run the following commands to install the necessary dependencies:
* **Install numpy**: `conda install numpy`
* **Install tensorflow**: `conda install tensorflow`
* **Install keras (version 1.2.2)**: `conda install -c conda-forge keras=1.2.2`
* **Install h5py**: `conda install h5py`
* **Install protobuf**: `conda install -c conda-forge protobuf`
### Procedure
The following is the precedure to train the MLP model using the released demo data. For convenience, we denote `APOLLO` as the path of the local apollo repository, for example `/home/username/apollo`.
### Train the Model
The following steps are to be followed in order to train the MLP model using the released demo data. For convenience, we denote `APOLLO` as the path of the local apollo repository, for example, `/home/username/apollo`
1. Create a folder to store prediction offline data by `mkdir APOLLO/data/prediction` if it does not exist
1. Create a folder to store offline prediction data using the command `mkdir APOLLO/data/prediction` if it does not exist
2. Open `apollo/modules/prediction/conf/prediction.conf`. Turn on the off-line mode by changing `--noprediction_offline_mode` to `--prediction_offline_mode`.
2. Open `apollo/modules/prediction/conf/prediction.conf`. Turn on the off-line mode by changing `--noprediction_offline_mode` to `--prediction_offline_mode`
3. Start dev docker by `bash docker/scripts/dev_start.sh` under apollo folder.
3. Start dev docker using `bash docker/scripts/dev_start.sh` under the apollo folder
4. Enter dev docker by `bash docker/scripts/dev_into.sh` under apollo folder.
4. Enter dev docker using `bash docker/scripts/dev_into.sh` under apollo folder
5. In docker, under `/apollo/`, run `bash apollo.sh build` to compile.
5. In docker, under `/apollo/`, run `bash apollo.sh build` to compile
6. In docker, under `/apollo/`, download the demo rosbag by `python docs/demo_guide/rosbag_helper.py demo_2.0.bag`
6. In docker, under `/apollo/`, download the demo ROSbag by `python docs/demo_guide/rosbag_helper.py demo_2.0.bag`
7. In docker, under `/apollo/`, run prediction module by `bash scripts/prediction.sh start_fe`
8. Open a new terminal window, enter the apollo dev docker by Step 4.
8. Open a new terminal window, enter the apollo dev docker using Step 4
9. In the new terminal window, under `/apollo/`, play the demo rosbag by `rosbag play demo_2.0.bag`
9. In the new terminal window, under `/apollo/`, play the demo rosbag using `rosbag play demo_2.0.bag`
10. After the demo rosbag running is finished in the new terminal window, go to the old terminal window and stop the prediction module by pressing `Ctrl + C`.
10. After the demo ROSbag finishes running in the new terminal window, go to the old terminal window and stop the prediction module by pressing `Ctrl + C`
11. Checkout if there is a file called `feature.0.bin` under the folder `/apollo/data/prediction/`
12. In docker, go to `/apollo/modules/tools/prediction/mlp_train/`, label the data by
`python generate_labels.py -f /apollo/data/prediction/feature.0.bin`. Then checkout if there is a file called `feature.0.label.bin` under the folder `/apollo/data/prediction/`
12. In docker, go to `/apollo/modules/tools/prediction/mlp_train/`, label the data using
`python generate_labels.py -f /apollo/data/prediction/feature.0.bin`. Then check if there is a file called `feature.0.label.bin` under the folder `/apollo/data/prediction/`
13. In docker under `/apollo/modules/tools/prediction/mlp_train/`, generate H5 files by `python generate_h5.py -f /apollo/data/prediction/feature.0.label.bin`. Then checkout if there is a file called `feature.0.label.h5`
13. In docker, under `/apollo/modules/tools/prediction/mlp_train/`, generate H5 files using `python generate_h5.py -f /apollo/data/prediction/feature.0.label.bin`. Then check if there is a file called `feature.0.label.h5` created
14. Exit dev docker
15. Go to the folder `APOLLO/modules/tools/prediction/mlp_train/proto/` run `protoc --python_out=./ fnn_model.proto` to generate fnn_model_pb2.py
16. Go to the folder `APOLLO/modules/tools/prediction/mlp_train/`, run the training model by `python mlp_train.py APOLLO/data/prediction/feature.0.label.h5`
16. Go to the folder `APOLLO/modules/tools/prediction/mlp_train/`, run the training model using `python mlp_train.py APOLLO/data/prediction/feature.0.label.h5`
17. The model's evaluation report will be in the file `APOLLO/modules/tools/prediction/mlp_train/evaluation_report.log`.
17. The model's evaluation report will be in the file `APOLLO/modules/tools/prediction/mlp_train/evaluation_report.log`
18. The model will be stored in the binary file `APOLLO/modules/tools/prediction/mlp_train/mlp_model.bin`, which can replace the old model in `APOLLO/modules/prediction/data/mlp_vehicle_model.bin` if you think it is better.
18. The model will be stored in the binary file `APOLLO/modules/tools/prediction/mlp_train/mlp_model.bin`, which can replace the old model in `APOLLO/modules/prediction/data/mlp_vehicle_model.bin` if you think that's better
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册