OpenHarmony provides a comprehensive auto-test framework for designing test cases. Detecting defects in the development process can improve code quality.
This document describes how to use the OpenHarmony test framework.
## Setting Up the Environment
The test framework depends on the Python running environment. Before using the test framework, set up the environment as follows:
- The test framework depends on Python. Before using the test framework, you need to set up the environment.
- For details about how to obtain the source code, see [Obtaining Source Code](../get-code/sourcecode-acquire.md).
### Environment Configuration
#### Basic Test Framework Environment
|Environment|Version|Description|
|------------|------------|------------|
|Operating system|Ubuntu 18.04 or later|Provides the build environment.|
|Linux extend component|libreadline-dev|Allows users to edit command lines.|
|Python|3.7.5 or later|Provides the programming language for the test framework.|
|Python Plug-ins|pyserial 3.3 or later<br>paramiko 2.7.1 or later<br>setuptools 40.8.0 or later<br>RSA 4.0 or later|- pyserial: supports serial port communication in Python.<br>- paramiko: allows SSH in Python.<br>- setuptools: allows creation and distribution of Python packages.<br>-RSA: implements RSA encryption in Python.|
|NFS Server|haneWIN NFS Server 1.2.50 or later or NFS v4 or later|Allows devices to be connected over a serial port.|
|HDC|1.1.0 or later|Allows devices to be connected by using the OpenHarmony Device Connector (HDC).|
#### Installation Process
1. Run the following command to install the Linux extended component libreadline:
```
sudo apt-get install libreadline-dev
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
libreadline-dev is already the newest version (7.0-3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
2. Run the following command to install the setuptools plug-in:
```
pip3 install setuptools
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
```
3. Run the following command to install the paramiko plug-in:
```
pip3 install paramiko
```
The installation is successful if the following information is displayed:
4. Run the following command to install the rsa plug-in:
```
pip3 install rsa
```
The installation is successful if the following information is displayed:
```
Installing collected packages: pyasn1, rsa
Successfully installed pyasn1-0.4.8 rsa-4.7
```
5. Run the following command to install the pyserial plug-in:
```
pip3 install pyserial
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
```
6. Install the NFS server if the device outputs results only through the serial port.
- For Windows, install, for example, haneWIN NFS Server 1.2.50.
- For Linux, run the following command to install the NFS server:
```
sudo apt install nfs-kernel-server
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
7. Install the HDC tool if the device supports HDC connections.
For details, see https://gitee.com/openharmony/developtools_hdc_standard/blob/master/README.md
## Checking the Installation Environment
| Check Item|Operation |Requirements |
| --- | --- | --- |
| Check whether Python is installed successfully.|Run the **python --version** command. |The Python version is 3.7.5 or later.|
| Check whether Python plug-ins are successfully installed.|Go to the **test/developertest** directory and run **start.bat** or **start.sh**.| The **>>>** prompt is displayed.|
|Check the NFS server status (for the devices that support only serial port output). |Log in to the development board through the serial port and run the **mount** command to mount the NFS. |The file directory can be mounted. |
|Check whether the HDC is successfully installed. |Run the **hdc_std -v** command.|The HDC version is 1.1.0 or later.|
## Directory Structure
The directory structure of the test framework is as follows:
```
test # Test subsystem
├── developertest # Developer test module
│ ├── aw # Static library of the test framework
│ ├── config # Test framework configuration
├── developertest # Developer test module
│ ├── aw # Static library of the test framework
│ ├── config # Test framework configuration
│ │ │ ...
│ │ └── user_config.xml # User configuration
│ ├── examples # Examples of test cases
│ ├── src # Source code of the test framework
│ ├── third_party # Adaptation code for third-party components on which the test framework depends
│ ├── reports # Test reports
│ ├── BUILD.gn # Build entry of the test framework
│ ├── start.bat # Test entry for Windows
│ └── start.sh # Test entry for Linux
└── xdevice # Modules on which the test framework depends
│ ├── examples # Examples of test cases
│ ├── src # Source code of the test framework
│ ├── third_party # Adaptation code for third-party components on which the test framework depends
│ ├── reports # Test reports
│ ├── BUILD.gn # Build entry of the test framework
│ ├── start.bat # Test entry for Windows
│ └── start.sh # Test entry for Linux
└── xdevice # Modules on which the test framework depends
```
## Writing Test Cases
### Designing the Test Case Directory
Design the test case directory as follows:
```
subsystem # Subsystem
├── partA # Part A
│ ├── moduleA # Module A
subsystem # Subsystem
├── partA # Part A
│ ├── moduleA # Module A
│ │ ├── include
│ │ ├── src # Service code
│ │ └── test # Test directory
│ │ ├── unittest # Unit test
│ │ │ ├── common # Common test cases
│ │ │ │ ├── BUILD.gn # Build file of test cases
│ │ ├── src # Service code
│ │ └── test # Test directory
│ │ ├── unittest # Unit test
│ │ │ ├── common # Common test cases
│ │ │ │ ├── BUILD.gn # Build file of test cases
│ │ │ │ └── testA_test.cpp # Source code of unit test cases
│ │ │ ├── phone # Test cases for mobile phones
│ │ │ ├── ivi # Test cases for head units
│ │ │ └── liteos-a # Test cases for the IP cameras that use the LiteOS kernel
│ │ ├── moduletest # Module test
│ │ │ ├── phone # Test cases for mobile phones
│ │ │ ├── ivi # Test cases for head units
│ │ │ └── liteos-a # Test cases for IP cameras using LiteOS
> Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices.
> **CAUTION**<br>Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices.
### Writing Test Cases
This test framework supports test cases written in multiple programming languages and provides different templates for different languages.
...
...
@@ -76,7 +163,7 @@ Example:
- Test case example
```
/*
* Copyright (c) 2021 XXXX Device Co., Ltd.
* Copyright (c) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
...
...
@@ -125,7 +212,7 @@ Example:
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub-function.
* @tc.desc: Verify the subfunction.
* @tc.type: FUNC
* @tc.require: Issue Number
*/
...
...
@@ -133,7 +220,7 @@ Example:
{
// Step 1 Call the function to obtain the result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
...
...
@@ -142,7 +229,7 @@ Example:
1. Add comment information to the test case file header.
```
/*
* Copyright (c) 2021 XXXX Device Co., Ltd.
* Copyright (c) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
...
...
@@ -159,7 +246,7 @@ Example:
2. Add the test framework header file and namespace.
```
#include <gtest/gtest.h>
using namespace testing::ext;
```
3. Add the header file of the test class.
...
...
@@ -175,35 +262,34 @@ Example:
void SetUp();
void TearDown();
};
void CalculatorSubTest::SetUpTestCase(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDownTestCase(void)
{
// Set a teardown function, which will be called after all test cases.
}
void CalculatorSubTest::SetUp(void)
{
// Set a setup function, which will be called before each test case.
}
void CalculatorSubTest::TearDown(void)
{
// Set a teardown function, which will be called after each test case.
}
```
> **NOTE**<br>
> When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style.
> **CAUTION**:<br>When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style.
5. Add implementation of the test cases, including test case comments and logic.
```
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub-function.
* @tc.desc: Verify the subfunction.
* @tc.type: FUNC
* @tc.require: Issue Number
*/
...
...
@@ -211,37 +297,37 @@ Example:
{
// Step 1 Call the function to obtain the test result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
```
The following test case templates are provided for your reference.
| Type | Description |
| ------------ | ------------ |
| HWTEST(A,B,C) | Use this template if the test case execution does not depend on setup or teardown. |
| HWTEST_F(A,B,C) | Use this template if the test case execution (excluding parameters) depends on setup and teardown. |
| HWTEST_P(A,B,C) | Use this template if the test case execution (including parameters) depends on setup and teardown. |
| Type| Description|
| ------------| ------------|
| HWTEST(A,B,C)| Use this template if the test case execution does not depend on setup or teardown.|
| HWTEST_F(A,B,C)| Use this template if the test case execution (excluding parameters) depends on setup and teardown.|
| HWTEST_P(A,B,C)| Use this template if the test case execution (including parameters) depends on setup and teardown.|
In the template names:
- *A* indicates the test suite name.
- *B* indicates the test case name, which is in the *Function*\_*No.* format. The *No.* is a three-digit number starting from **001**.
- *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case.
Note the following:
- The expected result of each test case must have an assertion.
- The test case level must be specified.
- It is recommended that the test be implemented step by step according to the template.
- The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following:
**CAUTION**<br>
- The expected result of each test case must have an assertion.
- The test case level must be specified.
- It is recommended that the test be implemented step by step according to the template.
- The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following:
| Test Case Type | Code |
| ------------ | ---------- |
| Function test | FUNC |
| Performance test | PERF |
| Reliability test | RELI |
| Security test | SECU |
| Fuzz test | FUZZ |
| Test Case Type|Code|
| ------------|------------|
|Function test |FUNC|
|Performance Test |PERF|
|Reliability test |RELI|
|Security test |SECU|
|Fuzzing |FUZZ|
**JavaScript Test Case Example**
...
...
@@ -257,7 +343,7 @@ Example:
- Test case example
```
/*
* Copyright (C) 2021 XXXX Device Co., Ltd.
* Copyright (C) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
...
...
@@ -304,7 +390,7 @@ Example:
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
...
...
@@ -314,7 +400,7 @@ Example:
1. Add comment information to the test case file header.
```
/*
* Copyright (C) 2021 XXXX Device Co., Ltd.
* Copyright (C) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
...
...
@@ -331,7 +417,7 @@ Example:
2. Import the APIs and JSUnit test library to test.
```
import app from '@system.app'
import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
```
3. Define the test suite (test class).
...
...
@@ -368,7 +454,7 @@ Example:
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
...
...
@@ -382,7 +468,7 @@ The following provides templates for different languages for your reference.
- **Test case build file example (C++)**
```
# Copyright (c) 2021 XXXX Device Co., Ltd.
# Copyright (c) 2022 XXXX Device Co., Ltd.
import("//build/test.gni")
...
...
@@ -417,25 +503,18 @@ The following provides templates for different languages for your reference.
> The output path is ***Part name*/*Module name***.
> **NOTE**<br>The output path is ***Part_name*/*Module_name***.
4. Set the output build file for the test cases.
...
...
@@ -540,10 +614,9 @@ The following provides templates for different languages for your reference.
ohos_js_unittest("GetAppInfoJsTest") {
}
```
> **NOTE**<br>
>- Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++.
>- The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**.
>- Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++.
>- The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**.
5. Configure the **config.json** file and signature file, which are mandatory.
...
...
@@ -555,7 +628,7 @@ The following provides templates for different languages for your reference.
**config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example:
**config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example:
```
{
...
...
@@ -625,9 +698,7 @@ The following provides templates for different languages for your reference.
deps = [ ":GetAppInfoJsTest" ]
}
```
> **NOTE**<br>
> Grouping test cases by test type allows you to execute a specific type of test cases when required.
> **NOTE**<br>Grouping test cases by test type allows you to execute a specific type of test cases when required.
#### Configuring ohos.build
...
...
@@ -648,9 +719,7 @@ Configure the part build file to associate with specific test cases.
]
}
```
> **NOTE**<br>
> **test_list** contains the test cases of the corresponding module.
> **NOTE**<br>**test_list** contains the test cases of the corresponding module.
### Configuring Test Case Resources
Test case resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution.
...
...
@@ -670,15 +739,17 @@ Perform the following steps:
</target>
</configuration>
```
3. In the build file of the test cases, configure **resource\_config\_file** to point to the resource file **ohos\_test.xml**.
3. In the build file of the test cases, configure **resource_config_file** to point to the resource file **ohos_test.xml**.
>- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory.**preparer** indicates the action to perform before the test suite is executed.
>- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory.
>**NOTE**
>- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory.
>- **preparer** indicates the action to perform before the test suite is executed.
>- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory.
>- **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory.
## Executing Test Cases
Before executing test cases, you need to modify the configuration based on the device used.
...
...
@@ -695,7 +766,7 @@ Before executing test cases, you need to modify the configuration based on the d
<testcase>true</testcase>
</build>
<environment>
<!-- Configure the IP address and port number of the remote server to support connection to the device through the HarmonyOS Device Connector (HDC).-->
<!-- Configure the IP address and port number of the remote server to support connection to the device through the HDC.-->
<device type="usb-hdc">
<ip></ip>
<port></port>
...
...
@@ -729,8 +800,7 @@ Before executing test cases, you need to modify the configuration based on the d
</NFS>
</user_config>
```
> **NOTE**<br>
> If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters.
>**NOTE**<br>If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters.
### Executing Test Cases on Windows
#### Building Test Cases
...
...
@@ -739,20 +809,20 @@ Test cases cannot be built on Windows. You need to run the following command to
>- **product-name**: specifies the name of the product to build, for example, **hispark_taurus_standard**.
>- **build-target**: specifies the test case to build. **make_test** indicates all test cases. You can specify the test cases based on requirements.
After the build is complete, the test cases are automatically saved in **out/hispark_taurus/packages/phone/tests**.
When the build is complete, the test cases are automatically saved in **out/hispark_taurus/packages/phone/tests**.
#### Setting Up the Execution Environment
1. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory.
2. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory.
> **NOTE**<br>
> Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution.
>**NOTE**<br>Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution.
3. Modify the **user_config.xml** file.
```
<build>
...
...
@@ -760,14 +830,12 @@ After the build is complete, the test cases are automatically saved in **out/his
<testcase>false</testcase>
</build>
<test_cases>
<!-- The test cases are copied to the Windows environment. Change the test case output path to the path of the test cases in the Windows environment.-->
<!-- The test cases are copied to the Windows environment. Change the test case output path to the path of the test cases in the Windows environment.-->
<dir>D:\Test\testcase\tests</dir>
</test_cases>
```
> **NOTE**<br>
> `<testcase>` indicates whether to build test cases. `<dir>` indicates the path for searching for test cases.
>**NOTE**<br>**<testcase>** indicates whether to build test cases. **<dir>** indicates the path for searching for test cases.
#### Executing Test Cases
1. Start the test framework.
```
...
...
@@ -775,7 +843,7 @@ After the build is complete, the test cases are automatically saved in **out/his
```
2. Select the product.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **hispark_taurus_standard**.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**.
3. Execute test cases.
...
...
@@ -790,7 +858,7 @@ After the build is complete, the test cases are automatically saved in **out/his
-tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp.
-ts [TESTSUITE]: specifies the test suite. This parameter can be used independently.
-tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts.
-You can run h to display help information.
You can run h to display help information.
```
### Executing Test Cases on Linux
#### Mapping the Remote Port
...
...
@@ -800,18 +868,14 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map
hdc_std kill
hdc_std -m -s 0.0.0.0:8710
```
> **NOTE**<br>
> The IP address and port number are default values.
>**NOTE**<br>The IP address and port number are default values.
2. On the HDC client, run the following command:
```
hdc_std -s xx.xx.xx.xx:8710 list targets
```
> **NOTE**<br>
> Enter the IP address of the device to test.
>**NOTE**<br>Enter the IP address of the device to test.
#### Executing Test Cases
1. Start the test framework.
```
...
...
@@ -819,7 +883,7 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map
```
2. Select the product.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **hispark_taurus_standard**.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**.
3. Execute test cases.
...
...
@@ -834,7 +898,7 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map
-tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp.
-ts [TESTSUITE]: specifies the test suite. This parameter can be used independently.
-tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts.
-You can run h to display help information.
You can run h to display help information.
```
## Viewing the Test Report
...
...
@@ -845,16 +909,15 @@ You can obtain the test result in the following directory:
```
test/developertest/reports/xxxx_xx_xx_xx_xx_xx
```
> **NOTE**<br>
> The folder for test reports is automatically generated.
>**NOTE**<br>The folder for test reports is automatically generated.
The folder contains the following files:
| Type | Description |
| Type| Description|
| ------------ | ------------ |
| result/ | Test cases in standard format. |
| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs.|
| summary_report.html | Test report summary.|
| details_report.html | Detailed test report.|
| result/ |Test cases in standard format.|
| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs.|