未验证 提交 b50107d1 编写于 作者: O openharmony_ci 提交者: Gitee

!12881 翻译完成 10736

Merge pull request !12881 from ester.zhou/TR-10736
# Device Test
- [Development Self-Test Framework User Guide](developer_test.md)
- [xDevice User Guide](xdevice.md)
- [XTS Test Case Development Guide](xts.md)
# Development Self-Test Framework User Guide
## Overview
OpenHarmony provides you with a comprehensive development self-test framework **developer_test**. As a part of the OpenHarmony test toolset, the development self-test framework is provided for you to test the development by yourself. You can develop relevant test cases based on your test requirements to discover defects at the development phase, greatly improving the code quality.
This document describes how to use the development self-test framework of OpenHarmony.
### Introduction
After adding or modifying code, OpenHarmony developers want to quickly verify whether the modified code functions properly, and the system already has a large number of automated test cases of existing functions, such as TDD cases and XTS cases. The development self-test framework aims to help you improve your self-test efficiency so that you can quickly execute the specified automated test cases and conducting development tests at the development phase.
### Constraints
When executing test cases using the framework, you must connect to the OpenHarmony device in advance.
## Environment Preparations
The development self-test framework depends on the Python environment. It is required that the Python version be 3.7.5 or later. Before using the framework, you can refer to the following document for configuration.
Click [here](https://gitee.com/openharmony/docs/blob/master/en/device-dev/get-code/sourcecode-acquire.md) to obtain the source code.
### Basic Self-Test Framework Environment
| Environment Dependency | Version | Description |
| ----------------- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| Operating system | Ubuntu 18.04 or later | Code compilation environment. |
| Linux extension component| libreadline-dev | Plugin used to read commands. |
| python | 3.7.5 or later | Language used by the test framework. |
| Python plugins | pyserial 3.3 or later, paramiko 2.7.1 or later, setuptools 40.8.0 or later, and rsa4.0 or later| - **pserial**: supports Python serial port communication.<br>- **paramiko**: allows Python to use SSH. <br>- **setuptools**: allows Python packages to be created and distributed easily. <br>- **rsa**: implements RSA encryption in Python.|
| NFS Server | haneWIN NFS Server 1.2.50 or later or NFS v4 or later | Devices can be connected using serial ports. Mini- and small-system devices are used. |
| HDC | 1.1.0 | A tool that enables devices to be connected through the HarmonyOS Device Connector (HDC). |
1. Run the following command to install the Linux extended component readline:
```bash
sudo apt-get install libreadline-dev
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
libreadline-dev is already the newest version (7.0-3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
2. Run the following command to install the **setuptools** plugin:
```bash
pip3 install setuptools
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
```
3. Run the following command to install the **paramiko** plugin:
```bash
pip3 install paramiko
```
The installation is successful if the following information is displayed:
```
Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko
Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0
```
4. Run the following command to install the **ras** plugin:
```bash
pip3 install rsa
```
The installation is successful if the following information is displayed:
```
Installing collected packages: pyasn1, rsa
Successfully installed pyasn1-0.4.8 rsa-4.7
```
5. Run the following command to install the **pyserial** plugin:
```bash
pip3 install pyserial
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
```
6. Install the NFS server if the device outputs results only through the serial port.
> This step is applicable to small-system or mini-system devices.
- Windows OS: Install the **haneWIN NFS Server1.2.50** package.
- Linux OS: Run the following command to install the NFS server:
```bash
sudo apt install nfs-kernel-server
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
7. If the device supports HDC connection, install the HDC tool. For details about the installation process, see [HDC-OpenHarmony device connector](https://gitee.com/openharmony/developtools_hdc_standard/blob/master/README.md).
### Environment Dependency Check
| Check Item | Operation | Requirement |
| -------------------------------------------------- | --------------------------------------------------- | ------------------------- |
| Check whether Python is installed successfully. | Run the **python --version** command. | The Python version is 3.7.5 or later. |
| Check whether Python plugins are successfully installed. | Go to the **test/developertest** directory and run **start.bat** or **start.sh**.| The **>>>** prompt is displayed.|
| Check the NFS server status (for the devices that support only serial port output).| Log in to the development board through the serial port and run the **mount** command to mount the NFS. | The file directory can be mounted. |
| Check whether the HDC is successfully installed. | Run the **hdc_std -v** command. | The HDC version is 1.1.0 or later. |
## Test Case Preparation
The test framework supports multiple types of tests and provides different test case templates for them._
**TDD Test (C++)**
Naming rules for source files
The source file name of test cases must be the same as that of the test suite. The file names must use lowercase letters and in the *Function*_*Sub-function*_**test** format. More specific sub-functions can be added as required.
Example:
```
calculator_sub_test.cpp
```
Test case example
```c++
/*
* Copyright (c) 2021 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#include "calculator.h"
#include <gtest/gtest.h>
using namespace testing::ext;
class CalculatorSubTest : public testing::Test {
public:
static void SetUpTestCase(void);
static void TearDownTestCase(void);
void SetUp();
void TearDown();
};
void CalculatorSubTest::SetUpTestCase(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDownTestCase(void)
{
// Set a teardown function, which will be called after all test cases.
}
void CalculatorSubTest::SetUp(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDown(void)
{
// Set a teardown function, which will be called after all test cases.
}
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub function.
* @tc.type: FUNC
* @tc.require: issueNumber
*/
HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
{
// Step 1 Call the function to obtain the test result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
```
The procedure is as follows:
1. Add comment information to the test case file header.
```
/*
* Copyright (c) 2021 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
```
2. Add the test framework header file and namespace.
```c++
#include <gtest/gtest.h>
using namespace testing::ext;
```
3. Add the header file of the test class.
```c++
#include "calculator.h"
```
4. Define the test suite (test class).
```c++
class CalculatorSubTest : public testing::Test {
public:
static void SetUpTestCase(void);
static void TearDownTestCase(void);
void SetUp();
void TearDown();
};
void CalculatorSubTest::SetUpTestCase(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDownTestCase(void)
{
// Set a teardown function, which will be called after all test cases.
}
void CalculatorSubTest::SetUp(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDown(void)
{
// Set a teardown function, which will be called after all test cases.
}==
```
> **NOTE**
>
> When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style.
5. Add implementation of the test cases, including test case comments and logic.
```c++
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub function.
* @tc.type: FUNC
* @tc.require: issueNumber
*/
HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
{
// Step 1 Call the function to obtain the test result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
```
> **NOTE**
>
> @tc.require: The format must be started with **AR/SR** or **issue**, for example, **issueI56WJ7**.
The following test case templates are provided for your reference.
| Type | Description |
| --------------- | ------------------------------------------------ |
| HWTEST(A,B,C) | Use this template if the test case execution does not depend on setup or teardown. |
| HWTEST_F(A,B,C) | Use this template if the test case execution (excluding parameters) depends on setup or teardown.|
| HWTEST_P(A,B,C) | Use this template if the test case execution (including parameters) depends on setup or teardown. |
In the template names:
- **A** indicates the test suite name.
- **B** indicates the test case name, which is in the *Function*_*No.* format. The *No.* is a three-digit number starting from **001**.
- *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case.
**NOTE**
- The expected result of each test case must have an assertion.
- The test case level must be specified.
- It is recommended that the test be implemented step by step according to the template.
- The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following:
| Test Case Type| Code|
| ------------ | -------- |
| Function test | FUNC |
| Performance test | PERF |
| Reliability test | RELI |
| Security test | SECU |
| Fuzzing test | FUZZ |
**TDD Test (JS)**
- Naming rules for source files
The source file name of a test case must be in the *Function**Sub-function***Test** format, and each part must use the upper camel case style. More specific sub-functions can be added as required.
Example:
```
AppInfoTest.js
```
- Test case example
```js
/*
* Copyright (C) 2021 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import app from '@system.app'
import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
describe("AppInfoTest", function () {
beforeAll(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeAll caled')
})
afterAll(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterAll caled')
})
beforeEach(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeEach caled')
})
afterEach(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterEach caled')
})
/*
* @tc.name:appInfoTest001
* @tc.desc:verify app info is not null
* @tc.type: FUNC
* @tc.require: issueNumber
*/
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
})
```
The procedure is as follows:
1. Add comment information to the test case file header.
```
/*
* Copyright (C) 2021 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
```
2. Import the APIs and JSUnit test library to test.
```js
import app from '@system.app'
import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
```
3. Define the test suite (test class).
```js
describe("AppInfoTest", function () {
beforeAll(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeAll caled')
})
afterAll(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterAll caled')
})
beforeEach(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeEach caled')
})
afterEach(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterEach caled')
})
```
4. Add implementation of the test cases.
```JS
/*
* @tc.name:appInfoTest001
* @tc.desc:verify app info is not null
* @tc.type: FUNC
* @tc.require: issueNumber
*/
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
```
> **NOTE**
>
> @tc.require: The format must be started with **issue**, for example, **issueI56WJ7**.
**Fuzzing Test**
[Fuzzing case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/fuzzlib/README_zh.md)
**Benchmark Test**
[Benchmark case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/benchmark/README_zh.md)
## **Test Case Building**
When a test case is executed, the test framework searches for the build file of the test case in the test case directory and builds the test case located. The following describes how to write build files (GN files) in different programming languages.
**TDD Test**
The following provides templates for different languages for your reference.
- **Test case build file example (C++)**
```
# Copyright (c) 2021 XXXX Device Co., Ltd.
import("//build/test.gni")
module_output_path = "developertest/calculator"
config("module_private_config") {
visibility = [ ":*" ]
include_dirs = [ "../../../include" ]
}
ohos_unittest("CalculatorSubTest") {
module_out_path = module_output_path
sources = [
"../../../include/calculator.h",
"../../../src/calculator.cpp",
]
sources += [ "calculator_sub_test.cpp" ]
configs = [ ":module_private_config" ]
deps = [ "//third_party/googletest:gtest_main" ]
}
group("unittest") {
testonly = true
deps = [":CalculatorSubTest"]
}
```
The procedure is as follows:
1. Add comment information for the file header.
```
# Copyright (c) 2021 XXXX Device Co., Ltd.
```
2. Import the build template.
```
import("//build/test.gni")
```
3. Specify the file output path.
```
module_output_path = "developertest/calculator"
```
> **NOTE**<br>The output path is ***Part name*/*Module name***.
4. Configure the directories for dependencies.
```
config("module_private_config") {
visibility = [ ":*" ]
include_dirs = [ "../../../include" ]
}
```
> **NOTE**
>
> Generally, the dependency directories are configured here and directly referenced in the build script of the test case.
5. Set the output build file for the test cases.
```
ohos_unittest("CalculatorSubTest") {
}
```
6. Write the build script (add the source file, configuration, and dependencies) for the test cases.
```
ohos_unittest("CalculatorSubTest") {
module_out_path = module_output_path
sources = [
"../../../include/calculator.h",
"../../../src/calculator.cpp",
"../../../test/calculator_sub_test.cpp"
]
sources += [ "calculator_sub_test.cpp" ]
configs = [ ":module_private_config" ]
deps = [ "//third_party/googletest:gtest_main" ]
}
```
> **NOTE**
>
> Set the test type based on actual requirements. The following test types are available:
>
> - **ohos_unittest**: unit test
> - **ohos_moduletest**: module test
> - **ohos_systemtest**: system test
> - **ohos_performancetest**: performance test
> - **ohos_securitytest**: security test
> - **ohos_reliabilitytest**: reliability test
> - **ohos_distributedtest**: distributed test
7. Group the test case files by test type.
```
group("unittest") {
testonly = true
deps = [":CalculatorSubTest"]
}
```
> **NOTE**
>
> Grouping test cases by test type allows you to execute a specific type of test cases when required.
- **Test case build file example (JavaScript)**
```
# Copyright (C) 2021 XXXX Device Co., Ltd.
import("//build/test.gni")
module_output_path = "developertest/app_info"
ohos_js_unittest("GetAppInfoJsTest") {
module_out_path = module_output_path
hap_profile = "./config.json"
certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
}
group("unittest") {
testonly = true
deps = [ ":GetAppInfoJsTest" ]
}
```
The procedure is as follows:
1. Add comment information for the file header.
```
# Copyright (C) 2021 XXXX Device Co., Ltd.
```
2. Import the build template.
```
import("//build/test.gni")
```
3. Specify the file output path.
```
module_output_path = "developertest/app_info"
```
> **NOTE**
>
> The output path is ***Part name*/*Module name***.
4. Set the output build file for the test cases.
```
ohos_js_unittest("GetAppInfoJsTest") {
}
```
> **NOTE**
> - Use the **ohos_js_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++.
> - The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**.
5. Configure the **config.json** file and signature file, which are mandatory.
```
ohos_js_unittest("GetAppInfoJsTest") {
module_out_path = module_output_path
hap_profile = "./config.json"
certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
}
```
**config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example:
```json
{
"app": {
"bundleName": "com.example.myapplication",
"vendor": "example",
"version": {
"code": 1,
"name": "1.0"
},
"apiVersion": {
"compatible": 4,
"target": 5 // Set it based on the tested SDK version. In this example, SDK5 is used.
}
},
"deviceConfig": {},
"module": {
"package": "com.example.myapplication",
"name": ".MyApplication",
"deviceType": [
"phone"
],
"distro": {
"deliveryWithInstall": true,
"moduleName": "entry",
"moduleType": "entry"
},
"abilities": [
{
"skills": [
{
"entities": [
"entity.system.home"
],
"actions": [
"action.system.home"
]
}
],
"name": "com.example.myapplication.MainAbility",
"icon": "$media:icon",
"description": "$string:mainability_description",
"label": "MyApplication",
"type": "page",
"launchType": "standard"
}
],
"js": [
{
"pages": [
"pages/index/index"
],
"name": "default",
"window": {
"designWidth": 720,
"autoDesignWidth": false
}
}
]
}
}
```
6. Group the test case files by test type.
```
group("unittest") {
testonly = true
deps = [ ":GetAppInfoJsTest" ]
}
```
> **NOTE**
>
> Grouping test cases by test type allows you to execute a specific type of test cases when required.
**Fuzzing Test**
[Fuzzing case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/fuzzlib/README_zh.md)
**Benchmark Test**
[Benchmark case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/benchmark/README_zh.md)
**Configuring ohos.build**
Configure the part build file to associate with specific test cases.
```
"partA": {
"module_list": [
],
"inner_list": [
],
"system_kits": [
],
"test_list": [ // Test under configuration module calculator.
"//system/subsystem/partA/calculator/test:unittest",
"//system/subsystem/partA/calculator/test:fuzztest",
"//system/subsystem/partA/calculator/test:benchmarktest"
}
```
> **NOTE**<br>**test_list** contains the test cases of the corresponding module.
## Configuring Test Resources
Test resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution.
Perform the following steps:
1. Create a **resource** directory under the **test** directory of the part, create a corresponding module directory under the **resource** directory, and store the resource files required in this module directory.
2. In the module directory under **resource**, create the **ohos_test.xml** file in the following format:
```xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration ver="2.0">
<target name="CalculatorSubTest">
<preparer>
<option name="push" value="test.jpg -> /data/test/resource" src="res"/>
<option name="push" value="libc++.z.so -> /data/test/resource" src="out"/>
</preparer>
</target>
</configuration>
```
3. In the build file of the test cases, configure **resource_config_file** to point to the resource file **ohos_test.xml**.
```
ohos_unittest("CalculatorSubTest") {
resource_config_file = "//system/subsystem/partA/test/resource/calculator/ohos_test.xml"
}
```
>**NOTE**
>- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory. **preparer** indicates the action to perform before the test suite is executed.
>- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory.
## Test Case Execution
### Configuration File
Before executing test cases, you need to modify the configuration based on the device used.
#### Modifying user_config.xml
```xml
<user_config>
<build>
<!-- Whether to build a demo case. The default value is false. If a demo case is required, change the value to true. -->
<example>false</example>
<!-- Whether to build the version. The default value is false. -->
<version>false</version>
<!-- Whether to build the test cases. The default value is true. If the build is already complete, change the value to false before executing the test cases.-->
<testcase>true</testcase>
<!--When compiling test cases, select whether the target CPU is of the 64-bit or 32-bit. The default value is null (32-bit). You can select arm64. -->
<parameter>
<target_cpu></target_cpu>
</parameter>
</build>
<environment>
<!-- Configure the IP address and port number of the remote server to support connection to the device through the OpenHarmony Device Connector (HDC).-->
<device type="usb-hdc">
<ip></ip>
<port></port>
<sn></sn>
</device>
<!-- Configure the serial port information of the device to enable connection through the serial port.-->
<device type="com" label="ipcamera">
<serial>
<com></com>
<type>cmd</type>
<baud_rate>115200</baud_rate>
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>1</timeout>
</serial>
</device>
</environment>
<!-- Configure the test case path. If the test cases have not been built (<testcase> is true), leave this parameter blank. If the build is complete, enter the path of the test cases.-->
<test_cases>
<dir></dir>
</test_cases>
<!-- Configure the coverage output path.-->
<coverage>
<outpath></outpath>
</coverage>
<!-- Configure the NFS mount information when the tested device supports only the serial port connection. Specify the NFS mapping path. host_dir indicates the NFS directory on the PC, and board_dir indicates the directory created on the board. -->
<NFS>
<host_dir></host_dir>
<mnt_cmd></mnt_cmd>
<board_dir></board_dir>
</NFS>
</user_config>
```
>**NOTE**
>
>If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters.
### Command Description
1. Start the test framework.
```
start.bat
```
2. Select the product.
After the test framework starts, you are asked to select a product. Select the development board to test.
If you need to manually add a product, add it within the **\<productform\>** tag to **config/framework_config.xml**.
3. Execute test cases.
Run the following command to execute test cases:
```
run -t UT -ts CalculatorSubTest -tc interger_sub_00l
```
In the command:
```
-**t [TESTTYPE]**: specifies the test type, which can be **UT**, **MST**, **ST**, **PERF**, **FUZZ**, or **BENCHMARK**. This parameter is mandatory.
-**tp [TESTPART]**: specifies the part to test. This parameter can be used independently.
-**tm [TESTMODULE]**: specifies the module to test. This parameter must be specified together with **-tp**.
-**ts [TESTSUITE]**: specifies a test suite. This parameter can be used independently.
-**tc [TESTCASE]**: specifies a test case. This parameter must be specified together with **-ts**.
-**h**: displays help information.
```
#### Executing Test Cases on Windows
Test cases cannot be built on Windows. You need to run the following command to build test cases on Linux:
```
./build.sh --product-name {product_name} --build-target make_test
```
>**NOTE**
>- **product-name**: specifies the name of the product to be compiled.
>- **build-target**: specifies the test case to build. **make_test** indicates all test cases. You can specify the test cases based on requirements.
After the build is complete, the test cases are automatically saved in **out/ohos-arm-release/packages/phone/tests**.
##### Setting Up the Execution Environment
1. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory.
2. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory.
>**NOTE**
>
>Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution.
3. Modify the **user_config.xml** file.
```xml
<build>
<!-- Because the test cases have been built, change the value to false. -->
<testcase>false</testcase>
</build>
<test_cases>
<!-- The test cases are copied to the Windows environment. Change the test case output path to the path of the test cases in the Windows environment.-->
<dir>D:\Test\testcase\tests</dir>
</test_cases>
```
>**NOTE**
>
>**\<testcase>** indicates whether to build test cases. **\<dir>** indicates the path for searching for test cases.
#### Executing Test Cases on Linux
If you directly connect to a Linux host, you can directly run commands to execute test cases.
##### Mapping the Remote Port
To enable test cases to be executed on a remote Linux server or a Linux VM, map the port to enable communication between the device and the remote server or VM. Configure port mapping as follows:
1. On the HDC server, run the following commands:
```
hdc_std kill
hdc_std -m -s 0.0.0.0:8710
```
>**NOTE**
>
>The IP address and port number are default values.
2. On the HDC client, run the following command:
```
hdc_std -s xx.xx.xx.xx:8710 list targets
```
>**NOTE**
>
>Enter the IP address of the device to test.
## Viewing the Test Result
### Test Report Logs
After the test cases are executed, the test result will be automatically generated. You can view the detailed test result in the related directory.
### Test Result
You can obtain the test result in the following directory:
```
test/developertest/reports/xxxx_xx_xx_xx_xx_xx
```
>**NOTE**
>
>The folder for test reports is automatically generated.
The folder contains the following files:
| Type | Description |
| ------------------------------------ | ------------------ |
| result/ | Test cases in standard format.|
| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs. |
| summary_report.html | Test report summary. |
| details_report.html | Detailed test report. |
### Test Framework Logs
```
reports/platform_log_xxxx_xx_xx_xx_xx_xx.log
```
### Latest Test Report
```
reports/latest
```
# xDevice User Guide
## Overview
As an open-source OS, OpenHarmony supports product development in many chip scenarios. To ensure compatibility of the OpenHarmony ecosystem, OpenHarmony provides the [compatibility test service](https://www.openharmony.cn/certification/document/guid). For related products, API tests are required for verification. However, executing a large number of automated test cases requires a scheduling and execution framework that supports capabilities such as generating visual test reports. Therefore, we designed and developed the xDevice test scheduling and execution framework.
### Introduction
The xDevice test scheduling and execution framework is a core component of the test infrastructure of OpenHarmony. It provides related services required for scheduling and executing automated test cases, supports scheduling and execution of a large number of automated test cases, as well as supports the generation of visual test reports. The binary package of xDevice will be compiled together with the XTS suite of OpenHarmony. You can obtain the xDevice tool from the XTS archiving path.
Based on the device type, xDevice mainly tests tasks in the following scenarios:
- Perform XTS tests for mini-system devices (such as the Hi3861 development board).
- Perform XTS tests for small-system devices (such as the Hi3516 development board).
- Perform XTS tests for standard-system devices (such as the RK3568 development board).
### Implementation Principles
The xDevice tool includes the following functional modules:
- **command**: enables command-based interactions between users and the test platform. It parses and processes user commands.
- **config**: sets test framework configurations and provides different configuration options for the serial port connection and USB connection modes.
- **driver**: functions as a test case executor, which defines main test steps, such as test case distribution, execution, and result collection.
- **report**: parses test results and generates test reports.
- **scheduler**: schedules various test case executors in the test framework.
- **environment**: configures the test framework environment, enabling device discovery and device management.
- **testkit**: provides test tools to implement JSON parsing, network file mounting, etc.
- **log**: records task logs and device logs.
In addition to the preceding functional modules, the framework depends on user-defined configuration files, which are classified into two types:
**Test Task Configuration File**
The test task configuration file provided by the framework is **user_config.xml**. You can modify the configuration file based on your environment information.
The environment configuration is as follows:
```xml
<environment>
<!-- Large-system device configuration -->
<device type="usb-hdc"> <!-- type: device connection mode. usb-hdc (default) indicates to control devices using the HDC. Currently, the framework supports only one USB device. -->
<ip></ip> <!-- ip: remote device IP address. The local device is used if ip and port are empty, and the remote device is used if ip and port are not empty. -->
<port></port> <!-- port: remote device port number. -->
<sn></sn> <!-- sn: device SN list. SNs are separated using semicolons (;). All local devices are used when the SN is empty. The device with the specified SN will be used if the SN is not empty. -->
</device>
<!-- L0 device configuration -->
<device type="com" label="wifiiot"> <!-- type: device connection mode. com indicates the serial port connection mode. label indicates the device type, for example, wifiot. -->
<serial> <!-- serial: serial port definition. -->
<com></com> <!-- com: serial port of the local connection, for example, COM20. -->
<type>cmd</type> <!-- type indicates the serial port type. cmd is the command serial port. -->
<baud_rate>115200</baud_rate> <!-- baud_rate, data_bits, stop_bits, timeout: serial port parameters. Generally, the default values are used. -->
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>20</timeout>
</serial>
<serial>
<com></com>
<type>deploy</type> <!-- type indicates the serial port type. deploy indicates the deployment serial port. -->
<baud_rate>115200</baud_rate>
</serial>
</device>
<!-- L1 device local connection configuration -->
<device type="com" label="ipcamera">
<serial>
<com></com>
<type>cmd</type>
<baud_rate>115200</baud_rate>
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>1</timeout>
</serial>
</device>
<!-- L1 device remote connection configuration. Multiple connections can be configured. -->
<device type="com" label="ipcamera">
<ip></ip>
<port></port>
</device>
</environment>
```
Set the test case directory.
```xml
<testcases>
<!-- If both dir and server are configured, only either of them takes effect. -->
<!-- Specify the test case directory. If this parameter is empty, the testcase directory in the current project will be used. -->
<dir></dir>
<!-- NFS mounting configuration. Set the value to NfsServer. -->
<server label="NfsServer">
<ip></ip> <!-- Mounting environment IP address. -->
<port></port> <!-- Mounting environment port. -->
<dir></dir> <!-- Mounted external path. -->
<username></username> <!-- Login user name. -->
<password></password> <!-- Login user password. -->
<remote></remote> <!-- whether the NFS server and the XDevice executor are deployed on different devices. If yes, set this parameter to true. Otherwise, set it to false. -->
</server>
</testcases>
```
Set the resource directory.
```xml
<resource>
<!-- Specify the resource directory. If this parameter is empty, the resource directory in the current project will be used. -->
<dir></dir>
</resource>
```
Set the log level.
```xml
<!-- The default level is INFO. For more detailed information, change the level to DEBUG. -->
<loglevel>INFO</loglevel>
```
**Test Suite Configuration File**
The test support suite executed by the device is specified by the test configuration file.
Each test suite has a test configuration file, which mainly specifies the test support suites (kits) that need to be used. In addition, the setup and teardown operations are supported.
The following is a configuration file example:
```json
{
// Description of the test support suite.
"description": "Configuration for aceceshi Tests",
// Specify the device for executing the current test support suite.
"environment": {
"type": "device",
"label": "wifiiot"
},
// Specify the driver executed by the device.
"driver": {
"type": "OHJSUnitTest",
"test-timeout": "700000",
"bundle-name": "com.open.harmony.acetestfive",
"package-name": "com.open.harmony.acetestfive",
"shell-timeout": "700000"
},
// The kit is mainly used to support test execution activities, including the setup operation before the test and the teardown operation after the test.
"kits": [
{
"type": "ShellKit",
"run-command": [
"remount",
"mkdir /data/data/resource",
"chmod -R 777 /data/data/resource",
"settings put secure adb_install_need_confirm 0"
],
"teardown-command": [
"remount",
"rm -rf /data/data/resource"
]
},
]
}
```
### Test Commands
Test commands can be classified into three groups: **help**, **list**, and **run**. Among them, **run** commands are most commonly used in the instruction sequence.
------
You can run **help** commands to obtain help information about the test framework commands.
```text
help:
use help to get information.
usage:
run: Display a list of supported run command.
list: Display a list of supported device and task record.
Examples:
help run
help list
```
**NOTE**
**help run**: displays the description of **run** commands.
**help list**: displays the description of **list** commands.
------
You can run **list** commands to display device information and related task information.
```text
list:
This command is used to display device list and task record.
usage:
list
list history
list <id>
Introduction:
list: display device list
list history: display history record of a serial of tasks
list <id>: display history record about task what contains specific id
Examples:
list
list history
list 6e****90
```
**NOTE**
**list**: displays device information.
**list history**: displays historical task information.
**list *\<id>***: displays historical information about tasks with specified IDs.
------
Run the **run** commands to execute test tasks.
```text
run:
This command is used to execute the selected testcases.
It includes a series of processes such as use case compilation, execution, and result collection.
usage: run [-l TESTLIST [TESTLIST ...] | -tf TESTFILE
[TESTFILE ...]] [-tc TESTCASE] [-c CONFIG] [-sn DEVICE_SN]
[-rp REPORT_PATH [REPORT_PATH ...]]
[-respath RESOURCE_PATH [RESOURCE_PATH ...]]
[-tcpath TESTCASES_PATH [TESTCASES_PATH ...]]
[-ta TESTARGS [TESTARGS ...]] [-pt]
[-env TEST_ENVIRONMENT [TEST_ENVIRONMENT ...]]
[-e EXECTYPE] [-t [TESTTYPE [TESTTYPE ...]]]
[-td TESTDRIVER] [-tl TESTLEVEL] [-bv BUILD_VARIANT]
[-cov COVERAGE] [--retry RETRY] [--session SESSION]
[--dryrun] [--reboot-per-module] [--check-device]
[--repeat REPEAT]
action task
Specify tests to run.
positional arguments:
action Specify action
task Specify task name,such as "ssts", "acts", "hits"
```
The table below describes how to use **run** commands.
| xDevice Command | Function | Example |
| :----------: | :----------------------------------------------------------: | :----------------------------------------------------------: |
| run xts | Runs all XTS modules of the specified type, for example, **acts**, **hits**, and **ssys**. | run acts |
| run -l xxx | Runs specified module test suites. Multiple module test suites are separated using semicolons (;). | run -l ActsWifiServiceTest;ActsLwipTest |
| run -sn | Specifies the SNs of the executed devices. Multiple SNs are separated using semicolons (;). | run acts -sn 10.117.183.37:17001<br>run acts -sn 88Y02******57723;VEG02******16642 |
| run -rp | Specifies the report generation path. By default, subdirectories will be created under **reports** of the work directory using the timestamp or task ID.| run acts -rp /suites/hits/xdevice_reports/2020.09.28-14.21.26 |
| run -respath | Specifies the test resource path. The default value is **resource**. | run acts -respath /cloud/zidane/xts/release/suites/resource |
| run -ta | Specifies module running parameters. You can specify test cases in the running module. Multiple cases are separated using commas (,). Currently, the JS driver test suite is supported.| run acts -ta class:ohos.hardware.soundtrigger.SoundTriggerTest#testKeyphraseParcelUnparcel_noUsers |
| run --retry | Executes failed test cases of the previous task to generate a new test report. | run –retryrun --retry --session 2020-10-30-17-15-11 (task directory name)|
### Test Report
After the test framework executes the **run** commands, the console outputs the corresponding logs, and the execution report is generated. The report is generated in the path specified by the **-rp** parameter if set. If the parameter is not set, the report will be generated in the default directory.
```text
Structure of the report directory (the default or the specified one)
├── result (test case execution results of the module)
│ ├── *Module name*.xml
│ ├── ... ...
├── log (running logs of devices and tasks)
│ ├── *Device 1*.log
│ ├── ... ...
│ ├── *<Task>*.log
├── summary_report.xml (task summary report)
├── summary_report.html (task summary visual report)
├── details_report.html (case execution visual report)
├── failures_report.html (failed case visual report, which will not be generated if no case fails)
├── summary.ini (Records information such as the used device, start time, and end time.)
├── task_info.record (Records executed commands and list of failed cases.)
├── xxxx.zip (compression file generated by compressing the preceding files)
├── summary_report.hash (file obtained by encrypting the compression file using SHA-256)
└── ... ...
```
## Environment Preparations
### Environment Requirements
- Python version: 3.7 or later
- pyserial: 3.3 or later
- paramiko: 2.7.1 or later
- rsa: 4.0 or later
### Installing the xDevice
- Install the basic framework of xDevice.
1. Go to the root directory of xDevice.
```bash
cd testfwk_xdevice
```
2. Open the console and run the following command:
```bash
python setup.py install
```
- Install the OpenHarmony driver plugin **ohos**.
1. Go to the **plugin/ohos** directory.
```bash
cd testfwk_xdevice/plugin/ohos
```
2. Open the console and run the following command as the current user:
```bash
python setup.py install
```
### Verifying the Environment
Check whether xDevice is installed successfully.
1. Go to the root directory of xDevice.
```bash
cd testfwk_xdevice
```
2. Open the console and run the following command:
```bash
python -m pip list
```
3. Check whether the **xdevice** and **xdevice-ohos** libraries are successfully installed.
```text
xdevice 0.0.0
xdevice-ohos 0.0.0
```
Check whether xDevice runs properly.
1. Go to the root directory of xDevice.
```bash
cd testfwk_xdevice
```
2. Open the console and run the following command:
```bash
python -m xdevice
```
3. Check whether the following information is displayed on the console:
```text
[2022-10-13 15:43:31,284] [30076] [Main] [INFO] [*************** xDevice Test Framework 2.11.0.1091 Starting ***************]
[2022-10-13 15:43:31,286] [30076] [ManagerLite] [WARNING] [wifiiot local com cannot be empty, please check]
[2022-10-13 15:43:31,286] [30076] [ManagerLite] [WARNING] [ipcamera local com cannot be empty, please check]
[2022-10-13 15:43:31,287] [30076] [ManagerLite] [WARNING] [device com or ip cannot be empty, please check]
>>>
```
## Mini-System Device XTS Test Guide (wifiiot)
1. Identify the serial port usage and modify the **user_config.xml** file in the root directory.
The COM port whose **type** is **cmd** corresponds to the AT command serial port on the board. The port is used to send commands to the device. In the example, the **ChA(COM20)** serial port is used.
The COM port whose **type** is **deploy** corresponds to the log output serial port on the board. The port is used to burn the image and print logs. In the example, the **ChB(COM18)** serial port is used.
If the AT command serial port is the same as the log output serial port, the serial ports can be set to the same port. That is, in the **user_config** file, the COM port whose **type** is **cmd** and the COM port whose **type** is **deploy** can be set to the same port, for example, **COM18**.
![L0-1](figures/L0-1.PNG)
The following is an example of the modified **user_config.xml** file:
```xml
<user_config>
<environment>
<device type="com" label="wifiiot">
<serial>
<com>com20</com>
<type>cmd</type>
<baud_rate>115200</baud_rate>
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>20</timeout>
</serial>
<serial>
<com>com18</com>
<type>deploy</type>
<baud_rate>115200</baud_rate>
</serial>
</device>
</environment>
<testcases>
<dir></dir>
<server label="NfsServer">
<ip></ip>
<port></port>
<dir></dir>
<username></username>
<password></password>
<remote></remote>
</server>
</testcases>
<resource>
<dir></dir>
</resource>
<loglevel>DEBUG</loglevel>
</user_config>
```
2. Create a **testcase** directory in the root directory of xDevice to store test suite files. XTS test suites are obtained from the daily builds of the system.
Daily builds: http://ci.openharmony.cn/dailys/dailybuilds
The following is an example of the test suite configuration file in JSON format:
```json
{
"description": "Config for ActsAllTest test cases",
"environment": [
{
"type": "device",
"label": "wifiiot"
}
],
"kits": [
{
"type": "DeployKit",
"timeout": "20000",
"burn_file": "acts/Hi3861_wifiiot_app_allinone.bin"
}
],
"driver": {
"type": "CTestLite"
}
}
```
3. Execute test cases.
Go to the root directory of xDevice, open the xDevice console, and run the following command:
```bash
python -m xdevice
```
Run the test suite command.
```text
run -l ActsAllTest
```
The command output is as follows.
![result-1](figures/result-1.PNG)
## Small-System Device XTS Test Guide (ipcamera)
1. Identify the serial port usage.
The COM port whose **type** is **cmd** corresponds to the AT command serial port on the board. The port is used to send commands to the device. In the example, the **ChA(COM20)** serial port is used.
<img src="figures/L0-1.PNG" alt="L0-1" style="zoom:67%;" />
IP camera devices have two connection modes. One is to connect through the local serial port, and the other is to connect through the IP address of the local area network.
2. Configure the NFS server.
There are two NFS mounting modes. One is to mount through the remote PC, and the other is to mount through the local area network.
To configure the NFS service on the local area network, perform the following steps:
1. Download and install the NFS server. Download address: https://www.hanewin.net/nfs-e.htm
2. Configure output and edit the output table file.
Add an NFS sharing path, for example, **D:\HS\NFS_Share_File -public –alldirs**. Note that the FTP IP address 192.168.1.10 is the IP address of the development board.
<img src="figures/NFS-2.PNG" style="zoom:75%;" />
3. Stop the NFS server and restart the NFS server to make the added sharing path take effect.
4. Find the mapped network port of the IP camera device on the PC, and manually set the IP address to 192.168.1.11 on the PC.
3. Modify the **user_config.xml** file in the root directory. The following is an example:
```xml
<user_config>
<environment>
<device type="com" label="ipcamera"> <!--local connection mode-->
<serial>
<com>com20</com>
<type>cmd</type>
<baud_rate>115200</baud_rate>
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>1</timeout>
</serial>
</device>
<device type="com" label="ipcamera"> <!--local area network connection mode-->
<ip>10.176.49.47</ip>
<port>10003</port>
</device>
</environment>
<testcases>
<dir></dir>
<server label="NfsServer"> <!--remote mounting mode-->
<ip>10.176.48.202</ip>
<port>1022</port>
<dir>/data/data/local/</dir>
<username>root</username>
<password>xxx</password>
<remote>true</remote>
</server>
<server label="NfsServer"> <!--local area network mounting mode-->
<ip>192.168.1.11</ip>
<port>2049</port>
<dir>D:\test</dir>
<remote>false</remote>
</server>
</testcases>
<resource>
<dir></dir>
</resource>
<loglevel>DEBUG</loglevel>
</user_config>
```
4. Create a **testcase** directory in the root directory of xDevice to store test suite files. XTS test suites are obtained from the daily builds of the system.
Daily builds: http://ci.openharmony.cn/dailys/dailybuilds
The following is an example of the test suite configuration file in JSON format:
```json
{
"description": "Config for kernel test cases",
"environment": [
{
"type": "device",
"label": "ipcamera"
}
],
"kits": [
{
"type": "MountKit",
"server": "NfsServer",
"mount": [
{
"source": "testcases/kernel",
"target": "/test_root/kernel"
}
]
}
],
"driver": {
"type": "CppTestLite",
"excute": "/test_root/kernel/ActsKernelIPCTest.bin"
}
}
```
5. Execute test cases.
Go to the root directory of xDevice, open the xDevice console, and run the following command:
```bash
python -m xdevice
```
Run the test suite command.
```text
run -l kernel
```
The command output is as follows.
![result-1](figures/result-1.PNG)
## Standard-System Device XTS Test Guide (RK3568)
1. Configure the HDC tool, and download the latest **ohos_sdk** from daily builds.
Daily builds: http://ci.openharmony.cn/dailys/dailybuilds
After downloading the tool, configure HDC into the environment variables on the PC. To do so, right-click the **Computer** or **My Computer** desktop icon and select **Properties**. Choose **Advanced system settings**. Select the **Advanced** tab, and click **Environment Variables**. In the **Environment Variables** dialog box, select the path of the environment variable.
2. Run the following command to check whether the device is properly connected:
```bash
hdc_std list targets
```
3. Modify the **user_config.xml** file. The following is an example:
```xml
<user_config>
<environment>
<device type="usb-hdc">
<ip></ip>
<port></port>
<sn>xxx;xxx</sn> <!--SNs of multiple connected devices are separated using semicolons.-->
</device>
</environment>
<testcases>
<dir></dir>
</testcases>
<resource>
<dir></dir>
</resource>
<loglevel>DEBUG</loglevel>
</user_config>
```
4. Create a **testcase** directory in the root directory of xDevice to store test suite files. XTS test suites are obtained from the daily builds of the system.
Daily builds: http://ci.openharmony.cn/dailys/dailybuilds
The following is an example of the test suite configuration file in JSON format:
```json
{
"description": "Configuration for hjunit demo Tests",
"driver": {
"type": "OHJSUnitTest",
"test-timeout": "180000",
"bundle-name": "ohos.acts.bundle.stage.test",
"module-name": "phone",
"shell-timeout": "600000",
"testcase-timeout": 70000
},
"kits": [
{
"test-file-name": [
"ActBmsStageEtsTest.hap"
],
"type": "AppInstallKit",
"cleanup-apps": true
},
{
"type": "ShellKit",
"teardown-command":[
"bm uninstall -n ohos.acts.bundle.stage.test"
]
}
]
}
```
5. Execute test cases.
Go to the root directory of xDevice, open the xDevice console, and run the following command:
```bash
python -m xdevice
```
Run the test suite command.
```text
run -l ActBmsStageEtsTest
```
The command output is as follows.
![result-1](figures/result-1.PNG)
## FAQs
### The **hdc list targets** command can find a device, but xDevice cannot identify the device.
**Issue Description**
The following error information is displayed.
![FAQ-1](figures/FAQ-1.PNG)
**Possible Causes**
The **HDC_SERVER_PORT** variable has been set and the HDC port has been modified. By default, xDevice uses port **8710**. If the port has been modified, the xDevice framework cannot identify the device.
**Solution**
Check whether the **HDC_SERVER_PROT** variable is set. If yes, change the port number to **8710** and restart xDevice.
# XTS Test Case Development Guide
## Introduction
The X test suite (XTS) subsystem contains a set of OpenHarmony compatibility test suites, including the currently supported application compatibility test suite (ACTS) and the device compatibility test suite (DCTS) that will be supported in the future.
This subsystem contains the ACTS and **tools** software package.
- The **acts** directory stores the source code and configuration files of ACTS test cases. The ACTS helps device vendors detect the software incompatibility as early as possible and ensures that the software is compatible with OpenHarmony during the entire development process.
- The **tools** software package stores the test case development framework related to **acts**.
## System Types
The following system types are supported:
- Mini system
The mini system fits into the devices that come with MCU processors, such as Arm Cortex-M and 32-bit RISC-V, and memory greater than or equal to 128 KiB. This system provides a variety of lightweight network protocols, a lightweight graphics framework, and a wide range of read/write components with the Internet of Things (IoT) bus. Typical products include connection modules, sensors, and wearables for smart home.
- Small system
The small system fits into the devices that come with application processors, such as Arm Cortex-A, and memory greater than or equal to 1 MiB. This system provides higher security capabilities, a standard graphics framework, and video encoding and decoding capabilities. Typical products include smart home IP cameras, electronic cat eyes, and routers, and event data recorders (EDRs) for easy travel.
- Standard system
The standard system fits into the devices that come with application processors, such as Arm Cortex-A, and memory greater than or equal to 128 MiB. This system provides a complete application framework supporting enhanced interaction, 3D GPU, hardware composer, diverse components, and rich animations. The standard system applies to high-end refrigerator displays.
## Contents
```
/test/xts
├── acts # Test code
│ └── subsystem # Source code of subsystem test cases for the standard system
│ └── subsystem_lite # Source code of subsystems test cases for mini and small systems
│ └── BUILD.gn # Build configuration of test cases for the standard system
│ └── build_lite
│ └── BUILD.gn # Build configuration of test cases for mini and small systems
└── tools # Test tool code
```
## Constraints
Test cases for the mini system must be developed based on C, and those for the small system must be developed based on C++.
## How to Use
**Table 1** Case levels
| Level| Basic Definition| Test Scope|
| -------- | -------- | -------- |
| Level0 | Smoke| Verifies basic functionalities of key features and basic DFX attributes with the most common input. The pass result indicates that the features are runnable.|
| Level1 | Basic| Verifies basic functionalities of key features and basic DFX attributes with common input. The pass result indicates that the features are testable.|
| Level2 | Major| Verifies basic functionalities of key features and basic DFX attributes with common input and errors. The pass result indicates that the features are functional and ready for beta testing.|
| Level3 | Minor| Verifies functionalities of all key features, and all DFX attributes with common and uncommon input combinations or normal and abnormal preset conditions.|
| Level4 | Rare| Verifies functionalities of key features under extremely abnormal presets and uncommon input combinations.|
**Table 2** Case scales
| Case Scale| Test Object| Test Environment|
| -------- | -------- | -------- |
| LargeTest | Service functionalities, all-scenario features, and mechanical power environment (MPE) and scenario-level DFX| Devices close to real devices|
| MediumTest | Modules, subsystem functionalities after module integration, and DFX| Single device that is actually used. You can perform message simulation, but do not mock functions.|
| SmallTest | Modules, classes, and functions| Local PC. Use a large number of mocks to replace dependencies with other modules.|
**Table 3** Test types
| Test Type| Definition|
| -------- | -------- |
| Function | Tests the correctness of both service and platform functionalities provided by the tested object for end users or developers.|
| Performance | Tests the processing capability of the tested object under specific preset conditions and load models. The processing capability is measured by the service volume that can be processed in a unit time, for example, call per second, frame per second, or event processing volume per second.|
| Power | Tests the power consumption of the tested object in a certain period of time under specific preset conditions and load models.|
| Reliability | Tests the service performance of the tested object under common and uncommon input conditions, or specified service volume pressure and long-term continuous running pressure. The test covers stability, pressure handling, fault injection, and Monkey test items.|
| Security | Tests the capability of defending against security threats, including but not limited to unauthorized access, use, disclosure, damage, modification, and destruction, to ensure information confidentiality, integrity, and availability. Tests the privacy protection capability to ensure that the collection, use, retention, disclosure, and disposal of users' private data comply with laws and regulations. Tests the compliance with various security specifications, such as security design, security requirements, and security certification of the Ministry of Industry and Information Technology (MIIT).|
| Global | Tests the internationalized data and localization capabilities of the tested object, including multi-language display, various input/output habits, time formats, and regional features, such as currency, time, and culture taboos.|
| Compatibility | Tests backward compatibility of an application with its own data, the forward and backward compatibility with the system, and the compatibility with different user data, such as audio file content of the player and smart SMS messages. Tests system backward compatibility with its own data and the compatibility of common applications in the ecosystem. Tests software compatibility with related hardware.|
| User | Tests user experience of the object in real user scenarios. All conclusions and comments should come from the users, which are all subjective evaluation in this case.|
| Standard | Tests the compliance with industry and company-specific standards, protocols, and specifications. The standards here do not include any security standards that should be classified into the security test.|
| Safety | Tests the safety property of the tested object to avoid possible hazards to personal safety, health, and the object itself.|
| Resilience | Tests the resilience property of the tested object to ensure that it can withstand and maintain the defined running status (including downgrading) when being attacked, and recover from and adapt defense to the attacks to approach mission assurance.|
## Test Case Development Guide
You should select the appropriate programming language and your target test framework to develop test cases.
**Table 4** Test frameworks and test case languages for different systems
| System| Test Framework| Language|
| -------- | -------- | -------- |
| Mini system| hctest | c |
| Small system| hcpptest | c++ |
| Standard system| HJSUnit and HCPPTest| JavaScript and C++|
### C-based Test Case Development and Compilation (for the Mini System)
**Developing test cases for the mini system**
The HCTest framework is used to support test cases developed with the C language. HCTest is enhanced and adapted based on the open-source test framework Unity.
1. Access the **test/xts/acts** repository where the test cases will be stored.
```
├── acts
│ └──subsystem_lite
│ │ └── module_hal
│ │ │ └── BUILD.gn
│ │ │ └── src
│ └──build_lite
│ │ └── BUILD.gn
```
2. Write the test case in the **src** directory.
1. Import the test framework header file.
```
#include "hctest.h"
```
2. Use the **LITE_TEST_SUIT** macro to define names of the subsystem, module, and test suite.
```
/**
* @brief register a test suite named "IntTestSuite"
* @param test subsystem name
* @param example module name
* @param IntTestSuite test suite name
*/
LITE_TEST_SUIT(test, example, IntTestSuite);
```
3. Define SetUp and TearDown.
Format: Test suite name+SetUp, Test suite name+TearDown.
The SetUp and TearDown functions must exist, but function bodies can be empty.
4. Use the **LITE_TEST_CASE** macro to write the test case.
Three parameters are involved: test suite name, test case name, and test case properties (including type, granularity, and level).
```
LITE_TEST_CASE(IntTestSuite, TestCase001, Function | MediumTest | Level1)
{
//do something
};
```
5. Use the **RUN_TEST_SUITE** macro to register the test suite.
```
RUN_TEST_SUITE(IntTestSuite);
```
3. Create a configuration file (**BUILD.gn**) of the test module.
Create a **BUILD.gn** (example) build file in each test module directory. Specify the name of the built static library and its dependent header file and library in the build file. The format is as follows:
```
import("//test/xts/tools/lite/build/suite_lite.gni")
hctest_suite("ActsDemoTest") {
suite_name = "acts"
sources = [
"src/test_demo.c",
]
include_dirs = [ ]
cflags = [ "-Wno-error" ]
}
```
4. Add build options to the **BUILD.gn** file in the **acts** directory.
You need to add the test module to the **test/xts/acts/build_lite/BUILD.gn** script in the **acts** directory.
```
lite_component("acts") {
...
if(board_name == "liteos_m") {
features += [
...
"//xts/acts/subsystem_lite/module_hal:ActsDemoTest"
]
}
}
```
5. Run build commands.
Test suites are built along with the version build. The ACTS is built together with the debug version.
> **NOTE**
> The ACTS build middleware is a static library, which will be linked to the image.
### C-based Test Case Execution (for the Mini System)
**Executing test cases for the mini system**
Burn the image into the development board.
**Executing the test**
1. Use a serial port tool to log in to the development board and save information about the serial port.
2. Restart the device and view serial port logs.
**Analyzing the test result**
View the serial port logs, whose format is as follows:
The log for each test suite starts with **Start to run test suite:** and ends with **xx Tests xx Failures xx Ignored**.
### C++-based Test Case Development and Compilation (for Standard and Small Systems)
**Developing test cases for small-system devices** (For examples of the standard system, go to the **global/i18n_standard** directory.)
The HCPPTest framework is enhanced and adapted based on the open-source framework Googletest.
1. Access the **test/xts/acts** repository where the test cases will be stored.
```
├── acts
│ └──subsystem_lite
│ │ └── module_posix
│ │ │ └── BUILD.gn
│ │ │ └── src
│ └──build_lite
│ │ └── BUILD.gn
```
2. Write the test case in the **src** directory.
1. Import the test framework header file.
The following statement includes **gtest.h**:
```
#include "gtest/gtest.h"
```
2. Define SetUp and TearDown.
```
using namespace std;
using namespace testing::ext;
class TestSuite: public testing::Test {
protected:
// Preset action of the test suite, which is executed before the first test case
static void SetUpTestCase(void){
}
// Test suite cleanup action, which is executed after the last test case
static void TearDownTestCase(void){
}
// Preset action of the test case
virtual void SetUp()
{
}
// Cleanup action of the test case
virtual void TearDown()
{
}
};
```
3. Use the **HWTEST** or **HWTEST_F** macro to write the test case.
**HWTEST**: definition of common test cases, including the test suite name, test case name, and case annotation.
**HWTEST_F**: definition of SetUp and TearDown test cases, including the test suite name, test case name, and case annotation.
Three parameters are involved: test suite name, test case name, and test case properties (including type, granularity, and level).
```
HWTEST_F(TestSuite, TestCase_0001, Function | MediumTest | Level1) {
// do something
}
```
3. Create a configuration file (**BUILD.gn**) of the test module.
Create a **BUILD.gn** build file in each test module directory. Specify the name of the built static library and its dependent header file and library in the build file. Each test module is independently built into a **.bin** executable file, which can be directly pushed to the development board for testing.
Example:
```
import("//test/xts/tools/lite/build/suite_lite.gni")
hcpptest_suite("ActsDemoTest") {
suite_name = "acts"
sources = [
"src/TestDemo.cpp"
]
include_dirs = [
"src",
...
]
deps = [
...
]
cflags = [ "-Wno-error" ]
}
```
4. Add build options to the **BUILD.gn** file in the **acts** directory.
Add the test module to the **test/xts/acts/build_lite/BUILD.gn** script in the **acts** directory.
```
lite_component("acts") {
...
else if(board_name == "liteos_a") {
features += [
...
"//xts/acts/subsystem_lite/module_posix:ActsDemoTest"
]
}
}
```
5. Run build commands.
Test suites are built along with the version build. The ACTS is built together with the debug version.
> **NOTE**
>
> The ACTS for the small system is independently built to an executable file (.bin) and archived in the **suites\acts** directory of the build result.
### C++-based Test Case Execution (for Standard and Small Systems)
**Executing test cases for the small system**
Currently, test cases are shared by the NFS and mounted to the development board for execution.
**Setting up the environment**
1. Use a network cable or wireless network to connect the development board to your PC.
2. Configure the IP address, subnet mask, and gateway for the development board. Ensure that the development board and the PC are on the same network segment.
3. Install and register the NFS server on the PC and start the NFS service.
4. Run the **mount** command for the development board to ensure that the development board can access NFS shared files on the PC.
Format: **mount** *NFS server IP address***:/***NFS shared directory* **/***development board directory* **nfs**
Example:
```
mount 192.168.1.10:/nfs /nfs nfs
```
**Executing test cases**
Execute **ActsDemoTest.bin** to trigger test case execution, and analyze serial port logs generated after the execution is complete.
### JavaScript-based Test Case Development (for the Standard System)
The HJSUnit framework is used to support automated tests of OpenHarmony applications that are developed using the JavaScript language based on the JS application framework.
**Basic syntax of test cases**
The test cases are developed with the JavaScript language and must meet the programming specifications of the language.
**Table 7** Basic syntax of test cases
| Syntax| Description| Requirement|
| -------- | -------- | -------- |
| beforeAll | Presets a test-suite-level action executed only once before all test cases are executed. You can pass the action function as the only parameter.| Optional|
| afterAll | Presets a test-suite-level clear action executed only once after all test cases are executed. You can pass the clear function as the only parameter.| Optional|
| beforeEach | Presets a test-case-level action executed before each test case is executed. The number of execution times is the same as the number of test cases defined by **it**. You can pass the action function as the only parameter.| Optional|
| afterEach | Presets a test-case-level clear action executed after each test case is executed. The number of execution times is the same as the number of test cases defined by **it**. You can pass the clear function as the only parameter.| Optional|
| describe | Defines a test suite. You can pass two parameters: test suite name and test suite function. **describe** supports embedding. **beforeAll**, **beforeEach**, **afterEach**, and **afterAll** can be defined in each **describe**.| Mandatory|
| it | Defines a test case. You can pass three parameters: test case name, filter parameter, and test case function.<br>**NOTE**<br>**Filter parameter**: The filter parameter is a 32-bit parameter of the **Int** type. **1** of bit 0 indicates not to filter. **1** of bits 0-10 indicates the test case type. **1** of bits 16-18 indicates the test case scale. **1** of bits 24-28 indicates the test level.<br>**Test case type**: Bits 0-10 indicate the following respectively: FUNCTION test, PERFORMANCE test, POWER test, RELIABILITY test, SECURITY test, GLOBAL test, COMPATIBILITY test, USER test, STANDARD test, SAFETY test, and RESILIENCE test.<br>**Test case scale**: Bits 16-18 indicate the following respectively: SMALL test, MEDIUM test, and LARGE test.<br>**Test level**: Bits 24-28 indicate the following respectively: LEVEL0-0 test, LEVEL1-1 test, LEVEL2-2 test, LEVEL3-3 test, and LEVEL4-4 test.| Mandatory|
Use the standard syntax of Jasmine to write test cases. The ES6 specification is supported.
1. Store the test cases in the **entry/src/main/js/test** directory, whose structure is as follows:
```
├── BUILD.gn
│ └──entry
│ │ └──src
│ │ │ └──main
│ │ │ │ └──js
│ │ │ │ │ └──default
│ │ │ │ │ │ └──pages
│ │ │ │ │ │ │ └──index
│ │ │ │ │ │ │ │ └──index.js # Entry file
│ │ │ │ │ └──test # Test code
│ │ │ └── resources # HAP resources
│ │ │ └── config.json # HAP configuration file
```
2. Start the JS test framework and load test cases. The following is an example for **index.js**:
```
// Start the JS test framework and load test cases.
import {Core, ExpectExtend} from 'deccjsunit/index'
export default {
data: {
title: ""
},
onInit() {
this.title = this.$t('strings.world');
},
onShow() {
console.info('onShow finish')
const core = Core.getInstance()
const expectExtend = new ExpectExtend({
'id': 'extend'
})
core.addService('expect', expectExtend)
core.init()
const configService = core.getDefaultService('config')
configService.setConfig(this)
require('../../../test/List.test')
core.execute()
},
onReady() {
},
}
```
3. Write a unit test case by referring to the following example:
```
// Use HJSUnit to perform the unit test.
describe('appInfoTest', function () {
it('app_info_test_001', 0, function () {
var info = app.getInfo()
expect(info.versionName).assertEqual('1.0')
expect(info.versionCode).assertEqual('3')
})
})
```
### JavaScript-based Test Case Packaging (for the Standard System)
For details about HAP package compilation, see [JS application development guide for the standard system](https://developer.harmonyos.com/en/docs/documentation/doc-guides/build_overview-0000001055075201).
## Full Compilation (for the Standard System)
1. Perform full compilation.
Command:
```
./build.sh suite=acts system_size=standard
```
Test case output directory: **out/release/suites/acts/testcases**
Test framework and case output directory: **out/release/suites/acts** (The test suite execution framework is compiled during case compilation.)
## Full Test Case Execution (for Small and Standard Systems)
**Setting up a test environment**
Install Python 3.7 or a later version on a Windows environment and ensure that the Windows environment is properly connected to the test device.
**Test execution directory** (corresponding to the **out/release/suites/acts** directory generated during compilation)
```
├── testcase # Directory for storing test suite files
│ └──xxx.hap # HAP file executed by the test suite
│ └──xxx.json # Execution configuration file of the test suite
├── tools # Test framework tool directory
├── run.bat # File for starting the test suite on the Windows platform
├── report # Directory for storing the test reports
```
**Executing test cases**
1. On the Windows environment, locate the directory in which the test cases are stored (**out/release/suites/acts**, copied from the Linux server), go to the directory in the Windows command line interface (CLI), and run **acts\run.bat**.
2. Enter the command for executing the test case.
- Execute all test cases.
```
run acts
```
**Figure 1** Running process
![en-us_image_0000001200230833](figures/en-us_image_0000001200230833.gif)
- Execute the test cases of a module (view specific module information in **\acts\testcases\**).
```
run –l ActsSamgrTest
```
**Figure 2** Viewing the running command
![en-us_image_0000001154351160](figures/en-us_image_0000001154351160.jpg)
Wait until the execution is complete.
3. View test reports.
Go to **acts\reports\**, obtain the current execution record, and open **summary_report.html** to view the test report.
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册