提交 9a558264 编写于 作者: D duangavin123

修改低错

Signed-off-by: Nduangavin123 <duanxichao@huawei.com>
上级 4535003d
# Test
OpenHarmony provides a comprehensive auto-test framework for designing test cases. Detecting defects in the development process can improve code quality.
This document describes how to use the OpenHarmony test framework.
## Setting Up the Environment
- The test framework depends on Python. Before using the test framework, you need to set up the environment.
- For details about how to obtain the source code, see [Obtaining Source Code](../get-code/sourcecode-acquire.md).
### Environment Configuration
#### Basic Test Framework Environment
|Environment|Version|Description|
|------------|------------|------------|
|Operating system|Ubuntu 18.04 or later|Provides the build environment.|
|Linux extend component|libreadline-dev|Allows users to edit command lines.|
|Python|3.7.5 or later|Provides the programming language for the test framework.|
|Python Plug-ins|pyserial 3.3 or later<br>paramiko 2.7.1 or later<br>setuptools 40.8.0 or later<br>RSA 4.0 or later|- pyserial: supports serial port communication in Python.<br>- paramiko: allows SSH in Python.<br>- setuptools: allows creation and distribution of Python packages.<br>-RSA: implements RSA encryption in Python.|
|NFS Server|haneWIN NFS Server 1.2.50 or later or NFS v4 or later|Allows devices to be connected over a serial port.|
|HDC|1.1.0 or later|Allows devices to be connected by using the OpenHarmony Device Connector (HDC).|
#### Installation Process
1. Run the following command to install the Linux extended component libreadline:
```
sudo apt-get install libreadline-dev
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
libreadline-dev is already the newest version (7.0-3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
2. Run the following command to install the setuptools plug-in:
```
pip3 install setuptools
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
```
3. Run the following command to install the paramiko plug-in:
```
pip3 install paramiko
```
The installation is successful if the following information is displayed:
```
Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko
Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0
```
4. Run the following command to install the rsa plug-in:
```
pip3 install rsa
```
The installation is successful if the following information is displayed:
```
Installing collected packages: pyasn1, rsa
Successfully installed pyasn1-0.4.8 rsa-4.7
```
5. Run the following command to install the pyserial plug-in:
```
pip3 install pyserial
```
The installation is successful if the following information is displayed:
```
Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
```
6. Install the NFS server if the device outputs results only through the serial port.
- For Windows, install, for example, haneWIN NFS Server 1.2.50.
- For Linux, run the following command to install the NFS server:
```
sudo apt install nfs-kernel-server
```
The installation is successful if the following information is displayed:
```
Reading package lists... Done
Building dependency tree
Reading state information... Done
nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3).
0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
```
7. Install the HDC tool if the device supports HDC connections.
For details, see https://gitee.com/openharmony/developtools_hdc_standard/blob/master/README.md
## Checking the Installation Environment
| Check Item|Operation |Requirements |
| --- | --- | --- |
| Check whether Python is installed successfully.|Run the **python --version** command. |The Python version is 3.7.5 or later.|
| Check whether Python plug-ins are successfully installed.|Go to the **test/developertest** directory and run **start.bat** or **start.sh**.| The **>>>** prompt is displayed.|
|Check the NFS server status (for the devices that support only serial port output). |Log in to the development board through the serial port and run the **mount** command to mount the NFS. |The file directory can be mounted. |
|Check whether the HDC is successfully installed. |Run the **hdc_std -v** command.|The HDC version is 1.1.0 or later.|
## Directory Structure
The directory structure of the test framework is as follows:
```
test # Test subsystem
├── developertest # Developer test module
│ ├── aw # Static library of the test framework
│ ├── config # Test framework configuration
│ │ │ ...
│ │ └── user_config.xml # User configuration
│ ├── examples # Examples of test cases
│ ├── src # Source code of the test framework
│ ├── third_party # Adaptation code for third-party components on which the test framework depends
│ ├── reports # Test reports
│ ├── BUILD.gn # Build entry of the test framework
│ ├── start.bat # Test entry for Windows
│ └── start.sh # Test entry for Linux
└── xdevice # Modules on which the test framework depends
```
## Writing Test Cases
### Designing the Test Case Directory
Design the test case directory as follows:
```
subsystem # Subsystem
├── partA # Part A
│ ├── moduleA # Module A
│ │ ├── include
│ │ ├── src # Service code
│ │ └── test # Test directory
│ │ ├── unittest # Unit test
│ │ │ ├── common # Common test cases
│ │ │ │ ├── BUILD.gn # Build file of test cases
│ │ │ │ └── testA_test.cpp # Source code of unit test cases
│ │ │ ├── phone # Test cases for mobile phones
│ │ │ ├── ivi # Test cases for head units
│ │ │ └── liteos-a # Test cases for IP cameras using LiteOS
│ │ ├── moduletest # Module test
│ │ ...
│ │
│ ├── moduleB # Module B
│ ├── test
│ │ └── resource # Dependency resources
│ │ ├── moduleA # Module A
│ │ │ ├── ohos_test.xml # Resource configuration file
│ │ ... └── 1.txt # Resource file
│ │
│ ├── ohos_build # Build entry configuration
│ ...
...
```
> **CAUTION**<br>Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices.
### Writing Test Cases
This test framework supports test cases written in multiple programming languages and provides different templates for different languages.
**C++ Test Case Example**
- Naming rules for source files
The source file name of test cases must be the same as that of the test suite. The file names must use lowercase letters and in the [Function]\_[Sub-function]\_**test** format. More specific sub-functions can be added as required.
Example:
```
calculator_sub_test.cpp
```
- Test case example
```
/*
* Copyright (c) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#include "calculator.h"
#include <gtest/gtest.h>
using namespace testing::ext;
class CalculatorSubTest : public testing::Test {
public:
static void SetUpTestCase(void);
static void TearDownTestCase(void);
void SetUp();
void TearDown();
};
void CalculatorSubTest::SetUpTestCase(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDownTestCase(void)
{
// Set a teardown function, which will be called after all test cases.
}
void CalculatorSubTest::SetUp(void)
{
// Set a setup function, which will be called before each test case.
}
void CalculatorSubTest::TearDown(void)
{
// Set a teardown function, which will be called after each test case.
}
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub function.
* @tc.type: FUNC
* @tc.require: Issue Number
*/
HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
{
// Step 1 Call the function to obtain the result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
```
The procedure is as follows:
1. Add comment information to the test case file header.
```
/*
* Copyright (c) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
```
2. Add the test framework header file and namespace.
```
#include <gtest/gtest.h>
using namespace testing::ext;
```
3. Add the header file of the test class.
```
#include "calculator.h"
```
4. Define the test suite (test class).
```
class CalculatorSubTest : public testing::Test {
public:
static void SetUpTestCase(void);
static void TearDownTestCase(void);
void SetUp();
void TearDown();
};
void CalculatorSubTest::SetUpTestCase(void)
{
// Set a setup function, which will be called before all test cases.
}
void CalculatorSubTest::TearDownTestCase(void)
{
// Set a teardown function, which will be called after all test cases.
}
void CalculatorSubTest::SetUp(void)
{
// Set a setup function, which will be called before each test case.
}
void CalculatorSubTest::TearDown(void)
{
// Set a teardown function, which will be called after each test case.
}
```
> **CAUTION**:<br>When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style.
5. Add implementation of the test cases, including test case comments and logic.
```
/**
* @tc.name: integer_sub_001
* @tc.desc: Verify the sub function.
* @tc.type: FUNC
* @tc.require: Issue Number
*/
HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
{
// Step 1 Call the function to obtain the test result.
int actual = Sub(4, 0);
// Step 2 Use an assertion to compare the obtained result with the expected result.
EXPECT_EQ(4, actual);
}
```
The following test case templates are provided for your reference.
| Type| Description|
| ------------| ------------|
| HWTEST(A,B,C)| Use this template if the test case execution does not depend on setup or teardown.|
| HWTEST_F(A,B,C)| Use this template if the test case execution (excluding parameters) depends on setup and teardown.|
| HWTEST_P(A,B,C)| Use this template if the test case execution (including parameters) depends on setup and teardown.|
In the template names:
- *A* indicates the test suite name.
- *B* indicates the test case name, which is in the *Function*\_*No.* format. The *No.* is a three-digit number starting from **001**.
- *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case.
**CAUTION**<br>
- The expected result of each test case must have an assertion.
- The test case level must be specified.
- It is recommended that the test be implemented step by step according to the template.
- The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following:
| Test Case Type|Code|
| ------------|------------|
|Function test |FUNC|
|Performance Test |PERF|
|Reliability test |RELI|
|Security test |SECU|
|Fuzzing |FUZZ|
**JavaScript Test Case Example**
- Naming rules for source files
The source file name of a test case must be in the [Function]\[Sub-function]Test format, and each part must use the upper camel case style. More specific sub-functions can be added as required.
Example:
```
AppInfoTest.js
```
- Test case example
```
/*
* Copyright (C) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import app from '@system.app'
import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
describe("AppInfoTest", function () {
beforeAll(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeAll called')
})
afterAll(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterAll called')
})
beforeEach(function() {
// Set a setup function, which will be called before each test case.
console.info('beforeEach called')
})
afterEach(function() {
// Set a teardown function, which will be called after each test case.
console.info('afterEach called')
})
/*
* @tc.name:appInfoTest001
* @tc.desc:verify app info is not null
* @tc.type: FUNC
* @tc.require: Issue Number
*/
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
})
```
The procedure is as follows:
1. Add comment information to the test case file header.
```
/*
* Copyright (C) 2022 XXXX Device Co., Ltd.
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
```
2. Import the APIs and JSUnit test library to test.
```
import app from '@system.app'
import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
```
3. Define the test suite (test class).
```
describe("AppInfoTest", function () {
beforeAll(function() {
// Set a setup function, which will be called before all test cases.
console.info('beforeAll called')
})
afterAll(function() {
// Set a teardown function, which will be called after all test cases.
console.info('afterAll called')
})
beforeEach(function() {
// Set a setup function, which will be called before each test case.
console.info('beforeEach called')
})
afterEach(function() {
// Set a teardown function, which will be called after each test case.
console.info('afterEach called')
})
```
4. Add implementation of the test cases.
```
/*
* @tc.name:appInfoTest001
* @tc.desc:verify app info is not null
* @tc.type: FUNC
* @tc.require: Issue Number
*/
it("appInfoTest001", 0, function () {
// Step 1 Call the function to obtain the test result.
var info = app.getInfo()
// Step 2 Use an assertion to compare the obtained result with the expected result.
expect(info != null).assertEqual(true)
})
```
### Writing the Build File for Test Cases
When a test case is executed, the test framework searches for the build file of the test case in the test case directory and builds the test case located. The following describes how to write build files (GN files) in different programming languages.
#### Writing Build Files for Test Cases
The following provides templates for different languages for your reference.
- **Test case build file example (C++)**
```
# Copyright (c) 2022 XXXX Device Co., Ltd.
import("//build/test.gni")
module_output_path = "subsystem_examples/calculator"
config("module_private_config") {
visibility = [ ":*" ]
include_dirs = [ "../../../include" ]
}
ohos_unittest("CalculatorSubTest") {
module_out_path = module_output_path
sources = [
"../../../include/calculator.h",
"../../../src/calculator.cpp",
]
sources += [ "calculator_sub_test.cpp" ]
configs = [ ":module_private_config" ]
deps = [ "//third_party/googletest:gtest_main" ]
}
group("unittest") {
testonly = true
deps = [":CalculatorSubTest"]
}
```
The procedure is as follows:
1. Add comment information for the file header.
```
# Copyright (c) 2022 XXXX Device Co., Ltd.
```
2. Import the build template.
```
import("//build/test.gni")
```
3. Specify the file output path.
```
module_output_path = "subsystem_examples/calculator"
```
> **NOTE**<br>The output path is ***Part_name*/*Module_name***.
4. Configure the directories for dependencies.
```
config("module_private_config") {
visibility = [ ":*" ]
include_dirs = [ "../../../include" ]
}
```
> **NOTE**<br>Generally, the dependency directories are configured here and directly referenced in the build script of the test case.
5. Set the output build file for the test cases.
```
ohos_unittest("CalculatorSubTest") {
}
```
6. Write the build script (add the source file, configuration, and dependencies) for the test cases.
```
ohos_unittest("CalculatorSubTest") {
module_out_path = module_output_path
sources = [
"../../../include/calculator.h",
"../../../src/calculator.cpp",
"../../../test/calculator_sub_test.cpp"
]
sources += [ "calculator_sub_test.cpp" ]
configs = [ ":module_private_config" ]
deps = [ "//third_party/googletest:gtest_main" ]
}
```
> **NOTE**<br>Set the test type based on actual requirements. The following test types are available:
> - **ohos_unittest**: unit test
> - **ohos_moduletest**: module test
> - **ohos_systemtest**: system test
> - **ohos_performancetest**: performance test
> - **ohos_securitytest**: security test
> - **ohos_reliabilitytest**: reliability test
> - **ohos_distributedtest**: distributed test
7. Group the test case files by test type.
```
group("unittest") {
testonly = true
deps = [":CalculatorSubTest"]
}
```
> **NOTE**<br>Grouping test cases by test type allows you to execute a specific type of test cases when required.
- Test case build file example (JavaScript)
```
# Copyright (C) 2022 XXXX Device Co., Ltd.
import("//build/test.gni")
module_output_path = "subsystem_examples/app_info"
ohos_js_unittest("GetAppInfoJsTest") {
module_out_path = module_output_path
hap_profile = "./config.json"
certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
}
group("unittest") {
testonly = true
deps = [ ":GetAppInfoJsTest" ]
}
```
The procedure is as follows:
1. Add comment information for the file header.
```
# Copyright (C) 2022 XXXX Device Co., Ltd.
```
2. Import the build template.
```
import("//build/test.gni")
```
3. Specify the file output path.
```
module_output_path = "subsystem_examples/app_info"
```
> **NOTE**<br>The output path is ***Part_name*/*Module_name***.
4. Set the output build file for the test cases.
```
ohos_js_unittest("GetAppInfoJsTest") {
}
```
> **NOTE**<br>
>- Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++.
>- The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**.
5. Configure the **config.json** file and signature file, which are mandatory.
```
ohos_js_unittest("GetAppInfoJsTest") {
module_out_path = module_output_path
hap_profile = "./config.json"
certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
}
```
**config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example:
```
{
"app": {
"bundleName": "com.example.myapplication",
"vendor": "example",
"version": {
"code": 1,
"name": "1.0"
},
"apiVersion": {
"compatible": 4,
"target": 5 // Set it based on the tested SDK version. In this example, SDK5 is used.
}
},
"deviceConfig": {},
"module": {
"package": "com.example.myapplication",
"name": ".MyApplication",
"deviceType": [
"phone"
],
"distro": {
"deliveryWithInstall": true,
"moduleName": "entry",
"moduleType": "entry"
},
"abilities": [
{
"skills": [
{
"entities": [
"entity.system.home"
],
"actions": [
"action.system.home"
]
}
],
"name": "com.example.myapplication.MainAbility",
"icon": "$media:icon",
"description": "$string:mainability_description",
"label": "MyApplication",
"type": "page",
"launchType": "standard"
}
],
"js": [
{
"pages": [
"pages/index/index"
],
"name": "default",
"window": {
"designWidth": 720,
"autoDesignWidth": false
}
}
]
}
}
```
6. Group the test case files by test type.
```
group("unittest") {
testonly = true
deps = [ ":GetAppInfoJsTest" ]
}
```
> **NOTE**<br>Grouping test cases by test type allows you to execute a specific type of test cases when required.
#### Configuring ohos.build
Configure the part build file to associate with specific test cases.
```
"partA": {
"module_list": [
],
"inner_list": [
],
"system_kits": [
],
"test_list": [
"//system/subsystem/partA/calculator/test:unittest" // Configure test under calculator.
]
}
```
> **NOTE**<br>**test_list** contains the test cases of the corresponding module.
### Configuring Test Case Resources
Test case resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution.
Perform the following steps:
1. Create the **resource** directory in the **test** directory of the part, and create a directory for the module in the **resource** directory to store resource files of the module.
2. In the module directory under **resource**, create the **ohos_test.xml** file in the following format:
```
<?xml version="1.0" encoding="UTF-8"?>
<configuration ver="2.0">
<target name="CalculatorSubTest">
<preparer>
<option name="push" value="test.jpg -> /data/test/resource" src="res"/>
<option name="push" value="libc++.z.so -> /data/test/resource" src="out"/>
</preparer>
</target>
</configuration>
```
3. In the build file of the test cases, configure **resource_config_file** to point to the resource file **ohos_test.xml**.
```
ohos_unittest("CalculatorSubTest") {
resource_config_file = "//system/subsystem/partA/test/resource/calculator/ohos_test.xml"
}
```
>**NOTE**
>- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory.
>- **preparer** indicates the action to perform before the test suite is executed.
>- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory.
>- **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory.
## Executing Test Cases
Before executing test cases, you need to modify the configuration based on the device used.
### Modifying user_config.xml
```
<user_config>
<build>
<!-- Whether to build a demo case. The default value is false. If a demo case is required, change the value to true. -->
<example>false</example>
<!-- Whether to build the version. The default value is false. -->
<version>false</version>
<!-- Whether to build the test cases. The default value is true. If the build is already complete, change the value to false before executing the test cases.-->
<testcase>true</testcase>
</build>
<environment>
<!-- Configure the IP address and port number of the remote server to support connection to the device through the HDC.-->
<device type="usb-hdc">
<ip></ip>
<port></port>
<sn></sn>
</device>
<!-- Configure the serial port information of the device to enable connection through the serial port.-->
<device type="com" label="ipcamera">
<serial>
<com></com>
<type>cmd</type>
<baud_rate>115200</baud_rate>
<data_bits>8</data_bits>
<stop_bits>1</stop_bits>
<timeout>1</timeout>
</serial>
</device>
</environment>
<!-- Configure the test case path. If the test cases have not been built (<testcase> is true), leave this parameter blank. If the build is complete, enter the path of the test cases.-->
<test_cases>
<dir></dir>
</test_cases>
<!-- Configure the coverage output path.-->
<coverage>
<outpath></outpath>
</coverage>
<!-- Configure the NFS mount information when the tested device supports only the serial port connection. Specify the NFS mapping path. host_dir indicates the NFS directory on the PC, and board_dir indicates the directory created on the board. -->
<NFS>
<host_dir></host_dir>
<mnt_cmd></mnt_cmd>
<board_dir></board_dir>
</NFS>
</user_config>
```
>**NOTE**<br>If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters.
### Executing Test Cases on Windows
#### Building Test Cases
Test cases cannot be built on Windows. You need to run the following command to build test cases on Linux:
```
./build.sh --product-name hispark_taurus_standard --build-target make_test
```
>**NOTE**
>
>- **product-name**: specifies the name of the product to build, for example, **hispark_taurus_standard**.
>- **build-target**: specifies the test case to build. **make_test** indicates all test cases. You can specify the test cases based on requirements.
When the build is complete, the test cases are automatically saved in **out/hispark_taurus/packages/phone/tests**.
#### Setting Up the Execution Environment
1. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory.
2. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory.
>**NOTE**<br>Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution.
3. Modify the **user_config.xml** file.
```
<build>
<!-- Because the test cases have been built, change the value to false. -->
<testcase>false</testcase>
</build>
<test_cases>
<!-- The test cases are copied to the Windows environment. Change the test case output path to the path of the test cases in the Windows environment.-->
<dir>D:\Test\testcase\tests</dir>
</test_cases>
```
>**NOTE**<br>**<testcase>** indicates whether to build test cases. **<dir>** indicates the path for searching for test cases.
#### Executing Test Cases
1. Start the test framework.
```
start.bat
```
2. Select the product.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**.
3. Execute test cases.
Run the following command to execute test cases:
```
run -t UT -ts CalculatorSubTest -tc integer_sub_00l
```
In the command:
```
-t [TESTTYPE]: specifies the test case type, which can be UT, MST, ST, or PERF. This parameter is mandatory.
-tp [TESTPART]: specifies the part to test. This parameter can be used independently.
-tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp.
-ts [TESTSUITE]: specifies the test suite. This parameter can be used independently.
-tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts.
You can run h to display help information.
```
### Executing Test Cases on Linux
#### Mapping the Remote Port
To enable test cases to be executed on a remote Linux server or a Linux VM, map the port to enable communication between the device and the remote server or VM. Configure port mapping as follows:
1. On the HDC server, run the following commands:
```
hdc_std kill
hdc_std -m -s 0.0.0.0:8710
```
>**NOTE**<br>The IP address and port number are default values.
2. On the HDC client, run the following command:
```
hdc_std -s xx.xx.xx.xx:8710 list targets
```
>**NOTE**<br>Enter the IP address of the device to test.
#### Executing Test Cases
1. Start the test framework.
```
./start.sh
```
2. Select the product.
After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**.
3. Execute test cases.
The test framework locates the test cases based on the command, and automatically builds and executes the test cases.
```
run -t UT -ts CalculatorSubTest -tc integer_sub_00l
```
In the command:
```
-t [TESTTYPE]: specifies the test case type, which can be UT, MST, ST, or PERF. This parameter is mandatory.
-tp [TESTPART]: specifies the part to test. This parameter can be used independently.
-tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp.
-ts [TESTSUITE]: specifies the test suite. This parameter can be used independently.
-tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts.
You can run h to display help information.
```
## Viewing the Test Report
After the test cases are executed, the test result will be automatically generated. You can view the detailed test result in the related directory.
### Test Result
You can obtain the test result in the following directory:
```
test/developertest/reports/xxxx_xx_xx_xx_xx_xx
```
>**NOTE**<br>The folder for test reports is automatically generated.
The folder contains the following files:
| Type| Description|
| ------------ | ------------ |
| result/ |Test cases in standard format.|
| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs.|
| summary_report.html | Test report summary.|
| details_report.html | Detailed test report.|
### Test Framework Logs
```
reports/platform_log_xxxx_xx_xx_xx_xx_xx.log
```
### Latest Test Report
```
reports/latest
```
# XTS Test Case Development
## Introduction
The X test suite (XTS) subsystem contains a set of OpenHarmony compatibility test suites, including the currently supported application compatibility test suite (ACTS) and the device compatibility test suite (DCTS) that will be supported in the future.
This subsystem contains the ACTS and **tools** software package.
- The **acts** directory stores the source code and configuration files of ACTS test cases. The ACTS helps device vendors detect the software incompatibility as early as possible and ensures that the software is compatible to OpenHarmony during the entire development process.
- The **tools** software package stores the test case development framework related to **acts**.
## System Types
OpenHarmony supports the following systems:
- Mini system
A mini system runs on a device that comes with memory greater than or equal to 128 KiB and MCU such as ARM Cortex-M and 32-bit RISC-V. It provides multiple lightweight network protocols and graphics frameworks, and a wide range of read/write components for the IoT bus. Typical products include connection modules, sensors, and wearables for smart home.
- Small system
A small system runs on a device that comes with memory greater than or equal to 1 MiB and application processors such as ARM Cortex-A. It provides higher security capabilities, standard graphics frameworks, and video encoding and decoding capabilities. Typical products include smart home IP cameras, electronic cat eyes, and routers, and event data recorders (EDRs) for smart travel.
- Standard system
A standard system runs on a device that comes with memory greater than or equal to 128 MiB and application processors such as ARM Cortex-A. It provides a complete application framework supporting the enhanced interaction, 3D GPU, hardware composer, diverse components, and rich animations. This system applies to high-end refrigerator displays.
## Directory Structure
```
/test/xts
├── acts # Test code
│ └── subsystem # Source code of subsystem test cases for the standard system
│ └── subsystem_lite # Source code of subsystems test cases for mini and small systems
│ └── BUILD.gn # Build configuration of test cases for the standard system
│ └── build_lite # Build configuration of test cases for the mini and small systems.
│ └── BUILD.gn # Build configuration of test cases for mini and small systems
└── tools # Test tool code
```
## Constraints
Test cases for the mini system must be developed in C, and those for the small system must be developed in C++.
## Usage Guidelines
**Table 1** Test case levels
| Level | Definition | Scope |
| ----- | ----------- | ------- |
| Level0 | Smoke | Verifies basic functionalities of key features and basic DFX attributes with the most common input. The pass result indicates that the features are runnable. |
| Level1 | Basic | Verifies basic functionalities of key features and basic DFX attributes with common input. The pass result indicates that the features are testable. |
| Level2 | Major | Verifies basic functionalities of key features and basic DFX attributes with common input and errors. The pass result indicates that the features are functional and ready for beta testing. |
| Level3 | Regular | Verifies functionalities of all key features, and all DFX attributes with common and uncommon input combinations or normal and abnormal preset conditions. |
| Level4 | Rare | Verifies functionalities of key features under extremely abnormal presets and uncommon input combinations. |
**Table 2** Test case granularities
| Test Scale | Test Objects | Test Environment |
| ----- | ----------- | ------- |
| LargeTest | Service functionalities, all-scenario features, and mechanical power environment (MPE) and scenario-level DFX | Devices close to real devices. |
| MediumTest | Modules, subsystem functionalities after module integration, and DFX | Single device that is actually used. You can perform message simulation, but do not mock functions. |
| SmallTest | Modules, classes, and functions | Local PC. Use a large number of mocks to replace dependencies with other modules. |
**Table 3** Test types
| Type | Definition |
| ----------- | ------- |
| Function | Tests the correctness of both service and platform functionalities provided by the tested object for end users or developers. |
| Performance | Tests the processing capability of the tested object under specific preset conditions and load models. The processing capability is measured by the service volume that can be processed in a unit time, for example, call per second, frame per second, or event processing volume per second. |
| Power | Tests the power consumption of the tested object in a certain period of time under specific preset conditions and load models. |
| Reliability | Tests the service performance of the tested object under common and uncommon input conditions, or specified service volume pressure and long-term continuous running pressure. The test covers stability, pressure handling, fault injection, and Monkey test times. |
| Security | Tests the capability of defending against security threats, including but not limited to unauthorized access, use, disclosure, damage, modification, and destruction, to ensure information confidentiality, integrity, and availability.<br/>Tests the privacy protection capability to ensure that the collection, use, retention, disclosure, and disposal of users' private data comply with laws and regulations.<br/> Tests the compliance with various security specifications, such as security design, security requirements, and security certification of the Ministry of Industry and Information Technology (MIIT). |
| Global | Tests the internationalized data and localization capabilities of the tested object, including multi-language display, various input/output habits, time formats, and regional features, such as currency, time, and culture taboos. |
| Compatibility | Tests backward compatibility of an application with its own data, the forward and backward compatibility with the system, and the compatibility with different user data, such as audio file content of the player and smart SMS messages.<br/>Tests system backward compatibility with its own data and the compatibility of common applications in the ecosystem.<br/>Tests software compatibility with related hardware. |
| User | Tests user experience of the object in real user scenarios. All conclusions and comments should come from the users, which are all subjective evaluation in this case. |
| Standard | Tests the compliance with industry and company-specific standards, protocols, and specifications. The standards here do not include any security standards that should be classified into the security test. |
| Safety | Tests the safety property of the tested object to avoid possible hazards to personal safety, health, and the object itself. |
| Resilience | Tests the resilience property of the tested object to ensure that it can withstand and maintain the defined running status (including downgrading) when being attacked, and recover from and adapt defense to the attacks to approach mission assurance. |
## Test Case Development Guidelines
The test framework and programming language vary with the system type.
**Table 4** Test frameworks and test case languages for different systems
| System | Test Framework | Language |
| ----- | ----------- | ------- |
| Mini | HCTest | C |
| Small | HCPPTest | C++ |
| Standard | HJSUnit and HCPPTest | JavaScript and C++ |
### Developing Test Cases in C (for the Mini System)
**Developing Test Cases for the Mini System**
HCTest and the C language are used to develop test cases. HCTest is enhanced and adapted based on the open-source test framework Unity.
1. Define the test case directory. The test cases are stored to **test/xts/acts**.
```
├── acts
│ └──subsystem_lite
│ │ └── module_hal
│ │ │ └── BUILD.gn
│ │ │ └── src
│ └──build_lite
│ │ └── BUILD.gn
```
2. Write the test case in the **src** directory.
(1) Include the test framework header file.
```
#include "hctest.h"
```
(2) Use the **LITE_TEST_SUIT** macro to define names of the subsystem, module, and test suite.
```
/**
* @brief register a test suite named "IntTestSuite"
* @param test subsystem name
* @param example module name
* @param IntTestSuite test suite name
*/
LITE_TEST_SUIT(test, example, IntTestSuite);
```
(3) Define Setup and TearDown.
​ Format: Test suite name+Setup, Test suite name+TearDown.
​ The Setup and TearDown functions must exist, but function bodies can be empty.
(4) Use the **LITE_TEST_CASE** macro to write the test case.
​ Three parameters are involved: test suite name, test case name, and test case properties (including type, granularity, and level).
```
LITE_TEST_CASE(IntTestSuite, TestCase001, Function | MediumTest | Level1)
{
// Do something.
};
```
(5) Use the **RUN_TEST_SUITE** macro to register the test suite.
```
RUN_TEST_SUITE(IntTestSuite);
```
3. Create the configuration file (**BUILD.gn**) of the test module.
Create a **BUILD.gn** (example) file in each test module directory, and specify the name of the built static library and its dependent header files and libraries.
The format is as follows:
```
import("//test/xts/tools/lite/build/suite_lite.gni")
hctest_suite("ActsDemoTest") {
suite_name = "acts"
sources = [
"src/test_demo.c",
]
include_dirs = [ ]
cflags = [ "-Wno-error" ]
}
```
4. Add build options to the **BUILD.gn** file in the **acts** directory.
You need to add the test module to the **test/xts/acts/build\_lite/BUILD.gn** script in the **acts** directory.
```
lite_component("acts") {
...
if(board_name == "liteos_m") {
features += [
...
"//xts/acts/subsystem_lite/module_hal:ActsDemoTest"
]
}
}
```
5. Run build commands.
Test suites are built along with the OS version. The ACTS is built together with the debug version.
>![](../public_sys-resources/icon-note.gif) **NOTE**<br/> The ACTS build middleware is a static library, which will be linked to the image.
### Executing Test Cases in C (for the Mini System)
**Executing Test Cases for the Mini System**
Burn the image into the development board.
**Executing the Test**
1. Use a serial port tool to log in to the development board and save information about the serial port.
2. Restart the device and view serial port logs.
**Analyzing the Test Result**
View the serial port logs in the following format:
The log for each test suite starts with "Start to run test suite:" and ends with "xx Tests xx Failures xx Ignored".
### Developing Test Cases in C++ (for Standard and Small Systems)
**Developing Test Cases for Small-System Devices** (for the standard system, see the **global/i18n_standard directory**.)
The HCPPTest framework, an enhanced version based on the open-source framework Googletest, is used.
1. Define the test case directory. The test cases are stored to **test/xts/acts**.
```
├── acts
│ └──subsystem_lite
│ │ └── module_posix
│ │ │ └── BUILD.gn
│ │ │ └── src
│ └──build_lite
│ │ └── BUILD.gn
```
2. Write the test case in the **src** directory.
(1) Include the test framework.
Include **gtest.h**.
```
#include "gtest/gtest.h"
```
(2) Define Setup and TearDown.
```
using namespace std;
using namespace testing::ext;
class TestSuite: public testing::Test {
protected:
// Preset action of the test suite, which is executed before the first test case
static void SetUpTestCase(void){
}
// Test suite cleanup action, which is executed after the last test case
static void TearDownTestCase(void){
}
// Preset action of the test case
virtual void SetUp()
{
}
// Cleanup action of the test case
virtual void TearDown()
{
}
};
```
(3) Use the **HWTEST** or **HWTEST_F** macro to write the test case.
**HWTEST**: definition of common test cases, including the test suite name, test case name, and case annotation.
**HWTEST_F**: definition of SetUp and TearDown test cases, including the test suite name, test case name, and case annotation.
Three parameters are involved: test suite name, test case name, and test case properties (including type, granularity, and level).
```
HWTEST_F(TestSuite, TestCase_0001, Function | MediumTest | Level1) {
// Do something
```
3. Create a configuration file (**BUILD.gn**) of the test module.
Create a **BUILD.gn** file in each test module directory, and specify the name of the built static library and its dependent header files and libraries. Each test module is independently built into a **.bin** executable file, which can be directly pushed to the development board for testing.
Example:
```
import("//test/xts/tools/lite/build/suite_lite.gni")
hcpptest_suite("ActsDemoTest") {
suite_name = "acts"
sources = [
"src/TestDemo.cpp"
]
include_dirs = [
"src",
...
]
deps = [
...
]
cflags = [ "-Wno-error" ]
}
```
4. Add build options to the **BUILD.gn** file in the **acts** directory.
Add the test module to the **test/xts/acts/build_lite/BUILD.gn** script in the **acts** directory.
```
lite_component("acts") {
...
else if(board_name == "liteos_a") {
features += [
...
"//xts/acts/subsystem_lite/module_posix:ActsDemoTest"
]
}
}
```
5. Run build commands.
Test suites are built along with the OS version. The ACTS is built together with the debug version.
>![](../public_sys-resources/icon-note.gif) **NOTE**
>
>The ACTS for the small system is independently built to an executable file (.bin) and archived in the **suites\acts** directory of the build result.
### Executing Test Cases in C++ (for Standard and Small Systems)
**Executing Test Cases for the Small System**
Currently, test cases are shared by the NFS and mounted to the development board for execution.
**Setting Up the Environment**
1. Use a network cable or wireless network to connect the development board to your PC.
2. Configure the IP address, subnet mask, and gateway for the development board. Ensure that the development board and the PC are in the same network segment.
3. Install and register the NFS server on the PC and start the NFS service.
4. Run the **mount** command for the development board to ensure that the development board can access NFS shared files on the PC.
Format: **mount** _NFS server IP address_**:/**_NFS shared directory_ **/**_development board directory_ **nfs**
Example:
```
mount 192.168.1.10:/nfs /nfs nfs
```
**Executing Test Cases**
Execute **ActsDemoTest.bin** to trigger test case execution, and analyze serial port logs generated after the execution is complete.
### Developing Test Cases in JavaScript (for the Standard System)
The HJSUnit framework is used to support automated test of OpenHarmony apps that are developed using the JavaScript language based on the JS application framework.
**Basic Syntax of Test Cases**
The test cases are developed with the JavaScript language and must meet the programming specifications of the language.
**Table 5** Basic syntax of test cases
| Syntax | Description | Mandatory |
| ------- | ------------- | ------------ |
| beforeAll | Presets a test-suite-level action executed only once before all test cases are executed. You can pass the action function as the only parameter. | No |
| afterAll | Presets a test-suite-level clear action executed only once after all test cases are executed. You can pass the clear function as the only parameter. | No |
| beforeEach | Presets a test-case-level action executed before each test case is executed. The number of execution times is the same as the number of test cases defined by it. You can pass the action function as the only parameter. | No |
| afterEach | Presets a test-case-level clear action executed after each test case is executed. The number of execution times is the same as the number of test cases defined by it. You can pass the clear function as the only parameter. | No |
| describe | Defines a test suite. You can pass two parameters: test suite name and test suite function. The describe statement supports nesting. You can use beforeall, beforeEach, afterEach, and afterAll in each describe statement. | Yes |
| it | Defines a test case. You can pass three parameters: test case name, filter parameter, and test case function. <br>**Filter parameter:** <br/>The value is a 32-bit integer. Setting different bits to 1 means different configurations.<br/> - Setting bit 0 to **1** means bypassing the filter. <br>- Setting bits 0-10 to **1** specifies the test case type, which can be FUNCTION (function test), PERFORMANCE (performance test), POWER (power consumption test), RELIABILITY (reliability test), SECURITY (security compliance test), GLOBAL (integrity test), COMPATIBILITY (compatibility test), USER (user test), STANDARD (standard test), SAFETY (security feature test), and RESILIENCE (resilience test), respectively.<br>- Setting bits 16-18 to **1** specifies the test case scale, which can be SMALL (small-scale test), MEDIUM (medium-scale test), and LARGE (large-scale test), respectively.<br>- Setting bits 24-28 to **1** specifies the test level, which can be LEVEL0 (level-0 test), LEVEL1 (level-1 test), LEVEL2 (level-2 test), LEVEL3 (level-3 test), and LEVEL4 (level-4 test), respectively.<br> | Yes |
Use the standard syntax of Jasmine to write test cases. The ES6 specification is supported.
1. Define the test case directory. The test cases are stored in the **entry/src/main/js/test** directory.
```
├── BUILD.gn
│ └──entry
│ │ └──src
│ │ │ └──main
│ │ │ │ └──js
│ │ │ │ │ └──default
│ │ │ │ │ │ └──pages
│ │ │ │ │ │ │ └──index
│ │ │ │ │ │ │ │ └──index.js # Entry file
│ │ │ │ │ └──test # Test code directory
│ │ │ └── resources # HAP resources
│ │ │ └── config.json # HAP configuration file
```
2. Start the JS test framework and load test cases.
The following is an example for **index.js**.
```
// Start the JS test framework and load test cases.
import {Core, ExpectExtend} from 'deccjsunit/index'
export default {
data: {
title: ""
},
onInit() {
this.title = this.$t('strings.world');
},
onShow() {
console.info('onShow finish')
const core = Core.getInstance()
const expectExtend = new ExpectExtend({
'id': 'extend'
})
core.addService('expect', expectExtend)
core.init()
const configService = core.getDefaultService('config')
configService.setConfig(this)
require('../../../test/List.test')
core.execute()
},
onReady() {
},
}
```
3. Write a unit test case.
The following is an example:
```
// Example 1: Use HJSUnit to perform a unit test.
describe('appInfoTest', function () {
it('app_info_test_001', 0, function () {
var info = app.getInfo()
expect(info.versionName).assertEqual('1.0')
expect(info.versionCode).assertEqual('3')
})
})
```
### Packaging Test Cases in JavaScript (for the Standard System)
For details about how to build a HAP, see the JS application development guide of the standard system [Building and Creating HAPs](https://developer.harmonyos.com/en/docs/documentation/doc-guides/build_overview-0000001055075201).
## Performing a Full Build (for the Standard System)
Run the following command:
```
./build.sh suite=acts system_size=standard
```
Test case directory: **out/release/suites/acts/testcases**
Test framework and test case directory: **out/release/suites/acts** \(the test suite execution framework is compiled during the build process)
## Executing Test Cases in a Full Build (for Small and Standard Systems)
**Setting Up a Test Environment**
Install Python 3.7 or a later version on a Windows environment and ensure that the Windows environment is properly connected to the test device.
**Test execution directory** \(corresponding to the **out/release/suites/acts** directory generated in the build)
```
├── testcase # Directory for storing test suite files
│ └──xxx.hap # HAP file executed by the test suite
│ └──xxx.json # Execution configuration file of the test suite
├── tools # Test framework tool directory
├── run.bat # File for starting the test suite on the Windows platform
├── report # Directory for storing the test reports
```
**Executing Test Cases**
1. On the Windows environment, locate the directory in which the test cases are stored \(**out/release/suites/acts**, copied from the Linux server), go to the directory in the Windows command window, and run **acts\\run.bat**.
2. Enter the command for executing the test case.
- Execute all test cases.
```
run acts
```
![](figure/en-us_image_0000001119924146.gif)
- Execute the test cases of a module \(view specific module information in **\acts\testcases\**).
```
run –l ActsSamgrTest
```
![](figure/en-us_image_0000001166643927.jpg)
You can view specific module information in **\acts\testcases\**.
Wait until the test cases are complete.
3. View the test report.
Go to **acts\reports**, obtain the current execution record, and open **summary_report.html** to view the test report.
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册