From 5793e75caaec7aa4657dd164ed193a03e49f1b69 Mon Sep 17 00:00:00 2001 From: annie_wangli Date: Mon, 1 Nov 2021 15:42:08 +0800 Subject: [PATCH] update docs Signed-off-by: annie_wangli --- .../subsystems/subsys-testguide-envbuild.md | 83 + .../subsystems/subsys-testguide-test.md | 1697 ++++++++--------- 2 files changed, 859 insertions(+), 921 deletions(-) create mode 100644 en/device-dev/subsystems/subsys-testguide-envbuild.md diff --git a/en/device-dev/subsystems/subsys-testguide-envbuild.md b/en/device-dev/subsystems/subsys-testguide-envbuild.md new file mode 100644 index 0000000000..7e81b6cb4b --- /dev/null +++ b/en/device-dev/subsystems/subsys-testguide-envbuild.md @@ -0,0 +1,83 @@ + + +# Setting Up the Environment +## Basic Test Framework Environment + +|Environment|Operating System|Linux Extended Component|Python|Python Plug-ins|NFS Server|HDC| +|------------|------------|------------|------------|------------|------------|------------| +|Version|Ubuntu 18.04 or later|libreadline-dev|3.7.5 or later|pyserial 3.3 or later, paramiko 2.7.1 or later, setuptools 40.8.0 or later, and rsa4.0 or later|haneWIN NFS Server 1.2.50 or later, or NFS v4 or later| 1.1.0 or later| +|Description|Provides code build environment.|Plug-in used to read commands.|Language used by the test framework.|pyserial: supports Python serial port communication.
paramiko: allows Python to use SSH.
setuptools: allows Python packages to be created and distributed easily.
rsa: implements RSA encryption in Python.|Enables devices to be connected through the serial port.| A tool that enables devices to be connected through the HarmonyOS Device Connector (HDC).| + +## Installation Process +1. Run the following command to install the Linux extended component libreadline: + ``` + sudo apt-get install libreadline-dev + ``` + The installation is successful if the following information is displayed: + ``` + Reading package lists... Done + Building dependency tree + Reading state information... Done + libreadline-dev is already the newest version (7.0-3). + 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. + ``` +2. Run the following command to install the setuptools plug-in: + ``` + pip3 install setuptools + ``` + The installation is successful if the following information is displayed: + ``` + Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0) + ``` +3. Run the following command to install the paramiko plug-in: + ``` + pip3 install paramiko + ``` + The installation is successful if the following information is displayed: + ``` + Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko + Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0 + ``` +4. Run the following command to install the ras plug-in: + ``` + pip3 install rsa + ``` + The installation is successful if the following information is displayed: + ``` + Installing collected packages: pyasn1, rsa + Successfully installed pyasn1-0.4.8 rsa-4.7 + ``` +5. Run the following command to install the pyserial plug-in: + ``` + pip3 install pyserial + ``` + The installation is successful if the following information is displayed: + ``` + Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4) + ``` +6. Install the NFS server if the device outputs results only through the serial port. + - In Windows, install, for example, haneWIN NFS Server 1.2.50. + - In Linux, run the following command to install the NFS server: + ``` + sudo apt install nfs-kernel-server + ``` + The installation is successful if the following information is displayed: + ``` + Reading package lists... Done + Building dependency tree + Reading state information... Done + nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3). + 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. + ``` +7. Install the HDC tool if the device supports HDC connections. For details about the installation process, see: + + https://gitee.com/openharmony/developtools_hdc_standard/blob/master/README.md + +## Checking the Environment + +| Check Item|Operation|Requirements| +| --- | --- | --- | +| Check whether Python is installed successfully.|Run the **python --version** command.|The Python version is 3.7.5 or later.| +| Check whether Python plug-ins are successfully installed.|Go to the **test/developertest** directory and run **run.bat** or **run.sh**.| The **>>>** prompt is displayed.| +|Check the NFS server status (for the devices that support only serial port output).|Log in to the development board through the serial port and run the **mount** command to mount the NFS.|The file directory can be mounted.| +|Check whether the HDC is successfully installed.|Run the **hdc_std -v** command.|The HDC version is 1.1.0 or later.| diff --git a/en/device-dev/subsystems/subsys-testguide-test.md b/en/device-dev/subsystems/subsys-testguide-test.md index 8f09862822..acc8daa6ec 100644 --- a/en/device-dev/subsystems/subsys-testguide-test.md +++ b/en/device-dev/subsystems/subsys-testguide-test.md @@ -1,282 +1,80 @@ -# Testing - -- [Overview](#section12403172115920) - - [Basic Concepts](#section53632272090) - - [Working Principles](#section2394431106) - -- [Limitations and Constraints](#section2029921310472) -- [Setting Up a Test Environment](#section175012297491) - - [Environment Requirements](#section935055691014) - - [Installing the Environment](#section6511193210111) - - [Verifying the Test Environment](#section1899144517117) - -- [Development Guidelines](#section16741101301210) - - [When to Use](#section93782214124) - - [Available APIs](#section54131732101218) - - [How to Develop](#section53541946111218) - -- [Development Example](#section7477121918136) -- [How to Use the Test Platform](#section76401945124810) -- [Directory Structure](#section1875515364133) - -## Overview - -### Basic Concepts - -The testing subsystem provides a one-click Python-based self-test platform for developers. It supports cross-platform tests and extension to third-party testing frameworks. The subsystem consists of modules for compiling, managing, scheduling and distributing, and executing test cases, collecting test results, generating test reports, creating test case templates, managing test environments, and many others. - -Before development using the testing subsystem, you need to understand the following concepts: - -- Test case compilation - - This operation compiles the source code of test cases into binary files that can be executed on the tested device. - -- Test case scheduling & distributing - - This operation distributes test cases to different tested devices through the network port or serial port, and allocates a specific executor for each test case. - -- Test case executor - - A test case executor defines the execution logic of each test case, such as its pre-processing, execution, and result recording. - -- Test case template - - A test case template defines respective unified formats for test cases and for GN files. - -- Test platform kits - - The test platform provides common methods to be used during the running of the test tool, for example, providing the test case directory to mount the file system to a tested device, distributing test cases to the tested device, or obtaining test results from the tested device. - -- Test report generation - - This operation defines a template for generating self-test reports and web test reports. - -- Test environment management - - The tested devices can be managed through the USB port or serial port, including discovering a device and querying the device status. - - -### Working Principles - -- The following figure shows the architecture of the test platform. - -**Figure 1** Platform architecture -![](figure/platform-architecture.png "platform-architecture") - -- The following figure shows the running sequence diagram of the test platform. - -**Figure 2** Running sequence of the test platform -![](figure/running-sequence-of-the-test-platform.png "running-sequence-of-the-test-platform") - -- Working principle of the test platform - -The test platform is started using a shell script. It executes a series of testing commands entered on the command line interface \(CLI\) and prints the command output. - -## Limitations and Constraints - -- The self-test platform supports only code-level test case development and verification, such as unit testing and module testing. -- Currently, the testing framework supports only white-box testing. -- Only one test platform can be started on a testing device. - -## Setting Up a Test Environment - -### Environment Requirements - -**Table 1** Environment requirements - - - - - - - - - - - - - - - - -

Item

-

Testing Device

-

Tested Device

-

Hardware

-
  • Memory: 8 GB or above
  • Hard disk space: 100 GB or above
  • Hardware architecture: x86 or ARM64
-
  • Hi3516D V300 development board
  • Hi3518E V300 development board
-

Software

-
  • OS: Windows 10 (64-bit) or Ubuntu 18.04

    System component (Linux): libreadline-dev

    -
  • Python: 3.7.5 or later
  • Python plug-ins: pySerial 3.3 or later, Paramiko 2.7.1 or later, Setuptools 40.8.0 or later, and RSA 4.0 or later
  • NFS server: haneWIN NFS Server 1.2.50 or later, or NFSv4 or later
-
  • OS: OpenHarmony 1.0 or later
  • Kernel: LiteOS Cortex-A or Linux kernel
-
- -### Installing the Environment - -1. \(Optional\) If the test environment runs Linux, run the following command to install system component Readline: - - ``` - sudo apt-get install libreadline-dev - ``` - - If the installation is successful, the following prompts are displayed: - - ``` - Reading package lists... Done - Building dependency tree - Reading state information... Done - libreadline-dev is already the newest version (7.0-3). - 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. - ``` - -2. Install Python extension plug-ins Setuptools. Install RSA, Paramiko, and pySerial if the device supports the serial port only. - - 1. Run the following command to install Setuptools: - - ``` - pip install setuptools - ``` - - If the installation is successful, the following prompts are displayed: - - ``` - Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0) - ``` - - 2. Run the following command to install RSA: - - ``` - pip install rsa - ``` - - If the installation is successful, the following prompts are displayed: - - ``` - Installing collected packages: pyasn1, rsa - Successfully installed pyasn1-0.4.8 rsa-4.7 - ``` - - 3. Run the following command to install Paramiko: - - ``` - pip install paramiko - ``` - - If the installation is successful, the following prompts are displayed: - - ``` - Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko - Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0 - ``` - - 4. \(Optional\) Run the following command to install pySerial. This step is mandatory for tested devices that support serial ports only. - - ``` - pip install pyserial - ``` - - If the installation is successful, the following prompts are displayed: - - ``` - Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4) - ``` - -3. \(Optional\) Install the NFS server. This step is mandatory for tested devices that support serial ports only. - - **Windows OS** - - Download and install **haneWIN NFS Server 1.2.50** at https://www.hanewin.net/nfs-e.htm. - - **Linux OS** - - ``` - sudo apt install nfs-kernel-server - ``` - - After the environment is installed, you can conduct coding and debugging for a test platform in an integrated development environment \(IDE\) \(DevEco Studio is recommended\). - - -### Verifying the Test Environment - -**Table 2** Environment verification - - - - - - - - - - - - - - - - - - - - -

Item

-

Operation

-

Requirement

-

Check that a compliant Python version has been installed.

-

Run the python --version command.

-

The Python version must be 3.7.5 or later.

-

Check that Python extension plug-ins have been installed.

-

Access the test/xdevice directory and run run.bat or run.sh.

-

The >>> prompt is displayed.

-

Check that the NFS server has been started (for tested devices that support serial ports only).

-

Log in to the development board through the serial port and run the mount command to mount the NFS server.

-

The file directory can be mounted properly.

-
- -## Development Guidelines - -### When to Use - -You can call the APIs to conduct white box tests of service code. - -### Available APIs - -The testing framework integrates the open-source unit testing framework and expands the macros of the test cases. For details about the framework, see the official open-source documentation. - -**Table 3** Expanded macros of the framework - - - - - - - - - - - - - - - - -

Macro

-

Description

-

HWTEST

-

The execution of test cases does not rely on setup and teardown execution. Based on the TEST macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST(CalculatorAddTest, TestPoint_001, TestSize.Level1).

-

HWTEST_F

-

The execution of test cases (without parameters) depends on setup and teardown execution. Based on the TEST_F macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST_F(CalculatorAddTest, TestPoint_001, TestSize.Level1).

-

HWTEST_P

-

The execution of test cases (with parameters) depends on setup and teardown execution. Based on the TEST_P macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST_P(CalculatorAddTest, TestPoint_001, TestSize.Level1).

-
- -### How to Develop - -1. Define a test suite file based on the test case directory, for example, **test/developertest/examples/lite/cxx\_demo/test/unittest/common/calc\_subtraction\_test.cpp**. The class in this test suite should be inherited from the **testing::Test** class and named in the format of "_Tested feature_\_**Test**". - +# Test Subsystem +OpenHarmony provides a comprehensive auto-test framework for designing test cases. Detecting defects in the development process can improve code quality. + +This document describes how to use the OpenHarmony test framework. +## Setting Up the Environment +The test framework depends on the Python running environment. Before using the test framework, set up the environment as follows: + - [Setting Up the Environment](subsys-testguide-envbuild.md) + - [Obtaining Source Code](../device-dev/get-code/sourcecode-acquire.md) + + +## Directory Structure +The directory structure of the test framework is as follows: +``` +test # Test subsystem +├── developertest # Developer test module +│ ├── aw # Static library of the test framework +│ ├── config # Test framework configuration +│ │ │ ... +│ │ └── user_config.xml # User configuration +│ ├── examples # Examples of test cases +│ ├── src # Source code of the test framework +│ ├── third_party # Adaptation code for third-party components on which the test framework depends +│ ├── reports # Test reports +│ ├── BUILD.gn # Build entry of the test framework +│ ├── start.bat # Test entry for Windows +│ └── start.sh # Test entry for Linux +└── xdevice # Modules on which the test framework depends +``` +## Writing Test Cases +### Designing the Test Case Directory +Design the test case directory as follows: +``` +subsystem # Subsystem +├── partA # Part A +│ ├── moduleA # Module A +│ │ ├── include +│ │ ├── src # Service code +│ │ └── test # Test directory +│ │ ├── unittest # Unit test +│ │ │ ├── common # Common test cases +│ │ │ │ ├── BUILD.gn # Build file of test cases +│ │ │ │ ├── testA_test.cpp # Source code of unit test cases +│ │ │ ├── phone # Test cases for mobile phones +│ │ │ ├── ivi # Test cases for head units +│ │ │ └── liteos-a # Test cases for the IP cameras that use the LiteOS kernel +│ │ └── resource # Dependency resources +│ │ └── ohos_test.xml +│ ├── moduleB # Module B +│ ├── test +│ │ └── moduletest # Module test +│ │ ├── common +│ │ ├── phone +│ │ ├── ivi +│ │ └── liteos-a +│ │ ... +│ └── ohos_build # Build entry configuration +... +``` +> **Note:** Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices. + +### Writing Test Cases +This test framework supports test cases written in multiple programming languages and provides different templates for different languages. + +**C++ Test Case Example** + +- Naming rules for source files + + The source file name of test cases must be the same as that of the test suite. The file names must use lowercase letters and in the [Function]\_[Sub-function]\_**test** format. More specific sub-functions can be added as required. +Example: + ``` + calculator_sub_test.cpp + ``` + +- Test case example ``` /* - * Copyright (c) 2020 OpenHarmony. + * Copyright (c) 2021 XXXX Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at @@ -289,692 +87,749 @@ The testing framework integrates the open-source unit testing framework and expa * See the License for the specific language governing permissions and * limitations under the License. */ + + #include "calculator.h" #include - using namespace std; using namespace testing::ext; - class CalcSubtractionTest : public testing::Test { + class CalculatorSubTest : public testing::Test { public: static void SetUpTestCase(void); static void TearDownTestCase(void); void SetUp(); void TearDown(); }; - ``` - - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >You must write test cases by observing the following specifications: - >- Naming - > The source file name of a test case must be consistent with the test suite content. Each test suite has multiple test cases and a test source file that is globally unique and named in \[Feature\]\_\[Function\]\_\[Subfunction 1\]\_\[Subfunction 1.1\] format \(subfunctions can be further divided\). - > The file name can contain only lower-case letters and underscores \(\_\) and must end with **test**, for example, **developertest/examples/lite/cxx\_demo**. - >- Coding - > The test cases must comply with the coding specifications for feature code. In addition, case descriptions are required for further clarification. For details, see [Test case template](#li2069415903917). - >- Compilation and configuration - > The test cases must be compiled using GN, and the configurations must comply with the compilation guide of this open-source project. For details, see [Compilation and Building Subsystem - Lightweight and Small-Scale Systems](subsys-build-mini-lite.md). - >- Test case template - > For details, see the example test case **developertest/examples/lite/cxx\_demo/test/unittest/common/calc\_subtraction\_test.cpp**. - -2. Implement the preprocessing \(via the **SetUp** function\) and postprocessing \(via the **TearDown** function\) operations required by the execution of the test suite. - - ``` - void CalcSubtractionTest::SetUpTestCase(void) + + void CalculatorSubTest::SetUpTestCase(void) { - // step 1: input testsuite setup step + // Set a setup function, which will be called before all test cases. } - void CalcSubtractionTest::TearDownTestCase(void) + void CalculatorSubTest::TearDownTestCase(void) { - // step 2: input testsuite teardown step + // Set a teardown function, which will be called after all test cases. } - void CalcSubtractionTest::SetUp(void) + void CalculatorSubTest::SetUp(void) { - // step 3: input testcase setup step + // Set a setup function, which will be called before each test case. } - void CalcSubtractionTest::TearDown(void) + void CalculatorSubTest::TearDown(void) { - // step 4: input testcase teardown step + // Set a teardown function, which will be called after each test case. } - ``` - -3. Compile a test case based on the feature to be tested. The following code uses **HWTEST\_F** as an example: - - ``` + /** * @tc.name: integer_sub_001 - * @tc.desc: Test Calculator + * @tc.desc: Verify the sub-function. * @tc.type: FUNC - * @tc.require: AR00000000 SR00000000 + * @tc.require: Issue Number */ - HWTEST_F(CalcSubtractionTest, integer_sub_001, TestSize.Level1) + HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1) { - EXPECT_EQ(0, Subtraction(1, 0)); - } - ``` + // Step 1 Call the function to obtain the result. + int actual = Sub(4, 0); - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >- **@tc.name**: test case name, which briefly describes the test purpose - >- **@tc.desc**: detailed description of the test case, including the test purpose, test procedure, and expected result - >- **@tc.type**: test type, which can be **FUNC**, **PERF**, **SECU**, or **RELI**. - >- **@tc.require**: requirement ID or issue ID, which is used to associate the modification with the test case - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

SN

-

Test Type

-

Code

-

Description

-

1

-

Functionality test

-

FUNC

-

Verifies that each functionality of the software complies with the function design and specifications.

-

2

-

Performance test

-

PERF

-

Verifies that the software meets the performance requirements. Performance tests include load tests, capacitance tests, and pressure tests.

-

3

-

Security test

-

SECU

-

Verifies that the software complies with security requirements and related laws and regulations within the software lifecycle.

-

4

-

Reliability test

-

RELI

-

Verifies the probability that the software does not cause system failures within a specified period of time and under given conditions. Software stability is also involved in the test.

-
- -4. Compile the GN file of the test case, including defining the compilation target, adding compilation dependencies, and setting the source file. - - Example file path: **test/developertest/examples/lite/cxx\_demo/test/unittest/common/BUILD.gn** - - ``` - import("//build/lite/config/test.gni") - - unittest("CalcSubTest") { - output_extension = "bin" - sources = [ - "calc_subtraction_test.cpp" - ] - include_dirs = [] - deps = [] + // Step 2 Use an assertion to compare the obtained result with the expected result. + EXPECT_EQ(4, actual); } ``` + The procedure is as follows: + 1. Add comment information to the test case file header. + ``` + /* + * Copyright (c) 2021 XXXX Device Co., Ltd. + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + ``` + 2. Add the test framework header file and namespace. + ``` + #include + + using namespace testing::ext; + ``` + 3. Add the header file of the test class. + ``` + #include "calculator.h" + ``` + 4. Define the test suite (test class). + ``` + class CalculatorSubTest : public testing::Test { + public: + static void SetUpTestCase(void); + static void TearDownTestCase(void); + void SetUp(); + void TearDown(); + }; + + void CalculatorSubTest::SetUpTestCase(void) + { + // Set a setup function, which will be called before all test cases. + } + + void CalculatorSubTest::TearDownTestCase(void) + { + // Set a teardown function, which will be called after all test cases. + } + + void CalculatorSubTest::SetUp(void) + { + // Set a setup function, which will be called before each test case. + } + + void CalculatorSubTest::TearDown(void) + { + // Set a teardown function, which will be called after each test case. + } + ``` + > **Note**: When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style. + + 5. Add implementation of the test cases, including test case comments and logic. + ``` + /** + * @tc.name: integer_sub_001 + * @tc.desc: Verify the sub function. + * @tc.type: FUNC + * @tc.require: Issue Number + */ + HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1) + { + // Step 1 Call the function to obtain the test result. + int actual = Sub(4, 0); + + // Step 2 Use an assertion to compare the obtained result with the expected result. + EXPECT_EQ(4, actual); + } + ``` + The following test case templates are provided for your reference. + + | Template| Description| + | ------------| ------------| + | HWTEST(A,B,C)| Use this template if the test case execution does not depend on setup or teardown.| + | HWTEST_F(A,B,C)| Use this template if the test case execution (excluding parameters) depends on setup and teardown.| + | HWTEST_P(A,B,C)| Use this template if the test case execution (including parameters) depends on setup and teardown.| + + In the template names: + - *A* indicates the test suite name. + - *B* indicates the test case name, which is in the *Function*\_*No.* format. The *No.* is a three-digit number starting from **001**. + - *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case. + + **Note**: + - The expected result of each test case must have an assertion. + - The test case level must be specified. + - It is recommended that the test be implemented step by step according to the template. + - The comment must contain the test case name, description, type, and requirement number. The test case description must be in the @tc.xxx format. The test case type @tc.type can be any of the following: + + | Test Case Type|Function test|Performance test|Reliability test|Security test|Fuzz test| + | ------------|------------|------------|------------|------------|------------| + | Code|FUNC|PERF|RELI|SECU|FUZZ| + -5. Add the compilation target to the subsystem compilation configuration to ensure that the test case is compiled with the version distribution. The following is an example: - 1. For devices that support connection to the Harmony device connector \(hdc\), the example compilation configuration directory is **test/developertest/examples/ohos.build**. - - ``` - { - "subsystem": "subsystem_examples", - "parts": { - "subsystem_examples": { - "module_list": [ - "//test/developertest/examples/detector:detector", - ... - ], - "test_list": [ - "//test/developertest/examples/detector/test:unittest", - ... - ] - }, - ... - } - ``` - - 2. For devices that support serial ports only, the example compilation configuration directory is **test/developertest/examples/lite/BUILD.gn**. - - ``` - import("//build/lite/config/test.gni") - - subsystem_test("test") { - test_components = [] - if(ohos_kernel_type == "liteos_riscv") { - features += [ - ] - }else if(ohos_kernel_type == "liteos_a") { - test_components += [ - "//test/developertest/examples/lite/cxx_demo/test/unittest/common:CalcSubTest" - ] - } - } - ``` - -6. Create a resource configuration file for the test case to use static resources. - 1. Create the **resource** directory in the **test** directory of a component or module. - 2. Create a directory for a device type, for example, **phone**, in the **resource** directory. - 3. Create a folder named after the module in the device type directory, for example, **testmodule**. - 4. Create the **ohos\_test.xml** file in the folder named after the module. The file content is in the following format: - - ``` - - - - - - - - ``` - - 5. Define **resource\_config\_file** in the compilation configuration file of the test case to specify the resource file **ohos\_test.xml**. - - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >The resource file is used to push the **test.txt** file in the **resource** directory to the **/data/test/resource** directory of the device to test. To do so, run the **hdc push** command. - - -7. Execute the test case after it is compiled \(the preceding steps are complete\). - - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >- For devices that support connection to the hdc, test cases can be compiled separately. - >- For devices that support serial ports only, to compile the test case, run the commands in the root directory for compiling the debug code. - > For details about how to execute a test case, see [How to Use the Test Platform](#section76401945124810). - - -## Development Example - -The code repository of the testing subsystem provides complete demo cases, which are available in the **test/developertest/examples/** directory. The following is an example of compiling a test case for a subtraction function: +**JavaScript Test Case Example** -- The tested code is as follows: +- Naming rules for source files + The source file name of a test case must be in the [Function]\[Sub-function]Test format, and each part must use the upper camel case style. More specific sub-functions can be added as required. +Example: ``` - static int Subtraction(int a, int b) - { - return a - b; - } + AppInfoTest.js ``` -- The test case code is as follows: - +- Test case example ``` - /** - * @tc.name: integer_sub_002 - * @tc.desc: Verify the Subtraction function. - * @tc.type: FUNC - * @tc.require: AR00000000 SR00000000 + /* + * Copyright (C) 2021 XXXX Device Co., Ltd. + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. */ - HWTEST_F(CalcSubtractionTest, integer_sub_002, TestSize.Level1) - { - EXPECT_EQ(1, Subtraction(2, 1)); + import app from '@system.app' + + import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index' + + describe("AppInfoTest", function () { + beforeAll(function() { + // Set a setup function, which will be called before all test cases. + console.info('beforeAll caled') + }) + + afterAll(function() { + // Set a teardown function, which will be called after all test cases. + console.info('afterAll caled') + }) + + beforeEach(function() { + // Set a setup function, which will be called before each test case. + console.info('beforeEach caled') + }) + + afterEach(function() { + // Set a teardown function, which will be called after each test case. + console.info('afterEach caled') + }) + + /* + * @tc.name:appInfoTest001 + * @tc.desc:verify app info is not null + * @tc.type: FUNC + * @tc.require: Issue Number + */ + it("appInfoTest001", 0, function () { + // Step 1 Call the function to obtain the test result. + var info = app.getInfo() + + // Step 2 Use an assertion to compare the obtained result with the expected result. + expect(info != null).assertEqual(true) + }) + }) + ``` + The procedure is as follows: + 1. Add comment information to the test case file header. + ``` + /* + * Copyright (C) 2021 XXXX Device Co., Ltd. + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + ``` + 2. Import the APIs and JSUnit test library to test. + ``` + import app from '@system.app' + + import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index' + ``` + 3. Define the test suite (test class). + ``` + describe("AppInfoTest", function () { + beforeAll(function() { + // Set a setup function, which will be called before all test cases. + console.info('beforeAll caled') + }) + + afterAll(function() { + // Set a teardown function, which will be called after all test cases. + console.info('afterAll caled') + }) + + beforeEach(function() { + // Set a setup function, which will be called before each test case. + console.info('beforeEach caled') + }) + + afterEach(function() { + // Set a teardown function, which will be called after each test case. + console.info('afterEach caled') + }) + ``` + 4. Add implementation of the test cases. + ``` + /* + * @tc.name:appInfoTest001 + * @tc.desc:verify app info is not null + * @tc.type: FUNC + * @tc.require: Issue Number + */ + it("appInfoTest001", 0, function () { + // Step 1 Call the function to obtain the test result. + var info = app.getInfo() + + // Step 2 Use an assertion to compare the obtained result with the expected result. + expect(info != null).assertEqual(true) + }) + ``` + +### Writing the Build File for Test Cases +When a test case is executed, the test framework searches for the build file of the test case in the test case directory and builds the test case located. The following describes how to write build files (GN files) in different programming languages. + +#### Writing Build Files for Test Cases +The following provides templates for different languages for your reference. + +- **Test case build file example (C++)** + ``` + # Copyright (c) 2021 XXXX Device Co., Ltd. + + import("//build/test.gni") + + module_output_path = "subsystem_examples/calculator" + + config("module_private_config") { + visibility = [ ":*" ] + + include_dirs = [ "../../../include" ] + } + + ohos_unittest("CalculatorSubTest") { + module_out_path = module_output_path + + sources = [ + "../../../include/calculator.h", + "../../../src/calculator.cpp", + ] + + sources += [ "calculator_sub_test.cpp" ] + + configs = [ ":module_private_config" ] + + deps = [ "//third_party/googletest:gtest_main" ] + } + + group("unittest") { + testonly = true + deps = [":CalculatorSubTest"] } ``` - - -## How to Use the Test Platform - -1. \(Optional\) Install the XDevice component. XDevice can be used as a Python extension package. - - Go to the installation directory **test/xdevice** and run the following command: - - ``` - python setup.py install - ``` - - If the installation is successful, the following prompts are displayed: + The build file is configured as follows: + + 1. Add comment information for the file header. + ``` + # Copyright (c) 2021 XXXX Device Co., Ltd. + ``` + 2. Import the build template. + ``` + import("//build/test.gni") + ``` + 3. Specify the file output path. + ``` + module_output_path = "subsystem_examples/calculator" + ``` + > **Note**: The output path is ***Part name*/*Module name***. + + 4. Configure the directories for dependencies. + + ``` + config("module_private_config") { + visibility = [ ":*" ] + + include_dirs = [ "../../../include" ] + } + ``` + > **Note**: Generally, the dependency directories are configured here and directly referenced in the build script of the test case. + + 5. Set the output build file for the test cases. + + ``` + ohos_unittest("CalculatorSubTest") { + } + ``` + 6. Write the build script (add the source file, configuration, and dependencies) for the test cases. + ``` + ohos_unittest("CalculatorSubTest") { + module_out_path = module_output_path + sources = [ + "../../../include/calculator.h", + "../../../src/calculator.cpp", + "../../../test/calculator_sub_test.cpp" + ] + sources += [ "calculator_sub_test.cpp" ] + configs = [ ":module_private_config" ] + deps = [ "//third_party/googletest:gtest_main" ] + } + ``` + + > **Note:** Set the test type based on actual requirements. The following test types are available: + > - **ohos_unittest**: unit test + > - **ohos_moduletest**: module test + > - **ohos_systemtest**: system test + > - **ohos_performancetest**: performance test + > - **ohos_securitytest**: security test + > - **ohos_reliabilitytest**: reliability test + > - **ohos_distributedtest**: distributed test + + 7. Group the test case files by test type. + + ``` + group("unittest") { + testonly = true + deps = [":CalculatorSubTest"] + } + ``` + > **Note**: Grouping test cases by test type allows you to execute a specific type of test cases when required. + +- **Test case build file example (JavaScript)** ``` - ... - Installed d:\programs\python37\lib\site-packages\xdevice-0.0.0-py3.7.egg - Processing dependencies for xdevice==0.0.0 - Searching for pyserial==3.4 - Best match: pyserial 3.4 - Processing pyserial-3.4-py3.7.egg - pyserial 3.4 is already the active version in easy-install.pth - Installing miniterm.py script to D:\Programs\Python37\Scripts - - Using d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg - Finished processing dependencies for xdevice==0.0.0 + # Copyright (C) 2021 XXXX Device Co., Ltd. + + import("//build/test.gni") + + module_output_path = "subsystem_examples/app_info" + + ohos_js_unittest("GetAppInfoJsTest") { + module_out_path = module_output_path + + hap_profile = "./config.json" + certificate_profile = "//test/developertest/signature/openharmony_sx.p7b" + } + + group("unittest") { + testonly = true + deps = [ ":GetAppInfoJsTest" ] + } ``` -2. Modify the **developertest/config/user\_config.xml** file to configure the Developertest component. - 1. Modify basic configuration parameters. - - \[build\] \# Set build parameters of the test case. - - ``` - - false - false - true - ... ... - - ``` - - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >**example**: whether to build the test case example. The default value is **false**. - >**version**: whether to build the test version. The default value is **false**. - >**testcase**: whether to build the test case. The default value is **true**. - - 2. For devices that support connection to the hdc, modify the configuration file as follows: - - Between the **device** tags with the **"usb-hdc"** attribute, modify the IP address of the device and the port number matching the HDC connection. For example: - - ``` - - 192.168.1.1 - 9111 - - - ``` - - 3. For devices that support serial ports only, modify the configuration file as follows: - - \[board\_info\] \# Configure development board information. - - ``` - - hispark - taurus - ipcamera - hb build - - ``` - - >![](../public_sys-resources/icon-note.gif) **NOTE:** - >**board\_series**: development board series. The default value is **hispark**. - >**board\_type**: development board type. The default value is **taurus**. - >**board\_product**: target product. The default value is **ipcamera**. - >**build\_command**: command used for building the test version and test case. The default value is **hb build**. - - Between the **device** tags with the **"ipcamera"** attribute, modify the serial port information, including the COM port and baud rate. For example: - - ``` - - - COM1 - cmd - 115200 - 8 - 1 - 1 - - - ``` - -3. \(Optional\) Modify the Developertest configuration. If a test case has been compiled, specify the compilation output path of the test case. In this case the test platform will not recompile the test case. - - Modify the **config/user\_config.xml** file. - - 1. Specify the output path of the test case, that is, the compilation output directory between the **test\_cases** tags. Example: - - ``` - - /home/opencode/out/release/tests - - ``` - - 2. For devices that support serial ports only, specify the NFS directory on the PC \(**host\_dir**\) and the corresponding directory on the board \(**board\_dir**\) between the **NFS** tags. For example: - - ``` - - D:\nfs - user - - ``` - -4. \(Optional\) Prepare the test environment. If devices to be tested support only serial ports, check whether the environment is ready: - - The system image and file system have been burnt into the development board and are running properly on it. For example, in system mode, if the device prompt **OHOS\#** when you log in with the shell, the system is running properly. - - The development host has been connected to the serial port of the development board and the network port. - - IP addresses of the development host and development board are in the same network segment and can ping each other. - - An empty directory has been created on the development host for mounting test cases through NFS, and the NFS service has been started properly. - -5. Start the test platform and execute the test case. - - Start the test framework, go to the **test/developertest** directory, and execute the startup script. - 1. Run the following command to start the test framework in Windows: - - ``` - start.bat - ``` - - 2. Run the following command to start the test framework in Linux: - - ``` - ./start.sh - ``` - - - Select a device type. + The procedure is as follows: - Configure the device type based on the development board in the configuration file, for example, **developertest/config/framework\_config.xml**. - - - Run test commands. - 1. To query the subsystems, modules, product form, and test types supported by test cases, run the **show** commands. - - ``` - Usage: - show productlist Query supported product forms - show typelist Query the supported test type - show subsystemlist Query supported subsystems - show modulelist Query supported modules - ``` - - 2. Run test commands. **-t** is mandatory, and **-ss** and **-tm** are optional. The following is an example: - - ``` - run -t ut -ss subsystem_examples -tm calculator - ``` - - 3. Specify the arguments to execute the test suite for a specific feature or module. - - ``` - usage: run [-h] [-p PRODUCTFORM] [-t [TESTTYPE [TESTTYPE ...]]] - [-ss SUBSYSTEM] [-tm TESTMODULE] [-ts TESTSUIT] - [-tc TESTCASE] [-tl TESTLEVEL] - - Optional arguments: - -h, --help Show this help message and exit. - -p PRODUCTFORM, --productform PRODUCTFORM Specified product form - -t [TESTTYPE [TESTTYPE ...]], --testtype [TESTTYPE [TESTTYPE ...]] - Specify test type(UT,MST,ST,PERF,ALL) - -ss SUBSYSTEM, --subsystem SUBSYSTEM Specify test subsystem - -tm TESTMODULE, --testmodule TESTMODULE Specified test module - -ts TESTSUIT, --testsuite TESTSUIT Specify test suite - -tc TESTCASE, --testcase TESTCASE Specify test case - -tl TESTLEVEL, --testlevel TESTLEVEL Specify test level - ``` - - - View the test framework help if needed. - - Run the following command query test commands that are supported by the test platform: - - ``` - help - ``` - - - Exit the test platform. - - Run the following command to exit the test platform: - - ``` - quit - ``` - -6. View the test result and logs. The test logs and reports are generated in the **developertest/reports** directory after you run the test commands. - - The test result is displayed on the console. The root path of the test result is as follows: - - ``` - reports/xxxx-xx-xx-xx-xx-xx - ``` - - - The test case formatting result is stored in the following directory: - - ``` - result/ - ``` - - - The test logs are stored in the following directory: - - ``` - log/plan_log_xxxx-xx-xx-xx-xx-xx.log - ``` - - - The report summary file is as follows: - - ``` - summary_report.html - ``` - - - The report details file is as follows: - - ``` - details_report.html - ``` - - - The log directory of the test platform is as follows: - - ``` - reports/platform_log_xxxx-xx-xx-xx-xx-xx.log - ``` - - - -## Directory Structure - -The source code of XDevice is stored in the **test/xdevice** directory. The following table describes the **xdevice** directory structure. - -**Table 4** XDevice structure - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Directory

-

Description

-

xdevice

-

Basic components of the test platform

-

xdevice/src/xdevice

-

Source code for the basic test framework

-

xdevice/config

-

Configuration file of the basic test framework

-

xdevice/src/xdevice/__main__.py

-

Internal entrance to the basic test framework

-

xdevice/src/xdevice/__init__.py

-

Package and plug-in dependencies

-

xdevice/src/xdevice/variables.py

-

Global variables

-

xdevice/src/xdevice/_core/command

-

Commands input by test cases

-

xdevice/src/xdevice/_core/config

-

Configuration management of the basic test framework

-

xdevice/src/xdevice/_core/environment

-

Environment management of the basic test framework, including device management

-

xdevice/src/xdevice/_core/executor

-

Scheduling and distribution of test cases

-

xdevice/src/xdevice/_core/driver

-

Test executor for the basic test framework

-

xdevice/src/xdevice/_core/resource

-

Resource files and test report templates for the basic test framework

-

xdevice/src/xdevice/_core/testkit

-

Common operations for the basic test framework, including NFS mounting

-

xdevice/src/xdevice/_core/logger.py

-

Log management of the basic test framework

-

xdevice/src/xdevice/_core/plugin.py

-

Plug-in management of the basic test framework

-

xdevice/src/xdevice/_core/interface.py

-

Interfaces for plug-ins of the basic test framework

-

xdevice/setup.py

-

Installation script of the basic test framework

-

xdevice/run.bat

-

Startup script of the basic test framework (Windows)

-

xdevice/run.sh

-

Startup script of the basic test framework (Linux)

-
- -The source code of Developertest is stored in the **test/developertest** directory. The following table describes the **developertest** directory structure. - -**Table 5** Developertest structure - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Directory

-

Description

-

developertest

-

Development test framework

-

developertest/src

-

Test framework source code

-

developertest/src/core

-

Test executor

-

developertest/src/core/build

-

Test case compilation

-

developertest/src/core/command

-

Processing of command lines entered by users

-

developertest/src/core/config

-

Test framework configuration management

-

developertest/src/core/driver

-

Test framework driver executor

-

developertest/src/core/resource

-

Test framework configuration file

-

developertest/src/core/testcase

-

Test case management

-

developertest/src/core/common.py

-

Common operations on the test framework

-

developertest/src/core/constants.py

-

Global constants of the test framework

-

developertest/src/core/exception.py

-

Test framework exceptions

-

developertest/src/core/utils.py

-

Test framework tools and methods

-

developertest/src/main

-

Test framework platform

-

developertest/src/main/__main__.py

-

Internal entrance of the test framework

-

developertest/examples

-

Test framework demo cases

-

developertest/third_party

-

Third-party components

-

developertest/BUILD.gn

-

Compilation configuration of the subsystem

-

developertest/start.bat

-

Developer test entry (Windows)

-

developertest/start.sh

-

Developer test entry (Linux)

-
+ 1. Add comment information for the file header. + + ``` + # Copyright (C) 2021 XXXX Device Co., Ltd. + ``` + 2. Import the build template. + + ``` + import("//build/test.gni") + ``` + 3. Specify the file output path. + + ``` + module_output_path = "subsystem_examples/app_info" + ``` + > **Note**: The output path is ***Part name*/*Module name***. + + 4. Set the output build file for the test cases. + + ``` + ohos_js_unittest("GetAppInfoJsTest") { + } + ``` + > **Note:** + >- Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++. + >- The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**. + + 5. Configure the **config.json** file and signature file, which are mandatory. + + ``` + ohos_js_unittest("GetAppInfoJsTest") { + module_out_path = module_output_path + + hap_profile = "./config.json" + certificate_profile = "//test/developertest/signature/openharmony_sx.p7b" + } + ``` + **config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example: + + ``` + { + "app": { + "bundleName": "com.example.myapplication", + "vendor": "example", + "version": { + "code": 1, + "name": "1.0" + }, + "apiVersion": { + "compatible": 4, + "target": 5 // Set it based on the tested SDK version. In this example, SDK5 is used. + } + }, + "deviceConfig": {}, + "module": { + "package": "com.example.myapplication", + "name": ".MyApplication", + "deviceType": [ + "phone" + ], + "distro": { + "deliveryWithInstall": true, + "moduleName": "entry", + "moduleType": "entry" + }, + "abilities": [ + { + "skills": [ + { + "entities": [ + "entity.system.home" + ], + "actions": [ + "action.system.home" + ] + } + ], + "name": "com.example.myapplication.MainAbility", + "icon": "$media:icon", + "description": "$string:mainability_description", + "label": "MyApplication", + "type": "page", + "launchType": "standard" + } + ], + "js": [ + { + "pages": [ + "pages/index/index" + ], + "name": "default", + "window": { + "designWidth": 720, + "autoDesignWidth": false + } + } + ] + } + } + ``` + 6. Group the test case files by test type. + ``` + group("unittest") { + testonly = true + deps = [ ":GetAppInfoJsTest" ] + } + ``` + > **Note**: Grouping test cases by test type allows you to execute a specific type of test cases when required. + +#### Configuring ohos.build + +Configure the part build file to associate with specific test cases. +``` +"partA": { + "module_list": [ + + ], + "inner_list": [ + + ], + "system_kits": [ + + ], + "test_list": [ + "//system/subsystem/partA/calculator/test:unittest" // Configure test under calculator. + ] + } +``` +> **Note**: **test_list** contains the test cases of the corresponding module. + +### Configuring Test Case Resources +Test case resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution. + +Perform the following steps: +1. Under the **test** directory of a part or module, create the **resource** directory to store resource files. + +2. In the **resource** directory, create the **ohos_test.xml** file in the following format: + ``` + + + + + + + + ``` +3. In the build file of the test cases, configure **resource\_config\_file** to point to the resource file **ohos\_test.xml**. + ``` + ohos_unittest("CalculatorSubTest") { + resource_config_file = "//system/subsystem/partA/calculator/test/resource/ohos_test.xml" + } + ``` + >**Note:** + >- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory. + >- **preparer** indicates the action to perform before the test suite is executed. + >- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. + >- **src="out"** indicates that the test resources are in the **out/release/$(component)** directory. +## Executing Test Cases +Before executing test cases, you need to modify the configuration based on the device used. + +### Modifying user_config.xml +``` + + + + false + + false + + true + + + + + + + + + + + + + cmd + 115200 + 8 + 1 + 1 + + + + + + + + + + + + + + + + + + +``` +>**Note**: If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters. + +### Executing Test Cases on Windows +#### Building Test Cases + +Test cases cannot be built on Windows. You need to run the following command to build test cases on Linux: +``` +./build.sh --product-name Hi3516DV300 --build-target make_test +``` +>**Note:** +> - --**product-name**: specifies the name of the product to build, for example, **Hi3516DV300**. +> - --**build-target**: specifies the target to build. It is optional. **make_test** indicates all test cases. You can set the build options based on requirements. + +When the build is complete, the test cases are automatically saved in the **out/ohos-arm-release/packages/phone/images/tests** directory. + +#### Setting Up the Execution Environment +1. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory. + +2. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory. + >**Note**: Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution. + +3. Modify the **user_config.xml** file. + ``` + + . + false + + + + D:\Test\testcase\tests + + ``` + >**Note**: `` indicates whether to build test cases. `` indicates the path for searching for test cases. + +#### Executing Test Cases +1. Start the test framework. + ``` + start.bat + ``` +2. Select the product. + + After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**. + +3. Execute test cases. + + Run the following command to execute test cases: + ``` + run -t UT -ts CalculatorSubTest -tc interger_sub_00l + ``` + In the command: + ``` + -t [TESTTYPE]: specifies the test case type, which can be UT, MST, ST, or PERF. This parameter is mandatory. + -tp [TESTTYPE]: specifies a part, which can be used independently. + -tm [TESTTYPE]: specifies a module. This parameter must be specified together with -tp. + -ts [TESTTYPE]: specifies a test suite, which can be used independently. + -tc [TESTTYPE]: specifies a test case. This parameter must be specified together with -ts. + You can run -h to display help information. + ``` +### Executing Test Cases on Linux +#### Mapping Remote Port +To enable test cases to be executed on a remote Linux server or a Linux VM, map the port to enable communication between the device and the remove server or VM. Configure port mapping as follows: +1. On the HDC server, run the following commands: + ``` + hdc_std kill + hdc_std -m -s 0.0.0.0:8710 + ``` + >**Note**: The IP address and port number are default values. + +2. On the HDC client, run the following command: + ``` + hdc_std -s xx.xx.xx.xx:8710 list targets + ``` + >**Note**: Enter the IP address of the device to test. + +#### Executing Test Cases +1. Start the test framework. + ``` + ./start.sh + ``` +2. Select the product. + + After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**. + +3. Execute test cases. + + The test framework locates the test cases based on the command, and automatically builds and executes the test cases. + ``` + run -t UT -ts CalculatorSubTest -tc interger_sub_00l + ``` + In the command: + ``` + -t [TESTTYPE]: specifies the test case type, which can be UT, MST, ST, or PERF. This parameter is mandatory. + -tp [TESTTYPE]: specifies a part, which can be used independently. + -tm [TESTTYPE]: specifies a module. This parameter must be specified together with -tp. + -ts [TESTTYPE]: specifies a test suite, which can be used independently. + -tc [TESTTYPE]: specifies a test case. This parameter must be specified together with -ts. + You can run -h to display help information. + ``` + +## Viewing the Test Report +After the test cases are executed, the test result will be automatically generated. You can view the detailed test result in the related directory. + +### Test Result +You can obtain the test result in the following directory: +``` +test/developertest/reports/xxxx_xx_xx_xx_xx_xx +``` +>**Note**: The folder for test reports is automatically generated. + +The folder contains the following files: +| Type| Description| +| ------------ | ------------ | +| result/ |Test cases in standard format| +| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs| +| summary_report.html | Test report summary| +| details_report.html | Detailed test report| + +### Test Framework Logs +``` +reports/platform_log_xxxx_xx_xx_xx_xx_xx.log +``` + +### Latest Test Report +``` +reports/latest +``` -- GitLab