diff --git a/en/device-dev/subsystems/subsys-testguide-test.md b/en/device-dev/subsystems/subsys-testguide-test.md index 031361ae8970c62ef4e3063b8f605dff8dd4c2f0..40d4905715e2a555ef38f298fec411cd5cb82e3a 100644 --- a/en/device-dev/subsystems/subsys-testguide-test.md +++ b/en/device-dev/subsystems/subsys-testguide-test.md @@ -1,64 +1,151 @@ -# Test Case Development +# Test OpenHarmony provides a comprehensive auto-test framework for designing test cases. Detecting defects in the development process can improve code quality. This document describes how to use the OpenHarmony test framework. ## Setting Up the Environment -The test framework depends on the Python running environment. Before using the test framework, set up the environment as follows: - - [Obtaining Source Code](../get-code/sourcecode-acquire.md) +- The test framework depends on Python. Before using the test framework, you need to set up the environment. +- For details about how to obtain the source code, see [Obtaining Source Code](../get-code/sourcecode-acquire.md). +### Environment Configuration +#### Basic Test Framework Environment + +|Environment|Version|Description| +|------------|------------|------------| +|Operating system|Ubuntu 18.04 or later|Provides the build environment.| +|Linux extend component|libreadline-dev|Allows users to edit command lines.| +|Python|3.7.5 or later|Provides the programming language for the test framework.| +|Python Plug-ins|pyserial 3.3 or later
paramiko 2.7.1 or later
setuptools 40.8.0 or later
RSA 4.0 or later|- pyserial: supports serial port communication in Python.
- paramiko: allows SSH in Python.
- setuptools: allows creation and distribution of Python packages.
-RSA: implements RSA encryption in Python.| +|NFS Server|haneWIN NFS Server 1.2.50 or later or NFS v4 or later|Allows devices to be connected over a serial port.| +|HDC|1.1.0 or later|Allows devices to be connected by using the OpenHarmony Device Connector (HDC).| + + +#### Installation Process + +1. Run the following command to install the Linux extended component libreadline: + ``` + sudo apt-get install libreadline-dev + ``` + The installation is successful if the following information is displayed: + ``` + Reading package lists... Done + Building dependency tree + Reading state information... Done + libreadline-dev is already the newest version (7.0-3). + 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. + ``` +2. Run the following command to install the setuptools plug-in: + ``` + pip3 install setuptools + ``` + The installation is successful if the following information is displayed: + ``` + Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0) + ``` +3. Run the following command to install the paramiko plug-in: + ``` + pip3 install paramiko + ``` + The installation is successful if the following information is displayed: + ``` + Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko + Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0 + ``` +4. Run the following command to install the rsa plug-in: + ``` + pip3 install rsa + ``` + The installation is successful if the following information is displayed: + ``` + Installing collected packages: pyasn1, rsa + Successfully installed pyasn1-0.4.8 rsa-4.7 + ``` +5. Run the following command to install the pyserial plug-in: + ``` + pip3 install pyserial + ``` + The installation is successful if the following information is displayed: + ``` + Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4) + ``` +6. Install the NFS server if the device outputs results only through the serial port. + - For Windows, install, for example, haneWIN NFS Server 1.2.50. + - For Linux, run the following command to install the NFS server: + ``` + sudo apt install nfs-kernel-server + ``` + The installation is successful if the following information is displayed: + ``` + Reading package lists... Done + Building dependency tree + Reading state information... Done + nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3). + 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. + ``` +7. Install the HDC tool if the device supports HDC connections. + + For details, see https://gitee.com/openharmony/developtools_hdc_standard/blob/master/README.md + +## Checking the Installation Environment + +| Check Item|Operation |Requirements | +| --- | --- | --- | +| Check whether Python is installed successfully.|Run the **python --version** command. |The Python version is 3.7.5 or later.| +| Check whether Python plug-ins are successfully installed.|Go to the **test/developertest** directory and run **start.bat** or **start.sh**.| The **>>>** prompt is displayed.| +|Check the NFS server status (for the devices that support only serial port output). |Log in to the development board through the serial port and run the **mount** command to mount the NFS. |The file directory can be mounted. | +|Check whether the HDC is successfully installed. |Run the **hdc_std -v** command.|The HDC version is 1.1.0 or later.| + ## Directory Structure The directory structure of the test framework is as follows: ``` test # Test subsystem -├── developertest # Developer test module -│ ├── aw # Static library of the test framework -│ ├── config # Test framework configuration +├── developertest # Developer test module +│ ├── aw # Static library of the test framework +│ ├── config # Test framework configuration │ │ │ ... │ │ └── user_config.xml # User configuration -│ ├── examples # Examples of test cases -│ ├── src # Source code of the test framework -│ ├── third_party # Adaptation code for third-party components on which the test framework depends -│ ├── reports # Test reports -│ ├── BUILD.gn # Build entry of the test framework -│ ├── start.bat # Test entry for Windows -│ └── start.sh # Test entry for Linux -└── xdevice # Modules on which the test framework depends +│ ├── examples # Examples of test cases +│ ├── src # Source code of the test framework +│ ├── third_party # Adaptation code for third-party components on which the test framework depends +│ ├── reports # Test reports +│ ├── BUILD.gn # Build entry of the test framework +│ ├── start.bat # Test entry for Windows +│ └── start.sh # Test entry for Linux +└── xdevice # Modules on which the test framework depends ``` ## Writing Test Cases ### Designing the Test Case Directory Design the test case directory as follows: ``` -subsystem # Subsystem -├── partA # Part A -│ ├── moduleA # Module A +subsystem # Subsystem +├── partA # Part A +│ ├── moduleA # Module A │ │ ├── include -│ │ ├── src # Service code -│ │ └── test # Test directory -│ │ ├── unittest # Unit test -│ │ │ ├── common # Common test cases -│ │ │ │ ├── BUILD.gn # Build file of test cases +│ │ ├── src # Service code +│ │ └── test # Test directory +│ │ ├── unittest # Unit test +│ │ │ ├── common # Common test cases +│ │ │ │ ├── BUILD.gn # Build file of test cases │ │ │ │ └── testA_test.cpp # Source code of unit test cases -│ │ │ ├── phone # Test cases for mobile phones -│ │ │ ├── ivi # Test cases for head units -│ │ │ └── liteos-a # Test cases for the IP cameras that use the LiteOS kernel -│ │ ├── moduletest # Module test +│ │ │ ├── phone # Test cases for mobile phones +│ │ │ ├── ivi # Test cases for head units +│ │ │ └── liteos-a # Test cases for IP cameras using LiteOS +│ │ ├── moduletest # Module test │ │ ... │ │ -│ ├── moduleB # Module B +│ ├── moduleB # Module B │ ├── test -│ │ └── resource # Dependency resources -│ │ ├── moduleA # Module A -│ │ │ ├── ohos_test.xml # Resource configuration file -│ │ ... └── 1.txt # Resources +│ │ └── resource # Dependency resources +│ │ ├── moduleA # Module A +│ │ │ ├── ohos_test.xml # Resource configuration file +│ │ ... └── 1.txt # Resource file │ │ -│ ├── ohos_build # Build entry configuration +│ ├── ohos_build # Build entry configuration │ ... │ ... ``` -> **NOTE**
-> Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices. +> **CAUTION**
Test cases are classified into common test cases and device-specific test cases. You are advised to place common test cases in the **common** directory and device-specific test cases in the directories of the related devices. ### Writing Test Cases This test framework supports test cases written in multiple programming languages and provides different templates for different languages. @@ -76,7 +163,7 @@ Example: - Test case example ``` /* - * Copyright (c) 2021 XXXX Device Co., Ltd. + * Copyright (c) 2022 XXXX Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at @@ -125,7 +212,7 @@ Example: /** * @tc.name: integer_sub_001 - * @tc.desc: Verify the sub-function. + * @tc.desc: Verify the sub function. * @tc.type: FUNC * @tc.require: Issue Number */ @@ -133,7 +220,7 @@ Example: { // Step 1 Call the function to obtain the result. int actual = Sub(4, 0); - + // Step 2 Use an assertion to compare the obtained result with the expected result. EXPECT_EQ(4, actual); } @@ -142,7 +229,7 @@ Example: 1. Add comment information to the test case file header. ``` /* - * Copyright (c) 2021 XXXX Device Co., Ltd. + * Copyright (c) 2022 XXXX Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at @@ -159,7 +246,7 @@ Example: 2. Add the test framework header file and namespace. ``` #include - + using namespace testing::ext; ``` 3. Add the header file of the test class. @@ -175,35 +262,34 @@ Example: void SetUp(); void TearDown(); }; - + void CalculatorSubTest::SetUpTestCase(void) { // Set a setup function, which will be called before all test cases. } - + void CalculatorSubTest::TearDownTestCase(void) { // Set a teardown function, which will be called after all test cases. } - + void CalculatorSubTest::SetUp(void) { // Set a setup function, which will be called before each test case. } - + void CalculatorSubTest::TearDown(void) { // Set a teardown function, which will be called after each test case. } ``` - > **NOTE**
- > When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style. + > **CAUTION**:
When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style. 5. Add implementation of the test cases, including test case comments and logic. ``` /** * @tc.name: integer_sub_001 - * @tc.desc: Verify the sub-function. + * @tc.desc: Verify the sub function. * @tc.type: FUNC * @tc.require: Issue Number */ @@ -211,37 +297,37 @@ Example: { // Step 1 Call the function to obtain the test result. int actual = Sub(4, 0); - + // Step 2 Use an assertion to compare the obtained result with the expected result. EXPECT_EQ(4, actual); } ``` The following test case templates are provided for your reference. - - | Type | Description | - | ------------ | ------------ | - | HWTEST(A,B,C) | Use this template if the test case execution does not depend on setup or teardown. | - | HWTEST_F(A,B,C) | Use this template if the test case execution (excluding parameters) depends on setup and teardown. | - | HWTEST_P(A,B,C) | Use this template if the test case execution (including parameters) depends on setup and teardown. | + + | Type| Description| + | ------------| ------------| + | HWTEST(A,B,C)| Use this template if the test case execution does not depend on setup or teardown.| + | HWTEST_F(A,B,C)| Use this template if the test case execution (excluding parameters) depends on setup and teardown.| + | HWTEST_P(A,B,C)| Use this template if the test case execution (including parameters) depends on setup and teardown.| In the template names: - *A* indicates the test suite name. - *B* indicates the test case name, which is in the *Function*\_*No.* format. The *No.* is a three-digit number starting from **001**. - *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case. - Note the following: - - The expected result of each test case must have an assertion. - - The test case level must be specified. - - It is recommended that the test be implemented step by step according to the template. - - The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following: + **CAUTION**
+ - The expected result of each test case must have an assertion. + - The test case level must be specified. + - It is recommended that the test be implemented step by step according to the template. + - The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following: - | Test Case Type | Code | - | ------------ | ---------- | - | Function test | FUNC | - | Performance test | PERF | - | Reliability test | RELI | - | Security test | SECU | - | Fuzz test | FUZZ | + | Test Case Type|Code| + | ------------|------------| + |Function test |FUNC| + |Performance Test |PERF| + |Reliability test |RELI| + |Security test |SECU| + |Fuzzing |FUZZ| **JavaScript Test Case Example** @@ -257,7 +343,7 @@ Example: - Test case example ``` /* - * Copyright (C) 2021 XXXX Device Co., Ltd. + * Copyright (C) 2022 XXXX Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at @@ -304,7 +390,7 @@ Example: it("appInfoTest001", 0, function () { // Step 1 Call the function to obtain the test result. var info = app.getInfo() - + // Step 2 Use an assertion to compare the obtained result with the expected result. expect(info != null).assertEqual(true) }) @@ -314,7 +400,7 @@ Example: 1. Add comment information to the test case file header. ``` /* - * Copyright (C) 2021 XXXX Device Co., Ltd. + * Copyright (C) 2022 XXXX Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at @@ -331,7 +417,7 @@ Example: 2. Import the APIs and JSUnit test library to test. ``` import app from '@system.app' - + import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index' ``` 3. Define the test suite (test class). @@ -368,7 +454,7 @@ Example: it("appInfoTest001", 0, function () { // Step 1 Call the function to obtain the test result. var info = app.getInfo() - + // Step 2 Use an assertion to compare the obtained result with the expected result. expect(info != null).assertEqual(true) }) @@ -382,7 +468,7 @@ The following provides templates for different languages for your reference. - **Test case build file example (C++)** ``` - # Copyright (c) 2021 XXXX Device Co., Ltd. + # Copyright (c) 2022 XXXX Device Co., Ltd. import("//build/test.gni") @@ -417,25 +503,18 @@ The following provides templates for different languages for your reference. The procedure is as follows: 1. Add comment information for the file header. - - ``` - # Copyright (c) 2021 XXXX Device Co., Ltd. + ``` + # Copyright (c) 2022 XXXX Device Co., Ltd. ``` - 2. Import the build template. - - ``` + ``` import("//build/test.gni") ``` - 3. Specify the file output path. - ``` module_output_path = "subsystem_examples/calculator" ``` - - > **NOTE**
- > The output path is ***Part name*/*Module name***. + > **NOTE**
The output path is ***Part_name*/*Module_name***. 4. Configure the directories for dependencies. @@ -446,8 +525,7 @@ The following provides templates for different languages for your reference. include_dirs = [ "../../../include" ] } ``` - > **NOTE**
- > Generally, the dependency directories are configured here and directly referenced in the build script of the test case. + > **NOTE**
Generally, the dependency directories are configured here and directly referenced in the build script of the test case. 5. Set the output build file for the test cases. @@ -470,15 +548,14 @@ The following provides templates for different languages for your reference. } ``` - > **NOTE**
- > Set the test type based on actual requirements. The following test types are available:
- > - **ohos_unittest**: unit test
- > - **ohos_moduletest**: module test
- > - **ohos_systemtest**: system test
- > - **ohos_performancetest**: performance test
- > - **ohos_securitytest**: security test
- > - **ohos_reliabilitytest**: reliability test
- > - **ohos_distributedtest**: distributed test
+ > **NOTE**
Set the test type based on actual requirements. The following test types are available: + > - **ohos_unittest**: unit test + > - **ohos_moduletest**: module test + > - **ohos_systemtest**: system test + > - **ohos_performancetest**: performance test + > - **ohos_securitytest**: security test + > - **ohos_reliabilitytest**: reliability test + > - **ohos_distributedtest**: distributed test 7. Group the test case files by test type. @@ -488,13 +565,12 @@ The following provides templates for different languages for your reference. deps = [":CalculatorSubTest"] } ``` - > **NOTE**
- > Grouping test cases by test type allows you to execute a specific type of test cases when required. - -- **Test case build file example (JavaScript)** + > **NOTE**
Grouping test cases by test type allows you to execute a specific type of test cases when required. + +- Test case build file example (JavaScript) ``` - # Copyright (C) 2021 XXXX Device Co., Ltd. + # Copyright (C) 2022 XXXX Device Co., Ltd. import("//build/test.gni") @@ -518,7 +594,7 @@ The following provides templates for different languages for your reference. 1. Add comment information for the file header. ``` - # Copyright (C) 2021 XXXX Device Co., Ltd. + # Copyright (C) 2022 XXXX Device Co., Ltd. ``` 2. Import the build template. @@ -530,9 +606,7 @@ The following provides templates for different languages for your reference. ``` module_output_path = "subsystem_examples/app_info" ``` - - > **NOTE**
- > The output path is ***Part name*/*Module name***. + > **NOTE**
The output path is ***Part_name*/*Module_name***. 4. Set the output build file for the test cases. @@ -540,10 +614,9 @@ The following provides templates for different languages for your reference. ohos_js_unittest("GetAppInfoJsTest") { } ``` - > **NOTE**
- > - Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++. - > - The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**. + >- Use the **ohos\_js\_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++. + >- The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**. 5. Configure the **config.json** file and signature file, which are mandatory. @@ -555,7 +628,7 @@ The following provides templates for different languages for your reference. certificate_profile = "//test/developertest/signature/openharmony_sx.p7b" } ``` - **config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example: + **config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example: ``` { @@ -625,9 +698,7 @@ The following provides templates for different languages for your reference. deps = [ ":GetAppInfoJsTest" ] } ``` - - > **NOTE**
- > Grouping test cases by test type allows you to execute a specific type of test cases when required. + > **NOTE**
Grouping test cases by test type allows you to execute a specific type of test cases when required. #### Configuring ohos.build @@ -648,9 +719,7 @@ Configure the part build file to associate with specific test cases. ] } ``` - -> **NOTE**
-> **test_list** contains the test cases of the corresponding module. +> **NOTE**
**test_list** contains the test cases of the corresponding module. ### Configuring Test Case Resources Test case resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution. @@ -670,15 +739,17 @@ Perform the following steps: ``` -3. In the build file of the test cases, configure **resource\_config\_file** to point to the resource file **ohos\_test.xml**. +3. In the build file of the test cases, configure **resource_config_file** to point to the resource file **ohos_test.xml**. ``` ohos_unittest("CalculatorSubTest") { resource_config_file = "//system/subsystem/partA/test/resource/calculator/ohos_test.xml" } ``` - >**NOTE**
- >- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory.**preparer** indicates the action to perform before the test suite is executed. - >- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory. + >**NOTE** + >- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory. + >- **preparer** indicates the action to perform before the test suite is executed. + >- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. + >- **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory. ## Executing Test Cases Before executing test cases, you need to modify the configuration based on the device used. @@ -695,7 +766,7 @@ Before executing test cases, you need to modify the configuration based on the d true - + @@ -729,8 +800,7 @@ Before executing test cases, you need to modify the configuration based on the d ``` -> **NOTE**
-> If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters. +>**NOTE**
If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters. ### Executing Test Cases on Windows #### Building Test Cases @@ -739,20 +809,20 @@ Test cases cannot be built on Windows. You need to run the following command to ``` ./build.sh --product-name hispark_taurus_standard --build-target make_test ``` -> **NOTE**
+>**NOTE** +> >- **product-name**: specifies the name of the product to build, for example, **hispark_taurus_standard**. >- **build-target**: specifies the test case to build. **make_test** indicates all test cases. You can specify the test cases based on requirements. -After the build is complete, the test cases are automatically saved in **out/hispark_taurus/packages/phone/tests**. +When the build is complete, the test cases are automatically saved in **out/hispark_taurus/packages/phone/tests**. #### Setting Up the Execution Environment 1. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory. 2. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory. - > **NOTE**
- > Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution. - + >**NOTE**
Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution. + 3. Modify the **user_config.xml** file. ``` @@ -760,14 +830,12 @@ After the build is complete, the test cases are automatically saved in **out/his false - + D:\Test\testcase\tests ``` - - > **NOTE**
- > `` indicates whether to build test cases. `` indicates the path for searching for test cases. - + >**NOTE**
**** indicates whether to build test cases. **** indicates the path for searching for test cases. + #### Executing Test Cases 1. Start the test framework. ``` @@ -775,7 +843,7 @@ After the build is complete, the test cases are automatically saved in **out/his ``` 2. Select the product. - After the test framework starts, you are asked to select a product. Select the development board to test, for example, **hispark_taurus_standard**. + After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**. 3. Execute test cases. @@ -790,7 +858,7 @@ After the build is complete, the test cases are automatically saved in **out/his -tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp. -ts [TESTSUITE]: specifies the test suite. This parameter can be used independently. -tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts. - -You can run h to display help information. + You can run h to display help information. ``` ### Executing Test Cases on Linux #### Mapping the Remote Port @@ -800,18 +868,14 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map hdc_std kill hdc_std -m -s 0.0.0.0:8710 ``` - - > **NOTE**
- > The IP address and port number are default values. - + >**NOTE**
The IP address and port number are default values. + 2. On the HDC client, run the following command: ``` hdc_std -s xx.xx.xx.xx:8710 list targets ``` - - > **NOTE**
- > Enter the IP address of the device to test. - + >**NOTE**
Enter the IP address of the device to test. + #### Executing Test Cases 1. Start the test framework. ``` @@ -819,7 +883,7 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map ``` 2. Select the product. - After the test framework starts, you are asked to select a product. Select the development board to test, for example, **hispark_taurus_standard**. + After the test framework starts, you are asked to select a product. Select the development board to test, for example, **Hi3516DV300**. 3. Execute test cases. @@ -834,7 +898,7 @@ To enable test cases to be executed on a remote Linux server or a Linux VM, map -tm [TESTMODULE]: specifies the module to test. This parameter must be specified together with -tp. -ts [TESTSUITE]: specifies the test suite. This parameter can be used independently. -tc [TESTCASE]: specifies the test case. This parameter must be specified together with -ts. - -You can run h to display help information. + You can run h to display help information. ``` ## Viewing the Test Report @@ -845,16 +909,15 @@ You can obtain the test result in the following directory: ``` test/developertest/reports/xxxx_xx_xx_xx_xx_xx ``` -> **NOTE**
-> The folder for test reports is automatically generated. +>**NOTE**
The folder for test reports is automatically generated. The folder contains the following files: -| Type | Description | +| Type| Description| | ------------ | ------------ | -| result/ | Test cases in standard format. | -| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs. | -| summary_report.html | Test report summary. | -| details_report.html | Detailed test report. | +| result/ |Test cases in standard format.| +| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs.| +| summary_report.html | Test report summary.| +| details_report.html | Detailed test report.| ### Test Framework Logs ```