# X Test Suite - [Introduction](#section465982318513) - [Devices](#section125090457443) - [Directory Structure](#section161941989596) - [Constraints](#section119744591305) - [Usage Guidelines](#section137768191623) - [Test Case Development Guidelines](#section3695134065513) - [C-based Test Case Development and Compilation \(for Mini-System Devices\)](#section1551164914237) - [C-based Test Case Execution \(for Mini-System Devices\)](#section10100701294) - [C++-based Test Case Development and Compilation \(for Small-, Standard-, and Large-System Devices\)](#section5714177113113) - [C++-based Test Case Execution \(for Small-, Standard-, and Large-System Devices\)](#section42281924184) - [Repositories Involved](#section1371113476307) ## Introduction The X test suite \(XTS\) subsystem contains a set of OpenHarmony certification test suites, including the currently supported application compatibility test suite \(ACTS\) and the device compatibility test suite \(DCTS\) that will be supported in the future. This subsystem contains the ACTS and **tools** software package. - The **acts** directory stores the source code and configuration files of ACTS test cases. The ACTS helps device vendors detect the software incompatibility as early as possible and ensures that the software is compatible to OpenHarmony during the entire development process. - The **tools** software package stores the test case development framework related to **acts**. ## Devices OpenHarmony supports the following device types: - **Mini-System Devices \(reference memory ≥ 128 KB\)** Such devices are equipped with MCU processors such as ARM Cortex-M and 32-bit RISC-V. They provide rich short-distance connection and peripheral bus access capabilities. Typical products include LinkIoT module devices and sensors in the smart home field. The LinkIoT module is usually used for smart Internet of Things \(IoT\) devices as the hardware module that implements connectivity functions. In the smart home field, the LinkIoT module is integrated into devices by vendors. For example, a LinkIoT module provides WLAN/Bluetooth access and data connection, and it usually communicates with the chip of smart home devices via a universal asynchronous receiver-transmitter \(UART\) or general-purpose input/output \(GPIO\) interface. - **Small-System Devices \(reference memory ≥ 1 MB\)** Such devices are equipped with application processors such as ARM Cortex-A. They provide higher security capabilities, standard graphics framework, and multimedia capabilities for video encoding and decoding. Typical products include IP cameras, electronic cat eyes, and routers in the smart home field, as well as event data recorders \(EDRs\) in the smart travel field. - **Standard-System Devices \(reference memory ≥ 128 MB\)** Such devices are equipped with application processors such as ARM Cortex-A. They provide a complete application framework supporting enhanced interaction, 3D GPU, hardware composer, diverse components, and rich animations. Typical products include high-end refrigerator displays. - **Large-System Devices \(reference memory ≥ 1 GB\)** Such devices are equipped with application processors such as ARM Cortex-A and provide a complete compatible application framework. Typical products include smart TVs and smart watches. ## Directory Structure ``` /test/xts ├── acts # Test code │ └── subsystem # Source code of subsystem test cases for large-system devices │ └── subsystem_lite # Source code of subsystems test cases for mini- and small-system devices │ └── BUILD.gn # Build configuration of test cases for large-system devices │ └── build_lite # Build configuration of test cases for mini-and small-system devices │ └── build_lite # Build configuration └── tools # Test tool code ``` ## Constraints Test cases for mini system devices must be developed based on C, and those for small system devices must be developed based on C++. ## Usage Guidelines **Table 1** Test case levels

Level

Definition

Scope

Level0

Smoke

Verifies basic functionalities of key features and basic DFX attributes with the most common input. The pass result indicates that the features are runnable.

Level1

Basic

Verifies basic functionalities of key features and basic DFX attributes with common input. The pass result indicates that the features are testable.

Level2

Major

Verifies basic functionalities of key features and basic DFX attributes with common input and errors. The pass result indicates that the features are functional and ready for beta testing.

Level3

Regular

Verifies functionalities of all key features, and all DFX attributes with common and uncommon input combinations or normal and abnormal preset conditions.

Level4

Rare

Verifies functionalities of key features under extremely abnormal presets and uncommon input combinations.

**Table 2** Test case granularities

Test Scale

Test Objects

Test Environment

LargeTest

Service functionalities, all-scenario features, and mechanical power environment (MPE) and scenario-level DFX

Devices close to real devices

MediumTest

Modules, subsystem functionalities after module integration, and DFX

Single device that is actually used. You can perform message simulation, but do not mock functions.

SmallTest

Modules, classes, and functions

Local PC. Use a large number of mocks to replace dependencies with other modules.

**Table 3** Test types

Type

Definition

Function

Tests the correctness of both service and platform functionalities provided by the tested object for end users or developers.

Performance

Tests the processing capability of the tested object under specific preset conditions and load models. The processing capability is measured by the service volume that can be processed in a unit time, for example, call per second, frame per second, or event processing volume per second.

Power

Tests the power consumption of the tested object in a certain period of time under specific preset conditions and load models.

Reliability

Tests the service performance of the tested object under common and uncommon input conditions, or specified service volume pressure and long-term continuous running pressure. The test covers stability, pressure handling, fault injection, and Monkey test times.

Security

  • Tests the capability of defending against security threats, including but not limited to unauthorized access, use, disclosure, damage, modification, and destruction, to ensure information confidentiality, integrity, and availability.
  • Tests the privacy protection capability to ensure that the collection, use, retention, disclosure, and disposal of users' private data comply with laws and regulations.
  • Tests the compliance with various security specifications, such as security design, security requirements, and security certification of the Ministry of Industry and Information Technology (MIIT).

Global

Tests the internationalized data and localization capabilities of the tested object, including multi-language display, various input/output habits, time formats, and regional features, such as currency, time, and culture taboos.

Compatibility

  • Tests backward compatibility of an application with its own data, the forward and backward compatibility with the system, and the compatibility with different user data, such as audio file content of the player and smart SMS messages.
  • Tests system backward compatibility with its own data and the compatibility of common applications in the ecosystem.
  • Tests software compatibility with related hardware.

User

Tests user experience of the object in real user scenarios. All conclusions and comments should come from the users, which are all subjective evaluation in this case.

Standard

Tests the compliance with industry and company-specific standards, protocols, and specifications. The standards here do not include any security standards that should be classified into the security test.

Safety

Tests the safety property of the tested object to avoid possible hazards to personal safety, health, and the object itself.

Resilience

Tests the resilience property of the tested object to ensure that it can withstand and maintain the defined running status (including downgrading) when being attacked, and recover from and adapt defense to the attacks to approach mission assurance.

## Test Case Development Guidelines You should select the appropriate programming language and your target test framework to develop test cases for the devices to test. **Table 4** Test frameworks and test case languages for different devices

Device Type

Test Framework

Language

Mini-system devices

HCTest

C

Small-system devices

HCPPTest

C++

Standard-system devices

HJUnit and HCPPTest

Java and C++

Large-system devices

HJUnit and HCPPTest

Java and C++

### C-based Test Case Development and Compilation \(for Mini-System Devices\) **Developing test cases for mini-system devices** The HCTest framework is used to support test cases developed with the C language. HCTest is enhanced and adapted based on the open-source test framework Unity. 1. Access the **test/xts/acts** repository where the test cases will be stored. ``` ├── acts │ └──subsystem_lite │ │ └── module_hal │ │ │ └── BUILD.gn │ │ │ └── src │ └──build_lite │ │ └── BUILD.gn ``` 2. Write the test case in the **src** directory. 1 Import the test framework header file. ``` #include "hctest.h" ``` 2. Use the **LITE\_TEST\_SUIT** macro to define names of the subsystem, module, and test suite. ``` /** * @brief Registers a test suite named IntTestSuite. * @param test Subsystem name * @param example Module name * @param IntTestSuite Test suite name */ LITE_TEST_SUIT(test, example, IntTestSuite); ``` 3. Define Setup and TearDown. Format: Test suite name+Setup, Test suite name+TearDown. The Setup and TearDown functions must exist, but function bodies can be empty. 4. Use the **LITE\_TEST\_CASE** macro to write the test case. Three parameters are involved: test suite name, test case name, and test case properties \(including type, granularity, and level\). ``` LITE_TEST_CASE(IntTestSuite, TestCase001, Function | MediumTest | Level1) { // Do something }; ``` 5. Use the **RUN\_TEST\_SUITE** macro to register the test suite. ``` RUN_TEST_SUITE(IntTestSuite); ``` 3. Create the configuration file \(**BUILD.gn**\) of the test module. Create a **BUILD.gn** \(example\) compilation file in each test module directory. Specify the name of the compiled static library and its dependent header file and library in the compilation file. The format is as follows: ``` import("//test/xts/tools/lite/build/suite_lite.gni") hctest_suite("ActsDemoTest") { suite_name = "acts" sources = [ "src/test_demo.c", ] include_dirs = [ ] cflags = [ "-Wno-error" ] } ``` 4. Add compilation options to the **BUILD.gn** file in the **acts** directory. You need to add the test module to the **test/xts/acts/build\_lite/BUILD.gn** script in the **acts** directory. ``` lite_component("acts") { ... if(board_name == "liteos_m") { features += [ ... "//xts/acts/subsystem_lite/module_hal:ActsDemoTest" ] } } ``` 5. Run compilation commands. Test suites are compiled along with version compilation. The ACTS is compiled together with the debug version. >![](public_sys-resources/icon-note.gif) **NOTE:** >The ACTS compiles middleware as a static library, which will be linked to the image. ### C-based Test Case Execution \(for Mini-System Devices\) **Executing test cases for mini-system devices** Burn the image into the development board. **Executing the test** 1. Use a serial port tool to log in to the development board and save information about the serial port. 2. Restart the device and view serial port logs. **Analyzing the test result** View the serial port logs, whose format is as follows: The log for each test suite starts with **Start to run test suite:** and ends with **xx Tests xx Failures xx Ignored**. ### C++-based Test Case Development and Compilation \(for Small-, Standard-, and Large-System Devices\) **Developing test cases for small-system devices** The HCPPTest framework is enhanced and adapted based on the open-source framework Googletest. 1. Access the **test/xts/acts** repository where the test cases will be stored. ``` ├── acts │ └──subsystem_lite │ │ └── module_posix │ │ │ └── BUILD.gn │ │ │ └── src │ └──build_lite │ │ └── BUILD.gn ``` 2. Write the test case in the **src** directory. 1. Import the test framework header file. The following statement includes **gtest.h**. ``` #include "gtest/gtest.h" ``` 2. Define Setup and TearDown. ``` using namespace std; using namespace testing::ext; class TestSuite: public testing::Test { protected: // Preset action of the test suite, which is executed before the first test case static void SetUpTestCase(void){ } // Test suite cleanup action, which is executed after the last test case static void TearDownTestCase(void){ } // Preset action of the test case virtual void SetUp() { } // Cleanup action of the test case virtual void TearDown() { } }; ``` 3. Use the **HWTEST** or **HWTEST\_F** macro to write the test case. **HWTEST**: definition of common test cases, including the test suite name, test case name, and case annotation. **HWTEST\_F**: definition of SetUp and TearDown test cases, including the test suite name, test case name, and case annotation. Three parameters are involved: test suite name, test case name, and test case properties \(including type, granularity, and level\). ``` HWTEST_F(TestSuite, TestCase_0001, Function | MediumTest | Level1) { // Do something } ``` 3. Create a configuration file \(**BUILD.gn**\) of the test module. Create a **BUILD.gn** compilation file in each test module directory. Specify the name of the compiled static library and its dependent header file and library in the compilation file. Each test module is independently compiled into a **.bin** executable file, which can be directly mounted to the development board for testing. Example: ``` import("//test/xts/tools/lite/build/suite_lite.gni") hcpptest_suite("ActsDemoTest") { suite_name = "acts" sources = [ "src/TestDemo.cpp" ] include_dirs = [ "src", ... ] deps = [ ... ] cflags = [ "-Wno-error" ] } ``` 4. Add compilation options to the **BUILD.gn** file in the **acts** directory. Add the test module to the **test/xts/acts/build\_lite/BUILD.gn** script in the **acts** directory. ``` lite_component("acts") { ... else if(board_name == "liteos_a") { features += [ ... "//xts/acts/subsystem_lite/module_posix:ActsDemoTest" ] } } ``` 5. Run compilation commands. Test suites are compiled along with the version compilation. The ACTS is compiled together with the debug version. >![](public_sys-resources/icon-note.gif) **NOTE:** >The ACTS for a small system device is independently compiled to an executable file \(.bin\) and archived in the **suites\\acts** directory of the compilation result. ### C++-based Test Case Execution \(for Small-, Standard-, and Large-System Devices\) **Executing test cases for small-system devices** Currently, test cases are shared by the NFS and mounted to the development board for execution. **Setting up the environment** 1. Use a network cable or wireless network to connect the development board to your PC. 2. Configure the IP address, subnet mask, and gateway for the development board. Ensure that the development board and the PC are in the same network segment. 3. Install and register the NFS server on the PC and start the NFS service. 4. Run the **mount** command for the development board to ensure that the development board can access NFS shared files on the PC. Format: **mount** _NFS server IP address_**:/**_NFS shared directory_ **/**_development board directory_ **nfs** Example: ``` mount 192.168.1.10:/nfs /nfs nfs ``` **Executing test cases** Execute **ActsDemoTest.bin** to trigger test case execution, and analyze serial port logs generated after the execution is complete. ## Repositories Involved [X Test Suite](https://gitee.com/openharmony/docs/blob/master/en/readme/x-test-suite.md) **xts\_acts** [xts\_tools](https://gitee.com/openharmony/xts_tools/blob/master/README.md)