未验证 提交 66db3f8e 编写于 作者: Y Yaqiang Li 提交者: GitHub

Merge pull request #22416 from taosdata/dclow/docs-cloud

docs: add documents for pi data source
---
title: Install Connection Agent
sidebar_label: Connection Agent
description: This document describes how to install the connection agent to ingest data into TDengine.
---
import Tabs from "@theme/Tabs";
import TabItem from "@theme/TabItem";
You can install the connection agent to ingest PI System data into TDengine.
## Prerequisites
- Ensure that your local machine is located on the same network as your PI Data Archive and PI AF Server (optional).
- Ensure that your local machine is running Linux or Windows.
## Procedure
<Tabs>
<TabItem label="Windows" value="windowsagent">
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Create New Agent** in the **Connection Agents** section.
3. Click **Windows** to download the connection agent.
4. On your local machine, run the connection agent installer and follow the prompts.
5. In TDengine Cloud, click **Next**.
6. Enter a unique name for your agent and click **Next** to generate an authentication token.
7. On your local machine, open the `C:\Program Files\taosX\config\agent.toml` file.
8. Copy the values of `endpoint` and `token` displayed in TDengine Cloud into the `agent.toml` file.
9. In TDengine Cloud, click **Next** and click **Finish**.
</TabItem>
<TabItem label="Linux" value="linuxagent">
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Create New Agent** in the **Connection Agents** section.
3. Click **Linux** to download the connection agent.
4. On your local machine, decompress the installation package and run the `install.sh` file.
5. In TDengine Cloud, click **Next**.
6. Enter a unique name for your agent and click **Next** to generate an authentication token.
7. On your local machine, open the `/etc/taos/agent.toml` file.
8. Copy the values of `endpoint` and `token` displayed in TDengine Cloud into the `agent.toml` file.
9. In TDengine Cloud, click **Next** and click **Finish**.
</TabItem>
</Tabs>
## What to Do Next
<Tabs>
<TabItem label="Windows" value="windowsnext">
1. Run the `sc start taosx-agent` command to start the connection agent as a service on your local machine.
2. [Ingest Data from PI System](../pi-system/).
</TabItem>
<TabItem label="Linux" value="linuxnext">
1. Run the `systemctl start taosx-agent` command to start the connection agent as a service on your local machine.
2. [Ingest Data from PI System](../pi-system/).
</TabItem>
</Tabs>
\ No newline at end of file
---
title: Ingest Data from PI System
sidebar_label: PI System
description: This document describes how to establish a connection with your PI System deployment and ingest data from PI System into TDengine.
---
You can ingest data from PI System into TDengine Cloud with the **Data Sources** feature. This integration uses the AF SDK to stream buffered data and query historical data from the PI Data Archive, set up PI and AF data pipes for streaming data, and connect to PI AF to query the AF structure. It also creates the corresponding tables and writes this data to TDengine Cloud over a secure RESTful API.
For more information about this solution, see [TDengine for PI System](https://tdengine.com/pi-system/).
:::note IMPORTANT
There is an additional charge for ingesting PI System data. The charge depends on your TDengine pricing plan. For more information, [contact our team](https://tdengine.com/contact/) or your account representative.
:::
## Prerequisites
- Create an empty database to store your PI System data. For more information, see [Database](../../programming/model/#create-database).
- Ensure that the connection agent is running on a machine located on the same network as your PI Data Archive and PI AF Server (optional). For more information, see [Install Connection Agent](../install-agent/).
- Obtain the name of your PI Data Archive server.
- (Optional) If you want to use PI AF, obtain the name of your AF database.
## Procedure
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Add Data Source**.
3. In the **Name** field, enter a name for your data source.
4. From the **Type** menu, select **PI**.
5. From the **Agent** menu, select the connection agent for your PI System.
If you have not created a connection agent, see [Install Connection Agent](../install-agent/).
6. In the **Target DB** field, enter the empty database that you created to store PI System data.
If you have not created a database, see [Database](../../programming/model/#create-database).
7. In the **Server** field, enter the name of your PI Data Archive server.
8. (Optional) In the **AFDatabaseName** field, enter the name of your AF database.
9. In the **PI System Name** field, enter the name of your AF server.
10. In the **Data Queue** section, configure the parameters as desired. You can also retain the default values.
- **Max Wait Length** indicates the maximum rows of data to send in each HTTPS insert request.
- **Update Interval** indicates how often data will be pulled from the PI system.
- **Max Backfill Range (in days)** indicates the maximum number of days that will be automatically backfilled when reconnecting.
11. In the **Data Sets** section, select the ingestion method:
- Select **Point List** to ingest PI Points.
- Select **Template for PI Point** to ingest PI Points based on AF element templates.
- Select **Template for AF Elements** to ingest elements from AF.
12. Enter a regular expression in the search field to find the desired PI Points or AF templates.
13. Select each desired PI Point or AF template and click **Add** under the selected item.
14. After adding all desired items, click **Add** under the **Data Sets** section.
15. Review the pricing notice and click **Confirm**.
The selected PI Points or AF templates start streaming data to TDengine, with tables in TDengine automatically created to match point names or AF template structure.
## What to Do Next
1. In TDengine Cloud, open **Explorer** and click the database that you created to store PI System data.
2. Verify that the specified data is being ingested into TDengine Cloud.
- A supertable is created that acts as a schema for your PI System data.
- Subtables are created that contain your PI System data and tags.
- When **Template For AF Element** mode is used, the AF tree will be copied to a single metadata tag called `location`, and any static attributes in the AF elements will be copied to corresponding tags in the supertable.
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册