提交 74d5e17c 编写于 作者: sangshuduo's avatar sangshuduo

Merge branch 'docs-cloud' into docs/sangshuduo/cloud-doc-rlang

---
title: Install Connection Agent
sidebar_label: Connection Agent
description: This document describes how to install the connection agent to ingest data into TDengine.
---
import Tabs from "@theme/Tabs";
import TabItem from "@theme/TabItem";
You can install the connection agent to ingest PI System data into TDengine.
## Prerequisites
- Ensure that your local machine is located on the same network as your PI Data Archive and PI AF Server (optional).
- Ensure that your local machine is running Linux or Windows.
## Procedure
<Tabs>
<TabItem label="Windows" value="windowsagent">
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Create New Agent** in the **Connection Agents** section.
3. Click **Windows** to download the connection agent.
4. On your local machine, run the connection agent installer and follow the prompts.
5. In TDengine Cloud, click **Next**.
6. Enter a unique name for your agent and click **Next** to generate an authentication token.
7. On your local machine, open the `C:\Program Files\taosX\config\agent.toml` file.
8. Copy the values of `endpoint` and `token` displayed in TDengine Cloud into the `agent.toml` file.
9. In TDengine Cloud, click **Next** and click **Finish**.
</TabItem>
<TabItem label="Linux" value="linuxagent">
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Create New Agent** in the **Connection Agents** section.
3. Click **Linux** to download the connection agent.
4. On your local machine, decompress the installation package and run the `install.sh` file.
5. In TDengine Cloud, click **Next**.
6. Enter a unique name for your agent and click **Next** to generate an authentication token.
7. On your local machine, open the `/etc/taos/agent.toml` file.
8. Copy the values of `endpoint` and `token` displayed in TDengine Cloud into the `agent.toml` file.
9. In TDengine Cloud, click **Next** and click **Finish**.
</TabItem>
</Tabs>
## What to Do Next
<Tabs>
<TabItem label="Windows" value="windowsnext">
1. Run the `sc start taosx-agent` command to start the connection agent as a service on your local machine.
2. [Ingest Data from PI System](../pi-system/).
</TabItem>
<TabItem label="Linux" value="linuxnext">
1. Run the `systemctl start taosx-agent` command to start the connection agent as a service on your local machine.
2. [Ingest Data from PI System](../pi-system/).
</TabItem>
</Tabs>
\ No newline at end of file
---
title: Ingest Data from PI System
sidebar_label: PI System
description: This document describes how to establish a connection with your PI System deployment and ingest data from PI System into TDengine.
---
You can ingest data from PI System into TDengine Cloud with the **Data Sources** feature. This integration uses the AF SDK to stream buffered data and query historical data from the PI Data Archive, set up PI and AF data pipes for streaming data, and connect to PI AF to query the AF structure. It also creates the corresponding tables and writes this data to TDengine Cloud over a secure RESTful API.
For more information about this solution, see [TDengine for PI System](https://tdengine.com/pi-system/).
:::note IMPORTANT
There is an additional charge for ingesting PI System data. The charge depends on your TDengine pricing plan. For more information, [contact our team](https://tdengine.com/contact/) or your account representative.
:::
## Prerequisites
- Create an empty database to store your PI System data. For more information, see [Database](../../programming/model/#create-database).
- Ensure that the connection agent is running on a machine located on the same network as your PI Data Archive and PI AF Server (optional). For more information, see [Install Connection Agent](../install-agent/).
- Obtain the name of your PI Data Archive server.
- (Optional) If you want to use PI AF, obtain the name of your AF database.
## Procedure
1. In TDengine Cloud, open **Data In**.
2. On the **Data Sources** tab, click **Add Data Source**.
3. In the **Name** field, enter a name for your data source.
4. From the **Type** menu, select **PI**.
5. From the **Agent** menu, select the connection agent for your PI System.
If you have not created a connection agent, see [Install Connection Agent](../install-agent/).
6. In the **Target DB** field, enter the empty database that you created to store PI System data.
If you have not created a database, see [Database](../../programming/model/#create-database).
7. In the **Server** field, enter the name of your PI Data Archive server.
8. (Optional) In the **AFDatabaseName** field, enter the name of your AF database.
9. In the **PI System Name** field, enter the name of your AF server.
10. In the **Data Queue** section, configure the parameters as desired. You can also retain the default values.
- **Max Wait Length** indicates the maximum rows of data to send in each HTTPS insert request.
- **Update Interval** indicates how often data will be pulled from the PI system.
- **Max Backfill Range (in days)** indicates the maximum number of days that will be automatically backfilled when reconnecting.
11. In the **Data Sets** section, select the ingestion method:
- Select **Point List** to ingest PI Points.
- Select **Template for PI Point** to ingest PI Points based on AF element templates.
- Select **Template for AF Elements** to ingest elements from AF.
12. Enter a regular expression in the search field to find the desired PI Points or AF templates.
13. Select each desired PI Point or AF template and click **Add** under the selected item.
14. After adding all desired items, click **Add** under the **Data Sets** section.
15. Review the pricing notice and click **Confirm**.
The selected PI Points or AF templates start streaming data to TDengine, with tables in TDengine automatically created to match point names or AF template structure.
## What to Do Next
1. In TDengine Cloud, open **Explorer** and click the database that you created to store PI System data.
2. Verify that the specified data is being ingested into TDengine Cloud.
- A supertable is created that acts as a schema for your PI System data.
- Subtables are created that contain your PI System data and tags.
- When **Template For AF Element** mode is used, the AF tree will be copied to a single metadata tag called `location`, and any static attributes in the AF elements will be copied to corresponding tags in the supertable.
...@@ -78,10 +78,23 @@ To obtain the value of cloud token and URL, please log in [TDengine Cloud](https ...@@ -78,10 +78,23 @@ To obtain the value of cloud token and URL, please log in [TDengine Cloud](https
Copy code bellow to your editor, then run it. If you are using jupyter, assuming you have followed the guide about Jupyter, you can copy the code into Jupyter editor in your browser. Copy code bellow to your editor, then run it. If you are using jupyter, assuming you have followed the guide about Jupyter, you can copy the code into Jupyter editor in your browser.
<Tabs defaultValue="rest">
<TabItem value="rest" label="REST">
```python ```python
{{#include docs/examples/python/develop_tutorial.py:connect}} {{#include docs/examples/python/develop_tutorial.py:connect}}
``` ```
</TabItem>
<TabItem value="websocket" label="WebSocket">
```python
{{#include docs/examples/python/develop_tutorial_ws.py:connect}}
```
</TabItem>
</Tabs>
For how to write data and query data, please refer to [Data In](https://docs.tdengine.com/cloud/data-in/) and [Data Out](https://docs.tdengine.com/cloud/data-out/). For how to write data and query data, please refer to [Data In](https://docs.tdengine.com/cloud/data-in/) and [Data Out](https://docs.tdengine.com/cloud/data-out/).
For more details about how to write or query data via REST API, please check [REST API](https://docs.tdengine.com/cloud/programming/connector/rest-api/). For more details about how to write or query data via REST API, please check [REST API](https://docs.tdengine.com/cloud/programming/connector/rest-api/).
......
from dotenv import load_dotenv
# read .env file from current working directory
load_dotenv()
# ANCHOR: connect
import taosws
import os
url = os.environ["TDENGINE_CLOUD_URL"]
token = os.environ["TDENGINE_CLOUD_TOKEN"]
try:
conn = taosrest.connect("%s?token=%s" % (url, token))
except Exception as e:
print(str(e))
# ANCHOR_END: connect
# ANCHOR: insert
# create super table
conn.execute("CREATE STABLE IF NOT EXISTS power.meters (ts TIMESTAMP, current FLOAT, voltage INT, phase FLOAT) TAGS (location BINARY(64), groupId INT)")
# insert multiple rows into multiple tables at once. subtables will be created automatically.
affected_row = conn.execute("""INSERT INTO power.d1001 USING power.meters TAGS('California.SanFrancisco', 2) VALUES ('2018-10-03 14:38:05.000', 10.30000, 219, 0.31000) ('2018-10-03 14:38:15.000', 12.60000, 218, 0.33000) ('2018-10-03 14:38:16.800', 12.30000, 221, 0.31000)
power.d1002 USING power.meters TAGS('California.SanFrancisco', 3) VALUES ('2018-10-03 14:38:16.650', 10.30000, 218, 0.25000)
""")
print("affected_row", affected_row) # 4
# ANCHOR_END: insert
# ANCHOR: query
result = conn.query("SELECT ts, current FROM power.meters LIMIT 2")
# ANCHOR_END: query
# ANCHOR: fields
print(result.fields)
# output: [{'name': 'ts', 'type': 9, 'bytes': 8}, {'name': 'current', 'type': 6, 'bytes': 4}]
# ANCHOR_END: fields
# ANCHOR: rows
print(result.rows)
# output: 3
# ANCHOR_END: rows
# ANCHOR: iter
for row in result:
print(row)
# output:
# [datetime.datetime(2018, 10, 3, 14, 38, 5, tzinfo=datetime.timezone.utc), 10.3]
# [datetime.datetime(2018, 10, 3, 14, 38, 15, tzinfo=datetime.timezone.utc), 12.6]
...@@ -76,10 +76,23 @@ $env:TDENGINE_CLOUD_URL='<url>' ...@@ -76,10 +76,23 @@ $env:TDENGINE_CLOUD_URL='<url>'
复制下面的代码到您的编辑器,然后执行这段代码。如果您正在使用 Jupyter 并且按照它的指南搭建好环境,您可以负责下面代码到您浏览器的 Jupyter 编辑器。 复制下面的代码到您的编辑器,然后执行这段代码。如果您正在使用 Jupyter 并且按照它的指南搭建好环境,您可以负责下面代码到您浏览器的 Jupyter 编辑器。
<Tabs defaultValue="rest">
<TabItem value="rest" label="REST">
```python ```python
{{#include docs/examples/python/develop_tutorial.py:connect}} {{#include docs/examples/python/develop_tutorial.py:connect}}
``` ```
</TabItem>
<TabItem value="websocket" label="WebSocket">
```python
{{#include docs/examples/python/develop_tutorial_ws.py:connect}}
```
</TabItem>
</Tabs>
对于如何写入数据和查询输入,请参考<https://docs.taosdata.com/cloud/data-in/insert-data/><https://docs.taosdata.com/cloud/data-out/query-data/> 对于如何写入数据和查询输入,请参考<https://docs.taosdata.com/cloud/data-in/insert-data/><https://docs.taosdata.com/cloud/data-out/query-data/>
想知道更多通过 REST 接口写入数据的详情,请参考[REST 接口](https://docs.taosdata.com/cloud/programming/connector/rest-api/). 想知道更多通过 REST 接口写入数据的详情,请参考[REST 接口](https://docs.taosdata.com/cloud/programming/connector/rest-api/).
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册