Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
慢慢CG
TDengine
提交
25bddc0c
T
TDengine
项目概览
慢慢CG
/
TDengine
与 Fork 源项目一致
Fork自
taosdata / TDengine
通知
1
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
T
TDengine
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
25bddc0c
编写于
10月 10, 2021
作者:
sangshuduo
浏览文件
操作
浏览文件
下载
差异文件
merge with develop branch.
上级
48dae824
11e28e3d
变更
113
展开全部
隐藏空白更改
内联
并排
Showing
113 changed file
with
1243 addition
and
730 deletion
+1243
-730
Jenkinsfile
Jenkinsfile
+6
-3
deps/cJson/inc/cJSON.h
deps/cJson/inc/cJSON.h
+1
-1
deps/cJson/src/cJSON.c
deps/cJson/src/cJSON.c
+1
-1
src/client/inc/tscParseLine.h
src/client/inc/tscParseLine.h
+12
-1
src/client/src/TSDBJNIConnector.c
src/client/src/TSDBJNIConnector.c
+3
-2
src/client/src/tscParseLineProtocol.c
src/client/src/tscParseLineProtocol.c
+52
-7
src/client/src/tscParseOpenTSDB.c
src/client/src/tscParseOpenTSDB.c
+73
-39
src/connector/python/README.md
src/connector/python/README.md
+2
-2
src/connector/python/examples/insert-lines.py
src/connector/python/examples/insert-lines.py
+2
-2
src/connector/python/taos/__init__.py
src/connector/python/taos/__init__.py
+2
-2
src/connector/python/taos/cinterface.py
src/connector/python/taos/cinterface.py
+6
-23
src/connector/python/taos/connection.py
src/connector/python/taos/connection.py
+25
-24
src/connector/python/taos/error.py
src/connector/python/taos/error.py
+2
-12
src/connector/python/tests/test_lines.py
src/connector/python/tests/test_lines.py
+5
-5
src/inc/taos.h
src/inc/taos.h
+1
-5
src/inc/taoserror.h
src/inc/taoserror.h
+1
-0
src/kit/taosdump/taosdump.c
src/kit/taosdump/taosdump.c
+32
-20
src/os/src/detail/osTime.c
src/os/src/detail/osTime.c
+3
-1
src/util/src/terror.c
src/util/src/terror.c
+1
-0
tests/examples/c/apitest.c
tests/examples/c/apitest.c
+8
-8
tests/examples/c/schemaless.c
tests/examples/c/schemaless.c
+2
-2
tests/pytest/crash_gen/valgrind_taos.supp
tests/pytest/crash_gen/valgrind_taos.supp
+17
-0
tests/pytest/fulltest.sh
tests/pytest/fulltest.sh
+3
-2
tests/pytest/insert/insertJSONPayload.py
tests/pytest/insert/insertJSONPayload.py
+105
-128
tests/pytest/insert/insertTelnetLines.py
tests/pytest/insert/insertTelnetLines.py
+26
-26
tests/pytest/insert/line_insert.py
tests/pytest/insert/line_insert.py
+10
-10
tests/pytest/insert/openTsdbTelnetLinesInsert.py
tests/pytest/insert/openTsdbTelnetLinesInsert.py
+88
-88
tests/pytest/insert/schemalessInsert.py
tests/pytest/insert/schemalessInsert.py
+96
-96
tests/pytest/tools/insert-interlace.json
tests/pytest/tools/insert-interlace.json
+1
-1
tests/pytest/tools/insert-tblimit-tboffset-createdb.json
tests/pytest/tools/insert-tblimit-tboffset-createdb.json
+1
-1
tests/pytest/tools/insert-tblimit-tboffset-insertrec.json
tests/pytest/tools/insert-tblimit-tboffset-insertrec.json
+1
-1
tests/pytest/tools/insert-tblimit-tboffset.json
tests/pytest/tools/insert-tblimit-tboffset.json
+1
-1
tests/pytest/tools/insert-tblimit-tboffset0.json
tests/pytest/tools/insert-tblimit-tboffset0.json
+1
-1
tests/pytest/tools/insert-tblimit1-tboffset.json
tests/pytest/tools/insert-tblimit1-tboffset.json
+1
-1
tests/pytest/tools/schemalessInsertPerformance.py
tests/pytest/tools/schemalessInsertPerformance.py
+10
-10
tests/pytest/tools/taosdemoAllTest/insert-1s1tnt1r.json
tests/pytest/tools/taosdemoAllTest/insert-1s1tnt1r.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-1s1tntmr.json
tests/pytest/tools/taosdemoAllTest/insert-1s1tntmr.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-allDataType.json
tests/pytest/tools/taosdemoAllTest/insert-allDataType.json
+88
-0
tests/pytest/tools/taosdemoAllTest/insert-disorder.json
tests/pytest/tools/taosdemoAllTest/insert-disorder.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-drop-exist-auto-N00.json
...est/tools/taosdemoAllTest/insert-drop-exist-auto-N00.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-drop-exist-auto-Y00.json
...est/tools/taosdemoAllTest/insert-drop-exist-auto-Y00.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-illegal.json
tests/pytest/tools/taosdemoAllTest/insert-illegal.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-interlace-row.json
tests/pytest/tools/taosdemoAllTest/insert-interlace-row.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-interval-speed.json
...s/pytest/tools/taosdemoAllTest/insert-interval-speed.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-newdb.json
tests/pytest/tools/taosdemoAllTest/insert-newdb.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-newtable.json
tests/pytest/tools/taosdemoAllTest/insert-newtable.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-nodbnodrop.json
tests/pytest/tools/taosdemoAllTest/insert-nodbnodrop.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-offset.json
tests/pytest/tools/taosdemoAllTest/insert-offset.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-renewdb.json
tests/pytest/tools/taosdemoAllTest/insert-renewdb.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-sample.json
tests/pytest/tools/taosdemoAllTest/insert-sample.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insert-timestep.json
tests/pytest/tools/taosdemoAllTest/insert-timestep.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertBinaryLenLarge16374AllcolLar49151.json
...sdemoAllTest/insertBinaryLenLarge16374AllcolLar49151.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertChildTab0.json
tests/pytest/tools/taosdemoAllTest/insertChildTab0.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertChildTabLess0.json
tests/pytest/tools/taosdemoAllTest/insertChildTabLess0.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertColumnsAndTagNum4096.json
...est/tools/taosdemoAllTest/insertColumnsAndTagNum4096.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertColumnsAndTagNumLarge4096.json
...ools/taosdemoAllTest/insertColumnsAndTagNumLarge4096.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertColumnsNum0.json
tests/pytest/tools/taosdemoAllTest/insertColumnsNum0.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertInterlaceRowsLarge1M.json
...est/tools/taosdemoAllTest/insertInterlaceRowsLarge1M.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertMaxNumPerReq.json
tests/pytest/tools/taosdemoAllTest/insertMaxNumPerReq.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertNumOfrecordPerReq0.json
...ytest/tools/taosdemoAllTest/insertNumOfrecordPerReq0.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertNumOfrecordPerReqless0.json
...t/tools/taosdemoAllTest/insertNumOfrecordPerReqless0.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertRestful.json
tests/pytest/tools/taosdemoAllTest/insertRestful.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertSigcolumnsNum4096.json
...pytest/tools/taosdemoAllTest/insertSigcolumnsNum4096.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertTagsNumLarge128.json
...s/pytest/tools/taosdemoAllTest/insertTagsNumLarge128.json
+1
-1
tests/pytest/tools/taosdemoAllTest/insertTimestepMulRowsLargeint16.json
...ools/taosdemoAllTest/insertTimestepMulRowsLargeint16.json
+2
-1
tests/pytest/tools/taosdemoAllTest/insert_5M_rows.json
tests/pytest/tools/taosdemoAllTest/insert_5M_rows.json
+1
-1
tests/pytest/tools/taosdemoAllTest/manual_block1_comp.json
tests/pytest/tools/taosdemoAllTest/manual_block1_comp.json
+1
-1
tests/pytest/tools/taosdemoAllTest/manual_block2.json
tests/pytest/tools/taosdemoAllTest/manual_block2.json
+1
-1
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit1.json
.../pytest/tools/taosdemoAllTest/moredemo-offset-limit1.json
+1
-1
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit5.json
.../pytest/tools/taosdemoAllTest/moredemo-offset-limit5.json
+1
-1
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit94.json
...pytest/tools/taosdemoAllTest/moredemo-offset-limit94.json
+1
-1
tests/pytest/tools/taosdemoAllTest/moredemo-offset-newdb.json
...s/pytest/tools/taosdemoAllTest/moredemo-offset-newdb.json
+1
-1
tests/pytest/tools/taosdemoAllTest/query-interrupt.json
tests/pytest/tools/taosdemoAllTest/query-interrupt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/queryInsertdata.json
tests/pytest/tools/taosdemoAllTest/queryInsertdata.json
+1
-1
tests/pytest/tools/taosdemoAllTest/queryInsertrestdata.json
tests/pytest/tools/taosdemoAllTest/queryInsertrestdata.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/1174-large-stmt.json
tests/pytest/tools/taosdemoAllTest/stmt/1174-large-stmt.json
+2
-1
tests/pytest/tools/taosdemoAllTest/stmt/1174-large-taosc.json
...s/pytest/tools/taosdemoAllTest/stmt/1174-large-taosc.json
+2
-1
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-stmt-random.json
...st/tools/taosdemoAllTest/stmt/1174-small-stmt-random.json
+2
-1
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-stmt.json
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-stmt.json
+2
-1
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-taosc.json
...s/pytest/tools/taosdemoAllTest/stmt/1174-small-taosc.json
+2
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-1s1tnt1r-stmt.json
...test/tools/taosdemoAllTest/stmt/insert-1s1tnt1r-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-1s1tntmr-stmt.json
...test/tools/taosdemoAllTest/stmt/insert-1s1tntmr-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-allDataType-stmt.json
...t/tools/taosdemoAllTest/stmt/insert-allDataType-stmt.json
+88
-0
tests/pytest/tools/taosdemoAllTest/stmt/insert-disorder-stmt.json
...test/tools/taosdemoAllTest/stmt/insert-disorder-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-drop-exist-auto-N00-stmt.json
...taosdemoAllTest/stmt/insert-drop-exist-auto-N00-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-drop-exist-auto-Y00-stmt.json
...taosdemoAllTest/stmt/insert-drop-exist-auto-Y00-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-interlace-row-stmt.json
...tools/taosdemoAllTest/stmt/insert-interlace-row-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-interval-speed-stmt.json
...ools/taosdemoAllTest/stmt/insert-interval-speed-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-newdb-stmt.json
.../pytest/tools/taosdemoAllTest/stmt/insert-newdb-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-newtable-stmt.json
...test/tools/taosdemoAllTest/stmt/insert-newtable-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-nodbnodrop-stmt.json
...st/tools/taosdemoAllTest/stmt/insert-nodbnodrop-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-offset-stmt.json
...pytest/tools/taosdemoAllTest/stmt/insert-offset-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-renewdb-stmt.json
...ytest/tools/taosdemoAllTest/stmt/insert-renewdb-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-sample-stmt.json
...pytest/tools/taosdemoAllTest/stmt/insert-sample-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insert-timestep-stmt.json
...test/tools/taosdemoAllTest/stmt/insert-timestep-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertBinaryLenLarge16374AllcolLar49151-stmt.json
...st/stmt/insertBinaryLenLarge16374AllcolLar49151-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertChildTab0-stmt.json
...test/tools/taosdemoAllTest/stmt/insertChildTab0-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertChildTabLess0-stmt.json
.../tools/taosdemoAllTest/stmt/insertChildTabLess0-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertColumnsAndTagNum4096-stmt.json
...taosdemoAllTest/stmt/insertColumnsAndTagNum4096-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertColumnsNum0-stmt.json
...st/tools/taosdemoAllTest/stmt/insertColumnsNum0-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertInterlaceRowsLarge1M-stmt.json
...taosdemoAllTest/stmt/insertInterlaceRowsLarge1M-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertMaxNumPerReq-stmt.json
...t/tools/taosdemoAllTest/stmt/insertMaxNumPerReq-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertNumOfrecordPerReq0-stmt.json
...s/taosdemoAllTest/stmt/insertNumOfrecordPerReq0-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertNumOfrecordPerReqless0-stmt.json
...osdemoAllTest/stmt/insertNumOfrecordPerReqless0-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertSigcolumnsNum4096-stmt.json
...ls/taosdemoAllTest/stmt/insertSigcolumnsNum4096-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/insertTagsNumLarge128-stmt.json
...ools/taosdemoAllTest/stmt/insertTagsNumLarge128-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/stmt/nsertColumnsAndTagNumLarge4096-stmt.json
...demoAllTest/stmt/nsertColumnsAndTagNumLarge4096-stmt.json
+1
-1
tests/pytest/tools/taosdemoAllTest/subInsertdata.json
tests/pytest/tools/taosdemoAllTest/subInsertdata.json
+1
-1
tests/pytest/tools/taosdemoAllTest/subInsertdataMaxsql100.json
.../pytest/tools/taosdemoAllTest/subInsertdataMaxsql100.json
+1
-1
tests/pytest/tools/taosdemoAllTest/taosdemoTestInsertAllType.py
...pytest/tools/taosdemoAllTest/taosdemoTestInsertAllType.py
+106
-0
tests/pytest/tools/taosdumpTest3.py
tests/pytest/tools/taosdumpTest3.py
+200
-0
tests/script/api/openTSDBTest.c
tests/script/api/openTSDBTest.c
+80
-130
tests/tsim/src/simExe.c
tests/tsim/src/simExe.c
+2
-2
未找到文件。
Jenkinsfile
浏览文件 @
25bddc0c
...
@@ -114,6 +114,7 @@ def pre_test(){
...
@@ -114,6 +114,7 @@ def pre_test(){
}
}
def
pre_test_win
(){
def
pre_test_win
(){
bat
'''
bat
'''
taskkill /f /t /im python.exe
cd C:\\
cd C:\\
rd /s /Q C:\\TDengine
rd /s /Q C:\\TDengine
cd C:\\workspace\\TDinternal
cd C:\\workspace\\TDinternal
...
@@ -180,9 +181,9 @@ def pre_test_win(){
...
@@ -180,9 +181,9 @@ def pre_test_win(){
cd debug
cd debug
call "C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\vcvarsall.bat" amd64
call "C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\vcvarsall.bat" amd64
cmake ../ -G "NMake Makefiles"
cmake ../ -G "NMake Makefiles"
nmake
nmake
|| exit 8
nmake install
nmake install
|| exit 8
xcopy /e/y/i/f C:\\workspace\\TDinternal\\debug\\build\\lib\\taos.dll C:\\Windows\\System32
xcopy /e/y/i/f C:\\workspace\\TDinternal\\debug\\build\\lib\\taos.dll C:\\Windows\\System32
|| exit 8
cd C:\\workspace\\TDinternal\\community\\src\\connector\\python
cd C:\\workspace\\TDinternal\\community\\src\\connector\\python
python -m pip install .
python -m pip install .
...
@@ -484,10 +485,12 @@ pipeline {
...
@@ -484,10 +485,12 @@ pipeline {
catchError
(
buildResult:
'FAILURE'
,
stageResult:
'FAILURE'
)
{
catchError
(
buildResult:
'FAILURE'
,
stageResult:
'FAILURE'
)
{
pre_test_win
()
pre_test_win
()
timeout
(
time:
20
,
unit:
'MINUTES'
){
bat
'''
bat
'''
cd C:\\workspace\\TDinternal\\community\\tests\\pytest
cd C:\\workspace\\TDinternal\\community\\tests\\pytest
.\\test-all.bat Wintest
.\\test-all.bat Wintest
'''
'''
}
}
}
script
{
script
{
win_stop
=
1
win_stop
=
1
...
...
deps/cJson/inc/cJSON.h
浏览文件 @
25bddc0c
...
@@ -73,7 +73,7 @@ typedef struct cJSON
...
@@ -73,7 +73,7 @@ typedef struct cJSON
char
*
string
;
char
*
string
;
//Keep the original string of number
//Keep the original string of number
char
numberstring
[
13
];
char
numberstring
[
64
];
}
cJSON
;
}
cJSON
;
typedef
struct
cJSON_Hooks
typedef
struct
cJSON_Hooks
...
...
deps/cJson/src/cJSON.c
浏览文件 @
25bddc0c
...
@@ -290,7 +290,7 @@ loop_end:
...
@@ -290,7 +290,7 @@ loop_end:
input_buffer
->
offset
+=
(
size_t
)(
after_end
-
number_c_string
);
input_buffer
->
offset
+=
(
size_t
)(
after_end
-
number_c_string
);
strncpy
(
item
->
numberstring
,
(
const
char
*
)
number_c_string
,
12
);
strncpy
(
item
->
numberstring
,
(
const
char
*
)
number_c_string
,
strlen
((
const
char
*
)
number_c_string
)
);
return
true
;
return
true
;
}
}
...
...
src/client/inc/tscParseLine.h
浏览文件 @
25bddc0c
...
@@ -47,6 +47,12 @@ typedef enum {
...
@@ -47,6 +47,12 @@ typedef enum {
SML_TIME_STAMP_NANO_SECONDS
SML_TIME_STAMP_NANO_SECONDS
}
SMLTimeStampType
;
}
SMLTimeStampType
;
typedef
enum
{
SML_LINE_PROTOCOL
=
0
,
SML_TELNET_PROTOCOL
=
1
,
SML_JSON_PROTOCOL
=
2
,
}
SMLProtocolType
;
typedef
struct
{
typedef
struct
{
uint64_t
id
;
uint64_t
id
;
SHashObj
*
smlDataToSchema
;
SHashObj
*
smlDataToSchema
;
...
@@ -57,7 +63,7 @@ bool checkDuplicateKey(char *key, SHashObj *pHash, SSmlLinesInfo* info);
...
@@ -57,7 +63,7 @@ bool checkDuplicateKey(char *key, SHashObj *pHash, SSmlLinesInfo* info);
bool
isValidInteger
(
char
*
str
);
bool
isValidInteger
(
char
*
str
);
bool
isValidFloat
(
char
*
str
);
bool
isValidFloat
(
char
*
str
);
int32_t
isValidChildTableName
(
const
char
*
pTbName
,
int16_t
len
);
int32_t
isValidChildTableName
(
const
char
*
pTbName
,
int16_t
len
,
SSmlLinesInfo
*
info
);
bool
convertSmlValueType
(
TAOS_SML_KV
*
pVal
,
char
*
value
,
bool
convertSmlValueType
(
TAOS_SML_KV
*
pVal
,
char
*
value
,
uint16_t
len
,
SSmlLinesInfo
*
info
);
uint16_t
len
,
SSmlLinesInfo
*
info
);
...
@@ -66,6 +72,11 @@ int32_t convertSmlTimeStamp(TAOS_SML_KV *pVal, char *value,
...
@@ -66,6 +72,11 @@ int32_t convertSmlTimeStamp(TAOS_SML_KV *pVal, char *value,
void
destroySmlDataPoint
(
TAOS_SML_DATA_POINT
*
point
);
void
destroySmlDataPoint
(
TAOS_SML_DATA_POINT
*
point
);
int
taos_insert_sml_lines
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
);
int
taos_insert_telnet_lines
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
);
int
taos_insert_json_payload
(
TAOS
*
taos
,
char
*
payload
);
#ifdef __cplusplus
#ifdef __cplusplus
}
}
#endif
#endif
...
...
src/client/src/TSDBJNIConnector.c
浏览文件 @
25bddc0c
...
@@ -17,6 +17,7 @@
...
@@ -17,6 +17,7 @@
#include "taos.h"
#include "taos.h"
#include "tlog.h"
#include "tlog.h"
#include "tscUtil.h"
#include "tscUtil.h"
#include "tscParseLine.h"
#include "com_taosdata_jdbc_TSDBJNIConnector.h"
#include "com_taosdata_jdbc_TSDBJNIConnector.h"
...
@@ -1070,7 +1071,7 @@ JNIEXPORT jlong JNICALL Java_com_taosdata_jdbc_TSDBJNIConnector_insertLinesImp(J
...
@@ -1070,7 +1071,7 @@ JNIEXPORT jlong JNICALL Java_com_taosdata_jdbc_TSDBJNIConnector_insertLinesImp(J
c_lines
[
i
]
=
(
char
*
)(
*
env
)
->
GetStringUTFChars
(
env
,
line
,
0
);
c_lines
[
i
]
=
(
char
*
)(
*
env
)
->
GetStringUTFChars
(
env
,
line
,
0
);
}
}
int
code
=
taos_
insert_lines
(
taos
,
c_lines
,
numLines
);
int
code
=
taos_
schemaless_insert
(
taos
,
c_lines
,
numLines
,
SML_LINE_PROTOCOL
);
for
(
int
i
=
0
;
i
<
numLines
;
++
i
)
{
for
(
int
i
=
0
;
i
<
numLines
;
++
i
)
{
jstring
line
=
(
jstring
)((
*
env
)
->
GetObjectArrayElement
(
env
,
lines
,
i
));
jstring
line
=
(
jstring
)((
*
env
)
->
GetObjectArrayElement
(
env
,
lines
,
i
));
...
@@ -1084,4 +1085,4 @@ JNIEXPORT jlong JNICALL Java_com_taosdata_jdbc_TSDBJNIConnector_insertLinesImp(J
...
@@ -1084,4 +1085,4 @@ JNIEXPORT jlong JNICALL Java_com_taosdata_jdbc_TSDBJNIConnector_insertLinesImp(J
return
JNI_TDENGINE_ERROR
;
return
JNI_TDENGINE_ERROR
;
}
}
return
code
;
return
code
;
}
}
\ No newline at end of file
src/client/src/tscParseLineProtocol.c
浏览文件 @
25bddc0c
...
@@ -1811,8 +1811,8 @@ static int32_t parseSmlKey(TAOS_SML_KV *pKV, const char **index, SHashObj *pHash
...
@@ -1811,8 +1811,8 @@ static int32_t parseSmlKey(TAOS_SML_KV *pKV, const char **index, SHashObj *pHash
return
TSDB_CODE_TSC_LINE_SYNTAX_ERROR
;
return
TSDB_CODE_TSC_LINE_SYNTAX_ERROR
;
}
}
while
(
*
cur
!=
'\0'
)
{
while
(
*
cur
!=
'\0'
)
{
if
(
len
>
TSDB_COL_NAME_LEN
)
{
if
(
len
>
=
TSDB_COL_NAME_LEN
-
1
)
{
tscError
(
"SML:0x%"
PRIx64
" Key field cannot exceeds
65 characters"
,
info
->
id
);
tscError
(
"SML:0x%"
PRIx64
" Key field cannot exceeds
%d characters"
,
info
->
id
,
TSDB_COL_NAME_LEN
-
1
);
return
TSDB_CODE_TSC_INVALID_COLUMN_LENGTH
;
return
TSDB_CODE_TSC_INVALID_COLUMN_LENGTH
;
}
}
//unescaped '=' identifies a tag key
//unescaped '=' identifies a tag key
...
@@ -1898,8 +1898,8 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
...
@@ -1898,8 +1898,8 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
}
}
while
(
*
cur
!=
'\0'
)
{
while
(
*
cur
!=
'\0'
)
{
if
(
len
>
TSDB_TABLE_NAME_LEN
)
{
if
(
len
>
=
TSDB_TABLE_NAME_LEN
-
1
)
{
tscError
(
"SML:0x%"
PRIx64
" Measurement field cannot exceeds
193 characters"
,
info
->
id
);
tscError
(
"SML:0x%"
PRIx64
" Measurement field cannot exceeds
%d characters"
,
info
->
id
,
TSDB_TABLE_NAME_LEN
-
1
);
free
(
pSml
->
stableName
);
free
(
pSml
->
stableName
);
pSml
->
stableName
=
NULL
;
pSml
->
stableName
=
NULL
;
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
...
@@ -1917,7 +1917,7 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
...
@@ -1917,7 +1917,7 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
if
(
*
cur
==
'\\'
)
{
if
(
*
cur
==
'\\'
)
{
escapeSpecialCharacter
(
1
,
&
cur
);
escapeSpecialCharacter
(
1
,
&
cur
);
}
}
pSml
->
stableName
[
len
]
=
*
cur
;
pSml
->
stableName
[
len
]
=
tolower
(
*
cur
)
;
cur
++
;
cur
++
;
len
++
;
len
++
;
}
}
...
@@ -1929,7 +1929,11 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
...
@@ -1929,7 +1929,11 @@ static int32_t parseSmlMeasurement(TAOS_SML_DATA_POINT *pSml, const char **index
}
}
//Table name can only contain digits(0-9),alphebet(a-z),underscore(_)
//Table name can only contain digits(0-9),alphebet(a-z),underscore(_)
int32_t
isValidChildTableName
(
const
char
*
pTbName
,
int16_t
len
)
{
int32_t
isValidChildTableName
(
const
char
*
pTbName
,
int16_t
len
,
SSmlLinesInfo
*
info
)
{
if
(
len
>
TSDB_TABLE_NAME_LEN
-
1
)
{
tscError
(
"SML:0x%"
PRIx64
" child table name cannot exceeds %d characters"
,
info
->
id
,
TSDB_TABLE_NAME_LEN
-
1
);
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
}
const
char
*
cur
=
pTbName
;
const
char
*
cur
=
pTbName
;
for
(
int
i
=
0
;
i
<
len
;
++
i
)
{
for
(
int
i
=
0
;
i
<
len
;
++
i
)
{
if
(
!
isdigit
(
cur
[
i
])
&&
!
isalpha
(
cur
[
i
])
&&
(
cur
[
i
]
!=
'_'
))
{
if
(
!
isdigit
(
cur
[
i
])
&&
!
isalpha
(
cur
[
i
])
&&
(
cur
[
i
]
!=
'_'
))
{
...
@@ -1975,12 +1979,13 @@ static int32_t parseSmlKvPairs(TAOS_SML_KV **pKVs, int *num_kvs,
...
@@ -1975,12 +1979,13 @@ static int32_t parseSmlKvPairs(TAOS_SML_KV **pKVs, int *num_kvs,
}
}
if
(
!
isField
&&
if
(
!
isField
&&
(
strcasecmp
(
pkv
->
key
,
"ID"
)
==
0
)
&&
pkv
->
type
==
TSDB_DATA_TYPE_BINARY
)
{
(
strcasecmp
(
pkv
->
key
,
"ID"
)
==
0
)
&&
pkv
->
type
==
TSDB_DATA_TYPE_BINARY
)
{
ret
=
isValidChildTableName
(
pkv
->
value
,
pkv
->
length
);
ret
=
isValidChildTableName
(
pkv
->
value
,
pkv
->
length
,
info
);
if
(
ret
)
{
if
(
ret
)
{
goto
error
;
goto
error
;
}
}
smlData
->
childTableName
=
malloc
(
pkv
->
length
+
1
);
smlData
->
childTableName
=
malloc
(
pkv
->
length
+
1
);
memcpy
(
smlData
->
childTableName
,
pkv
->
value
,
pkv
->
length
);
memcpy
(
smlData
->
childTableName
,
pkv
->
value
,
pkv
->
length
);
strntolower_s
(
smlData
->
childTableName
,
smlData
->
childTableName
,
(
int32_t
)
pkv
->
length
);
smlData
->
childTableName
[
pkv
->
length
]
=
'\0'
;
smlData
->
childTableName
[
pkv
->
length
]
=
'\0'
;
free
(
pkv
->
key
);
free
(
pkv
->
key
);
free
(
pkv
->
value
);
free
(
pkv
->
value
);
...
@@ -2184,3 +2189,43 @@ cleanup:
...
@@ -2184,3 +2189,43 @@ cleanup:
return
code
;
return
code
;
}
}
/**
* taos_schemaless_insert() parse and insert data points into database according to
* different protocol.
*
* @param $lines input array may contain multiple lines, each line indicates a data point.
* If protocol=2 is used input array should contain single JSON
* string(e.g. char *lines[] = {"$JSON_string"}). If need to insert
* multiple data points in JSON format, should include them in $JSON_string
* as a JSON array.
* @param $numLines indicates how many data points in $lines.
* If protocol = 2 is used this param will be ignored as $lines should
* contain single JSON string.
* @param $protocol indicates which protocol to use for parsing:
* 0 - influxDB line protocol
* 1 - OpenTSDB telnet line protocol
* 2 - OpenTSDB JSON format protocol
* @return return zero for successful insertion. Otherwise return none-zero error code of
* failure reason.
*
*/
int
taos_schemaless_insert
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
,
int
protocol
)
{
int
code
;
switch
(
protocol
)
{
case
SML_LINE_PROTOCOL
:
code
=
taos_insert_lines
(
taos
,
lines
,
numLines
);
break
;
case
SML_TELNET_PROTOCOL
:
code
=
taos_insert_telnet_lines
(
taos
,
lines
,
numLines
);
break
;
case
SML_JSON_PROTOCOL
:
code
=
taos_insert_json_payload
(
taos
,
*
lines
);
break
;
default:
code
=
TSDB_CODE_TSC_INVALID_PROTOCOL_TYPE
;
break
;
}
return
code
;
}
src/client/src/tscParseOpenTSDB.c
浏览文件 @
25bddc0c
...
@@ -37,7 +37,7 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
...
@@ -37,7 +37,7 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
const
char
*
cur
=
*
index
;
const
char
*
cur
=
*
index
;
uint16_t
len
=
0
;
uint16_t
len
=
0
;
pSml
->
stableName
=
tcalloc
(
TSDB_TABLE_NAME_LEN
+
1
,
1
);
// +1 to avoid 1772 line over write
pSml
->
stableName
=
tcalloc
(
TSDB_TABLE_NAME_LEN
,
1
);
if
(
pSml
->
stableName
==
NULL
)
{
if
(
pSml
->
stableName
==
NULL
)
{
return
TSDB_CODE_TSC_OUT_OF_MEMORY
;
return
TSDB_CODE_TSC_OUT_OF_MEMORY
;
}
}
...
@@ -48,8 +48,8 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
...
@@ -48,8 +48,8 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
}
}
while
(
*
cur
!=
'\0'
)
{
while
(
*
cur
!=
'\0'
)
{
if
(
len
>
TSDB_TABLE_NAME_LEN
)
{
if
(
len
>
=
TSDB_TABLE_NAME_LEN
-
1
)
{
tscError
(
"OTD:0x%"
PRIx64
" Metric cannot exceeds
193 characters"
,
info
->
id
);
tscError
(
"OTD:0x%"
PRIx64
" Metric cannot exceeds
%d characters"
,
info
->
id
,
TSDB_TABLE_NAME_LEN
-
1
);
tfree
(
pSml
->
stableName
);
tfree
(
pSml
->
stableName
);
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
}
}
...
@@ -62,7 +62,7 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
...
@@ -62,7 +62,7 @@ static int32_t parseTelnetMetric(TAOS_SML_DATA_POINT *pSml, const char **index,
if
(
*
cur
==
'.'
)
{
if
(
*
cur
==
'.'
)
{
pSml
->
stableName
[
len
]
=
'_'
;
pSml
->
stableName
[
len
]
=
'_'
;
}
else
{
}
else
{
pSml
->
stableName
[
len
]
=
*
cur
;
pSml
->
stableName
[
len
]
=
tolower
(
*
cur
)
;
}
}
cur
++
;
cur
++
;
...
@@ -171,7 +171,7 @@ static int32_t parseTelnetMetricValue(TAOS_SML_KV **pKVs, int *num_kvs, const ch
...
@@ -171,7 +171,7 @@ static int32_t parseTelnetMetricValue(TAOS_SML_KV **pKVs, int *num_kvs, const ch
static
int32_t
parseTelnetTagKey
(
TAOS_SML_KV
*
pKV
,
const
char
**
index
,
SHashObj
*
pHash
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseTelnetTagKey
(
TAOS_SML_KV
*
pKV
,
const
char
**
index
,
SHashObj
*
pHash
,
SSmlLinesInfo
*
info
)
{
const
char
*
cur
=
*
index
;
const
char
*
cur
=
*
index
;
char
key
[
TSDB_COL_NAME_LEN
+
1
];
// +1 to avoid key[len] over write
char
key
[
TSDB_COL_NAME_LEN
];
uint16_t
len
=
0
;
uint16_t
len
=
0
;
//key field cannot start with digit
//key field cannot start with digit
...
@@ -180,8 +180,8 @@ static int32_t parseTelnetTagKey(TAOS_SML_KV *pKV, const char **index, SHashObj
...
@@ -180,8 +180,8 @@ static int32_t parseTelnetTagKey(TAOS_SML_KV *pKV, const char **index, SHashObj
return
TSDB_CODE_TSC_LINE_SYNTAX_ERROR
;
return
TSDB_CODE_TSC_LINE_SYNTAX_ERROR
;
}
}
while
(
*
cur
!=
'\0'
)
{
while
(
*
cur
!=
'\0'
)
{
if
(
len
>
TSDB_COL_NAME_LEN
)
{
if
(
len
>
=
TSDB_COL_NAME_LEN
-
1
)
{
tscError
(
"OTD:0x%"
PRIx64
" Tag key cannot exceeds
65 characters"
,
info
->
id
);
tscError
(
"OTD:0x%"
PRIx64
" Tag key cannot exceeds
%d characters"
,
info
->
id
,
TSDB_COL_NAME_LEN
-
1
);
return
TSDB_CODE_TSC_INVALID_COLUMN_LENGTH
;
return
TSDB_CODE_TSC_INVALID_COLUMN_LENGTH
;
}
}
if
(
*
cur
==
' '
)
{
if
(
*
cur
==
' '
)
{
...
@@ -276,13 +276,14 @@ static int32_t parseTelnetTagKvs(TAOS_SML_KV **pKVs, int *num_kvs,
...
@@ -276,13 +276,14 @@ static int32_t parseTelnetTagKvs(TAOS_SML_KV **pKVs, int *num_kvs,
return
ret
;
return
ret
;
}
}
if
((
strcasecmp
(
pkv
->
key
,
"ID"
)
==
0
)
&&
pkv
->
type
==
TSDB_DATA_TYPE_BINARY
)
{
if
((
strcasecmp
(
pkv
->
key
,
"ID"
)
==
0
)
&&
pkv
->
type
==
TSDB_DATA_TYPE_BINARY
)
{
ret
=
isValidChildTableName
(
pkv
->
value
,
pkv
->
length
);
ret
=
isValidChildTableName
(
pkv
->
value
,
pkv
->
length
,
info
);
if
(
ret
)
{
if
(
ret
)
{
return
ret
;
return
ret
;
}
}
*
childTableName
=
malloc
(
pkv
->
length
+
1
);
*
childTableName
=
malloc
(
pkv
->
length
+
1
);
memcpy
(
*
childTableName
,
pkv
->
value
,
pkv
->
length
);
memcpy
(
*
childTableName
,
pkv
->
value
,
pkv
->
length
);
(
*
childTableName
)[
pkv
->
length
]
=
'\0'
;
(
*
childTableName
)[
pkv
->
length
]
=
'\0'
;
strntolower_s
(
*
childTableName
,
*
childTableName
,
(
int32_t
)
pkv
->
length
);
tfree
(
pkv
->
key
);
tfree
(
pkv
->
key
);
tfree
(
pkv
->
value
);
tfree
(
pkv
->
value
);
}
else
{
}
else
{
...
@@ -311,7 +312,7 @@ static int32_t parseTelnetTagKvs(TAOS_SML_KV **pKVs, int *num_kvs,
...
@@ -311,7 +312,7 @@ static int32_t parseTelnetTagKvs(TAOS_SML_KV **pKVs, int *num_kvs,
return
ret
;
return
ret
;
}
}
int32_t
tscParseTelnetLine
(
const
char
*
line
,
TAOS_SML_DATA_POINT
*
smlData
,
SSmlLinesInfo
*
info
)
{
static
int32_t
tscParseTelnetLine
(
const
char
*
line
,
TAOS_SML_DATA_POINT
*
smlData
,
SSmlLinesInfo
*
info
)
{
const
char
*
index
=
line
;
const
char
*
index
=
line
;
int32_t
ret
=
TSDB_CODE_SUCCESS
;
int32_t
ret
=
TSDB_CODE_SUCCESS
;
...
@@ -354,7 +355,7 @@ int32_t tscParseTelnetLine(const char* line, TAOS_SML_DATA_POINT* smlData, SSmlL
...
@@ -354,7 +355,7 @@ int32_t tscParseTelnetLine(const char* line, TAOS_SML_DATA_POINT* smlData, SSmlL
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
tscParseTelnetLines
(
char
*
lines
[],
int
numLines
,
SArray
*
points
,
SArray
*
failedLines
,
SSmlLinesInfo
*
info
)
{
static
int32_t
tscParseTelnetLines
(
char
*
lines
[],
int
numLines
,
SArray
*
points
,
SArray
*
failedLines
,
SSmlLinesInfo
*
info
)
{
for
(
int32_t
i
=
0
;
i
<
numLines
;
++
i
)
{
for
(
int32_t
i
=
0
;
i
<
numLines
;
++
i
)
{
TAOS_SML_DATA_POINT
point
=
{
0
};
TAOS_SML_DATA_POINT
point
=
{
0
};
int32_t
code
=
tscParseTelnetLine
(
lines
[
i
],
&
point
,
info
);
int32_t
code
=
tscParseTelnetLine
(
lines
[
i
],
&
point
,
info
);
...
@@ -438,15 +439,15 @@ int taos_telnet_insert(TAOS* taos, TAOS_SML_DATA_POINT* points, int numPoint) {
...
@@ -438,15 +439,15 @@ int taos_telnet_insert(TAOS* taos, TAOS_SML_DATA_POINT* points, int numPoint) {
/* telnet style API parser */
/* telnet style API parser */
int32_t
parseMetricFromJSON
(
cJSON
*
root
,
TAOS_SML_DATA_POINT
*
pSml
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseMetricFromJSON
(
cJSON
*
root
,
TAOS_SML_DATA_POINT
*
pSml
,
SSmlLinesInfo
*
info
)
{
cJSON
*
metric
=
cJSON_GetObjectItem
(
root
,
"metric"
);
cJSON
*
metric
=
cJSON_GetObjectItem
(
root
,
"metric"
);
if
(
!
cJSON_IsString
(
metric
))
{
if
(
!
cJSON_IsString
(
metric
))
{
return
TSDB_CODE_TSC_INVALID_JSON
;
return
TSDB_CODE_TSC_INVALID_JSON
;
}
}
size_t
stableLen
=
strlen
(
metric
->
valuestring
);
size_t
stableLen
=
strlen
(
metric
->
valuestring
);
if
(
stableLen
>
TSDB_TABLE_NAME_LEN
)
{
if
(
stableLen
>
TSDB_TABLE_NAME_LEN
-
1
)
{
tscError
(
"OTD:0x%"
PRIx64
" Metric cannot exceeds
193 characters in JSON"
,
info
->
id
);
tscError
(
"OTD:0x%"
PRIx64
" Metric cannot exceeds
%d characters in JSON"
,
info
->
id
,
TSDB_TABLE_NAME_LEN
-
1
);
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
return
TSDB_CODE_TSC_INVALID_TABLE_ID_LENGTH
;
}
}
...
@@ -462,19 +463,20 @@ int32_t parseMetricFromJSON(cJSON *root, TAOS_SML_DATA_POINT* pSml, SSmlLinesInf
...
@@ -462,19 +463,20 @@ int32_t parseMetricFromJSON(cJSON *root, TAOS_SML_DATA_POINT* pSml, SSmlLinesInf
}
}
//convert dot to underscore for now, will be removed once dot is allowed in tbname.
//convert dot to underscore for now, will be removed once dot is allowed in tbname.
for
(
int
i
=
0
;
i
<
st
rlen
(
metric
->
valuestring
)
;
++
i
)
{
for
(
int
i
=
0
;
i
<
st
ableLen
;
++
i
)
{
if
(
metric
->
valuestring
[
i
]
==
'.'
)
{
if
(
metric
->
valuestring
[
i
]
==
'.'
)
{
metric
->
valuestring
[
i
]
=
'_'
;
metric
->
valuestring
[
i
]
=
'_'
;
}
}
}
}
tstrncpy
(
pSml
->
stableName
,
metric
->
valuestring
,
stableLen
+
1
);
tstrncpy
(
pSml
->
stableName
,
metric
->
valuestring
,
stableLen
+
1
);
strntolower_s
(
pSml
->
stableName
,
pSml
->
stableName
,
(
int32_t
)
stableLen
);
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
parseTimestampFromJSONObj
(
cJSON
*
root
,
int64_t
*
tsVal
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseTimestampFromJSONObj
(
cJSON
*
root
,
int64_t
*
tsVal
,
SSmlLinesInfo
*
info
)
{
int32_t
size
=
cJSON_GetArraySize
(
root
);
int32_t
size
=
cJSON_GetArraySize
(
root
);
if
(
size
!=
OTD_JSON_SUB_FIELDS_NUM
)
{
if
(
size
!=
OTD_JSON_SUB_FIELDS_NUM
)
{
return
TSDB_CODE_TSC_INVALID_JSON
;
return
TSDB_CODE_TSC_INVALID_JSON
;
...
@@ -490,7 +492,7 @@ int32_t parseTimestampFromJSONObj(cJSON *root, int64_t *tsVal, SSmlLinesInfo* in
...
@@ -490,7 +492,7 @@ int32_t parseTimestampFromJSONObj(cJSON *root, int64_t *tsVal, SSmlLinesInfo* in
return
TSDB_CODE_TSC_INVALID_JSON
;
return
TSDB_CODE_TSC_INVALID_JSON
;
}
}
*
tsVal
=
value
->
valueint
;
*
tsVal
=
strtoll
(
value
->
numberstring
,
NULL
,
10
)
;
//if timestamp value is 0 use current system time
//if timestamp value is 0 use current system time
if
(
*
tsVal
==
0
)
{
if
(
*
tsVal
==
0
)
{
*
tsVal
=
taosGetTimestampNs
();
*
tsVal
=
taosGetTimestampNs
();
...
@@ -526,7 +528,7 @@ int32_t parseTimestampFromJSONObj(cJSON *root, int64_t *tsVal, SSmlLinesInfo* in
...
@@ -526,7 +528,7 @@ int32_t parseTimestampFromJSONObj(cJSON *root, int64_t *tsVal, SSmlLinesInfo* in
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
parseTimestampFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pTS
,
int
*
num_kvs
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseTimestampFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pTS
,
int
*
num_kvs
,
SSmlLinesInfo
*
info
)
{
//Timestamp must be the first KV to parse
//Timestamp must be the first KV to parse
assert
(
*
num_kvs
==
0
);
assert
(
*
num_kvs
==
0
);
int64_t
tsVal
;
int64_t
tsVal
;
...
@@ -538,7 +540,8 @@ int32_t parseTimestampFromJSON(cJSON *root, TAOS_SML_KV **pTS, int *num_kvs, SSm
...
@@ -538,7 +540,8 @@ int32_t parseTimestampFromJSON(cJSON *root, TAOS_SML_KV **pTS, int *num_kvs, SSm
if
(
timestamp
->
valueint
==
0
)
{
if
(
timestamp
->
valueint
==
0
)
{
tsVal
=
taosGetTimestampNs
();
tsVal
=
taosGetTimestampNs
();
}
else
{
}
else
{
tsVal
=
convertTimePrecision
(
timestamp
->
valueint
,
TSDB_TIME_PRECISION_MICRO
,
TSDB_TIME_PRECISION_NANO
);
tsVal
=
strtoll
(
timestamp
->
numberstring
,
NULL
,
10
);
tsVal
=
convertTimePrecision
(
tsVal
,
TSDB_TIME_PRECISION_MICRO
,
TSDB_TIME_PRECISION_NANO
);
}
}
}
else
if
(
cJSON_IsObject
(
timestamp
))
{
}
else
if
(
cJSON_IsObject
(
timestamp
))
{
int32_t
ret
=
parseTimestampFromJSONObj
(
timestamp
,
&
tsVal
,
info
);
int32_t
ret
=
parseTimestampFromJSONObj
(
timestamp
,
&
tsVal
,
info
);
...
@@ -567,7 +570,7 @@ int32_t parseTimestampFromJSON(cJSON *root, TAOS_SML_KV **pTS, int *num_kvs, SSm
...
@@ -567,7 +570,7 @@ int32_t parseTimestampFromJSON(cJSON *root, TAOS_SML_KV **pTS, int *num_kvs, SSm
}
}
int32_t
convertJSONBool
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
int64_t
valueInt
,
SSmlLinesInfo
*
info
)
{
static
int32_t
convertJSONBool
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
int64_t
valueInt
,
SSmlLinesInfo
*
info
)
{
if
(
strcasecmp
(
typeStr
,
"bool"
)
!=
0
)
{
if
(
strcasecmp
(
typeStr
,
"bool"
)
!=
0
)
{
tscError
(
"OTD:0x%"
PRIx64
" invalid type(%s) for JSON Bool"
,
info
->
id
,
typeStr
);
tscError
(
"OTD:0x%"
PRIx64
" invalid type(%s) for JSON Bool"
,
info
->
id
,
typeStr
);
return
TSDB_CODE_TSC_INVALID_JSON_TYPE
;
return
TSDB_CODE_TSC_INVALID_JSON_TYPE
;
...
@@ -580,7 +583,7 @@ int32_t convertJSONBool(TAOS_SML_KV *pVal, char* typeStr, int64_t valueInt, SSml
...
@@ -580,7 +583,7 @@ int32_t convertJSONBool(TAOS_SML_KV *pVal, char* typeStr, int64_t valueInt, SSml
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
convertJSONNumber
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
cJSON
*
value
,
SSmlLinesInfo
*
info
)
{
static
int32_t
convertJSONNumber
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
cJSON
*
value
,
SSmlLinesInfo
*
info
)
{
//tinyint
//tinyint
if
(
strcasecmp
(
typeStr
,
"i8"
)
==
0
||
if
(
strcasecmp
(
typeStr
,
"i8"
)
==
0
||
strcasecmp
(
typeStr
,
"tinyint"
)
==
0
)
{
strcasecmp
(
typeStr
,
"tinyint"
)
==
0
)
{
...
@@ -623,14 +626,19 @@ int32_t convertJSONNumber(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
...
@@ -623,14 +626,19 @@ int32_t convertJSONNumber(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
//bigint
//bigint
if
(
strcasecmp
(
typeStr
,
"i64"
)
==
0
||
if
(
strcasecmp
(
typeStr
,
"i64"
)
==
0
||
strcasecmp
(
typeStr
,
"bigint"
)
==
0
)
{
strcasecmp
(
typeStr
,
"bigint"
)
==
0
)
{
if
(
!
IS_VALID_BIGINT
(
value
->
valueint
))
{
tscError
(
"OTD:0x%"
PRIx64
" JSON value(%"
PRId64
") cannot fit in type(bigint)"
,
info
->
id
,
value
->
valueint
);
return
TSDB_CODE_TSC_VALUE_OUT_OF_RANGE
;
}
pVal
->
type
=
TSDB_DATA_TYPE_BIGINT
;
pVal
->
type
=
TSDB_DATA_TYPE_BIGINT
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
pVal
->
value
=
tcalloc
(
pVal
->
length
,
1
);
pVal
->
value
=
tcalloc
(
pVal
->
length
,
1
);
*
(
int64_t
*
)(
pVal
->
value
)
=
(
int64_t
)(
value
->
valueint
);
/* cJSON conversion of legit BIGINT may overflow,
* use original string to do the conversion.
*/
errno
=
0
;
int64_t
val
=
(
int64_t
)
strtoll
(
value
->
numberstring
,
NULL
,
10
);
if
(
errno
==
ERANGE
||
!
IS_VALID_BIGINT
(
val
))
{
tscError
(
"OTD:0x%"
PRIx64
" JSON value(%s) cannot fit in type(bigint)"
,
info
->
id
,
value
->
numberstring
);
return
TSDB_CODE_TSC_VALUE_OUT_OF_RANGE
;
}
*
(
int64_t
*
)(
pVal
->
value
)
=
val
;
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
//float
//float
...
@@ -665,7 +673,7 @@ int32_t convertJSONNumber(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
...
@@ -665,7 +673,7 @@ int32_t convertJSONNumber(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
return
TSDB_CODE_TSC_INVALID_JSON_TYPE
;
return
TSDB_CODE_TSC_INVALID_JSON_TYPE
;
}
}
int32_t
convertJSONString
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
cJSON
*
value
,
SSmlLinesInfo
*
info
)
{
static
int32_t
convertJSONString
(
TAOS_SML_KV
*
pVal
,
char
*
typeStr
,
cJSON
*
value
,
SSmlLinesInfo
*
info
)
{
if
(
strcasecmp
(
typeStr
,
"binary"
)
==
0
)
{
if
(
strcasecmp
(
typeStr
,
"binary"
)
==
0
)
{
pVal
->
type
=
TSDB_DATA_TYPE_BINARY
;
pVal
->
type
=
TSDB_DATA_TYPE_BINARY
;
}
else
if
(
strcasecmp
(
typeStr
,
"nchar"
)
==
0
)
{
}
else
if
(
strcasecmp
(
typeStr
,
"nchar"
)
==
0
)
{
...
@@ -680,7 +688,7 @@ int32_t convertJSONString(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
...
@@ -680,7 +688,7 @@ int32_t convertJSONString(TAOS_SML_KV *pVal, char* typeStr, cJSON *value, SSmlLi
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
parseValueFromJSONObj
(
cJSON
*
root
,
TAOS_SML_KV
*
pVal
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseValueFromJSONObj
(
cJSON
*
root
,
TAOS_SML_KV
*
pVal
,
SSmlLinesInfo
*
info
)
{
int32_t
ret
=
TSDB_CODE_SUCCESS
;
int32_t
ret
=
TSDB_CODE_SUCCESS
;
int32_t
size
=
cJSON_GetArraySize
(
root
);
int32_t
size
=
cJSON_GetArraySize
(
root
);
...
@@ -728,7 +736,7 @@ int32_t parseValueFromJSONObj(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* inf
...
@@ -728,7 +736,7 @@ int32_t parseValueFromJSONObj(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* inf
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
parseValueFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
*
pVal
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseValueFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
*
pVal
,
SSmlLinesInfo
*
info
)
{
int
type
=
root
->
type
;
int
type
=
root
->
type
;
switch
(
type
)
{
switch
(
type
)
{
...
@@ -746,7 +754,16 @@ int32_t parseValueFromJSON(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* info)
...
@@ -746,7 +754,16 @@ int32_t parseValueFromJSON(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* info)
pVal
->
type
=
TSDB_DATA_TYPE_BIGINT
;
pVal
->
type
=
TSDB_DATA_TYPE_BIGINT
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
pVal
->
value
=
tcalloc
(
pVal
->
length
,
1
);
pVal
->
value
=
tcalloc
(
pVal
->
length
,
1
);
*
(
int64_t
*
)(
pVal
->
value
)
=
(
int64_t
)(
root
->
valuedouble
);
/* cJSON conversion of legit BIGINT may overflow,
* use original string to do the conversion.
*/
errno
=
0
;
int64_t
val
=
(
int64_t
)
strtoll
(
root
->
numberstring
,
NULL
,
10
);
if
(
errno
==
ERANGE
||
!
IS_VALID_BIGINT
(
val
))
{
tscError
(
"OTD:0x%"
PRIx64
" JSON value(%s) cannot fit in type(bigint)"
,
info
->
id
,
root
->
numberstring
);
return
TSDB_CODE_TSC_VALUE_OUT_OF_RANGE
;
}
*
(
int64_t
*
)(
pVal
->
value
)
=
val
;
}
else
if
(
isValidFloat
(
root
->
numberstring
))
{
}
else
if
(
isValidFloat
(
root
->
numberstring
))
{
pVal
->
type
=
TSDB_DATA_TYPE_DOUBLE
;
pVal
->
type
=
TSDB_DATA_TYPE_DOUBLE
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
pVal
->
length
=
(
int16_t
)
tDataTypes
[
pVal
->
type
].
bytes
;
...
@@ -790,7 +807,7 @@ int32_t parseValueFromJSON(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* info)
...
@@ -790,7 +807,7 @@ int32_t parseValueFromJSON(cJSON *root, TAOS_SML_KV *pVal, SSmlLinesInfo* info)
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
parseMetricValueFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pKVs
,
int
*
num_kvs
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseMetricValueFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pKVs
,
int
*
num_kvs
,
SSmlLinesInfo
*
info
)
{
//skip timestamp
//skip timestamp
TAOS_SML_KV
*
pVal
=
*
pKVs
+
1
;
TAOS_SML_KV
*
pVal
=
*
pKVs
+
1
;
char
key
[]
=
OTD_METRIC_VALUE_COLUMN_NAME
;
char
key
[]
=
OTD_METRIC_VALUE_COLUMN_NAME
;
...
@@ -813,7 +830,9 @@ int32_t parseMetricValueFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs,
...
@@ -813,7 +830,9 @@ int32_t parseMetricValueFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs,
}
}
int32_t
parseTagsFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pKVs
,
int
*
num_kvs
,
char
**
childTableName
,
SSmlLinesInfo
*
info
)
{
static
int32_t
parseTagsFromJSON
(
cJSON
*
root
,
TAOS_SML_KV
**
pKVs
,
int
*
num_kvs
,
char
**
childTableName
,
SHashObj
*
pHash
,
SSmlLinesInfo
*
info
)
{
int32_t
ret
=
TSDB_CODE_SUCCESS
;
int32_t
ret
=
TSDB_CODE_SUCCESS
;
cJSON
*
tags
=
cJSON_GetObjectItem
(
root
,
"tags"
);
cJSON
*
tags
=
cJSON_GetObjectItem
(
root
,
"tags"
);
...
@@ -825,16 +844,19 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
...
@@ -825,16 +844,19 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
cJSON
*
id
=
cJSON_GetObjectItem
(
tags
,
"ID"
);
cJSON
*
id
=
cJSON_GetObjectItem
(
tags
,
"ID"
);
if
(
id
!=
NULL
)
{
if
(
id
!=
NULL
)
{
size_t
idLen
=
strlen
(
id
->
valuestring
);
size_t
idLen
=
strlen
(
id
->
valuestring
);
ret
=
isValidChildTableName
(
id
->
valuestring
,
(
int16_t
)
idLen
);
ret
=
isValidChildTableName
(
id
->
valuestring
,
(
int16_t
)
idLen
,
info
);
if
(
ret
!=
TSDB_CODE_SUCCESS
)
{
if
(
ret
!=
TSDB_CODE_SUCCESS
)
{
return
ret
;
return
ret
;
}
}
*
childTableName
=
tcalloc
(
idLen
+
1
,
sizeof
(
char
));
*
childTableName
=
tcalloc
(
idLen
+
1
,
sizeof
(
char
));
memcpy
(
*
childTableName
,
id
->
valuestring
,
idLen
);
memcpy
(
*
childTableName
,
id
->
valuestring
,
idLen
);
//remove all ID fields from tags list no case sensitive
strntolower_s
(
*
childTableName
,
*
childTableName
,
(
int32_t
)
idLen
);
while
(
id
!=
NULL
)
{
cJSON_DeleteItemFromObject
(
tags
,
"ID"
);
//check duplicate IDs
id
=
cJSON_GetObjectItem
(
tags
,
"ID"
);
cJSON_DeleteItemFromObject
(
tags
,
"ID"
);
id
=
cJSON_GetObjectItem
(
tags
,
"ID"
);
if
(
id
!=
NULL
)
{
return
TSDB_CODE_TSC_DUP_TAG_NAMES
;
}
}
}
}
...
@@ -853,8 +875,16 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
...
@@ -853,8 +875,16 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
if
(
tag
==
NULL
)
{
if
(
tag
==
NULL
)
{
return
TSDB_CODE_TSC_INVALID_JSON
;
return
TSDB_CODE_TSC_INVALID_JSON
;
}
}
//check duplicate keys
if
(
checkDuplicateKey
(
tag
->
string
,
pHash
,
info
))
{
return
TSDB_CODE_TSC_DUP_TAG_NAMES
;
}
//key
//key
size_t
keyLen
=
strlen
(
tag
->
string
);
size_t
keyLen
=
strlen
(
tag
->
string
);
if
(
keyLen
>
TSDB_COL_NAME_LEN
-
1
)
{
tscError
(
"OTD:0x%"
PRIx64
" Tag key cannot exceeds %d characters in JSON"
,
info
->
id
,
TSDB_COL_NAME_LEN
-
1
);
return
TSDB_CODE_TSC_INVALID_COLUMN_LENGTH
;
}
pkv
->
key
=
tcalloc
(
keyLen
+
1
,
sizeof
(
char
));
pkv
->
key
=
tcalloc
(
keyLen
+
1
,
sizeof
(
char
));
strncpy
(
pkv
->
key
,
tag
->
string
,
keyLen
);
strncpy
(
pkv
->
key
,
tag
->
string
,
keyLen
);
//value
//value
...
@@ -864,13 +894,14 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
...
@@ -864,13 +894,14 @@ int32_t parseTagsFromJSON(cJSON *root, TAOS_SML_KV **pKVs, int *num_kvs, char **
}
}
*
num_kvs
+=
1
;
*
num_kvs
+=
1
;
pkv
++
;
pkv
++
;
}
}
return
ret
;
return
ret
;
}
}
int32_t
tscParseJSONPayload
(
cJSON
*
root
,
TAOS_SML_DATA_POINT
*
pSml
,
SSmlLinesInfo
*
info
)
{
static
int32_t
tscParseJSONPayload
(
cJSON
*
root
,
TAOS_SML_DATA_POINT
*
pSml
,
SSmlLinesInfo
*
info
)
{
int32_t
ret
=
TSDB_CODE_SUCCESS
;
int32_t
ret
=
TSDB_CODE_SUCCESS
;
if
(
!
cJSON_IsObject
(
root
))
{
if
(
!
cJSON_IsObject
(
root
))
{
...
@@ -910,17 +941,20 @@ int32_t tscParseJSONPayload(cJSON *root, TAOS_SML_DATA_POINT* pSml, SSmlLinesInf
...
@@ -910,17 +941,20 @@ int32_t tscParseJSONPayload(cJSON *root, TAOS_SML_DATA_POINT* pSml, SSmlLinesInf
tscDebug
(
"OTD:0x%"
PRIx64
" Parse metric value from JSON payload finished"
,
info
->
id
);
tscDebug
(
"OTD:0x%"
PRIx64
" Parse metric value from JSON payload finished"
,
info
->
id
);
//Parse tags
//Parse tags
ret
=
parseTagsFromJSON
(
root
,
&
pSml
->
tags
,
&
pSml
->
tagNum
,
&
pSml
->
childTableName
,
info
);
SHashObj
*
keyHashTable
=
taosHashInit
(
128
,
taosGetDefaultHashFunction
(
TSDB_DATA_TYPE_BINARY
),
true
,
false
);
ret
=
parseTagsFromJSON
(
root
,
&
pSml
->
tags
,
&
pSml
->
tagNum
,
&
pSml
->
childTableName
,
keyHashTable
,
info
);
if
(
ret
)
{
if
(
ret
)
{
tscError
(
"OTD:0x%"
PRIx64
" Unable to parse tags from JSON payload"
,
info
->
id
);
tscError
(
"OTD:0x%"
PRIx64
" Unable to parse tags from JSON payload"
,
info
->
id
);
taosHashCleanup
(
keyHashTable
);
return
ret
;
return
ret
;
}
}
tscDebug
(
"OTD:0x%"
PRIx64
" Parse tags from JSON payload finished"
,
info
->
id
);
tscDebug
(
"OTD:0x%"
PRIx64
" Parse tags from JSON payload finished"
,
info
->
id
);
taosHashCleanup
(
keyHashTable
);
return
TSDB_CODE_SUCCESS
;
return
TSDB_CODE_SUCCESS
;
}
}
int32_t
tscParseMultiJSONPayload
(
char
*
payload
,
SArray
*
points
,
SSmlLinesInfo
*
info
)
{
static
int32_t
tscParseMultiJSONPayload
(
char
*
payload
,
SArray
*
points
,
SSmlLinesInfo
*
info
)
{
int32_t
payloadNum
,
ret
;
int32_t
payloadNum
,
ret
;
ret
=
TSDB_CODE_SUCCESS
;
ret
=
TSDB_CODE_SUCCESS
;
...
...
src/connector/python/README.md
浏览文件 @
25bddc0c
...
@@ -404,13 +404,13 @@ lines = [
...
@@ -404,13 +404,13 @@ lines = [
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns'
,
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns'
,
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
]
]
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
print
(
"inserted"
)
print
(
"inserted"
)
lines
=
[
lines
=
[
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"pass it again_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
]
]
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
result
=
conn
.
query
(
"show tables"
)
result
=
conn
.
query
(
"show tables"
)
for
row
in
result
:
for
row
in
result
:
...
...
src/connector/python/examples/insert-lines.py
浏览文件 @
25bddc0c
...
@@ -9,10 +9,10 @@ conn.select_db(dbname)
...
@@ -9,10 +9,10 @@ conn.select_db(dbname)
lines
=
[
lines
=
[
'st,t1=3i64,t2=4f64,t3="t3" c1=3i64,c3=L"pass",c2=false,c4=4f64 1626006833639000000ns'
,
'st,t1=3i64,t2=4f64,t3="t3" c1=3i64,c3=L"pass",c2=false,c4=4f64 1626006833639000000ns'
,
]
]
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
print
(
"inserted"
)
print
(
"inserted"
)
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
result
=
conn
.
query
(
"show tables"
)
result
=
conn
.
query
(
"show tables"
)
for
row
in
result
:
for
row
in
result
:
...
...
src/connector/python/taos/__init__.py
浏览文件 @
25bddc0c
...
@@ -406,13 +406,13 @@ lines = [
...
@@ -406,13 +406,13 @@ lines = [
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns',
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns',
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns',
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns',
]
]
conn.
insert_lines(lines
)
conn.
schemaless_insert(lines, 0
)
print("inserted")
print("inserted")
lines = [
lines = [
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns',
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns',
]
]
conn.
insert_lines(lines
)
conn.
schemaless_insert(lines, 0
)
result = conn.query("show tables")
result = conn.query("show tables")
for row in result:
for row in result:
...
...
src/connector/python/taos/cinterface.py
浏览文件 @
25bddc0c
...
@@ -809,40 +809,23 @@ def taos_stmt_use_result(stmt):
...
@@ -809,40 +809,23 @@ def taos_stmt_use_result(stmt):
return
result
return
result
try
:
try
:
_libtaos
.
taos_
insert_lines
.
restype
=
c_int
_libtaos
.
taos_
schemaless_insert
.
restype
=
c_int
_libtaos
.
taos_
insert_lines
.
argstype
=
c_void_p
,
c_void_p
,
c_int
_libtaos
.
taos_
schemaless_insert
.
argstype
=
c_void_p
,
c_void_p
,
c_int
except
AttributeError
:
except
AttributeError
:
print
(
"WARNING: libtaos(%s) does not support
insert_lines
"
%
taos_get_client_info
())
print
(
"WARNING: libtaos(%s) does not support
schemaless_insert
"
%
taos_get_client_info
())
def
taos_
insert_lines
(
connection
,
lines
):
def
taos_
schemaless_insert
(
connection
,
lines
,
protocol
):
# type: (c_void_p, list[str] | tuple(str)) -> None
# type: (c_void_p, list[str] | tuple(str)) -> None
num_of_lines
=
len
(
lines
)
num_of_lines
=
len
(
lines
)
lines
=
(
c_char_p
(
line
.
encode
(
"utf-8"
))
for
line
in
lines
)
lines
=
(
c_char_p
(
line
.
encode
(
"utf-8"
))
for
line
in
lines
)
lines_type
=
ctypes
.
c_char_p
*
num_of_lines
lines_type
=
ctypes
.
c_char_p
*
num_of_lines
p_lines
=
lines_type
(
*
lines
)
p_lines
=
lines_type
(
*
lines
)
errno
=
_libtaos
.
taos_
insert_lines
(
connection
,
p_lines
,
num_of_lines
)
errno
=
_libtaos
.
taos_
schemaless_insert
(
connection
,
p_lines
,
num_of_lines
,
protocol
)
if
errno
!=
0
:
if
errno
!=
0
:
raise
LinesError
(
"insert lines error"
,
errno
)
raise
SchemalessError
(
"schemaless insert error"
,
errno
)
def
taos_insert_telnet_lines
(
connection
,
lines
):
# type: (c_void_p, list[str] | tuple(str)) -> None
num_of_lines
=
len
(
lines
)
lines
=
(
c_char_p
(
line
.
encode
(
"utf-8"
))
for
line
in
lines
)
lines_type
=
ctypes
.
c_char_p
*
num_of_lines
p_lines
=
lines_type
(
*
lines
)
errno
=
_libtaos
.
taos_insert_telnet_lines
(
connection
,
p_lines
,
num_of_lines
)
if
errno
!=
0
:
raise
TelnetLinesError
(
"insert telnet lines error"
,
errno
)
def
taos_insert_json_payload
(
connection
,
payload
):
# type: (c_void_p, list[str] | tuple(str)) -> None
payload
=
payload
.
encode
(
"utf-8"
)
errno
=
_libtaos
.
taos_insert_json_payload
(
connection
,
payload
)
if
errno
!=
0
:
raise
JsonPayloadError
(
"insert json payload error"
,
errno
)
class
CTaosInterface
(
object
):
class
CTaosInterface
(
object
):
def
__init__
(
self
,
config
=
None
):
def
__init__
(
self
,
config
=
None
):
...
...
src/connector/python/taos/connection.py
浏览文件 @
25bddc0c
...
@@ -117,9 +117,10 @@ class TaosConnection(object):
...
@@ -117,9 +117,10 @@ class TaosConnection(object):
stream
=
taos_open_stream
(
self
.
_conn
,
sql
,
callback
,
stime
,
param
,
callback2
)
stream
=
taos_open_stream
(
self
.
_conn
,
sql
,
callback
,
stime
,
param
,
callback2
)
return
TaosStream
(
stream
)
return
TaosStream
(
stream
)
def
insert_lines
(
self
,
lines
):
def
schemaless_insert
(
self
,
lines
,
protocol
):
# type: (list[str]) -> None
# type: (list[str]) -> None
"""Line protocol and schemaless support
"""
1.Line protocol and schemaless support
## Example
## Example
...
@@ -131,34 +132,31 @@ class TaosConnection(object):
...
@@ -131,34 +132,31 @@ class TaosConnection(object):
lines = [
lines = [
'ste,t2=5,t3=L"ste" c1=true,c2=4,c3="string" 1626056811855516532',
'ste,t2=5,t3=L"ste" c1=true,c2=4,c3="string" 1626056811855516532',
]
]
conn.
insert_lines(lines
)
conn.
schemaless_insert(lines, 0
)
```
```
## Exception
2.OpenTSDB telnet style API format support
```python
try:
conn.insert_lines(lines)
except SchemalessError as err:
print(err)
```
"""
return
taos_insert_lines
(
self
.
_conn
,
lines
)
def
insert_telnet_lines
(
self
,
lines
):
"""OpenTSDB telnet style API format support
## Example
## Example
cpu_load 1626056811855516532ns 2.0f32 id="tb1",host="host0",interface="eth0"
import taos
conn = taos.connect()
conn.exec("drop database if exists test")
conn.select_db("test")
lines = [
'cpu_load 1626056811855516532ns 2.0f32 id="tb1",host="host0",interface="eth0"',
]
conn.schemaless_insert(lines, 1)
"""
return
taos_insert_telnet_lines
(
self
.
_conn
,
lines
)
def
insert_json_payload
(
self
,
payload
):
3.OpenTSDB HTTP JSON format support
"""OpenTSDB HTTP JSON format support
## Example
## Example
"{
import taos
conn = taos.connect()
conn.exec("drop database if exists test")
conn.select_db("test")
payload = ['''
{
"metric": "cpu_load_0",
"metric": "cpu_load_0",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
"value": 55.5,
"value": 55.5,
...
@@ -168,10 +166,13 @@ class TaosConnection(object):
...
@@ -168,10 +166,13 @@ class TaosConnection(object):
"interface": "eth0",
"interface": "eth0",
"Id": "tb0"
"Id": "tb0"
}
}
}"
}
''']
conn.schemaless_insert(lines, 2)
"""
"""
return
taos_insert_json_payload
(
self
.
_conn
,
payload
)
return
taos_schemaless_insert
(
self
.
_conn
,
lines
,
protocol
)
def
cursor
(
self
):
def
cursor
(
self
):
# type: () -> TaosCursor
# type: () -> TaosCursor
...
...
src/connector/python/taos/error.py
浏览文件 @
25bddc0c
...
@@ -80,17 +80,7 @@ class ResultError(DatabaseError):
...
@@ -80,17 +80,7 @@ class ResultError(DatabaseError):
pass
pass
class
LinesError
(
DatabaseError
):
class
SchemalessError
(
DatabaseError
):
"""taos_insert_lines errors."""
"""taos_schemaless_insert errors."""
pass
class
TelnetLinesError
(
DatabaseError
):
"""taos_insert_telnet_lines errors."""
pass
class
JsonPayloadError
(
DatabaseError
):
"""taos_insert_json_payload errors."""
pass
pass
src/connector/python/tests/test_lines.py
浏览文件 @
25bddc0c
...
@@ -13,10 +13,10 @@ def conn():
...
@@ -13,10 +13,10 @@ def conn():
return
connect
()
return
connect
()
def
test_
insert_lines
(
conn
):
def
test_
schemaless_insert
(
conn
):
# type: (TaosConnection) -> None
# type: (TaosConnection) -> None
dbname
=
"pytest_taos_
insert_lines
"
dbname
=
"pytest_taos_
schemaless_insert
"
try
:
try
:
conn
.
execute
(
"drop database if exists %s"
%
dbname
)
conn
.
execute
(
"drop database if exists %s"
%
dbname
)
conn
.
execute
(
"create database if not exists %s precision 'us'"
%
dbname
)
conn
.
execute
(
"create database if not exists %s precision 'us'"
%
dbname
)
...
@@ -27,13 +27,13 @@ def test_insert_lines(conn):
...
@@ -27,13 +27,13 @@ def test_insert_lines(conn):
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns'
,
'st,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin",c2=true,c4=5f64,c5=5f64,c6=7u64 1626006933640000000ns'
,
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
'stf,t1=4i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
]
]
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
print
(
"inserted"
)
print
(
"inserted"
)
lines
=
[
lines
=
[
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
'stf,t1=5i64,t3="t4",t2=5f64,t4=5f64 c1=3i64,c3=L"passitagin_stf",c2=false,c5=5f64,c6=7u64 1626006933641000000ns'
,
]
]
conn
.
insert_lines
(
lines
)
conn
.
schemaless_insert
(
lines
,
0
)
print
(
"inserted"
)
print
(
"inserted"
)
result
=
conn
.
query
(
"select * from st"
)
result
=
conn
.
query
(
"select * from st"
)
print
(
*
result
.
fields
)
print
(
*
result
.
fields
)
...
@@ -54,4 +54,4 @@ def test_insert_lines(conn):
...
@@ -54,4 +54,4 @@ def test_insert_lines(conn):
if
__name__
==
"__main__"
:
if
__name__
==
"__main__"
:
test_
insert_lines
(
connect
())
test_
schemaless_insert
(
connect
())
src/inc/taos.h
浏览文件 @
25bddc0c
...
@@ -187,11 +187,7 @@ DLL_EXPORT void taos_close_stream(TAOS_STREAM *tstr);
...
@@ -187,11 +187,7 @@ DLL_EXPORT void taos_close_stream(TAOS_STREAM *tstr);
DLL_EXPORT
int
taos_load_table_info
(
TAOS
*
taos
,
const
char
*
tableNameList
);
DLL_EXPORT
int
taos_load_table_info
(
TAOS
*
taos
,
const
char
*
tableNameList
);
DLL_EXPORT
int
taos_insert_lines
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
);
DLL_EXPORT
int
taos_schemaless_insert
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
,
int
protocol
);
DLL_EXPORT
int
taos_insert_telnet_lines
(
TAOS
*
taos
,
char
*
lines
[],
int
numLines
);
DLL_EXPORT
int
taos_insert_json_payload
(
TAOS
*
taos
,
char
*
payload
);
#ifdef __cplusplus
#ifdef __cplusplus
}
}
...
...
src/inc/taoserror.h
浏览文件 @
25bddc0c
...
@@ -112,6 +112,7 @@ int32_t* taosGetErrno();
...
@@ -112,6 +112,7 @@ int32_t* taosGetErrno();
#define TSDB_CODE_TSC_INVALID_JSON_TYPE TAOS_DEF_ERROR_CODE(0, 0x0222) //"Invalid JSON data type")
#define TSDB_CODE_TSC_INVALID_JSON_TYPE TAOS_DEF_ERROR_CODE(0, 0x0222) //"Invalid JSON data type")
#define TSDB_CODE_TSC_INVALID_JSON_CONFIG TAOS_DEF_ERROR_CODE(0, 0x0223) //"Invalid JSON configuration")
#define TSDB_CODE_TSC_INVALID_JSON_CONFIG TAOS_DEF_ERROR_CODE(0, 0x0223) //"Invalid JSON configuration")
#define TSDB_CODE_TSC_VALUE_OUT_OF_RANGE TAOS_DEF_ERROR_CODE(0, 0x0224) //"Value out of range")
#define TSDB_CODE_TSC_VALUE_OUT_OF_RANGE TAOS_DEF_ERROR_CODE(0, 0x0224) //"Value out of range")
#define TSDB_CODE_TSC_INVALID_PROTOCOL_TYPE TAOS_DEF_ERROR_CODE(0, 0x0225) //"Invalid line protocol type")
// mnode
// mnode
#define TSDB_CODE_MND_MSG_NOT_PROCESSED TAOS_DEF_ERROR_CODE(0, 0x0300) //"Message not processed")
#define TSDB_CODE_MND_MSG_NOT_PROCESSED TAOS_DEF_ERROR_CODE(0, 0x0300) //"Message not processed")
...
...
src/kit/taosdump/taosdump.c
浏览文件 @
25bddc0c
...
@@ -1020,25 +1020,25 @@ static void dumpCreateMTableClause(
...
@@ -1020,25 +1020,25 @@ static void dumpCreateMTableClause(
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"nchar"
)
==
0
)
{
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"nchar"
)
==
0
)
{
//pstr += sprintf(pstr, ", \'%s\'", tableDes->cols[counter].note);
//pstr += sprintf(pstr, ", \'%s\'", tableDes->cols[counter].note);
if
(
tableDes
->
cols
[
counter
].
var_value
)
{
if
(
tableDes
->
cols
[
counter
].
var_value
)
{
pstr
+=
sprintf
(
pstr
,
",
%s
"
,
pstr
+=
sprintf
(
pstr
,
",
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
var_value
);
tableDes
->
cols
[
counter
].
var_value
);
}
else
{
}
else
{
pstr
+=
sprintf
(
pstr
,
",
%s
"
,
tableDes
->
cols
[
counter
].
value
);
pstr
+=
sprintf
(
pstr
,
",
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
value
);
}
}
}
else
{
}
else
{
pstr
+=
sprintf
(
pstr
,
",
%s
"
,
tableDes
->
cols
[
counter
].
value
);
pstr
+=
sprintf
(
pstr
,
",
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
value
);
}
}
}
else
{
}
else
{
if
(
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"binary"
)
==
0
||
if
(
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"binary"
)
==
0
||
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"nchar"
)
==
0
)
{
strcasecmp
(
tableDes
->
cols
[
counter
].
type
,
"nchar"
)
==
0
)
{
//pstr += sprintf(pstr, "\'%s\'", tableDes->cols[counter].note);
//pstr += sprintf(pstr, "\'%s\'", tableDes->cols[counter].note);
if
(
tableDes
->
cols
[
counter
].
var_value
)
{
if
(
tableDes
->
cols
[
counter
].
var_value
)
{
pstr
+=
sprintf
(
pstr
,
"
%s
"
,
tableDes
->
cols
[
counter
].
var_value
);
pstr
+=
sprintf
(
pstr
,
"
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
var_value
);
}
else
{
}
else
{
pstr
+=
sprintf
(
pstr
,
"
%s
"
,
tableDes
->
cols
[
counter
].
value
);
pstr
+=
sprintf
(
pstr
,
"
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
value
);
}
}
}
else
{
}
else
{
pstr
+=
sprintf
(
pstr
,
"
%s
"
,
tableDes
->
cols
[
counter
].
value
);
pstr
+=
sprintf
(
pstr
,
"
\'
%s
\'
"
,
tableDes
->
cols
[
counter
].
value
);
}
}
/* pstr += sprintf(pstr, "%s", tableDes->cols[counter].note); */
/* pstr += sprintf(pstr, "%s", tableDes->cols[counter].note); */
}
}
...
@@ -1149,6 +1149,10 @@ static int64_t dumpNormalTable(
...
@@ -1149,6 +1149,10 @@ static int64_t dumpNormalTable(
colCount
=
getTableDes
(
dbName
,
tbName
,
tableDes
,
false
);
colCount
=
getTableDes
(
dbName
,
tbName
,
tableDes
,
false
);
if
(
colCount
<
0
)
{
if
(
colCount
<
0
)
{
errorPrint
(
"%s() LN%d, failed to get table[%s] schema
\n
"
,
__func__
,
__LINE__
,
tbName
);
free
(
tableDes
);
free
(
tableDes
);
return
-
1
;
return
-
1
;
}
}
...
@@ -1160,6 +1164,10 @@ static int64_t dumpNormalTable(
...
@@ -1160,6 +1164,10 @@ static int64_t dumpNormalTable(
colCount
=
getTableDes
(
dbName
,
tbName
,
tableDes
,
false
);
colCount
=
getTableDes
(
dbName
,
tbName
,
tableDes
,
false
);
if
(
colCount
<
0
)
{
if
(
colCount
<
0
)
{
errorPrint
(
"%s() LN%d, failed to get table[%s] schema
\n
"
,
__func__
,
__LINE__
,
tbName
);
free
(
tableDes
);
free
(
tableDes
);
return
-
1
;
return
-
1
;
}
}
...
@@ -1172,20 +1180,21 @@ static int64_t dumpNormalTable(
...
@@ -1172,20 +1180,21 @@ static int64_t dumpNormalTable(
if
(
g_args
.
avro
)
{
if
(
g_args
.
avro
)
{
if
(
0
!=
convertTbDesToAvroSchema
(
if
(
0
!=
convertTbDesToAvroSchema
(
dbName
,
tbName
,
tableDes
,
colCount
,
&
jsonAvroSchema
))
{
dbName
,
tbName
,
tableDes
,
colCount
,
&
jsonAvroSchema
))
{
errorPrint
(
"%s() LN%d, convertTbDesToAvroSchema failed
\n
"
,
__func__
,
__LINE__
);
freeTbDes
(
tableDes
);
freeTbDes
(
tableDes
);
return
-
1
;
return
-
1
;
}
}
}
}
tfree
(
tableDes
);
int64_t
ret
=
0
;
int64_t
ret
=
0
;
if
(
!
g_args
.
schemaonly
)
{
if
(
!
g_args
.
schemaonly
)
{
ret
=
dumpTableData
(
fp
,
tbName
,
dbName
,
precision
,
ret
=
dumpTableData
(
fp
,
tbName
,
dbName
,
precision
,
jsonAvroSchema
);
jsonAvroSchema
);
}
}
tfree
(
jsonAvroSchema
);
freeTbDes
(
tableDes
);
return
ret
;
return
ret
;
}
}
...
@@ -1282,20 +1291,23 @@ static void *dumpNtbOfDb(void *arg) {
...
@@ -1282,20 +1291,23 @@ static void *dumpNtbOfDb(void *arg) {
return
NULL
;
return
NULL
;
}
}
int64_t
count
;
for
(
int64_t
i
=
0
;
i
<
pThreadInfo
->
tablesOfDumpOut
;
i
++
)
{
for
(
int64_t
i
=
0
;
i
<
pThreadInfo
->
tablesOfDumpOut
;
i
++
)
{
debugPrint
(
"[%d] No.
\t
%"
PRId64
" table name: %s
\n
"
,
debugPrint
(
"[%d] No.
\t
%"
PRId64
" table name: %s
\n
"
,
pThreadInfo
->
threadIndex
,
i
,
pThreadInfo
->
threadIndex
,
i
,
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
name
);
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
name
);
dumpNormalTable
(
count
=
dumpNormalTable
(
pThreadInfo
->
dbName
,
pThreadInfo
->
dbName
,
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
stable
,
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
stable
,
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
name
,
((
TableInfo
*
)(
g_tablesList
+
pThreadInfo
->
tableFrom
+
i
))
->
name
,
pThreadInfo
->
precision
,
pThreadInfo
->
precision
,
fp
);
fp
);
if
(
count
<
0
)
{
break
;
}
}
}
fclose
(
fp
);
fclose
(
fp
);
return
NULL
;
return
NULL
;
}
}
...
@@ -1341,16 +1353,20 @@ static void *dumpNormalTablesOfStb(void *arg) {
...
@@ -1341,16 +1353,20 @@ static void *dumpNormalTablesOfStb(void *arg) {
TAOS_ROW
row
=
NULL
;
TAOS_ROW
row
=
NULL
;
int64_t
i
=
0
;
int64_t
i
=
0
;
int64_t
count
;
while
((
row
=
taos_fetch_row
(
res
))
!=
NULL
)
{
while
((
row
=
taos_fetch_row
(
res
))
!=
NULL
)
{
debugPrint
(
"[%d] sub table %"
PRId64
": name: %s
\n
"
,
debugPrint
(
"[%d] sub table %"
PRId64
": name: %s
\n
"
,
pThreadInfo
->
threadIndex
,
i
++
,
(
char
*
)
row
[
TSDB_SHOW_TABLES_NAME_INDEX
]);
pThreadInfo
->
threadIndex
,
i
++
,
(
char
*
)
row
[
TSDB_SHOW_TABLES_NAME_INDEX
]);
dumpNormalTable
(
count
=
dumpNormalTable
(
pThreadInfo
->
dbName
,
pThreadInfo
->
dbName
,
pThreadInfo
->
stbName
,
pThreadInfo
->
stbName
,
(
char
*
)
row
[
TSDB_SHOW_TABLES_NAME_INDEX
],
(
char
*
)
row
[
TSDB_SHOW_TABLES_NAME_INDEX
],
pThreadInfo
->
precision
,
pThreadInfo
->
precision
,
fp
);
fp
);
if
(
count
<
0
)
{
break
;
}
}
}
fclose
(
fp
);
fclose
(
fp
);
...
@@ -2007,9 +2023,9 @@ static int getTableDes(
...
@@ -2007,9 +2023,9 @@ static int getTableDes(
if
(
row
[
TSDB_SHOW_TABLES_NAME_INDEX
]
==
NULL
)
{
if
(
row
[
TSDB_SHOW_TABLES_NAME_INDEX
]
==
NULL
)
{
sprintf
(
tableDes
->
cols
[
i
].
note
,
"%s"
,
"NUL"
);
sprintf
(
tableDes
->
cols
[
i
].
note
,
"%s"
,
"NUL"
);
sprintf
(
tableDes
->
cols
[
i
].
value
,
"%s"
,
"NULL"
);
taos_free_result
(
res
);
taos_free_result
(
res
);
res
=
NULL
;
res
=
NULL
;
taos_close
(
taos
);
continue
;
continue
;
}
}
...
@@ -2051,26 +2067,22 @@ static int getTableDes(
...
@@ -2051,26 +2067,22 @@ static int getTableDes(
int
len
=
strlen
((
char
*
)
row
[
0
]);
int
len
=
strlen
((
char
*
)
row
[
0
]);
// FIXME for long value
// FIXME for long value
if
(
len
<
(
COL_VALUEBUF_LEN
-
2
))
{
if
(
len
<
(
COL_VALUEBUF_LEN
-
2
))
{
tableDes
->
cols
[
i
].
value
[
0
]
=
'\''
;
converStringToReadable
(
converStringToReadable
(
(
char
*
)
row
[
0
],
(
char
*
)
row
[
0
],
length
[
0
],
length
[
0
],
tableDes
->
cols
[
i
].
value
+
1
,
tableDes
->
cols
[
i
].
value
,
len
);
len
);
tableDes
->
cols
[
i
].
value
[
len
+
1
]
=
'\''
;
}
else
{
}
else
{
tableDes
->
cols
[
i
].
var_value
=
calloc
(
1
,
len
+
2
);
tableDes
->
cols
[
i
].
var_value
=
calloc
(
1
,
len
*
2
);
if
(
tableDes
->
cols
[
i
].
var_value
==
NULL
)
{
if
(
tableDes
->
cols
[
i
].
var_value
==
NULL
)
{
errorPrint
(
"%s() LN%d, memory alalocation failed!
\n
"
,
errorPrint
(
"%s() LN%d, memory alalocation failed!
\n
"
,
__func__
,
__LINE__
);
__func__
,
__LINE__
);
taos_free_result
(
res
);
taos_free_result
(
res
);
return
-
1
;
return
-
1
;
}
}
tableDes
->
cols
[
i
].
var_value
[
0
]
=
'\''
;
converStringToReadable
((
char
*
)
row
[
0
],
converStringToReadable
((
char
*
)
row
[
0
],
length
[
0
],
length
[
0
],
(
char
*
)(
tableDes
->
cols
[
i
].
var_value
+
1
),
len
);
(
char
*
)(
tableDes
->
cols
[
i
].
var_value
),
len
);
tableDes
->
cols
[
i
].
var_value
[
len
+
1
]
=
'\''
;
}
}
break
;
break
;
...
...
src/os/src/detail/osTime.c
浏览文件 @
25bddc0c
...
@@ -411,8 +411,10 @@ int64_t convertTimePrecision(int64_t time, int32_t fromPrecision, int32_t toPrec
...
@@ -411,8 +411,10 @@ int64_t convertTimePrecision(int64_t time, int32_t fromPrecision, int32_t toPrec
return
time
;
return
time
;
}
}
}
//end from nano
}
//end from nano
default:
default:
{
assert
(
0
);
assert
(
0
);
return
time
;
// only to pass windows compilation
}
}
//end switch fromPrecision
}
//end switch fromPrecision
}
}
...
...
src/util/src/terror.c
浏览文件 @
25bddc0c
...
@@ -120,6 +120,7 @@ TAOS_DEFINE_ERROR(TSDB_CODE_TSC_INVALID_JSON, "Invalid JSON format")
...
@@ -120,6 +120,7 @@ TAOS_DEFINE_ERROR(TSDB_CODE_TSC_INVALID_JSON, "Invalid JSON format")
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_INVALID_JSON_TYPE
,
"Invalid JSON data type"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_INVALID_JSON_TYPE
,
"Invalid JSON data type"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_INVALID_JSON_CONFIG
,
"Invalid JSON configuration"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_INVALID_JSON_CONFIG
,
"Invalid JSON configuration"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_VALUE_OUT_OF_RANGE
,
"Value out of range"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_VALUE_OUT_OF_RANGE
,
"Value out of range"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_TSC_INVALID_PROTOCOL_TYPE
,
"Invalid line protocol type"
)
// mnode
// mnode
TAOS_DEFINE_ERROR
(
TSDB_CODE_MND_MSG_NOT_PROCESSED
,
"Message not processed"
)
TAOS_DEFINE_ERROR
(
TSDB_CODE_MND_MSG_NOT_PROCESSED
,
"Message not processed"
)
...
...
tests/examples/c/apitest.c
浏览文件 @
25bddc0c
...
@@ -980,40 +980,40 @@ int32_t verify_schema_less(TAOS* taos) {
...
@@ -980,40 +980,40 @@ int32_t verify_schema_less(TAOS* taos) {
"stf,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641000000ns"
"stf,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641000000ns"
};
};
code
=
taos_
insert_lines
(
taos
,
lines
,
sizeof
(
lines
)
/
sizeof
(
char
*
)
);
code
=
taos_
schemaless_insert
(
taos
,
lines
,
sizeof
(
lines
)
/
sizeof
(
char
*
),
0
);
char
*
lines2
[]
=
{
char
*
lines2
[]
=
{
"stg,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"stg,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"stg,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin
\"
,c2=true,c4=5f64,c5=5f64 1626006833640000000ns"
"stg,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin
\"
,c2=true,c4=5f64,c5=5f64 1626006833640000000ns"
};
};
code
=
taos_
insert_lines
(
taos
,
&
lines2
[
0
],
1
);
code
=
taos_
schemaless_insert
(
taos
,
&
lines2
[
0
],
1
,
0
);
code
=
taos_
insert_lines
(
taos
,
&
lines2
[
1
],
1
);
code
=
taos_
schemaless_insert
(
taos
,
&
lines2
[
1
],
1
,
0
);
char
*
lines3
[]
=
{
char
*
lines3
[]
=
{
"sth,t1=4i64,t2=5f64,t4=5f64,ID=
\"
childtable
\"
c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641ms"
,
"sth,t1=4i64,t2=5f64,t4=5f64,ID=
\"
childtable
\"
c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641ms"
,
"sth,t1=4i64,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933654ms"
"sth,t1=4i64,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933654ms"
};
};
code
=
taos_
insert_lines
(
taos
,
lines3
,
2
);
code
=
taos_
schemaless_insert
(
taos
,
lines3
,
2
,
0
);
char
*
lines4
[]
=
{
char
*
lines4
[]
=
{
"st123456,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"st123456,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"dgtyqodr,t2=5f64,t3=L
\"
ste
\"
c1=tRue,c2=4i64,c3=
\"
iam
\"
1626056811823316532ns"
"dgtyqodr,t2=5f64,t3=L
\"
ste
\"
c1=tRue,c2=4i64,c3=
\"
iam
\"
1626056811823316532ns"
};
};
code
=
taos_
insert_lines
(
taos
,
lines4
,
2
);
code
=
taos_
schemaless_insert
(
taos
,
lines4
,
2
,
0
);
char
*
lines5
[]
=
{
char
*
lines5
[]
=
{
"zqlbgs,id=
\"
zqlbgs_39302_21680
\"
,t0=f,t1=127i8,t2=32767i16,t3=2147483647i32,t4=9223372036854775807i64,t5=11.12345f32,t6=22.123456789f64,t7=
\"
binaryTagValue
\"
,t8=L
\"
ncharTagValue
\"
c0=f,c1=127i8,c2=32767i16,c3=2147483647i32,c4=9223372036854775807i64,c5=11.12345f32,c6=22.123456789f64,c7=
\"
binaryColValue
\"
,c8=L
\"
ncharColValue
\"
,c9=7u64 1626006833639000000ns"
,
"zqlbgs,id=
\"
zqlbgs_39302_21680
\"
,t0=f,t1=127i8,t2=32767i16,t3=2147483647i32,t4=9223372036854775807i64,t5=11.12345f32,t6=22.123456789f64,t7=
\"
binaryTagValue
\"
,t8=L
\"
ncharTagValue
\"
c0=f,c1=127i8,c2=32767i16,c3=2147483647i32,c4=9223372036854775807i64,c5=11.12345f32,c6=22.123456789f64,c7=
\"
binaryColValue
\"
,c8=L
\"
ncharColValue
\"
,c9=7u64 1626006833639000000ns"
,
"zqlbgs,t9=f,id=
\"
zqlbgs_39302_21680
\"
,t0=f,t1=127i8,t11=127i8,t2=32767i16,t3=2147483647i32,t4=9223372036854775807i64,t5=11.12345f32,t6=22.123456789f64,t7=
\"
binaryTagValue
\"
,t8=L
\"
ncharTagValue
\"
,t10=L
\"
ncharTagValue
\"
c10=f,c0=f,c1=127i8,c12=127i8,c2=32767i16,c3=2147483647i32,c4=9223372036854775807i64,c5=11.12345f32,c6=22.123456789f64,c7=
\"
binaryColValue
\"
,c8=L
\"
ncharColValue
\"
,c9=7u64,c11=L
\"
ncharColValue
\"
1626006833639000000ns"
"zqlbgs,t9=f,id=
\"
zqlbgs_39302_21680
\"
,t0=f,t1=127i8,t11=127i8,t2=32767i16,t3=2147483647i32,t4=9223372036854775807i64,t5=11.12345f32,t6=22.123456789f64,t7=
\"
binaryTagValue
\"
,t8=L
\"
ncharTagValue
\"
,t10=L
\"
ncharTagValue
\"
c10=f,c0=f,c1=127i8,c12=127i8,c2=32767i16,c3=2147483647i32,c4=9223372036854775807i64,c5=11.12345f32,c6=22.123456789f64,c7=
\"
binaryColValue
\"
,c8=L
\"
ncharColValue
\"
,c9=7u64,c11=L
\"
ncharColValue
\"
1626006833639000000ns"
};
};
code
=
taos_
insert_lines
(
taos
,
&
lines5
[
0
],
1
);
code
=
taos_
schemaless_insert
(
taos
,
&
lines5
[
0
],
1
,
0
);
code
=
taos_
insert_lines
(
taos
,
&
lines5
[
1
],
1
);
code
=
taos_
schemaless_insert
(
taos
,
&
lines5
[
1
],
1
,
0
);
char
*
lines6
[]
=
{
char
*
lines6
[]
=
{
"st123456,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"st123456,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"dgtyqodr,t2=5f64,t3=L
\"
ste
\"
c1=tRue,c2=4i64,c3=
\"
iam
\"
1626056811823316532ns"
"dgtyqodr,t2=5f64,t3=L
\"
ste
\"
c1=tRue,c2=4i64,c3=
\"
iam
\"
1626056811823316532ns"
};
};
code
=
taos_
insert_lines
(
taos
,
lines6
,
2
);
code
=
taos_
schemaless_insert
(
taos
,
lines6
,
2
,
0
);
return
(
code
);
return
(
code
);
}
}
...
...
tests/examples/c/schemaless.c
浏览文件 @
25bddc0c
...
@@ -77,9 +77,9 @@ int main(int argc, char* argv[]) {
...
@@ -77,9 +77,9 @@ int main(int argc, char* argv[]) {
}
}
//shuffle(lines, numSuperTables * numChildTables * numRowsPerChildTable);
//shuffle(lines, numSuperTables * numChildTables * numRowsPerChildTable);
printf
(
"%s
\n
"
,
"begin taos_
insert_lines
"
);
printf
(
"%s
\n
"
,
"begin taos_
schemaless_insert
"
);
int64_t
begin
=
getTimeInUs
();
int64_t
begin
=
getTimeInUs
();
int32_t
code
=
taos_
insert_lines
(
taos
,
lines
,
numSuperTables
*
numChildTables
*
numRowsPerChildTable
);
int32_t
code
=
taos_
schemaless_insert
(
taos
,
lines
,
numSuperTables
*
numChildTables
*
numRowsPerChildTable
,
0
);
int64_t
end
=
getTimeInUs
();
int64_t
end
=
getTimeInUs
();
printf
(
"code: %d, %s. time used: %"
PRId64
"
\n
"
,
code
,
tstrerror
(
code
),
end
-
begin
);
printf
(
"code: %d, %s. time used: %"
PRId64
"
\n
"
,
code
,
tstrerror
(
code
),
end
-
begin
);
...
...
tests/pytest/crash_gen/valgrind_taos.supp
浏览文件 @
25bddc0c
...
@@ -18230,4 +18230,21 @@
...
@@ -18230,4 +18230,21 @@
fun:__pyx_pw_5numpy_6random_13bit_generator_12BitGenerator_1__init__
fun:__pyx_pw_5numpy_6random_13bit_generator_12BitGenerator_1__init__
obj:/usr/bin/python3.8
obj:/usr/bin/python3.8
obj:/usr/bin/python3.8
obj:/usr/bin/python3.8
}
{
<insert_a_suppression_name_here>
Memcheck:Leak
match-leak-kinds: definite
fun:malloc
obj:/usr/bin/python3.8
fun:_PyObject_MakeTpCall
fun:_PyEval_EvalFrameDefault
obj:/usr/bin/python3.8
fun:_PyObject_MakeTpCall
fun:_PyEval_EvalFrameDefault
obj:/usr/bin/python3.8
fun:_PyEval_EvalFrameDefault
obj:/usr/bin/python3.8
fun:_PyEval_EvalFrameDefault
fun:_PyEval_EvalCodeWithName
}
}
\ No newline at end of file
tests/pytest/fulltest.sh
浏览文件 @
25bddc0c
...
@@ -48,7 +48,7 @@ python3 ./test.py -f table/del_stable.py
...
@@ -48,7 +48,7 @@ python3 ./test.py -f table/del_stable.py
#stable
#stable
python3 ./test.py
-f
stable/insert.py
python3 ./test.py
-f
stable/insert.py
#
python3 test.py -f tools/taosdemoAllTest/taosdemoTestInsertWithJsonStmt.py
python3 test.py
-f
tools/taosdemoAllTest/taosdemoTestInsertWithJsonStmt.py
# tag
# tag
python3 ./test.py
-f
tag_lite/filter.py
python3 ./test.py
-f
tag_lite/filter.py
...
@@ -217,8 +217,9 @@ python3 ./test.py -f perfbenchmark/bug3433.py
...
@@ -217,8 +217,9 @@ python3 ./test.py -f perfbenchmark/bug3433.py
python3 ./test.py
-f
perfbenchmark/taosdemoInsert.py
python3 ./test.py
-f
perfbenchmark/taosdemoInsert.py
#taosdemo
#taosdemo
#
python3 test.py -f tools/taosdemoAllTest/taosdemoTestInsertWithJson.py
python3 test.py
-f
tools/taosdemoAllTest/taosdemoTestInsertWithJson.py
python3 test.py
-f
tools/taosdemoAllTest/taosdemoTestQueryWithJson.py
python3 test.py
-f
tools/taosdemoAllTest/taosdemoTestQueryWithJson.py
python3 test.py
-f
tools/taosdemoAllTest/taosdemoTestInsertAllType.py
#query
#query
python3 test.py
-f
query/distinctOneColTb.py
python3 test.py
-f
query/distinctOneColTb.py
...
...
tests/pytest/insert/insertJSONPayload.py
浏览文件 @
25bddc0c
...
@@ -33,7 +33,7 @@ class TDTestCase:
...
@@ -33,7 +33,7 @@ class TDTestCase:
### Default format ###
### Default format ###
### metric ###
### metric ###
print
(
"============= step0 : test metric ================"
)
print
(
"============= step0 : test metric ================"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": ".stb.0.",
"metric": ".stb.0.",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -45,16 +45,16 @@ class TDTestCase:
...
@@ -45,16 +45,16 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe _stb_0_"
)
tdSql
.
query
(
"describe _stb_0_"
)
tdSql
.
checkRows
(
6
)
tdSql
.
checkRows
(
6
)
### metric value ###
### metric value ###
print
(
"============= step1 : test metric value types ================"
)
print
(
"============= step1 : test metric value types ================"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_0",
"metric": "stb0_0",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -66,14 +66,14 @@ class TDTestCase:
...
@@ -66,14 +66,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_0"
)
tdSql
.
query
(
"describe stb0_0"
)
tdSql
.
checkData
(
1
,
1
,
"BIGINT"
)
tdSql
.
checkData
(
1
,
1
,
"BIGINT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_1",
"metric": "stb0_1",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -85,14 +85,14 @@ class TDTestCase:
...
@@ -85,14 +85,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_1"
)
tdSql
.
query
(
"describe stb0_1"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_2",
"metric": "stb0_2",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -104,14 +104,14 @@ class TDTestCase:
...
@@ -104,14 +104,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_2"
)
tdSql
.
query
(
"describe stb0_2"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_3",
"metric": "stb0_3",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -123,14 +123,14 @@ class TDTestCase:
...
@@ -123,14 +123,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_3"
)
tdSql
.
query
(
"describe stb0_3"
)
tdSql
.
checkData
(
1
,
1
,
"BINARY"
)
tdSql
.
checkData
(
1
,
1
,
"BINARY"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_4",
"metric": "stb0_4",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -142,14 +142,14 @@ class TDTestCase:
...
@@ -142,14 +142,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_4"
)
tdSql
.
query
(
"describe stb0_4"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_5",
"metric": "stb0_5",
"timestamp": 1626006833610123,
"timestamp": 1626006833610123,
...
@@ -161,9 +161,9 @@ class TDTestCase:
...
@@ -161,9 +161,9 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_5"
)
tdSql
.
query
(
"describe stb0_5"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
...
@@ -171,7 +171,7 @@ class TDTestCase:
...
@@ -171,7 +171,7 @@ class TDTestCase:
print
(
"============= step2 : test timestamp ================"
)
print
(
"============= step2 : test timestamp ================"
)
### timestamp 0 ###
### timestamp 0 ###
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_6",
"metric": "stb0_6",
"timestamp": 0,
"timestamp": 0,
...
@@ -183,37 +183,14 @@ class TDTestCase:
...
@@ -183,37 +183,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
print
(
"============= step3 : test tags ================"
)
print
(
"============= step3 : test tags ================"
)
### ID ###
payload
=
'''
{
"metric": "stb0_7",
"timestamp": 0,
"value": 123,
"tags": {
"ID": "tb0_7",
"t1": true,
"iD": "tb000",
"t2": false,
"t3": 10,
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>",
"id": "tb555"
}
}
'''
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
print
(
"insert_json_payload result {}"
.
format
(
code
))
tdSql
.
query
(
"select tbname from stb0_7"
)
tdSql
.
checkData
(
0
,
0
,
"tb0_7"
)
### Default tag numeric types ###
### Default tag numeric types ###
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_8",
"metric": "stb0_8",
"timestamp": 0,
"timestamp": 0,
...
@@ -222,14 +199,14 @@ class TDTestCase:
...
@@ -222,14 +199,14 @@ class TDTestCase:
"t1": 123
"t1": 123
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_8"
)
tdSql
.
query
(
"describe stb0_8"
)
tdSql
.
checkData
(
2
,
1
,
"BIGINT"
)
tdSql
.
checkData
(
2
,
1
,
"BIGINT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_9",
"metric": "stb0_9",
"timestamp": 0,
"timestamp": 0,
...
@@ -238,14 +215,14 @@ class TDTestCase:
...
@@ -238,14 +215,14 @@ class TDTestCase:
"t1": 123.00
"t1": 123.00
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_9"
)
tdSql
.
query
(
"describe stb0_9"
)
tdSql
.
checkData
(
2
,
1
,
"DOUBLE"
)
tdSql
.
checkData
(
2
,
1
,
"DOUBLE"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb0_10",
"metric": "stb0_10",
"timestamp": 0,
"timestamp": 0,
...
@@ -254,9 +231,9 @@ class TDTestCase:
...
@@ -254,9 +231,9 @@ class TDTestCase:
"t1": 123E-1
"t1": 123E-1
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb0_10"
)
tdSql
.
query
(
"describe stb0_10"
)
tdSql
.
checkData
(
2
,
1
,
"DOUBLE"
)
tdSql
.
checkData
(
2
,
1
,
"DOUBLE"
)
...
@@ -265,7 +242,7 @@ class TDTestCase:
...
@@ -265,7 +242,7 @@ class TDTestCase:
print
(
"============= step4 : test nested format ================"
)
print
(
"============= step4 : test nested format ================"
)
### timestamp ###
### timestamp ###
#seconds
#seconds
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb1_0",
"metric": "stb1_0",
"timestamp": {
"timestamp": {
...
@@ -280,15 +257,15 @@ class TDTestCase:
...
@@ -280,15 +257,15 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select ts from stb1_0"
)
tdSql
.
query
(
"select ts from stb1_0"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.000000"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.000000"
)
#milliseconds
#milliseconds
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb1_1",
"metric": "stb1_1",
"timestamp": {
"timestamp": {
...
@@ -303,15 +280,15 @@ class TDTestCase:
...
@@ -303,15 +280,15 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select ts from stb1_1"
)
tdSql
.
query
(
"select ts from stb1_1"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610000"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610000"
)
#microseconds
#microseconds
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb1_2",
"metric": "stb1_2",
"timestamp": {
"timestamp": {
...
@@ -326,19 +303,19 @@ class TDTestCase:
...
@@ -326,19 +303,19 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select ts from stb1_2"
)
tdSql
.
query
(
"select ts from stb1_2"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610123"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610123"
)
#nanoseconds
#nanoseconds
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb1_3",
"metric": "stb1_3",
"timestamp": {
"timestamp": {
"value": 1
.6260068336101233e+18
,
"value": 1
626006833610123321
,
"type": "ns"
"type": "ns"
},
},
"value": 10,
"value": 10,
...
@@ -349,16 +326,16 @@ class TDTestCase:
...
@@ -349,16 +326,16 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select ts from stb1_3"
)
tdSql
.
query
(
"select ts from stb1_3"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610123"
)
tdSql
.
checkData
(
0
,
0
,
"2021-07-11 20:33:53.610123"
)
#now
#now
tdSql
.
execute
(
'use test'
)
tdSql
.
execute
(
'use test'
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb1_4",
"metric": "stb1_4",
"timestamp": {
"timestamp": {
...
@@ -373,12 +350,12 @@ class TDTestCase:
...
@@ -373,12 +350,12 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
### metric value ###
### metric value ###
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_0",
"metric": "stb2_0",
"timestamp": {
"timestamp": {
...
@@ -396,14 +373,14 @@ class TDTestCase:
...
@@ -396,14 +373,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_0"
)
tdSql
.
query
(
"describe stb2_0"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
tdSql
.
checkData
(
1
,
1
,
"BOOL"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_1",
"metric": "stb2_1",
"timestamp": {
"timestamp": {
...
@@ -421,14 +398,14 @@ class TDTestCase:
...
@@ -421,14 +398,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_1"
)
tdSql
.
query
(
"describe stb2_1"
)
tdSql
.
checkData
(
1
,
1
,
"TINYINT"
)
tdSql
.
checkData
(
1
,
1
,
"TINYINT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_2",
"metric": "stb2_2",
"timestamp": {
"timestamp": {
...
@@ -446,14 +423,14 @@ class TDTestCase:
...
@@ -446,14 +423,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_2"
)
tdSql
.
query
(
"describe stb2_2"
)
tdSql
.
checkData
(
1
,
1
,
"SMALLINT"
)
tdSql
.
checkData
(
1
,
1
,
"SMALLINT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_3",
"metric": "stb2_3",
"timestamp": {
"timestamp": {
...
@@ -471,14 +448,14 @@ class TDTestCase:
...
@@ -471,14 +448,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_3"
)
tdSql
.
query
(
"describe stb2_3"
)
tdSql
.
checkData
(
1
,
1
,
"INT"
)
tdSql
.
checkData
(
1
,
1
,
"INT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_4",
"metric": "stb2_4",
"timestamp": {
"timestamp": {
...
@@ -496,14 +473,14 @@ class TDTestCase:
...
@@ -496,14 +473,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_4"
)
tdSql
.
query
(
"describe stb2_4"
)
tdSql
.
checkData
(
1
,
1
,
"BIGINT"
)
tdSql
.
checkData
(
1
,
1
,
"BIGINT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_5",
"metric": "stb2_5",
"timestamp": {
"timestamp": {
...
@@ -521,14 +498,14 @@ class TDTestCase:
...
@@ -521,14 +498,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_5"
)
tdSql
.
query
(
"describe stb2_5"
)
tdSql
.
checkData
(
1
,
1
,
"FLOAT"
)
tdSql
.
checkData
(
1
,
1
,
"FLOAT"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_6",
"metric": "stb2_6",
"timestamp": {
"timestamp": {
...
@@ -546,14 +523,14 @@ class TDTestCase:
...
@@ -546,14 +523,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_6"
)
tdSql
.
query
(
"describe stb2_6"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
tdSql
.
checkData
(
1
,
1
,
"DOUBLE"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_7",
"metric": "stb2_7",
"timestamp": {
"timestamp": {
...
@@ -571,14 +548,14 @@ class TDTestCase:
...
@@ -571,14 +548,14 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_7"
)
tdSql
.
query
(
"describe stb2_7"
)
tdSql
.
checkData
(
1
,
1
,
"BINARY"
)
tdSql
.
checkData
(
1
,
1
,
"BINARY"
)
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb2_8",
"metric": "stb2_8",
"timestamp": {
"timestamp": {
...
@@ -596,16 +573,16 @@ class TDTestCase:
...
@@ -596,16 +573,16 @@ class TDTestCase:
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
"t4": "123_abc_.!@#$%^&*:;,./?|+-=()[]{}<>"
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb2_8"
)
tdSql
.
query
(
"describe stb2_8"
)
tdSql
.
checkData
(
1
,
1
,
"NCHAR"
)
tdSql
.
checkData
(
1
,
1
,
"NCHAR"
)
### tag value ###
### tag value ###
payload
=
'''
payload
=
[
'''
{
{
"metric": "stb3_0",
"metric": "stb3_0",
"timestamp": {
"timestamp": {
...
@@ -655,9 +632,9 @@ class TDTestCase:
...
@@ -655,9 +632,9 @@ class TDTestCase:
}
}
}
}
}
}
'''
'''
]
code
=
self
.
_conn
.
insert_json_payload
(
payload
)
code
=
self
.
_conn
.
schemaless_insert
(
payload
,
2
)
print
(
"
insert_json_payload
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"describe stb3_0"
)
tdSql
.
query
(
"describe stb3_0"
)
tdSql
.
checkData
(
2
,
1
,
"BOOL"
)
tdSql
.
checkData
(
2
,
1
,
"BOOL"
)
...
...
tests/pytest/insert/insertTelnetLines.py
浏览文件 @
25bddc0c
...
@@ -39,8 +39,8 @@ class TDTestCase:
...
@@ -39,8 +39,8 @@ class TDTestCase:
".stb0.3. 1626006833639000000ns 4i8 host=
\"
host0
\"
interface=
\"
eth0
\"
"
,
".stb0.3. 1626006833639000000ns 4i8 host=
\"
host0
\"
interface=
\"
eth0
\"
"
,
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines0
)
code
=
self
.
_conn
.
schemaless_insert
(
lines0
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"show stables"
)
tdSql
.
query
(
"show stables"
)
tdSql
.
checkRows
(
4
)
tdSql
.
checkRows
(
4
)
...
@@ -68,8 +68,8 @@ class TDTestCase:
...
@@ -68,8 +68,8 @@ class TDTestCase:
"stb1 0 6i8 host=
\"
host0
\"
"
,
"stb1 0 6i8 host=
\"
host0
\"
"
,
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines
1
)
code
=
self
.
_conn
.
schemaless_insert
(
lines1
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb1"
)
tdSql
.
query
(
"select * from stb1"
)
tdSql
.
checkRows
(
6
)
tdSql
.
checkRows
(
6
)
...
@@ -82,8 +82,8 @@ class TDTestCase:
...
@@ -82,8 +82,8 @@ class TDTestCase:
"stb2_0 1626006833651ms -127i8 host=
\"
host0
\"
"
,
"stb2_0 1626006833651ms -127i8 host=
\"
host0
\"
"
,
"stb2_0 1626006833652ms 127i8 host=
\"
host0
\"
"
"stb2_0 1626006833652ms 127i8 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_0
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_0
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_0"
)
tdSql
.
query
(
"select * from stb2_0"
)
tdSql
.
checkRows
(
2
)
tdSql
.
checkRows
(
2
)
...
@@ -97,8 +97,8 @@ class TDTestCase:
...
@@ -97,8 +97,8 @@ class TDTestCase:
"stb2_1 1626006833651ms -32767i16 host=
\"
host0
\"
"
,
"stb2_1 1626006833651ms -32767i16 host=
\"
host0
\"
"
,
"stb2_1 1626006833652ms 32767i16 host=
\"
host0
\"
"
"stb2_1 1626006833652ms 32767i16 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_
1
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_1
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_1"
)
tdSql
.
query
(
"select * from stb2_1"
)
tdSql
.
checkRows
(
2
)
tdSql
.
checkRows
(
2
)
...
@@ -113,8 +113,8 @@ class TDTestCase:
...
@@ -113,8 +113,8 @@ class TDTestCase:
"stb2_2 1626006833652ms 2147483647i32 host=
\"
host0
\"
"
"stb2_2 1626006833652ms 2147483647i32 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_2
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_2
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_2"
)
tdSql
.
query
(
"select * from stb2_2"
)
tdSql
.
checkRows
(
2
)
tdSql
.
checkRows
(
2
)
...
@@ -130,8 +130,8 @@ class TDTestCase:
...
@@ -130,8 +130,8 @@ class TDTestCase:
"stb2_3 1626006833662ms 9223372036854775807 host=
\"
host0
\"
"
"stb2_3 1626006833662ms 9223372036854775807 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_3
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_3
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_3"
)
tdSql
.
query
(
"select * from stb2_3"
)
tdSql
.
checkRows
(
3
)
tdSql
.
checkRows
(
3
)
...
@@ -154,8 +154,8 @@ class TDTestCase:
...
@@ -154,8 +154,8 @@ class TDTestCase:
"stb2_4 1626006833710ms -3.4E38f32 host=
\"
host0
\"
"
"stb2_4 1626006833710ms -3.4E38f32 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_4
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_4
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_4"
)
tdSql
.
query
(
"select * from stb2_4"
)
tdSql
.
checkRows
(
10
)
tdSql
.
checkRows
(
10
)
...
@@ -179,8 +179,8 @@ class TDTestCase:
...
@@ -179,8 +179,8 @@ class TDTestCase:
"stb2_5 1626006833710ms 3.15 host=
\"
host0
\"
"
"stb2_5 1626006833710ms 3.15 host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_5
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_5
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_5"
)
tdSql
.
query
(
"select * from stb2_5"
)
tdSql
.
checkRows
(
11
)
tdSql
.
checkRows
(
11
)
...
@@ -203,8 +203,8 @@ class TDTestCase:
...
@@ -203,8 +203,8 @@ class TDTestCase:
"stb2_6 1626006833700ms FALSE host=
\"
host0
\"
"
"stb2_6 1626006833700ms FALSE host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_6
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_6
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_6"
)
tdSql
.
query
(
"select * from stb2_6"
)
tdSql
.
checkRows
(
10
)
tdSql
.
checkRows
(
10
)
...
@@ -220,8 +220,8 @@ class TDTestCase:
...
@@ -220,8 +220,8 @@ class TDTestCase:
"stb2_7 1626006833630ms
\"
binary_val.()[]{}<>
\"
host=
\"
host0
\"
"
"stb2_7 1626006833630ms
\"
binary_val.()[]{}<>
\"
host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_7
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_7
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_7"
)
tdSql
.
query
(
"select * from stb2_7"
)
tdSql
.
checkRows
(
3
)
tdSql
.
checkRows
(
3
)
...
@@ -236,8 +236,8 @@ class TDTestCase:
...
@@ -236,8 +236,8 @@ class TDTestCase:
"stb2_8 1626006833620ms L
\"
nchar_val数值二
\"
host=
\"
host0
\"
"
"stb2_8 1626006833620ms L
\"
nchar_val数值二
\"
host=
\"
host0
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines2_8
)
code
=
self
.
_conn
.
schemaless_insert
(
lines2_8
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb2_8"
)
tdSql
.
query
(
"select * from stb2_8"
)
tdSql
.
checkRows
(
2
)
tdSql
.
checkRows
(
2
)
...
@@ -254,8 +254,8 @@ class TDTestCase:
...
@@ -254,8 +254,8 @@ class TDTestCase:
"stb3_0 1626006833610ms 2 t1=-127i8 t2=-32767i16 t3=-2147483647i32 t4=-9223372036854775807i64 t5=-3.4E38f32 t6=-1.7E308f64 t7=false t8=
\"
binary_val_2
\"
t9=L
\"
标签值2
\"
"
"stb3_0 1626006833610ms 2 t1=-127i8 t2=-32767i16 t3=-2147483647i32 t4=-9223372036854775807i64 t5=-3.4E38f32 t6=-1.7E308f64 t7=false t8=
\"
binary_val_2
\"
t9=L
\"
标签值2
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines3_0
)
code
=
self
.
_conn
.
schemaless_insert
(
lines3_0
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb3_0"
)
tdSql
.
query
(
"select * from stb3_0"
)
tdSql
.
checkRows
(
2
)
tdSql
.
checkRows
(
2
)
...
@@ -298,8 +298,8 @@ class TDTestCase:
...
@@ -298,8 +298,8 @@ class TDTestCase:
"stb3_1 1626006833610ms 3 ID=
\"
child_table3
\"
host=
\"
host3
\"
"
"stb3_1 1626006833610ms 3 ID=
\"
child_table3
\"
host=
\"
host3
\"
"
]
]
code
=
self
.
_conn
.
insert_telnet_lines
(
lines3_
1
)
code
=
self
.
_conn
.
schemaless_insert
(
lines3_1
,
1
)
print
(
"
insert_telnet_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from stb3_1"
)
tdSql
.
query
(
"select * from stb3_1"
)
tdSql
.
checkRows
(
3
)
tdSql
.
checkRows
(
3
)
...
...
tests/pytest/insert/line_insert.py
浏览文件 @
25bddc0c
...
@@ -42,18 +42,18 @@ class TDTestCase:
...
@@ -42,18 +42,18 @@ class TDTestCase:
"stf,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641000000ns"
"stf,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641000000ns"
]
]
code
=
self
.
_conn
.
insert_lines
(
lines
)
code
=
self
.
_conn
.
schemaless_insert
(
lines
,
0
)
print
(
"
insert_lines
result {}"
.
format
(
code
))
print
(
"
schemaless_insert
result {}"
.
format
(
code
))
lines2
=
[
"stg,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
lines2
=
[
"stg,t1=3i64,t2=4f64,t3=
\"
t3
\"
c1=3i64,c3=L
\"
passit
\"
,c2=false,c4=4f64 1626006833639000000ns"
,
"stg,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin
\"
,c2=true,c4=5f64,c5=5f64 1626006833640000000ns"
"stg,t1=4i64,t3=
\"
t4
\"
,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin
\"
,c2=true,c4=5f64,c5=5f64 1626006833640000000ns"
]
]
code
=
self
.
_conn
.
insert_lines
([
lines2
[
0
]
])
print
(
"insert_lines result {}"
.
format
(
code
))
self
.
_conn
.
insert_lines
([
lines2
[
1
]
])
code
=
self
.
_conn
.
schemaless_insert
([
lines2
[
0
]
],
0
)
print
(
"insert_lines result {}"
.
format
(
code
))
print
(
"schemaless_insert result {}"
.
format
(
code
))
self
.
_conn
.
schemaless_insert
([
lines2
[
1
]
],
0
)
print
(
"schemaless_insert result {}"
.
format
(
code
))
tdSql
.
query
(
"select * from st"
)
tdSql
.
query
(
"select * from st"
)
tdSql
.
checkRows
(
4
)
tdSql
.
checkRows
(
4
)
...
@@ -73,10 +73,10 @@ class TDTestCase:
...
@@ -73,10 +73,10 @@ class TDTestCase:
tdSql
.
query
(
"describe stf"
)
tdSql
.
query
(
"describe stf"
)
tdSql
.
checkData
(
2
,
2
,
14
)
tdSql
.
checkData
(
2
,
2
,
14
)
self
.
_conn
.
insert_lines
([
self
.
_conn
.
schemaless_insert
([
"sth,t1=4i64,t2=5f64,t4=5f64,ID=
\"
childtable
\"
c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641ms"
,
"sth,t1=4i64,t2=5f64,t4=5f64,ID=
\"
childtable
\"
c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933641ms"
,
"sth,t1=4i64,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933654ms"
"sth,t1=4i64,t2=5f64,t4=5f64 c1=3i64,c3=L
\"
passitagin_stf
\"
,c2=false,c5=5f64,c6=7u64 1626006933654ms"
])
]
,
0
)
tdSql
.
execute
(
'reset query cache'
)
tdSql
.
execute
(
'reset query cache'
)
tdSql
.
query
(
'select tbname, * from sth'
)
tdSql
.
query
(
'select tbname, * from sth'
)
...
...
tests/pytest/insert/openTsdbTelnetLinesInsert.py
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/insert/schemalessInsert.py
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/insert-interlace.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/insert-tblimit-tboffset-createdb.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/insert-tblimit-tboffset-insertrec.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/insert-tblimit-tboffset.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/insert-tblimit-tboffset0.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/insert-tblimit1-tboffset.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/schemalessInsertPerformance.py
浏览文件 @
25bddc0c
...
@@ -14,7 +14,7 @@
...
@@ -14,7 +14,7 @@
import
traceback
import
traceback
import
random
import
random
import
string
import
string
from
taos.error
import
Line
sError
from
taos.error
import
Schemales
sError
import
datetime
import
datetime
import
time
import
time
from
copy
import
deepcopy
from
copy
import
deepcopy
...
@@ -172,28 +172,28 @@ class TDTestCase:
...
@@ -172,28 +172,28 @@ class TDTestCase:
def
perfTableInsert
(
self
):
def
perfTableInsert
(
self
):
table_generator
=
self
.
tableGenerator
()
table_generator
=
self
.
tableGenerator
()
for
input_sql
in
table_generator
:
for
input_sql
in
table_generator
:
self
.
_conn
.
insert_lines
([
input_sql
]
)
self
.
_conn
.
schemaless_insert
([
input_sql
],
0
)
# for i in range(10):
# for i in range(10):
# self._conn.
insert_lines([input_sql]
)
# self._conn.
schemaless_insert([input_sql], 0
)
def
perfDataInsert
(
self
,
count
=
4
):
def
perfDataInsert
(
self
,
count
=
4
):
table_generator
=
self
.
tableGenerator
(
count
=
count
)
table_generator
=
self
.
tableGenerator
(
count
=
count
)
ts
=
int
(
time
.
time
())
ts
=
int
(
time
.
time
())
for
input_sql
in
table_generator
:
for
input_sql
in
table_generator
:
print
(
"input_sql-----------"
,
input_sql
)
print
(
"input_sql-----------"
,
input_sql
)
self
.
_conn
.
insert_lines
([
input_sql
]
)
self
.
_conn
.
schemaless_insert
([
input_sql
],
0
)
for
i
in
range
(
100000
):
for
i
in
range
(
100000
):
ts
-=
1
ts
-=
1
input_sql_new
=
self
.
replaceLastStr
(
input_sql
,
str
(
ts
))
+
's'
input_sql_new
=
self
.
replaceLastStr
(
input_sql
,
str
(
ts
))
+
's'
print
(
"input_sql_new---------"
,
input_sql_new
)
print
(
"input_sql_new---------"
,
input_sql_new
)
self
.
_conn
.
insert_lines
([
input_sql_new
]
)
self
.
_conn
.
schemaless_insert
([
input_sql_new
],
0
)
def
batchInsertTable
(
self
,
batch_list
):
def
batchInsertTable
(
self
,
batch_list
):
for
insert_list
in
batch_list
:
for
insert_list
in
batch_list
:
print
(
threading
.
current_thread
().
name
,
"length="
,
len
(
insert_list
))
print
(
threading
.
current_thread
().
name
,
"length="
,
len
(
insert_list
))
print
(
threading
.
current_thread
().
name
,
'firstline'
,
insert_list
[
0
])
print
(
threading
.
current_thread
().
name
,
'firstline'
,
insert_list
[
0
])
print
(
threading
.
current_thread
().
name
,
'lastline:'
,
insert_list
[
-
1
])
print
(
threading
.
current_thread
().
name
,
'lastline:'
,
insert_list
[
-
1
])
self
.
_conn
.
insert_lines
(
insert_list
)
self
.
_conn
.
schemaless_insert
(
insert_list
,
0
)
print
(
threading
.
current_thread
().
name
,
'end'
)
print
(
threading
.
current_thread
().
name
,
'end'
)
def
genTableThread
(
self
,
thread_count
=
10
):
def
genTableThread
(
self
,
thread_count
=
10
):
...
@@ -218,7 +218,7 @@ class TDTestCase:
...
@@ -218,7 +218,7 @@ class TDTestCase:
def
createStb
(
self
,
count
=
4
):
def
createStb
(
self
,
count
=
4
):
input_sql
=
self
.
getPerfSql
(
count
=
count
,
init
=
True
)
input_sql
=
self
.
getPerfSql
(
count
=
count
,
init
=
True
)
self
.
_conn
.
insert_lines
([
input_sql
]
)
self
.
_conn
.
schemaless_insert
([
input_sql
],
0
)
def
threadInsertTable
(
self
,
end_list
,
thread_count
=
10
):
def
threadInsertTable
(
self
,
end_list
,
thread_count
=
10
):
threads
=
list
()
threads
=
list
()
...
@@ -238,7 +238,7 @@ class TDTestCase:
...
@@ -238,7 +238,7 @@ class TDTestCase:
# def createTb(self, count=4):
# def createTb(self, count=4):
# input_sql = self.getPerfSql(count=count)
# input_sql = self.getPerfSql(count=count)
# for i in range(10000):
# for i in range(10000):
# self._conn.
insert_lines([input_sql]
)
# self._conn.
schemaless_insert([input_sql], 0
)
# def createTb1(self, count=4):
# def createTb1(self, count=4):
# start_time = time.time()
# start_time = time.time()
...
@@ -273,8 +273,8 @@ class TDTestCase:
...
@@ -273,8 +273,8 @@ class TDTestCase:
# def test(self):
# def test(self):
# sql1 = 'stb,id="init",t0=14865i32,t1="tvnqbjuqck" c0=37i32,c1=217i32,c2=3i32,c3=88i32 1626006833640ms'
# sql1 = 'stb,id="init",t0=14865i32,t1="tvnqbjuqck" c0=37i32,c1=217i32,c2=3i32,c3=88i32 1626006833640ms'
# sql2 = 'stb,id="init",t0=14865i32,t1="tvnqbjuqck" c0=38i32,c1=217i32,c2=3i32,c3=88i32 1626006833641ms'
# sql2 = 'stb,id="init",t0=14865i32,t1="tvnqbjuqck" c0=38i32,c1=217i32,c2=3i32,c3=88i32 1626006833641ms'
# self._conn.
insert_lines([sql1]
)
# self._conn.
schemaless_insert([sql1], 0
)
# self._conn.
insert_lines([sql2]
)
# self._conn.
schemaless_insert([sql2], 0
)
def
run
(
self
):
def
run
(
self
):
print
(
"running {}"
.
format
(
__file__
))
print
(
"running {}"
.
format
(
__file__
))
...
...
tests/pytest/tools/taosdemoAllTest/insert-1s1tnt1r.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-1s1tntmr.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-allDataType.json
0 → 100644
浏览文件 @
25bddc0c
{
"filetype"
:
"insert"
,
"cfgdir"
:
"/etc/taos"
,
"host"
:
"127.0.0.1"
,
"port"
:
6030
,
"user"
:
"root"
,
"password"
:
"taosdata"
,
"thread_count"
:
4
,
"thread_count_create_tbl"
:
4
,
"result_file"
:
"./insert_res.txt"
,
"confirm_parameter_prompt"
:
"no"
,
"insert_interval"
:
0
,
"interlace_rows"
:
10
,
"num_of_records_per_req"
:
1000
,
"max_sql_len"
:
1024000
,
"databases"
:
[{
"dbinfo"
:
{
"name"
:
"db"
,
"drop"
:
"yes"
,
"replica"
:
1
,
"days"
:
10
,
"cache"
:
50
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"keep"
:
36500
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"comp"
:
2
,
"walLevel"
:
1
,
"cachelast"
:
0
,
"quorum"
:
1
,
"fsync"
:
3000
,
"update"
:
0
},
"super_tables"
:
[{
"name"
:
"stb0"
,
"child_table_exists"
:
"no"
,
"childtable_count"
:
1000
,
"childtable_prefix"
:
"stb00_"
,
"auto_create_table"
:
"no"
,
"batch_create_tbl_num"
:
1
,
"data_source"
:
"rand"
,
"insert_mode"
:
"taosc"
,
"insert_rows"
:
100
,
"childtable_limit"
:
0
,
"childtable_offset"
:
0
,
"multi_thread_write_one_tbl"
:
"no"
,
"interlace_rows"
:
0
,
"insert_interval"
:
0
,
"max_sql_len"
:
1024000
,
"disorder_ratio"
:
0
,
"disorder_range"
:
1000
,
"timestamp_step"
:
1
,
"start_timestamp"
:
"2020-10-01 00:00:00.000"
,
"sample_format"
:
"csv"
,
"sample_file"
:
"./sample.csv"
,
"tags_file"
:
""
,
"columns"
:
[{
"type"
:
"INT"
},
{
"type"
:
"TIMESTAMP"
},
{
"type"
:
"BIGINT"
},
{
"type"
:
"FLOAT"
},
{
"type"
:
"DOUBLE"
},
{
"type"
:
"SMALLINT"
},
{
"type"
:
"TINYINT"
},
{
"type"
:
"BOOL"
},
{
"type"
:
"NCHAR"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"UINT"
},
{
"type"
:
"UBIGINT"
},
{
"type"
:
"UTINYINT"
},
{
"type"
:
"USMALLINT"
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}],
"tags"
:
[{
"type"
:
"INT"
},
{
"type"
:
"BIGINT"
},
{
"type"
:
"FLOAT"
},
{
"type"
:
"DOUBLE"
},
{
"type"
:
"SMALLINT"
},
{
"type"
:
"TINYINT"
},
{
"type"
:
"BOOL"
},
{
"type"
:
"NCHAR"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"UINT"
},
{
"type"
:
"UBIGINT"
},
{
"type"
:
"UTINYINT"
},
{
"type"
:
"USMALLINT"
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}]
},
{
"name"
:
"stb1"
,
"child_table_exists"
:
"no"
,
"childtable_count"
:
1000
,
"childtable_prefix"
:
"stb01_"
,
"auto_create_table"
:
"no"
,
"batch_create_tbl_num"
:
10
,
"data_source"
:
"rand"
,
"insert_mode"
:
"taosc"
,
"insert_rows"
:
200
,
"childtable_limit"
:
0
,
"childtable_offset"
:
0
,
"multi_thread_write_one_tbl"
:
"no"
,
"interlace_rows"
:
0
,
"insert_interval"
:
0
,
"max_sql_len"
:
1024000
,
"disorder_ratio"
:
0
,
"disorder_range"
:
1000
,
"timestamp_step"
:
1
,
"start_timestamp"
:
"2020-10-01 00:00:00.000"
,
"sample_format"
:
"csv"
,
"sample_file"
:
"./sample.csv"
,
"tags_file"
:
""
,
"columns"
:
[{
"type"
:
"INT"
},
{
"type"
:
"DOUBLE"
,
"count"
:
1
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"BINARY"
,
"len"
:
32
,
"count"
:
1
}],
"tags"
:
[{
"type"
:
"TINYINT"
,
"count"
:
2
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}]
}]
}]
}
tests/pytest/tools/taosdemoAllTest/insert-disorder.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-drop-exist-auto-N00.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-drop-exist-auto-Y00.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-illegal.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-interlace-row.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-interval-speed.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-newdb.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-newtable.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-nodbnodrop.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-offset.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-renewdb.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-sample.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insert-timestep.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertBinaryLenLarge16374AllcolLar49151.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertChildTab0.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertChildTabLess0.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertColumnsAndTagNum4096.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertColumnsAndTagNumLarge4096.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertColumnsNum0.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertInterlaceRowsLarge1M.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertMaxNumPerReq.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertNumOfrecordPerReq0.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertNumOfrecordPerReqless0.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertRestful.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertSigcolumnsNum4096.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertTagsNumLarge128.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/insertTimestepMulRowsLargeint16.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"blf"
,
"name"
:
"blf"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[
"super_tables"
:
[
{
{
...
...
tests/pytest/tools/taosdemoAllTest/insert_5M_rows.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/manual_block1_comp.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
3
,
"blocks"
:
3
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
1000
,
"minRows"
:
1000
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/manual_block2.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit1.json
浏览文件 @
25bddc0c
...
@@ -23,7 +23,7 @@
...
@@ -23,7 +23,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit5.json
浏览文件 @
25bddc0c
...
@@ -23,7 +23,7 @@
...
@@ -23,7 +23,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/moredemo-offset-limit94.json
浏览文件 @
25bddc0c
...
@@ -23,7 +23,7 @@
...
@@ -23,7 +23,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/moredemo-offset-newdb.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/query-interrupt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/queryInsertdata.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/queryInsertrestdata.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/1174-large-stmt.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"gdse"
,
"name"
:
"gdse"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[{
"super_tables"
:
[{
"name"
:
"model_1174"
,
"name"
:
"model_1174"
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/1174-large-taosc.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"gdse"
,
"name"
:
"gdse"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[{
"super_tables"
:
[{
"name"
:
"model_1174"
,
"name"
:
"model_1174"
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-stmt-random.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"gdse"
,
"name"
:
"gdse"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[{
"super_tables"
:
[{
"name"
:
"model_1174"
,
"name"
:
"model_1174"
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-stmt.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"gdse"
,
"name"
:
"gdse"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[{
"super_tables"
:
[{
"name"
:
"model_1174"
,
"name"
:
"model_1174"
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/1174-small-taosc.json
浏览文件 @
25bddc0c
...
@@ -14,7 +14,8 @@
...
@@ -14,7 +14,8 @@
{
{
"dbinfo"
:
{
"dbinfo"
:
{
"name"
:
"gdse"
,
"name"
:
"gdse"
,
"drop"
:
"yes"
"drop"
:
"yes"
,
"keep"
:
36500
},
},
"super_tables"
:
[{
"super_tables"
:
[{
"name"
:
"model_1174"
,
"name"
:
"model_1174"
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-1s1tnt1r-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-1s1tntmr-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-allDataType-stmt.json
0 → 100644
浏览文件 @
25bddc0c
{
"filetype"
:
"insert"
,
"cfgdir"
:
"/etc/taos"
,
"host"
:
"127.0.0.1"
,
"port"
:
6030
,
"user"
:
"root"
,
"password"
:
"taosdata"
,
"thread_count"
:
4
,
"thread_count_create_tbl"
:
4
,
"result_file"
:
"./insert_res.txt"
,
"confirm_parameter_prompt"
:
"no"
,
"insert_interval"
:
0
,
"interlace_rows"
:
10
,
"num_of_records_per_req"
:
1000
,
"max_sql_len"
:
1024000
,
"databases"
:
[{
"dbinfo"
:
{
"name"
:
"db"
,
"drop"
:
"yes"
,
"replica"
:
1
,
"days"
:
10
,
"cache"
:
50
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"keep"
:
36500
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"comp"
:
2
,
"walLevel"
:
1
,
"cachelast"
:
0
,
"quorum"
:
1
,
"fsync"
:
3000
,
"update"
:
0
},
"super_tables"
:
[{
"name"
:
"stb0"
,
"child_table_exists"
:
"no"
,
"childtable_count"
:
1000
,
"childtable_prefix"
:
"stb00_"
,
"auto_create_table"
:
"no"
,
"batch_create_tbl_num"
:
1
,
"data_source"
:
"rand"
,
"insert_mode"
:
"stmt"
,
"insert_rows"
:
100
,
"childtable_limit"
:
0
,
"childtable_offset"
:
0
,
"multi_thread_write_one_tbl"
:
"no"
,
"interlace_rows"
:
0
,
"insert_interval"
:
0
,
"max_sql_len"
:
1024000
,
"disorder_ratio"
:
0
,
"disorder_range"
:
1000
,
"timestamp_step"
:
1
,
"start_timestamp"
:
"2020-10-01 00:00:00.000"
,
"sample_format"
:
"csv"
,
"sample_file"
:
"./sample.csv"
,
"tags_file"
:
""
,
"columns"
:
[{
"type"
:
"INT"
},
{
"type"
:
"TIMESTAMP"
},
{
"type"
:
"BIGINT"
},
{
"type"
:
"FLOAT"
},
{
"type"
:
"DOUBLE"
},
{
"type"
:
"SMALLINT"
},
{
"type"
:
"TINYINT"
},
{
"type"
:
"BOOL"
},
{
"type"
:
"NCHAR"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"UINT"
},
{
"type"
:
"UBIGINT"
},
{
"type"
:
"UTINYINT"
},
{
"type"
:
"USMALLINT"
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}],
"tags"
:
[{
"type"
:
"INT"
},
{
"type"
:
"BIGINT"
},
{
"type"
:
"FLOAT"
},
{
"type"
:
"DOUBLE"
},
{
"type"
:
"SMALLINT"
},
{
"type"
:
"TINYINT"
},
{
"type"
:
"BOOL"
},
{
"type"
:
"NCHAR"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"UINT"
},
{
"type"
:
"UBIGINT"
},
{
"type"
:
"UTINYINT"
},
{
"type"
:
"USMALLINT"
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}]
},
{
"name"
:
"stb1"
,
"child_table_exists"
:
"no"
,
"childtable_count"
:
1000
,
"childtable_prefix"
:
"stb01_"
,
"auto_create_table"
:
"no"
,
"batch_create_tbl_num"
:
10
,
"data_source"
:
"rand"
,
"insert_mode"
:
"stmt"
,
"insert_rows"
:
200
,
"childtable_limit"
:
0
,
"childtable_offset"
:
0
,
"multi_thread_write_one_tbl"
:
"no"
,
"interlace_rows"
:
0
,
"insert_interval"
:
0
,
"max_sql_len"
:
1024000
,
"disorder_ratio"
:
0
,
"disorder_range"
:
1000
,
"timestamp_step"
:
1
,
"start_timestamp"
:
"2020-10-01 00:00:00.000"
,
"sample_format"
:
"csv"
,
"sample_file"
:
"./sample.csv"
,
"tags_file"
:
""
,
"columns"
:
[{
"type"
:
"INT"
},
{
"type"
:
"DOUBLE"
,
"count"
:
1
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
},
{
"type"
:
"BINARY"
,
"len"
:
32
,
"count"
:
1
}],
"tags"
:
[{
"type"
:
"TINYINT"
,
"count"
:
2
},
{
"type"
:
"BINARY"
,
"len"
:
16
,
"count"
:
1
}]
}]
}]
}
tests/pytest/tools/taosdemoAllTest/stmt/insert-disorder-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-drop-exist-auto-N00-stmt.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-drop-exist-auto-Y00-stmt.json
浏览文件 @
25bddc0c
...
@@ -21,7 +21,7 @@
...
@@ -21,7 +21,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
3650
,
"keep"
:
3650
0
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-interlace-row-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-interval-speed-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-newdb-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-newtable-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-nodbnodrop-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-offset-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-renewdb-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
16
,
"cache"
:
16
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-sample-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insert-timestep-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertBinaryLenLarge16374AllcolLar49151-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertChildTab0-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertChildTabLess0-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertColumnsAndTagNum4096-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertColumnsNum0-stmt.json
浏览文件 @
25bddc0c
...
@@ -22,7 +22,7 @@
...
@@ -22,7 +22,7 @@
"cache"
:
50
,
"cache"
:
50
,
"blocks"
:
8
,
"blocks"
:
8
,
"precision"
:
"ms"
,
"precision"
:
"ms"
,
"keep"
:
365
,
"keep"
:
365
00
,
"minRows"
:
100
,
"minRows"
:
100
,
"maxRows"
:
4096
,
"maxRows"
:
4096
,
"comp"
:
2
,
"comp"
:
2
,
...
...
tests/pytest/tools/taosdemoAllTest/stmt/insertInterlaceRowsLarge1M-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/insertMaxNumPerReq-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/insertNumOfrecordPerReq0-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/insertNumOfrecordPerReqless0-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/insertSigcolumnsNum4096-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/insertTagsNumLarge128-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/stmt/nsertColumnsAndTagNumLarge4096-stmt.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/subInsertdata.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/subInsertdataMaxsql100.json
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdemoAllTest/taosdemoTestInsertAllType.py
0 → 100644
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/pytest/tools/taosdumpTest3.py
0 → 100644
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/script/api/openTSDBTest.c
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
tests/tsim/src/simExe.c
浏览文件 @
25bddc0c
此差异已折叠。
点击以展开。
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录