Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
apache
DolphinScheduler
提交
822addac
DolphinScheduler
项目概览
apache
/
DolphinScheduler
上一次同步 1 年多
通知
704
Star
9572
Fork
3514
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
DolphinScheduler
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
822addac
编写于
6月 10, 2020
作者:
E
eights
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
add sqoop task UT to pom
上级
5d83d797
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
5 addition
and
3 deletion
+5
-3
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTaskTest.java
...phinscheduler/server/worker/task/sqoop/SqoopTaskTest.java
+4
-3
pom.xml
pom.xml
+1
-0
未找到文件。
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTaskTest.java
浏览文件 @
822addac
...
...
@@ -83,12 +83,13 @@ public class SqoopTaskTest {
TaskExecutionContext
mysqlTaskExecutionContext
=
getMysqlTaskExecutionContext
();
//sqoop TEMPLATE job
//import mysql to HDFS
String
mysqlToHdfs
=
"{\"jobName\":\"sqoop_import\",\"jobType\":\"TEMPLATE\",\"concurrency\":1,\"modelType\":\"import\",\"sourceType\":\"MYSQL\",\"targetType\":\"HDFS\",\"sourceParams\":\"{\\\"srcDatasource\\\":2,\\\"srcTable\\\":\\\"person_2\\\",\\\"srcQueryType\\\":\\\"0\\\",\\\"srcQuerySql\\\":\\\"\\\",\\\"srcColumnType\\\":\\\"0\\\",\\\"srcColumns\\\":\\\"\\\",\\\"srcConditionList\\\":[],\\\"mapColumnHive\\\":[],\\\"mapColumnJava\\\":[]}\",\"targetParams\":\"{\\\"targetPath\\\":\\\"/ods/tmp/test/person7\\\",\\\"deleteTargetDir\\\":true,\\\"fileType\\\":\\\"--as-textfile\\\",\\\"compressionCodec\\\":\\\"\\\",\\\"fieldsTerminated\\\":\\\"@\\\",\\\"linesTerminated\\\":\\\"\\\\\\\\n\\\"}\",\"localParams\":[]}"
;
//import mysql to HDFS with hadoo
String
mysqlToHdfs
=
"{\"jobName\":\"sqoop_import\",\"hadoopCustomParams\":[{\"prop\":\"mapreduce.map.memory.mb\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"4096\"}],\"sqoopAdvancedParams\":[{\"prop\":\"--direct\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}],"
+
"\"jobType\":\"TEMPLATE\",\"concurrency\":1,\"modelType\":\"import\",\"sourceType\":\"MYSQL\",\"targetType\":\"HDFS\",\"sourceParams\":\"{\\\"srcDatasource\\\":2,\\\"srcTable\\\":\\\"person_2\\\",\\\"srcQueryType\\\":\\\"0\\\",\\\"srcQuerySql\\\":\\\"\\\",\\\"srcColumnType\\\":\\\"0\\\",\\\"srcColumns\\\":\\\"\\\",\\\"srcConditionList\\\":[],\\\"mapColumnHive\\\":[],\\\"mapColumnJava\\\":[]}\",\"targetParams\":\"{\\\"targetPath\\\":\\\"/ods/tmp/test/person7\\\",\\\"deleteTargetDir\\\":true,\\\"fileType\\\":\\\"--as-textfile\\\",\\\"compressionCodec\\\":\\\"\\\",\\\"fieldsTerminated\\\":\\\"@\\\",\\\"linesTerminated\\\":\\\"\\\\\\\\n\\\"}\",\"localParams\":[]}"
;
SqoopParameters
mysqlToHdfsParams
=
JSON
.
parseObject
(
mysqlToHdfs
,
SqoopParameters
.
class
);
SqoopJobGenerator
generator
=
new
SqoopJobGenerator
();
String
mysqlToHdfsScript
=
generator
.
generateSqoopJob
(
mysqlToHdfsParams
,
mysqlTaskExecutionContext
);
String
mysqlToHdfsExpected
=
"sqoop import -D mapred.job.name=sqoop_import -m 1 --connect jdbc:mysql://192.168.0.111:3306/test --username kylo --password 123456 --table person_2 --target-dir /ods/tmp/test/person7 --as-textfile --delete-target-dir --fields-terminated-by '@' --lines-terminated-by '\\n' --null-non-string 'NULL' --null-string 'NULL'"
;
String
mysqlToHdfsExpected
=
"sqoop import -D mapred.job.name=sqoop_import -
D mapreduce.map.memory.mb=4096 --direct -
m 1 --connect jdbc:mysql://192.168.0.111:3306/test --username kylo --password 123456 --table person_2 --target-dir /ods/tmp/test/person7 --as-textfile --delete-target-dir --fields-terminated-by '@' --lines-terminated-by '\\n' --null-non-string 'NULL' --null-string 'NULL'"
;
Assert
.
assertEquals
(
mysqlToHdfsExpected
,
mysqlToHdfsScript
);
//export hdfs to mysql using update mode
...
...
pom.xml
浏览文件 @
822addac
...
...
@@ -821,6 +821,7 @@
<include>
**/server/worker/task/spark/SparkTaskTest.java
</include>
<include>
**/server/worker/task/EnvFileTest.java
</include>
<include>
**/server/worker/task/spark/SparkTaskTest.java
</include>
<include>
**/server/worker/task/sqoop/SqoopTaskTest.java
</include>
<include>
**/server/worker/EnvFileTest.java
</include>
<include>
**/service/quartz/cron/CronUtilsTest.java
</include>
<include>
**/service/zk/DefaultEnsembleProviderTest.java
</include>
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录