Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
doujutun3207
flink
提交
e4b56950
F
flink
项目概览
doujutun3207
/
flink
与 Fork 源项目一致
从无法访问的项目Fork
通知
24
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
F
flink
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
e4b56950
编写于
6月 12, 2015
作者:
F
Fabian Hueske
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[FLINK-2207] Fix TableAPI conversion documenation and further renamings for consistency.
This closes #829
上级
e45c5dc5
变更
7
隐藏空白更改
内联
并排
Showing
7 changed file
with
19 addition
and
19 deletion
+19
-19
docs/libs/table.md
docs/libs/table.md
+4
-4
flink-staging/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
...a/org/apache/flink/api/scala/table/TableConversions.scala
+2
-2
flink-staging/flink-table/src/main/scala/org/apache/flink/api/table/Table.scala
...ble/src/main/scala/org/apache/flink/api/table/Table.scala
+1
-1
flink-staging/flink-table/src/main/scala/org/apache/flink/examples/scala/PageRankTable.scala
...scala/org/apache/flink/examples/scala/PageRankTable.scala
+1
-1
flink-staging/flink-table/src/main/scala/org/apache/flink/examples/scala/StreamingTableFilter.scala
...rg/apache/flink/examples/scala/StreamingTableFilter.scala
+1
-1
flink-staging/flink-table/src/test/scala/org/apache/flink/api/scala/table/test/FilterITCase.scala
.../org/apache/flink/api/scala/table/test/FilterITCase.scala
+3
-3
flink-staging/flink-table/src/test/scala/org/apache/flink/api/scala/table/test/JoinITCase.scala
...la/org/apache/flink/api/scala/table/test/JoinITCase.scala
+7
-7
未找到文件。
docs/libs/table.md
浏览文件 @
e4b56950
...
...
@@ -52,7 +52,7 @@ import org.apache.flink.api.scala.table._
case class WC(word: String, count: Int)
val input = env.fromElements(WC("hello", 1), WC("hello", 1), WC("ciao", 1))
val expr = input.toTable
val result = expr.groupBy('word).select('word, 'count.sum as 'count).toSet[WC]
val result = expr.groupBy('word).select('word, 'count.sum as 'count).to
Data
Set[WC]
{% endhighlight %}
The expression DSL uses Scala symbols to refer to field names and we use code generation to
...
...
@@ -69,7 +69,7 @@ case class MyResult(a: String, d: Int)
val input1 = env.fromElements(...).toTable('a, 'b)
val input2 = env.fromElements(...).toTable('c, 'd)
val joined = input1.join(input2).where("b = a && d > 42").select("a, d").toSet[MyResult]
val joined = input1.join(input2).where("b = a && d > 42").select("a, d").to
Data
Set[MyResult]
{% endhighlight %}
Notice, how a DataSet can be converted to a Table by using
`as`
and specifying new
...
...
@@ -108,14 +108,14 @@ DataSet<WC> input = env.fromElements(
new WC("Ciao", 1),
new WC("Hello", 1));
Table table = tableEnv.
toTable
(input);
Table table = tableEnv.
fromDataSet
(input);
Table filtered = table
.groupBy("word")
.select("word.count as count, word")
.filter("count = 2");
DataSet
<WC>
result = tableEnv.toSet(filtered, WC.class);
DataSet
<WC>
result = tableEnv.to
Data
Set(filtered, WC.class);
{% endhighlight %}
When using Java, the embedded DSL for specifying expressions cannot be used. Only String expressions
...
...
flink-staging/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
浏览文件 @
e4b56950
...
...
@@ -33,14 +33,14 @@ class TableConversions(table: Table) {
/**
* Converts the [[Table]] to a [[DataSet]].
*/
def
toSet
[
T:
TypeInformation
]
:
DataSet
[
T
]
=
{
def
to
Data
Set
[
T:
TypeInformation
]
:
DataSet
[
T
]
=
{
new
ScalaBatchTranslator
().
translate
[
T
](
table
.
operation
)
}
/**
* Converts the [[Table]] to a [[DataStream]].
*/
def
toStream
[
T:
TypeInformation
]
:
DataStream
[
T
]
=
{
def
to
Data
Stream
[
T:
TypeInformation
]
:
DataStream
[
T
]
=
{
new
ScalaStreamingTranslator
().
translate
[
T
](
table
.
operation
)
}
}
...
...
flink-staging/flink-table/src/main/scala/org/apache/flink/api/table/Table.scala
浏览文件 @
e4b56950
...
...
@@ -39,7 +39,7 @@ import org.apache.flink.api.table.plan._
* val table = set.toTable('a, 'b)
* ...
* val table2 = ...
* val set = table2.toSet[MyType]
* val set = table2.to
Data
Set[MyType]
* }}}
*/
case
class
Table
(
private
[
flink
]
val
operation
:
PlanNode
)
{
...
...
flink-staging/flink-table/src/main/scala/org/apache/flink/examples/scala/PageRankTable.scala
浏览文件 @
e4b56950
...
...
@@ -101,7 +101,7 @@ object PageRankTable {
val
newRanks
=
currentRanks
.
toTable
// distribute ranks to target pages
.
join
(
adjacencyLists
).
where
(
'pageId
===
'sourceId
)
.
select
(
'rank
,
'targetIds
).
toSet
[
RankOutput
]
.
select
(
'rank
,
'targetIds
).
to
Data
Set
[
RankOutput
]
.
flatMap
{
(
in
,
out
:
Collector
[(
Long
,
Double
)])
=>
val
targets
=
in
.
targetIds
...
...
flink-staging/flink-table/src/main/scala/org/apache/flink/examples/scala/StreamingTableFilter.scala
浏览文件 @
e4b56950
...
...
@@ -42,7 +42,7 @@ object StreamingTableFilter {
val
cars
=
genCarStream
().
toTable
.
filter
(
'carId
===
0
)
.
select
(
'carId
,
'speed
,
'distance
+
1000
as
'distance
,
'time
%
5
as
'time
)
.
toStream
[
CarEvent
]
.
to
Data
Stream
[
CarEvent
]
cars
.
print
()
...
...
flink-staging/flink-table/src/test/scala/org/apache/flink/api/scala/table/test/FilterITCase.scala
浏览文件 @
e4b56950
...
...
@@ -61,7 +61,7 @@ class FilterITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mod
val
filterDs
=
ds
.
filter
(
Literal
(
false
)
)
filterDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
filterDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"\n"
}
...
...
@@ -76,7 +76,7 @@ class FilterITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mod
val
filterDs
=
ds
.
filter
(
Literal
(
true
)
)
filterDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
filterDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"1,1,Hi\n"
+
"2,2,Hello\n"
+
"3,2,Hello world\n"
+
"4,3,Hello world, "
+
"how are you?\n"
+
"5,3,I am fine.\n"
+
"6,3,Luke Skywalker\n"
+
"7,4,"
+
...
...
@@ -109,7 +109,7 @@ class FilterITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mod
val
filterDs
=
ds
.
filter
(
'a
%
2
===
0
)
filterDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
filterDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"2,2,Hello\n"
+
"4,3,Hello world, how are you?\n"
+
"6,3,Luke Skywalker\n"
+
"8,4,"
+
"Comment#2\n"
+
"10,4,Comment#4\n"
+
"12,5,Comment#6\n"
+
"14,5,Comment#8\n"
+
"16,6,"
+
...
...
flink-staging/flink-table/src/test/scala/org/apache/flink/api/scala/table/test/JoinITCase.scala
浏览文件 @
e4b56950
...
...
@@ -57,7 +57,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'b
===
'e
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"Hi,Hallo\n"
+
"Hello,Hallo Welt\n"
+
"Hello world,Hallo Welt\n"
}
...
...
@@ -70,7 +70,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'b
===
'e
&&
'b
<
2
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"Hi,Hallo\n"
}
...
...
@@ -83,7 +83,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
filter
(
'a
===
'd
&&
'b
===
'h
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"Hi,Hallo\n"
+
"Hello,Hallo Welt\n"
+
"Hello world,Hallo Welt wie gehts?\n"
+
"Hello world,ABC\n"
+
"I am fine.,HIJ\n"
+
"I am fine.,IJK\n"
...
...
@@ -97,7 +97,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'foo
===
'e
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
""
}
...
...
@@ -110,7 +110,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'a
===
'g
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
""
}
...
...
@@ -123,7 +123,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'a
===
'd
).
select
(
'c
,
'g
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
""
}
...
...
@@ -136,7 +136,7 @@ class JoinITCase(mode: TestExecutionMode) extends MultipleProgramsTestBase(mode)
val
joinDs
=
ds1
.
join
(
ds2
).
where
(
'a
===
'd
).
select
(
'g
.
count
)
joinDs
.
toSet
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
joinDs
.
to
Data
Set
[
Row
].
writeAsCsv
(
resultPath
,
writeMode
=
WriteMode
.
OVERWRITE
)
env
.
execute
()
expected
=
"6"
}
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录