Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
doujutun3207
flink
提交
adeee2b9
F
flink
项目概览
doujutun3207
/
flink
与 Fork 源项目一致
从无法访问的项目Fork
通知
24
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
F
flink
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
adeee2b9
编写于
7月 14, 2014
作者:
M
Márton Balassi
提交者:
Stephan Ewen
8月 18, 2014
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[streaming] StreamRecord refactor
上级
05c5b48a
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
31 addition
and
30 deletion
+31
-30
flink-addons/flink-streaming/src/main/java/eu/stratosphere/streaming/api/streamrecord/StreamRecord.java
...stratosphere/streaming/api/streamrecord/StreamRecord.java
+30
-29
flink-addons/flink-streaming/src/test/java/eu/stratosphere/streaming/api/streamrecord/StreamRecordTest.java
...tosphere/streaming/api/streamrecord/StreamRecordTest.java
+1
-1
未找到文件。
flink-addons/flink-streaming/src/main/java/eu/stratosphere/streaming/api/streamrecord/StreamRecord.java
浏览文件 @
adeee2b9
...
...
@@ -62,11 +62,10 @@ import eu.stratosphere.types.Value;
* objects in Stratosphere stream processing. The elements of the batch are
* Value arrays.
*/
public
class
StreamRecord
<
T
extends
Tuple
>
implements
IOReadableWritable
,
Serializable
{
public
class
StreamRecord
implements
IOReadableWritable
,
Serializable
{
private
static
final
long
serialVersionUID
=
1L
;
private
List
<
T
>
recordBatch
;
private
List
<
?
extends
Tuple
>
recordBatch
;
private
StringValue
uid
=
new
StringValue
(
""
);
private
int
numOfFields
;
private
int
numOfRecords
;
...
...
@@ -82,23 +81,9 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
// TODO implement equals, clone
/**
* Creates a new empty
batch of records and sets the field number to one
* Creates a new empty
instance for read
*/
public
StreamRecord
()
{
this
.
numOfFields
=
1
;
recordBatch
=
new
ArrayList
<
T
>();
}
/**
* Creates a new empty batch of records and sets the field number to the
* given number
*
* @param length
* Number of fields in the records
*/
public
StreamRecord
(
int
length
)
{
numOfFields
=
length
;
recordBatch
=
new
ArrayList
<
T
>();
}
/**
...
...
@@ -111,9 +96,11 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
* @param batchSize
* Number of records
*/
public
StreamRecord
(
int
length
,
int
batchSize
)
{
numOfFields
=
length
;
recordBatch
=
new
ArrayList
<
T
>(
batchSize
);
public
StreamRecord
(
Tuple
tuple
,
int
batchSize
)
{
numOfFields
=
tuple
.
getArity
();
Class
<?>
tupleClass
=
CLASSES
[
tuple
.
getArity
()-
1
];
tuple
=
(
tupleClass
)
tuple
;
recordBatch
=
new
ArrayList
<>(
batchSize
);
}
/**
...
...
@@ -328,17 +315,18 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
}
private
void
writeTuple
(
Tuple
tuple
,
DataOutput
out
)
{
// Class basicType = CLASSES[tuple.getArity()-1];
// TypeInformation<? extends Tuple> typeInfo =
// TupleTypeInfo.getBasicTupleTypeInfo(basicType);
Class
[]
basicTypes
=
new
Class
[
tuple
.
getArity
()];
StringBuilder
basicTypeNames
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
basicTypes
.
length
;
i
++)
{
basicTypes
[
i
]
=
tuple
.
getField
(
i
).
getClass
();
basicTypeNames
.
append
(
basicTypes
[
i
].
getName
()
+
","
);
}
TypeInformation
<?
extends
Tuple
>
typeInfo
=
TupleTypeInfo
.
getBasicTupleTypeInfo
(
basicTypes
);
StringValue
typeVal
=
new
StringValue
(
basicTypeNames
.
toString
());
@SuppressWarnings
(
"unchecked"
)
TupleSerializer
<
Tuple
>
tupleSerializer
=
(
TupleSerializer
<
Tuple
>)
typeInfo
.
createSerializer
();
...
...
@@ -346,6 +334,7 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
tupleSerializer
);
serializationDelegate
.
setInstance
(
tuple
);
try
{
typeVal
.
write
(
out
);
serializationDelegate
.
write
(
out
);
}
catch
(
IOException
e
)
{
// TODO Auto-generated catch block
...
...
@@ -353,7 +342,21 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
}
}
public
Tuple
readTuple
(
DataInput
in
,
Class
...
basicTypes
)
throws
IOException
{
private
Tuple
readTuple
(
DataInput
in
)
throws
IOException
{
StringValue
typeVal
=
new
StringValue
();
typeVal
.
read
(
in
);
// TODO: use Tokenizer
String
[]
types
=
typeVal
.
getValue
().
split
(
","
);
Class
[]
basicTypes
=
new
Class
[
types
.
length
];
for
(
int
i
=
0
;
i
<
types
.
length
;
i
++)
{
try
{
basicTypes
[
i
]
=
Class
.
forName
(
types
[
i
]);
}
catch
(
ClassNotFoundException
e
)
{
// TODO Auto-generated catch block
e
.
printStackTrace
();
}
}
TypeInformation
<?
extends
Tuple
>
typeInfo
=
TupleTypeInfo
.
getBasicTupleTypeInfo
(
basicTypes
);
...
...
@@ -400,9 +403,7 @@ public class StreamRecord<T extends Tuple> implements IOReadableWritable,
recordBatch
=
new
ArrayList
<
T
>();
for
(
int
k
=
0
;
k
<
numOfRecords
;
++
k
)
{
T
tuple
=
null
;
readTuple
(
tuple
,
in
,
numOfFields
);
recordBatch
.
add
(
tuple
);
recordBatch
.
add
((
T
)
readTuple
(
in
));
}
}
...
...
flink-addons/flink-streaming/src/test/java/eu/stratosphere/streaming/api/streamrecord/StreamRecordTest.java
浏览文件 @
adeee2b9
...
...
@@ -149,7 +149,7 @@ public class StreamRecordTest {
Tuple2
<
Integer
,
String
>
tuple
=
new
Tuple2
<
Integer
,
String
>();
StreamRecord
<
Tuple2
<
Integer
,
String
>>
newRec
=
new
StreamRecord
<
Tuple2
<
Integer
,
String
>>(
2
);
Tuple2
<
Integer
,
String
>
tupleOut
=
(
Tuple2
<
Integer
,
String
>)
newRec
.
readTuple
(
in
,
Integer
.
class
,
String
.
class
);
Tuple2
<
Integer
,
String
>
tupleOut
=
newRec
.
read
(
in
);
assertEquals
(
tupleOut
.
getField
(
0
),
tuple
.
getField
(
0
));
}
catch
(
IOException
e
)
{
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录