Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
doujutun3207
flink
提交
76dd218d
F
flink
项目概览
doujutun3207
/
flink
与 Fork 源项目一致
从无法访问的项目Fork
通知
24
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
F
flink
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
76dd218d
编写于
7月 14, 2014
作者:
G
gyfora
提交者:
Stephan Ewen
8月 18, 2014
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[streaming] performance test scripts updated
上级
ce548603
变更
4
显示空白变更内容
内联
并排
Showing
4 changed file
with
218 addition
and
13 deletion
+218
-13
flink-addons/flink-streaming/src/test/resources/Performance/PerformanceTracker.py
...ming/src/test/resources/Performance/PerformanceTracker.py
+15
-13
flink-addons/flink-streaming/src/test/resources/Performance/WordCountTopology.java
...ing/src/test/resources/Performance/WordCountTopology.java
+160
-0
flink-addons/flink-streaming/src/test/resources/Performance/copy-files.sh
...nk-streaming/src/test/resources/Performance/copy-files.sh
+27
-0
flink-addons/flink-streaming/src/test/resources/Performance/remove-files.sh
...-streaming/src/test/resources/Performance/remove-files.sh
+16
-0
未找到文件。
flink-addons/flink-streaming/src/test/resources/Performance/PerformanceTracker.py
浏览文件 @
76dd218d
...
...
@@ -9,8 +9,10 @@ import matplotlib.pyplot as plt
import
pandas
as
pd
import
os
linestyles
=
[
'_'
,
'-'
,
'--'
,
':'
]
markers
=
[
'x'
,
'o'
,
'^'
,
'+'
]
markers
=
[
'D'
,
's'
,
'|'
,
''
,
'x'
,
'_'
,
'^'
,
' '
,
'd'
,
'h'
,
'+'
,
'*'
,
','
,
'o'
,
'.'
,
'1'
,
'p'
,
'H'
,
'v'
,
'>'
];
colors
=
[
'b'
,
'g'
,
'r'
,
'c'
,
'm'
,
'y'
,
'k'
]
def
readFiles
(
csv_dir
):
dataframes
=
[]
...
...
@@ -36,9 +38,9 @@ def plotCounter(csv_dir, smooth=5):
plt
.
title
(
'Counter'
)
for
dataframe
in
dataframes
:
if
len
(
markers
)
>
dataframe
[
1
]:
m
=
markers
[
dataframe
[
1
]]
else
:
m
=
'*'
m
=
markers
[
dataframe
[
1
]
%
len
(
markers
)]
dataframe
[
2
].
ix
[:,
0
].
plot
(
marker
=
m
,
markevery
=
10
,
markersize
=
10
)
plt
.
legend
([
x
[
0
]
for
x
in
dataframes
])
...
...
@@ -46,9 +48,9 @@ def plotCounter(csv_dir, smooth=5):
plt
.
title
(
'dC/dT'
)
for
dataframe
in
dataframes
:
if
len
(
markers
)
>
dataframe
[
1
]:
m
=
markers
[
dataframe
[
1
]
]
else
:
m
=
'*'
m
=
markers
[
dataframe
[
1
]
%
len
(
markers
)
]
pd
.
rolling_mean
(
dataframe
[
2
].
speed
,
smooth
).
plot
(
marker
=
m
,
markevery
=
10
,
markersize
=
10
)
plt
.
legend
([
x
[
0
]
for
x
in
dataframes
])
...
...
@@ -61,9 +63,9 @@ def plotTimer(csv_dir,smooth=5,std=50):
plt
.
title
(
'Timer'
)
for
dataframe
in
dataframes
:
if
len
(
markers
)
>
dataframe
[
1
]:
m
=
markers
[
dataframe
[
1
]]
else
:
m
=
'*'
m
=
markers
[
dataframe
[
1
]
%
len
(
markers
)]
pd
.
rolling_mean
(
dataframe
[
2
].
ix
[:,
0
],
smooth
).
plot
(
marker
=
m
,
markevery
=
10
,
markersize
=
10
)
plt
.
legend
([
x
[
0
]
for
x
in
dataframes
])
...
...
@@ -71,8 +73,8 @@ def plotTimer(csv_dir,smooth=5,std=50):
plt
.
title
(
'Standard deviance'
)
for
dataframe
in
dataframes
:
if
len
(
markers
)
>
dataframe
[
1
]:
m
=
markers
[
dataframe
[
1
]]
else
:
m
=
'*'
m
=
markers
[
dataframe
[
1
]
%
len
(
markers
)]
pd
.
rolling_std
(
dataframe
[
2
].
ix
[:,
0
],
std
).
plot
(
marker
=
m
,
markevery
=
10
,
markersize
=
10
)
plt
.
legend
([
x
[
0
]
for
x
in
dataframes
])
flink-addons/flink-streaming/src/test/resources/Performance/WordCountTopology.java
0 → 100644
浏览文件 @
76dd218d
package
storm.starter
;
import
java.io.BufferedReader
;
import
java.io.FileInputStream
;
import
java.io.FileNotFoundException
;
import
java.io.FileReader
;
import
java.io.IOException
;
import
java.util.HashMap
;
import
java.util.Map
;
import
storm.starter.spout.RandomSentenceSpout
;
import
backtype.storm.Config
;
import
backtype.storm.LocalCluster
;
import
backtype.storm.StormSubmitter
;
import
backtype.storm.spout.SpoutOutputCollector
;
import
backtype.storm.task.ShellBolt
;
import
backtype.storm.task.TopologyContext
;
import
backtype.storm.topology.BasicOutputCollector
;
import
backtype.storm.topology.IRichBolt
;
import
backtype.storm.topology.OutputFieldsDeclarer
;
import
backtype.storm.topology.TopologyBuilder
;
import
backtype.storm.topology.base.BaseBasicBolt
;
import
backtype.storm.topology.base.BaseRichSpout
;
import
backtype.storm.tuple.Fields
;
import
backtype.storm.tuple.Tuple
;
import
backtype.storm.tuple.Values
;
import
backtype.storm.utils.Utils
;
/**
* This topology demonstrates Storm's stream groupings and multilang
* capabilities.
*/
public
class
WordCountTopology
{
public
static
class
HamletSpout
extends
BaseRichSpout
{
SpoutOutputCollector
_collector
;
final
static
String
path
=
"/home/hermann/asd.txt"
;
final
static
int
emitHamlets
=
5
;
BufferedReader
br
=
null
;
int
numberOfHamlets
;
@Override
public
void
open
(
Map
conf
,
TopologyContext
context
,
SpoutOutputCollector
collector
)
{
_collector
=
collector
;
try
{
br
=
new
BufferedReader
(
new
FileReader
(
path
));
}
catch
(
FileNotFoundException
e
)
{
e
.
printStackTrace
();
}
numberOfHamlets
=
0
;
}
@Override
public
void
nextTuple
()
{
Utils
.
sleep
(
100
);
String
line
=
""
;
line
=
getLine
();
if
(
line
!=
null
)
{
_collector
.
emit
(
new
Values
(
line
));
}
}
public
String
getLine
()
{
String
line
;
try
{
line
=
br
.
readLine
();
if
(
line
==
null
&&
numberOfHamlets
<
emitHamlets
)
{
numberOfHamlets
++;
br
=
new
BufferedReader
(
new
FileReader
(
path
));
line
=
br
.
readLine
();
}
return
line
;
}
catch
(
IOException
e
)
{
e
.
printStackTrace
();
return
null
;
}
}
@Override
public
void
ack
(
Object
id
)
{
}
@Override
public
void
fail
(
Object
id
)
{
}
@Override
public
void
declareOutputFields
(
OutputFieldsDeclarer
declarer
)
{
declarer
.
declare
(
new
Fields
(
"word"
));
}
}
public
static
class
SplitSentence
extends
ShellBolt
implements
IRichBolt
{
public
SplitSentence
()
{
super
(
"python"
,
"splitsentence.py"
);
}
@Override
public
void
declareOutputFields
(
OutputFieldsDeclarer
declarer
)
{
declarer
.
declare
(
new
Fields
(
"word"
));
}
@Override
public
Map
<
String
,
Object
>
getComponentConfiguration
()
{
return
null
;
}
}
public
static
class
WordCount
extends
BaseBasicBolt
{
Map
<
String
,
Integer
>
counts
=
new
HashMap
<
String
,
Integer
>();
@Override
public
void
execute
(
Tuple
tuple
,
BasicOutputCollector
collector
)
{
String
word
=
tuple
.
getString
(
0
);
Integer
count
=
counts
.
get
(
word
);
if
(
count
==
null
)
count
=
0
;
count
++;
counts
.
put
(
word
,
count
);
collector
.
emit
(
new
Values
(
word
,
count
));
}
@Override
public
void
declareOutputFields
(
OutputFieldsDeclarer
declarer
)
{
declarer
.
declare
(
new
Fields
(
"word"
,
"count"
));
}
}
public
static
void
main
(
String
[]
args
)
throws
Exception
{
TopologyBuilder
builder
=
new
TopologyBuilder
();
builder
.
setSpout
(
"spout"
,
new
HamletSpout
(),
1
);
builder
.
setBolt
(
"split"
,
new
SplitSentence
(),
1
).
shuffleGrouping
(
"spout"
);
builder
.
setBolt
(
"count"
,
new
WordCount
(),
1
).
fieldsGrouping
(
"split"
,
new
Fields
(
"word"
));
Config
conf
=
new
Config
();
conf
.
setDebug
(
true
);
if
(
args
!=
null
&&
args
.
length
>
0
)
{
conf
.
setNumWorkers
(
3
);
StormSubmitter
.
submitTopology
(
args
[
0
],
conf
,
builder
.
createTopology
());
}
else
{
conf
.
setMaxTaskParallelism
(
3
);
LocalCluster
cluster
=
new
LocalCluster
();
cluster
.
submitTopology
(
"word-count"
,
conf
,
builder
.
createTopology
());
Thread
.
sleep
(
10000
);
cluster
.
shutdown
();
}
}
}
flink-addons/flink-streaming/src/test/resources/Performance/copy-files.sh
0 → 100755
浏览文件 @
76dd218d
#!/bin/bash
toDir
=
$1
if
[
-d
"
${
toDir
}
"
]
;
then
ssh strato@dell150.ilab.sztaki.hu
'
for j in {101..142} 144 145;
do
for i in $(ssh dell$j "ls stratosphere-distrib/log/counter/");
do scp strato@dell$j:stratosphere-distrib/log/counter/$i stratosphere-distrib/log/all_tests/counter/$i-$j.csv;
done
for i in $(ssh dell$j "ls stratosphere-distrib/log/timer/");
do scp strato@dell$j:stratosphere-distrib/log/timer/$i stratosphere-distrib/log/all_tests/timer/$i-$j.csv;
done
for i in $(ls stratosphere-distrib/log/counter/);
do cp stratosphere-distrib/log/counter/$i stratosphere-distrib/log/all_tests/counter/$i-150.csv;
done
for i in $(ls stratosphere-distrib/log/timer/);
do cp stratosphere-distrib/log/timer/$i stratosphere-distrib/log/all_tests/timer/$i-150.csv;
done
done
'
scp strato@dell150.ilab.sztaki.hu:stratosphere-distrib/log/all_tests/counter/
*
$toDir
/counter/
scp strato@dell150.ilab.sztaki.hu:stratosphere-distrib/log/all_tests/timer/
*
$toDir
/timer/
else
echo
"USAGE:"
echo
"run <directory>"
fi
\ No newline at end of file
flink-addons/flink-streaming/src/test/resources/Performance/remove-files.sh
0 → 100755
浏览文件 @
76dd218d
#!/bin/bash
ssh strato@dell150.ilab.sztaki.hu
'
for j in {101..142} 144 145;
do
$(ssh dell$j '
rm
stratosphere-distrib/log/counter/
*
');
$(ssh dell$j '
rm
stratosphere-distrib/log/timer/
*
');
done
rm stratosphere-distrib/log/counter/*
rm stratosphere-distrib/log/timer/*
rm stratosphere-distrib/log/all_tests/counter/*
rm stratosphere-distrib/log/all_tests/timer/*
'
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录