Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
apache
SkyWalking
提交
1a84431f
S
SkyWalking
项目概览
apache
/
SkyWalking
上一次同步 1 年多
通知
302
Star
21345
Fork
6091
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
S
SkyWalking
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
1a84431f
编写于
2月 18, 2016
作者:
wu-sheng
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
1.将初始化参数方法放到setup中。
上级
500183d5
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
92 addition
and
66 deletion
+92
-66
skywalking-analysis/src/main/java/com/ai/cloud/skywalking/analysis/categorize2chain/Categorize2ChainMapper.java
...ing/analysis/categorize2chain/Categorize2ChainMapper.java
+85
-63
skywalking-analysis/src/main/java/com/ai/cloud/skywalking/analysis/categorize2chain/Categorize2ChainReducer.java
...ng/analysis/categorize2chain/Categorize2ChainReducer.java
+7
-3
未找到文件。
skywalking-analysis/src/main/java/com/ai/cloud/skywalking/analysis/categorize2chain/Categorize2ChainMapper.java
浏览文件 @
1a84431f
package
com.ai.cloud.skywalking.analysis.categorize2chain
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.filter.SpanNodeProcessChai
n
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.filter.SpanNodeProcessFilter
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.model.ChainInfo
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.model.ChainNode
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.util.HBaseUtil
;
import
com.ai.cloud.skywalking.analysis.config.ConfigInitializer
;
import
com.ai.cloud.skywalking.protocol.Span
;
import
java.io.IOExceptio
n
;
import
java.util.ArrayList
;
import
java.util.Collections
;
import
java.util.Comparator
;
import
java.util.LinkedHashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.apache.hadoop.hbase.Cell
;
import
org.apache.hadoop.hbase.client.Result
;
...
...
@@ -17,68 +17,90 @@ import org.apache.hadoop.io.Text;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
java.io.IOException
;
import
java.util.*
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.filter.SpanNodeProcessChain
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.filter.SpanNodeProcessFilter
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.model.ChainInfo
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.model.ChainNode
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.util.HBaseUtil
;
import
com.ai.cloud.skywalking.analysis.config.ConfigInitializer
;
import
com.ai.cloud.skywalking.protocol.Span
;
public
class
Categorize2ChainMapper
extends
TableMapper
<
Text
,
ChainInfo
>
{
private
Logger
logger
=
LoggerFactory
.
getLogger
(
Categorize2ChainMapper
.
class
.
getName
());
private
Logger
logger
=
LoggerFactory
.
getLogger
(
Categorize2ChainMapper
.
class
.
getName
());
@Override
protected
void
setup
(
Context
context
)
throws
IOException
,
InterruptedException
{
ConfigInitializer
.
initialize
();
}
@Override
protected
void
map
(
ImmutableBytesWritable
key
,
Result
value
,
Context
context
)
throws
IOException
,
InterruptedException
{
ConfigInitializer
.
initialize
();
List
<
Span
>
spanList
=
new
ArrayList
<
Span
>()
;
ChainInfo
chainInfo
=
null
;
try
{
for
(
Cell
cell
:
value
.
rawCells
())
{
Span
span
=
new
Span
(
Bytes
.
toString
(
cell
.
getValueArray
(),
cell
.
getValueOffset
(),
cell
.
getValueLength
()));
spanList
.
add
(
span
);
}
@Override
protected
void
map
(
ImmutableBytesWritable
key
,
Result
value
,
Context
context
)
throws
IOException
,
InterruptedException
{
List
<
Span
>
spanList
=
new
ArrayList
<
Span
>
();
ChainInfo
chainInfo
=
null
;
try
{
for
(
Cell
cell
:
value
.
rawCells
())
{
Span
span
=
new
Span
(
Bytes
.
toString
(
cell
.
getValueArray
(),
cell
.
getValueOffset
(),
cell
.
getValueLength
()));
spanList
.
add
(
span
);
}
chainInfo
=
spanToChainInfo
(
Bytes
.
toString
(
key
.
get
()),
spanList
);
logger
.
info
(
"Success convert span to chain info...."
+
chainInfo
.
getCID
());
context
.
write
(
new
Text
(
chainInfo
.
getUserId
()
+
":"
+
chainInfo
.
getEntranceNodeToken
()),
chainInfo
);
}
catch
(
Exception
e
)
{
logger
.
error
(
"Failed to mapper call chain["
+
key
.
toString
()
+
"]"
,
e
);
}
}
chainInfo
=
spanToChainInfo
(
Bytes
.
toString
(
key
.
get
()),
spanList
);
logger
.
info
(
"Success convert span to chain info...."
+
chainInfo
.
getCID
());
context
.
write
(
new
Text
(
chainInfo
.
getUserId
()
+
":"
+
chainInfo
.
getEntranceNodeToken
()),
chainInfo
);
}
catch
(
Exception
e
)
{
logger
.
error
(
"Failed to mapper call chain["
+
key
.
toString
()
+
"]"
,
e
);
}
}
public
static
ChainInfo
spanToChainInfo
(
String
key
,
List
<
Span
>
spanList
)
{
SubLevelSpanCostCounter
costMap
=
new
SubLevelSpanCostCounter
();
ChainInfo
chainInfo
=
new
ChainInfo
();
Collections
.
sort
(
spanList
,
new
Comparator
<
Span
>()
{
@Override
public
int
compare
(
Span
span1
,
Span
span2
)
{
String
span1TraceLevel
=
span1
.
getParentLevel
()
+
"."
+
span1
.
getLevelId
();
String
span2TraceLevel
=
span2
.
getParentLevel
()
+
"."
+
span2
.
getLevelId
();
return
span1TraceLevel
.
compareTo
(
span2TraceLevel
);
}
});
public
static
ChainInfo
spanToChainInfo
(
String
key
,
List
<
Span
>
spanList
)
{
SubLevelSpanCostCounter
costMap
=
new
SubLevelSpanCostCounter
();
ChainInfo
chainInfo
=
new
ChainInfo
();
Collections
.
sort
(
spanList
,
new
Comparator
<
Span
>()
{
@Override
public
int
compare
(
Span
span1
,
Span
span2
)
{
String
span1TraceLevel
=
span1
.
getParentLevel
()
+
"."
+
span1
.
getLevelId
();
String
span2TraceLevel
=
span2
.
getParentLevel
()
+
"."
+
span2
.
getLevelId
();
return
span1TraceLevel
.
compareTo
(
span2TraceLevel
);
}
});
Map
<
String
,
SpanEntry
>
spanEntryMap
=
mergeSpanDataSet
(
spanList
);
for
(
Map
.
Entry
<
String
,
SpanEntry
>
entry
:
spanEntryMap
.
entrySet
())
{
ChainNode
chainNode
=
new
ChainNode
();
SpanNodeProcessFilter
filter
=
SpanNodeProcessChain
.
getProcessChainByCallType
(
entry
.
getValue
().
getSpanType
());
filter
.
doFilter
(
entry
.
getValue
(),
chainNode
,
costMap
);
chainInfo
.
addNodes
(
chainNode
);
}
Map
<
String
,
SpanEntry
>
spanEntryMap
=
mergeSpanDataSet
(
spanList
);
for
(
Map
.
Entry
<
String
,
SpanEntry
>
entry
:
spanEntryMap
.
entrySet
())
{
ChainNode
chainNode
=
new
ChainNode
();
SpanNodeProcessFilter
filter
=
SpanNodeProcessChain
.
getProcessChainByCallType
(
entry
.
getValue
().
getSpanType
());
filter
.
doFilter
(
entry
.
getValue
(),
chainNode
,
costMap
);
chainInfo
.
addNodes
(
chainNode
);
}
chainInfo
.
generateChainToken
();
HBaseUtil
.
saveCidTidMapping
(
key
,
chainInfo
);
return
chainInfo
;
}
chainInfo
.
generateChainToken
();
HBaseUtil
.
saveCidTidMapping
(
key
,
chainInfo
);
return
chainInfo
;
}
private
static
Map
<
String
,
SpanEntry
>
mergeSpanDataSet
(
List
<
Span
>
spanList
)
{
Map
<
String
,
SpanEntry
>
spanEntryMap
=
new
LinkedHashMap
<
String
,
SpanEntry
>();
for
(
int
i
=
spanList
.
size
()
-
1
;
i
>=
0
;
i
--)
{
Span
span
=
spanList
.
get
(
i
);
SpanEntry
spanEntry
=
spanEntryMap
.
get
(
span
.
getParentLevel
()
+
"."
+
span
.
getLevelId
());
if
(
spanEntry
==
null
)
{
spanEntry
=
new
SpanEntry
();
spanEntryMap
.
put
(
span
.
getParentLevel
()
+
"."
+
span
.
getLevelId
(),
spanEntry
);
}
spanEntry
.
setSpan
(
span
);
}
return
spanEntryMap
;
}
private
static
Map
<
String
,
SpanEntry
>
mergeSpanDataSet
(
List
<
Span
>
spanList
)
{
Map
<
String
,
SpanEntry
>
spanEntryMap
=
new
LinkedHashMap
<
String
,
SpanEntry
>();
for
(
int
i
=
spanList
.
size
()
-
1
;
i
>=
0
;
i
--)
{
Span
span
=
spanList
.
get
(
i
);
SpanEntry
spanEntry
=
spanEntryMap
.
get
(
span
.
getParentLevel
()
+
"."
+
span
.
getLevelId
());
if
(
spanEntry
==
null
)
{
spanEntry
=
new
SpanEntry
();
spanEntryMap
.
put
(
span
.
getParentLevel
()
+
"."
+
span
.
getLevelId
(),
spanEntry
);
}
spanEntry
.
setSpan
(
span
);
}
return
spanEntryMap
;
}
}
skywalking-analysis/src/main/java/com/ai/cloud/skywalking/analysis/categorize2chain/Categorize2ChainReducer.java
浏览文件 @
1a84431f
...
...
@@ -3,8 +3,6 @@ package com.ai.cloud.skywalking.analysis.categorize2chain;
import
java.io.IOException
;
import
java.util.Iterator
;
import
com.ai.cloud.skywalking.analysis.config.ConfigInitializer
;
import
org.apache.hadoop.io.IntWritable
;
import
org.apache.hadoop.io.Text
;
import
org.apache.hadoop.mapreduce.Reducer
;
...
...
@@ -13,13 +11,19 @@ import org.slf4j.LoggerFactory;
import
com.ai.cloud.skywalking.analysis.categorize2chain.model.ChainInfo
;
import
com.ai.cloud.skywalking.analysis.categorize2chain.util.HBaseUtil
;
import
com.ai.cloud.skywalking.analysis.config.ConfigInitializer
;
public
class
Categorize2ChainReducer
extends
Reducer
<
Text
,
ChainInfo
,
Text
,
IntWritable
>
{
private
static
Logger
logger
=
LoggerFactory
.
getLogger
(
Categorize2ChainReducer
.
class
.
getName
());
@Override
protected
void
setup
(
Context
context
)
throws
IOException
,
InterruptedException
{
ConfigInitializer
.
initialize
();
}
@Override
protected
void
reduce
(
Text
key
,
Iterable
<
ChainInfo
>
values
,
Context
context
)
throws
IOException
,
InterruptedException
{
ConfigInitializer
.
initialize
();
int
totalCount
=
reduceAction
(
key
.
toString
(),
values
.
iterator
());
context
.
write
(
new
Text
(
key
.
toString
()),
new
IntWritable
(
totalCount
));
}
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录