Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
apache
SkyWalking
提交
020ce1c3
S
SkyWalking
项目概览
apache
/
SkyWalking
上一次同步 1 年多
通知
302
Star
21345
Fork
6091
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
S
SkyWalking
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
020ce1c3
编写于
10月 24, 2017
作者:
P
peng-yongsheng
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
no message
上级
50faf615
变更
31
隐藏空白更改
内联
并排
Showing
31 changed file
with
210 addition
and
250 deletion
+210
-250
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/cpu/dao/CpuMetricH2DAO.java
...apm/collector/agentjvm/worker/cpu/dao/CpuMetricH2DAO.java
+4
-4
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/gc/dao/GCMetricH2DAO.java
...g/apm/collector/agentjvm/worker/gc/dao/GCMetricH2DAO.java
+1
-2
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/heartbeat/dao/InstanceHeartBeatH2DAO.java
...agentjvm/worker/heartbeat/dao/InstanceHeartBeatH2DAO.java
+3
-3
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/memory/dao/MemoryMetricH2DAO.java
...llector/agentjvm/worker/memory/dao/MemoryMetricH2DAO.java
+1
-2
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/memorypool/dao/MemoryPoolMetricH2DAO.java
...agentjvm/worker/memorypool/dao/MemoryPoolMetricH2DAO.java
+1
-2
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/ApplicationH2TableDefine.java
...register/worker/application/ApplicationH2TableDefine.java
+2
-0
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/dao/ApplicationEsDAO.java
...gentregister/worker/application/dao/ApplicationEsDAO.java
+1
-1
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/dao/ApplicationH2DAO.java
...gentregister/worker/application/dao/ApplicationH2DAO.java
+10
-4
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/instance/InstanceH2TableDefine.java
.../agentregister/worker/instance/InstanceH2TableDefine.java
+1
-0
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/instance/dao/InstanceH2DAO.java
...ctor/agentregister/worker/instance/dao/InstanceH2DAO.java
+3
-2
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/ServiceNameH2TableDefine.java
...register/worker/servicename/ServiceNameH2TableDefine.java
+1
-0
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/dao/ServiceNameEsDAO.java
...gentregister/worker/servicename/dao/ServiceNameEsDAO.java
+1
-1
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/dao/ServiceNameH2DAO.java
...gentregister/worker/servicename/dao/ServiceNameH2DAO.java
+1
-0
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/global/dao/GlobalTraceH2DAO.java
...ector/agentstream/worker/global/dao/GlobalTraceH2DAO.java
+4
-4
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/instance/performance/dao/InstPerformanceH2DAO.java
...worker/instance/performance/dao/InstPerformanceH2DAO.java
+7
-5
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/node/component/dao/NodeComponentH2DAO.java
...tstream/worker/node/component/dao/NodeComponentH2DAO.java
+4
-6
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/node/mapping/dao/NodeMappingH2DAO.java
...agentstream/worker/node/mapping/dao/NodeMappingH2DAO.java
+7
-5
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/noderef/dao/NodeReferenceH2DAO.java
...or/agentstream/worker/noderef/dao/NodeReferenceH2DAO.java
+7
-5
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/segment/cost/dao/SegmentCostH2DAO.java
...agentstream/worker/segment/cost/dao/SegmentCostH2DAO.java
+6
-4
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/segment/origin/dao/SegmentH2DAO.java
...r/agentstream/worker/segment/origin/dao/SegmentH2DAO.java
+1
-1
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/service/entry/dao/ServiceEntryH2DAO.java
...entstream/worker/service/entry/dao/ServiceEntryH2DAO.java
+3
-3
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/serviceref/dao/ServiceReferenceH2DAO.java
...ntstream/worker/serviceref/dao/ServiceReferenceH2DAO.java
+5
-5
apm-collector/apm-collector-boot/src/main/resources/application.yml
...tor/apm-collector-boot/src/main/resources/application.yml
+5
-1
apm-collector/apm-collector-storage/src/main/java/org/skywalking/apm/collector/storage/h2/SqlBuilder.java
...a/org/skywalking/apm/collector/storage/h2/SqlBuilder.java
+6
-10
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/CpuMetricH2DAO.java
...a/org/skywalking/apm/collector/ui/dao/CpuMetricH2DAO.java
+17
-23
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/GCMetricH2DAO.java
...va/org/skywalking/apm/collector/ui/dao/GCMetricH2DAO.java
+31
-52
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/InstPerformanceH2DAO.java
...skywalking/apm/collector/ui/dao/InstPerformanceH2DAO.java
+31
-43
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/InstanceH2DAO.java
...va/org/skywalking/apm/collector/ui/dao/InstanceH2DAO.java
+2
-2
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/MemoryMetricH2DAO.java
...rg/skywalking/apm/collector/ui/dao/MemoryMetricH2DAO.java
+21
-29
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/MemoryPoolMetricH2DAO.java
...kywalking/apm/collector/ui/dao/MemoryPoolMetricH2DAO.java
+22
-30
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/SegmentH2DAO.java
...ava/org/skywalking/apm/collector/ui/dao/SegmentH2DAO.java
+1
-1
未找到文件。
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/cpu/dao/CpuMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,6 +18,8 @@
package
org.skywalking.apm.collector.agentjvm.worker.cpu.dao
;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.jvm.CpuMetricTable
;
...
...
@@ -28,14 +30,12 @@ import org.skywalking.apm.collector.stream.worker.impl.dao.IPersistenceDAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
java.util.HashMap
;
import
java.util.Map
;
/**
* @author peng-yongsheng, clevertension
*/
public
class
CpuMetricH2DAO
extends
H2DAO
implements
ICpuMetricDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
CpuMetricH2DAO
.
class
);
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
return
null
;
}
...
...
@@ -43,7 +43,7 @@ public class CpuMetricH2DAO extends H2DAO implements ICpuMetricDAO, IPersistence
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
CpuMetricTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
CpuMetricTable
.
COLUMN_INSTANCE_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
CpuMetricTable
.
COLUMN_USAGE_PERCENT
,
data
.
getDataDouble
(
0
));
source
.
put
(
CpuMetricTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
0
));
...
...
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/gc/dao/GCMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -20,7 +20,6 @@ package org.skywalking.apm.collector.agentjvm.worker.gc.dao;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.jvm.GCMetricTable
;
...
...
@@ -40,7 +39,7 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO, IPersistenceDA
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
GCMetricTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
GCMetricTable
.
COLUMN_INSTANCE_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
GCMetricTable
.
COLUMN_PHRASE
,
data
.
getDataInteger
(
1
));
source
.
put
(
GCMetricTable
.
COLUMN_COUNT
,
data
.
getDataLong
(
0
));
...
...
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/heartbeat/dao/InstanceHeartBeatH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,7 +24,6 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.framework.UnexpectedException
;
...
...
@@ -44,10 +43,11 @@ import org.slf4j.LoggerFactory;
public
class
InstanceHeartBeatH2DAO
extends
H2DAO
implements
IInstanceHeartBeatDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstanceHeartBeatEsDAO
.
class
);
private
static
final
String
GET_INSTANCE_HEARTBEAT_SQL
=
"select * from {0} where {1} = ?"
;
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_INSTANCE_HEARTBEAT_SQL
,
InstanceTable
.
TABLE
,
InstanceTable
.
COLUMN_INSTANCE_ID
);
Object
[]
params
=
new
Object
[]{
id
};
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -69,7 +69,7 @@ public class InstanceHeartBeatH2DAO extends H2DAO implements IInstanceHeartBeatD
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
InstanceTable
.
COLUMN_HEARTBEAT_TIME
,
data
.
getDataLong
(
0
));
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
InstanceTable
.
TABLE
,
source
.
keySet
(),
InstanceTable
.
COLUMN_
APPLICATION
_ID
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
InstanceTable
.
TABLE
,
source
.
keySet
(),
InstanceTable
.
COLUMN_
INSTANCE
_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
params
=
new
ArrayList
<>(
source
.
values
());
params
.
add
(
data
.
getDataString
(
0
));
...
...
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/memory/dao/MemoryMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -20,7 +20,6 @@ package org.skywalking.apm.collector.agentjvm.worker.memory.dao;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.jvm.MemoryMetricTable
;
...
...
@@ -40,7 +39,7 @@ public class MemoryMetricH2DAO extends H2DAO implements IMemoryMetricDAO, IPersi
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
MemoryMetricTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
MemoryMetricTable
.
COLUMN_APPLICATION_INSTANCE_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
MemoryMetricTable
.
COLUMN_IS_HEAP
,
data
.
getDataBoolean
(
0
));
source
.
put
(
MemoryMetricTable
.
COLUMN_INIT
,
data
.
getDataLong
(
0
));
...
...
apm-collector/apm-collector-agentjvm/src/main/java/org/skywalking/apm/collector/agentjvm/worker/memorypool/dao/MemoryPoolMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -20,7 +20,6 @@ package org.skywalking.apm.collector.agentjvm.worker.memorypool.dao;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.jvm.MemoryPoolMetricTable
;
...
...
@@ -40,7 +39,7 @@ public class MemoryPoolMetricH2DAO extends H2DAO implements IMemoryPoolMetricDAO
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
MemoryPoolMetricTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
MemoryPoolMetricTable
.
COLUMN_INSTANCE_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
MemoryPoolMetricTable
.
COLUMN_POOL_TYPE
,
data
.
getDataInteger
(
1
));
source
.
put
(
MemoryPoolMetricTable
.
COLUMN_INIT
,
data
.
getDataLong
(
0
));
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/ApplicationH2TableDefine.java
浏览文件 @
020ce1c3
...
...
@@ -18,6 +18,7 @@
package
org.skywalking.apm.collector.agentregister.worker.application
;
import
org.skywalking.apm.collector.storage.define.global.GlobalTraceTable
;
import
org.skywalking.apm.collector.storage.define.register.ApplicationTable
;
import
org.skywalking.apm.collector.storage.h2.define.H2ColumnDefine
;
import
org.skywalking.apm.collector.storage.h2.define.H2TableDefine
;
...
...
@@ -32,6 +33,7 @@ public class ApplicationH2TableDefine extends H2TableDefine {
}
@Override
public
void
initialize
()
{
addColumn
(
new
H2ColumnDefine
(
GlobalTraceTable
.
COLUMN_ID
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
ApplicationTable
.
COLUMN_APPLICATION_CODE
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
ApplicationTable
.
COLUMN_APPLICATION_ID
,
H2ColumnDefine
.
Type
.
Int
.
name
()));
}
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/dao/ApplicationEsDAO.java
浏览文件 @
020ce1c3
...
...
@@ -47,7 +47,7 @@ public class ApplicationEsDAO extends EsDAO implements IApplicationDAO {
@Override
public
void
save
(
ApplicationDataDefine
.
Application
application
)
{
logger
.
debug
(
"save application register info, application id: {}, application code: {}"
,
application
.
getApplicationId
(),
application
.
getApplicationCode
());
ElasticSearchClient
client
=
getClient
();
Map
<
String
,
Object
>
source
=
new
HashMap
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>
();
source
.
put
(
ApplicationTable
.
COLUMN_APPLICATION_CODE
,
application
.
getApplicationCode
());
source
.
put
(
ApplicationTable
.
COLUMN_APPLICATION_ID
,
application
.
getApplicationId
());
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/application/dao/ApplicationH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,6 +18,8 @@
package
org.skywalking.apm.collector.agentregister.worker.application.dao
;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.storage.define.register.ApplicationDataDefine
;
...
...
@@ -32,7 +34,6 @@ import org.slf4j.LoggerFactory;
*/
public
class
ApplicationH2DAO
extends
H2DAO
implements
IApplicationDAO
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
ApplicationH2DAO
.
class
);
private
static
final
String
INSERT_APPLICATION_SQL
=
"insert into {0}({1}, {2}) values(?, ?)"
;
@Override
public
int
getMaxApplicationId
()
{
...
...
@@ -47,9 +48,14 @@ public class ApplicationH2DAO extends H2DAO implements IApplicationDAO {
@Override
public
void
save
(
ApplicationDataDefine
.
Application
application
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
INSERT_APPLICATION_SQL
,
ApplicationTable
.
TABLE
,
ApplicationTable
.
COLUMN_APPLICATION_ID
,
ApplicationTable
.
COLUMN_APPLICATION_CODE
);
Object
[]
params
=
new
Object
[]
{
application
.
getApplicationId
(),
application
.
getApplicationCode
()};
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
ApplicationTable
.
COLUMN_ID
,
application
.
getApplicationId
());
source
.
put
(
ApplicationTable
.
COLUMN_APPLICATION_CODE
,
application
.
getApplicationCode
());
source
.
put
(
ApplicationTable
.
COLUMN_APPLICATION_ID
,
application
.
getApplicationId
());
String
sql
=
SqlBuilder
.
buildBatchInsertSql
(
ApplicationTable
.
TABLE
,
source
.
keySet
());
Object
[]
params
=
source
.
values
().
toArray
(
new
Object
[
0
]);
try
{
client
.
execute
(
sql
,
params
);
}
catch
(
H2ClientException
e
)
{
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/instance/InstanceH2TableDefine.java
浏览文件 @
020ce1c3
...
...
@@ -32,6 +32,7 @@ public class InstanceH2TableDefine extends H2TableDefine {
}
@Override
public
void
initialize
()
{
addColumn
(
new
H2ColumnDefine
(
InstanceTable
.
COLUMN_ID
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
InstanceTable
.
COLUMN_APPLICATION_ID
,
H2ColumnDefine
.
Type
.
Int
.
name
()));
addColumn
(
new
H2ColumnDefine
(
InstanceTable
.
COLUMN_AGENT_UUID
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
InstanceTable
.
COLUMN_REGISTER_TIME
,
H2ColumnDefine
.
Type
.
Bigint
.
name
()));
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/instance/dao/InstanceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -38,7 +38,7 @@ public class InstanceH2DAO extends H2DAO implements IInstanceDAO {
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstanceH2DAO
.
class
);
private
static
final
String
GET_INSTANCE_ID_SQL
=
"select {0} from {1} where {2} = ? and {3} = ?"
;
private
static
final
String
UPDATE_HEARTBEAT_TIME_SQL
=
"updat
t
e {0} set {1} = ? where {2} = ?"
;
private
static
final
String
UPDATE_HEARTBEAT_TIME_SQL
=
"update {0} set {1} = ? where {2} = ?"
;
@Override
public
int
getInstanceId
(
int
applicationId
,
String
agentUUID
)
{
logger
.
info
(
"get the application id with application id = {}, agentUUID = {}"
,
applicationId
,
agentUUID
);
...
...
@@ -67,6 +67,7 @@ public class InstanceH2DAO extends H2DAO implements IInstanceDAO {
@Override
public
void
save
(
InstanceDataDefine
.
Instance
instance
)
{
H2Client
client
=
getClient
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
InstanceTable
.
COLUMN_ID
,
instance
.
getId
());
source
.
put
(
InstanceTable
.
COLUMN_INSTANCE_ID
,
instance
.
getInstanceId
());
source
.
put
(
InstanceTable
.
COLUMN_APPLICATION_ID
,
instance
.
getApplicationId
());
source
.
put
(
InstanceTable
.
COLUMN_AGENT_UUID
,
instance
.
getAgentUUID
());
...
...
@@ -85,7 +86,7 @@ public class InstanceH2DAO extends H2DAO implements IInstanceDAO {
@Override
public
void
updateHeartbeatTime
(
int
instanceId
,
long
heartbeatTime
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
UPDATE_HEARTBEAT_TIME_SQL
,
InstanceTable
.
TABLE
,
InstanceTable
.
COLUMN_HEARTBEAT_TIME
,
InstanceTable
.
COLUMN_I
NSTANCE_I
D
);
InstanceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
heartbeatTime
,
instanceId
};
try
{
client
.
execute
(
sql
,
params
);
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/ServiceNameH2TableDefine.java
浏览文件 @
020ce1c3
...
...
@@ -32,6 +32,7 @@ public class ServiceNameH2TableDefine extends H2TableDefine {
}
@Override
public
void
initialize
()
{
addColumn
(
new
H2ColumnDefine
(
ServiceNameTable
.
COLUMN_ID
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
ServiceNameTable
.
COLUMN_APPLICATION_ID
,
H2ColumnDefine
.
Type
.
Int
.
name
()));
addColumn
(
new
H2ColumnDefine
(
ServiceNameTable
.
COLUMN_SERVICE_NAME
,
H2ColumnDefine
.
Type
.
Varchar
.
name
()));
addColumn
(
new
H2ColumnDefine
(
ServiceNameTable
.
COLUMN_SERVICE_ID
,
H2ColumnDefine
.
Type
.
Int
.
name
()));
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/dao/ServiceNameEsDAO.java
浏览文件 @
020ce1c3
...
...
@@ -47,7 +47,7 @@ public class ServiceNameEsDAO extends EsDAO implements IServiceNameDAO {
@Override
public
void
save
(
ServiceNameDataDefine
.
ServiceName
serviceName
)
{
logger
.
debug
(
"save service name register info, application id: {}, service name: {}"
,
serviceName
.
getApplicationId
(),
serviceName
.
getServiceName
());
ElasticSearchClient
client
=
getClient
();
Map
<
String
,
Object
>
source
=
new
HashMap
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>
();
source
.
put
(
ServiceNameTable
.
COLUMN_SERVICE_ID
,
serviceName
.
getServiceId
());
source
.
put
(
ServiceNameTable
.
COLUMN_APPLICATION_ID
,
serviceName
.
getApplicationId
());
source
.
put
(
ServiceNameTable
.
COLUMN_SERVICE_NAME
,
serviceName
.
getServiceName
());
...
...
apm-collector/apm-collector-agentregister/src/main/java/org/skywalking/apm/collector/agentregister/worker/servicename/dao/ServiceNameH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -50,6 +50,7 @@ public class ServiceNameH2DAO extends H2DAO implements IServiceNameDAO {
logger
.
debug
(
"save service name register info, application id: {}, service name: {}"
,
serviceName
.
getApplicationId
(),
serviceName
.
getServiceName
());
H2Client
client
=
getClient
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
ServiceNameTable
.
COLUMN_ID
,
serviceName
.
getId
());
source
.
put
(
ServiceNameTable
.
COLUMN_SERVICE_ID
,
serviceName
.
getServiceId
());
source
.
put
(
ServiceNameTable
.
COLUMN_APPLICATION_ID
,
serviceName
.
getApplicationId
());
source
.
put
(
ServiceNameTable
.
COLUMN_SERVICE_NAME
,
serviceName
.
getServiceName
());
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/global/dao/GlobalTraceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,6 +18,8 @@
package
org.skywalking.apm.collector.agentstream.worker.global.dao
;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.framework.UnexpectedException
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
...
...
@@ -29,14 +31,12 @@ import org.skywalking.apm.collector.stream.worker.impl.dao.IPersistenceDAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
java.util.HashMap
;
import
java.util.Map
;
/**
* @author peng-yongsheng, clevertension
*/
public
class
GlobalTraceH2DAO
extends
H2DAO
implements
IGlobalTraceDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
GlobalTraceH2DAO
.
class
);
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
throw
new
UnexpectedException
(
"There is no need to merge stream data with database data."
);
}
...
...
@@ -48,7 +48,7 @@ public class GlobalTraceH2DAO extends H2DAO implements IGlobalTraceDAO, IPersist
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
GlobalTraceTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
GlobalTraceTable
.
COLUMN_SEGMENT_ID
,
data
.
getDataString
(
1
));
source
.
put
(
GlobalTraceTable
.
COLUMN_GLOBAL_TRACE_ID
,
data
.
getDataString
(
2
));
source
.
put
(
GlobalTraceTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
0
));
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/instance/performance/dao/InstPerformanceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,7 +24,6 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.stream.Data
;
...
...
@@ -43,10 +42,11 @@ import org.slf4j.LoggerFactory;
public
class
InstPerformanceH2DAO
extends
H2DAO
implements
IInstPerformanceDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstPerformanceH2DAO
.
class
);
private
static
final
String
GET_SQL
=
"select * from {0} where {1} = ?"
;
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
InstPerformanceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
InstPerformanceTable
.
TABLE
,
InstPerformanceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -62,10 +62,11 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO,
}
return
null
;
}
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
InstPerformanceTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
InstPerformanceTable
.
COLUMN_APPLICATION_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
InstPerformanceTable
.
COLUMN_INSTANCE_ID
,
data
.
getDataInteger
(
1
));
source
.
put
(
InstPerformanceTable
.
COLUMN_CALLS
,
data
.
getDataInteger
(
2
));
...
...
@@ -76,6 +77,7 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO,
entity
.
setParams
(
source
.
values
().
toArray
(
new
Object
[
0
]));
return
entity
;
}
@Override
public
H2SqlEntity
prepareBatchUpdate
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
...
...
@@ -85,7 +87,7 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO,
source
.
put
(
InstPerformanceTable
.
COLUMN_COST_TOTAL
,
data
.
getDataLong
(
0
));
source
.
put
(
InstPerformanceTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
1
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
InstPerformanceTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
InstPerformanceTable
.
TABLE
,
source
.
keySet
(),
InstPerformanceTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/node/component/dao/NodeComponentH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,13 +24,11 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.node.NodeComponentTable
;
import
org.skywalking.apm.collector.storage.define.serviceref.ServiceReferenceTable
;
import
org.skywalking.apm.collector.storage.h2.SqlBuilder
;
import
org.skywalking.apm.collector.storage.h2.dao.H2DAO
;
import
org.skywalking.apm.collector.storage.h2.define.H2SqlEntity
;
...
...
@@ -48,8 +46,8 @@ public class NodeComponentH2DAO extends H2DAO implements INodeComponentDAO, IPer
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
ServiceReferenceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
NodeComponentTable
.
TABLE
,
NodeComponentTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -70,7 +68,7 @@ public class NodeComponentH2DAO extends H2DAO implements INodeComponentDAO, IPer
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
NodeComponentTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
NodeComponentTable
.
COLUMN_COMPONENT_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
NodeComponentTable
.
COLUMN_COMPONENT_NAME
,
data
.
getDataString
(
1
));
source
.
put
(
NodeComponentTable
.
COLUMN_PEER_ID
,
data
.
getDataInteger
(
1
));
...
...
@@ -93,7 +91,7 @@ public class NodeComponentH2DAO extends H2DAO implements INodeComponentDAO, IPer
source
.
put
(
NodeComponentTable
.
COLUMN_PEER
,
data
.
getDataString
(
2
));
source
.
put
(
NodeComponentTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
0
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeComponentTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeComponentTable
.
TABLE
,
source
.
keySet
(),
NodeComponentTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/node/mapping/dao/NodeMappingH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,7 +24,6 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.agentstream.worker.node.component.dao.NodeComponentH2DAO
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
...
...
@@ -44,10 +43,11 @@ import org.slf4j.LoggerFactory;
public
class
NodeMappingH2DAO
extends
H2DAO
implements
INodeMappingDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
NodeComponentH2DAO
.
class
);
private
static
final
String
GET_SQL
=
"select * from {0} where {1} = ?"
;
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
NodeMappingTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
NodeMappingTable
.
TABLE
,
NodeMappingTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -62,10 +62,11 @@ public class NodeMappingH2DAO extends H2DAO implements INodeMappingDAO, IPersist
}
return
null
;
}
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
NodeMappingTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
NodeMappingTable
.
COLUMN_APPLICATION_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
NodeMappingTable
.
COLUMN_ADDRESS_ID
,
data
.
getDataInteger
(
1
));
source
.
put
(
NodeMappingTable
.
COLUMN_ADDRESS
,
data
.
getDataString
(
1
));
...
...
@@ -76,6 +77,7 @@ public class NodeMappingH2DAO extends H2DAO implements INodeMappingDAO, IPersist
entity
.
setParams
(
source
.
values
().
toArray
(
new
Object
[
0
]));
return
entity
;
}
@Override
public
H2SqlEntity
prepareBatchUpdate
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
...
...
@@ -84,7 +86,7 @@ public class NodeMappingH2DAO extends H2DAO implements INodeMappingDAO, IPersist
source
.
put
(
NodeMappingTable
.
COLUMN_ADDRESS
,
data
.
getDataString
(
1
));
source
.
put
(
NodeMappingTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
0
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeMappingTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeMappingTable
.
TABLE
,
source
.
keySet
(),
NodeMappingTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/noderef/dao/NodeReferenceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,7 +24,6 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.stream.Data
;
...
...
@@ -43,10 +42,11 @@ import org.slf4j.LoggerFactory;
public
class
NodeReferenceH2DAO
extends
H2DAO
implements
INodeReferenceDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
NodeReferenceH2DAO
.
class
);
private
static
final
String
GET_SQL
=
"select * from {0} where {1} = ?"
;
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
NodeReferenceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
NodeReferenceTable
.
TABLE
,
NodeReferenceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -67,10 +67,11 @@ public class NodeReferenceH2DAO extends H2DAO implements INodeReferenceDAO, IPer
}
return
null
;
}
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
NodeReferenceTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
NodeReferenceTable
.
COLUMN_FRONT_APPLICATION_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
NodeReferenceTable
.
COLUMN_BEHIND_APPLICATION_ID
,
data
.
getDataInteger
(
1
));
source
.
put
(
NodeReferenceTable
.
COLUMN_BEHIND_PEER
,
data
.
getDataString
(
1
));
...
...
@@ -87,6 +88,7 @@ public class NodeReferenceH2DAO extends H2DAO implements INodeReferenceDAO, IPer
entity
.
setParams
(
source
.
values
().
toArray
(
new
Object
[
0
]));
return
entity
;
}
@Override
public
H2SqlEntity
prepareBatchUpdate
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
...
...
@@ -101,7 +103,7 @@ public class NodeReferenceH2DAO extends H2DAO implements INodeReferenceDAO, IPer
source
.
put
(
NodeReferenceTable
.
COLUMN_ERROR
,
data
.
getDataInteger
(
7
));
source
.
put
(
NodeReferenceTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
0
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeReferenceTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
NodeReferenceTable
.
TABLE
,
source
.
keySet
(),
NodeReferenceTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/segment/cost/dao/SegmentCostH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,6 +18,8 @@
package
org.skywalking.apm.collector.agentstream.worker.segment.cost.dao
;
import
java.util.HashMap
;
import
java.util.Map
;
import
org.skywalking.apm.collector.core.stream.Data
;
import
org.skywalking.apm.collector.storage.define.DataDefine
;
import
org.skywalking.apm.collector.storage.define.segment.SegmentCostTable
;
...
...
@@ -28,22 +30,21 @@ import org.skywalking.apm.collector.stream.worker.impl.dao.IPersistenceDAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
java.util.HashMap
;
import
java.util.Map
;
/**
* @author peng-yongsheng, clevertension
*/
public
class
SegmentCostH2DAO
extends
H2DAO
implements
ISegmentCostDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
SegmentCostH2DAO
.
class
);
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
return
null
;
}
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
logger
.
debug
(
"segment cost prepareBatchInsert, id: {}"
,
data
.
getDataString
(
0
));
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
SegmentCostTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
SegmentCostTable
.
COLUMN_SEGMENT_ID
,
data
.
getDataString
(
1
));
source
.
put
(
SegmentCostTable
.
COLUMN_APPLICATION_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
SegmentCostTable
.
COLUMN_SERVICE_NAME
,
data
.
getDataString
(
2
));
...
...
@@ -59,6 +60,7 @@ public class SegmentCostH2DAO extends H2DAO implements ISegmentCostDAO, IPersist
entity
.
setParams
(
source
.
values
().
toArray
(
new
Object
[
0
]));
return
entity
;
}
@Override
public
H2SqlEntity
prepareBatchUpdate
(
Data
data
)
{
return
null
;
}
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/segment/origin/dao/SegmentH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -44,7 +44,7 @@ public class SegmentH2DAO extends H2DAO implements ISegmentDAO, IPersistenceDAO<
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
H2SqlEntity
entity
=
new
H2SqlEntity
();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
SegmentTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
SegmentTable
.
COLUMN_DATA_BINARY
,
data
.
getDataBytes
(
0
));
logger
.
debug
(
"segment source: {}"
,
source
.
toString
());
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/service/entry/dao/ServiceEntryH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -45,7 +45,7 @@ public class ServiceEntryH2DAO extends H2DAO implements IServiceEntryDAO, IPersi
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SERVICE_ENTRY_SQL
,
ServiceEntryTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_SERVICE_ENTRY_SQL
,
ServiceEntryTable
.
TABLE
,
ServiceEntryTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
...
...
@@ -66,7 +66,7 @@ public class ServiceEntryH2DAO extends H2DAO implements IServiceEntryDAO, IPersi
@Override
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
ServiceEntryTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
ServiceEntryTable
.
COLUMN_APPLICATION_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
ServiceEntryTable
.
COLUMN_ENTRY_SERVICE_ID
,
data
.
getDataInteger
(
1
));
source
.
put
(
ServiceEntryTable
.
COLUMN_ENTRY_SERVICE_NAME
,
data
.
getDataString
(
1
));
...
...
@@ -87,7 +87,7 @@ public class ServiceEntryH2DAO extends H2DAO implements IServiceEntryDAO, IPersi
source
.
put
(
ServiceEntryTable
.
COLUMN_REGISTER_TIME
,
data
.
getDataLong
(
0
));
source
.
put
(
ServiceEntryTable
.
COLUMN_NEWEST_TIME
,
data
.
getDataLong
(
1
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
ServiceEntryTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
ServiceEntryTable
.
TABLE
,
source
.
keySet
(),
ServiceEntryTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-agentstream/src/main/java/org/skywalking/apm/collector/agentstream/worker/serviceref/dao/ServiceReferenceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -24,7 +24,6 @@ import java.util.ArrayList;
import
java.util.HashMap
;
import
java.util.List
;
import
java.util.Map
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.stream.Data
;
...
...
@@ -43,11 +42,12 @@ import org.slf4j.LoggerFactory;
public
class
ServiceReferenceH2DAO
extends
H2DAO
implements
IServiceReferenceDAO
,
IPersistenceDAO
<
H2SqlEntity
,
H2SqlEntity
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
ServiceReferenceH2DAO
.
class
);
private
static
final
String
GET_SQL
=
"select * from {0} where {1} = ?"
;
@Override
public
Data
get
(
String
id
,
DataDefine
dataDefine
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
ServiceReferenceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_SQL
,
ServiceReferenceTable
.
TABLE
,
ServiceReferenceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
Data
data
=
dataDefine
.
build
(
id
);
...
...
@@ -77,7 +77,7 @@ public class ServiceReferenceH2DAO extends H2DAO implements IServiceReferenceDAO
public
H2SqlEntity
prepareBatchInsert
(
Data
data
)
{
H2SqlEntity
entity
=
new
H2SqlEntity
();
Map
<
String
,
Object
>
source
=
new
HashMap
<>();
source
.
put
(
"id"
,
data
.
getDataString
(
0
));
source
.
put
(
ServiceReferenceTable
.
COLUMN_ID
,
data
.
getDataString
(
0
));
source
.
put
(
ServiceReferenceTable
.
COLUMN_ENTRY_SERVICE_ID
,
data
.
getDataInteger
(
0
));
source
.
put
(
ServiceReferenceTable
.
COLUMN_ENTRY_SERVICE_NAME
,
data
.
getDataString
(
1
));
source
.
put
(
ServiceReferenceTable
.
COLUMN_FRONT_SERVICE_ID
,
data
.
getDataInteger
(
1
));
...
...
@@ -119,7 +119,7 @@ public class ServiceReferenceH2DAO extends H2DAO implements IServiceReferenceDAO
source
.
put
(
ServiceReferenceTable
.
COLUMN_TIME_BUCKET
,
data
.
getDataLong
(
7
));
String
id
=
data
.
getDataString
(
0
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
ServiceReferenceTable
.
TABLE
,
source
.
keySet
(),
"id"
);
String
sql
=
SqlBuilder
.
buildBatchUpdateSql
(
ServiceReferenceTable
.
TABLE
,
source
.
keySet
(),
ServiceReferenceTable
.
COLUMN_ID
);
entity
.
setSql
(
sql
);
List
<
Object
>
values
=
new
ArrayList
<>(
source
.
values
());
values
.
add
(
id
);
...
...
apm-collector/apm-collector-boot/src/main/resources/application.yml
浏览文件 @
020ce1c3
...
...
@@ -33,4 +33,8 @@ collector_inside:
# cluster_transport_sniffer: true
# cluster_nodes: localhost:9300
# index_shards_number: 2
# index_replicas_number: 0
\ No newline at end of file
# index_replicas_number: 0
storage
:
h2
:
url
:
jdbc:h2:tcp://localhost/~/test
user_name
:
sa
\ No newline at end of file
apm-collector/apm-collector-storage/src/main/java/org/skywalking/apm/collector/storage/h2/SqlBuilder.java
浏览文件 @
020ce1c3
...
...
@@ -23,21 +23,19 @@ import java.util.List;
import
java.util.Set
;
public
class
SqlBuilder
{
public
static
final
String
buildSql
(
String
sql
,
Object
...
args
)
{
public
static
String
buildSql
(
String
sql
,
Object
...
args
)
{
return
MessageFormat
.
format
(
sql
,
args
);
}
public
static
final
String
buildSql
(
String
sql
,
List
<
Object
>
args
)
{
public
static
String
buildSql
(
String
sql
,
List
<
Object
>
args
)
{
MessageFormat
messageFormat
=
new
MessageFormat
(
sql
);
return
messageFormat
.
format
(
args
.
toArray
(
new
Object
[
0
]));
}
public
static
final
String
buildBatchInsertSql
(
String
tableName
,
Set
<
String
>
columnNames
)
{
public
static
String
buildBatchInsertSql
(
String
tableName
,
Set
<
String
>
columnNames
)
{
StringBuilder
sb
=
new
StringBuilder
(
"insert into "
);
sb
.
append
(
tableName
).
append
(
"("
);
columnNames
.
forEach
((
columnName
)
->
{
sb
.
append
(
columnName
).
append
(
","
);
});
columnNames
.
forEach
((
columnName
)
->
sb
.
append
(
columnName
).
append
(
","
));
sb
.
delete
(
sb
.
length
()
-
1
,
sb
.
length
());
sb
.
append
(
") values("
);
for
(
int
i
=
0
;
i
<
columnNames
.
size
();
i
++)
{
...
...
@@ -48,12 +46,10 @@ public class SqlBuilder {
return
sb
.
toString
();
}
public
static
final
String
buildBatchUpdateSql
(
String
tableName
,
Set
<
String
>
columnNames
,
String
whereClauseName
)
{
public
static
String
buildBatchUpdateSql
(
String
tableName
,
Set
<
String
>
columnNames
,
String
whereClauseName
)
{
StringBuilder
sb
=
new
StringBuilder
(
"update "
);
sb
.
append
(
tableName
).
append
(
" set "
);
columnNames
.
forEach
((
columnName
)
->
{
sb
.
append
(
columnName
).
append
(
"=?,"
);
});
columnNames
.
forEach
((
columnName
)
->
sb
.
append
(
columnName
).
append
(
"=?,"
));
sb
.
delete
(
sb
.
length
()
-
1
,
sb
.
length
());
sb
.
append
(
" where "
).
append
(
whereClauseName
).
append
(
"=?"
);
return
sb
.
toString
();
...
...
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/CpuMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,11 +18,11 @@
package
org.skywalking.apm.collector.ui.dao
;
import
com.google.gson.JsonArray
;
import
java.sql.ResultSet
;
import
java.sql.SQLException
;
import
java.util.ArrayList
;
import
java.util.List
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.util.Const
;
...
...
@@ -33,20 +33,18 @@ import org.skywalking.apm.collector.storage.h2.dao.H2DAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
com.google.gson.JsonArray
;
/**
* @author peng-yongsheng, clevertension
*/
public
class
CpuMetricH2DAO
extends
H2DAO
implements
ICpuMetricDAO
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstanceH2DAO
.
class
);
private
static
final
String
GET_CPU_METRIC_SQL
=
"select * from {0} where {1} = ?"
;
private
static
final
String
GET_CPU_METRICS_SQL
=
"select * from {0} where {1} in ("
;
@Override
public
int
getMetric
(
int
instanceId
,
long
timeBucket
)
{
String
id
=
timeBucket
+
Const
.
ID_SPLIT
+
instanceId
;
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_CPU_METRIC_SQL
,
CpuMetricTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_CPU_METRIC_SQL
,
CpuMetricTable
.
TABLE
,
CpuMetricTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
return
rs
.
getInt
(
CpuMetricTable
.
COLUMN_USAGE_PERCENT
);
...
...
@@ -59,7 +57,7 @@ public class CpuMetricH2DAO extends H2DAO implements ICpuMetricDAO {
@Override
public
JsonArray
getMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_CPU_METRIC
S_SQL
,
CpuMetricTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_CPU_METRIC
_SQL
,
CpuMetricTable
.
TABLE
,
CpuMetricTable
.
COLUMN_ID
);
long
timeBucket
=
startTimeBucket
;
List
<
String
>
idList
=
new
ArrayList
<>();
...
...
@@ -70,24 +68,20 @@ public class CpuMetricH2DAO extends H2DAO implements ICpuMetricDAO {
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonArray
metrics
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
double
cpuUsed
=
rs
.
getDouble
(
CpuMetricTable
.
COLUMN_USAGE_PERCENT
);
metrics
.
add
((
int
)(
cpuUsed
*
100
));
idList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
String
[]
{
id
}))
{
if
(
rs
.
next
())
{
double
cpuUsed
=
rs
.
getDouble
(
CpuMetricTable
.
COLUMN_USAGE_PERCENT
);
metrics
.
add
((
int
)(
cpuUsed
*
100
));
}
else
{
metrics
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
return
metrics
;
}
}
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/GCMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,11 +18,12 @@
package
org.skywalking.apm.collector.ui.dao
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
import
java.sql.ResultSet
;
import
java.sql.SQLException
;
import
java.util.ArrayList
;
import
java.util.List
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.util.Const
;
...
...
@@ -34,9 +35,6 @@ import org.skywalking.apm.network.proto.GCPhrase;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
/**
* @author peng-yongsheng, clevertension
*/
...
...
@@ -44,12 +42,13 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO {
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
GCMetricH2DAO
.
class
);
private
static
final
String
GET_GC_COUNT_SQL
=
"select {1}, sum({0}) as cnt, {1} from {2} where {3} = ? and {4} in ("
;
private
static
final
String
GET_GC_METRIC_SQL
=
"select * from {0} where {1} = ?"
;
private
static
final
String
GET_GC_METRICS_SQL
=
"select * from {0} where {1} in ("
;
@Override
public
GCCount
getGCCount
(
long
[]
timeBuckets
,
int
instanceId
)
{
GCCount
gcCount
=
new
GCCount
();
H2Client
client
=
getClient
();
String
sql
=
GET_GC_COUNT_SQL
;
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
timeBuckets
.
length
;
i
++)
{
builder
.
append
(
"?,"
);
}
...
...
@@ -57,7 +56,7 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO {
builder
.
append
(
")"
);
sql
=
sql
+
builder
+
" group by {1}"
;
sql
=
SqlBuilder
.
buildSql
(
sql
,
GCMetricTable
.
COLUMN_COUNT
,
GCMetricTable
.
COLUMN_PHRASE
,
GCMetricTable
.
TABLE
,
GCMetricTable
.
COLUMN_INSTANCE_ID
,
"id"
);
GCMetricTable
.
TABLE
,
GCMetricTable
.
COLUMN_INSTANCE_ID
,
GCMetricTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[
timeBuckets
.
length
+
1
];
for
(
int
i
=
0
;
i
<
timeBuckets
.
length
;
i
++)
{
params
[
i
+
1
]
=
timeBuckets
[
i
];
...
...
@@ -83,9 +82,9 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO {
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
timeBucket
)
{
JsonObject
response
=
new
JsonObject
();
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_GC_METRIC_SQL
,
GCMetricTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_GC_METRIC_SQL
,
GCMetricTable
.
TABLE
,
GCMetricTable
.
COLUMN_ID
);
String
youngId
=
timeBucket
+
Const
.
ID_SPLIT
+
GCPhrase
.
NEW_VALUE
+
instanceId
;
Object
[]
params
=
new
Object
[]{
youngId
};
Object
[]
params
=
new
Object
[]
{
youngId
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
response
.
addProperty
(
"ygc"
,
rs
.
getInt
(
GCMetricTable
.
COLUMN_COUNT
));
...
...
@@ -94,7 +93,7 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO {
logger
.
error
(
e
.
getMessage
(),
e
);
}
String
oldId
=
timeBucket
+
Const
.
ID_SPLIT
+
GCPhrase
.
OLD_VALUE
+
instanceId
;
Object
[]
params1
=
new
Object
[]{
oldId
};
Object
[]
params1
=
new
Object
[]
{
oldId
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params1
))
{
if
(
rs
.
next
())
{
response
.
addProperty
(
"ogc"
,
rs
.
getInt
(
GCMetricTable
.
COLUMN_COUNT
));
...
...
@@ -109,67 +108,47 @@ public class GCMetricH2DAO extends H2DAO implements IGCMetricDAO {
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
)
{
JsonObject
response
=
new
JsonObject
();
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_GC_METRIC
S_SQL
,
GCMetricTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_GC_METRIC
_SQL
,
GCMetricTable
.
TABLE
,
GCMetricTable
.
COLUMN_ID
);
long
timeBucket
=
startTimeBucket
;
List
<
String
>
id
List
=
new
ArrayList
<>();
List
<
String
>
youngIds
List
=
new
ArrayList
<>();
do
{
timeBucket
=
TimeBucketUtils
.
INSTANCE
.
addSecondForSecondTimeBucket
(
TimeBucketUtils
.
TimeBucketType
.
SECOND
.
name
(),
timeBucket
,
1
);
String
youngId
=
timeBucket
+
Const
.
ID_SPLIT
+
instanceId
+
Const
.
ID_SPLIT
+
GCPhrase
.
NEW_VALUE
;
id
List
.
add
(
youngId
);
youngIds
List
.
add
(
youngId
);
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonArray
youngArray
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
youngArray
.
add
(
rs
.
getInt
(
GCMetricTable
.
COLUMN_COUNT
));
}
if
(
youngArray
.
size
()
==
0
)
{
youngArray
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
forEachRs
(
client
,
youngIdsList
,
sql
,
youngArray
);
response
.
add
(
"ygc"
,
youngArray
);
List
<
String
>
idList1
=
new
ArrayList
<>();
List
<
String
>
oldIdsList
=
new
ArrayList
<>();
timeBucket
=
startTimeBucket
;
do
{
timeBucket
=
TimeBucketUtils
.
INSTANCE
.
addSecondForSecondTimeBucket
(
TimeBucketUtils
.
TimeBucketType
.
SECOND
.
name
(),
timeBucket
,
1
);
String
oldId
=
timeBucket
+
Const
.
ID_SPLIT
+
instanceId
+
Const
.
ID_SPLIT
+
GCPhrase
.
OLD_VALUE
;
idList1
.
add
(
oldId
);
oldIdsList
.
add
(
oldId
);
}
while
(
timeBucket
<=
endTimeBucket
);
String
sql1
=
SqlBuilder
.
buildSql
(
GET_GC_METRICS_SQL
,
GCMetricTable
.
TABLE
,
"id"
);
StringBuilder
builder1
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList1
.
size
();
i
++)
{
builder1
.
append
(
"?,"
);
}
builder1
.
delete
(
builder1
.
length
()
-
1
,
builder1
.
length
());
builder1
.
append
(
")"
);
sql1
=
sql1
+
builder1
;
Object
[]
params1
=
idList
.
toArray
(
new
String
[
0
]);
JsonArray
oldArray
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql1
,
params1
))
{
while
(
rs
.
next
())
{
oldArray
.
add
(
rs
.
getInt
(
GCMetricTable
.
COLUMN_COUNT
));
}
if
(
oldArray
.
size
()
==
0
)
{
oldArray
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
JsonArray
oldArray
=
new
JsonArray
();
forEachRs
(
client
,
oldIdsList
,
sql
,
oldArray
);
response
.
add
(
"ogc"
,
oldArray
);
return
response
;
}
private
void
forEachRs
(
H2Client
client
,
List
<
String
>
idsList
,
String
sql
,
JsonArray
metricArray
)
{
idsList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
String
[]
{
id
}))
{
if
(
rs
.
next
())
{
metricArray
.
add
(
rs
.
getInt
(
GCMetricTable
.
COLUMN_COUNT
));
}
else
{
metricArray
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
}
}
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/InstPerformanceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,11 +18,11 @@
package
org.skywalking.apm.collector.ui.dao
;
import
com.google.gson.JsonArray
;
import
java.sql.ResultSet
;
import
java.sql.SQLException
;
import
java.util.ArrayList
;
import
java.util.List
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.util.Const
;
...
...
@@ -33,8 +33,6 @@ import org.skywalking.apm.collector.storage.h2.dao.H2DAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
com.google.gson.JsonArray
;
/**
* @author peng-yongsheng, clevertension
*/
...
...
@@ -42,7 +40,7 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstPerformanceH2DAO
.
class
);
private
static
final
String
GET_INST_PERF_SQL
=
"select * from {0} where {1} = ? and {2} in ("
;
private
static
final
String
GET_TPS_METRIC_SQL
=
"select * from {0} where {1} = ?"
;
private
static
final
String
GET_TPS_METRICS_SQL
=
"select * from {0} where {1} in ("
;
@Override
public
InstPerformance
get
(
long
[]
timeBuckets
,
int
instanceId
)
{
H2Client
client
=
getClient
();
logger
.
info
(
"the inst performance inst id = {}"
,
instanceId
);
...
...
@@ -74,8 +72,8 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
@Override
public
int
getTpsMetric
(
int
instanceId
,
long
timeBucket
)
{
logger
.
info
(
"getTpMetric instanceId = {}, startTimeBucket = {}"
,
instanceId
,
timeBucket
);
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC_SQL
,
InstPerformanceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
instanceId
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC_SQL
,
InstPerformanceTable
.
TABLE
,
InstPerformanceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
instanceId
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
return
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
...
...
@@ -89,7 +87,7 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
@Override
public
JsonArray
getTpsMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
)
{
logger
.
info
(
"getTpsMetric instanceId = {}, startTimeBucket = {}, endTimeBucket = {}"
,
instanceId
,
startTimeBucket
,
endTimeBucket
);
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC
S_SQL
,
InstPerformanceTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC
_SQL
,
InstPerformanceTable
.
TABLE
,
InstPerformanceTable
.
COLUMN_ID
);
long
timeBucket
=
startTimeBucket
;
List
<
String
>
idList
=
new
ArrayList
<>();
...
...
@@ -100,31 +98,26 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonArray
metrics
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
int
calls
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
metrics
.
add
(
calls
);
idList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
Object
[]
{
id
}))
{
if
(
rs
.
next
())
{
int
calls
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
metrics
.
add
(
calls
);
}
else
{
metrics
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
return
metrics
;
}
@Override
public
int
getRespTimeMetric
(
int
instanceId
,
long
timeBucket
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC_SQL
,
InstPerformanceTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
instanceId
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC_SQL
,
InstPerformanceTable
.
TABLE
,
InstPerformanceTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
instanceId
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
int
callTimes
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
...
...
@@ -139,7 +132,7 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
@Override
public
JsonArray
getRespTimeMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC
S_SQL
,
InstPerformanceTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_TPS_METRIC
_SQL
,
InstPerformanceTable
.
TABLE
,
InstPerformanceTable
.
COLUMN_ID
);
long
timeBucket
=
startTimeBucket
;
List
<
String
>
idList
=
new
ArrayList
<>();
...
...
@@ -150,25 +143,20 @@ public class InstPerformanceH2DAO extends H2DAO implements IInstPerformanceDAO {
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonArray
metrics
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
int
callTimes
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
int
costTotal
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_COST_TOTAL
);
metrics
.
add
(
costTotal
/
callTimes
);
idList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
Object
[]
{
id
}))
{
if
(
rs
.
next
())
{
int
callTimes
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_CALLS
);
int
costTotal
=
rs
.
getInt
(
InstPerformanceTable
.
COLUMN_COST_TOTAL
);
metrics
.
add
(
costTotal
/
callTimes
);
}
else
{
metrics
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
return
metrics
;
}
}
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/InstanceH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -44,7 +44,7 @@ public class InstanceH2DAO extends H2DAO implements IInstanceDAO {
private
static
final
String
GET_INST_LAST_HEARTBEAT_TIME_SQL
=
"select {0} from {1} where {2} > ? and {3} = ? limit 1"
;
private
static
final
String
GET_INSTANCE_SQL
=
"select * from {0} where {1} = ?"
;
private
static
final
String
GET_INSTANCES_SQL
=
"select * from {0} where {1} = ? and {2} >= ?"
;
private
static
final
String
GET_APPLICATIONS_SQL
=
"select {3}, count({0}) as cnt from {1} where {2} >= ?
and {2} <= ?
group by {3} limit 100"
;
private
static
final
String
GET_APPLICATIONS_SQL
=
"select {3}, count({0}) as cnt from {1} where {2} >= ? group by {3} limit 100"
;
@Override
public
Long
lastHeartBeatTime
()
{
...
...
@@ -87,7 +87,7 @@ public class InstanceH2DAO extends H2DAO implements IInstanceDAO {
JsonArray
applications
=
new
JsonArray
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_APPLICATIONS_SQL
,
InstanceTable
.
COLUMN_INSTANCE_ID
,
InstanceTable
.
TABLE
,
InstanceTable
.
COLUMN_HEARTBEAT_TIME
,
InstanceTable
.
COLUMN_APPLICATION_ID
);
Object
[]
params
=
new
Object
[]
{
startTime
,
endTime
};
Object
[]
params
=
new
Object
[]
{
startTime
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
Integer
applicationId
=
rs
.
getInt
(
InstanceTable
.
COLUMN_APPLICATION_ID
);
...
...
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/MemoryMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,11 +18,12 @@
package
org.skywalking.apm.collector.ui.dao
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
import
java.sql.ResultSet
;
import
java.sql.SQLException
;
import
java.util.ArrayList
;
import
java.util.List
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.util.Const
;
...
...
@@ -33,21 +34,18 @@ import org.skywalking.apm.collector.storage.h2.dao.H2DAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
/**
* @author clevertension
*/
public
class
MemoryMetricH2DAO
extends
H2DAO
implements
IMemoryMetricDAO
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstanceH2DAO
.
class
);
private
static
final
String
GET_MEMORY_METRIC_SQL
=
"select * from {0} where {1} =?"
;
private
static
final
String
GET_MEMORY_METRICS_SQL
=
"select * from {0} where {1} in ("
;
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
timeBucket
,
boolean
isHeap
)
{
H2Client
client
=
getClient
();
String
id
=
timeBucket
+
Const
.
ID_SPLIT
+
instanceId
+
Const
.
ID_SPLIT
+
isHeap
;
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_METRIC_SQL
,
MemoryMetricTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_METRIC_SQL
,
MemoryMetricTable
.
TABLE
,
MemoryMetricTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
JsonObject
metric
=
new
JsonObject
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
...
...
@@ -67,7 +65,7 @@ public class MemoryMetricH2DAO extends H2DAO implements IMemoryMetricDAO {
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
,
boolean
isHeap
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_METRIC
S_SQL
,
MemoryMetricTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_METRIC
_SQL
,
MemoryMetricTable
.
TABLE
,
MemoryMetricTable
.
COLUMN_ID
);
List
<
String
>
idList
=
new
ArrayList
<>();
long
timeBucket
=
startTimeBucket
;
do
{
...
...
@@ -77,30 +75,24 @@ public class MemoryMetricH2DAO extends H2DAO implements IMemoryMetricDAO {
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonObject
metric
=
new
JsonObject
();
JsonArray
usedMetric
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
metric
.
addProperty
(
"max"
,
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_MAX
));
metric
.
addProperty
(
"init"
,
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_INIT
));
usedMetric
.
add
(
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_USED
));
}
if
(
usedMetric
.
size
()
==
0
)
{
metric
.
addProperty
(
"max"
,
0
);
metric
.
addProperty
(
"init"
,
0
);
usedMetric
.
add
(
0
);
idList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
String
[]
{
id
}))
{
if
(
rs
.
next
())
{
metric
.
addProperty
(
"max"
,
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_MAX
));
metric
.
addProperty
(
"init"
,
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_INIT
));
usedMetric
.
add
(
rs
.
getLong
(
MemoryMetricTable
.
COLUMN_USED
));
}
else
{
metric
.
addProperty
(
"max"
,
0
);
metric
.
addProperty
(
"init"
,
0
);
usedMetric
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
metric
.
add
(
"used"
,
usedMetric
);
return
metric
;
...
...
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/MemoryPoolMetricH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -18,11 +18,12 @@
package
org.skywalking.apm.collector.ui.dao
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
import
java.sql.ResultSet
;
import
java.sql.SQLException
;
import
java.util.ArrayList
;
import
java.util.List
;
import
org.skywalking.apm.collector.client.h2.H2Client
;
import
org.skywalking.apm.collector.client.h2.H2ClientException
;
import
org.skywalking.apm.collector.core.util.Const
;
...
...
@@ -33,21 +34,18 @@ import org.skywalking.apm.collector.storage.h2.dao.H2DAO;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
com.google.gson.JsonArray
;
import
com.google.gson.JsonObject
;
/**
* @author clevertension
*/
public
class
MemoryPoolMetricH2DAO
extends
H2DAO
implements
IMemoryPoolMetricDAO
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
InstanceH2DAO
.
class
);
private
static
final
String
GET_MEMORY_POOL_METRIC_SQL
=
"select * from {0} where {1} =?"
;
private
static
final
String
GET_MEMORY_POOL_METRICS_SQL
=
"select * from {0} where {1} in ("
;
private
static
final
String
GET_MEMORY_POOL_METRIC_SQL
=
"select * from {0} where {1} =
?"
;
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
timeBucket
,
int
poolType
)
{
H2Client
client
=
getClient
();
String
id
=
timeBucket
+
Const
.
ID_SPLIT
+
instanceId
+
Const
.
ID_SPLIT
+
poolType
;
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_POOL_METRIC_SQL
,
MemoryPoolMetricTable
.
TABLE
,
"id"
);
Object
[]
params
=
new
Object
[]{
id
};
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_POOL_METRIC_SQL
,
MemoryPoolMetricTable
.
TABLE
,
MemoryPoolMetricTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
id
};
JsonObject
metric
=
new
JsonObject
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
...
...
@@ -67,7 +65,7 @@ public class MemoryPoolMetricH2DAO extends H2DAO implements IMemoryPoolMetricDAO
@Override
public
JsonObject
getMetric
(
int
instanceId
,
long
startTimeBucket
,
long
endTimeBucket
,
int
poolType
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_POOL_METRIC
S_SQL
,
MemoryPoolMetricTable
.
TABLE
,
"id"
);
String
sql
=
SqlBuilder
.
buildSql
(
GET_MEMORY_POOL_METRIC
_SQL
,
MemoryPoolMetricTable
.
TABLE
,
MemoryPoolMetricTable
.
COLUMN_ID
);
List
<
String
>
idList
=
new
ArrayList
<>();
long
timeBucket
=
startTimeBucket
;
do
{
...
...
@@ -77,30 +75,24 @@ public class MemoryPoolMetricH2DAO extends H2DAO implements IMemoryPoolMetricDAO
}
while
(
timeBucket
<=
endTimeBucket
);
StringBuilder
builder
=
new
StringBuilder
();
for
(
int
i
=
0
;
i
<
idList
.
size
();
i
++)
{
builder
.
append
(
"?,"
);
}
builder
.
delete
(
builder
.
length
()
-
1
,
builder
.
length
());
builder
.
append
(
")"
);
sql
=
sql
+
builder
;
Object
[]
params
=
idList
.
toArray
(
new
String
[
0
]);
JsonObject
metric
=
new
JsonObject
();
JsonArray
usedMetric
=
new
JsonArray
();
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
while
(
rs
.
next
())
{
metric
.
addProperty
(
"max"
,
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_MAX
));
metric
.
addProperty
(
"init"
,
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_INIT
));
usedMetric
.
add
(
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_USED
));
}
if
(
usedMetric
.
size
()
==
0
)
{
metric
.
addProperty
(
"max"
,
0
);
metric
.
addProperty
(
"init"
,
0
);
usedMetric
.
add
(
0
);
idList
.
forEach
(
id
->
{
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
new
String
[]
{
id
}))
{
if
(
rs
.
next
())
{
metric
.
addProperty
(
"max"
,
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_MAX
));
metric
.
addProperty
(
"init"
,
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_INIT
));
usedMetric
.
add
(
rs
.
getLong
(
MemoryPoolMetricTable
.
COLUMN_USED
));
}
else
{
metric
.
addProperty
(
"max"
,
0
);
metric
.
addProperty
(
"init"
,
0
);
usedMetric
.
add
(
0
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
}
catch
(
SQLException
|
H2ClientException
e
)
{
logger
.
error
(
e
.
getMessage
(),
e
);
}
});
metric
.
add
(
"used"
,
usedMetric
);
return
metric
;
...
...
apm-collector/apm-collector-ui/src/main/java/org/skywalking/apm/collector/ui/dao/SegmentH2DAO.java
浏览文件 @
020ce1c3
...
...
@@ -40,7 +40,7 @@ public class SegmentH2DAO extends H2DAO implements ISegmentDAO {
@Override
public
TraceSegmentObject
load
(
String
segmentId
)
{
H2Client
client
=
getClient
();
String
sql
=
SqlBuilder
.
buildSql
(
GET_SEGMENT_SQL
,
SegmentTable
.
COLUMN_DATA_BINARY
,
SegmentTable
.
TABLE
,
"id"
);
SegmentTable
.
TABLE
,
SegmentTable
.
COLUMN_ID
);
Object
[]
params
=
new
Object
[]
{
segmentId
};
try
(
ResultSet
rs
=
client
.
executeQuery
(
sql
,
params
))
{
if
(
rs
.
next
())
{
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录