未验证 提交 30ebe1ca 编写于 作者: 静夜思朝颜's avatar 静夜思朝颜 提交者: GitHub

provide thread stack analyzes (#4308)

* provide thread analyzes
Co-authored-by: wu-sheng's avatar吴晟 Wu Sheng <wu.sheng@foxmail.com>
上级 ef60cb27
......@@ -142,6 +142,11 @@ miss any newly-added dependency:
- Check the LICENSE's and NOTICE's of those dependencies, if they can be included in an ASF project, add them in the `apm-dist/release-docs/{LICENSE,NOTICE}` file.
- Add those dependencies' names to the `tools/dependencies/known-oap-backend-dependencies.txt` file (**alphabetical order**), the next run of `check-LICENSE.sh` should pass.
## Profile
The performance profile is an enhancement feature in the APM system. We are using the thread dump to estimate the method execution time, rather than adding many local spans. In this way, the resource cost would be much less than using distributed tracing to locate slow method. This feature is suitable in the production environment. The following documents are important for developers to understand the key parts of this feature
- [Profile data report procotol](https://github.com/apache/skywalking-data-collect-protocol/tree/master/profile) is provided like other trace, JVM data through gRPC.
- [Thread dump merging mechanism](backend-profile.md) introduces the merging mechanism, which helps the end users to understand the profile report.
## For release
[Apache Release Guide](How-to-release.md) introduces to the committer team about doing official Apache version release, to avoid
breaking any Apache rule. Apache license allows everyone to redistribute if you keep our licenses and NOTICE
......
# Thread dump merging mechanism
The performance profile is an enhancement feature in the APM system. We are using the thread dump to estimate the method execution time, rather than adding many local spans. In this way, the resource cost would be much less than using distributed tracing to locate slow method. This feature is suitable in the production environment. This document introduces how thread dumps are merged into the final report as a stack tree(s).
## Thread analyst
### Read data and transform
Read data from the database and convert it to a data structure in gRPC.
```
st=>start: Start
e=>end: End
op1=>operation: Load data using paging
op2=>operation: Transform data using parallel
st(right)->op1(right)->op2
op2(right)->e
```
Copy code and paste it into this [link](http://flowchart.js.org/) to generate flow chart.
1. Use the stream to read data by page (50 records per page).
2. Convert data into gRPC data structures in the form of parallel streams.
3. Merge into a list of data.
### Data analyze
Use the group by and collector modes in the Java parallel stream to group according to the first stack element in the database records,
and use the collector to perform data aggregation. Generate a multi-root tree.
```
st=>start: Start
e=>end: End
op1=>operation: Group by first stack element
sup=>operation: Generate empty stack tree
acc=>operation: Accumulator data to stack tree
com=>operation: Combine stack trees
fin=>operation: Calculate durations and build result
st(right)->op1->sup(right)->acc
acc(right)->com(right)->fin->e
```
Copy code and paste it into this [link](http://flowchart.js.org/) to generate flow chart.
- **Group by first stack element**: Use the first level element in each stack to group, ensuring that the stacks have the same root node.
- **Generate empty stack tree**: Generate multiple top-level empty trees for preparation of the following steps,
The reason for generating multiple top-level trees is that original data can be add in parallel without generating locks.
- **Accumulator data to stack tree**: Add every thread dump into the generated trees.
1. Iterate through each element in the thread dump to find if there is any child element with the same code signature and same stack depth in the parent element.
If not, then add this element.
2. Keep the dump sequences and timestamps in each nodes from the source.
- **Combine stack trees**: Combine all trees structures into one by using the rules as same as `Accumulator`.
1. Use LDR to traversal tree node. Use the `Stack` data structure to avoid recursive calls, each stack element represents the node that needs to be merged.
2. The task of merging two nodes is to merge the list of children nodes. If they have the same code signature and same parents, save the dump sequences and timestamps in this node. Otherwise, the node needs to be added into the target node as a new child.
- **Calculate durations and build result**: Calculate relevant statistics and generate response.
1. Use the same traversal node logic as in the `Combine stack trees` step. Convert to a GraphQL data structure, and put all nodes into a list for subsequent duration calculations.
2. Calculate each node's duration in parallel. For each node, sort the sequences, if there are two continuous sequences, the duration should add the duration of these two seq's timestamp.
3. Calculate each node execution in parallel. For each node, the duration of the current node should minus the time consumed by all children.
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.analyze;
import org.apache.skywalking.oap.server.core.query.entity.ProfileStackElement;
import java.util.Collections;
import java.util.EnumSet;
import java.util.Set;
import java.util.function.BiConsumer;
import java.util.function.BinaryOperator;
import java.util.function.Function;
import java.util.function.Supplier;
import java.util.stream.Collector;
/**
* Work for {@link ProfileAnalyzer} to analyze.
*/
public class ProfileAnalyzeCollector implements Collector<ProfileStack, ProfileStackNode, ProfileStackElement> {
@Override
public Supplier<ProfileStackNode> supplier() {
return ProfileStackNode::newNode;
}
@Override
public BiConsumer<ProfileStackNode, ProfileStack> accumulator() {
return ProfileStackNode::accumulateFrom;
}
@Override
public BinaryOperator<ProfileStackNode> combiner() {
return ProfileStackNode::combine;
}
@Override
public Function<ProfileStackNode, ProfileStackElement> finisher() {
return ProfileStackNode::buildAnalyzeResult;
}
@Override
public Set<Characteristics> characteristics() {
return Collections.unmodifiableSet(EnumSet.of(Characteristics.CONCURRENT, Characteristics.UNORDERED));
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.analyze;
import org.apache.skywalking.oap.server.core.query.entity.ProfileAnalyzation;
import org.apache.skywalking.oap.server.core.query.entity.ProfileStackElement;
import org.apache.skywalking.oap.server.library.util.CollectionUtils;
import java.util.*;
import java.util.stream.Collectors;
/**
* Analyze {@link ProfileStack} data to {@link ProfileAnalyzation}
*
* See: https://github.com/apache/skywalking/blob/docs/en/guides/backend-profile.md#thread-analyst
*/
public class ProfileAnalyzer {
private static final ProfileAnalyzeCollector ANALYZE_COLLECTOR = new ProfileAnalyzeCollector();
/**
* Analyze records
* @param stacks
* @return
*/
public static ProfileAnalyzation analyze(List<ProfileStack> stacks) {
if (CollectionUtils.isEmpty(stacks)) {
return null;
}
// using parallel stream
Map<String, ProfileStackElement> stackTrees = stacks.parallelStream()
// stack list cannot be empty
.filter(s -> CollectionUtils.isNotEmpty(s.getStack()))
.collect(Collectors.groupingBy(s -> s.getStack().get(0), ANALYZE_COLLECTOR));
ProfileAnalyzation analyzer = new ProfileAnalyzation();
analyzer.setStack(new ArrayList<>(stackTrees.values()));
return analyzer;
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.analyze;
import com.google.common.primitives.Ints;
import com.google.protobuf.InvalidProtocolBufferException;
import lombok.Data;
import org.apache.skywalking.apm.network.language.profile.ThreadStack;
import org.apache.skywalking.oap.server.core.profile.ProfileTaskSegmentSnapshotRecord;
import java.util.List;
/**
* Deserialize from {@link ProfileTaskSegmentSnapshotRecord}
*/
@Data
public class ProfileStack implements Comparable<ProfileStack> {
private int sequence;
private long dumpTime;
private List<String> stack;
public static ProfileStack deserialize(ProfileTaskSegmentSnapshotRecord record) {
ThreadStack threadStack = null;
try {
threadStack = ThreadStack.parseFrom(record.getStackBinary());
} catch (InvalidProtocolBufferException e) {
throw new IllegalArgumentException("wrong stack data");
}
// build data
ProfileStack stack = new ProfileStack();
stack.sequence = record.getSequence();
stack.dumpTime = record.getDumpTime();
stack.stack = threadStack.getCodeSignaturesList();
return stack;
}
@Override
public int compareTo(ProfileStack o) {
return Ints.compare(sequence, o.sequence);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.analyze;
import com.google.common.base.Objects;
import org.apache.skywalking.oap.server.core.query.entity.ProfileStackElement;
import java.util.*;
import java.util.function.Consumer;
import java.util.stream.Collectors;
/**
* Work for profiling stacks, intermediate state of the {@link ProfileStackElement} and {@link ProfileStack}
*/
public class ProfileStackNode {
private String codeSignature;
private List<ProfileStack> detectedStacks;
private List<ProfileStackNode> children;
private int duration;
/**
* create new empty, un-init node
* @return
*/
public static ProfileStackNode newNode() {
ProfileStackNode emptyNode = new ProfileStackNode();
emptyNode.detectedStacks = new LinkedList<>();
emptyNode.children = new ArrayList<>();
return emptyNode;
}
/**
* accumulate {@link ProfileStack} to this tree, it will invoke on the tree root node
* @param stack
*/
public void accumulateFrom(ProfileStack stack) {
List<String> stackList = stack.getStack();
if (codeSignature == null) {
codeSignature = stackList.get(0);
}
// add detected stack
this.detectedBy(stack);
// handle stack children
ProfileStackNode parent = this;
for (int depth = 1; depth < stackList.size(); depth++) {
String elementCodeSignature = stackList.get(depth);
// find same code signature children
ProfileStackNode childElement = null;
for (ProfileStackNode child : parent.children) {
if (Objects.equal(child.codeSignature, elementCodeSignature)) {
childElement = child;
break;
}
}
if (childElement != null) {
// add detected stack
childElement.detectedBy(stack);
parent = childElement;
} else {
// add children
ProfileStackNode childNode = newNode();
childNode.codeSignature = elementCodeSignature;
childNode.detectedBy(stack);
parent.children.add(childNode);
parent = childNode;
}
}
}
/**
* combine from other {@link ProfileStackNode}
* @param node
* @return
*/
public ProfileStackNode combine(ProfileStackNode node) {
// combine this node
this.combineDetectedStacks(node);
// merge tree using LDR to traversal tree node
// using stack to avoid recursion
// merge key.children <- value.children
LinkedList<Pair<ProfileStackNode, ProfileStackNode>> stack = new LinkedList<>();
stack.add(new Pair<>(this, node));
while (!stack.isEmpty()) {
Pair<ProfileStackNode, ProfileStackNode> needCombineNode = stack.pop();
// merge value children to key
// add to stack if need to keep traversal
combineChildrenNodes(needCombineNode.key, needCombineNode.value, stack::add);
}
return this;
}
/**
* merge all children nodes to appoint node
* @param targetNode
* @param beingMergedNode
* @param continueChildrenMerging
*/
private void combineChildrenNodes(ProfileStackNode targetNode, ProfileStackNode beingMergedNode, Consumer<Pair<ProfileStackNode, ProfileStackNode>> continueChildrenMerging) {
if (beingMergedNode.children.isEmpty()) {
return;
}
for (ProfileStackNode childrenNode : targetNode.children) {
// find node from being merged node children
for (ListIterator<ProfileStackNode> it = beingMergedNode.children.listIterator(); it.hasNext();) {
ProfileStackNode node = it.next();
if (node != null && node.matches(childrenNode)) {
childrenNode.combineDetectedStacks(node);
continueChildrenMerging.accept(new Pair<>(childrenNode, node));
it.set(null);
break;
}
}
}
for (ProfileStackNode node : beingMergedNode.children) {
if (node != null) {
targetNode.children.add(node);
}
}
}
/**
* build GraphQL result, calculate duration and count data using parallels
* @return
*/
public ProfileStackElement buildAnalyzeResult() {
// all nodes add to single-level list (such as flat), work for parallel calculating
LinkedList<Pair<ProfileStackElement, ProfileStackNode>> nodeMapping = new LinkedList<>();
ProfileStackElement root = buildElement();
nodeMapping.add(new Pair<>(root, this));
// same with combine logic
LinkedList<Pair<ProfileStackElement, ProfileStackNode>> stack = new LinkedList<>();
stack.add(new Pair<>(root, this));
while (!stack.isEmpty()) {
Pair<ProfileStackElement, ProfileStackNode> mergingPair = stack.pop();
ProfileStackElement respElement = mergingPair.key;
// generate children node and add to stack and all node mapping
respElement.setChildren(mergingPair.value.children.stream().map(c -> {
ProfileStackElement element = c.buildElement();
Pair<ProfileStackElement, ProfileStackNode> pair = new Pair<>(element, c);
stack.add(pair);
nodeMapping.add(pair);
return element;
}).collect(Collectors.toList()));
}
// calculate durations
nodeMapping.parallelStream().forEach(t -> t.value.calculateDuration(t.key));
nodeMapping.parallelStream().forEach(t -> t.value.calculateDurationExcludeChild(t.key));
return root;
}
private void detectedBy(ProfileStack stack) {
this.detectedStacks.add(stack);
}
private void combineDetectedStacks(ProfileStackNode node) {
this.detectedStacks.addAll(node.detectedStacks);
}
private ProfileStackElement buildElement() {
ProfileStackElement element = new ProfileStackElement();
element.setCodeSignature(this.codeSignature);
element.setChildren(new LinkedList<>());
element.setCount(this.detectedStacks.size());
return element;
}
/**
* calculate duration to {@link ProfileStackElement#getDuration()}
*/
private void calculateDuration(ProfileStackElement element) {
if (this.detectedStacks.size() <= 1) {
element.setDuration(0);
return;
}
Collections.sort(this.detectedStacks);
// calculate time windows duration
ProfileStack currentTimeWindowStartStack = detectedStacks.get(0);
ProfileStack currentTimeWindowEndTack = detectedStacks.get(0);
long duration = 0;
for (ListIterator<ProfileStack> it = detectedStacks.listIterator(1); it.hasNext(); ) {
ProfileStack currentStack = it.next();
// is continuity
if (currentTimeWindowEndTack.getSequence() + 1 != currentStack.getSequence()) {
duration += currentTimeWindowEndTack.getDumpTime() - currentTimeWindowStartStack.getDumpTime();
currentTimeWindowStartStack = currentStack;
}
currentTimeWindowEndTack = currentStack;
}
// calculate last one time windows
duration += currentTimeWindowEndTack.getDumpTime() - currentTimeWindowStartStack.getDumpTime();
this.duration = Math.toIntExact(duration);
element.setDuration(this.duration);
}
/**
* calculate duration to {@link ProfileStackElement#getDurationChildExcluded()}, expends on {@link #calculateDuration(ProfileStackElement)}
* @param element
*/
private void calculateDurationExcludeChild(ProfileStackElement element) {
element.setDurationChildExcluded(element.getDuration() - children.stream().mapToInt(t -> t.duration).sum());
}
private boolean matches(ProfileStackNode node) {
return Objects.equal(this.codeSignature, node.codeSignature);
}
private static class Pair<K, V> {
private final K key;
private final V value;
public Pair(K key, V value) {
this.key = key;
this.value = value;
}
}
}
......@@ -43,6 +43,6 @@ public class ProfileStackElement {
private int count;
// children of this stack code sign
private List<ProfileStackElement> childs;
private List<ProfileStackElement> children;
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile;
import org.apache.skywalking.oap.server.core.profile.bean.ProfileStackAnalyze;
import org.apache.skywalking.oap.server.core.profile.bean.ProfileStackAnalyzeHolder;
import org.junit.Test;
import org.yaml.snakeyaml.Yaml;
import java.io.InputStream;
public class ProfileAnalyzerTest {
@Test
public void testAnalyze() {
ProfileStackAnalyzeHolder holder = loadYaml("thread-snapshot.yml", ProfileStackAnalyzeHolder.class);
for (ProfileStackAnalyze analyze : holder.getList()) {
analyze.analyzeAndAssert();
}
}
private <T> T loadYaml(String file, Class<T> cls) {
InputStream expectedInputStream = Thread.currentThread().getContextClassLoader().getResourceAsStream(file);
return new Yaml().loadAs(expectedInputStream, cls);
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.bean;
import lombok.Data;
import org.apache.skywalking.oap.server.core.profile.analyze.ProfileAnalyzer;
import org.apache.skywalking.oap.server.core.profile.analyze.ProfileStack;
import org.apache.skywalking.oap.server.core.query.entity.ProfileAnalyzation;
import java.util.List;
import static org.junit.Assert.assertEquals;
@Data
public class ProfileStackAnalyze {
private ProfileStackData data;
private List<ProfileStackElementMatcher> expected;
public void analyzeAndAssert() {
List<ProfileStack> stacks = data.transform();
ProfileAnalyzation analyze = ProfileAnalyzer.analyze(stacks);
assertEquals(analyze.getStack().size(), expected.size());
for (int i = 0; i < analyze.getStack().size(); i++) {
expected.get(i).verify(analyze.getStack().get(i));
}
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.bean;
import lombok.Data;
import java.util.List;
@Data
public class ProfileStackAnalyzeHolder {
private List<ProfileStackAnalyze> list;
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.bean;
import com.google.common.base.Splitter;
import lombok.Data;
import org.apache.skywalking.oap.server.core.profile.analyze.ProfileStack;
import java.util.ArrayList;
import java.util.List;
@Data
public class ProfileStackData {
private int limit;
private List<String> snapshots;
public List<ProfileStack> transform() {
ArrayList<ProfileStack> result = new ArrayList<>(snapshots.size());
for (int i = 0; i < snapshots.size(); i++) {
ProfileStack stack = new ProfileStack();
stack.setSequence(i);
stack.setDumpTime(i * limit);
stack.setStack(Splitter.on("-").splitToList(snapshots.get(i)));
result.add(stack);
}
return result;
}
}
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package org.apache.skywalking.oap.server.core.profile.bean;
import lombok.Data;
import org.apache.skywalking.oap.server.core.query.entity.ProfileStackElement;
import org.apache.skywalking.oap.server.library.util.CollectionUtils;
import org.junit.Assert;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static org.junit.Assert.*;
@Data
public class ProfileStackElementMatcher {
private static final Pattern DURATION_PATTERN = Pattern.compile("(\\d+)\\:(\\d+)");
private String code;
private String duration;
private int count;
private List<ProfileStackElementMatcher> children;
public void verify(ProfileStackElement element) {
// analyze duration
Matcher durationInfo = DURATION_PATTERN.matcher(duration);
Assert.assertTrue("duration field pattern not match", durationInfo.find());
int duration = Integer.parseInt(durationInfo.group(1));
int durationExcludeChild = Integer.parseInt(durationInfo.group(2));
// assert
assertEquals(code, element.getCodeSignature());
assertEquals(duration, element.getDuration());
assertEquals(durationExcludeChild, element.getDurationChildExcluded());
assertEquals(count, element.getCount());
if (CollectionUtils.isEmpty(children)) {
children = Collections.emptyList();
}
if (CollectionUtils.isEmpty(element.getChildren())) {
element.setChildren(Collections.emptyList());
}
assertEquals(children.size(), element.getChildren().size());
// children code signature not sorted, need sort it, then verify
Collections.sort(children, Comparator.comparing(c -> c.code));
Collections.sort(element.getChildren(), Comparator.comparing(c -> c.getCodeSignature()));
for (int i = 0; i < children.size(); i++) {
children.get(i).verify(element.getChildren().get(i));
}
}
}
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# verify: read to analyze stack info
# limit: per stack snapshot dump limit
# stack: per data means one snapshot, stack elements split by "-"
# expected: need verify data analyze result
# info: follow this pattern: codeSignature(duration:durationExcludeChild)
# children: all children nodes
list:
# case 1
- data:
limit: 10
snapshots:
- A-B-C
- A-B
- A-B-C-D
expected:
- code: A
count: 3
duration: 20:0
children:
- code: B
count: 3
duration: 20:20
children:
- code: C
count: 2
duration: 0:0
children:
- code: D
count: 1
duration: 0:0
# case 2
- data:
limit: 10
snapshots:
- A-B-C
- B-C-D
expected:
- code: A
count: 1
duration: 0:0
children:
- code: B
count: 1
duration: 0:0
children:
- code: C
count: 1
duration: 0:0
- code: B
count: 1
duration: 0:0
children:
- code: C
count: 1
duration: 0:0
children:
- code: D
count: 1
duration: 0:0
# case 3
- data:
limit: 10
snapshots:
- A-B-C-D
- A-B
- A-B-C
- A-B
- A-B-C-D
expected:
- code: A
count: 5
duration: 40:0
children:
- code: B
count: 5
duration: 40:40
children:
- code: C
count: 3
duration: 0:0
children:
- code: D
count: 2
duration: 0:0
# case 4:
- data:
limit: 10
snapshots:
- A-B-C
- A-B-C-A
- A-C-A
- A-B-C-B
expected:
- code: A
count: 4
duration: 30:20
children:
- code: B
count: 3
duration: 10:0
children:
- code: C
count: 3
duration: 10:10
children:
- code: A
count: 1
duration: 0:0
- code: B
count: 1
duration: 0:0
- code: C
count: 1
duration: 0:0
children:
- code: A
count: 1
duration: 0:0
# case 5:
- data:
limit: 10
snapshots:
- A-B-C
- A-B-B-C
- A-B-B-B
- A-C-B
expected:
- code: A
count: 4
duration: 30:10
children:
- code: B
count: 3
duration: 20:10
children:
- code: C
count: 1
duration: 0:0
- code: B
count: 2
duration: 10:10
children:
- code: C
count: 1
duration: 0:0
- code: B
count: 1
duration: 0:0
- code: C
count: 1
duration: 0:0
children:
- code: B
count: 1
duration: 0:0
# case 6:
- data:
limit: 10
snapshots:
- A-B-C
- A-B
- D-E
expected:
- code: A
count: 2
duration: 10:0
children:
- code: B
count: 2
duration: 10:10
children:
- code: C
count: 1
duration: 0:0
- code: D
count: 1
duration: 0:0
children:
- code: E
count: 1
duration: 0:0
Subproject commit 138a0573cfcc5a44a99d3063b43d314faee8654a
Subproject commit f4e314de312a5bc053c6dbd6a07b561885b33888
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册