提交 dba2946f 编写于 作者: S szape 提交者: mbalassi

[FLINK-2243] [storm-compat] Demonstrating finite Storm spout functionality on exclamation example

-minor renaming
-improving JavaDocs

Closes #853
上级 6d9eeb55
......@@ -167,6 +167,57 @@ The input type is `Tuple1<String>` and `Fields("sentence")` specify that `input.
See [BoltTokenizerWordCountPojo](https://github.com/apache/flink/tree/master/flink-contrib/flink-storm-compatibility/flink-storm-compatibility-examples/src/main/java/org/apache/flink/stormcompatibility/wordcount/BoltTokenizerWordCountPojo.java) and [BoltTokenizerWordCountWithNames](https://github.com/apache/flink/tree/master/flink-contrib/flink-storm-compatibility/flink-storm-compatibility-examples/src/main/java/org/apache/flink/stormcompatibility/wordcount/BoltTokenizerWordCountWithNames.java) for examples.
# Flink Extensions
## Finite Storm Spouts
In Flink streaming, sources can be finite - i.e. emit a finite number of records and stop after emitting the last record -, however, Storm spouts always emit infinite streams.
The bridge between the two approach is the `FiniteStormSpout` interface which, in addition to `IRichSpout`, contains a `reachedEnd()` method, where the user can specify a stopping-condition.
The user can create a finite Storm spout by implementing this interface instead of `IRichSpout`, and implementing the `reachedEnd()`method in addition.
When used as part of a Flink topology, a `FiniteStormSpout` should be wrapped in a `FiniteStormSpoutWrapper` class.
Although finite Storm spouts are not necessary to embed Storm spouts into a Flink streaming program or to submit a whole Storm topology to Flink, there are cases where they may come in handy:
* to achieve that a native Storm spout behaves the same way as a finite Flink source with minimal modifications
* the user wants to process a stream only for some time; after that, the spout can stop automatically
* reading a file into a stream
* for testing purposes
A `FiniteStormSpout` can be still used as a normal, infinite Storm spout by changing its wrapper class to `StormSpoutWraper` in the Flink topology.
An example of a finite Storm spout that emits records for 10 seconds only:
<div class="codetabs" markdown="1">
<div data-lang="java" markdown="1">
~~~java
public class TimedFiniteStormSpout extends AbstractStormSpout implements FiniteStormSpout {
[...]
private long starttime = System.currentTimeMillis();
public boolean reachedEnd() {
return System.currentTimeMillis() - starttime > 10000l;
}
[...]
}
~~~
</div>
</div>
Using a `FiniteStormSpout` in a Flink topology:
<div class="codetabs" markdown="1">
<div data-lang="java" markdown="1">
~~~java
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<String> rawInput = env.addSource(
new FiniteStormSpoutWrapper<String>(new TimedFiniteStormSpout(), true)
TypeExtractor.getForClass(String.class));
// process data stream
[...]
~~~
</div>
</div>
# Storm Compatibility Examples
You can find more examples in Maven module `flink-storm-compatibilty-examples`.
......
......@@ -20,15 +20,32 @@ package org.apache.flink.stormcompatibility.excamation;
import org.apache.flink.examples.java.wordcount.util.WordCountData;
import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
import org.apache.flink.stormcompatibility.excamation.stormoperators.ExclamationBolt;
import org.apache.flink.stormcompatibility.util.FiniteStormFileSpout;
import org.apache.flink.stormcompatibility.util.FiniteStormInMemorySpout;
import org.apache.flink.stormcompatibility.util.OutputFormatter;
import org.apache.flink.stormcompatibility.util.RawOutputFormatter;
import org.apache.flink.stormcompatibility.util.SimpleOutputFormatter;
import org.apache.flink.stormcompatibility.util.StormBoltFileSink;
import org.apache.flink.stormcompatibility.util.StormBoltPrintSink;
import org.apache.flink.stormcompatibility.util.StormFileSpout;
import org.apache.flink.stormcompatibility.util.StormInMemorySpout;
/**
* This is a basic example of a Storm topology.
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamation[Local|RemoteByClient|RemoteBySubmitter] &lt;text path&gt;
* &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from
* {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>construct a regular Storm topology as Flink program</li>
* <li>make use of the FiniteStormSpout interface</li>
* </ul>
*/
public class ExclamationTopology {
......@@ -36,7 +53,7 @@ public class ExclamationTopology {
public final static String firstBoltId = "exclamation1";
public final static String secondBoltId = "exclamation2";
public final static String sinkId = "sink";
private final static OutputFormatter formatter = new RawOutputFormatter();
private final static OutputFormatter formatter = new SimpleOutputFormatter();
public static FlinkTopologyBuilder buildTopology() {
final FlinkTopologyBuilder builder = new FlinkTopologyBuilder();
......@@ -46,9 +63,9 @@ public class ExclamationTopology {
// read the text file from given input path
final String[] tokens = textPath.split(":");
final String inputFile = tokens[tokens.length - 1];
builder.setSpout(spoutId, new StormFileSpout(inputFile));
builder.setSpout(spoutId, new FiniteStormFileSpout(inputFile));
} else {
builder.setSpout(spoutId, new StormInMemorySpout(WordCountData.WORDS));
builder.setSpout(spoutId, new FiniteStormInMemorySpout(WordCountData.WORDS));
}
builder.setBolt(firstBoltId, new ExclamationBolt(), 3).shuffleGrouping(spoutId);
......@@ -59,9 +76,11 @@ public class ExclamationTopology {
// read the text file from given input path
final String[] tokens = outputPath.split(":");
final String outputFile = tokens[tokens.length - 1];
builder.setBolt(sinkId, new StormBoltFileSink(outputFile, formatter)).shuffleGrouping(secondBoltId);
builder.setBolt(sinkId, new StormBoltFileSink(outputFile, formatter))
.shuffleGrouping(secondBoltId);
} else {
builder.setBolt(sinkId, new StormBoltPrintSink(formatter), 4).shuffleGrouping(secondBoltId);
builder.setBolt(sinkId, new StormBoltPrintSink(formatter), 4)
.shuffleGrouping(secondBoltId);
}
return builder;
......@@ -84,13 +103,17 @@ public class ExclamationTopology {
textPath = args[0];
outputPath = args[1];
} else {
System.err.println("Usage: StormExclamation* <text path> <result path>");
System.err.println(
"Usage: StormExclamation[Local|RemoteByClient|RemoteBySubmitter] <text " +
"path> <result path>");
return false;
}
} else {
System.out.println("Executing StormExclamation* example with built-in default data");
System.out.println("Executing StormExclamation example with built-in default data");
System.out.println(" Provide parameters to read input data from a file");
System.out.println(" Usage: StormExclamation* <text path> <result path>");
System.out.println(
" Usage: StormExclamation[Local|RemoteByClient|RemoteBySubmitter] <text path>" +
" <result path>");
}
return true;
......
......@@ -26,7 +26,25 @@ import org.apache.flink.stormcompatibility.wrappers.StormBoltWrapper;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
public class StormBoltExclamation {
/**
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamationWithStormBolt &lt;text path&gt; &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from
* {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>use a Storm bolt within a Flink Streaming program</li>
* </ul>
*/
public class ExclamationWithStormBolt {
// *************************************************************************
// PROGRAM
......@@ -90,13 +108,13 @@ public class StormBoltExclamation {
textPath = args[0];
outputPath = args[1];
} else {
System.err.println("Usage: StormBoltExclamation <text path> <result path>");
System.err.println("Usage: ExclamationWithStormBolt <text path> <result path>");
return false;
}
} else {
System.out.println("Executing StormBoltExclamation example with built-in default data");
System.out.println("Executing ExclamationWithStormBolt example with built-in default data");
System.out.println(" Provide parameters to read input data from a file");
System.out.println(" Usage: StormBoltExclamation <text path> <result path>");
System.out.println(" Usage: ExclamationWithStormBolt <text path> <result path>");
}
return true;
}
......
......@@ -21,13 +21,32 @@ package org.apache.flink.stormcompatibility.excamation;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.typeutils.TypeExtractor;
import org.apache.flink.examples.java.wordcount.util.WordCountData;
import org.apache.flink.stormcompatibility.util.StormFileSpout;
import org.apache.flink.stormcompatibility.util.StormInMemorySpout;
import org.apache.flink.stormcompatibility.wrappers.StormFiniteSpoutWrapper;
import org.apache.flink.stormcompatibility.util.FiniteStormFileSpout;
import org.apache.flink.stormcompatibility.util.FiniteStormInMemorySpout;
import org.apache.flink.stormcompatibility.wrappers.FiniteStormSpoutWrapper;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
public class StormSpoutExclamation {
/**
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamationWithStormSpout &lt;text path&gt; &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from
* {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>use a Storm spout within a Flink Streaming program</li>
* <li>make use of the FiniteStormSpout interface</li>
* </ul>
*/
public class ExclamationWithStormSpout {
// *************************************************************************
// PROGRAM
......@@ -89,13 +108,14 @@ public class StormSpoutExclamation {
textPath = args[0];
outputPath = args[1];
} else {
System.err.println("Usage: StormSpoutExclamation <text path> <result path>");
System.err.println("Usage: ExclamationWithStormSpout <text path> <result path>");
return false;
}
} else {
System.out.println("Executing StormSpoutExclamation example with built-in default data");
System.out.println("Executing ExclamationWithStormSpout example with built-in default " +
"data");
System.out.println(" Provide parameters to read input data from a file");
System.out.println(" Usage: StormSpoutExclamation <text path> <result path>");
System.out.println(" Usage: ExclamationWithStormSpout <text path> <result path>");
}
return true;
}
......@@ -106,12 +126,14 @@ public class StormSpoutExclamation {
final String[] tokens = textPath.split(":");
final String localFile = tokens[tokens.length - 1];
return env.addSource(
new StormFiniteSpoutWrapper<String>(new StormFileSpout(localFile), true),
new FiniteStormSpoutWrapper<String>(new FiniteStormFileSpout(localFile), true),
TypeExtractor.getForClass(String.class)).setParallelism(1);
}
return env.addSource(new StormFiniteSpoutWrapper<String>(new StormInMemorySpout(WordCountData.WORDS), true),
TypeExtractor.getForClass(String.class));
return env.addSource(
new FiniteStormSpoutWrapper<String>(
new FiniteStormInMemorySpout(WordCountData.WORDS), true),
TypeExtractor.getForClass(String.class)).setParallelism(1);
}
......
......@@ -21,6 +21,27 @@ import backtype.storm.utils.Utils;
import org.apache.flink.stormcompatibility.api.FlinkLocalCluster;
import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
/**
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology} and
* submitted to Flink for execution in the same way as to a Storm {@link LocalCluster}.
* <p/>
* This example shows how to run program directly within Java, thus it cannot be used to submit a
* {@link StormTopology} via Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamationLocal &lt;text path&gt; &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>run a regular Storm program locally on Flink</li>
* </ul>
*/
public class StormExclamationLocal {
public final static String topologyId = "Streaming Exclamation";
......@@ -43,10 +64,6 @@ public class StormExclamationLocal {
cluster.submitTopology(topologyId, null, builder.createTopology());
Utils.sleep(10 * 1000);
// TODO kill does no do anything so far
cluster.killTopology(topologyId);
cluster.shutdown();
}
}
......@@ -25,6 +25,28 @@ import backtype.storm.utils.Utils;
import org.apache.flink.stormcompatibility.api.FlinkClient;
import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
/**
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology} and
* submitted to Flink for execution in the same way as to a Storm cluster similar to
* {@link NimbusClient}. The Flink cluster can be local or remote.
* <p/>
* This example shows how to submit the program via Java, thus it cannot be used to submit a
* {@link StormTopology} via Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamationRemoteByClient &lt;text path&gt; &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
* </ul>
*/
public class StormExclamationRemoteByClient {
public final static String topologyId = "Streaming Exclamation";
......
......@@ -22,6 +22,27 @@ import org.apache.flink.stormcompatibility.api.FlinkClient;
import org.apache.flink.stormcompatibility.api.FlinkSubmitter;
import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
/**
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text
* files in a streaming fashion. The program is constructed as a regular {@link StormTopology} and
* submitted to Flink for execution in the same way as to a Storm cluster similar to
* {@link StormSubmitter}. The Flink cluster can be local or remote.
* <p/>
* This example shows how to submit the program via Java as well as Flink's command line client (ie, bin/flink).
* <p/>
* <p/>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>StormExclamationRemoteByClient &lt;text path&gt; &lt;result path&gt;</code><br/>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
* </ul>
*/
public class StormExclamationRemoteBySubmitter {
public final static String topologyId = "Streaming Exclamation";
......
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.flink.stormcompatibility.util;
import backtype.storm.spout.SpoutOutputCollector;
import backtype.storm.task.TopologyContext;
import backtype.storm.tuple.Values;
import org.apache.flink.stormcompatibility.wrappers.FiniteStormSpout;
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.Map;
/**
* Implements a Storm Spout that reads data from a given local file. The spout stops automatically
* when it reached the end of the file.
*/
public class FiniteStormFileSpout extends AbstractStormSpout implements FiniteStormSpout {
private static final long serialVersionUID = -6996907090003590436L;
private final String path;
private BufferedReader reader;
private String line;
private boolean newLineRead;
public FiniteStormFileSpout(final String path) {
this.path = path;
}
@SuppressWarnings("rawtypes")
@Override
public void open(final Map conf, final TopologyContext context,
final SpoutOutputCollector collector) {
super.open(conf, context, collector);
try {
this.reader = new BufferedReader(new FileReader(this.path));
} catch (final FileNotFoundException e) {
throw new RuntimeException(e);
}
newLineRead = false;
}
@Override
public void close() {
if (this.reader != null) {
try {
this.reader.close();
} catch (final IOException e) {
throw new RuntimeException(e);
}
}
}
@Override
public void nextTuple() {
this.collector.emit(new Values(line));
newLineRead = false;
}
/**
* Can be called before nextTuple() any times including 0.
*/
public boolean reachedEnd() {
try {
readLine();
} catch (IOException e) {
throw new RuntimeException("Exception occured while reading file " + path);
}
return line == null;
}
private void readLine() throws IOException {
if (!newLineRead) {
line = reader.readLine();
newLineRead = true;
}
}
}
......@@ -18,15 +18,31 @@
package org.apache.flink.stormcompatibility.util;
import backtype.storm.tuple.Tuple;
import backtype.storm.tuple.Values;
import org.apache.flink.stormcompatibility.wrappers.FiniteStormSpout;
public class RawOutputFormatter implements OutputFormatter {
private static final long serialVersionUID = 8685668993521259832L;
/**
* Implements a Storm Spout that reads String[] data stored in the memory. The spout stops
* automatically when it emitted all of the data.
*/
public class FiniteStormInMemorySpout extends AbstractStormSpout implements FiniteStormSpout {
private static final long serialVersionUID = -4008858647468647019L;
private String[] source;
private int counter = 0;
public FiniteStormInMemorySpout(String[] source) {
this.source = source;
}
@Override
public String format(final Tuple input) {
assert (input.size() == 1);
return input.getValue(0).toString();
public void nextTuple() {
this.collector.emit(new Values(source[this.counter++]));
}
public boolean reachedEnd() {
return counter >= source.length;
}
}
......@@ -18,12 +18,19 @@
package org.apache.flink.stormcompatibility.util;
import java.io.Serializable;
import backtype.storm.tuple.Tuple;
import java.io.Serializable;
public interface OutputFormatter extends Serializable {
/**
* Converts a Storm {@link Tuple} to a string. This method is used for formatting the output
* tuples before writing them out to a file or to the consol.
*
* @param input The tuple to be formatted
* @return The string result of the formatting
*/
public String format(Tuple input);
}
......@@ -23,9 +23,20 @@ import backtype.storm.tuple.Tuple;
public class SimpleOutputFormatter implements OutputFormatter {
private static final long serialVersionUID = 6349573860144270338L;
/**
* Converts a Storm {@link Tuple} with 1 field to a string by retrieving the value of that
* field. This method is used for formatting raw outputs wrapped in tuples, before writing them
* out to a file or to the consol.
*
* @param input
* The tuple to be formatted
* @return The string result of the formatting
*/
@Override
public String format(final Tuple input) {
return input.getValues().toString();
if (input.getValues().size() != 1) {
throw new RuntimeException("The output is not raw");
}
return input.getValue(0).toString();
}
}
......@@ -40,7 +40,7 @@ import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
* <p/>
* This example shows how to:
* <ul>
* <li>use a Storm bolt within a Flink Streaming program.
* <li>use a Storm bolt within a Flink Streaming program.</li>
* </ul>
*/
public class BoltTokenizerWordCount {
......
......@@ -43,7 +43,7 @@ import org.apache.flink.util.Collector;
* <p/>
* This example shows how to:
* <ul>
* <li>use a Storm bolt within a Flink Streaming program.
* <li>use a Storm spout within a Flink Streaming program.</li>
* </ul>
*/
public class SpoutSourceWordCount {
......@@ -145,7 +145,7 @@ public class SpoutSourceWordCount {
}
return env.addSource(new StormFiniteSpoutWrapper<String>(new StormInMemorySpout(WordCountData.WORDS), true),
TypeExtractor.getForClass(String.class));
TypeExtractor.getForClass(String.class)).setParallelism(1);
}
......
......@@ -42,7 +42,7 @@ import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
* <p/>
* This example shows how to:
* <ul>
* <li>run a regular Storm program locally on Flink
* <li>run a regular Storm program locally on Flink</li>
* </ul>
*/
public class StormWordCountLocal {
......
......@@ -46,7 +46,7 @@ import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
* <p/>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
* </ul>
*/
public class StormWordCountRemoteByClient {
......
......@@ -42,7 +42,7 @@ import org.apache.flink.stormcompatibility.api.FlinkTopologyBuilder;
* <p/>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
* </ul>
*/
public class StormWordCountRemoteBySubmitter {
......
......@@ -47,7 +47,7 @@ import org.apache.flink.stormcompatibility.wordcount.stormoperators.StormBoltTok
* <p/>
* This example shows how to:
* <ul>
* <li>how to construct a regular Storm topology as Flink program
* <li>how to construct a regular Storm topology as Flink program</li>
* </ul>
*/
public class WordCountTopology {
......
......@@ -18,12 +18,12 @@
package org.apache.flink.stormcompatibility.exclamation;
import org.apache.flink.stormcompatibility.excamation.StormSpoutExclamation;
import org.apache.flink.stormcompatibility.excamation.ExclamationWithStormBolt;
import org.apache.flink.stormcompatibility.exclamation.util.ExclamationData;
import org.apache.flink.streaming.util.StreamingProgramTestBase;
import org.apache.flink.test.testdata.WordCountData;
public class StormSpoutExclamationITCase extends StreamingProgramTestBase {
public class ExclamationWithStormBoltITCase extends StreamingProgramTestBase {
protected String textPath;
protected String resultPath;
......@@ -41,7 +41,7 @@ public class StormSpoutExclamationITCase extends StreamingProgramTestBase {
@Override
protected void testProgram() throws Exception {
StormSpoutExclamation.main(new String[]{this.textPath, this.resultPath});
ExclamationWithStormBolt.main(new String[]{this.textPath, this.resultPath});
}
}
......@@ -18,12 +18,12 @@
package org.apache.flink.stormcompatibility.exclamation;
import org.apache.flink.stormcompatibility.excamation.StormBoltExclamation;
import org.apache.flink.stormcompatibility.excamation.ExclamationWithStormSpout;
import org.apache.flink.stormcompatibility.exclamation.util.ExclamationData;
import org.apache.flink.streaming.util.StreamingProgramTestBase;
import org.apache.flink.test.testdata.WordCountData;
public class StormBoltExclamationITCase extends StreamingProgramTestBase {
public class ExclamationWithStormSpoutITCase extends StreamingProgramTestBase {
protected String textPath;
protected String resultPath;
......@@ -41,7 +41,7 @@ public class StormBoltExclamationITCase extends StreamingProgramTestBase {
@Override
protected void testProgram() throws Exception {
StormBoltExclamation.main(new String[]{this.textPath, this.resultPath});
ExclamationWithStormSpout.main(new String[]{this.textPath, this.resultPath});
}
}
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册