- 14 2月, 2020 1 次提交
-
-
由 Chesnay Schepler 提交于
-
- 10 12月, 2019 1 次提交
-
-
由 Gary Yao 提交于
-
- 31 8月, 2019 1 次提交
-
-
由 Chesnay Schepler 提交于
-
- 12 7月, 2019 1 次提交
-
-
由 Kurt Young 提交于
-
- 04 5月, 2019 1 次提交
-
-
由 Chesnay Schepler 提交于
-
- 25 2月, 2019 1 次提交
-
-
由 Aljoscha Krettek 提交于
-
- 14 12月, 2018 1 次提交
-
-
由 Gary Yao 提交于
-
- 03 11月, 2018 1 次提交
-
-
由 Till Rohrmann 提交于
-
- 17 7月, 2018 1 次提交
-
-
由 Till Rohrmann 提交于
-
- 27 2月, 2018 1 次提交
-
-
由 Till Rohrmann 提交于
-
- 09 1月, 2018 1 次提交
-
-
由 Till Rohrmann 提交于
The flip6 build profile only runs the Flip-6 related test cases. Moreover, all Flip-6 related test cases are excluded when not running the flip6 build profile. This should reduce testing time when adding more and more Flip-6 test cases. Include flink-test-utils-junit in all submodules to make the Category marker interfaces Flip6 and OldAndFlip6 available This closes #4889.
-
- 09 11月, 2017 1 次提交
-
-
由 Piotr Nowojski 提交于
Disable it in most modules.
-
- 07 11月, 2017 1 次提交
-
-
由 Aljoscha Krettek 提交于
-
- 27 9月, 2017 1 次提交
-
-
由 Aljoscha Krettek 提交于
This also makes them optional in flink-runtime, which is enabled by the previous changes to only use Hadoop dependencies if they are available. This also requires adding a few explicit dependencies in other modules because they were using transitive dependencies of the Hadoop deps. The most common dependency there is, ha!, commons-io.
-
- 10 8月, 2017 1 次提交
-
-
由 zentol 提交于
This closes #4494.
-
- 07 8月, 2017 1 次提交
-
-
由 zentol 提交于
This closes #4453.
-
- 31 7月, 2017 1 次提交
-
-
由 zentol 提交于
-
- 25 7月, 2017 1 次提交
-
-
由 Dawid Wysakowicz 提交于
This closes #4384.
-
- 13 7月, 2017 1 次提交
-
-
由 zentol 提交于
-
- 08 5月, 2017 1 次提交
-
-
由 Robert Metzger 提交于
-
- 21 12月, 2016 2 次提交
-
-
由 Robert Metzger 提交于
-
由 Stephan Ewen 提交于
Currently, every project in Flink has a hard (compile scope) dependency on the jsr305, slf4j, and log4j artifacts. That way they are pulled into every fat jar, including user fat jars as soon as they refer to a connector or library. This commit changes the behavior in two ways: 1. It removes the concrete logger dependencies from the root pom file. Instead, it adds them to the 'flink-core' project. That way, all modules that refer to 'flink-core' will have those dependencies as well, but the projects that have 'flink-core' as provided (connectors, libraries, user programs, etc) will have those dependencies transitively as provided as well. 2. The commit overrides the slf4j and jsr305 dependencies in the parents of 'flink-connectors', 'flink-libraries', and 'flink-metrics' and sets the to 'provided'. That way all core projects pull the logger classes, but all projects that are not part of flink-dist (and rather bundled in fat jars) will not bundle these dependencies again. The flink-dist puts the dependencies into the fat jar (slf4j, jsr305) or the lib folder (log4j).
-
- 30 11月, 2016 1 次提交
-
-
由 Robert Metzger 提交于
This closes #2850
-
- 03 11月, 2016 1 次提交
-
-
由 Evgeny_Kincharov 提交于
[FLINK-4315] [dataSet] [hadoopCompat] Annotate Hadoop-related methods in ExecutionEnvironment as @Deprecated. - Preparation to remove Hadoop dependency from flink-java - Alternatives for deprecated functionality is provided in flink-hadoop-compatibility via HadoopInputs This closes #2637.
-
- 05 8月, 2016 1 次提交
-
-
由 Stephan Ewen 提交于
This moves the API compatibility checks into the API projects that use stability annotations. Previously, every project ran the tests, regardless of whether it contained public API classes or not. This closes #2334
-
- 03 8月, 2016 1 次提交
-
-
由 Marton Balassi 提交于
This closes #2324
-
- 05 7月, 2016 1 次提交
-
-
由 Stephan Ewen 提交于
Makes the JUnit test utils (TestLogger, retry rules, ...) properly available to other projects without the 'flink-core' test-jar, via the 'flink-test-utils-junit' project. Makes the ForkableMiniCluster, TestEnvironment, and other test utilities available in the 'main' scope of the 'flink-test-utils' project. Creates a 'flink-test-utils-parent' project that holds the 'flink-test-utils-junit' and 'flink-test-utils' project. Also moves some tests between projects and inlines some very simple utility functions in order to simplify some test jar dependencies.
-
- 11 5月, 2016 1 次提交
-
-
由 zentol 提交于
- replaced CharSets with StandardCharsets - added checkElementIndex to Flink Preconditions - replaced Guava Preconditions with Flink Preconditions - removed single usages Ints.max() and Joiner() This closes #1938
-
- 29 2月, 2016 1 次提交
-
-
由 Robert Metzger 提交于
-
- 05 2月, 2016 1 次提交
-
-
由 Robert Metzger 提交于
This closes #1428
-
- 02 2月, 2016 1 次提交
-
-
由 Stephan Ewen 提交于
-
- 01 2月, 2016 1 次提交
-
-
由 Stephan Ewen 提交于
The auto-magic for Joda Time was limited to very few classes. It was intransparent what cases would be handled.
-
- 21 1月, 2016 1 次提交
-
-
由 Maximilian Michels 提交于
This closes #1535.
-
- 01 12月, 2015 1 次提交
-
-
由 Fabian Hueske 提交于
This closes #1331
-
- 23 10月, 2015 1 次提交
-
-
由 Maximilian Michels 提交于
0.10-SNAPSHOT continues on branch release-0.10
-
- 01 10月, 2015 1 次提交
-
-
由 Stephan Ewen 提交于
-
- 28 8月, 2015 1 次提交
-
-
由 Stephan Ewen 提交于
[FLINK-2584] [java api] Downgrade version of javakaffee kryo serializers, for compatibility with kryo 2.4
-
- 26 8月, 2015 1 次提交
-
-
由 Robert Metzger 提交于
This is needed because the Hadoop IF/OF's are using Hadoop's FileSystem stack, which is using the security credentials passed in the JobConf / Job class in the getSplits() method. Note that access to secured Hadoop 1.x using Hadoop IF/OF's is not possible with this change. This limitation is due to missing methods in the old APIs. - Add some comments & change dependency scope to test
-
- 21 8月, 2015 1 次提交
-
-
由 chengxiang li 提交于
[FLINK-1901] [core] enable sample with fixed size on the whole dataset. [FLINK-1901] [core] add more comments for RandomSamplerTest. [FLINK-1901] [core] refactor PoissonSampler output Iterator. [FLINK-1901] [core] move sample/sampleWithSize operator to DataSetUtils. Adds notes for commons-math3 to LICENSE and NOTICE file This closes #949.
-
- 18 6月, 2015 1 次提交
-
-
由 mbalassi 提交于
Closes #851
-