- 23 2月, 2016 14 次提交
-
-
由 Till Rohrmann 提交于
Add test case This closes #1643.
-
由 zentol 提交于
This closes #1666
-
由 Stefano Baghino 提交于
[FLINK-3438] ExternalProcessRunner fails to detect ClassNotFound exception because of locale settings [FLINK-3438] Improved solution, no workaround [FLINK-3438] Change: a faulty process now causes a RuntimeException to be thrown This closes #1665.
-
由 Till Rohrmann 提交于
[FLINK-3460] [build] Set Flink dependencies in flink-streaming-connectors, flink-batch-connectors, cep, gelly and flink-ml modules to provided The flink-streaming-connectors all depend on flink-streaming-java in compile scope. This entails that this dependency is always pulled in, when you build a fat jar. By setting this dependency to provided, this will be avoided. Furthermore, the connectors are always used in conjunction with flink-streaming-java. This means that you will always have an explicit dependency import of flink-streaming-java in your build script. This allows you to run your program locally but it is also easy to exclude the dependency from being included in the fat jar by setting it to provided, as well. This closes #1683.
-
由 Till Rohrmann 提交于
[FLINK-3459] [build] Fix conflicting dependencies commons-collections, commons-beanutils and commons-beanutils-core The Hadoop dependencies have a dependency on commons-configuration which pulls in transitively the commons-collection, commons-beanutils and common-beanutils-core depedencies. Commons-beanutils and commons-collection contain classes which live in the same namespace. They are also binary compatible but not binary identical. This is a problem for the sbt assembly plugin which checks for binary identity. In order to solve the problem, we bump the commons-configuration version to 1.7 so that only commons-beanutils is pulled in. This is necessary, because the transitive promotion of dependencies of the shade plugin only excludes the commons-beanutils dependency only from the directly depending dependency. All parent dependencies won't have the exclusion. This is a problem for SBT which will pull the dependency as part of one of the parents, then. Moreover, we replace commons-beanutils by commons-beanutils-bean-collections which contains only the non-conflicting classes wrt commons-collections. This closes #1682.
-
由 Till Rohrmann 提交于
The shade-flink execution of the parent pom caused the problem that the guava dependencies were relocated twice in the flink-shaded-hadoop jar. In order to avoid this, this patch disables the shade-flink execution. This closes #1681.
-
由 zentol 提交于
This closes #1674.
-
由 Ufuk Celebi 提交于
-
由 Ufuk Celebi 提交于
Adds further log statements in order to improve debuggability of JobManagerHAJobGraphRecoveryITCase. Hope to help for debugging: - hpick://s3.amazonaws.com/archive.travis-ci.org/jobs/110095304/log.txt - https://s3.amazonaws.com/archive.travis-ci.org/jobs/110085371/log.txt
-
由 Robert Metzger 提交于
This closes #1692
-
由 Kostas Kloudas 提交于
This closes #1588
-
由 Greg Hogan 提交于
This closes #1680
-
由 Greg Hogan 提交于
-
由 Robert Metzger 提交于
This closes #1684
-
- 22 2月, 2016 2 次提交
-
-
由 Aljoscha Krettek 提交于
The StateDescriptor can be serializer asynchronously in case of asynchronous checkpoints. In that case two threads would try to concurrently use the TypeSerializer: The normal state updating and the checkpoint serialization. If the TypeSerializer is a KryoSerializer this can lead to problems. Therefore the need to duplicate it before using in "writeObject".
-
由 Aljoscha Krettek 提交于
-
- 21 2月, 2016 4 次提交
-
-
由 Ufuk Celebi 提交于
-
由 Maximilian Michels 提交于
-
由 Maximilian Michels 提交于
This adds more default constructor parameters to the RMQSource. In addition, users may override the setupConnectionFactory() method to return their onwn configured factory. This closes #1670.
-
由 Ufuk Celebi 提交于
This closes #1678.
-
- 20 2月, 2016 5 次提交
-
-
由 Aljoscha Krettek 提交于
-
由 Ufuk Celebi 提交于
This closes #1673.
-
由 Ufuk Celebi 提交于
Squashes the following commits: - [tests] Wait for task managers in JobManagerFailsITCase Possible fix for: https://s3.amazonaws.com/archive.travis-ci.org/jobs/110235128/log.txt - [tests] Move ITCase from runtime to tests - [tests] Merge abstract and sub type class - [tests] Determine JobManagerProcess ports from logs This closes #1676.
-
由 Ufuk Celebi 提交于
-
由 Robert Metzger 提交于
- Remove Guava dependency - Remove unused methods - Move ZKString Serializer - Add user-friendly error messages when parsing arguments This closes #1623 and closes #1672
-
- 19 2月, 2016 4 次提交
-
-
由 Stephan Ewen 提交于
-
由 Robert Metzger 提交于
-
由 Robert Metzger 提交于
-
由 vasia 提交于
This closes #1663
-
- 18 2月, 2016 5 次提交
-
-
由 Stephan Ewen 提交于
-
由 zentol 提交于
This closes #1653
-
由 zentol 提交于
This closes #1650
-
由 Aljoscha Krettek 提交于
-
由 Aljoscha Krettek 提交于
They are not specific to RocksDB, just utilities for copying local folders to/from HDFS. Moving them to flink-streaming-java means that they are always in the classpath of the TaskManager, not only in the user-code jar when using RocksDB. If they are only in the user-code jar the external process runner cannot find the class files, leading to ClassNotFoundExceptions.
-
- 17 2月, 2016 6 次提交
-
-
由 Aljoscha Krettek 提交于
This closes #1655
-
由 Stephan Ewen 提交于
-
由 Stephan Ewen 提交于
-
由 Stephan Ewen 提交于
-
由 Stephan Ewen 提交于
[FLINK-3413] [streaming] Make implicit conversions from Java DataStream to Scala DataStream explicit This also clean up a lot of JavaDocs in various Scala DataStream API classes.
-
由 Stephan Ewen 提交于
-