Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
doujutun3207
flink
提交
e1e7d7f7
F
flink
项目概览
doujutun3207
/
flink
与 Fork 源项目一致
从无法访问的项目Fork
通知
24
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
F
flink
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
e1e7d7f7
编写于
5月 04, 2020
作者:
R
Robert Metzger
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[FLINK-11086] Replace flink-shaded-hadoop-2 dependency by vanilla Hadoop dependency
上级
6a6a4395
变更
24
隐藏空白更改
内联
并排
Showing
24 changed file
with
390 addition
and
168 deletion
+390
-168
flink-connectors/flink-connector-filesystem/pom.xml
flink-connectors/flink-connector-filesystem/pom.xml
+14
-2
flink-connectors/flink-connector-hive/pom.xml
flink-connectors/flink-connector-hive/pom.xml
+92
-3
flink-connectors/flink-hadoop-compatibility/pom.xml
flink-connectors/flink-hadoop-compatibility/pom.xml
+8
-2
flink-connectors/flink-hbase/pom.xml
flink-connectors/flink-hbase/pom.xml
+2
-2
flink-connectors/flink-hcatalog/pom.xml
flink-connectors/flink-hcatalog/pom.xml
+8
-2
flink-dist/pom.xml
flink-dist/pom.xml
+2
-46
flink-dist/src/main/assemblies/hadoop.xml
flink-dist/src/main/assemblies/hadoop.xml
+0
-41
flink-end-to-end-tests/flink-bucketing-sink-test/pom.xml
flink-end-to-end-tests/flink-bucketing-sink-test/pom.xml
+7
-2
flink-end-to-end-tests/pom.xml
flink-end-to-end-tests/pom.xml
+15
-0
flink-filesystems/flink-hadoop-fs/pom.xml
flink-filesystems/flink-hadoop-fs/pom.xml
+15
-2
flink-filesystems/flink-mapr-fs/pom.xml
flink-filesystems/flink-mapr-fs/pom.xml
+8
-2
flink-filesystems/flink-s3-fs-hadoop/pom.xml
flink-filesystems/flink-s3-fs-hadoop/pom.xml
+11
-0
flink-filesystems/flink-swift-fs-hadoop/pom.xml
flink-filesystems/flink-swift-fs-hadoop/pom.xml
+22
-2
flink-formats/flink-compress/pom.xml
flink-formats/flink-compress/pom.xml
+8
-2
flink-formats/flink-orc-nohive/pom.xml
flink-formats/flink-orc-nohive/pom.xml
+25
-4
flink-formats/flink-orc/pom.xml
flink-formats/flink-orc/pom.xml
+19
-11
flink-formats/flink-parquet/pom.xml
flink-formats/flink-parquet/pom.xml
+8
-2
flink-formats/flink-sequence-file/pom.xml
flink-formats/flink-sequence-file/pom.xml
+7
-2
flink-fs-tests/pom.xml
flink-fs-tests/pom.xml
+24
-2
flink-runtime/pom.xml
flink-runtime/pom.xml
+14
-2
flink-tests/pom.xml
flink-tests/pom.xml
+3
-5
flink-yarn-tests/pom.xml
flink-yarn-tests/pom.xml
+0
-24
flink-yarn/pom.xml
flink-yarn/pom.xml
+23
-2
pom.xml
pom.xml
+55
-6
未找到文件。
flink-connectors/flink-connector-filesystem/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -51,8 +51,20 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-connectors/flink-connector-hive/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -41,6 +41,91 @@ under the License.
<derby.version>
10.10.2.0
</derby.version>
</properties>
<!-- Overwrite hadoop dependency management from flink-parent to use locally defined Hadoop version -->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
${hivemetastore.hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<version>
${hivemetastore.hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<version>
${hivemetastore.hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-common
</artifactId>
<version>
${hivemetastore.hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-client
</artifactId>
<version>
${hivemetastore.hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<!-- core dependencies -->
...
...
@@ -126,12 +211,16 @@ under the License.
thus override the default hadoop version from 2.4.1 to 2.7.5
-->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2-uber
</artifactId>
<version>
${hivemetastore.hadoop.version}-${flink.shaded.version}
</version>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<scope>
provided
</scope>
</dependency>
<!-- Hive dependencies -->
<!-- Note: Hive published jars do not have proper dependencies declared.
We need to push for HIVE-16391 (https://issues.apache.org/jira/browse/HIVE-16391) to resolve this problem. -->
...
...
flink-connectors/flink-hadoop-compatibility/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -61,8 +61,14 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-connectors/flink-hbase/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -101,8 +101,8 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-connectors/flink-hcatalog/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -82,8 +82,14 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-dist/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -137,8 +137,8 @@ under the License.
<version>
${project.version}
</version>
<exclusions>
<exclusion>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
...
...
@@ -460,50 +460,6 @@ under the License.
</dependencies>
</profile>
<profile>
<!-- Copies that shaded Hadoop uber jar to the dist folder. -->
<id>
include-hadoop
</id>
<activation>
<property>
<name>
include-hadoop
</name>
</property>
</activation>
<dependencies>
<!--
The Hadoop 2 Uber jar should not go into the Flink dist jar, but
sit next to it. Hence, we set it to 'provided' here.
-->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2-uber
</artifactId>
<scope>
provided
</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>
maven-assembly-plugin
</artifactId>
<executions>
<execution>
<id>
hadoop
</id>
<phase>
package
</phase>
<goals>
<goal>
single
</goal>
</goals>
<configuration>
<descriptors>
<descriptor>
src/main/assemblies/hadoop.xml
</descriptor>
</descriptors>
<finalName>
flink-${project.version}-bin
</finalName>
<appendAssemblyId>
false
</appendAssemblyId>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
<profile>
<!-- Creates/Removes the 'build-target' symlink in the root directory (only Unix systems) -->
<id>
symlink-build-target
</id>
...
...
flink-dist/src/main/assemblies/hadoop.xml
已删除
100644 → 0
浏览文件 @
6a6a4395
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<assembly
xmlns=
"http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd"
>
<id>
hadoop
</id>
<formats>
<format>
dir
</format>
</formats>
<includeBaseDirectory>
true
</includeBaseDirectory>
<baseDirectory>
flink-${project.version}
</baseDirectory>
<dependencySets>
<dependencySet>
<outputDirectory>
lib/
</outputDirectory>
<useTransitiveDependencies>
true
</useTransitiveDependencies>
<scope>
provided
</scope>
<includes>
<include>
org.apache.flink:flink-shaded-hadoop-2-uber
</include>
</includes>
</dependencySet>
</dependencySets>
</assembly>
flink-end-to-end-tests/flink-bucketing-sink-test/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -47,8 +47,13 @@
<version>
${project.version}
</version>
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
provided
</scope>
<exclusions>
<!-- Needed for proper dependency convergence -->
...
...
flink-end-to-end-tests/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -255,6 +255,21 @@ under the License.
</execution>
</executions>
</plugin>
<plugin>
<groupId>
org.apache.maven.plugins
</groupId>
<artifactId>
maven-enforcer-plugin
</artifactId>
<executions>
<execution>
<id>
dependency-convergence
</id>
<goals>
<goal>
enforce
</goal>
</goals>
<configuration>
<skip>
true
</skip>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
...
...
flink-filesystems/flink-hadoop-fs/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -43,10 +43,23 @@ under the License.
<!-- pulling in Hadoop by default -->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<optional>
true
</optional>
</dependency>
<!-- for the behavior test suite -->
<dependency>
...
...
flink-filesystems/flink-mapr-fs/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -47,8 +47,14 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<optional>
true
</optional>
</dependency>
...
...
flink-filesystems/flink-s3-fs-hadoop/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -32,6 +32,17 @@ under the License.
<packaging>
jar
</packaging>
<!-- Override the flink-parent dependencyManagement definition for hadoop-common to ensure
${fs.hadoopshaded.version} is used for this file system -->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
${fs.hadoopshaded.version}
</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<!-- Flink's file system abstraction (compiled against, not bundled) -->
...
...
flink-filesystems/flink-swift-fs-hadoop/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -37,6 +37,26 @@ under the License.
<openstackhadoop.hadoop.version>
2.8.1
</openstackhadoop.hadoop.version>
</properties>
<!-- Overwrite hadoop dependency versions inherited from the parent pom dependencyManagement -->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
<version>
${openstackhadoop.hadoop.version}
</version>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
${openstackhadoop.hadoop.version}
</version>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<version>
${openstackhadoop.hadoop.version}
</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<!-- Flink core -->
...
...
@@ -59,8 +79,8 @@ under the License.
because the optional Hadoop dependency is also pulled in for tests -->
<exclusions>
<exclusion>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
</exclusion>
</exclusions>
</dependency>
...
...
flink-formats/flink-compress/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -44,8 +44,14 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-formats/flink-orc-nohive/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -63,7 +63,7 @@ under the License.
<version>
${orc.version}
</version>
<classifier>
nohive
</classifier>
<exclusions>
<!-- Exclude ORC's Hadoop dependency and pull in Flink's
shaded
Hadoop. -->
<!-- Exclude ORC's Hadoop dependency and pull in Flink's Hadoop. -->
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
...
...
@@ -81,12 +81,17 @@ under the License.
<!-- Replacement for ORC's Hadoop dependency. -->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<version>
${hadoop.version}-${flink.shaded.version}
</version>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
provided
</scope>
</dependency>
<!-- test dependencies -->
<dependency>
<groupId>
org.apache.flink
</groupId>
...
...
@@ -98,6 +103,22 @@ under the License.
</dependencies>
<profiles>
<profile>
<!-- This profile adds dependencies needed to execute the tests
with Hadoop 3 -->
<id>
hadoop3-tests
</id>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs-client
</artifactId>
<version>
${hadoop.version}
</version>
<scope>
test
</scope>
</dependency>
</dependencies>
</profile>
</profiles>
<build>
<plugins>
<!-- skip dependency convergence due to Hadoop dependency -->
...
...
flink-formats/flink-orc/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -75,15 +75,6 @@ under the License.
<artifactId>
orc-core
</artifactId>
<version>
${orc.version}
</version>
<exclusions>
<!-- Exclude ORC's Hadoop dependency and pull in Flink's shaded Hadoop. -->
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
</exclusion>
<exclusion>
<groupId>
javax.activation
</groupId>
<artifactId>
javax.activation-api
</artifactId>
...
...
@@ -97,8 +88,8 @@ under the License.
<!-- Replacement for ORC's Hadoop dependency. -->
<dependency>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
@@ -147,6 +138,23 @@ under the License.
</dependencies>
<profiles>
<profile>
<!-- This profile adds dependencies needed to execute the tests
with Hadoop 3 -->
<id>
hadoop3-tests
</id>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs-client
</artifactId>
<version>
${hadoop.version}
</version>
<scope>
test
</scope>
</dependency>
</dependencies>
</profile>
</profiles>
<build>
<plugins>
<!-- skip dependency convergence due to Hadoop dependency -->
...
...
flink-formats/flink-parquet/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -97,8 +97,14 @@ under the License.
<!-- Hadoop is needed by Parquet -->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<scope>
provided
</scope>
</dependency>
...
...
flink-formats/flink-sequence-file/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -45,11 +45,16 @@ under the License.
<!-- Hadoop is needed for SequenceFile -->
<dependency>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
provided
</scope>
</dependency>
<!-- test dependencies -->
<dependency>
...
...
flink-fs-tests/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -37,8 +37,14 @@ under the License.
-->
<dependencies>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<scope>
test
</scope>
</dependency>
...
...
@@ -130,6 +136,22 @@ under the License.
</dependency>
</dependencies>
<profiles>
<profile>
<!-- This profile adds dependencies needed to execute the tests
with Hadoop 3 -->
<id>
hadoop3-tests
</id>
<dependencies>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs-client
</artifactId>
<version>
${hadoop.version}
</version>
<scope>
test
</scope>
</dependency>
</dependencies>
</profile>
</profiles>
<build>
<plugins>
<plugin>
...
...
flink-runtime/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -68,8 +68,20 @@ under the License.
<!-- optional dependency on Hadoop, so that Hadoop classes are not always pulled in -->
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<optional>
true
</optional>
</dependency>
...
...
flink-tests/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -58,12 +58,10 @@ under the License.
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<scope>
test
</scope>
</dependency>
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-jackson
</artifactId>
...
...
flink-yarn-tests/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -90,14 +90,6 @@ under the License.
<artifactId>
flink-yarn_${scala.binary.version}
</artifactId>
<version>
${project.version}
</version>
<scope>
test
</scope>
<exclusions>
<exclusion>
<!-- prevent flink-shaded-hadoop from being on the test classpath
to avoid conflicts with other dependencies -->
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
...
...
@@ -106,14 +98,6 @@ under the License.
<version>
${project.version}
</version>
<type>
test-jar
</type>
<scope>
test
</scope>
<exclusions>
<exclusion>
<!-- prevent flink-shaded-hadoop from being on the test classpath
to avoid conflicts with other dependencies -->
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
...
...
@@ -399,14 +383,6 @@ under the License.
</executions>
<configuration>
<artifactItems>
<artifactItem>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2-uber
</artifactId>
<version>
${hadoop.version}-${flink.shaded.version}
</version>
<type>
jar
</type>
<overWrite>
true
</overWrite>
<outputDirectory>
${project.build.directory}/shaded-hadoop
</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-examples-batch_${scala.binary.version}
</artifactId>
...
...
flink-yarn/pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -54,9 +54,30 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-common
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-client
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
</dependency>
<!-- test dependencies -->
...
...
pom.xml
浏览文件 @
e1e7d7f7
...
...
@@ -308,9 +308,9 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.
flink
</groupId>
<artifactId>
flink-shaded-hadoop-2
</artifactId>
<version>
${hadoop.version}
-${flink.shaded.version}
</version>
<groupId>
org.apache.
hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
${hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
...
...
@@ -324,9 +324,57 @@ under the License.
</dependency>
<dependency>
<groupId>
org.apache.flink
</groupId>
<artifactId>
flink-shaded-hadoop-2-uber
</artifactId>
<version>
${hadoop.version}-${flink.shaded.version}
</version>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-hdfs
</artifactId>
<version>
${hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-mapreduce-client-core
</artifactId>
<version>
${hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-common
</artifactId>
<version>
${hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
<artifactId>
log4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-log4j12
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-yarn-client
</artifactId>
<version>
${hadoop.version}
</version>
<exclusions>
<exclusion>
<groupId>
log4j
</groupId>
...
...
@@ -1482,6 +1530,7 @@ under the License.
<forkNumber>
0${surefire.forkNumber}
</forkNumber>
<log4j.configuration>
${log4j.configuration}
</log4j.configuration>
<jobmanager.scheduler>
${test.scheduler.type}
</jobmanager.scheduler>
<hadoop.version>
${hadoop.version}
</hadoop.version>
</systemPropertyVariables>
<argLine>
-Xms256m -Xmx2048m -Dmvn.forkNumber=${surefire.forkNumber} -XX:+UseG1GC
</argLine>
</configuration>
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录