Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
doujutun3207
flink
提交
853fc3c5
F
flink
项目概览
doujutun3207
/
flink
与 Fork 源项目一致
从无法访问的项目Fork
通知
24
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
F
flink
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
未验证
提交
853fc3c5
编写于
4月 01, 2020
作者:
J
Jingsong Lee
提交者:
GitHub
4月 01, 2020
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[FLINK-16858][table] Expose partitioned by grammar
This closes #11559
上级
15d42c87
变更
9
隐藏空白更改
内联
并排
Showing
9 changed file
with
10 addition
and
67 deletion
+10
-67
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/table/catalog/hive/HiveTestUtils.java
...va/org/apache/flink/table/catalog/hive/HiveTestUtils.java
+0
-2
flink-table/flink-sql-parser/src/main/codegen/includes/parserImpls.ftl
...link-sql-parser/src/main/codegen/includes/parserImpls.ftl
+1
-6
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/utils/ParserResource.java
...ava/org/apache/flink/sql/parser/utils/ParserResource.java
+0
-3
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/validate/FlinkSqlConformance.java
...apache/flink/sql/parser/validate/FlinkSqlConformance.java
+0
-7
flink-table/flink-sql-parser/src/main/resources/org.apache.flink.sql.parser.utils/ParserResource.properties
...g.apache.flink.sql.parser.utils/ParserResource.properties
+0
-1
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkSqlParserImplTest.java
...a/org/apache/flink/sql/parser/FlinkSqlParserImplTest.java
+1
-39
flink-table/flink-table-planner-blink/src/test/java/org/apache/flink/table/planner/operations/SqlToOperationConverterTest.java
...table/planner/operations/SqlToOperationConverterTest.java
+4
-4
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/sqlexec/SqlToOperationConverterTest.java
...ache/flink/table/sqlexec/SqlToOperationConverterTest.java
+4
-4
flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/runtime/batch/sql/PartitionableSinkITCase.scala
...ink/table/runtime/batch/sql/PartitionableSinkITCase.scala
+0
-1
未找到文件。
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/table/catalog/hive/HiveTestUtils.java
浏览文件 @
853fc3c5
...
...
@@ -19,7 +19,6 @@
package
org.apache.flink.table.catalog.hive
;
import
org.apache.flink.table.api.EnvironmentSettings
;
import
org.apache.flink.table.api.SqlDialect
;
import
org.apache.flink.table.api.TableEnvironment
;
import
org.apache.flink.table.catalog.CatalogTest
;
import
org.apache.flink.table.catalog.exceptions.CatalogException
;
...
...
@@ -114,7 +113,6 @@ public class HiveTestUtils {
EnvironmentSettings
settings
=
EnvironmentSettings
.
newInstance
().
useBlinkPlanner
().
inBatchMode
().
build
();
TableEnvironment
tableEnv
=
TableEnvironment
.
create
(
settings
);
tableEnv
.
getConfig
().
getConfiguration
().
setInteger
(
TABLE_EXEC_RESOURCE_DEFAULT_PARALLELISM
.
key
(),
1
);
tableEnv
.
getConfig
().
setSqlDialect
(
SqlDialect
.
HIVE
);
return
tableEnv
;
}
...
...
flink-table/flink-sql-parser/src/main/codegen/includes/parserImpls.ftl
浏览文件 @
853fc3c5
...
...
@@ -601,12 +601,7 @@ SqlCreate SqlCreateTable(Span s, boolean replace) :
}]
[
<PARTITIONED> <BY>
partitionColumns = ParenthesizedSimpleIdentifierList() {
if (!((FlinkSqlConformance) this.conformance).allowCreatePartitionedTable()) {
throw SqlUtil.newContextException(getPos(),
ParserResource.RESOURCE.createPartitionedTableIsOnlyAllowedForHive());
}
}
partitionColumns = ParenthesizedSimpleIdentifierList()
]
[
<WITH>
...
...
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/utils/ParserResource.java
浏览文件 @
853fc3c5
...
...
@@ -35,7 +35,4 @@ public interface ParserResource {
@Resources
.
BaseMessage
(
"OVERWRITE expression is only used with INSERT statement."
)
Resources
.
ExInst
<
ParseException
>
overwriteIsOnlyUsedWithInsert
();
@Resources
.
BaseMessage
(
"Creating partitioned table is only allowed for HIVE dialect."
)
Resources
.
ExInst
<
ParseException
>
createPartitionedTableIsOnlyAllowedForHive
();
}
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/validate/FlinkSqlConformance.java
浏览文件 @
853fc3c5
...
...
@@ -155,11 +155,4 @@ public enum FlinkSqlConformance implements SqlConformance {
public
boolean
allowQualifyingCommonColumn
()
{
return
true
;
}
/**
* Whether to allow "create table T(i int, j int) partitioned by (i)" grammar.
*/
public
boolean
allowCreatePartitionedTable
()
{
return
this
==
FlinkSqlConformance
.
HIVE
;
}
}
flink-table/flink-sql-parser/src/main/resources/org.apache.flink.sql.parser.utils/ParserResource.properties
浏览文件 @
853fc3c5
...
...
@@ -18,4 +18,3 @@
#
MultipleWatermarksUnsupported
=
Multiple WATERMARK statements is not supported yet.
OverwriteIsOnlyUsedWithInsert
=
OVERWRITE expression is only used with INSERT statement.
CreatePartitionedTableIsOnlyAllowedForHive
=
Creating partitioned table is only allowed for HIVE dialect.
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkSqlParserImplTest.java
浏览文件 @
853fc3c5
...
...
@@ -21,19 +21,14 @@ package org.apache.flink.sql.parser;
import
org.apache.flink.sql.parser.ddl.SqlCreateTable
;
import
org.apache.flink.sql.parser.error.SqlValidateException
;
import
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl
;
import
org.apache.flink.sql.parser.validate.FlinkSqlConformance
;
import
org.apache.calcite.avatica.util.Casing
;
import
org.apache.calcite.avatica.util.Quoting
;
import
org.apache.calcite.sql.SqlNode
;
import
org.apache.calcite.sql.parser.SqlParseException
;
import
org.apache.calcite.sql.parser.SqlParser
;
import
org.apache.calcite.sql.parser.SqlParserImplFactory
;
import
org.apache.calcite.sql.parser.SqlParserTest
;
import
org.apache.calcite.sql.validate.SqlConformance
;
import
org.hamcrest.BaseMatcher
;
import
org.hamcrest.Description
;
import
org.junit.Before
;
import
org.junit.Test
;
import
java.io.Reader
;
...
...
@@ -45,7 +40,6 @@ import static org.junit.Assert.fail;
/** FlinkSqlParserImpl tests. **/
public
class
FlinkSqlParserImplTest
extends
SqlParserTest
{
private
SqlConformance
conformance0
;
@Override
protected
SqlParserImplFactory
parserImplFactory
()
{
...
...
@@ -54,25 +48,7 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
protected
SqlParser
getSqlParser
(
Reader
source
,
UnaryOperator
<
SqlParser
.
ConfigBuilder
>
transform
)
{
if
(
conformance0
==
null
)
{
return
super
.
getSqlParser
(
source
,
transform
);
}
else
{
// overwrite the default sql conformance.
return
SqlParser
.
create
(
source
,
SqlParser
.
configBuilder
()
.
setParserFactory
(
parserImplFactory
())
.
setQuoting
(
Quoting
.
DOUBLE_QUOTE
)
.
setUnquotedCasing
(
Casing
.
TO_UPPER
)
.
setQuotedCasing
(
Casing
.
UNCHANGED
)
.
setConformance
(
conformance0
)
.
build
());
}
}
@Before
public
void
before
()
{
// clear the custom sql conformance.
conformance0
=
null
;
return
super
.
getSqlParser
(
source
,
transform
);
}
@Test
...
...
@@ -224,7 +200,6 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
@Test
public
void
testCreateTable
()
{
conformance0
=
FlinkSqlConformance
.
HIVE
;
final
String
sql
=
"CREATE TABLE tbl1 (\n"
+
" a bigint,\n"
+
" h varchar, \n"
+
...
...
@@ -258,7 +233,6 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
@Test
public
void
testCreateTableWithComment
()
{
conformance0
=
FlinkSqlConformance
.
HIVE
;
final
String
sql
=
"CREATE TABLE tbl1 (\n"
+
" a bigint comment 'test column comment AAA.',\n"
+
" h varchar, \n"
+
...
...
@@ -294,7 +268,6 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
@Test
public
void
testCreateTableWithPrimaryKeyAndUniqueKey
()
{
conformance0
=
FlinkSqlConformance
.
HIVE
;
final
String
sql
=
"CREATE TABLE tbl1 (\n"
+
" a bigint comment 'test column comment AAA.',\n"
+
" h varchar, \n"
+
...
...
@@ -565,7 +538,6 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
@Test
public
void
testCreateInvalidPartitionedTable
()
{
conformance0
=
FlinkSqlConformance
.
HIVE
;
final
String
sql
=
"create table sls_stream1(\n"
+
" a bigint,\n"
+
" b VARCHAR,\n"
+
...
...
@@ -578,16 +550,6 @@ public class FlinkSqlParserImplTest extends SqlParserTest {
.
fails
(
"Partition column [C] not defined in columns, at line 6, column 3"
));
}
@Test
public
void
testNotAllowedCreatePartition
()
{
conformance0
=
FlinkSqlConformance
.
DEFAULT
;
final
String
sql
=
"create table sls_stream1(\n"
+
" a bigint,\n"
+
" b VARCHAR\n"
+
") PARTITIONED BY (a^)^ with ( 'x' = 'y', 'asd' = 'dada')"
;
sql
(
sql
).
fails
(
"Creating partitioned table is only allowed for HIVE dialect."
);
}
@Test
public
void
testCreateTableWithMinusInOptionKey
()
{
final
String
sql
=
"create table source_table(\n"
+
...
...
flink-table/flink-table-planner-blink/src/test/java/org/apache/flink/table/planner/operations/SqlToOperationConverterTest.java
浏览文件 @
853fc3c5
...
...
@@ -246,8 +246,8 @@ public class SqlToOperationConverterTest {
" 'connector' = 'kafka', \n"
+
" 'kafka.topic' = 'log.test'\n"
+
")\n"
;
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
HIVE
);
final
CalciteParser
parser
=
getParserBySqlDialect
(
SqlDialect
.
HIVE
);
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
DEFAULT
);
final
CalciteParser
parser
=
getParserBySqlDialect
(
SqlDialect
.
DEFAULT
);
Operation
operation
=
parse
(
sql
,
planner
,
parser
);
assert
operation
instanceof
CreateTableOperation
;
CreateTableOperation
op
=
(
CreateTableOperation
)
operation
;
...
...
@@ -265,8 +265,8 @@ public class SqlToOperationConverterTest {
@Test
(
expected
=
SqlConversionException
.
class
)
public
void
testCreateTableWithPkUniqueKeys
()
{
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
HIVE
);
final
CalciteParser
parser
=
getParserBySqlDialect
(
SqlDialect
.
HIVE
);
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
DEFAULT
);
final
CalciteParser
parser
=
getParserBySqlDialect
(
SqlDialect
.
DEFAULT
);
final
String
sql
=
"CREATE TABLE tbl1 (\n"
+
" a bigint,\n"
+
" b varchar, \n"
+
...
...
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/sqlexec/SqlToOperationConverterTest.java
浏览文件 @
853fc3c5
...
...
@@ -245,8 +245,8 @@ public class SqlToOperationConverterTest {
" 'connector' = 'kafka', \n"
+
" 'kafka.topic' = 'log.test'\n"
+
")\n"
;
final
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
HIVE
);
SqlNode
node
=
getParserBySqlDialect
(
SqlDialect
.
HIVE
).
parse
(
sql
);
final
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
DEFAULT
);
SqlNode
node
=
getParserBySqlDialect
(
SqlDialect
.
DEFAULT
).
parse
(
sql
);
assert
node
instanceof
SqlCreateTable
;
Operation
operation
=
SqlToOperationConverter
.
convert
(
planner
,
catalogManager
,
node
).
get
();
assert
operation
instanceof
CreateTableOperation
;
...
...
@@ -306,8 +306,8 @@ public class SqlToOperationConverterTest {
" 'connector' = 'kafka', \n"
+
" 'kafka.topic' = 'log.test'\n"
+
")\n"
;
final
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
HIVE
);
SqlNode
node
=
getParserBySqlDialect
(
SqlDialect
.
HIVE
).
parse
(
sql
);
final
FlinkPlannerImpl
planner
=
getPlannerBySqlDialect
(
SqlDialect
.
DEFAULT
);
SqlNode
node
=
getParserBySqlDialect
(
SqlDialect
.
DEFAULT
).
parse
(
sql
);
assert
node
instanceof
SqlCreateTable
;
SqlToOperationConverter
.
convert
(
planner
,
catalogManager
,
node
);
}
...
...
flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/runtime/batch/sql/PartitionableSinkITCase.scala
浏览文件 @
853fc3c5
...
...
@@ -62,7 +62,6 @@ class PartitionableSinkITCase extends AbstractTestBase {
def
before
()
:
Unit
=
{
batchExec
.
setParallelism
(
1
)
tEnv
=
BatchTableEnvironment
.
create
(
batchExec
)
tEnv
.
getConfig
.
setSqlDialect
(
SqlDialect
.
HIVE
)
registerTableSource
(
"nonSortTable"
,
testData
.
toList
)
registerTableSource
(
"sortTable"
,
testData1
.
toList
)
PartitionableSinkITCase
.
init
()
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录