Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
apache
Shardingsphere
提交
464b5882
Shardingsphere
项目概览
apache
/
Shardingsphere
通知
56
Star
3
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
Shardingsphere
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
464b5882
编写于
5月 12, 2017
作者:
H
haocao
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Add scan nchars test cases for TokenizerTest.
上级
f65638bc
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
38 addition
and
16 deletion
+38
-16
sharding-jdbc-core/src/test/java/com/dangdang/ddframe/rdb/sharding/parsing/lexer/analyzer/TokenizerTest.java
...me/rdb/sharding/parsing/lexer/analyzer/TokenizerTest.java
+38
-16
未找到文件。
sharding-jdbc-core/src/test/java/com/dangdang/ddframe/rdb/sharding/parsing/lexer/analyzer/TokenizerTest.java
浏览文件 @
464b5882
...
...
@@ -17,9 +17,7 @@
package
com.dangdang.ddframe.rdb.sharding.parsing.lexer.analyzer
;
import
com.dangdang.ddframe.rdb.sharding.parsing.lexer.token.Literals
;
import
com.dangdang.ddframe.rdb.sharding.parsing.lexer.token.Token
;
import
com.dangdang.ddframe.rdb.sharding.parsing.lexer.token.TokenType
;
import
com.dangdang.ddframe.rdb.sharding.parsing.lexer.token.*
;
import
org.junit.Test
;
import
org.mockito.internal.matchers.apachecommons.ReflectionEquals
;
...
...
@@ -28,9 +26,9 @@ import static org.hamcrest.core.Is.is;
import
static
org
.
junit
.
Assert
.
assertTrue
;
public
final
class
TokenizerTest
{
private
final
Dictionary
dictionary
=
new
Dictionary
();
@Test
public
void
assertSkipWhitespace
()
{
String
sql
=
"SELECT *\tFROM\rTABLE_XXX\n"
;
...
...
@@ -43,14 +41,14 @@ public final class TokenizerTest {
assertThat
(
tokenizer
.
skipWhitespace
(),
is
(
expected
));
}
}
@Test
public
void
assertSkipCommentWithoutComment
()
{
String
sql
=
"SELECT * FROM XXX_TABLE"
;
Tokenizer
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"_"
));
assertThat
(
tokenizer
.
skipComment
(),
is
(
sql
.
indexOf
(
"_"
)));
}
@Test
public
void
assertSkipSingleLineComment
()
{
String
singleLineCommentWithHyphen
=
"--x\"y`z\n"
;
...
...
@@ -63,7 +61,7 @@ public final class TokenizerTest {
expected
=
sql
.
indexOf
(
"/"
)
+
singleLineCommentWithSlash
.
length
();
assertThat
(
slashTokenizer
.
skipComment
(),
is
(
expected
));
}
@Test
public
void
assertSkipSingleLineMySQLComment
()
{
String
comment
=
"#x\"y`z\n"
;
...
...
@@ -72,7 +70,7 @@ public final class TokenizerTest {
int
expected
=
sql
.
indexOf
(
"#"
)
+
comment
.
length
();
assertThat
(
tokenizer
.
skipComment
(),
is
(
expected
));
}
@Test
public
void
assertSkipMultipleLineComment
()
{
String
comment
=
"/*--xyz \n WHERE XX=1 //xyz*/"
;
...
...
@@ -81,7 +79,7 @@ public final class TokenizerTest {
int
expected
=
sql
.
indexOf
(
"/"
)
+
comment
.
length
();
assertThat
(
tokenizer
.
skipComment
(),
is
(
expected
));
}
@Test
(
expected
=
UnterminatedCharException
.
class
)
public
void
assertSkipMultipleLineCommentUnterminatedCharException
()
{
String
comment
=
"/*--xyz \n WHERE XX=1 //xyz"
;
...
...
@@ -89,7 +87,7 @@ public final class TokenizerTest {
Tokenizer
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"/"
));
tokenizer
.
skipComment
();
}
@Test
public
void
assertSkipHint
()
{
String
comment
=
"/*--xyz \n WHERE XX=1 //xyz*/"
;
...
...
@@ -98,7 +96,7 @@ public final class TokenizerTest {
int
expected
=
sql
.
indexOf
(
"/"
)
+
comment
.
length
();
assertThat
(
tokenizer
.
skipHint
(),
is
(
expected
));
}
@Test
(
expected
=
UnterminatedCharException
.
class
)
public
void
assertSkipHintUnterminatedCharException
()
{
String
comment
=
"/*--xyz \n WHERE XX=1 //xyz"
;
...
...
@@ -106,7 +104,7 @@ public final class TokenizerTest {
Tokenizer
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"/"
));
tokenizer
.
skipHint
();
}
@Test
public
void
assertScanVariable
()
{
String
sql
=
"SELECT * FROM XXX_TABLE %s WHERE YY>2"
;
...
...
@@ -119,7 +117,7 @@ public final class TokenizerTest {
Tokenizer
tokenizer
=
new
Tokenizer
(
formatSql
,
dictionary
,
formatSql
.
indexOf
(
"@"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanVariable
()).
matches
(
new
Token
(
Literals
.
VARIABLE
,
literals
,
formatSql
.
indexOf
(
"WHERE"
)
-
1
)));
}
@Test
public
void
assertScanNumber
()
{
String
sql
=
"SELECT * FROM XXX_TABLE WHERE XX=%s"
;
...
...
@@ -142,16 +140,40 @@ public final class TokenizerTest {
assertScanHexDecimal
(
sql
,
"0x1e"
,
Literals
.
HEX
);
assertScanHexDecimal
(
sql
,
"0x-1e"
,
Literals
.
HEX
);
}
private
void
assertScanNumber
(
final
String
sql
,
final
String
literals
,
final
TokenType
type
)
{
String
formatSql
=
String
.
format
(
sql
,
literals
);
Tokenizer
tokenizer
=
new
Tokenizer
(
formatSql
,
dictionary
,
sql
.
indexOf
(
"="
)
+
1
);
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanNumber
()).
matches
(
new
Token
(
type
,
literals
,
formatSql
.
length
())));
}
private
void
assertScanHexDecimal
(
final
String
sql
,
final
String
literals
,
final
TokenType
type
)
{
String
formatSql
=
String
.
format
(
sql
,
literals
);
Tokenizer
tokenizer
=
new
Tokenizer
(
formatSql
,
dictionary
,
sql
.
indexOf
(
"="
)
+
1
);
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanHexDecimal
()).
matches
(
new
Token
(
type
,
literals
,
formatSql
.
length
())));
}
@Test
public
void
assertScanNChars
()
{
String
sql
=
"SELECT * FROM ORDER, XX_TABLE AS `table` WHERE YY=N'xx' And group =-1 GROUP BY YY"
;
Tokenizer
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"ORDER"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanIdentifier
()).
matches
(
new
Token
(
Literals
.
IDENTIFIER
,
"ORDER"
,
sql
.
indexOf
(
","
))));
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"GROUP"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanIdentifier
()).
matches
(
new
Token
(
DefaultKeyword
.
GROUP
,
"GROUP"
,
sql
.
indexOf
(
"BY"
)
-
1
)));
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"`"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanIdentifier
()).
matches
(
new
Token
(
Literals
.
IDENTIFIER
,
"`table`"
,
sql
.
indexOf
(
"WHERE"
)
-
1
)));
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"YY"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanIdentifier
()).
matches
(
new
Token
(
Literals
.
IDENTIFIER
,
"YY"
,
sql
.
indexOf
(
"="
))));
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"=-"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanSymbol
()).
matches
(
new
Token
(
Symbol
.
EQ
,
"="
,
sql
.
indexOf
(
"=-"
)
+
1
)));
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"'"
));
assertTrue
(
new
ReflectionEquals
(
tokenizer
.
scanChars
()).
matches
(
new
Token
(
Literals
.
CHARS
,
"xx"
,
sql
.
indexOf
(
"And"
)
-
1
)));
}
@Test
(
expected
=
UnterminatedCharException
.
class
)
public
void
assertScanChars
()
{
String
sql
=
"SELECT * FROM XXX_TABLE AS `TEST"
;
Tokenizer
tokenizer
=
new
Tokenizer
(
sql
,
dictionary
,
sql
.
indexOf
(
"`"
));
tokenizer
.
scanChars
();
}
}
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录