Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
KnowledgePlanet
road-map
chatglm-sdk-java
提交
8923a734
chatglm-sdk-java
项目概览
KnowledgePlanet
/
road-map
/
chatglm-sdk-java
通知
321
Star
27
Fork
14
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
chatglm-sdk-java
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
提交
8923a734
编写于
1月 21, 2024
作者:
小傅哥
⛹
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
docs:更新文档
上级
aa0cfb88
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
277 addition
and
123 deletion
+277
-123
README.md
README.md
+277
-123
未找到文件。
README.md
浏览文件 @
8923a734
...
...
@@ -26,7 +26,7 @@
<dependency>
<groupId>cn.bugstack</groupId>
<artifactId>chatglm-sdk-java</artifactId>
<version>
1.1
</version>
<version>
2.0
</version>
</dependency>
```
...
...
@@ -38,150 +38,304 @@
### 2.1 代码执行
```
java
private
OpenAiSession
openAiSession
;
@Before
public
void
test_OpenAiSessionFactory
()
{
// 1. 配置文件
Configuration
configuration
=
new
Configuration
();
configuration
.
setApiHost
(
"https://open.bigmodel.cn/"
);
configuration
.
setApiSecretKey
(
"62ddec38b1d0b9a7b0fddaf271e6ed90.HpD0SUBUlvqd05ey"
);
configuration
.
setLevel
(
HttpLoggingInterceptor
.
Level
.
BODY
);
// 2. 会话工厂
OpenAiSessionFactory
factory
=
new
DefaultOpenAiSessionFactory
(
configuration
);
// 3. 开启会话
this
.
openAiSession
=
factory
.
openSession
();
}
```
-
测试前申请你的 ApiKey 填写到 setApiSecretKey 中使用。
#### 2.1.1 流式对话 - 兼容旧版模式运行
<details><summary><a>
👉查看代码
</a></summary></br>
```
java
/**
* @author 小傅哥,微信:fustack
* @description 在官网申请 ApiSecretKey <a href="https://open.bigmodel.cn/usercenter/apikeys">ApiSecretKey</a>
* @github https://github.com/fuzhengwei
* @Copyright 公众号:bugstack虫洞栈 | 博客:https://bugstack.cn - 沉淀、分享、成长,让自己和他人都能有所收获!
* 流式对话;
* 1. 默认 isCompatible = true 会兼容新旧版数据格式
* 2. GLM_3_5_TURBO、GLM_4 支持联网等插件
*/
@Slf4j
public
class
ApiTest
{
@Test
public
void
test_completions
()
throws
Exception
{
CountDownLatch
countDownLatch
=
new
CountDownLatch
(
1
);
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
GLM_3_5_TURBO
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setIncremental
(
false
);
request
.
setIsCompatible
(
true
);
// 是否对返回结果数据做兼容,24年1月发布的 GLM_3_5_TURBO、GLM_4 模型,与之前的模型在返回结果上有差异。开启 true 可以做兼容。
// 24年1月发布的 glm-3-turbo、glm-4 支持函数、知识库、联网功能
request
.
setTools
(
new
ArrayList
<
ChatCompletionRequest
.
Tool
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Tool
.
builder
()
.
type
(
ChatCompletionRequest
.
Tool
.
Type
.
web_search
)
.
webSearch
(
ChatCompletionRequest
.
Tool
.
WebSearch
.
builder
().
enable
(
true
).
searchQuery
(
"小傅哥"
).
build
())
.
build
());
}
});
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"小傅哥的是谁"
)
.
build
());
}
});
// 请求
openAiSession
.
completions
(
request
,
new
EventSourceListener
()
{
@Override
public
void
onEvent
(
EventSource
eventSource
,
@Nullable
String
id
,
@Nullable
String
type
,
String
data
)
{
ChatCompletionResponse
response
=
JSON
.
parseObject
(
data
,
ChatCompletionResponse
.
class
);
log
.
info
(
"测试结果 onEvent:{}"
,
response
.
getData
());
// type 消息类型,add 增量,finish 结束,error 错误,interrupted 中断
if
(
EventType
.
finish
.
getCode
().
equals
(
type
))
{
ChatCompletionResponse
.
Meta
meta
=
JSON
.
parseObject
(
response
.
getMeta
(),
ChatCompletionResponse
.
Meta
.
class
);
log
.
info
(
"[输出结束] Tokens {}"
,
JSON
.
toJSONString
(
meta
));
}
}
@Override
public
void
onClosed
(
EventSource
eventSource
)
{
log
.
info
(
"对话完成"
);
countDownLatch
.
countDown
();
}
@Override
public
void
onFailure
(
EventSource
eventSource
,
@Nullable
Throwable
t
,
@Nullable
Response
response
)
{
log
.
info
(
"对话异常"
);
countDownLatch
.
countDown
();
}
});
// 等待
countDownLatch
.
await
();
}
```
private
OpenAiSession
openAiSession
;
</details>
@Before
public
void
test_OpenAiSessionFactory
()
{
// 1. 配置文件
Configuration
configuration
=
new
Configuration
();
configuration
.
setApiHost
(
"https://open.bigmodel.cn/"
);
configuration
.
setApiSecretKey
(
"d570f7c5d289cdac2abdfdc562e39f3f.trqz1dH8ZK6ED7Pg"
);
configuration
.
setLevel
(
HttpLoggingInterceptor
.
Level
.
BODY
);
// 2. 会话工厂
OpenAiSessionFactory
factory
=
new
DefaultOpenAiSessionFactory
(
configuration
);
// 3. 开启会话
this
.
openAiSession
=
factory
.
openSession
();
}
#### 2.1.2 流式对话 - 新版调用
/**
* 流式对话 & 关联上下文
*/
@Test
public
void
test_completions
()
throws
JsonProcessingException
,
InterruptedException
{
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
CHATGLM_TURBO
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setIncremental
(
false
);
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"1+2"
)
.
build
());
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"Okay"
)
.
build
());
/* system 和 user 为一组出现。如果有参数类型为 system 则 system + user 一组一起传递。*/
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
system
.
getCode
())
.
content
(
"1+1=2"
)
.
build
());
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"Okay"
)
.
build
());
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"1+2"
)
.
build
());
<details><summary><a>
👉查看代码
</a></summary></br>
```
java
/**
* 流式对话;
* 1. 与 test_completions 测试类相比,只是设置 isCompatible = false 这样就是使用了新的数据结构。onEvent 处理接收数据有差异
* 2. 不兼容旧版格式的话,仅支持 GLM_3_5_TURBO、GLM_4 其他模型会有解析错误
*/
@Test
public
void
test_completions_new
()
throws
Exception
{
CountDownLatch
countDownLatch
=
new
CountDownLatch
(
1
);
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
GLM_3_5_TURBO
);
// GLM_3_5_TURBO、GLM_4
request
.
setIsCompatible
(
false
);
// 24年1月发布的 glm-3-turbo、glm-4 支持函数、知识库、联网功能
request
.
setTools
(
new
ArrayList
<
ChatCompletionRequest
.
Tool
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Tool
.
builder
()
.
type
(
ChatCompletionRequest
.
Tool
.
Type
.
web_search
)
.
webSearch
(
ChatCompletionRequest
.
Tool
.
WebSearch
.
builder
().
enable
(
true
).
searchQuery
(
"小傅哥"
).
build
())
.
build
());
}
});
request
.
setMessages
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"小傅哥的是谁"
)
.
build
());
}
});
// 请求
openAiSession
.
completions
(
request
,
new
EventSourceListener
()
{
@Override
public
void
onEvent
(
EventSource
eventSource
,
@Nullable
String
id
,
@Nullable
String
type
,
String
data
)
{
if
(
"[DONE]"
.
equals
(
data
))
{
log
.
info
(
"[输出结束] Tokens {}"
,
JSON
.
toJSONString
(
data
));
return
;
}
});
// 请求
openAiSession
.
completions
(
request
,
new
EventSourceListener
()
{
@Override
public
void
onEvent
(
EventSource
eventSource
,
@Nullable
String
id
,
@Nullable
String
type
,
String
data
)
{
ChatCompletionResponse
response
=
JSON
.
parseObject
(
data
,
ChatCompletionResponse
.
class
);
log
.
info
(
"测试结果 onEvent:{}"
,
response
.
getData
());
// type 消息类型,add 增量,finish 结束,error 错误,interrupted 中断
if
(
EventType
.
finish
.
getCode
().
equals
(
type
))
{
ChatCompletionResponse
.
Meta
meta
=
JSON
.
parseObject
(
response
.
getMeta
(),
ChatCompletionResponse
.
Meta
.
class
);
log
.
info
(
"[输出结束] Tokens {}"
,
JSON
.
toJSONString
(
meta
));
}
}
ChatCompletionResponse
response
=
JSON
.
parseObject
(
data
,
ChatCompletionResponse
.
class
);
log
.
info
(
"测试结果:{}"
,
JSON
.
toJSONString
(
response
));
}
@Override
public
void
onClosed
(
EventSource
eventSource
)
{
log
.
info
(
"对话完成"
);
countDownLatch
.
countDown
();
}
@Override
public
void
onFailure
(
EventSource
eventSource
,
@Nullable
Throwable
t
,
@Nullable
Response
response
)
{
log
.
error
(
"对话失败"
,
t
);
countDownLatch
.
countDown
();
}
});
// 等待
countDownLatch
.
await
();
}
```
@Override
public
void
onClosed
(
EventSource
eventSource
)
{
log
.
info
(
"对话完成"
);
}
</details>
});
#### 2.1.3 流式对话 - 多模态图片识别 4v(vision)
// 等待
new
CountDownLatch
(
1
).
await
();
}
<details><summary><a>
👉查看代码
</a></summary></br>
/**
* 同步请求
*/
@Test
public
void
test_completions_future
()
throws
ExecutionException
,
InterruptedException
{
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
CHATGLM_TURBO
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"1+1"
)
.
build
());
```
java
@Test
public
void
test_completions_4v
()
throws
Exception
{
CountDownLatch
countDownLatch
=
new
CountDownLatch
(
1
);
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
GLM_4V
);
// GLM_3_5_TURBO、GLM_4
request
.
setStream
(
true
);
request
.
setMessages
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
// content 字符串格式
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"这个图片写了什么"
)
.
build
());
// content 对象格式
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
ChatCompletionRequest
.
Prompt
.
Content
.
builder
()
.
type
(
ChatCompletionRequest
.
Prompt
.
Content
.
Type
.
text
.
getCode
())
.
text
(
"这是什么图片"
)
.
build
())
.
build
());
// content 对象格式,上传图片;图片支持url、basde64
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
ChatCompletionRequest
.
Prompt
.
Content
.
builder
()
.
type
(
ChatCompletionRequest
.
Prompt
.
Content
.
Type
.
image_url
.
getCode
())
.
imageUrl
(
ChatCompletionRequest
.
Prompt
.
Content
.
ImageUrl
.
builder
().
url
(
"https://bugstack.cn/images/article/project/chatgpt/chatgpt-extra-231011-01.png"
).
build
())
.
build
())
.
build
());
}
});
openAiSession
.
completions
(
request
,
new
EventSourceListener
()
{
@Override
public
void
onEvent
(
EventSource
eventSource
,
@Nullable
String
id
,
@Nullable
String
type
,
String
data
)
{
if
(
"[DONE]"
.
equals
(
data
))
{
log
.
info
(
"[输出结束] Tokens {}"
,
JSON
.
toJSONString
(
data
));
return
;
}
});
ChatCompletionResponse
response
=
JSON
.
parseObject
(
data
,
ChatCompletionResponse
.
class
);
log
.
info
(
"测试结果:{}"
,
JSON
.
toJSONString
(
response
));
}
@Override
public
void
onClosed
(
EventSource
eventSource
)
{
log
.
info
(
"对话完成"
);
countDownLatch
.
countDown
();
}
@Override
public
void
onFailure
(
EventSource
eventSource
,
@Nullable
Throwable
t
,
@Nullable
Response
response
)
{
log
.
error
(
"对话失败"
,
t
);
countDownLatch
.
countDown
();
}
});
// 等待
countDownLatch
.
await
();
}
```
CompletableFuture
<
String
>
future
=
openAiSession
.
completions
(
request
);
String
response
=
future
.
get
();
</details>
log
.
info
(
"测试结果:{}"
,
response
);
}
#### 2.1.4 同步请求 - future 模式
/**
* 同步请求
*/
@Test
public
void
test_completions_sync
()
throws
IOException
{
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
CHATGLM_TURBO
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"1+1"
)
.
build
());
}
});
<details><summary><a>
👉查看代码
</a></summary></br>
```
java
@Test
public
void
test_completions_future
()
throws
Exception
{
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
CHATGLM_TURBO
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"1+1"
)
.
build
());
}
});
CompletableFuture
<
String
>
future
=
openAiSession
.
completions
(
request
);
String
response
=
future
.
get
();
log
.
info
(
"测试结果:{}"
,
response
);
}
```
ChatCompletionSyncResponse
response
=
openAiSession
.
completionsSync
(
request
);
</details>
log
.
info
(
"测试结果:{}"
,
JSON
.
toJSONString
(
response
));
}
#### 2.1.5 同步请求 - 普通模式
<details><summary><a>
👉查看代码
</a></summary></br>
```
java
@Test
public
void
test_completions_sync
()
throws
Exception
{
// 入参;模型、请求信息
ChatCompletionRequest
request
=
new
ChatCompletionRequest
();
request
.
setModel
(
Model
.
GLM_4V
);
// chatGLM_6b_SSE、chatglm_lite、chatglm_lite_32k、chatglm_std、chatglm_pro
request
.
setPrompt
(
new
ArrayList
<
ChatCompletionRequest
.
Prompt
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Prompt
.
builder
()
.
role
(
Role
.
user
.
getCode
())
.
content
(
"小傅哥是谁"
)
.
build
());
}
});
// 24年1月发布的 glm-3-turbo、glm-4 支持函数、知识库、联网功能
request
.
setTools
(
new
ArrayList
<
ChatCompletionRequest
.
Tool
>()
{
private
static
final
long
serialVersionUID
=
-
7988151926241837899L
;
{
add
(
ChatCompletionRequest
.
Tool
.
builder
()
.
type
(
ChatCompletionRequest
.
Tool
.
Type
.
web_search
)
.
webSearch
(
ChatCompletionRequest
.
Tool
.
WebSearch
.
builder
().
enable
(
true
).
searchQuery
(
"小傅哥"
).
build
())
.
build
());
}
});
ChatCompletionSyncResponse
response
=
openAiSession
.
completionsSync
(
request
);
log
.
info
(
"测试结果:{}"
,
JSON
.
toJSONString
(
response
));
}
```
</details>
#### 2.1.6 文生图
<details><summary><a>
👉查看代码
</a></summary></br>
```
java
@Test
public
void
test_genImages
()
throws
Exception
{
ImageCompletionRequest
request
=
new
ImageCompletionRequest
();
request
.
setModel
(
Model
.
COGVIEW_3
);
request
.
setPrompt
(
"画个小狗"
);
ImageCompletionResponse
response
=
openAiSession
.
genImages
(
request
);
log
.
info
(
"测试结果:{}"
,
JSON
.
toJSONString
(
response
));
}
```
-
这是一个单元测试类,也是最常使用的流式对话模式。
</details>
### 2.2 脚本测试
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录