未验证 提交 ee813e34 编写于 作者: J jakpiase 提交者: GitHub

Reupload: Added numpy bf16 datatype support via custom pip package (#38703)

* reuploaded files

* Changed year from 2021 to 2022

* minor change

* fixed requirements.txt file
上级 747000dd
# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import numpy as np
from paddle_bfloat import bfloat16
import unittest
class TestBF16DataType(unittest.TestCase):
def test_matmul(self):
a_bf16 = np.random.random((6, 7)).astype(bfloat16)
b_bf16 = np.random.random((7, 8)).astype(bfloat16)
c_bf16 = np.matmul(a_bf16, b_bf16)
a_fp32 = a_bf16.astype(np.float32)
b_fp32 = b_bf16.astype(np.float32)
c_fp32 = np.matmul(a_fp32, b_fp32)
self.assertTrue(np.allclose(c_bf16, c_fp32))
if __name__ == "__main__":
unittest.main()
...@@ -5,3 +5,4 @@ Pillow ...@@ -5,3 +5,4 @@ Pillow
six six
decorator decorator
astor astor
paddle_bfloat==0.1.2
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册