[code-reflection] RFR: [hat] Proposal for bfloat16

Juan Fumero jfumero at openjdk.org
Mon Dec 1 15:59:19 UTC 2025


This PR introduces the type [`bfloat16`](https://cloud.google.com/blog/products/ai-machine-learning/bfloat16-the-secret-to-high-performance-on-cloud-tpus) for HAT.

Testing for OpenCL:


HAT=SHOW_CODE java -cp hat/job.jar hat.java test ffi-opencl hat.test.TestBFloat16Type


Testing for CUDA:


HAT=SHOW_CODE java -cp hat/job.jar hat.java test ffi-cuda hat.test.TestBFloat16Type


Some tests are expecting to fail due to precision error. We will need to improve the type conversion with round-to-nearest-even for example.

-------------

Commit messages:
 - Merge branch 'code-reflection' into hat/type/bfloat16
 - [hat] remove custom Op for bfloat
 - single line imports in hat code builders
 - single line imports in hat code builders
 - [hat] test matmul with bfloat16
 - [hat] dialect for bfloat16 removed
 - [hat] new test file included in the hat test list
 - [hat] OpenCL handler for bfloat16 via float convs
 - Merge branch 'code-reflection' into hat/type/bfloat16
 - [hat][wip] bfloat16 conversions for OpenCL
 - ... and 10 more: https://git.openjdk.org/babylon/compare/76e4d3b7...60f7140c

Changes: https://git.openjdk.org/babylon/pull/716/files
  Webrev: https://webrevs.openjdk.org/?repo=babylon&pr=716&range=00
  Stats: 1517 lines in 29 files changed: 1399 ins; 44 del; 74 mod
  Patch: https://git.openjdk.org/babylon/pull/716.diff
  Fetch: git fetch https://git.openjdk.org/babylon.git pull/716/head:pull/716

PR: https://git.openjdk.org/babylon/pull/716


More information about the babylon-dev mailing list