[code-reflection] RFR: Integrate Java Triton example with Intel Triton Backend

hanklo6 duke at openjdk.org
Wed Oct 9 21:44:26 UTC 2024


Babylon Java Triton example translates Java source code with Java Triton API into code model by code reflection.
 
In this PR, we traverse the given code model and output Triton MLIR dialect in the generic form, and then inject generated MLIR dialect into the Intel Triton backend. We then utilize Intel Triton backend to compile the Triton MLIR dialect into a SPIR-V module. Use `Jextract` to create Java binding of Intel Level Zero runtime and launch the given kernel function with it on Intel GPUs.

## Usage
Navigate to the `cr-example/triton` directory and execute `mvn clean test`. This will generate multiple MLIR files in the `result` directory ready to be processed by the Triton backend. 

Next, modify the `compiler.py` file within the `intel-xpu-triton-backend` project by applying the patch `git apply add-mlir-insertion.patch`. Then run the Triton backend by running `python3 translate.py`.

The Triton backend will generate SPIR-V files, which will be located under `~/.triton/cache/{hash_value}/{kernel_name}/{kernel_name}.spv`.

To create a binding for Level Zero, execute the below commands:

$JEXTRACT_DIR/bin/jextract --output src/gen/java -I /usr/include -t oneapi.levelzero level-zero/include/ze_api.h
$JAVA_HOME/bin/javac -cp target/classes -d target/classes src/gen/java/oneapi/levelzero/*.java
$JAVA_HOME/bin/jar cf levelzero.jar -C target/classes/ .

The will generate `levelzero.jar` in the current directory.

After getting JAR files for Level Zero and `JSON-java`, proceed to compile and run the launcher `LevelZero.java` with the following commands:

babylon/build/linux-x86_64-server-release/jdk/bin/javac -cp .:levelzero.jar:json-java.jar LevelZero.java
babylon/build/linux-x86_64-server-release/jdk/bin/java -ea -cp .:levelzero.jar:json-java.jar LevelZero


Ensure the hash values in`~/.triton/cache` match those used in the `LevelZero.java`.

## Dependencies
- [intel-xpu-backend-for-triton](https://github.com/intel/intel-xpu-backend-for-triton)
- [intel-extension-for-pytorch](https://github.com/intel/intel-extension-for-pytorch)
- [Intel oneAPI base Toolkit](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html)
- [Jextract](https://github.com/openjdk/jextract)
- [Level Zero loader](https://github.com/oneapi-src/level-zero)
- [compute-runtime](https://github.com/intel/compute-runtime/releases)
- [JSON-java](https://github.com/stleary/JSON-java)

-------------

Commit messages:
 - Fix trailing and leading whitespaces
 - Use system home in the path
 - Add launcher code and translate code
 - Update translate.py
 - Add compile patch
 - Update the generated file names
 - Add triton translator
 - Update test cases
 - Add MLIR generator
 - Add attributes name support and enhance dot and load op

Changes: https://git.openjdk.org/babylon/pull/241/files
  Webrev: https://webrevs.openjdk.org/?repo=babylon&pr=241&range=00
  Stats: 1926 lines in 12 files changed: 1775 ins; 4 del; 147 mod
  Patch: https://git.openjdk.org/babylon/pull/241.diff
  Fetch: git fetch https://git.openjdk.org/babylon.git pull/241/head:pull/241

PR: https://git.openjdk.org/babylon/pull/241


More information about the babylon-dev mailing list