[code-reflection] RFR: 1D Matrix Multiplication example for HAT [v2]

Juan Fumero duke at openjdk.org
Mon Apr 7 16:23:04 UTC 2025


On Mon, 7 Apr 2025 15:17:37 GMT, f <duke at openjdk.org> wrote:

>> Pending for review
>
> @jjfumero Hi, can use Babylon to implement operations to do llama inference now? Babylon has all the basic ops for at least one platform ie. cuda now?

Hi @SidneyLann , I am not the core maintainer of Babylon. Probably Gary Frost can help you with your questions. From my view, I think you need to access shared memory and some synchronisation primitives to be able to perform reductions. I am not sure if this is implemented in HAT yet.

-------------

PR Comment: https://git.openjdk.org/babylon/pull/276#issuecomment-2783916596


More information about the babylon-dev mailing list