<!DOCTYPE html>
<html data-lt-installed="true">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body style="padding-bottom: 1px;">
<p>Hi Paul, <br>
I think this is a great initiative and very well-needed in the
Java world. I have a few questions. <br>
<br>
1) <br>
<i>> Babylon will ensure that code reflection is fit for
purpose by creating a GPU programming model for Java that
leverages code reflection and is implemented as a Java library.</i><br>
<br>
Does this mean that one of the goals of the project is to define
how GPUs should be programmed using the Code Reflection API, or
for Java in General? Is Babylon limited to GPUs? Are you also
considering other types of accelerators (e.g., AI accelerators,
RISC-V accelerators, etc). <br>
<br>
We have other programming models such as TornadoVM [1], which can
be programmed using different styles (e.g., loop parallel programs
and kernel APIs). How the new model/s will accommodate existing
solutions? Is this to be defined?<br>
<br>
2) <br>
<i>> We do not currently plan to deliver the GPU programming
model into the JDK. However, work on that model could identify
JDK features and enhancements of general utility which could be
addressed in future work.</i><br>
<br>
Does this mean that the GPU programming model will be only used as
a motivation to develop the Code Reflection APIs for different use
cases?<br>
</p>
<p>3) Is there any intent to support JVM languages with these models
(e.g., R, Scala, etc), or will it be specific for the Java
language? <br>
<br>
4) I believe we also need new types. As we discussed in JVMLS this
year, we will also need NDArray and Tensor types, Vector types and
Panama-based types for AI and Heterogeneous Computing. This is
aligned to the Gary's talk at JVMLS [2] in which he proposed the
HAT initiative (Heterogeneous Accelerator Toolkit) and
Panama-based types. Will be this also part of the Babylon
project? </p>
<p>[1]
<a class="moz-txt-link-freetext" href="https://tornadovm.readthedocs.io/en/latest/programming.html#core-programming">https://tornadovm.readthedocs.io/en/latest/programming.html#core-programming</a></p>
<p>[2] <a class="moz-txt-link-freetext" href="https://www.youtube.com/watch?v=lbKBu3lTftc">https://www.youtube.com/watch?v=lbKBu3lTftc</a><br>
</p>
<p><br>
Thanks<br>
Juan<br>
</p>
<p><br>
</p>
<div class="moz-cite-prefix">On 13/09/2023 01:37, Paul Sandoz wrote:<br>
</div>
<blockquote type="cite"
cite="mid:EFFC47D1-408D-4BD6-9316-294F8A9BCCAB@oracle.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
Hi Ethan,
<div><br>
</div>
<div>Current/prior work includes Mojo, MLIR, C# LINQ, Julia [1],
Swift for TensorFlow [2], Haskell [3].</div>
<div><br>
</div>
<div>In the context of lunch and Python what I had in mind is
machine learning and all those frameworks, and I was also
thinking about introspection of Python code which IIUC is what
TorchDynamo [4] does. </div>
<div><br>
</div>
<div>Paul. </div>
<div>
<div><br>
</div>
<div>[1] <a href="https://arxiv.org/abs/1712.03112"
moz-do-not-send="true" class="moz-txt-link-freetext">https://arxiv.org/abs/1712.03112</a></div>
<div><br>
</div>
<div>[2] <a
href="https://llvm.org/devmtg/2018-10/slides/Hong-Lattner-SwiftForTensorFlowGraphProgramExtraction.pdf"
moz-do-not-send="true" class="moz-txt-link-freetext">https://llvm.org/devmtg/2018-10/slides/Hong-Lattner-SwiftForTensorFlowGraphProgramExtraction.pdf</a></div>
<div><br>
</div>
<div>[3] <a
href="http://conal.net/papers/essence-of-ad/essence-of-ad-icfp.pdf"
moz-do-not-send="true" class="moz-txt-link-freetext">http://conal.net/papers/essence-of-ad/essence-of-ad-icfp.pdf</a></div>
<div><br>
</div>
<div>[4] <a
href="https://pytorch.org/docs/stable/dynamo/index.html"
moz-do-not-send="true" class="moz-txt-link-freetext">https://pytorch.org/docs/stable/dynamo/index.html</a></div>
<div><br>
<blockquote type="cite">
<div>On Sep 12, 2023, at 12:31 PM, Ethan McCue
<a class="moz-txt-link-rfc2396E" href="mailto:ethan@mccue.dev"><ethan@mccue.dev></a> wrote:</div>
<br class="Apple-interchange-newline">
<div>
<div dir="ltr">Can you elaborate more on prior work / the
state of affairs in other language ecosystems? In the
talk you reference Python "eating Java's lunch" - do
they have a comparable set of features or some mechanism
that serves the same goal (write code in Python, derive
GPU kernel/autodiffed/etc. code)?</div>
<br>
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">On Wed, Sep 6, 2023 at
12:44 PM Paul Sandoz <<a
href="mailto:paul.sandoz@oracle.com"
moz-do-not-send="true" class="moz-txt-link-freetext">paul.sandoz@oracle.com</a>>
wrote:<br>
</div>
<blockquote class="gmail_quote"
style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
I hereby invite discussion of a new Project, Babylon,
whose primary goal<br>
will be to extend the reach of Java to foreign
programming models such as <br>
SQL, differentiable programming, machine learning
models, and GPUs.<br>
<br>
Focusing on the last example, suppose a Java developer
wants to write a GPU <br>
kernel in Java and execute it on a GPU. The
developer’s Java code must, <br>
somehow, be analyzed and transformed into an
executable GPU kernel. A Java <br>
library could do that, but it requires access to the
Java code in symbolic <br>
form. Such access is, however, currently limited to
the use of non-standard <br>
APIs or to conventions at different points in the
program’s life cycle <br>
(compile time or run time), and the symbolic forms
available (abstract <br>
syntax trees or bytecodes) are often ill-suited to
analysis and transformation.<br>
<br>
Babylon will extend Java's reach to foreign
programming models with an <br>
enhancement to reflective programming in Java, called
code reflection. This <br>
will enable standard access, analysis, and
transformation of Java code in a <br>
suitable form. Support for a foreign programming model
can then be more <br>
easily implemented as a Java library.<br>
<br>
Babylon will ensure that code reflection is fit for
purpose by creating a <br>
GPU programming model for Java that leverages code
reflection and is <br>
implemented as a Java library. To reduce the risk of
bias we will also <br>
explore, or encourage the exploration of, other
programming models such as <br>
SQL and differentiable programming, though we may do
so less thoroughly.<br>
<br>
Code reflection consists of three parts:<br>
<br>
1) The modeling of Java programs as code models,
suitable for access,<br>
analysis, and transformation.<br>
2) Enhancements to Java reflection, enabling access to
code models at compile<br>
time and run time.<br>
3) APIs to build, analyze, and transform code models.<br>
<br>
For further details please see the JVM Language Summit
2023 presentations <br>
entitled "Code Reflection" [1] and "Java and GPU … are
we nearly there yet?" <br>
[2].<br>
<br>
I propose to lead this Project with an initial set of
Reviewers that<br>
includes, but is not limited to, Maurizio Cimadamore,
Gary Frost, and<br>
Sandhya Viswanathan.<br>
<br>
For code reflection this Project will start with a
clone of the current JDK <br>
main-line release, JDK 22, and track main-line
releases going forward.<br>
For the GPU programming model this Project will create
a separate repository,<br>
that is dependent on code reflection features as they
are developed.<br>
<br>
We expect to deliver Babylon over time, in a series of
JEPs that will likely<br>
span multiple feature releases.<br>
We do not currently plan to deliver the GPU
programming model into the JDK.<br>
However, work on that model could identify JDK
features and enhancements of <br>
general utility which could be addressed in future
work.<br>
<br>
Comments?<br>
<br>
Paul.<br>
<br>
[1] <a
href="https://cr.openjdk.org/~psandoz/conferences/2023-JVMLS/Code-Reflection-JVMLS-23-08-07.pdf"
rel="noreferrer" target="_blank"
moz-do-not-send="true" class="moz-txt-link-freetext">
https://cr.openjdk.org/~psandoz/conferences/2023-JVMLS/Code-Reflection-JVMLS-23-08-07.pdf</a><br>
<a
href="https://urldefense.com/v3/__https://youtu.be/xbk9_6XA_IY__;!!ACWV5N9M2RV99hQ!Pi_JEFeTachQ7GPUzCbX43Gh_znVj4rdfF5nwlwB6Ge37ghWGq6BLIbq-KlIM2mmm18hSL0CdCRECtQy0Q$"
rel="noreferrer" target="_blank"
moz-do-not-send="true">
https://youtu.be/xbk9_6XA_IY</a><br>
<br>
[2] <a
href="https://urldefense.com/v3/__https://youtu.be/lbKBu3lTftc__;!!ACWV5N9M2RV99hQ!Pi_JEFeTachQ7GPUzCbX43Gh_znVj4rdfF5nwlwB6Ge37ghWGq6BLIbq-KlIM2mmm18hSL0CdCRhl4eQWQ$"
rel="noreferrer" target="_blank"
moz-do-not-send="true">
https://youtu.be/lbKBu3lTftc</a><br>
<br>
</blockquote>
</div>
</div>
</blockquote>
</div>
<br>
</div>
</blockquote>
</body>
<lt-container></lt-container>
</html>