RFR[XS] 8249809 avoid calling DirectiveSet::clone(this) in compilecommand_compatibility_init

Liu, Xin xxinliu at amazon.com
Tue Jul 28 03:39:08 UTC 2020


hi, Volker, 

Thank you to review my patch. 

1. yes. I guess nodes are the major memory consumption, but compiler directives are for both c1 and c2. 
it still can reduce memory footprint a little. 

2.  Previous code set the flag here. both ControlIntrinsic and DisableIntrinsic belong to compilerdirectives_common_flags.

 327 #define init_default_cc(name, type, dvalue, cc_flag) { type v; if (!_modified[name##Index] && CompilerOracle::has_option_value(method, #cc_flag, v) && v != this->name##Option) { set->name##Option = v; changed = true;} }
 328     compilerdirectives_common_flags(init_default_cc)

When method-level directives override the global directives, this code snippet set changed. 
Even though I remove the flag 'changed', I have the same logic in the smart pointer. 

3.  the smart pointer DirectiveSetPtr needs to provide 2 accesses of the underlying pointer. 
one is read-only and the other one is mutable. 

Ideally, it should has the following 2 operator->().
DirectiveSet* operator->(); 
DirectiveSet const* operator->(). 

AFAFI, C++ doesn't support covariant return type overload.  That is to say, we need to find a way to work around. 
the reason I provide overload operator*() because it returns a reference to object.  Users who want to modify the pointee have to explicitly dereference the smart pointer. 
(*set).member = newvalue; 
set->member = newvalue; // compiler error. 

your approach also works,  but you need to invoke const_cast<> for all places where you want to  read. 
I think my approach has shorter code. 


I just came up a new idea.  How about I provide a method cloned(), which returns the unqualified pointer?

+  DirectiveSet* cloned() {
+    if (!_clone) {
+      _clone = DirectiveSet::clone(_origin);
+    }
+    return _clone;
+  }
+
   DirectiveSet* transfer() {
     assert(_origin != NULL, "_origin is NULL! transfer() can only be invoked once.");
     if (_clone != NULL) {
@@ -340,7 +347,7 @@

     if (CompilerOracle::should_print(method)) {
       if (!_modified[PrintAssemblyIndex]) {
-        (*set).PrintAssemblyOption = true;
+        set.cloned()->PrintAssemblyOption = true;
       }
     }

thanks,
--lx

________________________________________
From: Volker Simonis <volker.simonis at gmail.com>
Sent: Monday, July 27, 2020 8:51 AM
To: Liu, Xin
Cc: hotspot-compiler-dev at openjdk.java.net
Subject: RE: [EXTERNAL] RFR[XS] 8249809 avoid calling DirectiveSet::clone(this) in compilecommand_compatibility_init

CAUTION: This email originated from outside of the organization. Do not click links or open attachments unless you can confirm the sender and know the content is safe.



Hi Xin,

I'm not sure if saving the allocation of an DirectiveSet has any
visible effect compared to the much larger allocations required for
the method compilation itself.

Apart from that, I must confess that I'm not totally understanding the
original logic. From what I see, it sets "changed" to true in the case
where it changes the cloned DirectiveSet. But it doesn't do that in
the cases where it only changes the clone's control word:

 341         set->_intrinsic_control_words.fill_in(TriBool());
...
 348           set->_intrinsic_control_words[id] = iter.is_enabled();
...
 361         set->_intrinsic_control_words.fill_in(TriBool());
...
 368           set->_intrinsic_control_words[id] = false;

Why don't these mutations count as "changing"  the cloned DirectiveSet?

After your patch, you've changed the above lines such that they will
always create a clone which seems different from the initial
behaviour.

Which of the two behaviours is correct here, the original one, the new
one after your change or doesn't it matter for reasons I don't
understand?


I also wonder why you need to overload both operators "operator*()"
and "operator->()"? It seems a little bit arbitrary (and hard to
understand for people reading the code) that "operator*()" clones the
underlying directiveSet while "operator->()" uses the original one.
Why not just define two versions of "operator->()" and let the
compiler choose the right one like so:

DirectiveSet const* operator->() const {
  return !_clone ? _origin : _clone;
}

DirectiveSet* operator->() {
  if (!_clone) {
    _clone = DirectiveSet::clone(_origin);
  }
  return _clone;
}
...
if (!_modified[LogIndex]) {
  bool log = CompilerOracle::should_log(method);
  if (log != const_cast<const DirectiveSetPtr&>(set)->LogOption) {
    set->LogOption = log;
  }
}

Thank you and best regards,
Volker

On Mon, Jul 27, 2020 at 1:47 AM Liu, Xin <xxinliu at amazon.com> wrote:
>
> hi, Reviewers,
>
> Could you review this simple patch?
> bug: https://bugs.openjdk.java.net/browse/JDK-8249809
> webrev: https://cr.openjdk.java.net/~xliu/8249809/00/webrev/
>
> When the users specify a method-level compiler directive, the DirectiveSet is cloned for every single compiling method. It's expensive but rarely hit. Actually, Only user-specified methods must clone the DirectiveSet. I introduce a smart pointer DirectiveSetPtr. operator->() returns a pointer to a constant DirectiveSet, which is read-only. It doesn't clone the _origin until c2 need to update its members. transfer() yield the ownership of the pointer.
>
> Test:
> manually tests with different CompileComand options.
> hotspot:tier1 and gtest:all.
>
> thanks,
> --lx
>


More information about the hotspot-compiler-dev mailing list