dlsym(RTLD_DEFAULT, "getentropy") return non-NULL on Mac

Wang Weijun weijun.wang at oracle.com
Sat Nov 7 02:51:34 UTC 2015


I find something strange.

Background: a new method getentropy() is available on OpenBSD [1] and Solaris and people are also proposing it on other OSes.

Therefore inside JDK I write a piece of native code to detect it, something like

    typedef int (*GETENTROPY_FN)(char* buffer, int len);

    getentropy = (GETENTROPY_FN)dlsym(RTLD_DEFAULT, "getentropy");
    if (getentropy) {
        return 1;
    } 

On Mac, it returns non-NULL, but a later call to (*getentropy)(cbuffer, length) shows

  #  SIGBUS (0xa) at pc=0x0000000103bfa030, pid=22434, tid=5891
  ...
  # Problematic frame:
  # C  [libj2rand.dylib+0x1030]  getentropy+0x0

However, "man getentropy" does not show anything, and the following simple program also prints out 0x0

#include <dlfcn.h>
#include <stdlib.h>

main() {
   void* g = dlsym(RTLD_DEFAULT, "getentropy");
   printf("%p\n", g);
}

What does this mean? Is the JDK code loading another getentropy() somewhere else? How do I detect it and what shall I do to avoid it?

Thanks
Max

[1] http://www.openbsd.org/cgi-bin/man.cgi/OpenBSD-current/man2/getentropy.2


More information about the security-dev mailing list