[security-dev 00226]: X509KeyManager alias choice based on temporary socket
Bruno Harbulot
Bruno.Harbulot at manchester.ac.uk
Wed Jul 9 13:02:00 UTC 2008
Hello,
I'm trying to use an X509KeyManager to choose which certificate a server
presents depending on which IP address the socket is listening on.
Let's suppose I have two certificates (+private keys) for
host1.example.org (10.0.0.1) and host2.example.com (10.0.0.2), and that
the clients are going to check the name of the hostname against the name
in the certificate. This should work fine since there are two distinct
IP addresses, one for each certificate.
The Java server I'm running is configured with a single SSLContext,
which is set up with a KeyStore that contains two pairs of private keys
and certificates. This SSLContext is also set up to use a custom
X509KeyManager, which I was planning to use to choose which of the two
aliases (and therefore certificates) should be used depending on the socket.
This server starts up two SSLServerSockets, one with local address
10.0.0.1 and the other one with local address 10.0.0.2 (same port, but
this shouldn't really matter).
I initially thought that "chooseServerAlias(String keyType, Principal[]
issuers, Socket socket)" would give me the server socket used and thus I
would be able to pick the alias based on "socket.getLocalAddress()".
Unfortunately, it turns out that the socket passed to chooseServerAlias
is not the socket that is actually used. Its address is always 0.0.0.0,
regardless of the IP address the actual listening socket has been bound to.
I've traced the call to chooseServerAlias to:
com.sun.net.ssl.internal.ssl.ServerHandshaker.setupPrivateKeyAndChain(ServerHandshaker.java:843)
at
com.sun.net.ssl.internal.ssl.ServerHandshaker.trySetCipherSuite(ServerHandshaker.java:686)
at
com.sun.net.ssl.internal.ssl.SSLServerSocketImpl.checkEnabledSuites(SSLServerSocketImpl.java:292)
at
com.sun.net.ssl.internal.ssl.SSLServerSocketImpl.accept(SSLServerSocketImpl.java:253)
Looking at com.sun.net.ssl.internal.ssl.SSLServerSocketImpl (in
openjdk-6-src-b10_30_may_2008.tar.gz and
openjdk-7-ea-src-b30-03_jul_2008.zip), it turns out that the choice of
the certificate is based on a temporary socket, which is initialised
with settings based on the actual socket (cipher suites, etc.), but not
the local address or port. (Code fragment at the end of this e-mail.)
It is done by design? Any idea how it would be possible to choose a
certificate based on which IP address is used otherwise?
Best wishes,
Bruno.
====== This is a short extract of
com.sun.net.ssl.internal.ssl.SSLServerSocketImpl (line count will differ
depending on the version, but it's towards the end of the file) =====
/*
* This is a sometimes helpful diagnostic check that is performed
* once for each ServerSocket to verify that the initial set of
* enabled suites are capable of supporting a successful handshake.
*/
private void checkEnabledSuites() throws IOException {
//
// We want to report an error if no cipher suites were actually
// enabled, since this is an error users are known to make. Then
// they get vastly confused by having clients report an error!
//
synchronized (this) {
if (checkedEnabled) {
return;
}
if (useServerMode == false) {
return;
}
SSLSocketImpl tmp = new SSLSocketImpl(sslContext,
useServerMode,
enabledCipherSuites, doClientAuth,
enableSessionCreation, enabledProtocols);
ServerHandshaker handshaker = tmp.getServerHandshaker();
for (Iterator t = enabledCipherSuites.iterator();
t.hasNext(); ) {
CipherSuite suite = (CipherSuite)t.next();
if (handshaker.trySetCipherSuite(suite)) {
checkedEnabled = true;
return;
}
}
More information about the security-dev
mailing list