[RFC][Icedtea-web]: Changed how cache works.

Denis Lila dlila at redhat.com
Fri Apr 15 10:45:46 PDT 2011


> I think it might be best to hold this off for now, and hack at it
> later.
> This was an attempt to make it so that we don't have a bunch of
> folders
> with just another folder in it.

I agree.

Well, assuming the patch is now the same as the one before the
url hashCode names + my suggestions, it's ok with me.
I would be more comfortable if Omair took a look at it first
though. It's a big patch and it changes important functionality.

Regards,
Denis.

----- Original Message -----
> ----- Original Message -----
> > From: "Denis Lila" <dlila at redhat.com>
> > To: "Andrew Su" <asu at redhat.com>
> > Cc: "IcedTea" <distro-pkg-dev at openjdk.java.net>, "Omair Majid"
> > <omajid at redhat.com>
> > Sent: Thursday, April 14, 2011 1:54:01 PM
> > Subject: Re: [RFC][Icedtea-web]: Changed how cache works.
> > Hi.
> >
> > You forgot to hg add CacheLRUWrapper. Assuming it was just a
> > rename of the corresponding file in the previous version of
> > your patch, I added it to your patch, and attached it. The
> > attached patch should compile.
> 
> Oops, thanks for doing that!
> 
> >
> > Here are some things that jump out at me (more comments may come
> > later on):
> >
> > In getCacheFileIfExists, can you put the lastPath computation
> > in it's own function (similar to getIdForCacheFolder). That way
> > it'll be a bit easier to figure out what's going on.
> 
> Sure.
> 
> >
> > In removeUntrackedDirectories you do
> >
> > + for (File f : temp.listFiles()) {
> > + String path = f.getPath();
> > + temp = new File(path);
> > + if (temp.isDirectory() && !keep.contains(temp.getPath())) {
> > + try {
> > + FileUtils.recursiveDelete(temp, temp);
> >
> > Wouldn't it be the same (and simpler) to do:
> >
> > + for (File f : temp.listFiles()) {
> > + if (f.isDirectory() && !keep.contains(f.getPath())) {
> > + try {
> > + FileUtils.recursiveDelete(f, f);
> >
> 
> Yes it would; Residue from moving things around..
> 
> > Also, the HashSet<String> argument of that function is called
> > "keep".
> > This is a bit peculiar, since you're deleting everything in it.
> > Also, can you make that a Set<String> instead of HashSet?
> 
> Changed.
> 
> >
> >
> > Most importantly:
> > > I have applied your patch and did some adjustments such as using
> > > the
> > > hash of the url as file name instead of having a really long
> > > directory.
> >
> > Isn't this broken? I mean, what happens when two different urls
> > hash to the same code?
> 

> 
> >
> > To me, it seems like you're just going to create a file when it
> > is first hashed, and when a different URL has the same hash
> > getCachFileIfExist will return the first cached file, instead
> > of creating a new one.
> 
> 
> 
> -- snip --
> Cheers,
> Andrew



More information about the distro-pkg-dev mailing list