Using Ant’s FTP Task

I’m going to be doing AudioMan’s builds on one machine and uploading them to a web server with FTP so people can download them. The easiest way to do this was with Ant, the tool I use to build AudioMan. After building the project Ant already knows all of the details of the build it just did — where it is, what all of the files are, etc.

One small problem though: <ftp> is an optional Ant task and requires an external library. What the documentation doesn’t tell you is that Ant 1.6 requires a different external library file from Ant 1.5. That’s what happens when you read the docs for 1.6 using 1.5, right? 🙂 It would still be nice to know it changed. I Googled for it, and found the info telling me that it changed.

So that’s handy, I was using the wrong JAR file. Just for the record:

Ant 1.5 requires NetComponents.jar which you can get here.
Ant 1.6 requires commons-net.jar which you can get here. It also requires the Jakarta ORO JAR file, which you can get here.

Here’s a tip for the ftp task: If your Ant file is in a publicly accessible place like an open source CVS repository, you probably shouldn’t put your password right in the Ant task like:

<ftp server=”” userid=”user” password=”god”>
<fileset file=”” />

because people will know your FTP password! You’re better off using a property, which you can leave blank in the Ant build.xml file and specify at the command line instead. Then the checked in file won’t have a password in it and only the people that know the password can use this task. Here’s the new Ant file:

<property name=”ftp.password” value=”” />

<ftp server=”” userid=”user” password=”${ftp.password}”>
<fileset file=”” />

and here’s how to use the command line to specify the password and run the ftp task:

ant -Dftp.password=god ftp

Java Microbenchmarks are Evil

I tried to make a benchmark to compare returning objects vs. throwing exceptions but the Java virtual machine is a very hard thing to benchmark because of the optimizations it does. See this old Q+A for more information, and optimizations have probably improved since then.

I wanted to compare the numbers to Andrew’s numbers from .NET that he wrote in my comments but they are probably skewed/optimized too.

For example I wrote two functions

private static Exception returnException()
return new Exception();

private static void throwsException() throws Exception
throw new Exception();

and called them a million times in a loop. With the first one I assigned the result to an Exception variable inside the loop. The second I put in a try/catch block inside the loop and caught the exception. When I timed both it I got around the same time (I did about 10 of each and recorded the high and low):

returnException: 3755-3766 ms
throwsException: 4005-4046 ms

So this can make it look like there is no performance difference between throwing an exception and returning a value. Nope, not so fast. The interpreter/compiler is optimizing the returnException() call. Because it’s such a small function, it’s just inlining it into the loop itself and removing the overhead of having a function call (using the call stack). The second function that throws an exception is likely inlined as well.

The compiler is apparently also smart enough not to generate code for variables that aren’t used, like my Exception variable that holds the result of the method call (that has since been optimized out). But that made me wonder: what’s taking so long then? All you would have is an empty loop. I compared returning new Exception() to returning just boolean true and it was 1000 times slower than boolean. But apparently the allocation using new can’t be optimized away, even though the variable is never used.

The whole point of testing this was to show that when an exception is thrown it has to navigate back up the call stack (cleaning up the stack as it goes) to find the correct catch block. This is what is expensive about throwing exceptions compared to calling functions, which only have to push and pop a few values on and off the stack to return and don’t have to manage the try/catch logic. If the compiler optimizes the code you can’t compare them fairly.

So beware of microbenchmarks, which is exactly what the five year old Q+A linked above said. So only way to fairly test the speed of code generated from an optimizing compiler is to test it in a large product. I wonder if anyone has done a return Object versus throw Exception comparison on a larger scale.


I’m writing this in my blog so it will get Googled. I tried searching for a solution to this problem and came up empty.

If you are using the Ant Eclipse plug-in and get the following error message

[javac] BUILD FAILED: file:C:/[ECLIPSE_DIR]/workspace/[PROJECT_DIR]/build.xml:32: Unable to find a javac compiler; is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK

you are using the wrong Java Virtual Machine (JVM) with Eclipse. Ant goes ahead and uses the javac from the JVM Eclipse is using no matter what you put in the compiler attribute. This is bad because Eclipse uses the first Java VM it finds on your computer’s PATH variable.

In my case I installed the Java Runtime Environment (JRE) before the Java Software Development Kit (SDK) and Eclipse was using the JRE by default. Ant doesn’t work with the JRE.

So you have to tell Eclipse to run using the SDK so that Ant can use the SDK. To do this make a shortcut to eclipse.exe and change the target to:

[ECLIPSE_DIR]\eclipse.exe -vm [SDK_DIR]\bin\javaw.exe

Where [ECLIPSE_DIR] and [SDK_DIR] are the full paths to the Eclipse and Java SDK directories respectively.

He’s a Microsoft Human Aggregator … Deal With It

Robert Scoble’s been catching a lot of flak lately. OK, he’s an opinionated guy so obviously he’ll rile some people but that’s why we read his blog. His opinions and links are worth reading … sometimes. I don’t think there’s any blogger out there who I’d say is interesting all of the time, even the often hilarious Rory Blyth.

It’s getting pretty hard to pick and choose what to read these days. I had a list of about 100 blogs I was reading and found after about a month it was just too much to digest. Now I just wait until someone else links to interesting posts. I might be missing around half of the good stuff, but I’ll trade that for my sanity thank you very much. So that’s the function that Robert Scoble’s blog is serving for me: he’s a Microsoft human aggregator.

I have half a brain — sometimes two halves. I know he’s going to be biased and I want that spin. I get enough anti-spin on Slashdot to counter it 10 times over. When I’m curious about what’s going on at Microsoft I want to read it from a human being, not a press release.

But lately Scoble’s been doing a lot of rear-view mirror watching. He thinks that linking to people that insult/disagree with him will make things better. I don’t know about that — it doesn’t take much to write a disagreeing blog post these days. Maybe if they had a good point and it got Scoble reflecting about something and he blogged about it, it would be worth the link back. Otherwise he’s just linking to trolls.

He also thinks that linking to competing products and companies will make things better. I disagree there too. Unless he has something better to say than “see? I’m linking to them too” it’s not really worth the time to write it and it’s definitely not worth the time to read it.

I pick on Scoble because he’s one of the few bloggers who’s getting a lot of attention AND he works for a software company with a bulls-eye on it the size of Redmond, Washington (I hear it’s the only thing you can see from space other than the Great Wall). It’s an interesting push and pull of opinions and politics that make for interesting blog reading. It’s like watching a guy try to cross the Niagara Falls on a high wire … on his hands. So it’s kind of a shame to see him giving into his readers so much. Feedback is good but ultimately Robert has to stick to his guns and say “I like Microsoft people and products and damn the man if people complain about it.”

Overall I think that Scoble’s readers have to grow up. They have to realise what they are reading and why they are reading it. Scoble is not a local newspaper and he doesn’t owe you anything. You’re perfectly free to have an opinion on it, of course, but If you don’t like his blog why waste your time complaining about it? Go read something else. You’re screwing up a perfectly good thing for the rest of us.

Adventures in Disabling VgaSave

This story is long and probably amusing to people who aren’t me. I’m posting it for your enjoyment. Chalk this one up in the “worst personal IT experience so far” column…

I was at a friend’s place and her roommate’s DVD player on her notebook computer wasn’t working properly. It said something like “requires DDraw, cannot use the current display mode”.

So I thought maybe she didn’t have the latest version of DirectX. I went to the Windows Update web site, and before I could check to see if she had installed DirectX I had to download a bunch of security updates. Install updates and reboot three times only to discover that she doesn’t need a DirectX update. D’oh.

So I check the video driver. Blank. Blank? Yep. I click on Adapter –> Properties and it says “VgaSave”. I was like WTF? So I go to the laptop’s manufacturer (Acer) to get the real video drivers, try to install them and it says that the current drivers are in use, so I have to disable them to install the new ones. OK, so I disable VgaSave.

I still can’t install the new drivers, and I install a notebook status tool from Acer thinking that maybe it can fix the driver problem. After installation it wants a restart, so I restart.

On reboot the machine shows the product screen (Acer), then the Windows loading bar animating and then nothing. Blackness. OK, I had no idea what was wrong at this point so I try to boot into safe mode. Nope, doesn’t work. Command prompt? Nope. I can’t do anything. Everything else still works though. I can hear the sound, and the computer is thinking like it’s booting up.

So I look up VgaSave online and find this page on Wikipedia. Crap! I just screwed the pooch completely.

Luckily, the computer’s owner has a good sense of humour (and some blind faith)! I look up how to fix this problem. Apparently VgaSave is a fallback video service for NT operating systems. When your video card doesn’t work, or you use Safe Mode, it uses VgaSave. If you disable this service you have no fallback! That’s where I was.

So I needed to re-enable the service and I read that I can do this through the NT Recovery Console (RC), which requires the administrator password. I boot up the RC and try a blank Administrator password. No go. I ask the notebook’s owner for some possible passwords. No dice. I try stupid common ones like password, god, etc. Nope.

The last Windows install on this machine was done by CompuSmart, a computer repair shop in Ottawa. They would have set the Administrator password on install, probably to blank. But blank didn’t work! So I search the net more info…

Apparently because of security, XP changed the way it stores passwords internally. This changed sometime between the original release of XP and now, though I couldn’t pinpoint exactly when. The RC I was using was from the XP CD (SP 1a) and I thought maybe it read the passwords the old way. So I downloaded the bootdisk maker from Microsoft’s web site to get the new RC. It’s SIX DISKS and it takes about 5 minutes to make ’em all.

I make the six disks on another machine and take them to the notebook. Using boot disks takes a long time, probably ten minutes. I get to disk 5 and it won’t read. I redo the disks with the same disks and disk 5 is still toast. I redo them AGAIN with a new disk for disk 5 and I get an error that a file on the disk was corrupted: means bad boot disk maker!

At this point I was calm on the outside, snapping on the inside and thanking God I’m not an IT administrator and actually do this for a living. At that point it was late so I gave up for then, and took the machine with me.

So today I’m still fooling around with it some more. The blank Adminstrator password doesn’t work with XP Home boot disks. It doesn’t work with XP Home SP1, XP Pro SP1, XP Home SP2 boot disks. I actually made the boot disks for them all. No dice.

I call up CompuSmart and ask them about the Administrator passwords for new systems. The dude said “we leave them blank”. Ok. The blank passwords didn’t work on any version of RC! I was stuck.

At this point, I was at my last resort: use a 3rd party utility to change the Adminstrator password. Microsoft obviously does not recommend doing this — the only ways Microsoft recommends changing the Adminstrator password is through the GUI in XP or a password reset disk. Obviously I couldn’t get in the GUI — and who actually makes recovery disks? Not students. I was out of official options.

So I downloaded a neat little tool called the Offline NT Password and Registry Editor (ONTPRE). It actually boots into Linux(!), and edits the Windows registry, including the SAM file that contains all of the users and passwords.

So I get into ONTPRE and check out the users. Administrator is disabled and locked. Good lord, Compusmart lied to me. My urge to tear my hair out after this ordeal is wrestled to the ground by a feeling of relief. I didn’t have to reformat. That is good news. Thank you, ONTPRE!

So I boot up RC (with the six boot disks, natch), use the blank Administrator password and voila, I’m in the RC. To enable VgaSave again, I just typed.


at the RC command line.

Rebooted the machine and everything is back to normal. Will I try to screw with this machine some more to actually get the drivers working properly? Hell yes, because now I know how to fix it if I screw up again. Ha. 🙂

update: after yet ANOTHER hiccup, this time installing XP service pack 2, I’m all done. I got an error midway through the installer and had an error dialog that wouldn’t go away, so I had to force quit the installer. Then Windows Update thought I had the full SP2, so I had to uninstall the partial and redo the install again.

You should have seen the spyware that Ad Aware found on this machine. I’ve never seen that much on one computer before. Now IE is hiding and Firefox is on the desktop. If you don’t want spyware or random popup windows, you should use Firefox too.

Ad Hoc Communities for Specific Problems

I honestly cannot believe how many people have come to my blog for a solution to the Ant JAVA_HOME problem with Eclipse 2.1 I wrote up six months ago. I probably get 50-75 hits a day from just that issue.

A strange combination of circumstances gives me all of those hits. My increased Google juice has to do with Google’s inability to properly score blogs, meaning I have a relatively inflated score. So when you seach for the error text, my blog comes up near the top. It probably doesn’t help that I got linked a few times by Robert Scoble who also has an inflated score, which probably adds to my score somehow (Scoble calls this ‘giving Google juice’). The increased linkage of blogs is part of what gives Google problems, as linking frequency is a basis for its scoring system.

The great thing is that I have a solution to a problem and I can share it with others, creating a sort of ad hoc community around a specific problem. It’s really great when you can just Google an error message and get a solution right away instead of wading through manuals.


Browsing Zenith Upon Us?

Microsoft presumably can’t justify further development or expense on IE6 development. They are a money-making business and they’ve moved on. The idiosyncrasies of the browser are well known, or at least should be given how long its been out, to the development community. Changing IE6 now would break these code workarounds, making people angry at MSFT.

There are a few UI enhancements that wouldn’t break the DHTML code rendering and would only improve the “browsing experience”. Tabbed browsing and pop-up blocking are probably the two that are most requested.

I don’t think Microsoft will add tabbed browsing for two reasons: 1) is that it’s not intuitive GUI, as I said in my comments last week and 2) it would require a major rewrite of IE. You can’t mash in something major like tabs. It’s like adding browser support for frames: instead of dealing with one thing you are dealing with n things.

Pop-up blocking is a code interpretation hack because popups are perfectly valid DHTML. It works in Mozilla because Mozilla users know what a pop-up is on a technical level. Most users of IE6 don’t and as a result could miss important web site functionality because of it — especially on (corporate intranet) web sites that use pop-ups as a way to simulate modal dialogs on web applications. Yes, gross.

I think Paul missed part of what the Microsoft employee meant by “Further improvements to IE will require enhancements to the underlying OS.” They aren’t just talking about eye-candy GUI improvements. They are talking about having access to Longhorn’s (still rumoured? I can’t keep up) SQL Server-based file system for easier file system searching and other low-level technical enhancements. These additions to the underlying OS will improve the next IE’s UI experience by adding the possibility for unique features not possible on WinXP. The flying and fluttering windows are just gravy …. I guess.

Forget about the browser GUI. If you want to talk about zeniths, then DHTML has reached its practical zenith. People are doing UI things with browsers today that just seem unnatural — and I’m not talking web pages here (though there are many weird ones out there), I’m talking corporate web applications. Sure it’s easier to maintain so-called “thin” client application (or easier to break it, depending on who you ask) but if the UI tools (DHTML) you have available to you don’t measure up, use something else — don’t kludge it.

Update 12:12 Tim Bray adds to his comments that CSS ain’t Rocket Science. I agree with a lot of what he says, but I’m going to explain my earlier statement about using DHTML improperly to add to his arguments.

You can’t control what people will do with your UI toolkit. VB programmers suck at making GUIs because they generally have no formal training — you can’t help this. Web pages solve the problem in one dimension by limiting what you can do on one page … and as a result web applications are cleaner but sometimes require more clicks. Usability improves on look and feel, but suffers when speed is important.

Some WinForm widgets are more dense and complex. The slider, for example, does something you’d never be able to do well on a web site. Same with tabs. You can kludge tabs in DHTML, but in WinForms they are just there. Managing that density is a job that requires skill and is generally not something a programmer can do well without help from someone with usability training.

The other side of the coin is when you want your application to be GUI dense and client-server. You can’t have it both ways — in that case you must use WinForms or you will suffer in browser development hell. Believe me, I’ve been there.

Overlapping Backgrounds

Wow, the default Moveable Type 2.63 CSS stylesheet is horrible. I discovered why Internet Explorer 6 is coughing up a lung and other major organs: because every <div> on this page had a background. Some places on the page had three white backgrounds layered on top of each other. It’s definitely another Internet Explorer bug, but it’s generally something a good CSS stylesheet doesn’t do anyway. I took out every background but the one on <body> and that seems to fix the problem.

So why did the Moveable Type development team do it? My guess is so that CSS newbies would know where to change the background colour of the individual elements. Sure that’s all well and good and helpful but it really screws the default rendering. The default template should be rock solid and something that people can depend on. There are people out there that won’t even touch their templates — and the administrator can configure Moveable Type to have users/bloggers that can’t change their templates at all. D’oh!

If there are any other weird browser problems with this site please let me know.

Update on April 26th, 2003 12:15pm: The overlapping backgrounds was causing the page to render with large white blocks in it, covering text. If you went to another area of the page and came back, the text might be fixed. Or you could highlight the area where the text should be and it would appear.

I’ve made a mirror of this index page with the default “clean” stylesheet that is provided with Moveable Type 2.63 so people can try it out for themselves. If you’re wondering why that page doesn’t fit on your screen laterally, it’s because of this problem.

Update on April 26th, 2003 12:45pm: I can’t get that IE background problem to reproduce on the mirror page with the original templates. I’m stumped. I had a half dozen different people with this problem on my site. I’ll try to track it down … sorry about the inconvenience.

Nevertheless, the rather serious absolute div problem with the original style sheets I mentioned earlier is still present. This is a combination of bad CSS and IE 6 not being too forgiving (or maybe too literal). That’s very unlike IE considering some of the malformed HTML it renders

Java Delete to Recycle Bin

It’s interesting that even a small shift in the desktop paradigm hasn’t been picked up by Java: the Recycle Bin. All of the popular operating systems have a variation of it but the only thing you can do in Java is permanently delete a file.

This is pretty inconvenient for AudioMan, which is trying to mimic Windows Explorer in some ways. So when I delete a track I want it to show up in the Recycle Bin. It’s not as simple as moving the file to the Recycle Bin folder, which is hidden. There are other things that have to be done — on Windows it’s accessible via a Win32 API call: SHFileOperation().

So far the only solution I’ve found is to use JNI but that’s not appealing to me as a Java developer. Wouldn’t it be nice if SWT did this for me on Windows, Mac OS and Linux like it does for opening files? Eventually the Java library should do it instead, but the SWT guys might be a bit faster implementing it.

Maybe I should just write a patch for SWT myself … I’ll have to dust off my copy of Visual C++ 6.0. Nevermind that, how do I implement it on Mac OS and Linux? Oh boy …

Update Monday 17:36: I submitted a bug report for “Move to Recycle Bin” to SWT and they replied that it is out of their scope, forwarding it to the Eclipse platform resources component. That’s pretty reasonable, and wasn’t completely unexpected.

They also suggested I ask Sun about including it in the File class (it’s not in the JavaDocs for Java 5). Yeah, wouldn’t that be nice? But who do I talk to about that? Is there a bug database? Yep, but I need to register. Are there Java bloggers I can talk to directly to find out why this hasn’t been done yet? Maybe the answer is really simple. Does a JSR need to be written for something this small? I’m in pretty foreign territory here. 🙂 Maybe Tim Bray knows someone I should talk to ….

… more legwork necessary. I’ll keep y’all updated.

Sun has a bug for this (login required) but hasn’t moved on it. The person reporting the bug wanted to replace delete() and deleteOnExit() with methods that send to the recycle bin. I agree with the rest of the people in the discussion that this is a bad idea. The top of the bug description says “Will break crossplatform compatibility. Wont fix.”

I agree, don’t change delete(). The File class just needs one method moveToRecycleBin() or moveToTrash() that can be used instead of delete(). For deletes in the GUI where the user expects to see the files moved to the recycle bin, it works. For other deletes (ie. configuration files that the user isn’t interested in) you need the permanent delete. They both have their place.

Another interesting idea is to write a new class that manages a recycle bin — moving files there, listing them and then recovering them. Now that all of the major platforms have the recycle bin paradigm, Java should get on the bandwagon and give us cross platform access to it, don’t you think?

Update Thursday July 29 10:04 PM: I have submitted an RFE bug to Sun for this. I’d link to it, but it won’t be visible on the bug site until it is reviewed. For Sun people who might be reading this, the RFE has the internal review ID 290373.


Ryan Lowe’s Blog

Free Software Realities
James Robertson has linked to a few capitalist rants by Clemens Vasters: one and two. Ponder this quote from Clemens: “selfish is not the one who wants to get a tangible reward for his work. Selfish is the one who denies that reward.”

I’m glad I’m not alone in disagreeing with him and I didn’t think I would be. I was going to write a comment on James’ blog about his latest post but it got long so I’ll post it here instead:

Isn’t it kind of silly to be having a conversation about this? I mean, do people expect open source developers to say “hey, ya … you know what? I’m wasting my time. You’re right. OK, let’s go back to the old way so everyone can get a paycheck. We all deserve it.” It ain’t gonna happen. There will always be enough control freaks and freedom idealists to commoditize the next software market with free software.

There’s clearly a difference of opinion here. On one hand we have people who are looking for something in return for their investment of time. Open source has some qualities that allow those people to do that (free marketing, public speaking fees, increased feedback, bugfixes, whatever). That’s fine, that’s great, I can appreciate that. You need to support a family and I need to support my Mac hardware fetish.

But there is a segment of the free software world that just doesn’t see it that way. They pump out code for the greater good, or to boost their egos and not their wallets. These people will not be convinced by capitalists ranting about losing earning potential … in fact they will be driven the other way and be motivated to stick it to you. As software developers looking to get paid I think we owe it to ourselves to understand the rationale behind this “competition”. I think it’s great that I can get paid for my hobby and call it a career … but I’m also realistic. Enough hobbyists working for free will marginalize me, so I’ll just have to watch out and stay ahead of them. Free software has too much momentum now.

I know I’m naive — and I can conveniently use my youth as an excuse. But I’m also realistic about free software. It’s serious competition … and you aren’t going to convince too many free software developers to start coding only for dollars just because you think it’s “selfish” not to charge for their work. Oh no, they’ll just see it the other way … it’s selfish for you to expect to get paid for something someone else will do for FREE in their spare time. That’s life in the software industry of the next 30 years … that’s what I’m expecting.

Naive would be thinking that free software won’t impact your market niche or that you can convince people to change their minds. It will have an impact and you can’t change enough minds to make a difference at this point. People are getting a taste of software freedom (that’s the free that should be emphasized, not the code) and they like it. So you might as well start figuring out how to make money despite free software’s existence. I’m naive about business but I know that much … and I can see it coming from a mile away. Oh, and so can IBM.

Software developers seem to have it in their heads that they will always have jobs and it’s just not the case. Remember, this “market” didn’t exist 50 years ago. If you lose your job because of free software don’t blame free software, blame yourself for not having the foresight to move on to greener pastures and better opportunities. If you want a secure job, the software “industry” is not a good place to look for one. Things change too quickly.