Oracle Blog

kto

Thursday May 07, 2009

When ant runs, it sometimes uses temporary files using
the java property java.io.tmpdir as the root of the temporary file area
for the system.
Unfortunately, the temp file used by ant isn't unique enough to
prevent two processes running ant from bumping into
each other.
We had problems using the same machine to build both Solaris 32bit
and Solaris 64bit at the same time.
OUCH!

Jonathan came up with a
good solution specific to the JDK in the
langtools Makefile that runs ant.
He defines the java.io.tmpdir property on the ant command line to be
unique to this build area and platform:
ant -Djava.io.tmpdir='$(ABS_OUTPUTDIR)/build/ant-tmp' ...

The basic idea is to redefine the java.io.tmpdir (the root of the temporary
file area) to something more unique to the circumstances, and ideally
in a location that will get cleaned up at the right time.

Had a lovely time (<- sarcasm) tracking down a problem on Windows with JDK ant builds.
Apparently on Windows, if you manage to get the ant.bat startup instead
of the shell script version of the startup, commas are not allowed on the
command line.
So you cannot do this:ant -Djavac.debuglevel=source,lines,varsOUCH!

This problem was reported
here
but nothing was done.
Seems like a simple note in the User Manual was a minimum here.

So I downloaded the cpptasks for ant from
the ant-contrib site.
And it builds from the command line just fine with the latest ant, but
it won't build using the ant that comes with NetBeans, and it won't
build when loaded into NetBeans.
So why is that? The build requires xercesImpl.jar, but the cpptasks build.xml
file doesn't explicitly say that, so how did it find it in one case and not
in another?

Turns out that the default behavior for the ant <javac> task is to
include all the ant runtime classpath in your java compilation. Yipes!
That seems like a horrible default if you ask me, depending on what ant decides
to use in its runtime classpath, you get it all? :\^(
OUCH!

Seems like every ant installation could potentially behave differently, depending
on how ant is configured.

So I'm thinking I want to change all <javac> uses to
<javac includeAntRuntime="false">

Saturday Feb 28, 2009

Just how many copies of junit.jar have been added to source repositories on the planet?
Quite a few I imagine, seems like a waste of repository data space and well, just wrong.
Not junit, which is a fantastic product, just the fact that we have so many copies.
Granted you have gained a pretty stable tool by freezing the version you have, and you
have guaranteed having a copy at all times, but is it a good idea to add all these binary
fines to your repository data?
As the list of tools like this grows and grows, does the "just add it to the repository"
solution continue to scale?
And each time you need a new version, you end up adding even more binary data to
your repository.

Some projects have taken to doing a kind of "tools bootstrap" by downloading all the
open source tools the first time you setup a repository, making the files immune from
normal 'ant clean' actions.
Ant
has a a task called <get> which can allow you to download tool
bundles and it works quite well, but there are some catches to doing
it this way.
Expecting all the download sites to be up and available 24/7 is not realistic.
And predictability is really important so you want to make sure you always download
the same version of the tools, keeping a record of what versions of the tools you use.

So what we did in the
openjfx-compiler setup repository,
was to create an import/ directory to hold the downloaded tools,
automate the population of that area with the <get> task,
and also allowed for quick population of import/ with a large
import zip bundle.
The initial version of the repository had a very
similar mechanism, so this idea should be credited to the original authors
on the
OpenJFX Compiler Project.

This logic is contained in the file build-importer.xml of the
setup repository and
for each tool NAME
downloaded, a set of properties is defined (import.NAME.\*),
and 2 ant tasks import-get-NAME and import-NAME.
Probably best to look at the bottom of this file first.
As before quite a few macrodefs were used to make this all work.

Your browser does not support iframes.

The ant build script then just uses ${import.junit.jar} to get a junit.jar file.

You can actually try this out yourself pretty easily if you have Mercurial (hg)
and ant by doing this:

Of course I'll predict that it fails the first time for 50% or more people,
this kind of downloading is just not that reliable when depending on all these
sites.
So you may have to run ant import a few times.

Monday Feb 23, 2009

Ant
works great for any pure Java project, very simple to deal with, might get a little tricky when dealing
with jar manifests, but not bad, and very efficient in terms of limiting the Java VM startup overhead.
But what about platform specific tasks?
I myself find the "<exec>" ant task so painful to use that I
avoid it at all costs, or at least isolate each use to a "<macrodef>".
And this macrodef isolation actually works pretty well when you are dealing with many different
platforms that you need to build on.

There are many solutions to the issue of platform specific builds, including the
ant cpptasks
and I am sure many more. So what I am saying here is not new and not the end all to
this issue. Just some ideas for people to consider when up against this problem.
Please, add your comments if you have some good references and ideas.
It's very obvious to me that I am no where near an ant expert,
so take all this with a grain of salt.
I also want to give credit to the many JavaFX teams and individuals you wrote
the various ant scripts in all the repositories, most of this is a consolidation
of other peoples ideas and techniques.

So how did the JavaFX SDK deal with multiple platform issues in ant?
This project was composed of many sub repositories, each with different system needs and often
using slightly different techniques for building.
The top repository (or setup repository or root repository)
we have allows for this independence as much as possible, but at the
same time trys to create some kind of structure to the build process.
From the Mercurial file view of the
openjfx-compiler setup repository,
I will try and explain what is happening.

Basic OS arch detection is done in the file build-os-arch.xml:

Your browser does not support iframes.

People unfamiliar with xml or ant might find the syntax a bit convoluted,
it takes time to get used to it.
Key here is the property os_name
(which will contain one of: solaris, windows, linux, or macosx),
and will be used in the build-defs.xml
file to import the right platform specific file build-${os_name}-defs.xml.
Keep in mind this is unique to this project, but the basics should work for any multi-platform
build project.

The platform specific macrodefs are in the build-${os_name}-defs.xml
files, customized for each OS, and each defines the -init-platform-defs task.
Consider the macosx file:

Your browser does not support iframes.

Special to this file is the ability to run the xcodebuild utility, this macrodef
should probably be turned into some kind of generic do-project-buildmacrodef, someday.
Note that with the JavaFX SDK project we have allowed teams to work in sparse Mercurial forests,
this repository we are looking at is the top repository but it could have many sub repositories.
Depending on the sub repositories present, there are different needs.
We try and check them in these build-${os_name}-defs.xml files, via the
-init-platform-defs target.

Pull it all together with the file build-defs.xml which we ask all sub repositories to
import early in their own build.xml

Your browser does not support iframes.

This file is imported by each subrepository.
Note that this file imports build-os-arch.xml,
build-${os_name}-defs.xml, and
many other files to provide lots of macrodefs and property settings for a sub repository.

Then we established an ant target contract between each sub repository and the
top repository by requiring certain jfx-\* targets to be available in the
sub repository, for example in the openjfx-compiler repository build.xml file
(somewhere in the first 100 lines you should see jfx-\* targets defined):

Your browser does not support iframes.

Note that it imports in ../build-defs.xml and has defined a set of jfx-\*
targets for use by the top repository build.xml file.

The forest build script then uses some macrodefs to cycle through the various
sub repositories (sometimes called components) in the file build-components.xml,
look for the do-all-\* targets:

Your browser does not support iframes.

Which you will see imported in the top level build.xml file:

Your browser does not support iframes.

We have a cached area where a previous SDK build is used in the case of a partial forest,
allowing a developer to concentrate on the work in a single repository and not have to build
the entire forest.
The
OpenJDK builds uses a similar concept with the
Import JDK, where the pieces you aren't building can come from.
Effectively, we will cycle over the sub repositories present, in a particular order, and
request each one to perform a certain action as defined by the jfx-\* target contract.
Look for the use of the do-all-componentsmacrodef for where we will cycle over
the sub repositories.
(It's a shame that ant doesn't have some kind of applyant task, but you use the tools you are given.)

The JavaFX SDK project is a bit unique, and the techniques used in it's build process
may not suit many projects, but I thought
some of this might be of interest to anyone considering putting their hand into
any large ant nest someday. :\^)

Hope someone has found this helpful, as always, comments on better ideas is always welcome.

Monday Feb 16, 2009

On Windows, and using
Ant
to do native compilations with Visual Studio requires certain environment variables to be set.
So to be able to say "just run ant", you need to require the user to
have set these properties before they start ant.
The standard way to set them is with the script vcvars32.bat or
vsvars32.bat.
So how do you just make this work inside ant?
You can't just call vcvars32.bat from your ant script, or can you?

Turns out that you can dip into a cmd.exe environment just long enough to
run vcvars32.bat, print out the settings as a properties file and pop back out.
As an example,

REM Windows bat file that runs vcvars32.bat for Visual Studio 2003
REM and echos out a property file with the values of the environment
REM variables we want, e.g. PATH, INCLUDE, LIB, and LIBPATH.
REM Clean out the current settings
set INCLUDE=
set LIB=
set LIBPATH=
REM Run the vsvars32.bat file, sending it's output to neverland.
set VSVARS32=%VS71COMNTOOLS%\vsvars32.bat
if "%VS71COMNTOOLS%"=="" (
set VSVARS32=%VS80COMNTOOLS%\vsvars32.bat
if "%VS80COMNTOOLS%"=="" set VSVARS32=%VS90COMNTOOLS%\vsvars32.bat
)
call "%VSVARS32%" > NUL
REM Create some vars that are not set with VS Express 2008
if "%MSVCDIR%"=="" set MSVCDIR=%VCINSTALLDIR%
REM Try using exe, com might be hanging in ssh environment?
REM set DEVENVCMD=%DEVENVDIR%\devenv.exe
set DEVENVCMD=%DEVENVDIR%\devenv.com
REM Adjust for lack of devenv in express editions. This needs more work.
REM VCExpress is the correct executable, but cmd line is different...
if not exist "%DEVENVCMD%" set DEVENVCMD=%DEVENVDIR%\VCExpress.exe
REM Make sure Cygwin is on the path
set PATH="C:\cygwin\bin;C:\cygwin\;%PATH%"
REM Echo out a properties file
echo ############################################################
echo # DO NOT EDIT: This is a generated file.
echo windows.vs.vsvars32.bat=%VSVARS32%
echo windows.vs.DEVENVDIR=%DEVENVDIR%
echo windows.vs.DEVENVCMD=%DEVENVCMD%
echo windows.vs.VCINSTALLDIR=%VCINSTALLDIR%
echo windows.vs.VSINSTALLDIR=%VSINSTALLDIR%
echo windows.vs.MSVCDIR=%MSVCDIR%
echo windows.vs.INCLUDE=%INCLUDE%
echo windows.vs.LIB=%LIB%
echo windows.vs.LIBPATH=%LIBPATH%
echo windows.vs.PATH=%PATH%
echo ############################################################

Of course, the \\ characters need to be changed to /.
This results in something like:

The
OpenJFX Compiler Setup Files
is an open Mercurial repository that allows for building of the entire JavaFX
product from a forest of repositories, not all open of course.

Now before you post a question asking why JavaFX isn't all open source, you
would be asking the wrong person, I don't know and I have very little influence
over this, see the
open source statement here.
Also see the
OpenJFX Data Site
for more information on what is visible in the OpenJFX Compiler project.

Some of the issues I tried to tackle with this new setup may relate to many other
projects:

Dealing ant scripts on a very large project full of hundreds of ant scripts.

Sunday Mar 02, 2008

It took be some time, but I figured out how to change the environment variables that the
launched Mac applications will get.
Why? Because I wanted NetBeans to be running with a PATH environment
variable setting that matched what Ant got when I used the build.xml file
from the command line.
When running the "<exec>" Ant task to use an executable,
full paths or modifying the PATH
is very platform specific, hard to maintain, and a huge pain.
Having the PATH set properly in the environment is the best way.

Now I could launch "netbeans" from a command line and it would work, but
here is the answer for setting the environment for applications
launched:

The directory ~/.MacOSX will need to be created.
Apparently this file is read in at login time, so if you change it you will need
to logout and log back in again for any change to make a difference.
In my case I added the path to my Ant, my Findbugs, and /usr/local/bin
which contains the Mercurial (hg) I want to have available in the PATH.
I'm not sure NetBeans will actually use the version of Ant in the PATH, but
that hasn't been an issue for me.

Why Does hg need to be in the PATH?

It was the need for /usr/local/bin (hg) in the path (/usr/local/bin) that got me started
on all this because in my
Ant file I wanted the build to automatically pull the version information out
of the repository and make it available to the built product as a property
setting. Effectively I need to run:

# Get the last changeset
hg tip --template '{node|short}\\n'
# Get the latest tag with a Version string in it
hg log -l 1 --template '{desc|firstline}\\n' -k "Version:"
# Get the date of the last changeset
hg tip --template '{date|shortdate}\\n'

This property and it's value would need to be read in or made available at runtime, the
actual Java code would just need to getProperty("product.version") to get the
version string.

I used a tag to track the version code name, with
the most recent tag containing "Version:" provides the product code name.
The date and changeset id come from the tip, or most recent changeset.

Automating the creation of a new version code name tag can be done with a special
Ant target used when needed.
Effectively it needs to run:

hg tag -f -m "Version: Name" TAG-YYYY-MM-DD

I just manually created a file with a few hundred code names and pick one based on the
day of the year.
This "AllVersions" file could look as simple as:

Humor Risk (1921), previewed once and never released; thought to be lost
The Cocoanuts (1929), released by Paramount Pictures
Animal Crackers (1930), released by Paramount
The House That Shadows Built (1931), released by Paramount (short subject)
Monkey Business (1931), released by Paramount
Horse Feathers (1932), released by Paramount
Duck Soup (1933), released by Paramount
A Night at the Opera (1935), released by MGM
A Day at the Races (1937), released by MGM
Room Service (1938), released by RKO Radio Pictures
At the Circus (1939), released by MGM
Go West (1940), released by MGM
The Big Store (1941), released by MGM
A Night in Casablanca (1946), released by United Artists
Love Happy (1949), released by United Artists
The Story of Mankind (1957), released by Warner Brothers

But ideally you would want enough code names in the list to avoid the
name getting re-used too many times.
The "hg tip --template '{node|short}\\n'" is your real version,
these code names are just a way to help people quickly identify a version.

Now creating tags like this may not be advised or necessary with all
repositories, but the basic
principle can work in many situations.
For example, with the
OpenJDK
project, the Release Engineering people will create the major milestone
tags, and using those you could effectively identify a JDK version with a
name (e.g. JDK 7 Build 23), and an exact changeset id.
The trick is to get the version information from the repository, into the
build tool (make or ant), into the product installation or
baked into the product executable, and then available at runtime by the product
plus easily seen when looking at an installation of the product.
One issue I see is dealing with the situation where you are building a
plain source tree without the Mercurial data or the Mercurial tools,
somehow when the plain source tree is created the version data would
need to be left in the source bundle.

Hope this is of some use to people. I'm sure there might be a better way, so if anyone has
any ideas please add your comments.
Ultimately I'd like a product to be able to provide enough details to a user
so that the original source tree could be made quickly available.
Given the changeset id, the exact and complete source could be re-created
with hg clone --rev, of course that gets more complicated with
a forest, but still pretty simple.