As I promised overholt on IRC, I wanted to share my views about the Fedora Java packaging guidelines from a non-Java-coder point of view.
- "JNI-using JAR files"
This link is broken. It can be fixed easily.
- BuildRequires: jpackage-utils
Why do we need this? I understand Requires: jpackage-utils for directory ownership etc, but the BR is not necessary for most packages (at least all the ones I saw so far). I think this should *not* be required.
- In certain cases, we can build applications GCJ-natively (producing .so files). But these won't work with any JVM. What should be the packager's primary preference? GCJ-native or OpenJDK? The first one runs faster, but the second one has larger coverage.
For instance, tuxguitar (that I packaged) provides GNU Makefiles (that use GCJ) for this. Are the resulting .so files going to be the same as the ones built by aot-compile-rpm? (More about AOT later)
This case has confused me a lot in the past.
- Some explanation in the beginning about what GCJ can do and what openjdk can do; and some information about byte-code vs. machine-code will be very useful.
- BuildRequires: java-devel [>= specific_version]
How will the packager get to know the "specific_version"? For openjdk this is 1:1.6.0 , but for GCJ this is 1.5.0 . Are there other numbers that we need to know? Can't we put the numbers for all the cases in the guidelines?
- The Specfile Templates for ant and maven contradict with what was written in the BuildRequires and Requires section.
- the abbreviation "SNPG" should be defined in the first possible place, not in the third iteration (both for ant and maven).
- "Will this preserve the line ending as the this page says it must?"
This would be an artistic ending if we were writing a novel. But I think a guideline shouldn't end with open questions :)
- It would be nice if there was a definition of GCJ-AOT bits. What do they do? Why do we like them? What does gij do? etc
- "Note: For Fedora versions < 8, no JDK was available other than GCJ so java packages with executable code MUST have the GCJ AOT bits."
This notice can be removed safely.
- The occurences of
should be removed.
- GCJ AOT bits SHOULD be built and included in packages.
This needs to be more explicit. ie. 's@in packages@in all Java packages@'. I also think that this sentence should go to JAVA GUIDELINES so people click on the link for "GCJ Guidelines". The way that "GCJ Guidelines" link is put there, doesn't give an impression that it should be visited for any possible Java package.
These are all issues I encountered. If I remember more I will post them here. I thought a review on the guidelines from a Java-ignorant person would help other Java-ignorant people in the future. Thanks for reading :)
is it true that the Java library for serial communication RXTX is not
part of the distribution?
I searched for it using yum and was not successful. Maybe its name is
different from what I expected. If it is there, what is the package's nam=
 - http://users.frii.com/jarvi/rxtx/
I am currently trying to add incremental parsing (TM) to a java based
LR(n) parser. To prove that it works, I have the following test setup:
For every given test input file the following steps are run:
- parse that file normally
- for every token (well, for large files every 100th token or so) in
that file, add a test case which assumes that token has changed and runs
the incremental parser with that information
This gives me for a really large file some 100 test cases and two
_really_ strange things.
1. Memory growth from test case to test case: As I add tests dynamically
I run a single testsuite which creates all test cases, so I cannot fork
a new vm for every test case. That should not make any difference,
because in tearDown() I set all fields of the test case to null and
manually invoke the garbage collector. Still the heap memory growth
about 1 or 2 MB with every test case run, resulting in huge performance
holes when the heap runs full and finally a HeapOverflowException
- Remember: I set _all_ fields of the test cases to null after the test
was run -
2. And this is where the WTF-o-meter goes through the ceiling: On one
test the non incremental parser takes 300ms on a 18K input file
containing ~ 2800 tokens (which is a nice performance, I think). When I
run my incremental parser with the very first token marked as changed
it's finished after only 30ms. That would be a very good result if the
incremental parser implementation was finished yet, but it is not!
Currently that two invocations run exactly the same code path! So I see
a reduction to 10% in runtime only by invoking the same code path again.
If I would encounter both phenomenons one at a time I would blame (or
praise in case of 2.) openjdks jvm for it, but having both leaves me
with the guess of some kind of code caching mechanism in the background.
Is that some kind of hot feature? And how (for the sake of benchmarking)
can I deactivate it?
We'd like to enable an RPM script that mirrors OSGi Require-Bundle,
Import-Package, and Export-Package statements into RPM virtual Requires
and Provides . While testing this, Alphonse van Assche discovered
that our eclipse-ecj package should actually Require eclipse-rcp .
eclipse-rcp also includes SWT which itself needs a bunch of GNOME
libraries and XULRunner. I don't think we want java-gcj-compat dragging
in XULRunner and a bunch of GNOME libraries.
Therefore, I think we need to have a separate ecj SRPM/RPM . I have
whipped up a simple one for use by java-gcj-compat and put it here:
The java-gcj-compat maintainers will have to own it and therefore test
it out. I don't anticipate any issues, but please test it ASAP so we
can deal with any fallout. Once it's been deemed acceptable, it will
have to go through a review which I probably shouldn't do :) Then, the
eclipse package will need to:
- remove all references to ecj
- keep jdtcore symlinks but not ecj symlinks
- move jdtcore symlinks to -jdt package and remove -ecj package
This is all pretty minor and can be done very quickly. Ideally we'd get
it done before the beta freeze on Tuesday. If we can't get it done by
then, we could perhaps justify the minor changes for after the freeze or
wait until F-12.
Having automatic and correct OSGi bundle-level requirements matched in
our RPMs will be very nice. We will obviously still need BuildRequires
but we'll know at install time if there will be an error with OSGi
dependency resolution at runtime.
This is because our eclipse-ecj package contains the
org.eclipse.jdt.core plugin which needs org.eclipse.core.runtime among
other stuff and core.runtime is in the RCP feature. Note that the new
ecj package will contain just the batch compiler part of jdt.core -- as
opposed to the entire thing like we do now -- so it won't have the
dependency on any RCP bundles at the OSGi level (in fact, it won't even
be an OSGi bundle).
There are other benefits to a standalone ecj SRPM, of course:
- we don't need to patch org.eclipse.jdt.core and diverge from upstream
- GCCMain -- the gcj driver for ecj -- can go into the ecj JAR and not
- bootstrapping a full gcc SRPM no longer requires the output of
building an eclipse SRPM
One concern is that RHEL-5 has an unversioned Obsoletes/Provides on
"ecj" which should probably get fixed.
so I figured what caused my maven setup to suck. .m2 was corrupted.
But now I encounter another small problem. I have a library
(page-runtime) which is a dependency of another project (page).
When editing page in eclipse I want to have source code and/or javadoc
set in .classpath. I could do that manually, but mvn eclipse:eclipse
-DdownloadSources=true should do that, but it does not.
Any advice on that?
The jsr-305 package failed the mass rebuild due to maven being unavailable
at the time. Now that maven is reported to be fixed, I updated the sources
to the latest upstream version and tried to rebuild. I'm heading out of
town shortly, so I don't have time to diagnose the error I got on x86_64:
The fact that the build succeeded on i586 suggests to me that this is yet
another problem with maven or one of its dependencies. If someone
recognizes the symptoms, please give me a clue. Otherwise I'll try to
figure it out when I get back on Wednesday. Regards,