Introduction

Overview of current state

A module may export a certain list of public packages to any caller, or certain friend packages to an enumerated list of callers only, or no packages at all. Both Ant- and Maven-based build harnesses verify at build time that a calling module has permission to access packages from an API module via a regular "specification version" dependency, and the same check is done at runtime by the module system.

A spec dep is on a given specification version or newer, together with a major release version or (occasionally) range. The major release version is not related in any way to the specification version, but must be a natural number (if omitted it is typically considered to be zero). When compiling against m/1 2.3, the build harness will by default create a dep "m/1 > 2.3" matching any version of m with major equal to 1 and spec equal to 2.3 or higher. It is possible to explicitly request a major range like "m/1-2 > 2.3" matching m with major 1 and spec 2.3+, or m with major 2 and any spec. The expectation is that compatible API changes increment the spec version (somehow); whereas incompatible changes increment the major version (and perhaps also change the spec version): CompatibilityPolicy.

An implementation dep is on an exact implementation version of a module. By default that is generated by the build harness, e.g. as the current date, but it may be given explicitly, typically as a natural number (unrelated to either the major or spec versions). When using an impl dep, a public type in any package may be accessed (at build time and runtime).

Problems with current state

Usage from external modules

The most serious problem: external module developers frequently need to call some semi-API which is either not officially exported, or exported only to some friends. This is very common for IDE plugins, since plenty of widely used modules offer only friend packages: xml.multiview, gsf.testrunner, jumpto, and so on. The developer is usually implored to either ask for friend status from the maintainer of the API module, or to initiate a review of the module to make it public. Either way, this advice is useless for building against the current NetBeans release, and it discourages people from experimenting with potentially useful APIs which might be improved by their feedback. The problem is especially notable for API modules which are not actively maintained, both because getting changes made to the export metadata is hard, and because the API is probably not changing much from release to release anyway.

The usual workarounds are:

Declaring an implementation dependency, and shipping a different copy of the importing module for each NetBeans release, even when (in the usual case) its code is unchanged. Makes it difficult to build the product in a single job when using the Maven harness, and essentially impossible when using the Ant harness.

Like #1 but not bothering to ship different versions; users of a different NetBeans release must recompile the module from sources.

Calling the entire API using reflection.

Using reflection to hack into the NB module system and convince it that the importer is a "friend". (See https://bitbucket.org/jglick/yenta for example.) This has become the preferred technique for those who know about it, but of course it subverts the original intent of protecting a user from an unlinkable module - and introduces an unannounced dependency on internal implementation classes in the module system.

Version madness

Having three different kinds of versions that may control module-to-module dependencies - specification, major release, and implementation - is rather confusing. It is not obvious, for example, that upgrading from 1.1 -> 2.0 in spec version is not essentially any different than 1.1 -> 1.2; or that "m/1-2 > 1.3" will match m/2 in version 1.0.

Another minor point of confusion is that "m/1-2 > 1.3" matches 1.3 itself, i.e. '>' should really be '≥'.

Difficulties of implementation dependencies

Using impl deps is awkward in the best of circumstances. When the exporting module is cooperative, it will declare a numeric implementation version, then use spec.version.base to ensure that changes to that version automatically change the spec version too. (See DevFaqImplementationDependency for background.) Then the importer must declare an impl dep, which is difficult in the Maven harness (requires special metadata).

The importer might accidentally begin using packages which the exporter did not mean to make available even to this importer.

When publishing a numeric impl version, it is all too easy to forget to increment it when making changes, so some dependencies are often stuck on the same impl dep for years - even while the actual API has changed considerably.

Mixing public and friend packages

Declaring a list of friends for a module prevents you from also declaring public packages. (This is essentially because a module has just one version and we would not want to mark a major release version increment when only changing the friend packages: bug #54123 comment #6.) Since it is commonplace for an API module to need to export a special SPI to certain callers, but which should not be part of the general API, there are various workarounds known:

Add a public package with a name implying implementation and hope only the right modules call it. If an incompatible change needs to be made in it, make a compatible-style increment to the spec version and just hope the calling modules are updated in tandem (since the module system will not understand). (e.g.: org.openide.util.lookup.implspi)

Like #1, but override module.javadoc.packages so it does not appear in API documentation (but will still appear in code completion etc.). (e.g.: org.netbeans.modules.progress.spi)

Like #1, but use ad-hoc techniques at runtime to prevent calls from unexpected places, like checking the name of a subclass or checking a stack trace. (e.g.: org.openide.util.lookup.implspi.AbstractServiceProviderProcessor)

Try to factor the friend API code into a third module which both the public API module and SPI implementors will depend upon. Not always straightforward, and rarely done in practice.

Declare an impl dep. This is forbidden in the platform cluster, especially due to the problems interoperating with OSGi, but still common in other clusters. (e.g.: org.netbeans.modules.project.uiapi)

Minor incompatible changes

Most incompatible API changes are quite small in scope and are likely to affect very few clients; if they were more important, effort would be put into avoiding them. Yet the module system has only a binary notion "compatible" vs. "incompatible" change. So the API developer who needs to make a slightly incompatible change is forced to choose between

Make the API change and just mark a regular specification version increase. Increment the versions of any callers which are known to need updating and hope that users receive the updates. Hope that there are no other callers needing updates out there.

Increment the major release version. Forces those callers which really needed updates to be updated, which is good. But forces the great majority of callers to declare a dep on the new version (or a range) and release a new binary, which is potentially very disruptive.

Long-lost friends

OpenIDE-Module-Friends lists can easily include obsolete entries, since there is no check for unused friends. The entries which are netbeans.org modules are useless, since you can just look them up using the golden file; those which are external modules are potentially useful if you want to confer with external module developers about possibly incompatible changes, but it is not necessarily clear whether such friends still exist, or even where they live. For example, org.netbeans.modules.scala.project has been moved around, org.netbeans.modules.fortress.project is essentially dead, and org.netbeans.modules.javafx.dataloader is gone, so the declared friend list for org.netbeans.modules.gsf is no longer very useful.

OSGi translation

Creating OSGi equivalents of module dependency declarations is challenging when the major release version is involved, resulting in hacks like translating NetBeans 1.1 to OSGi 101.1.

Friend packages are not possible in OSGi mode; they are treated as public.

Implementation dependencies are also not available in OSGi mode, so modules with an integral impl version are simply treated as exporting everything.

Proposal: package stability classifications

Rather than forcing a module to declare a single stability level for its entire exported surface, permit individual packages to be separately marked as stable, somewhat stable, or not stable at all. Drop the explicit list of friends. Enforce a uniform, intuitive version range model that matches these expectations; but provide a graceful way to deal with the relatively rare case of a dependency falling outside that range.

Marking stability of export

API Stability lists various stability classifications for APIs. We should permit a module developer to explicitly indicate on each package their expectations for its stability over time.

One attractive option is to mark this right in the source code, as suggested in MNBMODULE-148. The advantage is that the metadata stays close to the sources, and (with @Documented) becomes part of any generated Javadoc. For example, in package-info.java:

where Stability could also be (say) DEVEL, or PRIVATE (the default if unmarked).

The build harness might need to collect stability classifications for the packages in the module and summarize them in the generated manifest; that depends on how they are consumed by importers and the module system (below).

TBD how to deal with binary packages included in a library wrapper module, which obviously could not be annotated in this way. Perhaps these could be listed manually in the source manifest in some way, or passed to the build harness.

A more conservative change would be to keep OpenIDE-Module-Public-Packages with its current meaning (assuming OpenIDE-Module-Friends is deprecated); and add OpenIDE-Module-Devel-Packages. A module could export public packages, devel packages, or both (but they may not overlap).

Marking stability of import

The next step is to determine what kinds of APIs a module is importing. Assume that major release versions and implementation versions are both deprecated (OpenIDE-Module-Build-Version taking over the informational role in the log file sometimes carried by OpenIDE-Module-Implementation-Version today), leaving the specification version as the sole determiner of API compatibility. (It is already the sole indication of an "upgrade" from the standpoint of Auto Update.)

For each exporter, i.e. module being compiled against, collect a list of packages actually being used. (See below about constants.) Pick the least stable such package, and include a dependency clause in the generated manifest according to the exporter's name, the exporter's current spec version (see policy below), and that minimum stability. Unlike current module dependencies, the stability would result in an implicit or explicit closed-open version range. Support the exporter is m 1.1 (~ 1.1.0); then

stable -> [1.1,2.0)

devel -> [1.1,1.2)

private -> [1.1,1.1.1)

Thus a use of a stable (or "official") API would have a similar effect as a spec version dependency today (with no declared major release version range): 2.0 would be considered to break compatibility for the official API. This is similar to the conventions used in Semantic Versioning. Please note there is also semantic versioning defined by OSGi guys expressing this kind of proximity as well.

For a devel API (roughly comparable to friend packages today), routine trunk development could include incompatible changes if convenient, so long as the spec version is updated and importers are updated to match; patches made to a release branch would be assumed compatible for importers, so only the edited module needs to be published on Auto Update. In the case of private packages, it is assumed that any specification version change, even patch updates, might involve incompatible changes.

Note that there is no explicit friend list, so any external module developer is potentially on equal footing as far as using APIs which are not official but are in practice stable enough to be useful.

It is TBD what form this new dependency metadata would take. The most conservative approach is to keep the current header but permit the dependency clause to be richer, perhaps using a range syntax:

OpenIDE-Module-Module-Dependencies: org.openide.nodes ~ [7.2,8.0)

Another possibility is to begin using the OSGi header, which already supports this:

Require-Bundle: org.openide.nodes;bundle-version="[7.2,8.0)"

TBD whether the module system would accept arbitrary version ranges, or only the three specific kinds generated above: [x.y.z,x'.0.0) or [x.y.z,x.y'.0) or [x.y.z,x.y.z'). If restricted, then some non-range-based syntax could be devised, e.g. "1.1+" vs. "1.1.0+" vs. "1.1.0.0+" in the example above.

Handling possibly broken dependencies

(In this section we assume that an API developer updates the spec version of a module in accordance with the stability level of the package being changed; see policy below.)

Since incompatible API changes do happen, and on occasion an importer would actually be broken by the change, we want to ensure that a user is never offered a new or updated module which begins throwing linkage errors when it is loaded.

(You might think that a broken module would just throw one such error and then "stop", but this is not so. A module may have many functional entry points, and some may be invoked very frequently, not necessarily in response to an explicit user gesture. For example, an editor hint which made use of an unlinkable utility class could throw a fresh exception every time the editor pauses for a rescan!)

Conversely, since most incompatible changes do not affect most callers, we do not want to gratuitously prevent a user from loading a module which in fact would work fine just because the version numbers do not match.

The solution is to degrade gracefully. If the (spec) version of the exporter is within the importer's requested version range, permit it to be loaded as now without any further ado - trusting that versions have been updated in a compliant way. If it is older than the lower end of the range, mark the importer as unloadable, as now - the user should look for a newer version (usually one would have been offered by Auto Update anyway, or they are simply using too old a platform).

If the exporter's version is newer than the upper end of the importer's range, rather than immediately marking the importer as unloadable, go through all the classes in its module and try to resolve them. If any have linkage errors, mark the importer as incompatible as we do today and refuse to enable it; if not, go ahead and load it, while logging a warning. There are a few subtleties here:

While this situation can be expected to be relatively uncommon, forcing resolution of every class in the importer during every startup is probably too expensive. Could be solved by caching the "known-good" state, with the cache to be invalidated on a subsequent startup if either the exporter or importer is changed.

Once a class is resolved, it remains referenced by its class loader for the rest of the session, consuming valuable PermGen heap space. It may be feasible to use a special "throwaway" loader for doing these checks, so that no extra memory is permanently consumed.

The JVM's binary compatibility specification perversely considers it compatible for an implementer of an interface if a method is added to the interface. Since this will still result in NoSuchMethodError's at runtime if called, which are just as unacceptable as any other linkage error, the compatibility checker would need to explicitly ensure that all methods from all nominally implemented interfaces are in fact implemented.

It would be desirable to check the existence of Java members referenced by XML layers (via newvalue and methodvalue attributes), since these often refer to factories in API modules.

Usages of compile-time constants would not be detected; see below.

Classes in Class-Path extensions often have unmet "optional" dependencies, so probably only the main module JAR should be verified.

The result is a balance of several objectives: the same startup performance as today for the usual case that everything is compatible; the ability to load potentially incompatible plugins at a small cost if they are not really broken; and a more or less polite way to refuse to load plugins which look like they are broken and need to be updated.

Behavioral API changes

This system only covers incompatibilities at the level of Java signatures; it does not handle incompatible changes to the behavior of methods, or to extralinguistic contracts such as XML layer locations or system properties. On the other hand, this may not be so bad:

Most of the ways in which modules communicate with one another is through regular Java binary interfaces.

Changes to behavior can be accompanied by changes in signature, which makes it clear at the language level what is different. In fact it is often possible and desirable to expose complex semantics in the Java signature, where it can be managed using common tools like Javadoc, Find Usages, and so on.

If such contracts do get broken, the consequences are generally not as dire as a linkage error. The plugin might not work well or at all, but it is unlikely to ruin the whole application by throwing constant exceptions or the like.

Compile-time constants

Detecting packages in use by a build-time dependency could be done at the source level, but much more easily in bytecode after compilation. The main difference is that compile-time constants - static final fields of primitive or String type whose initializers follow certain restrictions - are inlined by current versions of javac, losing information about the source dependency. Uses of constants would also not be visible to a linkage checker.

Possible workarounds:

Detect usages of constants in source code. Does not help with the linkage checker unless these are also recorded somewhere in the JAR.

Try to get javac itself to stop inlining constants - perhaps a request for a JLS change with -target 8, but not viable in the short term.

Force the field to not be initializable as a constant, so the caller's bytecode must refer to it. (Signature checkers could warn about fields with the ConstantValue attribute.) One downside is that you lose the ability to see the value of the constant in Javadoc. Possible by wrapping constant values in:

private static <T> T id(T t) {
return t;
}

This issue may not matter much anyway, since if the constant is inlined there will be no runtime reference either. The case of interest is when an API developer wishes to change a constant, which is typically an incompatible change that ought to be treated as such. In many such cases other parts of the API signature will be changed incompatibly as well.

Policies

Transitioning

TBD what the best way would be to transition from the current state. The module system should still be able to handle legacy release and implementation versions, and both impl and legacy (non-range) spec dependencies, and the generic build harnesses (both Ant and Maven) should continue to support legacy mode; but we would want to begin using the new convention for all netbeans.org modules and their interdependencies.

For modules declaring no major release version today, probably this is easy: deps on "m > 1.30" just switch quietly to (e.g.) "m ~ [1.30,2)" and keep similar semantics. OpenIDE-Module-Friends is dropped from the manifest, and the entries in OpenIDE-Module-Public-Packages are marked either stable or devel accordingly.

For modules declaring a major release version today, it is TBD whether that version should be dropped, or retained for compatibility with old external modules but ignored when creating new-style dependencies.

The relatively rare major release range dependencies can simply be replaced with a plain module dep (i.e. on the current version).

OpenIDE-Module-Implementation-Version would be removed from all modules explicitly declaring one (probably this should be made into a build error in projectized.xml if present), and all impl deps converted to plain module deps after ensuring that the packages actually being used are marked as devel. (Or, if appropriate, moving the necessary elements to a new devel package.) spec.version.base=1.5.0 would be converted back to plain OpenIDE-Module-Specification-Version: 1.6 (and again its usage should be made a build error for netbeans.org modules).

Conventional version numbers

TBD whether the current netbeans.org convention of x.y in trunk and x.y.z in release branches is compatible with this proposal. Using x.y.z in trunk would be more comfortable, since an update to x.y.z' could be used to push fixes to AU and/or mark a compatible change to a devel package. (Tracking compatibility of changes to private packages is probably unwieldy.)

Unfortunately that would not be compatible with the current convention for release branches, in which versions start at x.y.1, go to x.y.2 for the first patch or minor release, and so on. Either we could accept release branches being cut from x.y.z and the first release being x.y.z', etc.; or update to x.y'.0 in trunk, cut the branch (using initially x.y'.1), then update again in trunk to x.y".0.

Recording baseline version

#70917 in the Ant harness would mean source projects would no longer need to explicitly state the version of each dependency, which is very often wrong (usually too old) anyway.

CompatibilityPolicy#Versioning_impact could be simplified this way: when making an incompatible change, you would simply update the first component of the affected module's spec version, and that is all. Other modules in the same source tree would automatically begin to request the new version, so no patch to them would be needed, nor would they need to have been "prepared" using major release version ranges. Externally hosted modules would incur the one-time cost of a compatibility check unless and until recompiled against your new release.

In the case of the Maven harness, the exporter is picked out of the repository by the given version and compiled against, so it makes sense to just declare that as the base version. (Maven syntax actually permits version ranges, but these are useless for our purposes as they will result in the newest match being added to the compiler's classpath, whereas we would want the oldest.)

Permitted dependencies inside netbeans.org sources

The netbeans.org build harness (projectized.xml) should simply prohibit any dependencies on private packages. This means that the generic harness should have some configurable minimum stability level for all dependencies, below which package usage results in a build error.

Signature testing

A build harness could try to enforce change policy using signature tests. Currently the Hudson jobs permit any compatible change, sending a report from nbms-and-javadoc; and forbid any incompatible change relative to the last release.

For purposes of this proposal, it would be better to permit any compatible change so long as the specification version is incremented, and also any incompatible change so long as the spec version is incremented in the appropriate digit (according to the package's declared stability level), reporting either kind of change in nbms-and-javadoc. Doing this from the Maven harness would be easy enough (since older releases of a plugin are available in the repository), but it is trickier in an Ant build since you would need to somehow retain the signature file from the previous version.

Together with #Recording_baseline_version, a corollary is that making potentially incompatible changes to devel APIs would be both easier and safer than it is to do so today on friend packages. If you forget to update the spec of the exporter, sigtest complains. If you update the exporter but forget to update the spec of each importer affected by the change, the diachronic consistency test will complain. If you update specs of both exporter and all affected importers, everything will be fine, because the new importers will automatically request the new exporter, and unaffected importers will be found linkage-compatible at runtime. The same is true for incompatible changes to stable APIs, though of course these should be much rarer.

(This assumes that ConsistencyVerifier, used by the diachronic consistency test, performs the same linkage check as the module system would during startup: i.e. a dep on an exporter which is newer than the high end of the range should be permitted iff it in fact links.)

Usage warnings

When building an external module, i.e. the harness default when not used in netbeans.org sources, deps on private packages should be permitted but issue a warning. Deps on devel packages might merit an informational note but not a warning.

The Whitelist API could probably be used to warn the developer about usage of private APIs right in the Java editor: bug #201453.

The current tactic of compiling against build/public-package-jars/*.jar in the Ant harness would have to be dropped - the full module should be placed in the classpath instead. Note that this means that annotations using validateResource will begin permitting resources from non-API packages of other modules, which is not ideal but it does more closely match runtime behavior. The Maven plugin currently enforces public packages after compilation only, by emitting an error if needed; of course this would be switched to creating a dependency range.

(Alternately, permit deps on stable packages only by default, and force the user to explicitly enable deps on devel or private packages. This could be done in project.xml for the Ant harness, or plugin configuration for Maven.)

The current Add Dependency dialog for the Ant harness differentiates "API" from "Non-API" modules based on the existence of public packages, or friend packages for which the proposed importer is in fact a friend. This would probably need to be relaxed so that any module offering either stable or devel packages would be offered in the "API" list: by using one of the devel packages, you just create an accordingly limited dependency range.

OSGi impact

When using NetBeansInOSGi, the MakeOSGi task would be able to directly translate NetBeans version range dependencies to Require-Bundle entries. An OSGi container would simply lack the graceful fallback for out-of-range dependencies. This would help align NetBeans and OSGi dependencies more closely.

OSGiAndNetBeans#Runtime could be simplified in the same way; unnatural dependencies like [2.1,100) would become the natural [2.1,3.0).

It is noteworthy that section 3.7.3 "Semantic Versioning" of the OSGi r4 4.3 specification describes a similar policy for version ranges, except that rather than stable or devel packages, it discusses consumer and provider APIs, which might be mixed together in a single package or even class (effectively precluding any automated way of determining who might be broken by a given signature change). Also the OSGi specification reserves micro-digit changes for those which "do not affect the API" but does not specify whether API changes compatible for both consumers and providers may increment just this digit.

Jigsaw interoperability

In the future it may be necessary to interoperate with Jigsaw (JDK 8 modules) in various ways - depending on JDK modules or third-party Jigsaw modules from NB modules, writing NB modules in Jigsaw format, or converting NBMs to Jigsaw format in batch mode. This spec is still under development, so the following discussion assumes something like the first draft, and omits various interoperability issues not relevant to package accessibility (such as how to define layer.xml locations).

Reducing module versions to just the specification version should simplify such efforts. Jigsaw supports version ranges in dependencies, albeit in more flexible form than proposed here.

Jigsaw offers "permits" clauses very much like NB's current friend packages. (#Mixing_public_and_friend_packages is possible using nondefault views with permits.) If present, Jigsaw itself would enforce the restriction. Permits would simply be discouraged in the case of NBMs written in Jigsaw format.

Jigsaw has a binary distinction between exported packages (or types within packages, though unlikely to be used in practice) and internal packages. Nonexported packages cannot be compiled against (when using javac in module mode), and are not accessible or even visible to other modules at runtime. This is clearly not a perfect match to the current proposal. Also, a module-info.java must specify version ranges of imports, preventing an NBM written in Jigsaw source format from taking advantage of automatic ranges as in #Marking_stability_of_import. (#Recording_baseline_version is irrelevant in this case since javac will compute the classpath according to module-info.java.)

The suggested policy for NBMs written as, or converted to, Jigsaw modules is that both PUBLIC and DEVEL packages be exported, though the @Stability annotation be retained; PRIVATE packages (whether so marked or not) should not be exported. For NBMs written in Jigsaw source format, this constraint must be verified somehow (using an annotation processor or some batch tool).

In the case of an NBM written in Jigsaw source format, the module-info.java should specify the desired version range of each dependency, e.g.

After compilation, some verification tool (run either in a per-module build script or in batch on a whole cluster or application) would check that all dependencies are either of the form ">= x.y.z < x'.0.0" or ">= x.y.z < x.y'.0"; and that for dependencies in the first form, no types from DEVEL packages are referenced. (Dependencies on JDK or third-party modules, i.e. those with no stability annotations at all, must be excluded from this check.)

For an NBM converted to Jigsaw module format, the translation tool need do nothing special - merely copy the version ranges recorded in the NBM manifest into the generated module-info.class.

In either case, importing PRIVATE packages would not be possible: such code would neither compile nor run, any more than code trying to use a package-private type from another module. Since using PRIVATE packages is a measure of last resort, and should be even less commonly needed when #Mixing_public_and_friend_packages is no longer an issue, this seems acceptable.

#Handling_possibly_broken_dependencies would be impossible in the case of an app running entirely under Jigsaw, except perhaps using some batch tool to retroactively expand version ranges in module-info.class after verifying binary compatibility. If running the NB module system with Jigsaw interoperation (akin to current Netigso/Netbinox), regular NBMs with dependencies on Jigsaw modules could benefit from runtime compatibility checks; whether native Jigsaw modules with too-old imports could be checked in this way as well would depend on whether Jigsaw exposes hooks for adjusting module metadata after parsing but before final resolution.

JEP #179 mentions that the JDK will include its own stability annotations.

Documentation

Only stable packages should be published in Javadoc HTML form. These should carefully document all assumptions, include a changelog, etc.

Devel packages should include Javadoc comments but need not carry a changelog. A potential importer is expected to have access to the source code, so there is no need to publish HTML documentation (code completion popups suffice).

Documenting private packages is senseless except for comprehension by the module's own developer(s).

Entries in nbbuild/build.properties related to Javadoc (config.javadoc.*, javadoc.packages) could be dropped, as well as any module.javadoc.packages overrides; the build-javadoc target would simply run the javadoc target on all modules in the cluster config; a module with no stable packages would do nothing, and a module with some stable packages would create a documentation set for them (ignoring any additional devel or private packages).