DataNucleus JIRA is now in read-only mode. Raise any new issues in GitHub against the plugin that it applies to.
DataNucleus JIRA will remain for the foreseeable future but will eventually be discontinued

Provide API to allow a class' ClassMetaData to be removed from the MetaDataManager.

This is to be able to implement a JRebel plugin, where each class is reloaded.

From experiments, I've discovered that using Eclipse IDE with the DN enhancer plugin actually causes JRebel to reload the class twice:
- the first time, JRebel picks up the unenhanced class. At this point we want to remove the class' metadata from the MetaDataManager
- the second time (called a few milliseconds later), JRebel picks up the enhanced class (after the DN plugin has done the enhancement process). This is a no-op for the JRebel plugin; DN can simply lazily recreate the metadata when the class is next used.

~~~
The above hopefully is sufficient to implement the plugin. (There are other more exotic designs whereby we could look to support do the enhancement itself within the JRebel plugin, but they shouldn't require any further DN changes).

Description

This is to be able to implement a JRebel plugin, where each class is reloaded.
From experiments, I've discovered that using Eclipse IDE with the DN enhancer plugin actually causes JRebel to reload the class twice:
- the first time, JRebel picks up the unenhanced class. At this point we want to remove the class' metadata from the MetaDataManager
- the second time (called a few milliseconds later), JRebel picks up the enhanced class (after the DN plugin has done the enhancement process). This is a no-op for the JRebel plugin; DN can simply lazily recreate the metadata when the class is next used.
~~~
The above hopefully is sufficient to implement the plugin. (There are other more exotic designs whereby we could look to support do the enhancement itself within the JRebel plugin, but they shouldn't require any further DN changes).

Implementing unloading of the metadata is easily doable. A point worth making is that if the user updates their class/metadata to something that implies a change in datastore schema then that will almost certainly NOT work (with codebase as of v3.3). Such a change is in the realms of "schema evolution" and only particular things are supported for RDBMS (like adding an implementation of an interface where a field is of an interface type).
http://www.datanucleus.org/servlet/wiki/display/ENG/Schema+Evolution

Andy Jefferson added a comment - 19/Jan/14 10:46 AM Implementing unloading of the metadata is easily doable. A point worth making is that if the user updates their class/metadata to something that implies a change in datastore schema then that will almost certainly NOT work (with codebase as of v3.3). Such a change is in the realms of "schema evolution" and only particular things are supported for RDBMS (like adding an implementation of an interface where a field is of an interface type).
http://www.datanucleus.org/servlet/wiki/display/ENG/Schema+Evolution

Given we're trying to build a tool for development use only (where the database itself will often be in-memory, eg using HSQLDB), we probably don't need a Rolls Royce solution. In terms of priorities, I would see:
1. being able to add new (scalar) field
2. being able to remove a (scalar) field
3. being able to rename a (scalar) field.

More sophisticated changes, such as adding bidirectional relationships, could be left until later.

Two questions:
1. how feasible is it to implement just the above changes
2. if the entire PMF is discarded and rebuilt, and the "datanucleus.autoCreateSchema" and similar properties [1] are set, then - although it would take longer to rebuild - would this support more complex changes (such as a new bidirectional relationship)?

Dan Haywood added a comment - 19/Jan/14 03:28 PM Thanks for pointing that out, Andy.
Given we're trying to build a tool for development use only (where the database itself will often be in-memory, eg using HSQLDB), we probably don't need a Rolls Royce solution. In terms of priorities, I would see:
1. being able to add new (scalar) field
2. being able to remove a (scalar) field
3. being able to rename a (scalar) field.
More sophisticated changes, such as adding bidirectional relationships, could be left until later.
Two questions:
1. how feasible is it to implement just the above changes
2. if the entire PMF is discarded and rebuilt, and the "datanucleus.autoCreateSchema" and similar properties [1] are set, then - although it would take longer to rebuild - would this support more complex changes (such as a new bidirectional relationship)?
[1] http://www.datanucleus.org/products/accessplatform/jdo/schema.html

GitHub master (datanucleus-core and datanucleus-api-jdo) have a method
MetaDataManager.unloadMetaDataForClass(String) allowing unloading of metadata for a class. Suggest you test it using nightly builds and give feedback. It is likely that there are other features downstream (StoreManager, backing stores) that will be making use of the AbstractClassMetaData object), and obviously any (currently) managed object will have a reference to the (previous) AbstractClassMetaData.

Note that Schema Evolution is not addressed as part of this issue, that would be on a store-by-store basis (for those stores that are schema aware) and will not be addressed in the 3.3 lifecycle.

The full solution would be a method on the PMF
unmanageClass(String className);

and then internally it can do all of the unravelling of references, whether in MetaDataManager, StoreManager or wherever. That particular option should be raised on Apache JDO project (the PMF already has getManagedClasses(), so it is a reasonable addition).

Andy Jefferson added a comment - 19/Jan/14 04:31 PM - edited GitHub master (datanucleus-core and datanucleus-api-jdo) have a method
MetaDataManager.unloadMetaDataForClass(String) allowing unloading of metadata for a class. Suggest you test it using nightly builds and give feedback. It is likely that there are other features downstream (StoreManager, backing stores) that will be making use of the AbstractClassMetaData object), and obviously any (currently) managed object will have a reference to the (previous) AbstractClassMetaData.
Note that Schema Evolution is not addressed as part of this issue, that would be on a store-by-store basis (for those stores that are schema aware) and will not be addressed in the 3.3 lifecycle.
The full solution would be a method on the PMF
unmanageClass(String className);
and then internally it can do all of the unravelling of references, whether in MetaDataManager, StoreManager or wherever. That particular option should be raised on Apache JDO project (the PMF already has getManagedClasses(), so it is a reasonable addition).

Dan Haywood added a comment - 20/Jan/14 02:48 PM Hi Andy,
I've done some experimentation with this.
The unloading of the metadata works fine, it seems. But (for reasons I'll omit here) it does appear that I will (after all) have to crack the problem of running the enhancer with the new metadata.
I've been able to write a CustomClassLoader and use it to defineClass with the byte[] array given to me by JRebel:
public class CustomClassLoader extends ClassLoader {
public CustomClassLoader(ClassLoader parent) {
super(parent);
}
public synchronized void defineClass(String fullClassName, byte[] bytes) {
try {
defineClass(fullClassName, bytes, 0, bytes.length);
} finally {
}
}
}
This CustomClassLoader has a parent class loader whose URLs are obtained from the classpath (via the ThreadContextClassLoader):
URL[] urls = ((URLClassLoader) contextClassLoader).getURLs();
URLClassLoader classLoader = new URLClassLoader(urls, parentClassLoader);
this.customClassLoader = new CustomClassLoader(classLoader);
when I load the class out of this classloader, it does seem to be loaded by the custom class loader, not its parent:
Class<?> cls = customClassLoader.loadClass(className);
System.err.println(" loaded: " + cls.getName());
System.err.println(" - classloader: " + cls.getClassLoader().
and when I iterate through its methods, it reflects the change just made in the IDE, and
System.err.println(" - methods:");
Method[] methods = cls.getMethods();
for (Method method : methods) {
System.err.println(" - method: " + method.
}
... so, to summarize, I think I have the class loading bit sorted out.
Where I am stuck, though, is that creating a new PMF with the above custom class loader defined as primaryClassLoader, I still seem to pick up the old metadata:
props.put("datanucleus.primaryClassLoader", customClassLoader);
props.put("datanucleus.identifier.case", "PreserveCase");
props.put("javax.jdo.PersistenceManagerFactoryClass", "org.datanucleus.api.jdo.JDOPersistenceManagerFactory");
props.put("javax.jdo.option.ConnectionDriverName", "org.hsqldb.jdbcDriver");
props.put("javax.jdo.option.ConnectionURL", "jdbc:hsqldb:mem:test");
props.put("javax.jdo.option.ConnectionUserName", "sa");
props.put("javax.jdo.option.ConnectionPassword", "");
PersistenceManagerFactory persistenceManagerFactory = JDOHelper.getPersistenceManagerFactory(props, "simple");
TypeMetadata typeMetadata = persistenceManagerFactory.getMetadata(className);
System.err.println(" typeMetadata: " + typeMetadata.getName());
I do get back a typeMetadata, and I'm iterating through its parent to get to the containing JDOMetadata parent (to hand off to the Enhancer):
Metadata md = typeMetadata;
Metadata parent;
JDOMetadata jdoMetadata = null;
while( (parent = md.getParent()) != null) {
System.err.println(" - parent: " + parent.getClass().getName());
System.err.println(" - parent.toString():\n" + parent.toString());
md = parent;
if(md instanceof JDOMetadata) {
jdoMetadata = (JDOMetadata) md;
}
}
Although this works, the metadata shows the class' original structure. That seems to suggest that I've not managed to point this PMF to use my CustomClassLoader.
Any insights appreciated...

After more experimentation, I established that JRebel does indeed load the class twice, once unenhanced, and then (after the Eclipse DN enhancer has run) in enhanced form.

I came up with the strategy of caching the bytecode whenever it is enhanced (and every class, when initially loaded, is of course enhanced). So, if my JRebel plugin is ever fed unenhanced bytecode, I basically just return the previous enhanced bytecode... a no-op.

So that means that in-memory enhancement isn't required, which simplifies things.

I did, though, manage to use the custom class loader, with a patched version of the ClassLoaderResolver (picking up the userDefinedClassLoader first), to build the Class<?> for new enhanced bytecode, and feed that to a new PMF in order to obtain the corresponding metadata.

I then used the new unload method (in this ticket) to remove the old metadata, and register the new.

What I found, however, was that although I got no errors, that any instantiated objects basically were empty. I suspect this is because (as you thought?) the downstream StoreManager (?) hasn't been invalidated/resynced.

So, I took a different tack, and threw away the entire PMF, and built a new one (meaning also that I didn't even call the unload method of this ticket).

~~~
Net result: I don't, therefore, need this unload method (though I don't think it does any harm). Also there is no real schema evolution in the above plugin (can add new properties if optional, or indeed remove properties, but a "rename" of a property is interpreted merely as a new and old property.

Whatever, it's a good result, so I'm happy.

Thanks for your help, Andy. Please close this ticket, optionally backing out the change since I don't require it.

Dan Haywood added a comment - 22/Jan/14 11:16 PM OK, here's where I am with this (generally, good news).
After more experimentation, I established that JRebel does indeed load the class twice, once unenhanced, and then (after the Eclipse DN enhancer has run) in enhanced form.
I came up with the strategy of caching the bytecode whenever it is enhanced (and every class, when initially loaded, is of course enhanced). So, if my JRebel plugin is ever fed unenhanced bytecode, I basically just return the previous enhanced bytecode... a no-op.
So that means that in-memory enhancement isn't required, which simplifies things.
I did, though, manage to use the custom class loader, with a patched version of the ClassLoaderResolver (picking up the userDefinedClassLoader first), to build the Class<?> for new enhanced bytecode, and feed that to a new PMF in order to obtain the corresponding metadata.
I then used the new unload method (in this ticket) to remove the old metadata, and register the new.
What I found, however, was that although I got no errors, that any instantiated objects basically were empty. I suspect this is because (as you thought?) the downstream StoreManager (?) hasn't been invalidated/resynced.
So, I took a different tack, and threw away the entire PMF, and built a new one (meaning also that I didn't even call the unload method of this ticket).
End result: it works!
The resultant plugin is up on github, at https://github.com/danhaywood/isis-jrebel-plugin.
~~~
Net result: I don't, therefore, need this unload method (though I don't think it does any harm). Also there is no real schema evolution in the above plugin (can add new properties if optional, or indeed remove properties, but a "rename" of a property is interpreted merely as a new and old property.
Whatever, it's a good result, so I'm happy.
Thanks for your help, Andy. Please close this ticket, optionally backing out the change since I don't require it.