Datanucleus, JDO and executable jar - how to do it?
Asked Answered
M

5

6

I am developing a desktop app with Datanucleus and JDO for embedded H2 database. It all works fine when I run it from Eclipse, but it stops working when I try to make executable jar out of it. I get a following error:

org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "jdo" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.

Of course it shows that I have not configured something properly - what am I missing? If I was missing something big, it wouldn't work at all, so I am assuming it's a flawed executable jar. I have seen that error in other apps, like JPOX, where it was fixed, but without any solution given.

Whole error stacktrace:

Exception in thread "main" javax.jdo.JDOFatalInternalException: Unexpected exception caught.
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at db.PersistenceManagerFilter.init(PersistenceManagerFilter.java:44)
        at Main.main(Main.java:26)
NestedThrowablesStackTrace:
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at db.PersistenceManagerFilter.init(PersistenceManagerFilter.java:44)
        at Main.main(Main.java:26)
Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "jdo" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.
        at org.datanucleus.NucleusContext.<init>(NucleusContext.java:233)
        at org.datanucleus.NucleusContext.<init>(NucleusContext.java:196)
        at org.datanucleus.NucleusContext.<init>(NucleusContext.java:174)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:364)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:294)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:195)
        ... 12 more

The line it points to is PersistenceManagerFilter init method:

pmf = JDOHelper.getPersistenceManagerFactory(getProperties());

Properties file looks like that:

    javax.jdo.PersistenceManagerFactoryClass=org.datanucleus.api.jdo.JDOPersistenceManagerFactory
datanucleus.ConnectionDriverName=org.h2.Driver
datanucleus.ConnectionURL=jdbc:h2:datanucleus
datanucleus.ConnectionUserName=sa
datanucleus.ConnectionPassword=

I have all dependencies from maven, with a goal to deploy with dependencies. Dependencies are as stated on datanucleus page http://www.datanucleus.org/products/datanucleus/jdo/maven.html

Any ideas?

Magnus answered 10/4, 2012 at 21:33 Comment(6)
what "executable jar" ? made up of what ?Discarnate
Whole project with maven dependencies deployed as executable jar file.Magnus
you mean you unjarred the DN jars and put everything in a single jar?Discarnate
I have bundled everything into big jar, all dependency jars into executable jar. With a maven-repo style structure inside the jar for dependencies. What is the correct way to bundle everything, so I can have working jar, with all dependencies working properly?Magnus
DN jars need to have their OSGi info in the right places (plugin.xml and META-INF/MANIFEST.MF). Are they?Discarnate
No, I do not have that in my MANIFEST file. I am not creating a plugin, just a standalone java app, so I didn't take a look at that section of documentation. I am not really sure what to put there then, bundle information too, or just define imports?Magnus
M
2

Adding to DataNucleus answer.
To acheave what you need you should use maven-dependency-plugin
and add following to your pom.xml

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-dependency-plugin</artifactId>
            <version>2.4</version>
            <executions>
                <execution>
                    <id>copy-dependencies</id>
                    <phase>package</phase>
                    <goals>
                        <goal>copy-dependencies</goal>
                    </goals>
                    <configuration>
                        <outputDirectory>${project.build.directory}/jars</outputDirectory>
                        <overWriteReleases>false</overWriteReleases>
                        <overWriteSnapshots>false</overWriteSnapshots>
                        <overWriteIfNewer>true</overWriteIfNewer>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

Then the dependencies will be in target/jars dir.

To execute your app you use command:

Windows:
java -cp "yourFile.jar;jars/*" package.className

Linux:
java -cp "yourFile.jar:jars/*" package.className

NOTE: do not use jars/*.jar, this will not work

Menell answered 13/4, 2012 at 8:2 Comment(0)
D
7

DataNucleus jars are all OSGi-enabled and use a plugin mechanism to identify capabilities, so contain plugin.xml and META-INF/MANIFEST.MF files. These need to be in the same locations as they are in the original DN jars (from the root of the jar). If you unpack and then rejar them up you will need to merge any plugin.xml and META-INF/MANIFEST.MF from the DN jars ... ALL of the information there not just some of it.

Discarnate answered 11/4, 2012 at 12:58 Comment(4)
Ok, I am now pretty sure that my jar is correct and contains all required informations. I think I am missing something in properties, what can it be? Or are they correct? I have edited my post to include all properties. Can it be about datanucleus.ConnectionFactoryName missing? What value should it have?Magnus
Could you be more precise on a way to merge these files ? It's not obvious.Leopoldine
@Discarnate yes explaining the best/right way to merge the manifest files would be great. I compile a shaded JAR in a big project and there's too many things to merge manuallyBoggart
@jtravaglini. The MANIFEST info is OSGi so any OSGi resource should help you on that (DataNucleus doesn't use it explicitly and won't stop working when you have some error in it AFAIK). The plugin.xml should be pretty obvious how to merge (being an XML file) ... put all extension-points at the top (from all plugin.xml files being merged), then put all extensions below (and all entries for each extension together).Discarnate
M
2

Adding to DataNucleus answer.
To acheave what you need you should use maven-dependency-plugin
and add following to your pom.xml

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-dependency-plugin</artifactId>
            <version>2.4</version>
            <executions>
                <execution>
                    <id>copy-dependencies</id>
                    <phase>package</phase>
                    <goals>
                        <goal>copy-dependencies</goal>
                    </goals>
                    <configuration>
                        <outputDirectory>${project.build.directory}/jars</outputDirectory>
                        <overWriteReleases>false</overWriteReleases>
                        <overWriteSnapshots>false</overWriteSnapshots>
                        <overWriteIfNewer>true</overWriteIfNewer>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

Then the dependencies will be in target/jars dir.

To execute your app you use command:

Windows:
java -cp "yourFile.jar;jars/*" package.className

Linux:
java -cp "yourFile.jar:jars/*" package.className

NOTE: do not use jars/*.jar, this will not work

Menell answered 13/4, 2012 at 8:2 Comment(0)
H
2

In order to use DataNucleus 4.x in an Apache Storm topology which requires a single jar, I had to do two hacks in order to keep their PluginRegistry stuff working. The issue is that the DataNucleus core tries to load modules as OSGi bundles, even when it is not running in an OSGi container. This works fine as long as the jars are not merged (and I would prefer not to merge my dependencies, but this is not an option for me).

First, I merged all the plugin.xml files into the datanucleus-core plugin.xml. The trick is that extension-point ids are relative to their parent plugin's id. So if any of the modules your are using define new extension-points, e.g. datanucleus-rdbms, you have to rewrite the ids so they are relative to their new parent plugin.

Second, I added the following entries to our jar's MANIFEST.MF:

Premain-Class: org.datanucleus.enhancer.DataNucleusClassFileTransformer
Bundle-SymbolicName: org.datanucleus;singleton:=true

This solution is not ideal as our application is essentially pretending to be the DataNucleus core OSGi bundle. However, this was what I ended up with after a few days off smashing my head on my desk.

It might be possible to provide a different PluginRegistry implementation, but I have not looked into this.

Homegrown answered 20/11, 2014 at 1:16 Comment(1)
Can you explain which Ids?Gilbertine
U
0

For anyone else struggling to merge the datanucleus plugin.xml files, I used the following code to help. Pipe the contents from the 3 separate datanucleus plugin.xml files using this command and this will tell you where there are extensions which explicitly need merging:

cat plugin_core.xml plugin_rdbms.xml plugin_api.xml | grep -h "extension point" | tr -d "[:blank:]"| sort | uniq -d

More details are in a separate post.

Undesigning answered 28/6, 2020 at 20:12 Comment(0)
B
0

Building on top of the existing answers (especially https://mcmap.net/q/1018618/-datanucleus-jdo-and-executable-jar-how-to-do-it), overrode my plugin.xml once manually with the IDs for the RDBMS plugin. I followed this answer for that: https://mcmap.net/q/999973/-apache-spark-hive-executable-jar-with-maven-shade

After storing that new plugin.xml as a resource, I configured my maven-shade-plugin as follows:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.4</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <shadedArtifactAttached>true</shadedArtifactAttached>
                <finalName>${project.artifactId}-shaded</finalName>
                <filters>
                    <filter>
                        <artifact>*:*</artifact>
                        <excludes>
                            <exclude>META-INF/*.SF</exclude>
                            <exclude>META-INF/*.DSA</exclude>
                            <exclude>META-INF/*.RSA</exclude>
                            <exclude>META-INF/DEPENDENCIES</exclude>
                            <exclude>META-INF/license/*</exclude>
                            <exclude>META-INF/LICENSE</exclude>
                            <exclude>META-INF/NOTICE</exclude>
                        </excludes>
                    </filter>
                </filters>
                <transformers>
                    <!--
                    Metastore dependencies require below highly specific merging solutions:
                    - https://github.com/apache/iceberg/issues/5946#issuecomment-1278674526
                    - https://mcmap.net/q/1018618/-datanucleus-jdo-and-executable-jar-how-to-do-it
                    - https://mcmap.net/q/999973/-apache-spark-hive-executable-jar-with-maven-shade
                    -->
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
                        <resource>META-INF/MANIFEST</resource>
                    </transformer>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                        <manifestEntries>
                            <Main-Class>${mainClass}</Main-Class>
                            <Premain-Class>org.datanucleus.enhancer.DataNucleusClassFileTransformer</Premain-Class>
                            <Bundle-SymbolicName>org.datanucleus;singleton:=true</Bundle-SymbolicName>
                        </manifestEntries>
                    </transformer>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
                        <resource>plugin.xml</resource>
                    </transformer>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.IncludeResourceTransformer">
                        <resource>plugin.xml</resource>
                        <file>src/main/resources/plugin.xml</file>
                    </transformer>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                        <resource>reference.conf</resource>
                    </transformer>
                </transformers>
            </configuration>
        </plugin>
    </plugins>
</build>
Beebe answered 16/12, 2022 at 20:49 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.