2014-12-27T19:16:47+01:00http://www.martinahrer.at/Octopress2013-03-02T21:52:58+01:00http://www.martinahrer.at/blog/2013/03/02/zero2hero-offering-trainings-on-hot-technologiesI’s been a while since I have written my last blog post. Mostly because I have been busy with Zero2Hero which I have co-founded in fall last year. Zero2Hero is offering the coolest trainings on hot technologies. You can checkout the current offerings at http://www.zero2hero.at.

At the moment we are preparing some Git workshops - we have been able to engage Tim Berglund, a well known GitHubber. Registration has already started.

Git Beginners

Git Advanced

In the past months we started a series of workhops around Groovy technologies. So we had started out with Groovy (with Dierk König, Canoo), did Gradle (with Peter Niederwieser, Gradleware). In Fall this year we should finalize this series with a Grails training.

]]>2012-07-21T22:31:41+02:00http://www.martinahrer.at/blog/2012/07/21/892Zero2Hero is starting their first class series of Groovy trainings with a core Groovy training with Dierk König. Registration is open now! Seats are limited to a maximum of 15.
]]>2012-07-14T21:58:00+02:00http://www.martinahrer.at/blog/2012/07/14/hosting-a-static-web-site-in-a-amazon-s3-bucketCreate a Amazon S3 bucket

Using the Amazon Console create a S3 bucket. Use your domain name (e.g. www.domain.tld) to name the bucket.
Open the properties panel, switch to the Website tab, enable hosting of static web sites. Next go back to the permissions tab and click Edit bucket policy

]]>2012-02-15T16:54:15+01:00http://www.martinahrer.at/blog/2012/02/15/868Yesterday I had to move a few files from a Git repository A to repository B while working on a client’s project. Today I had to do it again on another project. Again I had to think about the sequence of commands I had used. For this reason I’m posting this here.

In line #1 we clone the source repository so we can’t break the original one, we also remove the remote in line #4 so no links back to the source repository exist. Next using the filter-branch command we filter out everything but the directory we want to keep.
Pull files to new repository

]]>2011-09-13T23:27:44+02:00http://www.martinahrer.at/blog/2011/09/13/migrate-from-hades-to-spring-data-jpaFor the past 18 months I have been utilizing hades for implementing JPA based repositories/DAOs. Hades has now been moved into the Spring Data JPA project. Starting with their 1.0.1 release I tried migrating one of my projects to this release. Following I document the steps that were necessary to achieve this.

Their project page starts off with showing the Maven repositories and dependency descriptors (in case you are using Maven). Obviously the first thing is to replace the hades artifact by the spring-data-jpa artifact.

The project’s domain classes implement the org.synyx.hades.domain.Persistable interface which has to be replaced by org.springframework.data.domain.Persistable (takes a single generics parameter for the id type).

DAO interfaces extending org.synyx.hades.dao.GenericDao have to be modified to extend org.springframework.data.repository.CrudRepository. With this modification you will face a couple of compile problems. read*-methods have been renamed to find*-methods. Also you are loosing the paging and sorting capable find* - methods (which I didn’t use). If those are required then extend the org.springframework.data.repository.PagingAndSortingRepository interface.

find* - methods now return iterables rather than list types requiring minor adjustments to existing codebase. To get around this, the interface can also extend from JpaRepository. Using this base interface hands you a pretty powerful DAO that goes far beyond a simple CRUD repository.

When I started off with hades many month ago I used an explicit XML-based configuration of DAO components.

With these changes in place all tests were showing the green bar again. However, today I would recommend using repository namespace http://www.springframework.org/schema/data/jpa which significantly reduces the amount of configuration needed by utilizing class path scanning to detect repository components.

Certainly you might encounter more steps during migration depending on the features you have used from the old hades code base. In my case I only had used named JPA queries which just continued to work like before.

]]>2011-08-17T00:11:45+02:00http://www.martinahrer.at/blog/2011/08/17/analyzing-class-metadata-with-the-springframeworkWhat a shame I haven’t published anything for a long time. To make up for this long pause, I’m going to discuss how Spring can support with analyzing class metadata. These days everybody is crazy about using annotations in their frameworks. Occasionally you need to know what classes/methods are annotated. A good example might be to build a list of class names for all JPA entities (those classes annotated with javax.persistence.Entity and the like).

A simple use-case showing how to detect classes annotated as components

This use-case is using a few Spring APIs and some home grown code that controls resource matching (ClassResourceResolver and AnnotationMetadataMatcher).

Under the hood Spring is using the ASM bytecode library for looking into the class files. Note that no class examined by the code above is actually loading any of the class objects so it is fairly lightweight and not filling up your perm gen space. Still you need to make up useful search paths so you don’t end up scanning through the full classpath.

Matthew McCullough’s Bio
Matthew McCullough is an energetic 15 year veteran of enterprise software development, world-traveling open source educator, and co-founder of Ambient Ideas, LLC, a US consultancy. Matthew currently is a trainer for GitHub.com, author of the Git Master Class series for O’Reilly, speaker on the No Fluff Just Stuff tour, author of three of the top 10 DZone RefCards, including the Git RefCard, and President of the Denver Open Source Users Group.
His current topics of research center around project automation: build tools (Maven, Leiningen, Gradle), distributed version control (Git), Continuous Integration (Hudson) and Quality Metrics (Sonar).

]]>2011-01-19T13:46:42+01:00http://www.martinahrer.at/blog/2011/01/19/testing-static-methodsWe all have learned that it is not wise to overuse static methods as these are kind of hard to test and deal with. So in general I strive to get out of their way. Sometimes I’m facing the situation that the use of a framework or that some technical infrastructure requires me to implement static methods.
So for testing I found PowerMock which nicely integrates with EasyMock and Mockito.
]]>2010-10-20T00:41:38+02:00http://www.martinahrer.at/blog/2010/10/20/activemq-in-embedded-modeI’m using Apache ActiveMQ for testing JMS based code. With their Springframework support it’s pretty simple to embed it into a test suite.
Today I encountered a strange problem. After I had broken my test and fixed the bug later I was not able to restart the embedded ActiveMQ container.

Obviously java.io.EOFException: Chunk stream does not exist at page: 0 sounds like some temporary file data was corrupt. JobSchedulerStore creates a directory activemq-data/localhost/scheduler for storing scheduler data. Clean/delete that and it should be startable again. If you don’t need that then you are better off with completely deactivating that feature.

Also its helpful to disable automatic start of the queue container so the test can fully control it.

1234567891011121314151617181920

publicclassServiceOperationsJmsProducerTestextendsAbstractJUnit4SpringContextTestsimplementsBrokerServiceAware{privateBrokerServicebrokerService;@TestpublicvoidtestProducer()throwsException{try{brokerService.start();// do the testing}finally{brokerService.stop();brokerService.waitUntilStopped();}}@Override@ResourcepublicvoidsetBrokerService(BrokerServicebrokerService){this.brokerService=brokerService;}}

]]>2010-10-16T20:11:06+02:00http://www.martinahrer.at/blog/2010/10/16/aspectj-runtime-weaving-within-a-maven-buildIn order to perform AspectJ runtime weaving, an agent is required. Here we are using the agent (spring-instrument-3.0.3.RELEASE) provided by the Springframework. So for surefire we have to use a command line argument javaagent: that specifies the JAR file containing the agent. To keep the build portable we copy that artifact to the build directory first using the maven-dependency-plugin.

]]>2010-10-11T19:08:25+02:00http://www.martinahrer.at/blog/2010/10/11/using-a-blackberry-device-as-a-modem-on-your-macWhile on the windows platform it was sufficient to only provide the username and password for the providers APN, with Mac OSX a generic modem phone number is required. This phone number is “991#”.
For a username and password for the providers APN you have to consult your provider.
The user name for A1 (Austria) is ppp@A1plus.at with the password ppp**.

In the “Advanced” settings choose “Research in Motion” as vendor, “BlackBerry IP Modem (GSM)” as model, “a1.net” as APN!

]]>2010-10-06T23:45:33+02:00http://www.martinahrer.at/blog/2010/10/06/register-activemq-spring-namespace-with-eclipse-3-6In a earlier post I described how to setup the eclipse XML editor to validate a Spring context file containing the AMQ namespace.

Looks like with eclipse 3.6 that does not work anymore, I was not able to actually use a JAR file as location like I used to set this up with eclipse 3.4.
So I’m directly associating the AMQ namespace with the XML schema available from the AMQ project.

]]>2010-05-26T00:03:42+02:00http://www.martinahrer.at/blog/2010/05/26/using-nexus-and-the-nexus-rest-api-for-implementing-a-software-update-toolNexus is using a pretty well documented REST API which is usable externally as well.
For one of my customers I implemented a kind of automatic software update tool that can be embedded into any product. It is based on the Sonatype NEXUS repository manager.

That tool performs the following steps

Detect the current software version from the artifact metadata that is embedded into any Maven-built artifact

Query Nexus for available artifacts for the installed software (identified by Maven groupId and artifactId).

Test if any of the available artfacts has a version number higher than the installed one

If a newer version is available then download it

Move the new version to the target directory (e.g. the deployment folder of the application server)

The Nexus REST url for a simple query for some artifact looks like http://localhost/nexus/service/local/data_index/repositories/releases/content?q=commons-lang

This query asks for the commons-lang artifact from the releases repository.

That only needs to be parsed using a SAX parser for extracting the most important elements. This is the Maven GAV, packaging, classifier, and the download url which are stored in a POJO. This POJO implements the algorithm for selecting the latest version by its compareTo method.

While implementing the download part I had learned an important lesson. For downloading and moving large files it is much more efficient to use the Java NIO API.

]]>2010-05-22T23:51:22+02:00http://www.martinahrer.at/blog/2010/05/22/java5-generics-are-implemented-by-type-erasure-but… still it is possible to get hold of the actual type parameters of parameterized types. Even if classical reflection does not deliver type information, this type parameters can still be extremely useful. I have a few pieces of code (DAO, controllers, etc.) that are parameterized by class objects.

This is kind of violating the DRY principle. The actual type is already given through the type parameter. So I was trying to eliminate this few lines of code and come up with a solution that can analyze any parameterized type from a class declaration (super class hierarchy + interfaces).

]]>2010-05-18T15:14:40+02:00http://www.martinahrer.at/blog/2010/05/18/a-simple-viewcontroller-for-jsfVery early Apache Shale came up with the idea of providing a view controller that allowed to execute dedicated methods (annotated, etc.) during various JSF life-cycle (phase) events for doing initialization work upfront for example. As Shale was hibernated, Apache Orchestra stepped in and added many more useful features.
Both frameworks are based on the idea of having a dedicated page bean (a JSF managed bean per view) that would control important processing steps for the entire view (page).
It requires following naming conventions in order to associate a page bean to a view, optionally a view controller mapper provided the ability of explicitely associating page beans and views.

In many projects I found the situation that the client didn’t want to pull in another framework just for that little feature. Also for distributing responsibilities, I usually have multiple managed beans per view. So another “artificial” page bean is required just for controlling the others. So I came up with a fairly simple solution that allows to directly invoke those life-cycle callbacks on any managed bean used on a single page.

The <f:phaseListener> tag allows to bind a JSF PhaseListener. You can even bind multiple phase listeners (one for each managed bean; you get the idea).

So each of the beans would provide a reference to a phase listener that would take care to perform the proper processing for the bean itself. Every bean then would use a set of annotations (like @PreRenderView, @PostRestoreView, @PreInvokeApplication, etc.) to mark those methods that should be called during a specific phase.

The code above is partially showing a base class from which many of my managed bean are subclassed. Key is the ViewControllerPhaseListener which hides all of the details of doing the reflection job for finding (and then caching) the annotated methods to invoke during a specific phase. For brevity I’m showing only the key elements of this class.

In a few projects this has proven to be quite versatile. Currently I have a couple of interesting phase listener components, from the simple DebugPhaseListener to the OpenTransactionInViewPhaseListener. I can just throw them into any view as they are required.

]]>2010-03-24T19:40:03+01:00http://www.martinahrer.at/blog/2010/03/24/resource-versioning-in-jsf2JSF2 made an promising attempt to provide versatile resource versioning but obviously has failed this. After I had used weblets for a while I tried to migrate some JSF2 web application to adhere to this standard. Unfortunately I had to figure out that resource versioning does not work as it has been proposed in the JSF2 spec.

Seem that doing versioning of resources loaded from a classpath location has given the JSF2 implementation team a hard time. The spec was suggesting that a JAR could contain a version number for either the library or even the resource name (e.g. META-INF/resources/library-name/version/resource-name) as described in the blog entry linked above.

Versioning only works for resources loaded from the webroot (e.g webroot/resources/library-name/version/resource-name…) and it looks like the next version of the JSF spec will drop some of the versioning features again.

]]>2010-02-09T21:13:56+01:00http://www.martinahrer.at/blog/2010/02/09/rendering-a-facesmessage-reliablyQuite a while ago I had posted about generating a FacesMessage within a method that is called during the RENDER_RESPONSE phase. Today I had to find a way to display those at the very top of a page and none of the message should get lost.

So I came up with trying that with jQuery (in my case it is already included in Rich-Faces that I was using here). The idea is to actually render all FacesMessage objects at the very bottom of the page and once the page has finished loading move its DOM tree up where it was supposed to be displayed.