Overview

There are several common transformation and job problems that can be easily addressed.

Report Parameters That Include Accented Characters Fail to Validate

If you run a report that has parameters that include accented characters and you see an error message that says "This parameter value is of an invalid value," then you must make the Tomcat server modification explained in Modifying server.xml To Work With Accented Characters.

Cannot Execute or Modify a Transformation or Job

If you cannot run, preview, debug, replay, verify, schedule, copy, export, or save a transformation or job, have an administrative user check if the role to which you are assigned has been granted execute permission. Execute permission is needed to perform these tasks.

Action Sequences That Call PDI Content Won't Run

If you've established a solution repository in PDI to store your jobs and transformations, and you attempt to use that stored PDI content in an action sequence on the BA Server, the action sequence will not execute. This is because the BA Server needs specific connection information for the Data Integration (DI) Server in order to retrieve the job or transformation.

Kitchen can't read KJBs from a Zip export

Note: This also applies to Pan and KTR files in Zip archives.

If you are trying to read a KJB file from a Zip export but are getting errors, you may have a syntax error in your Kitchen command. Zip files must be prefaced by a ! (exclamation mark) character. On Linux and other Unix-like operating systems, you must escape the exclamation mark with a backslash: \!

Execute Scheduled Jobs on a Remote Carte Server

Follow the instructions below if you need to schedule a job to run on a remote Carte server. Without making these configuration changes, you will be unable to remotely execute scheduled jobs.

Note: This process is also required for using the DI Server as a load balancer in a dynamic Carte cluster.

Stop the DI Server and remote Carte server.

Copy the repositories.xml file from the .kettle directory on your workstation to the same location on your Carte slave. Without this file, the Carte slave will be unable to connect to the DI Repository to retrieve PDI content.

Open the /pentaho/server/data-integration-server/tomcat/webapps/pentaho-di/WEB-INF/web.xml file with a text editor.

Save and close the file, then edit the carte.sh or Carte.bat startup script on the machine that runs your Carte server.

Add -Dpentaho.repository.client.attemptTrust=true to the java line at the bottom of the file.

Save and close the file.

Start your Carte and DI Server

You can now schedule a job to run on a remote Carte instance.

Kettle Cluster on YARN Will Not Start

If you are using the Start YARN Kettle Cluster job entry, but the Kettle cluster will not start and an error message displays, make sure that the Default FS setting matches the configured hostname for the HDFS Name node, then try starting the kettle cluster again.

Class org.pentaho.hbase.shim.hdp21.authentication.c not found

The software cannot find the jar that contains the class listed in the error message. The jar is the HDP hadoop-configurations directory. To resolve this issue, open the Sqoop Import entry in your job, click the Command Line Tool button, and add the –libjars argument along with the path to the pentaho-hadoop-shims-hdp21-security-<version number>.jar, like this: