A.
]]>anietorphttps://forums.pentaho.com/threads/231348-Kettle-8-2-Database-lookup-error-with-sqlserver/Regarding Select Value step in kettlehttps://forums.pentaho.com/threads/231347-Regarding-Select-Value-step-in-kettle/?goto=newpost
Wed, 20 Feb 2019 07:49:59 GMTHello,
Is it good to use select value step many times in a dashboard kettle? Because if we are having 10 columns in the input stream and want to...Hello,

Is it good to use select value step many times in a dashboard kettle? Because if we are having 10 columns in the input stream and want to join through merge join step and after the merge join we need only 4 columns. So it is good to remove the remaining 6 columns?

Regards,
Har**** Saxena
]]>hsaxena20https://forums.pentaho.com/threads/231347-Regarding-Select-Value-step-in-kettle/BigQuery error with pentaho 8.2 using table output stephttps://forums.pentaho.com/threads/231346-BigQuery-error-with-pentaho-8-2-using-table-output-step/?goto=newpost
Tue, 19 Feb 2019 21:17:45 GMTWe did a pentaho big query with a simple data row generator for 3 coloum and the same column was added to the target ( big query) . Here we used...We did a pentaho big query with a simple data row generator for 3 coloum and the same column was added to the target ( big query) . Here we used table output step to write to the target

We are able to successfully connect and browse the big-query schema and tables on the target using target connections but how ever when we try to run the etl it fails with the below error

Table output.0 - The Simba driver for a Google Big Query database connection does not support regular DDL statements. Please use the GBQ Bulk Loader step to create your table.

We also checked the jira for same error but it says its fixed on 8.2 but still no go

Can some body advise us if there is something missing on this ? or Anyone connected successfully to big-query and done a data load with 8.2 ?
]]>greyyfox7https://forums.pentaho.com/threads/231346-BigQuery-error-with-pentaho-8-2-using-table-output-step/Execute R Script - Access Denied to jri.dllhttps://forums.pentaho.com/threads/231345-Execute-R-Script-Access-Denied-to-jri-dll/?goto=newpost
Tue, 19 Feb 2019 16:41:33 GMTHello!

I'm using the "Execute R Script" step to execute R code, but, I'm having a problem executing a transform that uses this step. Because of a denied access to jri.dll, the transformation doesn't run successfully.
See the image that I attached.

Any help will be greatly appreciated. Thank you!
]]>wisleyvelascohttps://forums.pentaho.com/threads/231345-Execute-R-Script-Access-Denied-to-jri-dll/Copy rows to result behaviorhttps://forums.pentaho.com/threads/231344-Copy-rows-to-result-behavior/?goto=newpost
Tue, 19 Feb 2019 15:23:56 GMTI'm following this guide located at https://www.clearpeaks.com/copy-rows...ult-in-kettle/. I'm using the REST client in one transformation to retrieve a resultset and then using the copy rows to result to pass the resultset to another transformation, where I access the individual fields as parameters. This is mostly working, however I'm finding certain parameters that should have a null value, contain a value for previous field in the resultset. Is this expected behavior? Any ideas on how to overcome this? I'm using pdi 8.2.0.0-342. I have 'Execute every input row' checked.
]]>clarkddchttps://forums.pentaho.com/threads/231344-Copy-rows-to-result-behavior/Property Input not including subfoldershttps://forums.pentaho.com/threads/231343-Property-Input-not-including-subfolders/?goto=newpost
Tue, 19 Feb 2019 14:21:32 GMTWhen configuring the property input to use subfolders, it won't work, while the text file input will.
Has anyone encountered the same thing?
Am I missing something, or is it a bug? Tested with CE 8.0 and 8.2!

https://uploadix.de/image/ZqPhttps://uploadix.de/image/Zq9
]]>R2-G2https://forums.pentaho.com/threads/231343-Property-Input-not-including-subfolders/Scope of variables Different between kettle 5 and kettle 8https://forums.pentaho.com/threads/231342-Scope-of-variables-Different-between-kettle-5-and-kettle-8/?goto=newpost
Mon, 18 Feb 2019 08:47:39 GMTHi All,
I have a job run at kettle 5.0.0.CE successfully. but when i run the same job on kettle 8.0.0.5 CE . It raise a error. I check the...Hi All,

I have a job run at kettle 5.0.0.CE successfully. but when i run the same job on kettle 8.0.0.5 CE . It raise a error. I check the log and find the root cause is Scope of variables Different .
the transformation cant' get the variable values in the kettle 8. I set the scope is root-job.

but when i set it to JVM . It does work on kettle 8.

I want to know what's the different between kettle5 and kettle 8 for variable scope. If i set the scope to JVM whether will impact the others job call by crontab.

Thanks
]]>zxx2403https://forums.pentaho.com/threads/231342-Scope-of-variables-Different-between-kettle-5-and-kettle-8/8.2 CE Google Drive API how to see foldershttps://forums.pentaho.com/threads/231341-8-2-CE-Google-Drive-API-how-to-see-folders/?goto=newpost
Fri, 15 Feb 2019 12:50:56 GMTHi, I am exploring new feature to connect to Google Drive, following a blog i found online(not sure if i am allowed to post a link here).
I have...Hi, I am exploring new feature to connect to Google Drive, following a blog i found online(not sure if i am allowed to post a link here).

I have set the Google Api, downloaded and renamed the JSON file into credentials folder and now after restart I can see the Google Drive option appear via Open URL.
Only I cannot see any folders and also not allowed to create a new folder, also not allowed to press OK to finish the Open URL action.

Please suggest me how to replace new line character(\n) in String using "Replace in String" step.

I have tried with \n in "Replace in String" step but no use.

Thanks in advance,
Ram
]]>Jahnavihttps://forums.pentaho.com/threads/231340-Replace-new-line-character(-n)-in-String-using-quot-Replace-in-String-quot-step/Looking for efficient way to transfer a large table (Partitioning the input table?)https://forums.pentaho.com/threads/231339-Looking-for-efficient-way-to-transfer-a-large-table-(Partitioning-the-input-table-)/?goto=newpost
Tue, 12 Feb 2019 23:56:06 GMTHi,

I'm trying to transfer a local Oracle table with 22 million rows to an Azure SQL database (plain transfer, no transformations). On the "table output" step, I can easily pump up "Number of copies to start" (e. g. 200), and it works fine. However, now the input part can't keep up with the output, i. e. it became the bottleneck.

I tried to start the table input step multiple times (and I made sure data movement is set to "Round-Robin"), but I noticed that data gets multiplied nonetheless, so that doesn't seem to be an option either.

I then came across 'table partitioning", here's what I've tried (and it seems to "somehow" work)": I set up a "Partition" in the "View" tab, and set it randomly to 5 partitions. Then I attach a select step to the input step and select "right click -> partitions -> Remainder of division -> Select the new partition schema -> select a field name). Then, attach a table output step and start it multiple times.

It seems to work on smaller tables (that's all I tested), but my approach is a bit like "hit and miss" (or rather "voodoo"). Before I try it on a large table, could someone give me some feedback if I'm on the right track here? And do I need to include other steps, for example, I could partition the transactions table by year. Do I need to sort the table by year first before partitioning).

Thanks in advance
]]>MacPhotoBikerhttps://forums.pentaho.com/threads/231339-Looking-for-efficient-way-to-transfer-a-large-table-(Partitioning-the-input-table-)/Longest matchhttps://forums.pentaho.com/threads/231337-Longest-match/?goto=newpost
Mon, 11 Feb 2019 14:23:18 GMTTrying to process a stream of data by comparing a field to a info-list, then returning an ID from this info-list (if a match is found). So far so...Trying to process a stream of data by comparing a field to a info-list, then returning an ID from this info-list (if a match is found). So far so good, a fairly simply task. However, this lookup must be done using longest match! This is similar to longest prefix match used with IP-addresses.

Example:

Incoming row with value "10000123"

Info-list:
1000
100001
1000012
100002
1200

The longest match here would be 1000012. The result should be fetching a corresponding ID in this info list.

I tried using recursive sub-transformation, iterating through the various lengths, to try and find the longest match first. Kettle/spoon didn't like that very much, and it stopped/froze. I think the sub-transformation executes even if it doesn't receive any input rows, if that's the case it's not able to break the recursion and will crash into infinity. I don't want to hardcode each iteration (could be very many different lengths in the info-list). So I wonder if there is any other solution than doing it manually in java?
]]>Sparkleshttps://forums.pentaho.com/threads/231337-Longest-match/Find characters in a listhttps://forums.pentaho.com/threads/231335-Find-characters-in-a-list/?goto=newpost
Thu, 07 Feb 2019 19:40:12 GMTHello.

I have a issue that I can't solve.

I have the field like a image, that I return the values W, T, @, CH or SAC.

So, I need to check if this values are in the 4 or 5 firts characters, but I don't have a pattern in my field.

Can anyone help me?
]]>thiagofredhttps://forums.pentaho.com/threads/231335-Find-characters-in-a-list/Rest API CDA for Reporthttps://forums.pentaho.com/threads/231334-Rest-API-CDA-for-Report/?goto=newpost
Thu, 07 Feb 2019 19:08:27 GMTHi guys i'm new here!
i read to much about PDI and the all suite and i have very excited on participate here!

Now my first question...

I have a CDA on mi BI-Server for read data from a mysql table. I call the cda with the URL request and i get the options for take de output in xml

thanks to all!!!
]]>renato.droguetthttps://forums.pentaho.com/threads/231334-Rest-API-CDA-for-Report/https://forums.pentaho.com/threads/231333-PDI-8-2-0-0-342-Microsoft-Excel-input-step-Browse-doesn-t-quot-see-quot-xlsx-files/?goto=newpost
Thu, 07 Feb 2019 17:01:35 GMTI've been using PDI for quite some time but this is the first time I've run across this issue. I'm trying to load an Excel (xlsx) file into the Microsoft Excel input step. Yet when I browse to the folder containing my xlsx files nothing shows up in the step's browse window. It will see xls files but not xlsx files. If I change the browse filter to "all files" to where I can see the xlsx files, add one, and then try to get the sheet names I get the following error:

at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:54)
at org.pentaho.di.trans.steps.excelinput.WorkbookFactory.getWorkbook(WorkbookFactory.java:39)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.getSheets(ExcelInputDialog.java:1922)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog$17.widgetSelected(ExcelInputDialog.java:1184)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.open(ExcelInputDialog.java:1214)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:120)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8662)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3293)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:785)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1381)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7817)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9179)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:707)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: jxl.read.biff.BiffException: Unable to recognize OLE stream
at jxl.read.biff.CompoundFile.<init>(CompoundFile.java:116)
at jxl.read.biff.File.<init>(File.java:127)
at jxl.Workbook.getWorkbook(Workbook.java:268)
at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:52)
... 29 more

I have written a java code for encryption of data. currently it is throwing the error as "no applicable constructor/method found for actual parameters [byte], [byte]. Candidates are byte org.springframework.security.crypto.util.Encodingutils.concatenate (byte[]). "
Can anyone please help.

I am currently using pentaho 7 version.
I have placed the jar file in pentaho lib folder and I also checked in launcher.properties file if classpath is mapped to lib folder.
After all these checks there is still an issue.
Can anyone pls help me out.
]]>saidlp13https://forums.pentaho.com/threads/231331-Access-of-external-jar-using-udjc-step/Launch Spoon on ubuntu 18.04https://forums.pentaho.com/threads/231330-Launch-Spoon-on-ubuntu-18-04/?goto=newpost
Mon, 04 Feb 2019 08:21:57 GMTHi All,
I have installed pentaho on ubuntu server which is running on GCP. I have followed all the steps to install and the installation was fine. I...Hi All,
I have installed pentaho on ubuntu server which is running on GCP. I have followed all the steps to install and the installation was fine. I ran the job from terminal using kitchen and it works fine. But when i am trying to open spoon, I'm facing the below error.

When i try to access on status page, all is broken because static web resources cannot be loaded (css, svg).

I have the folowing use cases.

Case 1) Through Carte status page, seems it's trying to get access to inaccessible content. I tried change to "kettle/content..." and "kettle/pentaho/content..." but it still failed.

Case 2) Through PDI status page, i have custom url, instead of "/pentaho" i need to use "/pentaho-di", resources keeps pointing to bad location. If i do manually url adjustment, resource is there.

Where i can change those resource locations?
]]>dobeslaohttps://forums.pentaho.com/threads/231329-PDI-(8-2)-Status-SVG-file-not-found/Unknown type 245 in column 0 of 1 in binary-encodedhttps://forums.pentaho.com/threads/231328-Unknown-type-245-in-column-0-of-1-in-binary-encoded/?goto=newpost
Fri, 01 Feb 2019 15:03:52 GMTHi,
PDI 6.1, MySQL database.
We are getting this error Unknown type 245 in column 0 of 1 in binary-encoded. could you please help on this.
...Hi,

PDI 6.1, MySQL database.

We are getting this error Unknown type 245 in column 0 of 1 in binary-encoded. could you please help on this.

When we are trying to load the json data using table input step, we are getting it.

Thanks
]]>ranalahttps://forums.pentaho.com/threads/231328-Unknown-type-245-in-column-0-of-1-in-binary-encoded/differences between kettle 8.0.0.0-28 and 8.2.0.0-342 ?https://forums.pentaho.com/threads/231327-differences-between-kettle-8-0-0-0-28-and-8-2-0-0-342/?goto=newpost
Thu, 31 Jan 2019 17:02:33 GMTafter looking at https://help.pentaho.com/Documentation/8.2/Whats_New to see what is new in 8.2 I still have some issues.
i have a number of dbunit...after looking at https://help.pentaho.com/Documentation/8.2/Whats_New to see what is new in 8.2 I still have some issues.
i have a number of dbunit tests that execute pentaho jobs and transformations via java code and then compare job/transformation outputs vs. known inputs.
these 121 odd tests ran clean against 8.0.0.0-28 but after upgrading to 8.2.0.0-342 I have a number of failures.
was wondering if somebody else has seen similar behavior in the 8.2 release.
any pointers would be appreciated.
thanks.
]]>nmogielnickihttps://forums.pentaho.com/threads/231327-differences-between-kettle-8-0-0-0-28-and-8-2-0-0-342/DB Repository to File Repohttps://forums.pentaho.com/threads/231326-DB-Repository-to-File-Repo/?goto=newpost
Wed, 30 Jan 2019 16:31:08 GMTI am migrating from a DB Repo to a file based Repo.
While all the transformations and Jobs are progressing properly I have not yet figured out...I am migrating from a DB Repo to a file based Repo.

While all the transformations and Jobs are progressing properly I have not yet figured out how to do the following

Currently (DB Repo)
When making a Transformation you have a option of all of your Database connections on the left hand side. This allows the ability to quickly configure what you need

File Based Repo
Only the connections in use appear to be listed. (for that job or transformation) when making new Transformations or jobs this is a PITA

Question:
How do I export all of the Database Connections into unique kdb files? (I have 50-60 unique DB connections)

I'm using latest version of CentOS, I just downloaded PDI 8.2 and tried to launch kitchen with command line (file repository).
It seems that it wanted to configure some things but it never finishes and block at the same step every time I launch it.
Then I'm blocked and must kill processes (other shell session) to get hands back on the shell.

Thanks
]]>romshttps://forums.pentaho.com/threads/231324-First-init-PDI8-2-Linux-Creating-configuration-won-t-terminate/https://forums.pentaho.com/threads/231323-REST-client-keeps-sending-quot-Transfer-Encoding-chunked-quot/?goto=newpost
Thu, 24 Jan 2019 09:52:14 GMTHow to remove header "Transfer-encoding: chunked" from REST client transmision?
Normally this header should be added when 'content-length' is unknown. But even if I add 'content-length', Pentaho keeps sending 'chunked'.
Is there any trick to manipulate headers Kettle sends?
]]>Gosforthhttps://forums.pentaho.com/threads/231323-REST-client-keeps-sending-quot-Transfer-Encoding-chunked-quot/SPARK AEL execution through Panhttps://forums.pentaho.com/threads/231322-SPARK-AEL-execution-through-Pan/?goto=newpost
Thu, 24 Jan 2019 07:54:15 GMTHi,
I am able to run a PDI Transformation using AEL Spark Engine through Spoon, but I need to run this transformation using Pan and I dont see any...Hi,

I am able to run a PDI Transformation using AEL Spark Engine through Spoon, but I need to run this transformation using Pan and I dont see any option to specify Pan to use AEL Spark Engine to run.