June 8, 2010

Just a little update to my previous posting on RightNow‘s .NET API performance.

This API has me CRYING for their Apex Data Loader. Apex was FAST!!! It was extremely fast for selecting records. But it was even fast inserting and updating records. That was an awesome tool — dump the records out, massage them in Access using SQL, pump the records back in, go home an eat lunch. Done in 3hrs.

So here are the latest metrics:

RightNow .Net API

Extract: 1.5MM records in 1hr 4min

Update: 1.5MM records 3days+

Some of you may ask what is the update and how involved is it? I believe it to be a fairly simple task, myself. I’m updating each contact record and setting a custom field’s value to an externally assigned primary key. The one excuse I will allow RightNow is that there are 250 or so custom fields for the contact. I would expect some performance degradation, but a data conversion of 1.5mm records taking longer than 3days is just stupid.

I always lose sight of this when I need it. That and RightNow’s website hides this from anyone’s view (side rant: really, how does one search in Google for RightNow the product?). Gartner says RightNow is #1 for customer support KB search, but I can never find shit using their support KB search. (side rant: seriously how much was Gartner paid?)

Create a New Report via SQL

To do this, we’re going to be using a custom script.

Define a filter that will always return zero rows.filter: 1=0

Select Level > Custom Scripts

Select the ‘Finish tab of custom scripts

Paste in the following while modifying the SQL to your liking:$temp=array();
$row_idx = 0;
//Run the desired query
$query=sql_prepare
(sprintf("
SELECT label,ac_id, header_code, init_code, process_code, exit_code
FROM ac_scripts a, labels l
where a.ac_id = l.label_id
and tbl=121
and (header_code is not NULL
or init_code is not NULL
or process_code is not NULL
or exit_code is not NULL)
"));
//Each column in the SELECT clause will need to be returned as the appropriate data-type (INT for integer, NTS for string, DTTM for datetime)
sql_bind_col($query,1,BIND_NTS,80);
sql_bind_col($query,2,BIND_INT,0);
sql_bind_col($query,3,BIND_NTS,1000);
sql_bind_col($query,4,BIND_NTS,1000);
sql_bind_col($query,5,BIND_NTS,1000);
sql_bind_col($query,6,BIND_NTS,1000);
//For each record returned in the above query, return it as one row of output, each field in its respective column
while ($temp = sql_fetch($query))
{
$exit_obj[$row_idx][0]->val = $temp[0];
$exit_obj[$row_idx][1]->val = $temp[1];
$exit_obj[$row_idx][2]->val = $temp[2];
$exit_obj[$row_idx][3]->val = $temp[3];
$exit_obj[$row_idx][4]->val = $temp[4];
$exit_obj[$row_idx][5]->val = $temp[5];
++$row_idx;
}
//Clear your buffer
sql_free($query);

$exit_obj is the array returned, whose values are subsequently displayed in the report

February 24, 2010

I wanted to dump out data from RightNow’s Contact tables. The intention was to load a SQL Server table that would then be used in later ETL for data quality rules.

The code does the magic I want.

Execute a RightNow report for a range of contacts.c_id values

Store the results in a SQL Server table on my local machine

The only problem is performance. SFDC can dump out 2-3million rows in an hour. My code was far from efficient as it was built for a quick-exercise. The code for 1.5million rows would finish in about 14hours. YUCK!Update: I was able to score 760k records in 1.5hrs … much better.

Looking at the comments on this post, it appears that I have RightNow checking my code. Yes. there is a bug in it. The idea is to use a report to dump out data. So the start range needs to be incremented more than it is in the AcFilter.

October 20, 2009

With longer running processes, it is a common practice to create a 0-byte file as a signal to a dependent process. An example would be the data transfer via FTP of several files to a staging directory. This post shows how to wait for the 0-byte file.

Control Flow

When complete, the control flow for this package will have the following items defined:

For Loop Container – this makes the process continue in a loop until the 0byte file is found

Check File script task – this checks for the presence of a file and sets a Boolean variable to True or False

Sleep script task – this will create a time delay so that the process isn’t checking every nano-second

The script task is configured to execute the Sleep task based on the precedence constraint. This is defined by double clicking on the connector and setting the following:

Definition: Sleep Script Task

Configured to sleep or pause the process.

ReadOnlyVariables

User::SleepMinutes

ReadWriteVariables

The script task will then execute the following code:

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.' To access the object model, use the Dts object. Connections, variables, events,' and logging features are available as static members of the Dts class.' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.'' To open Code and Text Editor Help, press F1.' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
'
' Zack Bethem @ AmberLeaf' Based on: http://blogs.pragmaticworks.com/mike_davis/2009/06/make-an-ssis-package-delay-or-wait-for-data.html'
If Dts.Variables.Contains("User::SleepMinutes") = True Then
Dim min As Double = Convert.ToDouble(Dts.Variables("SleepMinutes").Value) *60
Dim ms As Int32 = Convert.ToInt32(min *1000)
System.Threading.Thread.Sleep(ms)
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
End Sub
End Class

The For Loop container is configured to execute the Move CheckFile task based on the precedence constraint. This is defined by double clicking on the connector and setting the following:

Definition: Move CheckFile Script Task

Configured to move the 0-byte handoff file to an Archive location with the appended suffix of the date in yyyyMMddHH format.