2016-05-28

MyDell E7450 laptop just died. it’s very dead, no sign of life whatsoever, even the power adapter shut off itself. Will the data in the PC be recoverable? I’m not very good at backing up my PC. It is Saturday morning I have to wait until Monday to call Dell Service, fuck! Most unfortunate and inconvenient, I was planning to debug a failing MS SQL server connector this weekend, and fix some MS Sharepoint lists. But I need access to the company network for that and my PC.

Hopefully Dell Support team resurrect the PC during Monday.

And Monday morning we have planned to use my PC for end user training. We rehearsed the training on my PC with Single Sign on login and all. User training without minute preparation is seldom a success, f-u-c-k again.

Sometime life is such a bitch. Update:It's late Tuesday afternoon and still no sign of the Dell PC repairman, I am not happy. Update 2:Wednesday afternoon, The friendly Dell PC repairman have fixed my PC and I'm happy again.

Yippi, Here I am for the world to see

A new Data Warehouse was born last week, After many R.I.P. posts at last a post celebrating a newborn. (Sadly there will be more R.I.P., but today is a day of joy and celebration.)

The new Data Warehouse is a proof of concept only, it is intended as the data store for a BI solution serving a faraway distribution center. It took some 3 hours to create the Data Warehouse and plan some 10 extraction jobs, the job stats might even show in the monthly twitter graph for May.

The only problem so far is the MS SQL Server connector of the Data Warehouse, for some reason it refuse to run. Hopefully it is a minor problem.

If this PoC make it all the way to production I will write more about it. So far so good (except for the SQL server connector).

2016-05-22

In Part 1 I described how I wanted to add a tiny little feature to the Data Warehouse’s SAP connector. And as far as part 1 goes it looks like a walk in the park, but then reality hit me.

I wanted to implement support for prereqs in <sql> statements like this:

<sql>

<prereq>@TABLE_ROWS</prereq>

<autoload duplicate='replace' database='@C_DB' truncate='yes'/>

</sql>

.

But after spending some four five hours looking at the code, I realised it would take me a weekend implementing this, so I decided to implement prereq on autoload level, which was much easier like this:

<sql>

<autoload duplicate='replace' database='@C_DB' truncate='yes/>

<prereq>@TABLE_ROWS</prereq>

</autoload>

</sql>

Then I needed to carry the new @TABLE_ROWS symbolic variable over to subsequent jobs, so I placed it into the job symbolic table. But it turned out that the symbolic table was already ‘fixed’ at this late phase of the execution of the job. It turned out to be a somewhat tedious job to analyse and code the ability to carry over the @TABLE_ROWS, since it appeared in two different symbolic tables (I have still not figured out why, it looks awkward) written to disk. Finally after some ten hours of looking at the code I decided to do a somewhat ugly hack in the SAP connector:

Yes, it took me some ten hours to get those lines of code right. Now I can do this:

<job name=’extractFromSAP’> <sap>... </sap>

<sql>

<autoload duplicate='replace' database='@C_DB' truncate='yes/>

<prereq>@TABLE_ROWS</prereq>

</autoload>

</sql>

</job>

<job name=’postProcessSAPData’>

<prereq>@extractFromSAP.TABLE_ROWS</prereq

…

</job>

TABLE_ROWS is the number of data lines extracted from SAP

All in all adding just a tiny little feature took some twenty hours. It is long time since I looked at the code, most of the time I spent on understanding the code and the consequences of the code change. As you have seen the actual code added and changed is small. If I had a better understanding of the code it would have taken one to three hours complete the change. Now it took about 20 hours, it could have taken 200 hours, you never know what it will cost or what the end result will be when you add a tiny little ‘thing’ to a complex system. Was this change worth the efford? Nah can’t say that, but it was a bit fun.

Still a question remains, what is the result of the extractFromSAP job if the prereq prevent autoloading? Was it a success or a failure? Or was it bypassed?

You can argue for all three alternatives, now it is counted as a success, but I think 'bypassed' better reflects the result. I have to think about that.

2016-05-15

Often but not always when I revisit old code I realise there are room for improvements. This time I had reason to review the SAP connector code I done for importing data from SAP into an relational database, the basic structure is very simple:

I looked into this because I came across a job that runs frequently, but seldom finds any new data to export from SAP, (the job is the first in a workflow that creates an alarm that signals ‘incoming bulk order’). It annoyed me when I realized there is no way to stop the SQL part nomatter of the result from the SAP rfc part of the job. The rest of the workflow was stopped by a prereq:

<prereq type='sql'>

SELECT IF(COUNT(*)>0,1,0) FROM import_table;

</prereq>

Prereqs are booleans saying stop or go, I decided I would implement support for prereqs in the sql part of the job like this:

<job> <rfc>

SAP extraction code goes here…

</rfc>

<sql>

<prereq>got data from rfc</prereq>

SQL import code goes here...

</sql>

</job>

This is a pretty neat design. I’m reusing the prereq which is a very commonly used instruction of my data tag language and it’s self explanatory. The rfc part already creates a file with the number of rows found, so it’s just a matter of making the number accessible to the sql part. And very easy to evaluate the prereq, if you just feed the extracted rows to the prereq it will be evaluated as a boolean, 0 results in false and anything above to true. I decided to call the extracted rows count TABLE_ROWS and put it into the jobs symbolic table for accessibility that was easy I only changed

The line beginning with <autoload is parameters for automagically load a table into a RDBMS, and it looks like crap. I decided a facelift was needed this is what I would like it to look:

<sql>

<prereq>@TABLE_ROWS</prereq>

<fieldclosure>"</fieldclosure>

<autoload duplicate='replace' database='@C_DB' truncate='yes'>

<table>

<name>@TABLE1=@DWTABLE1</name>

<idxdef>PRIMARY KEY (`VBELN`,`POSNR`),INDEX(`MATNR`,`WERKS`)</idxdef>

<comment>This table is derived from SAP table @TABLE1</comment>

</table>

</autoload>

</sql>

Not pretty maybe, but less ugly and support for sql prereqs. Now I only had implement the new functionality. I started with the new XML syntax. Just to keep things simple I decided to downgrade the new syntax to the old syntax. I had a function that scans for <autoload> and if found pass it over to a function that does the autoload of SAP data into tables.

$sqlarr[$i] = $sqlxml['sql'][0]; // Note this is our copy of the job control block

}

}

return ['sql' => $sqlarr];

}

And that is where I am now, I didn’t do what I originally wanted to accomplish. I suppose that is what happen to most of us when start looking into old code where you only intend to add or change something minor. Next weekend I hope to get some time to finalish the task.

P.s.

You may rightly wonder why is this bulk order alarm not implemented directly in SAP? I do not know, probably it was too expensive or took too long time to get into production.