This xml input document is so big, tough to understand what may be wrong, therefore i need to get your whole XML input document. I can run XML input cut/paste from debug.log without program call using a special xmlservice $ctl="*test" mode, so i don't need your target ILE RPG/COBOL, whatever, just cut/paste/send entire input XML to adc@us.ibm.com (or send entire debug.log).

I want to look into your data just to see if we can get xmlservice to handle this massive XML input document efficiently (fun to try), so please send to my email (hopefully will pass though my email filters without getting kicked out).

-- but i am very curious --

I find myself wondering why a 3600 record/field input document would ever be considered a web application at all?? I mean no disrespect, but what is the web application context of such a large amount data running between browser and xmlservice ??? Won't such a big data transfer be prohibited on the web ... or .. perhaps broken up into smaller chunks to avoid web/browser time outs???

rangercairns wrote:This xml input document is so big, tough to understand what may be wrong, therefore i need to get your whole XML input document. I can run XML input cut/paste from debug.log without program call using a special xmlservice $ctl="*test" mode, so i don't need your target ILE RPG/COBOL, whatever, just cut/paste/send entire input XML to *** (or send entire debug.log).

Yes, you can actually see in the debug snippet I posted that my plugsize is 1M (Stmt: call XMLSERVICE.iPLUG1M(?,?,?,?)), because the default 512K was not enough for this to run at all.We also have some requests where 1M isn't enough either and we have to go 5M.This is another problem we have to discuss some other day :)

rangercairns wrote:I find myself wondering why a 3600 record/field input document would ever be considered a web application at all?? I mean no disrespect, but what is the web application context of such a large amount data running between browser and xmlservice ??? Won't such a big data transfer be prohibited on the web ... or .. perhaps broken up into smaller chunks to avoid web/browser time outs???

Well our system that's running on i and written in RPG is your regular ERP system.So given the nature of ERP systems, the data there tends to be quite huge.Our PHP applications have many different ways of running the RPG business programs.We for example have product or customer listing on ExtJS grid which can then be edited right there on the grid, and we then have to send all the updates (like possibly 200 rows at a time) to RPG to process and save to DB2.We also have Excel import/export functionality, which actually uses straight SQL write to file because of the time it would take to process (even with old toolkit). Here we can have like 60 000 rows of products with like 20 fields per row. I'm not dreaming of putting XMLSERVICE through all that, I'm just telling that the "big data" we are talking about on this thread isn't actually that big in the big picture.

But the main thing we are concerned about is our Web Services, we have services for things like getting product prices, warehouse availability etc.Here there's obviously third party systems that are requesting this information from our ERP, and of course there's many different third party systems that can and do connect to these services at several of our customers.Now imagine this, the third party system now (don't ask me why) gets information on 10 000 products per day, if we make that 10 000 different Web Service calls and 10 000 different program calls, it will be hell of a lot slower in total than asking 50 times for those 200 product infos. That may sound bit weird, but that's what third party does and we have to oblige. Anyway this was just one extreme example that's actually in production use, but we have many more examples with actual reasons that make sense, for getting up to 200 product or customer or whatever information at once.

Also browser/third party system timeouts were never a problem with old toolkit, old toolkit handles these requests happily in like couple of seconds tops, well depends on the load of the machine etc. etc.Actually the 200 limit on the occurrence ds's we have in many programs comes from empirical testing of what the Easycom toolkit could handle in reasonable time, we originally had 500 but that was too slow (4-5 seconds).

>>> GET (no update) ... for getting up to 200 product or customer or whatever information at once.

Yes, we expected this sort of many record output therefore XMLSERVICE has a short cut XML syntax for array (dim='n') ... and currently Alan is working to implement in PHP toolkit, i think this will do a fairly reasonable performing job for output.

<ds dim='999'><data type='132a' >...</data><data type='12p2' >...</data>... and so on ...</ds>

>>> PUT (update input) ... have to send all the updates (like possibly 200 rows at a time) to RPG to process and save to DB2.

Ok, this we did not expect this massive data and we need to work on XMLSERVICE to handle this much input. Actually debug file you sent me has 16,429 XML nodes worth of data, essentially a tree with 16,000 branches each leading to a leaf of data ... mmm ... this may just be way too big for our discrete XML <data></data> based design.

I will give it a try with current design ... BUT ...

We may have to invent a new format for massive records passed input ... perhaps something like this ...

</template label='mybigds'><describe type='132A'/><describe type='12p2'/>... and so on ...</template><raw template='mybigds' delimit=':' eol='LF'>frog132:12.37:...:toad145:34512.37:...:... and so on ...</raw>

... where each delimited record can be quickly popped into memory in consecutive fashion (RPG DS array style).

Philosophy:I would like to stay with character data (frog132:12.37:...) and avoid "binary" data transfers because clients never have types like packed/zoned decimal ... and ... well for a web point of view you can send a big string around the web using any protocol (DB2, REST GET/POST, ftp, etc.) over any language (PHP, Ruby, perl, csh, bash, curl, etc.), which in the long run will protect your application for all manner of device proliferation (ipad, phone, pc, etc.)

With that said, do you have other design ideas (this is open source development my friend)???

Oh nuts, my new code 'bigAssist' for parse big data has an error, i will correct.

Bottom line, 1.7.4-sg3 download has an error. Well, this is why we test on Yips testing page before moving to production on main xmlservice page ... i thank you for your test ... and ... your big data test will now become a standard test from now on.