rangercairns wrote:I will give it a try with current design ... BUT ...
We may have to invent a new format for massive records passed input ... perhaps something like this ...
- Code: Select all
... and so on ...
<raw template='mybigds' delimit=':' eol='LF'>
... and so on ...
... where each delimited record can be quickly popped into memory in consecutive fashion (RPG DS array style).
I would like to stay with character data (frog132:12.37:...) and avoid "binary" data transfers because clients never have types like packed/zoned decimal ... and ... well for a web point of view you can send a big string around the web using any protocol (DB2, REST GET/POST, ftp, etc.) over any language (PHP, Ruby, perl, csh, bash, curl, etc.), which in the long run will protect your application for all manner of device proliferation (ipad, phone, pc, etc.)
With that said, do you have other design ideas (this is open source development my friend)???
rangercairns wrote:Ok, timo massive data test works on new xmlservice version 1.7.4-sg4, please try when/if you have time.
This 1.7.4-sg4 xmlservice fix does NOT require any changes to PHP Toolkit or customer script (Yahoo).
Thanks for the help.
Active test versions
FIX — parsing large data works again
big data test 5,000 elements … improved 40 seconds to 3 seconds
massive data timo 16,000 XML element test works … improved 60+ seconds to 15 seconds
not done with work, but try if you would like
I suspect 15 seconds is still longer than you would like, but this version fix is still 100% RPG code (no PASE assembler) ... and ... there may still be room for improvement in xmlservice RPG only implementation.
MAYBE at some point we may have to entertain a new format for "massive data", but not today. BTW -- i suspect "massive data" is completely possible (even staying within XML), but we will see what you want to do.
Customers x, y and z are having 3rd party web shops (from different vendors, of course). For fast response times, these web shops query stock balance data from our iSeries ERP every hour. This means they run more than 50 web service queries every hour with occurrence ds of 200 products. So they query stock balances for more than 10000 products every hour. And this is for the "front page" views only...
When an order is created, they check the stock balances for the order to inform the user... and these orders are big... dozens of order rows.. and at this point this web service is "real time" query and it has to be fast. So occurrence ds is the solution and it needs to be fast.
Application --(fast)--> Compatibility Wrapper --(fast)--> XML-Toolkit --(SLOW)--> XMLSERVICE --(fast)--> RPG --(fast)--> XMLSERVICE --(fast)--> XML-Toolkit --(fast)--> Compatibility Wrapper --(fast)--> Application
In this case "a ton" of complex data records data will be spilled output (xmlservice output),
but input to the ZZTON call would just be default *BLANKS, zeros, (xmlservice input).
D zzton PR
D nn1p0 1p 0
D nn7a 7a
D nn8p0 8p 0
D nnDS likeds(dcTon_t) dim(TONMAX)
<cmd comment='chglibl'>CHGLIBL LIBL(xyzlibxmlservicexyz)</cmd>
<pgm name='ZZSRV' func='ZZTON'>
<parm io='in'><data type='1p0'>1</data></parm>
<parm io='in'><data type='7a'>7</data></parm>
<parm io='in'><data type='8p0'>8</data></parm>
<ds var='dcTon_t' dim='999'>
<data var='s01' type='7A'>1</data>
<data var='s02' type='4p0'>2</data>
Users browsing this forum: No registered users and 2 guests