Working with multiple occurrence data structures

Re: Working with multiple occurrence data structures

Postby timo_karvinen on Mon Nov 12, 2012 11:00 am

I tried to run a quick test with the new version against one of our real production programs but something went wrong.

Code: Select all
12 Nov 2012 11:46:34,14646 i5Error: num=312 cat=9 msg="UNEXPECTED" desc="XML run loop failed (p(54) <pgm name='RMETUH' lib='*LIBL')"


Here's the debug log but I snipped the input params out of it (something like ~3600 rows of them).

Code: Select all
Exec start: 2012-11-12 11:46:14
IPC: '/tmp/xtoolkit/WEBSMERX13'. Control key: *cdata *sbmjob(ZENDSVR/ZSVR_JOBD/XTOOLKIT)
Stmt: call XMLSERVICE.iPLUG1M(?,?,?,?)
Input XML: <?xml version="1.0" encoding="ISO-8859-1" ?>
<script>
<pgm name='RMETUH' lib='*LIBL'>
********** SNIP **********
</pgm>
</script>
Output XML: <?xml version="1.0" encoding="ISO-8859-1" ?>
<report>
<version>XML Toolkit 1.7.4-sg3</version>
<error>
<errnoxml>1100004</errnoxml>
<xmlerrmsg><![CDATA[XML run loop failed]]></xmlerrmsg>
<xmlhint><![CDATA[p(54) <pgm name='RMETUH' lib='*LIBL']]></xmlhint>
</error>
<error>
<errnoxml>1100007</errnoxml>
<xmlerrmsg><![CDATA[XML copy in excp]]></xmlerrmsg>
<xmlhint><![CDATA[p(351224) <ds var='PO_PriceTable' array='on']]></xmlhint>
</error>
<error>
<errnoxml>1100017</errnoxml>
<xmlerrmsg><![CDATA[XML copy in data]]></xmlerrmsg>
<xmlhint><![CDATA[p(351224)
<ds var='PO_PriceTable'>
<data var='PO_Pr_Hity' t]]></xmlhint>
</error>
<error>
<errnoxml>1100007</errnoxml>
<xmlerrmsg><![CDATA[XML copy in excp]]></xmlerrmsg>
<xmlhint><![CDATA[p(350865) <ds var='PO_ProdTable']]></xmlhint>
</error>
<error>
<errnoxml>1100017</errnoxml>
<xmlerrmsg><![CDATA[XML copy in data]]></xmlerrmsg>
<xmlhint><![CDATA[p(350865)
<data var='PO_Pr_Error' type='1a' /><data var='PO]]></xmlhint>
</error>
<error>
<errnoxml>1100007</errnoxml>
<xmlerrmsg><![CDATA[XML copy in excp]]></xmlerrmsg>
<xmlhint><![CDATA[p(77022) <ds var='PO_ProdTable' array='on']]></xmlhint>
</error>
<error>
<errnoxml>1100017</errnoxml>
<xmlerrmsg><![CDATA[XML copy in data]]></xmlerrmsg>
<xmlhint><![CDATA[p(77022)
<ds var='PO_ProdTable'>
<data var='PO_Pr_Error' ty]]></xmlhint>
</error>
<error>
<errnoxml>1100007</errnoxml>
<xmlerrmsg><![CDATA[XML copy in excp]]></xmlerrmsg>
<xmlhint><![CDATA[p(76993) <parm comment='PO_ProdTable']]></xmlhint>
</error>
<error>
<errnoxml>1100017</errnoxml>
<xmlerrmsg><![CDATA[XML copy in data]]></xmlerrmsg>
<xmlhint><![CDATA[p(76993) <ds var='PO_ProdTable' array='on'>
<ds var='PO_Prod]]></xmlhint>
</error>
<error>
<errnoxml>1100008</errnoxml>
<xmlerrmsg><![CDATA[XML copy in failed]]></xmlerrmsg>
<xmlhint><![CDATA[<pgm name='RMETUH' lib='*LIBL'>
<parm comment='PI_Caller'><]]></xmlhint>
</error>
<error>
<errnoxml>1100007</errnoxml>
<xmlerrmsg><![CDATA[XML copy in excp]]></xmlerrmsg>
<xmlhint><![CDATA[p(76993) <parm comment='PO_ProdTable']]></xmlhint>
</error>
<error>
<errnoxml>1100017</errnoxml>
<xmlerrmsg><![CDATA[XML copy in data]]></xmlerrmsg>
<xmlhint><![CDATA[p(76993) <ds var='PO_ProdTable' array='on'>
<ds var='PO_Prod]]></xmlhint>
</error>
<error>
<errnoxml>1100004</errnoxml>
<xmlerrmsg><![CDATA[XML run loop failed]]></xmlerrmsg>
<xmlhint><![CDATA[p(54) <pgm name='RMETUH' lib='*LIBL']]></xmlhint>
</error>
<jobinfo>
<jobipc>/tmp/xtoolkit/WEBSMERX13</jobipc>
<jobipcskey>011E56F1</jobipcskey>
<jobname>XTOOLKIT</jobname>
<jobuser>WEBSMERX</jobuser>
<jobnbr>145510</jobnbr>
<jobsts>*ACTIVE</jobsts>
<curuser>WEBSMERX</curuser>
<ccsid>65535</ccsid>
<dftccsid>278</dftccsid>
<paseccsid>819</paseccsid>
<langid>FIN</langid>
<cntryid>FI</cntryid>
<sbsname>ZENDSVR</sbsname>
<sbslib>ZENDSVR</sbslib>
<curlib></curlib>
<syslibl>THSYS QSYS QSYS2 QHLPSYS QUSRSYS QADM</syslibl>
<usrlibl>MRX8.M8 MERXO8K QGPL QTEMP</usrlibl>
<jobcpffind>see log scan, not error list</jobcpffind>
</jobinfo>
<joblogscan>
<joblogrec>
<jobcpf>CPF1124</jobcpf>
<jobtime><![CDATA[12.11.12  10:40:05,312976]]></jobtime>
<jobtext><![CDATA[Työ 145510/WEBSMERX/XTOOLKIT aloitettu 12.11.12 kello Työ 145510/WEBSMERX/XTOOLKIT on viety työjonoon.]]></jobtext>
</joblogrec>
<joblogrec>
<jobcpf>*NONE</jobcpf>
<jobtime><![CDATA[12.11.12  10:40:05,312976]]></jobtime>
<jobtext><![CDATA[CALL PGM(XMLSERVICE/XMLSERVICE)]]></jobtext>
</joblogrec>
</joblogscan>
<joblog job='XTOOLKIT' user='WEBSMERX' nbr='145510'>
<![CDATA[5722SS1 V5R4M0 060210                        Työlokin näyttö                        TTH400   12.11.12 11:46:33          Sivu    1
  Työ  . . . . . . . . . . . . :   XTOOLKIT        Käyttäjä  . . . . :   WEBSMERX     Numero . . . . . . . . . . . :   145510
  Työn kuvaus  . . . . . . . . :   ZSVR_JOBD       Kirjasto  . . . . :   ZENDSVR
SAN.TUN.   LAJI                    VAK  PVM       AIKA             LÄHDEOHJ.    KIRJASTO    ASENN.   KOHDEOHJ.   KIRJASTO    ASENN.
CPF1124    Ilmoitussanoma          00   12.11.12  10:40:05,312408  QWTPIIPP     QSYS        0671     *EXT                    *N
                                     Sanoma  . . . . :   Työ 145510/WEBSMERX/XTOOLKIT aloitettu 12.11.12 kello
                                       10:40:05 kirjastossa ZENDSVR olevassa alijärjestelmässä ZENDSVR. Järjestelmä
                                       otti vastaan työn 12.11.12 kello 10:40:05.
CPI1125    Ilmoitussanoma          00   12.11.12  10:40:05,312824  QWTPCRJA     QSYS        010F     *EXT                    *N
                                     Sanoma  . . . . :   Työ 145510/WEBSMERX/XTOOLKIT on viety työjonoon.
                                     Selitys . . . . :   Työ 145510/WEBSMERX/XTOOLKIT on viety kirjastossa ZENDSVR
                                       olevaan työjonoon ZSVR_JOBQ. Työn aloittanut työ on 145424/QUSER/QSQSRVR.
                                       Työ 145510/WEBSMERX/XTOOLKIT on aloitettu komennolla SBMJOB (työn vienti
                                       työjonoon) ja työn määritteet ovat seuraavat: JOBPTY(5) OUTPTY(5) PRTTXT()
                                       RTGDTA(QCMDB) SYSLIBL(THSYS      QSYS       QSYS2      QHLPSYS    QUSRSYS
                                       QADM) CURLIB(*CRTDFT) INLLIBL(MRX8.M8    MERXO8K    QGPL       QTEMP)
                                       INLASPGRP(*NONE) LOG(4 00 *NOLIST) LOGCLPGM(*YES) LOGOUTPUT(*JOBEND)
                                       OUTQ(QGPL/QPRINT2) PRTDEV(QSUPPRT) INQMSGRPY(*RQD) HOLD(*NO) DATE(*SYSVAL)
                                       SWS(00000000)  MSGQ(QUSRSYS/WEBSMERX) CCSID(65535) SRTSEQ(*N/*HEX)
                                       LANGID(FIN) CNTRYID(FI) JOBMSGQMX(16) JOBMSGQFL(*WRAP) ALWMLTTHD(*NO)
                                       SPLFACN(*KEEP).
*NONE      Komentosanoma                12.11.12  10:40:05,312976  QWTSCSBJ                 *N       QCMD        QSYS        0194
                                     Sanoma  . . . . :  -CALL PGM(XMLSERVICE/XMLSERVICE)
                                       PARM('/tmp/xtoolkit/WEBSMERX13')
]]>
</joblog>
</report>

Exec end: 2012-11-12 11:46:34. Seconds to execute: 19,383512973785.


It also still took almost 20 seconds to fail.

-Timo
timo_karvinen
 
Posts: 74
Joined: Wed Aug 12, 2009 7:58 am
Location: Tampere, Finland

Re: Working with multiple occurrence data structures

Postby rangercairns on Mon Nov 12, 2012 9:03 pm

This xml input document is so big, tough to understand what may be wrong, therefore i need to get your whole XML input document. I can run XML input cut/paste from debug.log without program call using a special xmlservice $ctl="*test" mode, so i don't need your target ILE RPG/COBOL, whatever, just cut/paste/send entire input XML to adc@us.ibm.com (or send entire debug.log).
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

Re: Working with multiple occurrence data structures

Postby rangercairns on Mon Nov 12, 2012 9:09 pm

Also, I hate to ask, but did you try something simple like using a bigger plug size for all this data (maybe, 5M, 10M, 15M) ???

Code: Select all
// new toolkit
try { $ToolkitServiceObj = ToolkitService::getInstance($database, $user, $password); }
catch (Exception $e) { echo  $e->getMessage(), "\n"; exit();}
$ToolkitServiceObj->setToolkitServiceParams(
array('InternalKey'=>$ipc, // route to same XMLSERVICE job /tmp/myjob1
'subsystem'=>"QGPL/QDFTJOBD",      // subsystem/jobd to start XMLSERVICE (if not running)
'plug'=>"iPLUG5M"));               // max size data i/o (iPLUG4K,32K,65K.512K,1M,5M,10M,15M)
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

Re: Working with multiple occurrence data structures

Postby rangercairns on Mon Nov 12, 2012 9:26 pm

I want to look into your data just to see if we can get xmlservice to handle this massive XML input document efficiently (fun to try), so please send to my email (hopefully will pass though my email filters without getting kicked out).

-- but i am very curious --

I find myself wondering why a 3600 record/field input document would ever be considered a web application at all?? I mean no disrespect, but what is the web application context of such a large amount data running between browser and xmlservice ??? Won't such a big data transfer be prohibited on the web ... or .. perhaps broken up into smaller chunks to avoid web/browser time outs???
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

Re: Working with multiple occurrence data structures

Postby timo_karvinen on Tue Nov 13, 2012 9:36 am

rangercairns wrote:This xml input document is so big, tough to understand what may be wrong, therefore i need to get your whole XML input document. I can run XML input cut/paste from debug.log without program call using a special xmlservice $ctl="*test" mode, so i don't need your target ILE RPG/COBOL, whatever, just cut/paste/send entire input XML to *** (or send entire debug.log).


I send the debug log to you.

-Timo
timo_karvinen
 
Posts: 74
Joined: Wed Aug 12, 2009 7:58 am
Location: Tampere, Finland

Re: Working with multiple occurrence data structures

Postby timo_karvinen on Tue Nov 13, 2012 9:50 am

rangercairns wrote:Also, I hate to ask, but did you try something simple like using a bigger plug size for all this data (maybe, 5M, 10M, 15M) ???

Code: Select all
// new toolkit
try { $ToolkitServiceObj = ToolkitService::getInstance($database, $user, $password); }
catch (Exception $e) { echo  $e->getMessage(), "\n"; exit();}
$ToolkitServiceObj->setToolkitServiceParams(
array('InternalKey'=>$ipc, // route to same XMLSERVICE job /tmp/myjob1
'subsystem'=>"QGPL/QDFTJOBD",      // subsystem/jobd to start XMLSERVICE (if not running)
'plug'=>"iPLUG5M"));               // max size data i/o (iPLUG4K,32K,65K.512K,1M,5M,10M,15M)


Yes, you can actually see in the debug snippet I posted that my plugsize is 1M (Stmt: call XMLSERVICE.iPLUG1M(?,?,?,?)), because the default 512K was not enough for this to run at all.
We also have some requests where 1M isn't enough either and we have to go 5M.
This is another problem we have to discuss some other day :)

-Timo
timo_karvinen
 
Posts: 74
Joined: Wed Aug 12, 2009 7:58 am
Location: Tampere, Finland

Re: Working with multiple occurrence data structures

Postby timo_karvinen on Tue Nov 13, 2012 10:05 am

rangercairns wrote:I find myself wondering why a 3600 record/field input document would ever be considered a web application at all?? I mean no disrespect, but what is the web application context of such a large amount data running between browser and xmlservice ??? Won't such a big data transfer be prohibited on the web ... or .. perhaps broken up into smaller chunks to avoid web/browser time outs???


Well our system that's running on i and written in RPG is your regular ERP system.
So given the nature of ERP systems, the data there tends to be quite huge.
Our PHP applications have many different ways of running the RPG business programs.
We for example have product or customer listing on ExtJS grid which can then be edited right there on the grid, and we then have to send all the updates (like possibly 200 rows at a time) to RPG to process and save to DB2.
We also have Excel import/export functionality, which actually uses straight SQL write to file because of the time it would take to process (even with old toolkit). Here we can have like 60 000 rows of products with like 20 fields per row. I'm not dreaming of putting XMLSERVICE through all that, I'm just telling that the "big data" we are talking about on this thread isn't actually that big in the big picture.

But the main thing we are concerned about is our Web Services, we have services for things like getting product prices, warehouse availability etc.
Here there's obviously third party systems that are requesting this information from our ERP, and of course there's many different third party systems that can and do connect to these services at several of our customers.
Now imagine this, the third party system now (don't ask me why) gets information on 10 000 products per day, if we make that 10 000 different Web Service calls and 10 000 different program calls, it will be hell of a lot slower in total than asking 50 times for those 200 product infos. That may sound bit weird, but that's what third party does and we have to oblige. Anyway this was just one extreme example that's actually in production use, but we have many more examples with actual reasons that make sense, for getting up to 200 product or customer or whatever information at once.

Also browser/third party system timeouts were never a problem with old toolkit, old toolkit handles these requests happily in like couple of seconds tops, well depends on the load of the machine etc. etc.
Actually the 200 limit on the occurrence ds's we have in many programs comes from empirical testing of what the Easycom toolkit could handle in reasonable time, we originally had 500 but that was too slow (4-5 seconds).

-Timo
timo_karvinen
 
Posts: 74
Joined: Wed Aug 12, 2009 7:58 am
Location: Tampere, Finland

Re: Working with multiple occurrence data structures

Postby rangercairns on Tue Nov 13, 2012 5:00 pm

>>> GET (no update) ... for getting up to 200 product or customer or whatever information at once.

Yes, we expected this sort of many record output therefore XMLSERVICE has a short cut XML syntax for array (dim='n') ... and currently Alan is working to implement in PHP toolkit, i think this will do a fairly reasonable performing job for output.
Code: Select all
<ds dim='999'>
<data type='132a' >...</data>
<data type='12p2' >...</data>
... and so on ...
</ds>


>>> PUT (update input) ... have to send all the updates (like possibly 200 rows at a time) to RPG to process and save to DB2.

Ok, this we did not expect this massive data and we need to work on XMLSERVICE to handle this much input. Actually debug file you sent me has 16,429 XML nodes worth of data, essentially a tree with 16,000 branches each leading to a leaf of data ... mmm ... this may just be way too big for our discrete XML <data></data> based design.

I will give it a try with current design ... BUT ...

We may have to invent a new format for massive records passed input ... perhaps something like this ...
Code: Select all
</template label='mybigds'>
<describe type='132A'/>
<describe type='12p2'/>
... and so on ...
</template>
<raw template='mybigds' delimit=':' eol='LF'>
frog132:12.37:...:
toad145:34512.37:...:
... and so on ...
</raw>

... where each delimited record can be quickly popped into memory in consecutive fashion (RPG DS array style).

Philosophy:
I would like to stay with character data (frog132:12.37:...) and avoid "binary" data transfers because clients never have types like packed/zoned decimal ... and ... well for a web point of view you can send a big string around the web using any protocol (DB2, REST GET/POST, ftp, etc.) over any language (PHP, Ruby, perl, csh, bash, curl, etc.), which in the long run will protect your application for all manner of device proliferation (ipad, phone, pc, etc.)

With that said, do you have other design ideas (this is open source development my friend)???
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

Re: Working with multiple occurrence data structures

Postby rangercairns on Tue Nov 13, 2012 5:32 pm

Update 16,000 sounds too big ... which may be the error ... i am still looking

adc@cairns:~/src/zmaster/zmaster_tool/tests$ grep -c '<data' test_40502_nocall_timo.phpt
1612
adc@cairns:~/src/zmaster/zmaster_tool/tests$ grep -c '<parm' test_40502_nocall_timo.phpt
15
adc@cairns:~/src/zmaster/zmaster_tool/tests$ grep -c '<ds' test_40502_nocall_timo.phpt
1803


However, <template> <raw> may still be a good idea for massive data ... so i may do that anyway (easy to do).
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

Re: Working with multiple occurrence data structures

Postby rangercairns on Tue Nov 13, 2012 5:43 pm

Oh nuts, my new code 'bigAssist' for parse big data has an error, i will correct.

Bottom line, 1.7.4-sg3 download has an error. Well, this is why we test on Yips testing page before moving to production on main xmlservice page ... i thank you for your test ... and ... your big data test will now become a standard test from now on.
rangercairns
 
Posts: 215
Joined: Fri Jul 24, 2009 6:28 pm

PreviousNext

Return to New Toolkit

Who is online

Users browsing this forum: No registered users and 2 guests