kentatzend wrote:As Tony commented we had a set of priorities for the development of this new toolkit and initially performance was not our highest item. But now that we've address a lot of the basic problems we're looking more at performance than previously.
Yes, this is also why we're here now, asking about performance for our use cases. Now that much of everything else is more or less working, I'd imagine everyone would want to focus a bit on the performance side.
kentatzend wrote:But I'm a bit worried that this is really just an artificial test case that is not going to exist in very many real world test cases. Of course I'm not 100% sure that this "assessment" by me is correct so I'd like to ask a few questions.
Like both me and Jan have already mentioned several times, there's nothing artificial about this, this is how our programs/services work and have worked in production environments at several of our customers for couple of years now. I can't tell you either, if such cases are really popular out in the rest of the world, but I can tell you it's certainly the way we use most of our php-rpg programs/services. And the way that our ERP (rpg) is designed to work with php applications / interfaces.
kentatzend wrote:It seems like there is a mismatch between what Jan says about the test case and the example Timo is giving us.
There's indeed a mismatch, but it's not between what Jan says and the data I have send you. It's between what we are saying and how you are reading that information. You seem to understand our posts so that 200 products equals 200 (xml) elements. A product we are talking about here is actual physical product in the real world, it has many many properties in the databases. So instead of 200 product rows equaling 200 xml-elements (just one property per row), they actually have like 30 or maybe even 100 properties (like name, description, packaging size, price without tax, price with tax, warehouse availability in several different warehouses and so on and so on), so 200 rows * 100 properties becomes 20000 xml elements. Well in reality even more than that, because in xml even a product row with only one property would consist of several xml elements because of the wrapper elements.
kentatzend wrote:Quite frankly I'm wondering what application really ever needs 10K+ elements to be fetched at one time and have to do that on any sort of regular basis. I could see where you might fetch 10K element once or even periodically where the period is fairly long (15 minutes, hourly, etc). The case would be for something like a local cached "catalog" of items for performance, etc. I could even see where you would fetch 10K elements in bunches (as Jan describes). But I find it hard to imagine (and I have a pretty good imagination) a scenario where a user request would need to pull all 10K+ element and could display or process them as part of a single user request/response.
Again, our ERP solution that talks to a lot of other systems like CRM's, Web Shops, Finance systems etc. etc. So it's not necessarily "a user request" as you put it, it's likely an another system requesting/inputting the data for multitude of different reasons. And again Jan was talking about 10k products instead of elements, which actually could mean like 10k*100 xml elements or so forth. And you are exactly right there, the case Jan described is "local cached catalog of items for performance" at a Web Shop on different server, which sells items based on inventory and product information stored in our ERP.
kentatzend wrote:Am i missing something here or is this really just a good artificial test case that was designed to push the edge of the envelope?
And once more with a feeling :); no, definitely not an artificial test case, at least not for us here in Finland, these are real in-production-use programs/services and we have several of them. And I can't for the world of me imagine what on earth my/our motivation would be for making an artificial test case like that, to begin with.
-Timo