You are here:  » Looping through large 900mb file


Looping through large 900mb file

Submitted by supertester on Thu, 2014-08-21 16:25 in

Hi David,

I have a 900mb xml file that needs to be looped and inserted in DB, and when I unleash Magicparser on it like I normally do it just stops after a while, no time-out errors or anything, it just stops. I then coded a XMLreader parser for it, and well.. it does the same, it just stops. However it is working when I 'divide' the file in half with a global $counter and processing 250.000 rows at a time. Downside is that XMLreader needs to read the file twice and well.. it's very slow and I don't like XMLreader.

Now I stumbled upon your reply on this topic: http://www.magicparser.com/node/1104

"However; I have in development and you would be welcome to try out a new method of achieving this which enables you to get the current offset data of the the parser and then to jump directly to that point in the XML source in the next iteration; which has none of the overhead of the above technique - i'll email it to you to try out together with a template calling script to demonstrate usage..."

I would love to try that development version if you still have it? Or do you have any other suggestions how to tackle this feed?

Submitted by support on Fri, 2014-08-22 08:04

Hi,

Sure - what I'll do is package up an example using the latest version with SEEK support and forward to you shortly...

Cheers,
David