When I execute a query that returns a quite huge quantity of rows, it
seems mysql sends ALL the data to the client before that client is able
to loop over the result set.
This acts the same with e.g Perl DBI's execute(), or PHP's
mysql_query(), so this looks like it has something to do with the
I was wondering if there was a way with mysql to do some kind of
"progressive fetch" so that I do not end up with a script that uses 3GB
of memory? Is it actually something that one cannot change, that ...