No database or other Oracle stuff this time. Close to Apache though.
Testing some large downloads with WordPress, and the download of a large file stopped round about 2.6GB.

The download is through http, so my first thought that it should be a time-out in Apache or what so ever. But in the Apache error-log it showes:
“PHP Fatal error:  Maximum execution time of 30 seconds exceeded in <my home – wordpress – plugin – directory>wp-e-commerce/wp-shopping-cart.old.php on line 1179″

30 seconds? Download stopped after more than an hour!

My blog is powered by WordPress, and so by php. Don’t know much about php, and just dug into forums with similar messages. At the end I found a wonderful parameter: set_time_limit.
Updated two lines in a function in the file wp-shopping-cart.old.php of my wp-e-commerce plugin:

function nzshpcrt_download_file() {
global $wpdb,$user_level,$wp_rewrite;
get_currentuserinfo();
function readfile_chunked($filename, $retbytes = true) {
$chunksize = 2 * (1024 * 1024); // how many bytes per chunk
$buffer = ”;
$cnt = 0;
$handle = fopen($filename, ‘rb’);
set_time_limit(0);
if($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if($retbytes)    {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}

Tested it, and it worked – downloaded a 7GB file(!). Just didn’t know if the set_time_limit(0) is in the right place, but apparently it is.
Hope I didn’t break other functions in the php-file…

zv7qrnb