I recently faced a problem with files transfers to my webserver:
if you have a lot of tiny files (let's say some 5000s) to transfer, it take a long long time so why not compress em all in a bigger file and unzip on the webserver ?
Unfortunately I'm using Aruba where it's not allowed to use the system command so I used exec:
<html>
<head>
<title>Unzip Page</title>
</head>
<body>
<?php
$zip= $_GET['filename'];
if($zip === NULL){
echo "Missing filename<br>";
echo "usage: http://url/untar.php?filename=FILENAME.ZIP";
}
else{
exec ("unzip $zip");
echo "file unzipped:$zip<br>";
echo "this is provided by <a href=\"http://lxphotostudio.mine.nu\">Lxphotostudio";
}
?>
</body>
</html>
Actually I have to say that you could face some timeouts due to the setting of the webserver, try to tune the size of the files accordingly.
--
webDesign: http://www.Lxphotostudio.info
bLuELab: http://bluelab.mine.nu
I recently made a small and ugly (absolutely not elegant) script to make my joomla local files syncronize properly with the online aruba webserver:
ReplyDeleteif [ $# -lt 1 ]
then
echo "usage makeTar.sh "
else
cp -r $1 /tmp
chown -R 18825551:100 /tmp/$1
tar cfvzps file1.tgz -C /tmp/$1 administrator \components language logs robots.txt \installation_done includes
tar cfvzps file2.tgz -C /tmp/$1 images
tar cfvzps file3.tgz -C /tmp/$1 \configuration.php CREDITS.php includes \INSTALL.php LICENSES.php plugins xmlrpc \demo.flv index2.php libraries CHANGELOG.php \configuration.php-dist htaccess.txt index.php \media templates COPYRIGHT.php modules tmp
rm -fr /tmp/$1
echo "done:"
ls -l file*.tgz
fi
I just put it here to avoid losing it
NB the strange groupid and userid is due to my aruba webserver ids and are necessary to keep permissions even if I noticed that you have to change the php file permissions by hand to 744 (local apache 2 uses 644 and is happy with that) to make it work.