I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for this post).
At the moment I have a PHP/HTML file that allows users to upload files. As these files can get quite large up to 3 gigs in size (transfer rate is ~4 megs a second), I was wondering what would be the best approach? Ideally I would like to have a progress bar to show the status of the transfer. I don't know of too many options but as my users are laboratory staff (minimal computer skills), I would like to have a solution that is web based.
3 GB? Whew. Whatever system you choose, whether it be MediaWiki, Tiki, or the thousands of others, you'll need to modify the HTTP server and PHP configuration to allow such large uploads. For Apache, this means make sure you have this in your virtualhost/global config:
in one of our systems, sometimes we upload huge data files. we use a java program written by someone here, the program runs for a few hours to finish the job.