Nothing to look into from a developer's perspective - as I said that it's expected behaviour:
Regular uploads work as expected [...] - it's expected behaviour that such huge files will time out when http-uploading.
You haven't done as I suggested:
Use FTP-upload plus batch-add to upload such huge files. If that doesn't work for you, post a link to one of the files you're having issues with when batch-adding.
php_value upload_max_filesize 2000M
php_value max_execution_time 10000
php_value max_input_time 10000
That's just wishfull thinking: PHP is just a component of the webserver. The webserver daemon has got configuration settings of it's own that apply as well as the OS underneath it. To give you an analogy: if you want your car to run faster, you can not only exchange the speedometer into a modell that allows faster speeds; you also need to modify the engine and transmission to actually give the car more power. There are several limiting factors, and you need to take care of all of them if you want your system reliably.
Uploads of 30 MB do not work reliably through http because of many factors that have an impact on that process. It's hard to to figure out all limitations that can have an impact on this process (in other words: there are several things that can go wrong), basically because the process wasn't built to catter for such large files in the first place. That's why you should use the mechanisms that
have been designed to transfer larger files to a webserver. The method is outlined in the docs: it's FTP upload. Coppermine offers a nifty mechanism to catter for that, i.e. batch-add. There is nothing that we (as Coppermine developers) could do to circumvent this, as we have no power over the limiting factors that apply for http uploads.