Bug #323
closedFile upload problem
Description
While the new version (1.4.5) permits much larger file uploads than previous versions (especially when RAM is limited), it nonetheless appears to have a problem with really large files. As far as I can tell, the entire file is written out in 1-M chunks - these are then recombined into one large file. For a sufficently large file, however, it is noticed that, while the deleting of the 1M chunks continues unbated, the large file is not written to after the first approx. 900M (the figure ranges between 881M and, once, a record of just over 1G, but generally sticks somewhere in the 900Ms range).
Updated by stbuehler over 16 years ago
- Status changed from New to Fixed
- Resolution set to worksforme
"into one large file" - do you mean webdav? as it is not true for cgi, fastcgi, proxy, ...
too old anyway.
Updated by Anonymous over 16 years ago
- Status changed from Fixed to Need Feedback
- Resolution deleted (
worksforme)
The same problem. After uploading 168M file, when it starts to combine chunks, something goes wrong and process stops. No errors in log file.
-- polkila
Updated by stbuehler over 16 years ago
aaaaaaaaaaargs.
please.
give.
details.
It works for many people, so how do you think we can fix your problem if you don't give us details?
Why don't you at least answer the question i asked?
so, again: what os/system?
config?
what did you do to trigger the error?
And: 1.4.18 is too old, current is 1.4.20 - there has been fixes for the sendfile backend for freebsd, and many other fixes.
Updated by stbuehler over 16 years ago
- Status changed from Need Feedback to Invalid
- Pending changed from Yes to No
- Resolution set to worksforme
- Patch available set to No
Missing feedback.
Updated by stbuehler over 16 years ago
- Status changed from Invalid to Missing Feedback
Also available in: Atom