Project

General

Profile

Trouble with big files

Added by Flavio C almost 14 years ago

Hello, I have a problem. When I use Redmine to download files (e.g. from a "filesystem" Repository, or from Wiki as attachments, or from File area) I obtain, only for large files, this error (in production.log):

NoMemoryError (failed to allocate memory):
/vendor/rails/actionpack/lib/action_controller/streaming.rb:86:in `write'
/vendor/rails/actionpack/lib/action_controller/streaming.rb:86:in `send_file'
/vendor/rails/actionpack/lib/action_controller/streaming.rb:84:in `open'
/vendor/rails/actionpack/lib/action_controller/streaming.rb:84:in `send_file'
[...]
/usr/bin/mongrel_rails:19:in `load'
/usr/bin/mongrel_rails:19

I'm on a hosting server, and I don't think that is a configuration error because this happens only for large files, and the remaining of Redmine works perfectly.

I searched in the forum but didn't find a solution. I would like to know how I can manage big files. And, in the case a workaround is needed, there is a friendly solution? For example, if I put these files in a FTP area or a htaccess-protected folder, I ask a second login to my Redmine users. So, do you know a workaround to allow downloading big files only to the already logged Redmine users?

Very thanks!


Replies (10)

RE: Trouble with big files - Added by Felix Schäfer almost 14 years ago

Redmine will load the complete file in memory before sending it to the client, so your server needs to be able to allocate that much memory in addition to the memory rails and redmine already use. You could try serving your files through WEBDAV on apache, which you could integrate to the redmine authentication and authorization through the redmine.pm provided for svn integration (svn over http is just a superset of webdav, so it should work).

RE: Trouble with big files - Added by Flavio C almost 14 years ago

Thanks for the answer; excuse me if my question is too trivial, but I never worked with redmine.pm and webdav: I'm on a hosting, and I have no access to httpd.conf, is your solution applicable in this case?

RE: Trouble with big files - Added by Felix Schäfer almost 14 years ago

Mmh, good question. I think you need stuff in the vhost.conf and mod_perl loaded. But even then, it's not "simple" so I'm not sure if it would be the right solution for you if you have no experience with webdav.

RE: Trouble with big files - Added by Flavio C almost 14 years ago

I asked to hosting's support if I have memory limitation on Mongrel, and they answer that I've no limit, so maybe the error in my first post came for other reasons. So, now I have no idea on what could be the cause, what kind of checks can I do? Very thanks!

RE: Trouble with big files - Added by Felix Schäfer almost 14 years ago

Mmh, the limitation could come from a lot of places: available memory, number of other processes, OS limit on per-process memory available… Anyway, what is approximately the size at which this problem begins to occur?

RE: Trouble with big files - Added by Flavio C almost 14 years ago

Thanks again for the answer; I've done more tests, and I can't download files bigger than 8 MB. So, the problem is very frustrating: in our team we need to manage with Redmine files of this size.

For the available memory:

[~]# free -m
total used free shared buffers cached
Mem: 7986 7375 611 0 327 3410
-/+ buffers/cache: 3637 4348
Swap: 8000 1 7999

It's a report of a specific instant, but there is always much more than 250 MB available.

For the number of other processes, I'm on a shared hosting, and with ps aux I see many processes, tens of them. But the situation (from the memory point of view) seems untroubled:

[~]# ps auxw --sort -rss
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
mysql 31281 5.5 9.9 1232644 809872 ? Sl Apr29 1131:05 /usr/sbin/mysqld [...]
flxxxx 17956 0.0 1.6 262132 135168 ? S Apr18 10:41 /usr/bin/ruby /usr/bin/mongrel_rails start -p 12xxx -d -e production -P log/mongrel.pid
690 8145 0.0 1.6 256424 131788 ? S May05 1:31 /usr/bin/ruby /usr/bin/mongrel_rails start -p [...]
532 25012 0.0 1.5 249648 123020 ? S May04 0:10 /usr/bin/ruby /usr/bin/mongrel_rails start -p [...]
752 16568 0.0 1.4 244648 119896 ? S Apr18 0:15 /usr/bin/ruby /usr/bin/mongrel_rails start -d -p [...]
root 9136 0.0 1.3 145228 113380 ? Ss Apr15 3:27 cfenvd
864 14221 8.2 1.2 376292 98404 ? SN 17:20 0:00 /usr/local/apache/bin/httpd -DSSL
864 14200 6.8 1.2 376284 98400 ? SN 17:20 0:00 /usr/local/apache/bin/httpd -DSSL
864 14159 7.1 1.1 375484 97612 ? SN 17:20 0:00 /usr/local/apache/bin/httpd -DSSL
[...]

I use about 135000 KB (my mongrel process is the first one), that is 1.6% of memory. What can I do? Thanks!

RE: Trouble with big files - Added by Felix Schäfer almost 14 years ago

Well, that's no filesize that should cause problems. Just out of the blue, is your max filesize setting maybe near 8MB? If yes, try upping that and see if it makes any difference. If that doesn't work either, head over to IRC and see if anyone can help (edavis10, khalsa and rchady come to mind that could have an idea) or/and file a new bug, though you will need to describe how to reproduce this error, i.e. try on a fresh install if you get the same errors.

RE: Trouble with big files - Added by Flavio C almost 14 years ago

Felix Schäfer wrote:

is your max filesize setting maybe near 8MB?

Do you refer to this setting? If yes, it was setted to 40 MB.

If yes, try upping that and see if it makes any difference.

I tried, but I saw exactly the same behaviour.

Anyway, the problem is only in one way, that is, I can't download the files: for example, I can upload files without problems (and much bigger than 8 MB), and the uploads are correct (in the files folder of Redmine I can see the files uploaded); but when I try to download them (for example as wiki attachments, but the same happens in other areas) I receive the error.

If that doesn't work either, head over to IRC and see if anyone can help (edavis10, khalsa and rchady come to mind that could have an idea) or/and file a new bug, though you will need to describe how to reproduce this error, i.e. try on a fresh install if you get the same errors.

Thanks for the hints, I'll try your suggestions.

RE: Trouble with big files - Added by Ronald Theile over 13 years ago

I just met this error too with a 320MB file on a Redmine 0.8.6. the Server had 1GB RAM and got the "NoMemoryError (failed to allocate memory):"
Because it's on a VMware I added another GB of memory and I could download the file.
For testing I started a "top" on the machine during download and saw that the ruby process allocated the memory but did not give it back! So with the 4th download I got the NoMemoryError again! Only a restart of the appliance helped!

hope that this error is fixed on newer versions were we'll move in about a month.

RE: Trouble with big files - Added by Felix Schäfer over 13 years ago

Ronald Theile wrote:

For testing I started a "top" on the machine during download and saw that the ruby process allocated the memory but did not give it back! So with the 4th download I got the NoMemoryError again! Only a restart of the appliance helped!

That's a problem with the ruby GC poorly handling large files, and that's why there's a setting to limit file sizes in the redmine configuration. If you need to work with such big files, you're probably better off working with a webdav or svn directory.

    (1-10/10)