I am working on a log analysis tool, and I have added reading og .gz log
files. When I scan very large files, I run out of heap space. Here is
the part of the code reading the file. Will this try to keep the whole
file in memory? Is there a way to avoid this?
begin
file = Zlib::GzipReader.open(filename)
file.each_line do |line|
…
end
ensure
file.close if file
end
Sorry for the spam, but it seems I have got a corrupted log file
containing lots of binary data. each_line will try to put this in one
line that is larger than available heap space. Problem found
Sorry for the spam, but it seems I have got a corrupted log file containing lots of binary data. each_line will try to put this in one line that is larger than available heap space. Problem found