Skip to content

Commit

Permalink
Bugfix for #15 for compressing large files
Browse files Browse the repository at this point in the history
This is a potential quick fix for Issue #15 which occurs when attempting to compress files larger than ZIP64_LIMIT.  `zinfo.file_size` is never initialized to the correct file size and thus the determination of whether zip64 is required is made based on the file size of 0.  This later results in an exception being raised as though the file size increased during compression, since the file size is actually counted during compression and later saved over `zinfo.file_size`. 

It is important to note that this fix may not be cross platform.  Different versions of Python do different things on Windows with the `st_size` parameter in the `os.stat` call.  So, that may be worth investigating further.  However, in the short term, this will fix the problem on Linux, Mac, and some Windows platforms without making it worse where it still doesn't work.

I would leave it to the maintainers to make a broader decision on whether this fix is appropriate or if a better solution would be desired.  I'm happy to help.
  • Loading branch information
lesthaeghet committed Jan 28, 2016
1 parent 8abfa34 commit da7d3ed
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion zipstream/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,10 @@ def __write(self, filename=None, iterable=None, arcname=None, compress_type=None
else:
zinfo.compress_type = compress_type

zinfo.file_size = 0
if st:
zinfo.file_size = st[6]
else:
zinfo.file_size = 0
zinfo.flag_bits = 0x00
zinfo.flag_bits |= 0x08 # ZIP flag bits, bit 3 indicates presence of data descriptor
zinfo.header_offset = self.fp.tell() # Start of header bytes
Expand Down

0 comments on commit da7d3ed

Please sign in to comment.