You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
When an archive contains a lot of entries (for example 22k+), library always loads it into memory (internal array) of entries, which leads to out of memory problems (adding more RAM is not a solution). Also the same is for zipping large amount of files (around 7k is enough) where is the same problem: entries are collected into an internal array.
Maybe it's kind of limitation of this library to zip/unzip relatively small amount of files as it's not probably intended to do such huge works; my original reason is making deployment packages using ZIP (both zip/unzip) in quite dark environment, so a lot of processed files are required in a bit restricted environment.
How to reproduce
Set memory limit of php script to for example 32MB and try zip/unzip folder with 10k files; it will die on memory usage.
Possible Solution
This could be hard to do as there is an internal array of entries; solution could be zipping on fly when an entry is added and also reading entries using for example stream/iterator. Functions like hasEntry() would work with quite low performance on huge entry set as it has to go through all entries in the stream.
Additional context
Nothing to say here.
The text was updated successfully, but these errors were encountered:
I have this problem.
I want zip nested folders and thousands of photos with small sizes. Eventually, they become gigs
But I get Allowed memory size of.....
Server don't have enough RAM, so ini_set('memory_limit', '-1') not an option.
Maybe zipping files into disk step by step? not at once.
Description
When an archive contains a lot of entries (for example 22k+), library always loads it into memory (internal array) of entries, which leads to out of memory problems (adding more RAM is not a solution). Also the same is for zipping large amount of files (around 7k is enough) where is the same problem: entries are collected into an internal array.
Maybe it's kind of limitation of this library to zip/unzip relatively small amount of files as it's not probably intended to do such huge works; my original reason is making deployment packages using ZIP (both zip/unzip) in quite dark environment, so a lot of processed files are required in a bit restricted environment.
How to reproduce
Set memory limit of php script to for example 32MB and try zip/unzip folder with 10k files; it will die on memory usage.
Possible Solution
This could be hard to do as there is an internal array of entries; solution could be zipping on fly when an entry is added and also reading entries using for example stream/iterator. Functions like hasEntry() would work with quite low performance on huge entry set as it has to go through all entries in the stream.
Additional context
Nothing to say here.
The text was updated successfully, but these errors were encountered: