1

Is there a way using 7-Zip to create multi-volume archives (totaling many gigabytes) with each archive as a standalone zip with no files spanning multiple volumes???

Scenario. User has to upload 16 gb of images to a website, but uploads are capped at 100mb, and does not support multi-volume extraction, so each "volume" must be able to be extracted individually without any broken files.

Preferably a MAC-OS supported program.

Thanks in advance.

Steve Reeder
  • 111
  • 3

1 Answers1

0

datapacker is a tool to pack files into the minimum number of bins. I don't know if you can easily get it on macOS. The source is here: https://github.com/jgoerzen/datapacker . The manual: here.

General procedure:

  1. Use datapacker to create bins (directories) not exceeding 99 MB each. Use --action=hardlink (and --deep-links if needed).

  2. Iterate over bins and

    • pack the content of each bin individually,
    • upload the resulting archive,
    • delete the archive locally.

    Or you can pack all bins first, upload all archives later and only then delete the archives in bunch, but this requires more disk space than processing one bin at a time.

    No archive should exceed 100 MB in size. If the original files compress well in general then consider creating larger bins in the first place (preliminary testing advised).

  3. Remove the bins.

I guess with --action=exec it's possible to run a script (separately for each bin) that will pack and upload on the fly. This approach will be hard to resume reliably if something fails in the middle, so I would prefer creating directories and handling them later, as described above.