#48
in
Hacktoberfest
That's really confusing since `pixz` exists and its "pixie" pronunciation actually works
pixz (https://github.com/vasi/pixz) is a nice parallel xz that additionally creates an index of tar files so you can decompress individual files. I wonder if dpkg could be extended to do something similar.
Related: pixz – a variant of xz that works with tar and enables (semi)efficient random access: https://github.com/vasi/pixz
Also relevant is pixz [1] which can do parallel LZMA/XZ decompression as well as tar file indexing.
Also https://github.com/vasi/pixz, which uses a more computationally efficient yet widely supported compression algorithm (lzma/xz) and adds indexing, making byte range access possible in large files within the archive (and does also support parallel decompression).
For general purposes, I like using pixz which is indexable in comparison: https://github.com/vasi/pixz
Do you know if Debian is using parallelized XZ or not with apt / dpkg?
And the same for LZMA: https://github.com/vasi/pixz
(it's relatively easy to remember those commands)
Totally agree with this. As someone with a commit bit to the project, as well as a long-time user, I'd like to second the recommendation. Pixz is a terrific parallel XZ compression/expansion tool. I find it indispensable for logs and database backups. Link: https://github.com/vasi/pixz
Fish shell users can take advantage of the Extract and Compress plugins I wrote, which utilize Pixz if installed: https://github.com/justinmayer/tackle/tree/master/plugins/ex...
Bzip2 doesn't handle multiple cores as far as I'm aware, but tools such as pbzip2 can. I wrote about this some time ago: https://hackercodex.com/guide/parallel-bzip-compression/
That said, parallel XZ is even better: https://github.com/vasi/pixz
I've also got my version of parallel xz: https://github.com/vasi/pixz
It doesn't require use of large temporary files like pxz, and the xz files it produces can also be decompressed in parallel.