mirror of
https://github.com/adamhathcock/sharpcompress.git
synced 2026-02-03 21:23:38 +00:00
Really bad performance. #313
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @SuperJMN on GitHub (Jul 8, 2018).
I'm using this code.
It takes like 10 minutes to extract 272 files from a 7-zip archive that is not special by any means. Well, it's been compressed using the Ultra setting… :)
I has some of them that are about 80MB in size.
What could be wrong?
Thanks!!
@adamhathcock commented on GitHub (Jul 8, 2018):
The 7zip implementation is far from optimized. I’m sure some quick wins could be from pooling byte arrays but I haven’t looked into it.
@SuperJMN commented on GitHub (Jul 8, 2018):
Thank you. I hope it gets better. I'm using it inside this application: https://github.com/SuperJMN/Lumia-WoA-Installer (in the most recent dev branch, still not committed). I will use it to import driver packages. They're .7z files with Ultra compressing level.
@adamhathcock commented on GitHub (Jul 9, 2018):
There is a perf improvement someone contributed that I haven't released yet that is specific for LZMA but should help everything: https://github.com/adamhathcock/sharpcompress/pull/384
I probably ought to release soon.
@SuperJMN commented on GitHub (Jul 9, 2018):
It's already merged. I hope it's published to NuGet.org soon! Thank you.
@adamhathcock commented on GitHub (Jul 9, 2018):
https://www.nuget.org/packages/sharpcompress/0.22.0
@SuperJMN commented on GitHub (Jul 9, 2018):
Wow, that's FAST. Thanks a lot! I think my tool is now ready for prime time! As soon as I add a "About" section, this wonderful project will be in the credits, for sure.
@SuperJMN commented on GitHub (Jul 14, 2018):
Hello again!
Unfortunately, the performance is still alarming. It seems the extraction is quite quick at the beginning, but performance degrades as the extraction of the files progresses. I've profiled the extraction process and this is what I've got:
Can you, please, take a look? I had to switch to regular .zip files instead because of this. This is a link to the file I'm testing: https://drive.google.com/open?id=1xSWWALufBbe-2H4fgF8GQChn-NKt7mQ3
Thank you!
@adamhathcock commented on GitHub (Jul 14, 2018):
Probably places where byte arrays can be pooled.
Not sure if I’ll get a chance soon as I’ve been going through a long and drawn series of personal issues.
@androschuk commented on GitHub (Feb 22, 2023):
Hello @adamhathcock Any updates?
In my case, there is a
7zarchive with800+files (Archive size~9 Mb, Extracted folder size~54Mb), when I try to make anOpenEntryStreamon a300+file, it starts to noticeably slow down.Example:
It looks like related issues:
@adamhathcock commented on GitHub (Mar 1, 2023):
PRs are welcome. I've been away for personal reasons.
@Erior commented on GitHub (Mar 12, 2023):
@androschuk , try use
var reader = archive.ExtractAllEntries();
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
// Do something
}
}
@androschuk commented on GitHub (Mar 14, 2023):
@Erior Unfortunately it doesn't help. Because the main problem is inside
OpenEntryStream.