Compare commits

...

54 Commits

Author SHA1 Message Date
Adam Hathcock
4ca1a7713e Merge pull request #1157 from adamhathcock/adam/1154-release
Merge pull request #1156 from adamhathcock/copilot/fix-sharpcompress-…
2026-01-25 11:36:59 +00:00
Adam Hathcock
9caf7be928 Revert testing 2026-01-24 10:23:02 +00:00
Adam Hathcock
bf4217fde6 Merge pull request #1156 from adamhathcock/copilot/fix-sharpcompress-archive-iteration
Fix silent iteration failure when input stream throws on Flush()
# Conflicts:
#	src/SharpCompress/packages.lock.json
2026-01-24 10:18:02 +00:00
Adam Hathcock
d5a8c37113 Merge pull request #1154 from adamhathcock/adam/1151-release
Adam/1151 release cherry pick
2026-01-23 09:31:03 +00:00
Adam Hathcock
21ce9a38e6 fix up tests 2026-01-23 09:04:55 +00:00
Adam Hathcock
7732fbb698 Merge pull request #1151 from adamhathcock/copilot/fix-entrystream-flush-issue
Fix EntryStream.Dispose() throwing NotSupportedException on non-seekable streams
2026-01-23 08:59:56 +00:00
Adam Hathcock
97879f18b6 Merge pull request #1146 from adamhathcock/adam/pr-1145-release
Merge pull request #1145 from adamhathcock/copilot/add-leaveopen-para…
2026-01-19 10:35:33 +00:00
Adam Hathcock
d74454f7e9 Merge pull request #1145 from adamhathcock/copilot/add-leaveopen-parameter-lzipstream
Add leaveOpen parameter to LZipStream and BZip2Stream
2026-01-19 09:58:10 +00:00
Adam Hathcock
5c947bccc7 Merge branch 'adam/update-docs' 2026-01-07 16:18:51 +00:00
Adam Hathcock
fbdefc17c1 updates from review 2026-01-07 16:18:27 +00:00
Adam Hathcock
1425c6ff0d Merge pull request #1120 from adamhathcock/adam/update-docs
Update docs
2026-01-07 16:12:51 +00:00
Adam Hathcock
e038aea694 move old changelog 2026-01-07 16:10:55 +00:00
Adam Hathcock
87ccbf329d moved examples to USAGE 2026-01-07 15:56:38 +00:00
Adam Hathcock
9dcf384263 update for progress reporting 2026-01-07 15:30:26 +00:00
Adam Hathcock
be045c4f15 Merge pull request #1114 from adamhathcock/copilot/fix-7z-file-decompression-error
Fix async decompression of .7z files by implementing Memory<byte> ReadAsync overload
2026-01-07 08:16:51 +00:00
Adam Hathcock
fd968b3f78 Update src/SharpCompress/IO/ReadOnlySubStream.cs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-06 16:33:03 +00:00
Adam Hathcock
833dd7b3a2 fix tests and fmt 2026-01-06 15:33:43 +00:00
Adam Hathcock
b9258ad496 use more ValueTask methods but types are still created because of state machine suspension 2026-01-06 15:26:49 +00:00
copilot-swe-agent[bot]
0678318dde Fix async decompression by implementing Memory<byte> ReadAsync overload
The issue was that .NET 10's ReadExactlyAsync calls the Memory<byte> overload of ReadAsync, which wasn't implemented in BufferedSubStream. This caused it to fall back to the base Stream implementation that uses synchronous reads, leading to cache state corruption.

Solution: Added ValueTask<int> ReadAsync(Memory<byte>, CancellationToken) overload for modern .NET versions.

All tests now passing including LZMA2 and Solid archives.

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-06 14:18:15 +00:00
copilot-swe-agent[bot]
7116c0d098 Add async support to BufferedSubStream for 7zip decompression
- Implemented ReadAsync and RefillCacheAsync methods in BufferedSubStream
- Added async test cases for SevenZipArchive (LZMA, LZMA2, Solid, BZip2, PPMd)
- Tests show LZMA, BZip2, and PPMd working correctly
- LZMA2 and Solid archives still failing with Data Error - investigating cache state management

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-06 14:13:48 +00:00
copilot-swe-agent[bot]
2fde8436fb Initial plan 2026-01-06 14:00:03 +00:00
Adam Hathcock
61ecd6475f Merge pull request #1113 from adamhathcock/adam/fix-readonly-dispose
Fix a usage of ReadOnly that use dispose in 7Zip
2026-01-06 13:22:50 +00:00
Adam Hathcock
64b209a772 add disposal tests and fix lzipstream 2026-01-06 13:13:34 +00:00
Adam Hathcock
48dbdbfed5 fmt 2026-01-06 12:49:26 +00:00
Adam Hathcock
cf50311b9c Skip should use framework stuff 2026-01-06 12:45:13 +00:00
Adam Hathcock
e4d8582a2a 7zip streams always want to be disposed 2026-01-06 12:42:48 +00:00
Adam Hathcock
b8e5ee45eb Merge pull request #1109 from adamhathcock/dependabot/nuget/build/SimpleExec-13.0.0
Bump SimpleExec from 12.1.0 to 13.0.0
2026-01-05 17:17:55 +00:00
Adam Hathcock
9f20a9e7d2 Merge pull request #1110 from TwanVanDongen/master
Formats.md updated to reflect additions of Ace, Arc and Arj
2026-01-05 17:14:26 +00:00
Twan
201521d814 Merge branch 'adamhathcock:master' into master 2026-01-05 18:09:55 +01:00
Twan van Dongen
18bb3cba11 Added descriptions for archives Ace, Arc and Arj 2026-01-05 18:08:53 +01:00
Adam Hathcock
af951d6f6a Merge pull request #1102 from TwanVanDongen/master
Add support for ACE archives
2026-01-05 16:37:35 +00:00
Adam Hathcock
e5fe92bf90 Merge pull request #1108 from adamhathcock/dependabot/nuget/dot-config/csharpier-1.2.5
Bump csharpier from 1.2.4 to 1.2.5
2026-01-05 16:15:12 +00:00
dependabot[bot]
b1aca7c305 Bump SimpleExec from 12.1.0 to 13.0.0
---
updated-dependencies:
- dependency-name: SimpleExec
  dependency-version: 13.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-05 09:32:51 +00:00
dependabot[bot]
c0a0cc4a44 Bump csharpier from 1.2.4 to 1.2.5
---
updated-dependencies:
- dependency-name: csharpier
  dependency-version: 1.2.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-05 09:32:17 +00:00
Twan van Dongen
7a49eb9e93 Archives containing encrypted content throws exception. 2026-01-04 19:10:57 +01:00
Adam Hathcock
5aa0610882 Merge pull request #1104 from adamhathcock/copilot/fix-invalidoperationexception-rar
Fix InvalidOperationException when RAR uncompressed size exceeds header value
2026-01-04 12:18:38 +00:00
copilot-swe-agent[bot]
41ed4c8186 Add negative test case for premature stream termination
Added Rar_StreamValidation_ThrowsOnTruncatedStream test that verifies InvalidOperationException IS thrown when a RAR stream ends prematurely (position < expected length). Created TruncatedStream mock to simulate corrupted/truncated RAR files. This test validates the exception condition (_position < Length) works correctly.

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-04 11:47:57 +00:00
copilot-swe-agent[bot]
90a33ce6b0 Remove duplicate test that doesn't verify exception case
Removed Rar_StreamValidation_CorrectExceptionBehavior test as it duplicated the validation in Rar_StreamValidation_OnlyThrowsOnPrematureEnd without actually testing the exception case. The remaining test adequately validates that the fix works correctly across multiple RAR formats.

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-04 11:36:25 +00:00
copilot-swe-agent[bot]
12574798e1 Update test to document exception behavior for RAR stream validation
Renamed and enhanced the test to better document the fix. Added second test (Rar_StreamValidation_CorrectExceptionBehavior) that explicitly validates the difference between old and new behavior. Tests verify that InvalidOperationException is only thrown when position < expected length (premature termination), not when position >= expected length (which can be valid for some RAR files).

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-04 09:55:18 +00:00
copilot-swe-agent[bot]
83b11254db Add test for InvalidOperationException fix in RAR extraction
Added Rar_ExtractionCompletesWithoutInvalidOperationException test that verifies RAR extraction completes successfully without throwing InvalidOperationException when reading streams to EOF. The test validates the fix works across RAR, RAR5, RAR4, and RAR2 formats by reading all entries completely and ensuring no exceptions are thrown.

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-04 09:33:31 +00:00
copilot-swe-agent[bot]
b25493fd29 Fix InvalidOperationException when RAR unpacks more data than header specifies
Changed validation condition from `_position != Length` to `_position < Length` in RarStream.Read() and RarStream.ReadImplAsync() to only throw when unpacking ends prematurely, not when more data is unpacked than the header specifies. This allows successful extraction of RAR files where the actual uncompressed size exceeds the header size.

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-04 09:23:47 +00:00
copilot-swe-agent[bot]
bb66100486 Initial plan 2026-01-04 09:10:50 +00:00
Twan van Dongen
3ebf97dd49 MultiVolume not supported, tests provided by split archive. 2026-01-03 19:09:43 +01:00
Twan van Dongen
bfcdeb3784 Ace largefile test added 2026-01-03 18:48:54 +01:00
Twan van Dongen
feece3d788 Missed some CSharpier edits 2026-01-03 18:08:01 +01:00
Twan van Dongen
94adb77e9e Merge branch 'master' of https://github.com/TwanVanDongen/sharpcompress 2026-01-03 18:01:13 +01:00
Twan van Dongen
909d36c237 more subtle check of magic bytes for ARJ archives 2026-01-03 17:59:17 +01:00
Twan van Dongen
e1c8aa226d Add ACE archive support (read-only, stored entries) 2026-01-03 17:18:00 +01:00
Adam Hathcock
2327679f23 Merge pull request #1098 from adamhathcock/adam/remove-old-release
remove old release
2026-01-03 14:27:13 +00:00
Adam Hathcock
574d9f970c Merge pull request #1099 from adamhathcock/copilot/sub-pr-1098
Configure nuget-release workflow to validate PRs without publishing
2026-01-03 14:22:13 +00:00
copilot-swe-agent[bot]
235096a2eb Configure nuget-release.yml to run on PRs without publishing
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-03 14:15:50 +00:00
copilot-swe-agent[bot]
a739fdc544 Initial plan 2026-01-03 14:13:34 +00:00
Adam Hathcock
6196e26044 also remove from sln 2026-01-03 13:57:42 +00:00
Adam Hathcock
46a4064989 remove old bulid 2026-01-03 13:54:51 +00:00
66 changed files with 2610 additions and 401 deletions

View File

@@ -3,11 +3,11 @@
"isRoot": true,
"tools": {
"csharpier": {
"version": "1.2.4",
"version": "1.2.5",
"commands": [
"csharpier"
],
"rollForward": false
}
}
}
}

View File

@@ -1,25 +0,0 @@
name: SharpCompress
on:
push:
branches:
- 'master'
pull_request:
types: [ opened, synchronize, reopened, ready_for_review ]
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [windows-latest, ubuntu-latest]
steps:
- uses: actions/checkout@v6
- uses: actions/setup-dotnet@v5
with:
dotnet-version: 10.0.x
- run: dotnet run --project build/build.csproj
- uses: actions/upload-artifact@v6
with:
name: ${{ matrix.os }}-sharpcompress.nupkg
path: artifacts/*

View File

@@ -7,6 +7,10 @@ on:
- 'release'
tags:
- '[0-9]+.[0-9]+.[0-9]+'
pull_request:
branches:
- 'master'
- 'release'
permissions:
contents: read
@@ -49,9 +53,9 @@ jobs:
name: ${{ matrix.os }}-nuget-package
path: artifacts/*.nupkg
# Push to NuGet.org using C# build target (Windows only)
# Push to NuGet.org using C# build target (Windows only, not on PRs)
- name: Push to NuGet
if: success() && matrix.os == 'windows-latest'
if: success() && matrix.os == 'windows-latest' && github.event_name != 'pull_request'
run: dotnet run --project build/build.csproj -- push-to-nuget
env:
NUGET_API_KEY: ${{ secrets.NUGET_API_KEY }}

View File

@@ -7,7 +7,7 @@
<PackageVersion Include="Microsoft.Bcl.AsyncInterfaces" Version="10.0.0" />
<PackageVersion Include="Microsoft.NET.Test.Sdk" Version="18.0.1" />
<PackageVersion Include="Mono.Posix.NETStandard" Version="1.0.0" />
<PackageVersion Include="SimpleExec" Version="12.1.0" />
<PackageVersion Include="SimpleExec" Version="13.0.0" />
<PackageVersion Include="System.Text.Encoding.CodePages" Version="10.0.0" />
<PackageVersion Include="System.Buffers" Version="4.6.1" />
<PackageVersion Include="System.Memory" Version="4.6.3" />

View File

@@ -10,7 +10,10 @@
| Archive Format | Compression Format(s) | Compress/Decompress | Archive API | Reader API | Writer API |
| ---------------------- | ------------------------------------------------- | ------------------- | --------------- | ---------- | ------------- |
| Rar | Rar | Decompress (1) | RarArchive | RarReader | N/A |
| Ace | None | Decompress | N/A | AceReader | N/A |
| Arc | None, Packed, Squeezed, Crunched | Decompress | N/A | ArcReader | N/A |
| Arj | None | Decompress | N/A | ArjReader | N/A |
| Rar | Rar | Decompress | RarArchive | RarReader | N/A |
| Zip (2) | None, Shrink, Reduce, Implode, DEFLATE, Deflate64, BZip2, LZMA/LZMA2, PPMd | Both | ZipArchive | ZipReader | ZipWriter |
| Tar | None | Both | TarArchive | TarReader | TarWriter (3) |
| Tar.GZip | DEFLATE | Both | TarArchive | TarReader | TarWriter (3) |

142
OLD_CHANGELOG.md Normal file
View File

@@ -0,0 +1,142 @@
# Version Log
* [Releases](https://github.com/adamhathcock/sharpcompress/releases)
## Version 0.18
* [Now on Github releases](https://github.com/adamhathcock/sharpcompress/releases/tag/0.18)
## Version 0.17.1
* Fix - [Bug Fix for .NET Core on Windows](https://github.com/adamhathcock/sharpcompress/pull/257)
## Version 0.17.0
* New - Full LZip support! Can read and write LZip files and Tars inside LZip files. [Make LZip a first class citizen. #241](https://github.com/adamhathcock/sharpcompress/issues/241)
* New - XZ read support! Can read XZ files and Tars inside XZ files. [XZ in SharpCompress #91](https://github.com/adamhathcock/sharpcompress/issues/94)
* Fix - [Regression - zip file writing on seekable streams always assumed stream start was 0. Introduced with Zip64 writing.](https://github.com/adamhathcock/sharpcompress/issues/244)
* Fix - [Zip files with post-data descriptors can be properly skipped via decompression](https://github.com/adamhathcock/sharpcompress/issues/162)
## Version 0.16.2
* Fix [.NET 3.5 should support files and cryptography (was a regression from 0.16.0)](https://github.com/adamhathcock/sharpcompress/pull/251)
* Fix [Zip per entry compression customization wrote the wrong method into the zip archive](https://github.com/adamhathcock/sharpcompress/pull/249)
## Version 0.16.1
* Fix [Preserve compression method when getting a compressed stream](https://github.com/adamhathcock/sharpcompress/pull/235)
* Fix [RAR entry key normalization fix](https://github.com/adamhathcock/sharpcompress/issues/201)
## Version 0.16.0
* Breaking - [Progress Event Tracking rethink](https://github.com/adamhathcock/sharpcompress/pull/226)
* Update to VS2017 - [VS2017](https://github.com/adamhathcock/sharpcompress/pull/231) - Framework targets have been changed.
* New - [Add Zip64 writing](https://github.com/adamhathcock/sharpcompress/pull/211)
* [Fix invalid/mismatching Zip version flags.](https://github.com/adamhathcock/sharpcompress/issues/164) - This allows nuget/System.IO.Packaging to read zip files generated by SharpCompress
* [Fix 7Zip directory hiding](https://github.com/adamhathcock/sharpcompress/pull/215/files)
* [Verify RAR CRC headers](https://github.com/adamhathcock/sharpcompress/pull/220)
## Version 0.15.2
* [Fix invalid headers](https://github.com/adamhathcock/sharpcompress/pull/210) - fixes an issue creating large-ish zip archives that was introduced with zip64 reading.
## Version 0.15.1
* [Zip64 extending information and ZipReader](https://github.com/adamhathcock/sharpcompress/pull/206)
## Version 0.15.0
* [Add zip64 support for ZipArchive extraction](https://github.com/adamhathcock/sharpcompress/pull/205)
## Version 0.14.1
* [.NET Assemblies aren't strong named](https://github.com/adamhathcock/sharpcompress/issues/158)
* [Pkware encryption for Zip files didn't allow for multiple reads of an entry](https://github.com/adamhathcock/sharpcompress/issues/197)
* [GZip Entry couldn't be read multiple times](https://github.com/adamhathcock/sharpcompress/issues/198)
## Version 0.14.0
* [Support for LZip reading in for Tars](https://github.com/adamhathcock/sharpcompress/pull/191)
## Version 0.13.1
* [Fix null password on ReaderFactory. Fix null options on SevenZipArchive](https://github.com/adamhathcock/sharpcompress/pull/188)
* [Make PpmdProperties lazy to avoid unnecessary allocations.](https://github.com/adamhathcock/sharpcompress/pull/185)
## Version 0.13.0
* Breaking change: Big refactor of Options on API.
* 7Zip supports Deflate
## Version 0.12.4
* Forward only zip issue fix https://github.com/adamhathcock/sharpcompress/issues/160
* Try to fix frameworks again by copying targets from JSON.NET
## Version 0.12.3
* 7Zip fixes https://github.com/adamhathcock/sharpcompress/issues/73
* Maybe all profiles will work with project.json now
## Version 0.12.2
* Support Profile 259 again
## Version 0.12.1
* Support Silverlight 5
## Version 0.12.0
* .NET Core RTM!
* Bug fix for Tar long paths
## Version 0.11.6
* Bug fix for global header in Tar
* Writers now have a leaveOpen `bool` overload. They won't close streams if not-requested to.
## Version 0.11.5
* Bug fix in Skip method
## Version 0.11.4
* SharpCompress is now endian neutral (matters for Mono platforms)
* Fix for Inflate (need to change implementation)
* Fixes for RAR detection
## Version 0.11.1
* Added Cancel on IReader
* Removed .NET 2.0 support and LinqBridge dependency
## Version 0.11
* Been over a year, contains mainly fixes from contributors!
* Possible breaking change: ArchiveEncoding is UTF8 by default now.
* TAR supports writing long names using longlink
* RAR Protect Header added
## Version 0.10.3
* Finally fixed Disposal issue when creating a new archive with the Archive API
## Version 0.10.2
* Fixed Rar Header reading for invalid extended time headers.
* Windows Store assembly is now strong named
* Known issues with Long Tar names being worked on
* Updated to VS2013
* Portable targets SL5 and Windows Phone 8 (up from SL4 and WP7)
## Version 0.10.1
* Fixed 7Zip extraction performance problem
## Version 0.10:
* Added support for RAR Decryption (thanks to https://github.com/hrasyid)
* Embedded some BouncyCastle crypto classes to allow RAR Decryption and Winzip AES Decryption in Portable and Windows Store DLLs
* Built in Release (I think)

228
README.md
View File

@@ -4,7 +4,7 @@ SharpCompress is a compression library in pure C# for .NET Framework 4.8, .NET 8
The major feature is support for non-seekable streams so large files can be processed on the fly (i.e. download stream).
**NEW:** All I/O operations now support async/await for improved performance and scalability. See the [Async Usage](#async-usage) section below.
**NEW:** All I/O operations now support async/await for improved performance and scalability. See the [USAGE.md](USAGE.md#async-examples) for examples.
GitHub Actions Build -
[![SharpCompress](https://github.com/adamhathcock/sharpcompress/actions/workflows/dotnetcore.yml/badge.svg)](https://github.com/adamhathcock/sharpcompress/actions/workflows/dotnetcore.yml)
@@ -34,235 +34,11 @@ Hi everyone. I hope you're using SharpCompress and finding it useful. Please giv
Please do not email me directly to ask for help. If you think there is a real issue, please report it here.
## Async Usage
SharpCompress now provides full async/await support for all I/O operations, allowing for better performance and scalability in modern applications.
### Async Reading Examples
Extract entries asynchronously:
```csharp
using (Stream stream = File.OpenRead("archive.zip"))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
// Async extraction
await reader.WriteEntryToDirectoryAsync(
@"C:\temp",
new ExtractionOptions() { ExtractFullPath = true, Overwrite = true },
cancellationToken
);
}
}
}
```
Extract all entries to directory asynchronously:
```csharp
using (Stream stream = File.OpenRead("archive.tar.gz"))
using (var reader = ReaderFactory.Open(stream))
{
await reader.WriteAllToDirectoryAsync(
@"C:\temp",
new ExtractionOptions() { ExtractFullPath = true, Overwrite = true },
cancellationToken
);
}
```
Open entry stream asynchronously:
```csharp
using (var archive = ZipArchive.Open("archive.zip"))
{
foreach (var entry in archive.Entries.Where(e => !e.IsDirectory))
{
using (var entryStream = await entry.OpenEntryStreamAsync(cancellationToken))
{
// Process stream asynchronously
await entryStream.CopyToAsync(outputStream, cancellationToken);
}
}
}
```
### Async Writing Examples
Write files asynchronously:
```csharp
using (Stream stream = File.OpenWrite("output.zip"))
using (var writer = WriterFactory.Open(stream, ArchiveType.Zip, CompressionType.Deflate))
{
await writer.WriteAsync("file1.txt", fileStream, DateTime.Now, cancellationToken);
}
```
Write all files from directory asynchronously:
```csharp
using (Stream stream = File.OpenWrite("output.tar.gz"))
using (var writer = WriterFactory.Open(stream, ArchiveType.Tar, new WriterOptions(CompressionType.GZip)))
{
await writer.WriteAllAsync(@"D:\files", "*", SearchOption.AllDirectories, cancellationToken);
}
```
All async methods support `CancellationToken` for graceful cancellation of long-running operations.
## Want to contribute?
I'm always looking for help or ideas. Please submit code or email with ideas. Unfortunately, just letting me know you'd like to help is not enough because I really have no overall plan of what needs to be done. I'll definitely accept code submissions and add you as a member of the project!
## TODOs (always lots)
* RAR 5 decryption crc check support
* 7Zip writing
* Zip64 (Need writing and extend Reading)
* Multi-volume Zip support.
* ZStandard writing
## Version Log
* [Releases](https://github.com/adamhathcock/sharpcompress/releases)
### Version 0.18
* [Now on Github releases](https://github.com/adamhathcock/sharpcompress/releases/tag/0.18)
### Version 0.17.1
* Fix - [Bug Fix for .NET Core on Windows](https://github.com/adamhathcock/sharpcompress/pull/257)
### Version 0.17.0
* New - Full LZip support! Can read and write LZip files and Tars inside LZip files. [Make LZip a first class citizen. #241](https://github.com/adamhathcock/sharpcompress/issues/241)
* New - XZ read support! Can read XZ files and Tars inside XZ files. [XZ in SharpCompress #91](https://github.com/adamhathcock/sharpcompress/issues/94)
* Fix - [Regression - zip file writing on seekable streams always assumed stream start was 0. Introduced with Zip64 writing.](https://github.com/adamhathcock/sharpcompress/issues/244)
* Fix - [Zip files with post-data descriptors can be properly skipped via decompression](https://github.com/adamhathcock/sharpcompress/issues/162)
### Version 0.16.2
* Fix [.NET 3.5 should support files and cryptography (was a regression from 0.16.0)](https://github.com/adamhathcock/sharpcompress/pull/251)
* Fix [Zip per entry compression customization wrote the wrong method into the zip archive](https://github.com/adamhathcock/sharpcompress/pull/249)
### Version 0.16.1
* Fix [Preserve compression method when getting a compressed stream](https://github.com/adamhathcock/sharpcompress/pull/235)
* Fix [RAR entry key normalization fix](https://github.com/adamhathcock/sharpcompress/issues/201)
### Version 0.16.0
* Breaking - [Progress Event Tracking rethink](https://github.com/adamhathcock/sharpcompress/pull/226)
* Update to VS2017 - [VS2017](https://github.com/adamhathcock/sharpcompress/pull/231) - Framework targets have been changed.
* New - [Add Zip64 writing](https://github.com/adamhathcock/sharpcompress/pull/211)
* [Fix invalid/mismatching Zip version flags.](https://github.com/adamhathcock/sharpcompress/issues/164) - This allows nuget/System.IO.Packaging to read zip files generated by SharpCompress
* [Fix 7Zip directory hiding](https://github.com/adamhathcock/sharpcompress/pull/215/files)
* [Verify RAR CRC headers](https://github.com/adamhathcock/sharpcompress/pull/220)
### Version 0.15.2
* [Fix invalid headers](https://github.com/adamhathcock/sharpcompress/pull/210) - fixes an issue creating large-ish zip archives that was introduced with zip64 reading.
### Version 0.15.1
* [Zip64 extending information and ZipReader](https://github.com/adamhathcock/sharpcompress/pull/206)
### Version 0.15.0
* [Add zip64 support for ZipArchive extraction](https://github.com/adamhathcock/sharpcompress/pull/205)
### Version 0.14.1
* [.NET Assemblies aren't strong named](https://github.com/adamhathcock/sharpcompress/issues/158)
* [Pkware encryption for Zip files didn't allow for multiple reads of an entry](https://github.com/adamhathcock/sharpcompress/issues/197)
* [GZip Entry couldn't be read multiple times](https://github.com/adamhathcock/sharpcompress/issues/198)
### Version 0.14.0
* [Support for LZip reading in for Tars](https://github.com/adamhathcock/sharpcompress/pull/191)
### Version 0.13.1
* [Fix null password on ReaderFactory. Fix null options on SevenZipArchive](https://github.com/adamhathcock/sharpcompress/pull/188)
* [Make PpmdProperties lazy to avoid unnecessary allocations.](https://github.com/adamhathcock/sharpcompress/pull/185)
### Version 0.13.0
* Breaking change: Big refactor of Options on API.
* 7Zip supports Deflate
### Version 0.12.4
* Forward only zip issue fix https://github.com/adamhathcock/sharpcompress/issues/160
* Try to fix frameworks again by copying targets from JSON.NET
### Version 0.12.3
* 7Zip fixes https://github.com/adamhathcock/sharpcompress/issues/73
* Maybe all profiles will work with project.json now
### Version 0.12.2
* Support Profile 259 again
### Version 0.12.1
* Support Silverlight 5
### Version 0.12.0
* .NET Core RTM!
* Bug fix for Tar long paths
### Version 0.11.6
* Bug fix for global header in Tar
* Writers now have a leaveOpen `bool` overload. They won't close streams if not-requested to.
### Version 0.11.5
* Bug fix in Skip method
### Version 0.11.4
* SharpCompress is now endian neutral (matters for Mono platforms)
* Fix for Inflate (need to change implementation)
* Fixes for RAR detection
### Version 0.11.1
* Added Cancel on IReader
* Removed .NET 2.0 support and LinqBridge dependency
### Version 0.11
* Been over a year, contains mainly fixes from contributors!
* Possible breaking change: ArchiveEncoding is UTF8 by default now.
* TAR supports writing long names using longlink
* RAR Protect Header added
### Version 0.10.3
* Finally fixed Disposal issue when creating a new archive with the Archive API
### Version 0.10.2
* Fixed Rar Header reading for invalid extended time headers.
* Windows Store assembly is now strong named
* Known issues with Long Tar names being worked on
* Updated to VS2013
* Portable targets SL5 and Windows Phone 8 (up from SL4 and WP7)
### Version 0.10.1
* Fixed 7Zip extraction performance problem
### Version 0.10:
* Added support for RAR Decryption (thanks to https://github.com/hrasyid)
* Embedded some BouncyCastle crypto classes to allow RAR Decryption and Winzip AES Decryption in Portable and Windows Store DLLs
* Built in Release (I think)
## Notes
XZ implementation based on: https://github.com/sambott/XZ.NET by @sambott

View File

@@ -20,7 +20,7 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Config", "Config", "{CDB425
.editorconfig = .editorconfig
Directory.Packages.props = Directory.Packages.props
NuGet.config = NuGet.config
.github\workflows\dotnetcore.yml = .github\workflows\dotnetcore.yml
.github\workflows\nuget-release.yml = .github\workflows\nuget-release.yml
USAGE.md = USAGE.md
README.md = README.md
FORMATS.md = FORMATS.md

View File

@@ -113,38 +113,26 @@ using (var archive = RarArchive.Open("Test.rar"))
}
```
### Extract solid Rar or 7Zip archives with manual progress reporting
### Extract solid Rar or 7Zip archives with progress reporting
`ExtractAllEntries` only works for solid archives (Rar) or 7Zip archives. For optimal performance with these archive types, use this method:
```C#
using (var archive = RarArchive.Open("archive.rar")) // Must be solid Rar or 7Zip
using SharpCompress.Common;
using SharpCompress.Readers;
var progress = new Progress<ProgressReport>(report =>
{
if (archive.IsSolid || archive.Type == ArchiveType.SevenZip)
Console.WriteLine($"Extracting {report.EntryPath}: {report.PercentComplete}%");
});
using (var archive = RarArchive.Open("archive.rar", new ReaderOptions { Progress = progress })) // Must be solid Rar or 7Zip
{
archive.WriteToDirectory(@"D:\output", new ExtractionOptions()
{
// Calculate total size for progress reporting
double totalSize = archive.Entries.Where(e => !e.IsDirectory).Sum(e => e.Size);
long completed = 0;
using (var reader = archive.ExtractAllEntries())
{
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
reader.WriteEntryToDirectory(@"D:\output", new ExtractionOptions()
{
ExtractFullPath = true,
Overwrite = true
});
completed += reader.Entry.Size;
double progress = completed / totalSize;
Console.WriteLine($"Progress: {progress:P}");
}
}
}
}
ExtractFullPath = true,
Overwrite = true
});
}
```

View File

@@ -16,9 +16,9 @@
},
"SimpleExec": {
"type": "Direct",
"requested": "[12.1.0, )",
"resolved": "12.1.0",
"contentHash": "PcCSAlMcKr5yTd571MgEMoGmoSr+omwziq2crB47lKP740lrmjuBocAUXHj+Q6LR6aUDFyhszot2wbtFJTClkA=="
"requested": "[13.0.0, )",
"resolved": "13.0.0",
"contentHash": "zcCR1pupa1wI1VqBULRiQKeHKKZOuJhi/K+4V5oO+rHJZlaOD53ViFo1c3PavDoMAfSn/FAXGAWpPoF57rwhYg=="
}
}
}

View File

@@ -0,0 +1,61 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace SharpCompress.Common.Ace
{
public class AceCrc
{
// CRC-32 lookup table (standard polynomial 0xEDB88320, reflected)
private static readonly uint[] Crc32Table = GenerateTable();
private static uint[] GenerateTable()
{
var table = new uint[256];
for (int i = 0; i < 256; i++)
{
uint crc = (uint)i;
for (int j = 0; j < 8; j++)
{
if ((crc & 1) != 0)
crc = (crc >> 1) ^ 0xEDB88320u;
else
crc >>= 1;
}
table[i] = crc;
}
return table;
}
/// <summary>
/// Calculate ACE CRC-32 checksum.
/// ACE CRC-32 uses standard CRC-32 polynomial (0xEDB88320, reflected)
/// with init=0xFFFFFFFF but NO final XOR.
/// </summary>
public static uint AceCrc32(ReadOnlySpan<byte> data)
{
uint crc = 0xFFFFFFFFu;
foreach (byte b in data)
{
crc = (crc >> 8) ^ Crc32Table[(crc ^ b) & 0xFF];
}
return crc; // No final XOR for ACE
}
/// <summary>
/// ACE CRC-16 is the lower 16 bits of the ACE CRC-32.
/// </summary>
public static ushort AceCrc16(ReadOnlySpan<byte> data)
{
return (ushort)(AceCrc32(data) & 0xFFFF);
}
}
}

View File

@@ -0,0 +1,68 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
namespace SharpCompress.Common.Ace
{
public class AceEntry : Entry
{
private readonly AceFilePart _filePart;
internal AceEntry(AceFilePart filePart)
{
_filePart = filePart;
}
public override long Crc
{
get
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc32;
}
}
public override string? Key => _filePart?.Header.Filename;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.PackedSize ?? 0;
public override CompressionType CompressionType
{
get
{
if (_filePart.Header.CompressionType == Headers.CompressionType.Stored)
{
return CompressionType.None;
}
return CompressionType.AceLZ77;
}
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTime;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => _filePart.Header.IsFileEncrypted;
public override bool IsDirectory => _filePart.Header.IsDirectory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
}

View File

@@ -0,0 +1,52 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.IO;
namespace SharpCompress.Common.Ace
{
public class AceFilePart : FilePart
{
private readonly Stream _stream;
internal AceFileHeader Header { get; set; }
internal AceFilePart(AceFileHeader localAceHeader, Stream seekableStream)
: base(localAceHeader.ArchiveEncoding)
{
_stream = seekableStream;
Header = localAceHeader;
}
internal override string? FilePartName => Header.Filename;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionType)
{
case Headers.CompressionType.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.PackedSize
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionQuality
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
}
}

View File

@@ -0,0 +1,35 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Arj;
using SharpCompress.Readers;
namespace SharpCompress.Common.Ace
{
public class AceVolume : Volume
{
public AceVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public override bool IsFirstVolume
{
get { return true; }
}
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
internal IEnumerable<AceFilePart> GetVolumeFileParts()
{
return new List<AceFilePart>();
}
}
}

View File

@@ -0,0 +1,171 @@
using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Xml.Linq;
using SharpCompress.Common.Arc;
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// ACE file entry header
/// </summary>
public sealed class AceFileHeader : AceHeader
{
public long DataStartPosition { get; private set; }
public long PackedSize { get; set; }
public long OriginalSize { get; set; }
public DateTime DateTime { get; set; }
public int Attributes { get; set; }
public uint Crc32 { get; set; }
public CompressionType CompressionType { get; set; }
public CompressionQuality CompressionQuality { get; set; }
public ushort Parameters { get; set; }
public string Filename { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
/// <summary>
/// File data offset in the archive
/// </summary>
public ulong DataOffset { get; set; }
public bool IsDirectory => (Attributes & 0x10) != 0;
public bool IsContinuedFromPrev =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_PREV) != 0;
public bool IsContinuedToNext =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_NEXT) != 0;
public int DictionarySize
{
get
{
int bits = Parameters & 0x0F;
return bits < 10 ? 1024 : 1 << bits;
}
}
public AceFileHeader(ArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.FILE) { }
/// <summary>
/// Reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
}
}
// Store the data start position
DataStartPosition = stream.Position;
return this;
}
public CompressionType GetCompressionType(byte value) =>
value switch
{
0 => CompressionType.Stored,
1 => CompressionType.Lz77,
2 => CompressionType.Blocked,
_ => CompressionType.Unknown,
};
public CompressionQuality GetCompressionQuality(byte value) =>
value switch
{
0 => CompressionQuality.None,
1 => CompressionQuality.Fastest,
2 => CompressionQuality.Fast,
3 => CompressionQuality.Normal,
4 => CompressionQuality.Good,
5 => CompressionQuality.Best,
_ => CompressionQuality.Unknown,
};
}
}

View File

@@ -0,0 +1,153 @@
using System;
using System.IO;
using SharpCompress.Common.Arj.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// Header type constants
/// </summary>
public enum AceHeaderType
{
MAIN = 0,
FILE = 1,
RECOVERY32 = 2,
RECOVERY64A = 3,
RECOVERY64B = 4,
}
public abstract class AceHeader
{
// ACE signature: bytes at offset 7 should be "**ACE**"
private static readonly byte[] AceSignature =
[
(byte)'*',
(byte)'*',
(byte)'A',
(byte)'C',
(byte)'E',
(byte)'*',
(byte)'*',
];
public AceHeader(ArchiveEncoding archiveEncoding, AceHeaderType type)
{
AceHeaderType = type;
ArchiveEncoding = archiveEncoding;
}
public ArchiveEncoding ArchiveEncoding { get; }
public AceHeaderType AceHeaderType { get; }
public ushort HeaderFlags { get; set; }
public ushort HeaderCrc { get; set; }
public ushort HeaderSize { get; set; }
public byte HeaderType { get; set; }
public bool IsFileEncrypted =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.FILE_ENCRYPTED) != 0;
public bool Is64Bit =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MEMORY_64BIT) != 0;
public bool IsSolid =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.SOLID_MAIN) != 0;
public bool IsMultiVolume =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MULTIVOLUME) != 0;
public abstract AceHeader? Read(Stream reader);
public byte[] ReadHeader(Stream stream)
{
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (stream.Read(headerBytes, 0, 4) != 4)
{
return Array.Empty<byte>();
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
{
return Array.Empty<byte>();
}
// Read the header data
var body = new byte[HeaderSize];
if (stream.Read(body, 0, HeaderSize) != HeaderSize)
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
public static bool IsArchive(Stream stream)
{
// ACE files have a specific signature
// First two bytes are typically 0x60 0xEA (signature bytes)
// At offset 7, there should be "**ACE**" (7 bytes)
var bytes = new byte[14];
if (stream.Read(bytes, 0, 14) != 14)
{
return false;
}
// Check for "**ACE**" at offset 7
return CheckMagicBytes(bytes, 7);
}
protected static bool CheckMagicBytes(byte[] headerBytes, int offset)
{
// Check for "**ACE**" at specified offset
for (int i = 0; i < AceSignature.Length; i++)
{
if (headerBytes[offset + i] != AceSignature[i])
{
return false;
}
}
return true;
}
protected DateTime ConvertDosDateTime(uint dosDateTime)
{
try
{
int second = (int)(dosDateTime & 0x1F) * 2;
int minute = (int)((dosDateTime >> 5) & 0x3F);
int hour = (int)((dosDateTime >> 11) & 0x1F);
int day = (int)((dosDateTime >> 16) & 0x1F);
int month = (int)((dosDateTime >> 21) & 0x0F);
int year = (int)((dosDateTime >> 25) & 0x7F) + 1980;
if (
day < 1
|| day > 31
|| month < 1
|| month > 12
|| hour > 23
|| minute > 59
|| second > 59
)
{
return DateTime.MinValue;
}
return new DateTime(year, month, day, hour, minute, second);
}
catch
{
return DateTime.MinValue;
}
}
}
}

View File

@@ -0,0 +1,97 @@
using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// ACE main archive header
/// </summary>
public sealed class AceMainHeader : AceHeader
{
public byte ExtractVersion { get; set; }
public byte CreatorVersion { get; set; }
public HostOS HostOS { get; set; }
public byte VolumeNumber { get; set; }
public DateTime DateTime { get; set; }
public string Advert { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
public byte AceVersion { get; private set; }
public AceMainHeader(ArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.MAIN) { }
/// <summary>
/// Reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
}
}
return this;
}
}
}

View File

@@ -0,0 +1,16 @@
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// Compression quality
/// </summary>
public enum CompressionQuality
{
None,
Fastest,
Fast,
Normal,
Good,
Best,
Unknown,
}
}

View File

@@ -0,0 +1,13 @@
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// Compression types
/// </summary>
public enum CompressionType
{
Stored,
Lz77,
Blocked,
Unknown,
}
}

View File

@@ -0,0 +1,33 @@
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// Header flags (main + file, overlapping meanings)
/// </summary>
public static class HeaderFlags
{
// Shared / low bits
public const ushort ADDSIZE = 0x0001; // extra size field present
public const ushort COMMENT = 0x0002; // comment present
public const ushort MEMORY_64BIT = 0x0004;
public const ushort AV_STRING = 0x0008; // AV string present
public const ushort SOLID = 0x0010; // solid file
public const ushort LOCKED = 0x0020;
public const ushort PROTECTED = 0x0040;
// Main header specific
public const ushort V20FORMAT = 0x0100;
public const ushort SFX = 0x0200;
public const ushort LIMITSFXJR = 0x0400;
public const ushort MULTIVOLUME = 0x0800;
public const ushort ADVERT = 0x1000;
public const ushort RECOVERY = 0x2000;
public const ushort LOCKED_MAIN = 0x4000;
public const ushort SOLID_MAIN = 0x8000;
// File header specific (same bits, different meaning)
public const ushort NTSECURITY = 0x0400;
public const ushort CONTINUED_PREV = 0x1000;
public const ushort CONTINUED_NEXT = 0x2000;
public const ushort FILE_ENCRYPTED = 0x4000; // file encrypted (file header)
}
}

View File

@@ -0,0 +1,22 @@
namespace SharpCompress.Common.Ace.Headers
{
/// <summary>
/// Host OS type
/// </summary>
public enum HostOS
{
MsDos = 0,
Os2,
Windows,
Unix,
MacOs,
WinNt,
Primos,
AppleGs,
Atari,
Vax,
Amiga,
Next,
Unknown,
}
}

View File

@@ -9,4 +9,5 @@ public enum ArchiveType
GZip,
Arc,
Arj,
Ace,
}

View File

@@ -34,14 +34,13 @@ namespace SharpCompress.Common.Arj.Headers
public byte[] ReadHeader(Stream stream)
{
// check for magic bytes
Span<byte> magic = stackalloc byte[2];
var magic = new byte[2];
if (stream.Read(magic) != 2)
{
return Array.Empty<byte>();
}
var magicValue = (ushort)(magic[0] | magic[1] << 8);
if (magicValue != ARJ_MAGIC)
if (!CheckMagicBytes(magic))
{
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
@@ -138,5 +137,22 @@ namespace SharpCompress.Common.Arj.Headers
? (FileType)value
: Headers.FileType.Unknown;
}
public static bool IsArchive(Stream stream)
{
var bytes = new byte[2];
if (stream.Read(bytes, 0, 2) != 2)
{
return false;
}
return CheckMagicBytes(bytes);
}
protected static bool CheckMagicBytes(byte[] headerBytes)
{
var magicValue = (ushort)(headerBytes[0] | headerBytes[1] << 8);
return magicValue == ARJ_MAGIC;
}
}
}

View File

@@ -30,4 +30,5 @@ public enum CompressionType
Distilled,
ZStandard,
ArjLZ77,
AceLZ77,
}

View File

@@ -55,7 +55,7 @@ internal class SevenZipFilePart : FilePart
{
folderStream.Skip(skipSize);
}
return new ReadOnlySubStream(folderStream, Header.Size);
return new ReadOnlySubStream(folderStream, Header.Size, leaveOpen: false);
}
public CompressionType CompressionType

View File

@@ -30,6 +30,7 @@ public sealed class BZip2Stream : Stream, IStreamStack
private readonly Stream stream;
private bool isDisposed;
private readonly bool leaveOpen;
/// <summary>
/// Create a BZip2Stream
@@ -37,19 +38,30 @@ public sealed class BZip2Stream : Stream, IStreamStack
/// <param name="stream">The stream to read from</param>
/// <param name="compressionMode">Compression Mode</param>
/// <param name="decompressConcatenated">Decompress Concatenated</param>
public BZip2Stream(Stream stream, CompressionMode compressionMode, bool decompressConcatenated)
/// <param name="leaveOpen">Leave the stream open after disposing</param>
public BZip2Stream(
Stream stream,
CompressionMode compressionMode,
bool decompressConcatenated,
bool leaveOpen = false
)
{
#if DEBUG_STREAMS
this.DebugConstruct(typeof(BZip2Stream));
#endif
this.leaveOpen = leaveOpen;
Mode = compressionMode;
if (Mode == CompressionMode.Compress)
{
this.stream = new CBZip2OutputStream(stream);
this.stream = new CBZip2OutputStream(stream, 9, leaveOpen);
}
else
{
this.stream = new CBZip2InputStream(stream, decompressConcatenated);
this.stream = new CBZip2InputStream(
stream,
decompressConcatenated,
leaveOpen: leaveOpen
);
}
}

View File

@@ -168,6 +168,7 @@ internal class CBZip2InputStream : Stream, IStreamStack
private int computedBlockCRC,
computedCombinedCRC;
private readonly bool decompressConcatenated;
private readonly bool leaveOpen;
private int i2,
count,
@@ -181,9 +182,10 @@ internal class CBZip2InputStream : Stream, IStreamStack
private char z;
private bool isDisposed;
public CBZip2InputStream(Stream zStream, bool decompressConcatenated)
public CBZip2InputStream(Stream zStream, bool decompressConcatenated, bool leaveOpen = false)
{
this.decompressConcatenated = decompressConcatenated;
this.leaveOpen = leaveOpen;
ll8 = null;
tt = null;
BsSetStream(zStream);
@@ -207,7 +209,10 @@ internal class CBZip2InputStream : Stream, IStreamStack
this.DebugDispose(typeof(CBZip2InputStream));
#endif
base.Dispose(disposing);
bsStream?.Dispose();
if (!leaveOpen)
{
bsStream?.Dispose();
}
}
internal static int[][] InitIntArray(int n1, int n2)
@@ -398,7 +403,10 @@ internal class CBZip2InputStream : Stream, IStreamStack
private void BsFinishedWithStream()
{
bsStream?.Dispose();
if (!leaveOpen)
{
bsStream?.Dispose();
}
bsStream = null;
}

View File

@@ -341,12 +341,14 @@ internal sealed class CBZip2OutputStream : Stream, IStreamStack
private int currentChar = -1;
private int runLength;
private readonly bool leaveOpen;
public CBZip2OutputStream(Stream inStream)
: this(inStream, 9) { }
public CBZip2OutputStream(Stream inStream, bool leaveOpen = false)
: this(inStream, 9, leaveOpen) { }
public CBZip2OutputStream(Stream inStream, int inBlockSize)
public CBZip2OutputStream(Stream inStream, int inBlockSize, bool leaveOpen = false)
{
this.leaveOpen = leaveOpen;
block = null;
quadrant = null;
zptr = null;
@@ -481,7 +483,10 @@ internal sealed class CBZip2OutputStream : Stream, IStreamStack
this.DebugDispose(typeof(CBZip2OutputStream));
#endif
Dispose();
bsStream?.Dispose();
if (!leaveOpen)
{
bsStream?.Dispose();
}
bsStream = null;
}
}

View File

@@ -586,7 +586,13 @@ internal class ZlibBaseStream : Stream, IStreamStack
public override void Flush()
{
_stream.Flush();
// Only flush the underlying stream when in write mode
// Flushing input streams during read operations is not meaningful
// and can cause issues with forward-only/non-seekable streams
if (_streamMode == StreamMode.Writer)
{
_stream.Flush();
}
//rewind the buffer
((IStreamStack)this).Rewind(z.AvailableBytesIn); //unused
z.AvailableBytesIn = 0;
@@ -594,7 +600,13 @@ internal class ZlibBaseStream : Stream, IStreamStack
public override async Task FlushAsync(CancellationToken cancellationToken)
{
await _stream.FlushAsync(cancellationToken).ConfigureAwait(false);
// Only flush the underlying stream when in write mode
// Flushing input streams during read operations is not meaningful
// and can cause issues with forward-only/non-seekable streams
if (_streamMode == StreamMode.Writer)
{
await _stream.FlushAsync(cancellationToken).ConfigureAwait(false);
}
//rewind the buffer
((IStreamStack)this).Rewind(z.AvailableBytesIn); //unused
z.AvailableBytesIn = 0;

View File

@@ -153,7 +153,7 @@ internal class OutWindow : IDisposable
_pendingLen = rem;
}
public async Task CopyPendingAsync(CancellationToken cancellationToken = default)
public async ValueTask CopyPendingAsync(CancellationToken cancellationToken = default)
{
if (_pendingLen < 1)
{
@@ -206,7 +206,7 @@ internal class OutWindow : IDisposable
_pendingDist = distance;
}
public async Task CopyBlockAsync(
public async ValueTask CopyBlockAsync(
int distance,
int len,
CancellationToken cancellationToken = default
@@ -253,7 +253,7 @@ internal class OutWindow : IDisposable
}
}
public async Task PutByteAsync(byte b, CancellationToken cancellationToken = default)
public async ValueTask PutByteAsync(byte b, CancellationToken cancellationToken = default)
{
_buffer[_pos++] = b;
_total++;
@@ -369,6 +369,28 @@ internal class OutWindow : IDisposable
return size;
}
public int Read(Memory<byte> buffer, int offset, int count)
{
if (_streamPos >= _pos)
{
return 0;
}
var size = _pos - _streamPos;
if (size > count)
{
size = count;
}
_buffer.AsMemory(_streamPos, size).CopyTo(buffer.Slice(offset, size));
_streamPos += size;
if (_streamPos >= _windowSize)
{
_pos = 0;
_streamPos = 0;
}
return size;
}
public int ReadByte()
{
if (_streamPos >= _pos)

View File

@@ -45,10 +45,14 @@ public sealed class LZipStream : Stream, IStreamStack
private bool _finished;
private long _writeCount;
private readonly Stream? _originalStream;
private readonly bool _leaveOpen;
public LZipStream(Stream stream, CompressionMode mode)
public LZipStream(Stream stream, CompressionMode mode, bool leaveOpen = false)
{
Mode = mode;
_originalStream = stream;
_leaveOpen = leaveOpen;
if (mode == CompressionMode.Decompress)
{
@@ -58,7 +62,7 @@ public sealed class LZipStream : Stream, IStreamStack
throw new InvalidFormatException("Not an LZip stream");
}
var properties = GetProperties(dSize);
_stream = new LzmaStream(properties, stream);
_stream = new LzmaStream(properties, stream, leaveOpen: leaveOpen);
}
else
{
@@ -125,6 +129,10 @@ public sealed class LZipStream : Stream, IStreamStack
{
Finish();
_stream.Dispose();
if (Mode == CompressionMode.Compress && !_leaveOpen)
{
_originalStream?.Dispose();
}
}
}

View File

@@ -3,6 +3,7 @@
using System;
using System.Diagnostics.CodeAnalysis;
using System.IO;
using System.Threading.Tasks;
using SharpCompress.Compressors.LZMA.LZ;
using SharpCompress.Compressors.LZMA.RangeCoder;
@@ -475,7 +476,7 @@ public class Decoder : ICoder, ISetDecoderProperties // ,System.IO.Stream
return false;
}
internal async System.Threading.Tasks.Task<bool> CodeAsync(
internal async ValueTask<bool> CodeAsync(
int dictionarySize,
OutWindow outWindow,
RangeCoder.Decoder rangeDecoder,

View File

@@ -35,6 +35,7 @@ public class LzmaStream : Stream, IStreamStack
private readonly Stream _inputStream;
private readonly long _inputSize;
private readonly long _outputSize;
private readonly bool _leaveOpen;
private readonly int _dictionarySize;
private readonly OutWindow _outWindow = new();
@@ -56,14 +57,28 @@ public class LzmaStream : Stream, IStreamStack
private readonly Encoder _encoder;
private bool _isDisposed;
public LzmaStream(byte[] properties, Stream inputStream)
: this(properties, inputStream, -1, -1, null, properties.Length < 5) { }
public LzmaStream(byte[] properties, Stream inputStream, bool leaveOpen = false)
: this(properties, inputStream, -1, -1, null, properties.Length < 5, leaveOpen) { }
public LzmaStream(byte[] properties, Stream inputStream, long inputSize)
: this(properties, inputStream, inputSize, -1, null, properties.Length < 5) { }
public LzmaStream(byte[] properties, Stream inputStream, long inputSize, bool leaveOpen = false)
: this(properties, inputStream, inputSize, -1, null, properties.Length < 5, leaveOpen) { }
public LzmaStream(byte[] properties, Stream inputStream, long inputSize, long outputSize)
: this(properties, inputStream, inputSize, outputSize, null, properties.Length < 5) { }
public LzmaStream(
byte[] properties,
Stream inputStream,
long inputSize,
long outputSize,
bool leaveOpen = false
)
: this(
properties,
inputStream,
inputSize,
outputSize,
null,
properties.Length < 5,
leaveOpen
) { }
public LzmaStream(
byte[] properties,
@@ -71,13 +86,15 @@ public class LzmaStream : Stream, IStreamStack
long inputSize,
long outputSize,
Stream presetDictionary,
bool isLzma2
bool isLzma2,
bool leaveOpen = false
)
{
_inputStream = inputStream;
_inputSize = inputSize;
_outputSize = outputSize;
_isLzma2 = isLzma2;
_leaveOpen = leaveOpen;
#if DEBUG_STREAMS
this.DebugConstruct(typeof(LzmaStream));
@@ -179,7 +196,10 @@ public class LzmaStream : Stream, IStreamStack
{
_position = _encoder.Code(null, true);
}
_inputStream?.Dispose();
if (!_leaveOpen)
{
_inputStream?.Dispose();
}
_outWindow.Dispose();
}
base.Dispose(disposing);
@@ -425,7 +445,7 @@ public class LzmaStream : Stream, IStreamStack
}
}
private async Task DecodeChunkHeaderAsync(CancellationToken cancellationToken = default)
private async ValueTask DecodeChunkHeaderAsync(CancellationToken cancellationToken = default)
{
var controlBuffer = new byte[1];
await _inputStream
@@ -632,6 +652,119 @@ public class LzmaStream : Stream, IStreamStack
return total;
}
#if !NETFRAMEWORK && !NETSTANDARD2_0
public override async ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default
)
{
if (_endReached)
{
return 0;
}
var total = 0;
var offset = 0;
var count = buffer.Length;
while (total < count)
{
cancellationToken.ThrowIfCancellationRequested();
if (_availableBytes == 0)
{
if (_isLzma2)
{
await DecodeChunkHeaderAsync(cancellationToken).ConfigureAwait(false);
}
else
{
_endReached = true;
}
if (_endReached)
{
break;
}
}
var toProcess = count - total;
if (toProcess > _availableBytes)
{
toProcess = (int)_availableBytes;
}
_outWindow.SetLimit(toProcess);
if (_uncompressedChunk)
{
_inputPosition += await _outWindow
.CopyStreamAsync(_inputStream, toProcess, cancellationToken)
.ConfigureAwait(false);
}
else if (
await _decoder
.CodeAsync(_dictionarySize, _outWindow, _rangeDecoder, cancellationToken)
.ConfigureAwait(false)
&& _outputSize < 0
)
{
_availableBytes = _outWindow.AvailableBytes;
}
var read = _outWindow.Read(buffer, offset, toProcess);
total += read;
offset += read;
_position += read;
_availableBytes -= read;
if (_availableBytes == 0 && !_uncompressedChunk)
{
if (
!_rangeDecoder.IsFinished
|| (_rangeDecoderLimit >= 0 && _rangeDecoder._total != _rangeDecoderLimit)
)
{
_outWindow.SetLimit(toProcess + 1);
if (
!await _decoder
.CodeAsync(
_dictionarySize,
_outWindow,
_rangeDecoder,
cancellationToken
)
.ConfigureAwait(false)
)
{
_rangeDecoder.ReleaseStream();
throw new DataErrorException();
}
}
_rangeDecoder.ReleaseStream();
_inputPosition += _rangeDecoder._total;
if (_outWindow.HasPending)
{
throw new DataErrorException();
}
}
}
if (_endReached)
{
if (_inputSize >= 0 && _inputPosition != _inputSize)
{
throw new DataErrorException();
}
if (_outputSize >= 0 && _position != _outputSize)
{
throw new DataErrorException();
}
}
return total;
}
#endif
public override Task WriteAsync(
byte[] buffer,
int offset,

View File

@@ -134,7 +134,7 @@ internal class RarStream : Stream, IStreamStack
fetch = false;
}
_position += outTotal;
if (count > 0 && outTotal == 0 && _position != Length)
if (count > 0 && outTotal == 0 && _position < Length)
{
// sanity check, eg if we try to decompress a redir entry
throw new InvalidOperationException(
@@ -179,7 +179,7 @@ internal class RarStream : Stream, IStreamStack
fetch = false;
}
_position += outTotal;
if (count > 0 && outTotal == 0 && _position != Length)
if (count > 0 && outTotal == 0 && _position < Length)
{
// sanity check, eg if we try to decompress a redir entry
throw new InvalidOperationException(

View File

@@ -0,0 +1,37 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Readers;
using SharpCompress.Readers.Ace;
namespace SharpCompress.Factories
{
public class AceFactory : Factory, IReaderFactory
{
public override string Name => "Ace";
public override ArchiveType? KnownArchiveType => ArchiveType.Ace;
public override IEnumerable<string> GetSupportedExtensions()
{
yield return "ace";
}
public override bool IsArchive(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
return AceHeader.IsArchive(stream);
}
public IReader OpenReader(Stream stream, ReaderOptions? options) =>
AceReader.Open(stream, options);
}
}

View File

@@ -28,12 +28,7 @@ namespace SharpCompress.Factories
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
var arjHeader = new ArjMainHeader(new ArchiveEncoding());
if (arjHeader.Read(stream) == null)
{
return false;
}
return true;
return ArjHeader.IsArchive(stream);
}
public IReader OpenReader(Stream stream, ReaderOptions? options) =>

View File

@@ -19,6 +19,7 @@ public abstract class Factory : IFactory
RegisterFactory(new TarFactory());
RegisterFactory(new ArcFactory());
RegisterFactory(new ArjFactory());
RegisterFactory(new AceFactory());
}
private static readonly HashSet<Factory> _factories = new();

View File

@@ -1,5 +1,7 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.IO;
@@ -68,6 +70,23 @@ internal class BufferedSubStream : SharpCompressStream, IStreamStack
BytesLeftToRead -= _cacheLength;
}
private async ValueTask RefillCacheAsync(CancellationToken cancellationToken)
{
var count = (int)Math.Min(BytesLeftToRead, _cache.Length);
_cacheOffset = 0;
if (count == 0)
{
_cacheLength = 0;
return;
}
Stream.Position = origin;
_cacheLength = await Stream
.ReadAsync(_cache, 0, count, cancellationToken)
.ConfigureAwait(false);
origin += _cacheLength;
BytesLeftToRead -= _cacheLength;
}
public override int Read(byte[] buffer, int offset, int count)
{
if (count > Length)
@@ -104,6 +123,61 @@ internal class BufferedSubStream : SharpCompressStream, IStreamStack
return _cache[_cacheOffset++];
}
public override async Task<int> ReadAsync(
byte[] buffer,
int offset,
int count,
CancellationToken cancellationToken
)
{
if (count > Length)
{
count = (int)Length;
}
if (count > 0)
{
if (_cacheOffset == _cacheLength)
{
await RefillCacheAsync(cancellationToken).ConfigureAwait(false);
}
count = Math.Min(count, _cacheLength - _cacheOffset);
Buffer.BlockCopy(_cache, _cacheOffset, buffer, offset, count);
_cacheOffset += count;
}
return count;
}
#if !NETFRAMEWORK && !NETSTANDARD2_0
public override async ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default
)
{
var count = buffer.Length;
if (count > Length)
{
count = (int)Length;
}
if (count > 0)
{
if (_cacheOffset == _cacheLength)
{
await RefillCacheAsync(cancellationToken).ConfigureAwait(false);
}
count = Math.Min(count, _cacheLength - _cacheOffset);
_cache.AsSpan(_cacheOffset, count).CopyTo(buffer.Span);
_cacheOffset += count;
}
return count;
}
#endif
public override long Seek(long offset, SeekOrigin origin) => throw new NotSupportedException();
public override void SetLength(long value) => throw new NotSupportedException();

View File

@@ -16,11 +16,11 @@ internal class ReadOnlySubStream : SharpCompressStream, IStreamStack
private long _position;
public ReadOnlySubStream(Stream stream, long bytesToRead)
: this(stream, null, bytesToRead) { }
public ReadOnlySubStream(Stream stream, long bytesToRead, bool leaveOpen = true)
: this(stream, null, bytesToRead, leaveOpen) { }
public ReadOnlySubStream(Stream stream, long? origin, long bytesToRead)
: base(stream, leaveOpen: true, throwOnDispose: false)
public ReadOnlySubStream(Stream stream, long? origin, long bytesToRead, bool leaveOpen = true)
: base(stream, leaveOpen, throwOnDispose: false)
{
if (origin != null && stream.Position != origin.Value)
{

View File

@@ -138,8 +138,6 @@ public class SharpCompressStream : Stream, IStreamStack
#endif
}
internal bool IsRecording { get; private set; }
protected override void Dispose(bool disposing)
{
#if DEBUG_STREAMS

View File

@@ -0,0 +1,115 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Ace;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Common.Arj;
namespace SharpCompress.Readers.Ace
{
/// <summary>
/// Reader for ACE archives.
/// ACE is a proprietary archive format. This implementation supports both ACE 1.0 and ACE 2.0 formats
/// and can read archive metadata and extract uncompressed (stored) entries.
/// Compressed entries require proprietary decompression algorithms that are not publicly documented.
/// </summary>
/// <remarks>
/// ACE 2.0 additions over ACE 1.0:
/// - Improved LZ77 compression (compression type 2)
/// - Recovery record support
/// - Additional header flags
/// </remarks>
public abstract class AceReader : AbstractReader<AceEntry, AceVolume>
{
private readonly ArchiveEncoding _archiveEncoding;
internal AceReader(ReaderOptions options)
: base(options, ArchiveType.Ace)
{
_archiveEncoding = Options.ArchiveEncoding;
}
private AceReader(Stream stream, ReaderOptions options)
: this(options) { }
/// <summary>
/// Derived class must create or manage the Volume itself.
/// AbstractReader.Volume is get-only, so it cannot be set here.
/// </summary>
public override AceVolume? Volume => _volume;
private AceVolume? _volume;
/// <summary>
/// Opens an AceReader for non-seeking usage with a single volume.
/// </summary>
/// <param name="stream">The stream containing the ACE archive.</param>
/// <param name="options">Reader options.</param>
/// <returns>An AceReader instance.</returns>
public static AceReader Open(Stream stream, ReaderOptions? options = null)
{
stream.NotNull(nameof(stream));
return new SingleVolumeAceReader(stream, options ?? new ReaderOptions());
}
/// <summary>
/// Opens an AceReader for Non-seeking usage with multiple volumes
/// </summary>
/// <param name="streams"></param>
/// <param name="options"></param>
/// <returns></returns>
public static AceReader Open(IEnumerable<Stream> streams, ReaderOptions? options = null)
{
streams.NotNull(nameof(streams));
return new MultiVolumeAceReader(streams, options ?? new ReaderOptions());
}
protected abstract void ValidateArchive(AceVolume archive);
protected override IEnumerable<AceEntry> GetEntries(Stream stream)
{
var mainHeaderReader = new AceMainHeader(_archiveEncoding);
var mainHeader = mainHeaderReader.Read(stream);
if (mainHeader == null)
{
yield break;
}
if (mainHeader?.IsMultiVolume == true)
{
throw new MultiVolumeExtractionException(
"Multi volumes are currently not supported"
);
}
if (_volume == null)
{
_volume = new AceVolume(stream, Options, 0);
ValidateArchive(_volume);
}
var localHeaderReader = new AceFileHeader(_archiveEncoding);
while (true)
{
var localHeader = localHeaderReader.Read(stream);
if (localHeader?.IsFileEncrypted == true)
{
throw new CryptographicException(
"Password protected archives are currently not supported"
);
}
if (localHeader == null)
break;
yield return new AceEntry(new AceFilePart((AceFileHeader)localHeader, stream));
}
}
protected virtual IEnumerable<FilePart> CreateFilePartEnumerableForCurrentEntry() =>
Entry.Parts;
}
}

View File

@@ -0,0 +1,117 @@
#nullable disable
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Ace;
namespace SharpCompress.Readers.Ace
{
internal class MultiVolumeAceReader : AceReader
{
private readonly IEnumerator<Stream> streams;
private Stream tempStream;
internal MultiVolumeAceReader(IEnumerable<Stream> streams, ReaderOptions options)
: base(options) => this.streams = streams.GetEnumerator();
protected override void ValidateArchive(AceVolume archive) { }
protected override Stream RequestInitialStream()
{
if (streams.MoveNext())
{
return streams.Current;
}
throw new MultiVolumeExtractionException(
"No stream provided when requested by MultiVolumeAceReader"
);
}
internal override bool NextEntryForCurrentStream()
{
if (!base.NextEntryForCurrentStream())
{
// if we're got another stream to try to process then do so
return streams.MoveNext() && LoadStreamForReading(streams.Current);
}
return true;
}
protected override IEnumerable<FilePart> CreateFilePartEnumerableForCurrentEntry()
{
var enumerator = new MultiVolumeStreamEnumerator(this, streams, tempStream);
tempStream = null;
return enumerator;
}
private class MultiVolumeStreamEnumerator : IEnumerable<FilePart>, IEnumerator<FilePart>
{
private readonly MultiVolumeAceReader reader;
private readonly IEnumerator<Stream> nextReadableStreams;
private Stream tempStream;
private bool isFirst = true;
internal MultiVolumeStreamEnumerator(
MultiVolumeAceReader r,
IEnumerator<Stream> nextReadableStreams,
Stream tempStream
)
{
reader = r;
this.nextReadableStreams = nextReadableStreams;
this.tempStream = tempStream;
}
public IEnumerator<FilePart> GetEnumerator() => this;
IEnumerator IEnumerable.GetEnumerator() => this;
public FilePart Current { get; private set; }
public void Dispose() { }
object IEnumerator.Current => Current;
public bool MoveNext()
{
if (isFirst)
{
Current = reader.Entry.Parts.First();
isFirst = false; //first stream already to go
return true;
}
if (!reader.Entry.IsSplitAfter)
{
return false;
}
if (tempStream != null)
{
reader.LoadStreamForReading(tempStream);
tempStream = null;
}
else if (!nextReadableStreams.MoveNext())
{
throw new MultiVolumeExtractionException(
"No stream provided when requested by MultiVolumeAceReader"
);
}
else
{
reader.LoadStreamForReading(nextReadableStreams.Current);
}
Current = reader.Entry.Parts.First();
return true;
}
public void Reset() { }
}
}
}

View File

@@ -0,0 +1,31 @@
using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Common.Ace;
namespace SharpCompress.Readers.Ace
{
internal class SingleVolumeAceReader : AceReader
{
private readonly Stream _stream;
internal SingleVolumeAceReader(Stream stream, ReaderOptions options)
: base(options)
{
stream.NotNull(nameof(stream));
_stream = stream;
}
protected override Stream RequestInitialStream() => _stream;
protected override void ValidateArchive(AceVolume archive)
{
if (archive.IsMultiVolume)
{
throw new MultiVolumeExtractionException(
"Streamed archive is a Multi-volume archive. Use a different AceReader method to extract."
);
}
}
}
}

View File

@@ -70,7 +70,7 @@ public static class ReaderFactory
}
throw new InvalidFormatException(
"Cannot determine compressed stream type. Supported Reader Formats: Arc, Arj, Zip, GZip, BZip2, Tar, Rar, LZip, XZ, ZStandard"
"Cannot determine compressed stream type. Supported Reader Formats: Ace, Arc, Arj, Zip, GZip, BZip2, Tar, Rar, LZip, XZ, ZStandard"
);
}
}

View File

@@ -71,48 +71,16 @@ internal static class Utility
return;
}
using var buffer = MemoryPool<byte>.Shared.Rent(TEMP_BUFFER_SIZE);
while (advanceAmount > 0)
{
var toRead = (int)Math.Min(buffer.Memory.Length, advanceAmount);
var read = source.Read(buffer.Memory.Slice(0, toRead).Span);
if (read <= 0)
{
break;
}
advanceAmount -= read;
}
using var readOnlySubStream = new IO.ReadOnlySubStream(source, advanceAmount);
readOnlySubStream.CopyTo(Stream.Null);
}
public static void Skip(this Stream source)
{
using var buffer = MemoryPool<byte>.Shared.Rent(TEMP_BUFFER_SIZE);
while (source.Read(buffer.Memory.Span) > 0) { }
}
public static void Skip(this Stream source) => source.CopyTo(Stream.Null);
public static async Task SkipAsync(
this Stream source,
CancellationToken cancellationToken = default
)
public static Task SkipAsync(this Stream source, CancellationToken cancellationToken = default)
{
var array = ArrayPool<byte>.Shared.Rent(TEMP_BUFFER_SIZE);
try
{
while (true)
{
var read = await source
.ReadAsync(array, 0, array.Length, cancellationToken)
.ConfigureAwait(false);
if (read <= 0)
{
break;
}
}
}
finally
{
ArrayPool<byte>.Shared.Return(array);
}
cancellationToken.ThrowIfCancellationRequested();
return source.CopyToAsync(Stream.Null);
}
public static DateTime DosDateToDateTime(ushort iDate, ushort iTime)

View File

@@ -0,0 +1,61 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Readers;
using SharpCompress.Readers.Ace;
using Xunit;
namespace SharpCompress.Test.Ace
{
public class AceReaderTests : ReaderTests
{
public AceReaderTests()
{
UseExtensionInsteadOfNameToVerify = true;
UseCaseInsensitiveToVerify = true;
}
[Fact]
public void Ace_Uncompressed_Read() => Read("Ace.store.ace", CompressionType.None);
[Fact]
public void Ace_Encrypted_Read()
{
var exception = Assert.Throws<CryptographicException>(() => Read("Ace.encrypted.ace"));
}
[Theory]
[InlineData("Ace.method1.ace", CompressionType.AceLZ77)]
[InlineData("Ace.method1-solid.ace", CompressionType.AceLZ77)]
[InlineData("Ace.method2.ace", CompressionType.AceLZ77)]
[InlineData("Ace.method2-solid.ace", CompressionType.AceLZ77)]
public void Ace_Unsupported_ShouldThrow(string fileName, CompressionType compressionType)
{
var exception = Assert.Throws<NotSupportedException>(() =>
Read(fileName, compressionType)
);
}
[Theory]
[InlineData("Ace.store.largefile.ace", CompressionType.None)]
public void Ace_LargeFileTest_Read(string fileName, CompressionType compressionType)
{
ReadForBufferBoundaryCheck(fileName, compressionType);
}
[Fact]
public void Ace_Multi_Reader()
{
var exception = Assert.Throws<MultiVolumeExtractionException>(() =>
DoMultiReader(
["Ace.store.split.ace", "Ace.store.split.c01"],
streams => AceReader.Open(streams)
)
);
}
}
}

View File

@@ -45,14 +45,17 @@ namespace SharpCompress.Test.Arj
public void Arj_Multi_Reader()
{
var exception = Assert.Throws<MultiVolumeExtractionException>(() =>
DoArj_Multi_Reader([
"Arj.store.split.arj",
"Arj.store.split.a01",
"Arj.store.split.a02",
"Arj.store.split.a03",
"Arj.store.split.a04",
"Arj.store.split.a05",
])
DoMultiReader(
[
"Arj.store.split.arj",
"Arj.store.split.a01",
"Arj.store.split.a02",
"Arj.store.split.a03",
"Arj.store.split.a04",
"Arj.store.split.a05",
],
streams => ArjReader.Open(streams)
)
);
}
@@ -74,26 +77,5 @@ namespace SharpCompress.Test.Arj
{
ReadForBufferBoundaryCheck(fileName, compressionType);
}
private void DoArj_Multi_Reader(string[] archives)
{
using (
var reader = ArjReader.Open(
archives
.Select(s => Path.Combine(TEST_ARCHIVES_PATH, s))
.Select(p => File.OpenRead(p))
)
)
{
while (reader.MoveToNextEntry())
{
reader.WriteEntryToDirectory(
SCRATCH_FILES_PATH,
new ExtractionOptions { ExtractFullPath = true, Overwrite = true }
);
}
}
VerifyFiles();
}
}
}

View File

@@ -0,0 +1,73 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Test.Mocks;
/// <summary>
/// A stream wrapper that throws NotSupportedException on Flush() calls.
/// This is used to test that archive iteration handles streams that don't support flushing.
/// </summary>
public class ThrowOnFlushStream : Stream
{
private readonly Stream inner;
public ThrowOnFlushStream(Stream inner)
{
this.inner = inner;
}
public override bool CanRead => inner.CanRead;
public override bool CanSeek => false;
public override bool CanWrite => false;
public override long Length => throw new NotSupportedException();
public override long Position
{
get => throw new NotSupportedException();
set => throw new NotSupportedException();
}
public override void Flush() => throw new NotSupportedException("Flush not supported");
public override Task FlushAsync(CancellationToken cancellationToken) =>
throw new NotSupportedException("FlushAsync not supported");
public override int Read(byte[] buffer, int offset, int count) =>
inner.Read(buffer, offset, count);
public override Task<int> ReadAsync(
byte[] buffer,
int offset,
int count,
CancellationToken cancellationToken
) => inner.ReadAsync(buffer, offset, count, cancellationToken);
#if !NETFRAMEWORK && !NETSTANDARD2_0
public override ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default
) => inner.ReadAsync(buffer, cancellationToken);
#endif
public override long Seek(long offset, SeekOrigin origin) => throw new NotSupportedException();
public override void SetLength(long value) => throw new NotSupportedException();
public override void Write(byte[] buffer, int offset, int count) =>
throw new NotSupportedException();
protected override void Dispose(bool disposing)
{
if (disposing)
{
inner.Dispose();
}
base.Dispose(disposing);
}
}

View File

@@ -0,0 +1,65 @@
using System;
using System.IO;
namespace SharpCompress.Test.Mocks;
/// <summary>
/// A stream wrapper that truncates the underlying stream after reading a specified number of bytes.
/// Used for testing error handling when streams end prematurely.
/// </summary>
public class TruncatedStream : Stream
{
private readonly Stream baseStream;
private readonly long truncateAfterBytes;
private long bytesRead;
public TruncatedStream(Stream baseStream, long truncateAfterBytes)
{
this.baseStream = baseStream ?? throw new ArgumentNullException(nameof(baseStream));
this.truncateAfterBytes = truncateAfterBytes;
bytesRead = 0;
}
public override bool CanRead => baseStream.CanRead;
public override bool CanSeek => baseStream.CanSeek;
public override bool CanWrite => false;
public override long Length => baseStream.Length;
public override long Position
{
get => baseStream.Position;
set => baseStream.Position = value;
}
public override int Read(byte[] buffer, int offset, int count)
{
if (bytesRead >= truncateAfterBytes)
{
// Simulate premature end of stream
return 0;
}
var maxBytesToRead = (int)Math.Min(count, truncateAfterBytes - bytesRead);
var actualBytesRead = baseStream.Read(buffer, offset, maxBytesToRead);
bytesRead += actualBytesRead;
return actualBytesRead;
}
public override long Seek(long offset, SeekOrigin origin) => baseStream.Seek(offset, origin);
public override void SetLength(long value) => throw new NotSupportedException();
public override void Write(byte[] buffer, int offset, int count) =>
throw new NotSupportedException();
public override void Flush() => baseStream.Flush();
protected override void Dispose(bool disposing)
{
if (disposing)
{
baseStream?.Dispose();
}
base.Dispose(disposing);
}
}

View File

@@ -1,3 +1,4 @@
using System;
using System.IO;
using System.Linq;
using SharpCompress.Archives;
@@ -5,6 +6,7 @@ using SharpCompress.Archives.Rar;
using SharpCompress.Common;
using SharpCompress.Compressors.LZMA.Utilites;
using SharpCompress.Readers;
using SharpCompress.Test.Mocks;
using Xunit;
namespace SharpCompress.Test.Rar;
@@ -642,4 +644,77 @@ public class RarArchiveTests : ArchiveTests
);
Assert.True(passwordProtectedFilesArchive.IsEncrypted);
}
/// <summary>
/// Test for issue: InvalidOperationException when extracting RAR files.
/// This test verifies the fix for the validation logic that was changed from
/// (_position != Length) to (_position &lt; Length).
/// The old logic would throw an exception when position exceeded expected length,
/// but the new logic only throws when decompression ends prematurely (position &lt; expected).
/// </summary>
[Fact]
public void Rar_StreamValidation_OnlyThrowsOnPrematureEnd()
{
// Test normal extraction - should NOT throw InvalidOperationException
// even if actual decompressed size differs from header
var testFiles = new[] { "Rar.rar", "Rar5.rar", "Rar4.rar", "Rar2.rar" };
foreach (var testFile in testFiles)
{
using var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, testFile));
using var archive = RarArchive.Open(stream);
// Extract all entries and read them completely
foreach (var entry in archive.Entries.Where(e => !e.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
using var ms = new MemoryStream();
// This should complete without throwing InvalidOperationException
// The fix ensures we only throw when position &lt; expected length, not when position >= expected
entryStream.CopyTo(ms);
// Verify we read some data
Assert.True(
ms.Length > 0,
$"Failed to extract data from {entry.Key} in {testFile}"
);
}
}
}
/// <summary>
/// Negative test case: Verifies that InvalidOperationException IS thrown when
/// a RAR stream ends prematurely (position &lt; expected length).
/// This tests the validation condition (_position &lt; Length) works correctly.
/// </summary>
[Fact]
public void Rar_StreamValidation_ThrowsOnTruncatedStream()
{
// This test verifies the exception is thrown when decompression ends prematurely
// by using a truncated stream that stops reading after a small number of bytes
var testFile = "Rar.rar";
using var fileStream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, testFile));
// Wrap the file stream with a truncated stream that will stop reading early
// This simulates a corrupted or truncated RAR file
using var truncatedStream = new TruncatedStream(fileStream, 1000);
// Opening the archive should work, but extracting should throw
// when we try to read beyond the truncated data
var exception = Assert.Throws<InvalidOperationException>(() =>
{
using var archive = RarArchive.Open(truncatedStream);
foreach (var entry in archive.Entries.Where(e => !e.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
using var ms = new MemoryStream();
// This should throw InvalidOperationException when it can't read all expected bytes
entryStream.CopyTo(ms);
}
});
// Verify the exception message matches our expectation
Assert.Contains("unpacked file size does not match header", exception.Message);
}
}

View File

@@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
@@ -220,4 +221,26 @@ public abstract class ReaderTests : TestBase
Assert.Equal(expected.Pop(), reader.Entry.Key);
}
}
protected void DoMultiReader(
string[] archives,
Func<IEnumerable<Stream>, IDisposable> readerFactory
)
{
using var reader = readerFactory(
archives.Select(s => Path.Combine(TEST_ARCHIVES_PATH, s)).Select(File.OpenRead)
);
dynamic dynReader = reader;
while (dynReader.MoveToNextEntry())
{
dynReader.WriteEntryToDirectory(
SCRATCH_FILES_PATH,
new ExtractionOptions { ExtractFullPath = true, Overwrite = true }
);
}
VerifyFiles();
}
}

View File

@@ -0,0 +1,139 @@
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Archives;
using SharpCompress.Common;
using Xunit;
namespace SharpCompress.Test.SevenZip;
#if !NETFRAMEWORK
public class SevenZipArchiveAsyncTests : ArchiveTests
{
[Fact]
public async Task SevenZipArchive_LZMA_AsyncStreamExtraction()
{
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "7Zip.LZMA.7z");
using var stream = File.OpenRead(testArchive);
using var archive = ArchiveFactory.Open(stream);
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
var targetPath = Path.Combine(SCRATCH_FILES_PATH, entry.Key!);
var targetDir = Path.GetDirectoryName(targetPath);
if (!string.IsNullOrEmpty(targetDir) && !Directory.Exists(targetDir))
{
Directory.CreateDirectory(targetDir);
}
using var sourceStream = await entry.OpenEntryStreamAsync(CancellationToken.None);
await using var targetStream = File.Create(targetPath);
await sourceStream.CopyToAsync(targetStream, CancellationToken.None);
}
VerifyFiles();
}
[Fact]
public async Task SevenZipArchive_LZMA2_AsyncStreamExtraction()
{
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "7Zip.LZMA2.7z");
using var stream = File.OpenRead(testArchive);
using var archive = ArchiveFactory.Open(stream);
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
var targetPath = Path.Combine(SCRATCH_FILES_PATH, entry.Key!);
var targetDir = Path.GetDirectoryName(targetPath);
if (!string.IsNullOrEmpty(targetDir) && !Directory.Exists(targetDir))
{
Directory.CreateDirectory(targetDir);
}
using var sourceStream = await entry.OpenEntryStreamAsync(CancellationToken.None);
await using var targetStream = File.Create(targetPath);
await sourceStream.CopyToAsync(targetStream, CancellationToken.None);
}
VerifyFiles();
}
[Fact]
public async Task SevenZipArchive_Solid_AsyncStreamExtraction()
{
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "7Zip.solid.7z");
using var stream = File.OpenRead(testArchive);
using var archive = ArchiveFactory.Open(stream);
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
var targetPath = Path.Combine(SCRATCH_FILES_PATH, entry.Key!);
var targetDir = Path.GetDirectoryName(targetPath);
if (!string.IsNullOrEmpty(targetDir) && !Directory.Exists(targetDir))
{
Directory.CreateDirectory(targetDir);
}
using var sourceStream = await entry.OpenEntryStreamAsync(CancellationToken.None);
await using var targetStream = File.Create(targetPath);
await sourceStream.CopyToAsync(targetStream, CancellationToken.None);
}
VerifyFiles();
}
[Fact]
public async Task SevenZipArchive_BZip2_AsyncStreamExtraction()
{
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "7Zip.BZip2.7z");
using var stream = File.OpenRead(testArchive);
using var archive = ArchiveFactory.Open(stream);
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
var targetPath = Path.Combine(SCRATCH_FILES_PATH, entry.Key!);
var targetDir = Path.GetDirectoryName(targetPath);
if (!string.IsNullOrEmpty(targetDir) && !Directory.Exists(targetDir))
{
Directory.CreateDirectory(targetDir);
}
using var sourceStream = await entry.OpenEntryStreamAsync(CancellationToken.None);
await using var targetStream = File.Create(targetPath);
await sourceStream.CopyToAsync(targetStream, CancellationToken.None);
}
VerifyFiles();
}
[Fact]
public async Task SevenZipArchive_PPMd_AsyncStreamExtraction()
{
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "7Zip.PPMd.7z");
using var stream = File.OpenRead(testArchive);
using var archive = ArchiveFactory.Open(stream);
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
var targetPath = Path.Combine(SCRATCH_FILES_PATH, entry.Key!);
var targetDir = Path.GetDirectoryName(targetPath);
if (!string.IsNullOrEmpty(targetDir) && !Directory.Exists(targetDir))
{
Directory.CreateDirectory(targetDir);
}
using var sourceStream = await entry.OpenEntryStreamAsync(CancellationToken.None);
await using var targetStream = File.Create(targetPath);
await sourceStream.CopyToAsync(targetStream, CancellationToken.None);
}
VerifyFiles();
}
}
#endif

View File

@@ -9,6 +9,9 @@
<PropertyGroup Condition="'$(Configuration)|$(TargetFramework)|$(Platform)'=='Debug|net10.0|AnyCPU'">
<DefineConstants>$(DefineConstants);DEBUG_STREAMS</DefineConstants>
</PropertyGroup>
<PropertyGroup Condition=" '$(TargetFramework)' == 'net48' ">
<DefineConstants>$(DefineConstants);LEGACY_DOTNET</DefineConstants>
</PropertyGroup>
<PropertyGroup Condition="$([System.Runtime.InteropServices.RuntimeInformation]::IsOSPlatform($([System.Runtime.InteropServices.OSPlatform]::Windows)))">
<DefineConstants>$(DefineConstants);WINDOWS</DefineConstants>
</PropertyGroup>
@@ -25,7 +28,7 @@
<PackageReference Include="xunit" />
<PackageReference Include="Microsoft.NETFramework.ReferenceAssemblies" PrivateAssets="All" />
</ItemGroup>
<ItemGroup Condition=" '$(VersionlessImplicitFrameworkDefine)' != 'NETFRAMEWORK' ">
<ItemGroup Condition="$([System.Runtime.InteropServices.RuntimeInformation]::IsOSPlatform($([System.Runtime.InteropServices.OSPlatform]::Linux)))">
<PackageReference Include="Mono.Posix.NETStandard" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,202 @@
using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Compressors;
using SharpCompress.Compressors.BZip2;
using SharpCompress.Compressors.Deflate;
using SharpCompress.Compressors.LZMA;
using SharpCompress.Compressors.Lzw;
using SharpCompress.Compressors.PPMd;
using SharpCompress.Compressors.Reduce;
using SharpCompress.Compressors.ZStandard;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Test.Mocks;
using Xunit;
namespace SharpCompress.Test.Streams;
public class DisposalTests
{
private void VerifyStreamDisposal(
Func<Stream, bool, Stream> createStream,
bool supportsLeaveOpen = true
)
{
// 1. Test Dispose behavior (should dispose inner stream)
{
using var innerStream = new TestStream(new MemoryStream());
// createStream(stream, leaveOpen: false)
var stream = createStream(innerStream, false);
stream.Dispose();
// Some streams might not support disposal of inner stream (e.g. PpmdStream apparently)
// But for those that satisfy the pattern, we assert true.
Assert.True(
innerStream.IsDisposed,
"Stream should have been disposed when leaveOpen=false"
);
}
// 2. Test LeaveOpen behavior (should NOT dispose inner stream)
if (supportsLeaveOpen)
{
using var innerStream = new TestStream(new MemoryStream());
// createStream(stream, leaveOpen: true)
var stream = createStream(innerStream, true);
stream.Dispose();
Assert.False(
innerStream.IsDisposed,
"Stream should NOT have been disposed when leaveOpen=true"
);
}
}
private void VerifyAlwaysDispose(Func<Stream, Stream> createStream)
{
using var innerStream = new TestStream(new MemoryStream());
var stream = createStream(innerStream);
stream.Dispose();
Assert.True(innerStream.IsDisposed, "Stream should have been disposed (AlwaysDispose)");
}
private void VerifyNeverDispose(Func<Stream, Stream> createStream)
{
using var innerStream = new TestStream(new MemoryStream());
var stream = createStream(innerStream);
stream.Dispose();
Assert.False(innerStream.IsDisposed, "Stream should NOT have been disposed (NeverDispose)");
}
[Fact]
public void SourceStream_Disposal()
{
VerifyStreamDisposal(
(stream, leaveOpen) =>
new SourceStream(
stream,
i => null,
new ReaderOptions { LeaveStreamOpen = leaveOpen }
)
);
}
[Fact]
public void ProgressReportingStream_Disposal()
{
VerifyStreamDisposal(
(stream, leaveOpen) =>
new ProgressReportingStream(
stream,
new Progress<ProgressReport>(),
"",
0,
leaveOpen: leaveOpen
)
);
}
[Fact]
public void DataDescriptorStream_Disposal()
{
// DataDescriptorStream DOES dispose inner stream
VerifyAlwaysDispose(stream => new DataDescriptorStream(stream));
}
[Fact]
public void DeflateStream_Disposal()
{
// DeflateStream in SharpCompress always disposes inner stream
VerifyAlwaysDispose(stream => new DeflateStream(stream, CompressionMode.Compress));
}
[Fact]
public void GZipStream_Disposal()
{
// GZipStream in SharpCompress always disposes inner stream
VerifyAlwaysDispose(stream => new GZipStream(stream, CompressionMode.Compress));
}
[Fact]
public void LzwStream_Disposal()
{
VerifyStreamDisposal(
(stream, leaveOpen) =>
{
var lzw = new LzwStream(stream);
lzw.IsStreamOwner = !leaveOpen;
return lzw;
}
);
}
[Fact]
public void PpmdStream_Disposal()
{
// PpmdStream seems to not dispose inner stream based on code analysis
// It takes PpmdProperties which we need to mock or create.
var props = new PpmdProperties();
VerifyNeverDispose(stream => new PpmdStream(props, stream, false));
}
[Fact]
public void LzmaStream_Disposal()
{
// LzmaStream always disposes inner stream
// Need to provide valid properties to avoid crash in constructor (invalid window size)
// 5 bytes: 1 byte properties + 4 bytes dictionary size (little endian)
// Dictionary size = 1024 (0x400) -> 00 04 00 00
var lzmaProps = new byte[] { 0, 0, 4, 0, 0 };
VerifyAlwaysDispose(stream => new LzmaStream(lzmaProps, stream));
}
[Fact]
public void LZipStream_Disposal()
{
// LZipStream now supports leaveOpen parameter
// Use Compress mode to avoid need for valid input header
VerifyStreamDisposal(
(stream, leaveOpen) => new LZipStream(stream, CompressionMode.Compress, leaveOpen)
);
}
[Fact]
public void BZip2Stream_Disposal()
{
// BZip2Stream now supports leaveOpen parameter
VerifyStreamDisposal(
(stream, leaveOpen) =>
new BZip2Stream(stream, CompressionMode.Compress, false, leaveOpen)
);
}
[Fact]
public void ReduceStream_Disposal()
{
// ReduceStream does not dispose inner stream
VerifyNeverDispose(stream => new ReduceStream(stream, 0, 0, 1));
}
[Fact]
public void ZStandard_CompressionStream_Disposal()
{
VerifyStreamDisposal(
(stream, leaveOpen) =>
new CompressionStream(stream, level: 0, bufferSize: 0, leaveOpen: leaveOpen)
);
}
[Fact]
public void ZStandard_DecompressionStream_Disposal()
{
VerifyStreamDisposal(
(stream, leaveOpen) =>
new DecompressionStream(
stream,
bufferSize: 0,
checkEndOfStream: false,
leaveOpen: leaveOpen
)
);
}
}

View File

@@ -0,0 +1,226 @@
using System;
using System.IO;
using System.Text;
using SharpCompress.Compressors;
using SharpCompress.Compressors.BZip2;
using SharpCompress.Compressors.LZMA;
using SharpCompress.Test.Mocks;
using Xunit;
namespace SharpCompress.Test.Streams;
public class LeaveOpenBehaviorTests
{
private static byte[] CreateTestData() =>
Encoding.UTF8.GetBytes("The quick brown fox jumps over the lazy dog");
[Fact]
public void BZip2Stream_Compress_LeaveOpen_False()
{
using var innerStream = new TestStream(new MemoryStream());
using (
var bzip2 = new BZip2Stream(
innerStream,
CompressionMode.Compress,
false,
leaveOpen: false
)
)
{
bzip2.Write(CreateTestData(), 0, CreateTestData().Length);
bzip2.Finish();
}
Assert.True(innerStream.IsDisposed, "Inner stream should be disposed when leaveOpen=false");
}
[Fact]
public void BZip2Stream_Compress_LeaveOpen_True()
{
using var innerStream = new TestStream(new MemoryStream());
byte[] compressed;
using (
var bzip2 = new BZip2Stream(
innerStream,
CompressionMode.Compress,
false,
leaveOpen: true
)
)
{
bzip2.Write(CreateTestData(), 0, CreateTestData().Length);
bzip2.Finish();
}
Assert.False(
innerStream.IsDisposed,
"Inner stream should NOT be disposed when leaveOpen=true"
);
// Should be able to read the compressed data
innerStream.Position = 0;
compressed = new byte[innerStream.Length];
innerStream.Read(compressed, 0, compressed.Length);
Assert.True(compressed.Length > 0);
}
[Fact]
public void BZip2Stream_Decompress_LeaveOpen_False()
{
// First compress some data
var memStream = new MemoryStream();
using (var bzip2 = new BZip2Stream(memStream, CompressionMode.Compress, false, true))
{
bzip2.Write(CreateTestData(), 0, CreateTestData().Length);
bzip2.Finish();
}
memStream.Position = 0;
using var innerStream = new TestStream(memStream);
var decompressed = new byte[CreateTestData().Length];
using (
var bzip2 = new BZip2Stream(
innerStream,
CompressionMode.Decompress,
false,
leaveOpen: false
)
)
{
bzip2.Read(decompressed, 0, decompressed.Length);
}
Assert.True(innerStream.IsDisposed, "Inner stream should be disposed when leaveOpen=false");
Assert.Equal(CreateTestData(), decompressed);
}
[Fact]
public void BZip2Stream_Decompress_LeaveOpen_True()
{
// First compress some data
var memStream = new MemoryStream();
using (var bzip2 = new BZip2Stream(memStream, CompressionMode.Compress, false, true))
{
bzip2.Write(CreateTestData(), 0, CreateTestData().Length);
bzip2.Finish();
}
memStream.Position = 0;
using var innerStream = new TestStream(memStream);
var decompressed = new byte[CreateTestData().Length];
using (
var bzip2 = new BZip2Stream(
innerStream,
CompressionMode.Decompress,
false,
leaveOpen: true
)
)
{
bzip2.Read(decompressed, 0, decompressed.Length);
}
Assert.False(
innerStream.IsDisposed,
"Inner stream should NOT be disposed when leaveOpen=true"
);
Assert.Equal(CreateTestData(), decompressed);
// Should still be able to use the stream
innerStream.Position = 0;
Assert.True(innerStream.CanRead);
}
[Fact]
public void LZipStream_Compress_LeaveOpen_False()
{
using var innerStream = new TestStream(new MemoryStream());
using (var lzip = new LZipStream(innerStream, CompressionMode.Compress, leaveOpen: false))
{
lzip.Write(CreateTestData(), 0, CreateTestData().Length);
lzip.Finish();
}
Assert.True(innerStream.IsDisposed, "Inner stream should be disposed when leaveOpen=false");
}
[Fact]
public void LZipStream_Compress_LeaveOpen_True()
{
using var innerStream = new TestStream(new MemoryStream());
byte[] compressed;
using (var lzip = new LZipStream(innerStream, CompressionMode.Compress, leaveOpen: true))
{
lzip.Write(CreateTestData(), 0, CreateTestData().Length);
lzip.Finish();
}
Assert.False(
innerStream.IsDisposed,
"Inner stream should NOT be disposed when leaveOpen=true"
);
// Should be able to read the compressed data
innerStream.Position = 0;
compressed = new byte[innerStream.Length];
innerStream.Read(compressed, 0, compressed.Length);
Assert.True(compressed.Length > 0);
}
[Fact]
public void LZipStream_Decompress_LeaveOpen_False()
{
// First compress some data
var memStream = new MemoryStream();
using (var lzip = new LZipStream(memStream, CompressionMode.Compress, true))
{
lzip.Write(CreateTestData(), 0, CreateTestData().Length);
lzip.Finish();
}
memStream.Position = 0;
using var innerStream = new TestStream(memStream);
var decompressed = new byte[CreateTestData().Length];
using (var lzip = new LZipStream(innerStream, CompressionMode.Decompress, leaveOpen: false))
{
lzip.Read(decompressed, 0, decompressed.Length);
}
Assert.True(innerStream.IsDisposed, "Inner stream should be disposed when leaveOpen=false");
Assert.Equal(CreateTestData(), decompressed);
}
[Fact]
public void LZipStream_Decompress_LeaveOpen_True()
{
// First compress some data
var memStream = new MemoryStream();
using (var lzip = new LZipStream(memStream, CompressionMode.Compress, true))
{
lzip.Write(CreateTestData(), 0, CreateTestData().Length);
lzip.Finish();
}
memStream.Position = 0;
using var innerStream = new TestStream(memStream);
var decompressed = new byte[CreateTestData().Length];
using (var lzip = new LZipStream(innerStream, CompressionMode.Decompress, leaveOpen: true))
{
lzip.Read(decompressed, 0, decompressed.Length);
}
Assert.False(
innerStream.IsDisposed,
"Inner stream should NOT be disposed when leaveOpen=true"
);
Assert.Equal(CreateTestData(), decompressed);
// Should still be able to use the stream
innerStream.Position = 0;
Assert.True(innerStream.CanRead);
}
}

View File

@@ -251,4 +251,106 @@ public class ZipReaderAsyncTests : ReaderTests
}
Assert.Equal(8, count);
}
[Fact]
public async ValueTask EntryStream_Dispose_DoesNotThrow_OnNonSeekableStream_Deflate_Async()
{
// Since version 0.41.0: EntryStream.DisposeAsync() should not throw NotSupportedException
// when FlushAsync() fails on non-seekable streams (Deflate compression)
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.dd.zip");
using Stream stream = new ForwardOnlyStream(File.OpenRead(path));
using var reader = ReaderFactory.Open(stream);
// This should not throw, even if internal FlushAsync() fails
while (await reader.MoveToNextEntryAsync())
{
if (!reader.Entry.IsDirectory)
{
#if LEGACY_DOTNET
using var entryStream = await reader.OpenEntryStreamAsync();
#else
await using var entryStream = await reader.OpenEntryStreamAsync();
#endif
// Read some data
var buffer = new byte[1024];
await entryStream.ReadAsync(buffer, 0, buffer.Length);
// DisposeAsync should not throw NotSupportedException
}
}
}
[Fact]
public async ValueTask EntryStream_Dispose_DoesNotThrow_OnNonSeekableStream_LZMA_Async()
{
// Since version 0.41.0: EntryStream.DisposeAsync() should not throw NotSupportedException
// when FlushAsync() fails on non-seekable streams (LZMA compression)
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.lzma.dd.zip");
using Stream stream = new ForwardOnlyStream(File.OpenRead(path));
using var reader = ReaderFactory.Open(stream);
// This should not throw, even if internal FlushAsync() fails
while (await reader.MoveToNextEntryAsync())
{
if (!reader.Entry.IsDirectory)
{
#if LEGACY_DOTNET
using var entryStream = await reader.OpenEntryStreamAsync();
#else
await using var entryStream = await reader.OpenEntryStreamAsync();
#endif
// Read some data
var buffer = new byte[1024];
await entryStream.ReadAsync(buffer, 0, buffer.Length);
// DisposeAsync should not throw NotSupportedException
}
}
}
[Fact]
public async ValueTask Archive_Iteration_DoesNotBreak_WhenFlushThrows_Deflate_Async()
{
// Regression test: since 0.41.0, archive iteration would silently break
// when the input stream throws NotSupportedException in Flush().
// Only the first entry would be returned, then iteration would stop without exception.
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.dd.zip");
using var fileStream = File.OpenRead(path);
using Stream stream = new ThrowOnFlushStream(fileStream);
using var reader = ReaderFactory.Open(stream);
var count = 0;
while (await reader.MoveToNextEntryAsync())
{
if (!reader.Entry.IsDirectory)
{
count++;
}
}
// Should iterate through all entries, not just the first one
Assert.True(count > 1, $"Expected more than 1 entry, but got {count}");
}
[Fact]
public async ValueTask Archive_Iteration_DoesNotBreak_WhenFlushThrows_LZMA_Async()
{
// Regression test: since 0.41.0, archive iteration would silently break
// when the input stream throws NotSupportedException in Flush().
// Only the first entry would be returned, then iteration would stop without exception.
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.lzma.dd.zip");
using var fileStream = File.OpenRead(path);
using Stream stream = new ThrowOnFlushStream(fileStream);
using var reader = ReaderFactory.Open(stream);
var count = 0;
while (await reader.MoveToNextEntryAsync())
{
if (!reader.Entry.IsDirectory)
{
count++;
}
}
// Should iterate through all entries, not just the first one
Assert.True(count > 1, $"Expected more than 1 entry, but got {count}");
}
}

View File

@@ -436,4 +436,98 @@ public class ZipReaderTests : ReaderTests
Assert.Equal(archiveKeys.OrderBy(k => k), readerKeys.OrderBy(k => k));
}
}
[Fact]
public void EntryStream_Dispose_DoesNotThrow_OnNonSeekableStream_Deflate()
{
// Since version 0.41.0: EntryStream.Dispose() should not throw NotSupportedException
// when Flush() fails on non-seekable streams (Deflate compression)
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.dd.zip");
using Stream stream = new ForwardOnlyStream(File.OpenRead(path));
using var reader = ReaderFactory.Open(stream);
// This should not throw, even if internal Flush() fails
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
using var entryStream = reader.OpenEntryStream();
// Read some data
var buffer = new byte[1024];
entryStream.Read(buffer, 0, buffer.Length);
// Dispose should not throw NotSupportedException
}
}
}
[Fact]
public void EntryStream_Dispose_DoesNotThrow_OnNonSeekableStream_LZMA()
{
// Since version 0.41.0: EntryStream.Dispose() should not throw NotSupportedException
// when Flush() fails on non-seekable streams (LZMA compression)
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.lzma.dd.zip");
using Stream stream = new ForwardOnlyStream(File.OpenRead(path));
using var reader = ReaderFactory.Open(stream);
// This should not throw, even if internal Flush() fails
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
using var entryStream = reader.OpenEntryStream();
// Read some data
var buffer = new byte[1024];
entryStream.Read(buffer, 0, buffer.Length);
// Dispose should not throw NotSupportedException
}
}
}
[Fact]
public void Archive_Iteration_DoesNotBreak_WhenFlushThrows_Deflate()
{
// Regression test: since 0.41.0, archive iteration would silently break
// when the input stream throws NotSupportedException in Flush().
// Only the first entry would be returned, then iteration would stop without exception.
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.dd.zip");
using var fileStream = File.OpenRead(path);
using Stream stream = new ThrowOnFlushStream(fileStream);
using var reader = ReaderFactory.Open(stream);
var count = 0;
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
count++;
}
}
// Should iterate through all entries, not just the first one
Assert.True(count > 1, $"Expected more than 1 entry, but got {count}");
}
[Fact]
public void Archive_Iteration_DoesNotBreak_WhenFlushThrows_LZMA()
{
// Regression test: since 0.41.0, archive iteration would silently break
// when the input stream throws NotSupportedException in Flush().
// Only the first entry would be returned, then iteration would stop without exception.
var path = Path.Combine(TEST_ARCHIVES_PATH, "Zip.lzma.dd.zip");
using var fileStream = File.OpenRead(path);
using Stream stream = new ThrowOnFlushStream(fileStream);
using var reader = ReaderFactory.Open(stream);
var count = 0;
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
count++;
}
}
// Should iterate through all entries, not just the first one
Assert.True(count > 1, $"Expected more than 1 entry, but got {count}");
}
}

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.