Compare commits

..

8 Commits

Author SHA1 Message Date
Adam Hathcock
5c4b83e501 Merge remote-tracking branch 'origin/master' into copilot/add-performance-benchmarking
# Conflicts:
#	tests/SharpCompress.Performance/packages.lock.json
2026-01-14 14:52:21 +00:00
copilot-swe-agent[bot]
80ac10a5fe Merge latest master branch and resolve conflicts
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-12 14:11:57 +00:00
copilot-swe-agent[bot]
a92ce90252 Fix path validation and add iteration cleanup to prevent file reuse
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-05 17:38:17 +00:00
copilot-swe-agent[bot]
e519f61f0f Address code review feedback: fix exception handling and initialization order
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-05 17:36:08 +00:00
copilot-swe-agent[bot]
49f2271253 Format code with CSharpier
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-05 17:33:06 +00:00
copilot-swe-agent[bot]
5b1d11bc1d Add WriteBenchmarks, BaselineComparisonBenchmarks, and comprehensive documentation
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-05 17:31:32 +00:00
copilot-swe-agent[bot]
aa3a40d968 Add BenchmarkDotNet integration with Archive and Reader benchmarks
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-05 17:27:50 +00:00
copilot-swe-agent[bot]
6125654b2e Initial plan 2026-01-05 17:21:29 +00:00
600 changed files with 17008 additions and 38073 deletions

View File

@@ -3,7 +3,7 @@
"isRoot": true,
"tools": {
"csharpier": {
"version": "1.2.6",
"version": "1.2.5",
"commands": [
"csharpier"
],

7
.copilot-agent.yml Normal file
View File

@@ -0,0 +1,7 @@
enabled: true
agent:
name: copilot-coding-agent
allow:
- paths: ["src/**/*", "tests/**/*", "README.md", "AGENTS.md"]
actions: ["create", "modify"]
require_review_before_merge: true

View File

@@ -307,6 +307,7 @@ dotnet_diagnostic.CS8602.severity = error
dotnet_diagnostic.CS8604.severity = error
dotnet_diagnostic.CS8618.severity = error
dotnet_diagnostic.CS0618.severity = suggestion
dotnet_diagnostic.CS1998.severity = error
dotnet_diagnostic.CS4014.severity = error
dotnet_diagnostic.CS8600.severity = error
dotnet_diagnostic.CS8603.severity = error

17
.github/agents/copilot-agent.yml vendored Normal file
View File

@@ -0,0 +1,17 @@
enabled: true
agent:
name: copilot-coding-agent
allow:
- paths: ["src/**/*", "tests/**/*", "README.md", "AGENTS.md"]
actions: ["create", "modify", "delete"]
require_review_before_merge: true
required_approvals: 1
allowed_merge_strategies:
- squash
- merge
auto_merge_on_green: false
run_workflows: true
notes: |
- This manifest expresses the policy for the Copilot coding agent in this repository.
- It does NOT install or authorize the agent; a repository admin must install the Copilot coding agent app and grant the repository the necessary permissions (contents: write, pull_requests: write, checks: write, actions: write/read, issues: write) to allow the agent to act.
- Keep allow paths narrow and prefer require_review_before_merge during initial rollout.

25
.github/prompts/plan-async.prompt.md vendored Normal file
View File

@@ -0,0 +1,25 @@
# Plan: Implement Missing Async Functionality in SharpCompress
SharpCompress has async support for low-level stream operations and Reader/Writer APIs, but critical entry points (Archive.Open, factory methods, initialization) remain synchronous. This plan adds async overloads for all user-facing I/O operations and fixes existing async bugs, enabling full end-to-end async workflows.
## Steps
1. **Add async factory methods** to [ArchiveFactory.cs](src/SharpCompress/Factories/ArchiveFactory.cs), [ReaderFactory.cs](src/SharpCompress/Factories/ReaderFactory.cs), and [WriterFactory.cs](src/SharpCompress/Factories/WriterFactory.cs) with `OpenAsync` and `CreateAsync` overloads accepting `CancellationToken`
2. **Implement async Open methods** on concrete archive types ([ZipArchive.cs](src/SharpCompress/Archives/Zip/ZipArchive.cs), [TarArchive.cs](src/SharpCompress/Archives/Tar/TarArchive.cs), [RarArchive.cs](src/SharpCompress/Archives/Rar/RarArchive.cs), [GZipArchive.cs](src/SharpCompress/Archives/GZip/GZipArchive.cs), [SevenZipArchive.cs](src/SharpCompress/Archives/SevenZip/SevenZipArchive.cs)) and reader types ([ZipReader.cs](src/SharpCompress/Readers/Zip/ZipReader.cs), [TarReader.cs](src/SharpCompress/Readers/Tar/TarReader.cs), etc.)
3. **Convert archive initialization logic to async** including header reading, volume loading, and format signature detection across archive constructors and internal initialization methods
4. **Fix LZMA decoder async bugs** in [LzmaStream.cs](src/SharpCompress/Compressors/LZMA/LzmaStream.cs), [Decoder.cs](src/SharpCompress/Compressors/LZMA/Decoder.cs), and [OutWindow.cs](src/SharpCompress/Compressors/LZMA/OutWindow.cs) to enable true async 7Zip support and remove `NonDisposingStream` workaround
5. **Complete Rar async implementation** by converting `UnpackV2017` methods to async in [UnpackV2017.cs](src/SharpCompress/Compressors/Rar/UnpackV2017.cs) and updating Rar20 decompression
6. **Add comprehensive async tests** covering all new async entry points, cancellation scenarios, and concurrent operations across all archive formats in test files
## Further Considerations
1. **Breaking changes** - Should new async methods be added alongside existing sync methods (non-breaking), or should sync methods eventually be deprecated? Recommend additive approach for backward compatibility.
2. **Performance impact** - Header parsing for formats like Zip/Tar is often small; consider whether truly async parsing adds value vs sync parsing wrapped in Task, or make it conditional based on stream type (network vs file).
3. **7Zip complexity** - The LZMA async bug fix (Step 4) may be challenging due to state management in the decoder; consider whether to scope it separately or implement a simpler workaround that maintains correctness.

123
.github/prompts/plan-for-next.prompt.md vendored Normal file
View File

@@ -0,0 +1,123 @@
# Plan: Modernize SharpCompress Public API
Based on comprehensive analysis, the API has several inconsistencies around factory patterns, async support, format capabilities, and options classes. Most improvements can be done incrementally without breaking changes.
## Steps
1. **Standardize factory patterns** by deprecating format-specific static `Open` methods in [Archives/Zip/ZipArchive.cs](src/SharpCompress/Archives/Zip/ZipArchive.cs), [Archives/Tar/TarArchive.cs](src/SharpCompress/Archives/Tar/TarArchive.cs), etc. in favor of centralized [Factories/ArchiveFactory.cs](src/SharpCompress/Factories/ArchiveFactory.cs)
2. **Complete async implementation** in [Writers/Zip/ZipWriter.cs](src/SharpCompress/Writers/Zip/ZipWriter.cs) and other writers that currently use sync-over-async, implementing true async I/O throughout the writer hierarchy
3. **Unify options classes** by making [Common/ExtractionOptions.cs](src/SharpCompress/Common/ExtractionOptions.cs) inherit from `OptionsBase` and adding progress reporting to extraction methods consistently
4. **Clarify GZip semantics** in [Archives/GZip/GZipArchive.cs](src/SharpCompress/Archives/GZip/GZipArchive.cs) by adding XML documentation explaining single-entry limitation and relationship to GZip compression used in Tar.gz
## Further Considerations
1. **Breaking changes roadmap** - Should we plan a major version (2.0) to remove deprecated factory methods, clean up `ArchiveType` enum (remove Arc/Arj or add full support), and consolidate naming patterns?
2. **Progress reporting consistency** - Should `IProgress<ArchiveExtractionProgress<IEntry>>` be added to all extraction extension methods or consolidated into options classes?
## Detailed Analysis
### Factory Pattern Issues
Three different factory patterns exist with overlapping functionality:
1. **Static Factories**: ArchiveFactory, ReaderFactory, WriterFactory
2. **Instance Factories**: IArchiveFactory, IReaderFactory, IWriterFactory
3. **Format-specific static methods**: Each archive class has static `Open` methods
**Example confusion:**
```csharp
// Three ways to open a Zip archive - which is recommended?
var archive1 = ArchiveFactory.Open("file.zip");
var archive2 = ZipArchive.Open("file.zip");
var archive3 = ArchiveFactory.AutoFactory.Open(fileInfo, options);
```
### Async Support Gaps
Base `IWriter` interface has async methods, but writer implementations provide minimal async support. Most writers just call synchronous methods:
```csharp
public virtual async Task WriteAsync(...)
{
// Default implementation calls synchronous version
Write(filename, source, modificationTime);
await Task.CompletedTask.ConfigureAwait(false);
}
```
Real async implementations only in:
- `TarWriter` - Proper async implementation
- Most other writers use sync-over-async
### GZip Archive Special Case
GZip is treated as both a compression format and an archive format, but only supports single-entry archives:
```csharp
protected override GZipArchiveEntry CreateEntryInternal(...)
{
if (Entries.Any())
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
// ...
}
```
### Options Class Hierarchy
```
OptionsBase (LeaveStreamOpen, ArchiveEncoding)
├─ ReaderOptions (LookForHeader, Password, DisableCheckIncomplete, BufferSize, ExtensionHint, Progress)
├─ WriterOptions (CompressionType, CompressionLevel, Progress)
│ ├─ ZipWriterOptions (ArchiveComment, UseZip64)
│ ├─ TarWriterOptions (FinalizeArchiveOnClose, HeaderFormat)
│ └─ GZipWriterOptions (no additional properties)
└─ ExtractionOptions (standalone - Overwrite, ExtractFullPath, PreserveFileTime, PreserveAttributes)
```
**Issues:**
- `ExtractionOptions` doesn't inherit from `OptionsBase` - no encoding support during extraction
- Progress reporting inconsistency between readers and extraction
- Obsolete properties (`ChecksumIsValid`, `Version`) with unclear migration path
### Implementation Priorities
**High Priority (Non-Breaking):**
1. Add API usage guide (Archive vs Reader, factory recommendations, async best practices)
2. Fix progress reporting consistency
3. Complete async implementation in writers
**Medium Priority (Next Major Version):**
1. Unify factory pattern - deprecate format-specific static `Open` methods
2. Clean up options classes - make `ExtractionOptions` inherit from `OptionsBase`
3. Clarify archive types - remove Arc/Arj from `ArchiveType` enum or add full support
4. Standardize naming across archive types
**Low Priority:**
1. Add BZip2 archive support similar to GZipArchive
2. Complete obsolete property cleanup with migration guide
### Backward Compatibility Strategy
**Safe (Non-Breaking) Changes:**
- Add new methods to interfaces (use default implementations)
- Add new options properties (with defaults)
- Add new factory methods
- Improve async implementations
- Add progress reporting support
**Breaking Changes to Avoid:**
- ❌ Removing format-specific `Open` methods (deprecate instead)
- ❌ Changing `LeaveStreamOpen` default (currently `true`)
- ❌ Removing obsolete properties before major version bump
- ❌ Changing return types or signatures of existing methods
**Deprecation Pattern:**
- Use `[Obsolete]` for one major version
- Use `[EditorBrowsable(EditorBrowsableState.Never)]` in next major version
- Remove in following major version

View File

@@ -1,50 +0,0 @@
name: Performance Benchmarks
on:
push:
branches:
- 'master'
- 'release'
pull_request:
branches:
- 'master'
- 'release'
workflow_dispatch:
permissions:
contents: read
jobs:
benchmark:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
with:
fetch-depth: 0
- uses: actions/setup-dotnet@v5
with:
dotnet-version: 10.0.x
- name: Build Performance Project
run: dotnet build tests/SharpCompress.Performance/SharpCompress.Performance.csproj --configuration Release
- name: Run Benchmarks
run: dotnet run --project tests/SharpCompress.Performance/SharpCompress.Performance.csproj --configuration Release --no-build -- --filter "*" --exporters json markdown --artifacts benchmark-results
continue-on-error: true
- name: Display Benchmark Results
if: always()
run: dotnet run --project build/build.csproj -- display-benchmark-results
- name: Compare with Baseline
if: always()
run: dotnet run --project build/build.csproj -- compare-benchmark-results
- name: Upload Benchmark Results
if: always()
uses: actions/upload-artifact@v6
with:
name: benchmark-results
path: benchmark-results/

12
.gitignore vendored
View File

@@ -4,8 +4,8 @@ _ReSharper.SharpCompress/
bin/
*.suo
*.user
tests/TestArchives/Scratch/
tests/TestArchives/Scratch2/
TestArchives/Scratch/
TestArchives/Scratch2/
TestResults/
*.nupkg
packages/*/
@@ -17,10 +17,10 @@ tests/TestArchives/*/Scratch2
tools
.idea/
artifacts/
BenchmarkDotNet.Artifacts/
baseline-artifacts/
profiler-snapshots/
.DS_Store
*.snupkg
benchmark-results/
# BenchmarkDotNet artifacts
BenchmarkDotNet.Artifacts/
**/BenchmarkDotNet.Artifacts/

View File

@@ -178,59 +178,5 @@ SharpCompress supports multiple archive and compression formats:
2. **Solid archives (Rar, 7Zip)** - Use `ExtractAllEntries()` for best performance, not individual entry extraction
3. **Stream disposal** - Always set `LeaveStreamOpen` explicitly when needed (default is to close)
4. **Tar + non-seekable stream** - Must provide file size or it will throw
5. **Multi-framework differences** - Some features differ between .NET Framework and modern .NET (e.g., Mono.Posix)
6. **Format detection** - Use `ReaderFactory.Open()` for auto-detection, test with actual archive files
### Async Struct-Copy Bug in LZMA RangeCoder
When implementing async methods on mutable `struct` types (like `BitEncoder` and `BitDecoder` in the LZMA RangeCoder), be aware that the async state machine copies the struct when `await` is encountered. This means mutations to struct fields after the `await` point may not persist back to the original struct stored in arrays or fields.
**The Bug:**
```csharp
// BAD: async method on mutable struct
public async ValueTask<uint> DecodeAsync(Decoder decoder, CancellationToken cancellationToken = default)
{
var newBound = (decoder._range >> K_NUM_BIT_MODEL_TOTAL_BITS) * _prob;
if (decoder._code < newBound)
{
decoder._range = newBound;
_prob += (K_BIT_MODEL_TOTAL - _prob) >> K_NUM_MOVE_BITS; // Mutates _prob
await decoder.Normalize2Async(cancellationToken).ConfigureAwait(false); // Struct gets copied here
return 0; // Original _prob update may be lost
}
// ...
}
```
**The Fix:**
Refactor async methods on mutable structs to perform all struct mutations synchronously before any `await`, or use a helper method to separate the await from the struct mutation:
```csharp
// GOOD: struct mutations happen synchronously, await is conditional
public ValueTask<uint> DecodeAsync(Decoder decoder, CancellationToken cancellationToken = default)
{
var newBound = (decoder._range >> K_NUM_BIT_MODEL_TOTAL_BITS) * _prob;
if (decoder._code < newBound)
{
decoder._range = newBound;
_prob += (K_BIT_MODEL_TOTAL - _prob) >> K_NUM_MOVE_BITS; // All mutations complete
return DecodeAsyncHelper(decoder.Normalize2Async(cancellationToken), 0); // Await in helper
}
decoder._range -= newBound;
decoder._code -= newBound;
_prob -= (_prob) >> K_NUM_MOVE_BITS; // All mutations complete
return DecodeAsyncHelper(decoder.Normalize2Async(cancellationToken), 1); // Await in helper
}
private static async ValueTask<uint> DecodeAsyncHelper(ValueTask normalizeTask, uint result)
{
await normalizeTask.ConfigureAwait(false);
return result;
}
```
**Why This Matters:**
In LZMA, the `BitEncoder` and `BitDecoder` structs maintain adaptive probability models in their `_prob` field. When these structs are stored in arrays (e.g., `_models[m]`), the async state machine copy breaks the adaptive model, causing incorrect bit decoding and eventually `DataErrorException` exceptions.
**Related Files:**
- `src/SharpCompress/Compressors/LZMA/RangeCoder/RangeCoderBit.Async.cs` - Fixed
- `src/SharpCompress/Compressors/LZMA/RangeCoder/RangeCoderBitTree.Async.cs` - Uses readonly structs, so this pattern doesn't apply

View File

@@ -12,6 +12,5 @@
<RunAnalyzersDuringBuild>False</RunAnalyzersDuringBuild>
<ManagePackageVersionsCentrally>true</ManagePackageVersionsCentrally>
<RestorePackagesWithLockFile>true</RestorePackagesWithLockFile>
<CentralPackageTransitivePinningEnabled>true</CentralPackageTransitivePinningEnabled>
</PropertyGroup>
</Project>

View File

@@ -1,10 +1,10 @@
<Project>
<ItemGroup>
<PackageVersion Include="BenchmarkDotNet" Version="0.15.8" />
<PackageVersion Include="BenchmarkDotNet" Version="0.14.0" />
<PackageVersion Include="Bullseye" Version="6.1.0" />
<PackageVersion Include="AwesomeAssertions" Version="9.3.0" />
<PackageVersion Include="Glob" Version="1.1.9" />
<PackageVersion Include="JetBrains.Profiler.SelfApi" Version="2.5.16" />
<PackageVersion Include="JetBrains.Profiler.SelfApi" Version="2.5.15" />
<PackageVersion Include="Microsoft.Bcl.AsyncInterfaces" Version="10.0.0" />
<PackageVersion Include="Microsoft.NET.Test.Sdk" Version="18.0.1" />
<PackageVersion Include="Mono.Posix.NETStandard" Version="1.0.0" />
@@ -12,9 +12,9 @@
<PackageVersion Include="System.Text.Encoding.CodePages" Version="10.0.0" />
<PackageVersion Include="System.Buffers" Version="4.6.1" />
<PackageVersion Include="System.Memory" Version="4.6.3" />
<PackageVersion Include="xunit.v3" Version="3.2.2" />
<PackageVersion Include="xunit" Version="2.9.3" />
<PackageVersion Include="xunit.runner.visualstudio" Version="3.1.5" />
<GlobalPackageReference Include="Microsoft.SourceLink.GitHub" Version="10.0.102" />
<GlobalPackageReference Include="Microsoft.SourceLink.GitHub" Version="8.0.0" />
<GlobalPackageReference Include="Microsoft.NETFramework.ReferenceAssemblies" Version="1.0.3" />
<GlobalPackageReference
Include="Microsoft.VisualStudio.Threading.Analyzers"

View File

@@ -18,11 +18,12 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Config", "Config", "{CDB425
Directory.Build.props = Directory.Build.props
global.json = global.json
.editorconfig = .editorconfig
.gitignore = .gitignore
Directory.Packages.props = Directory.Packages.props
NuGet.config = NuGet.config
.github\workflows\nuget-release.yml = .github\workflows\nuget-release.yml
USAGE.md = USAGE.md
README.md = README.md
FORMATS.md = FORMATS.md
AGENTS.md = AGENTS.md
EndProjectSection
EndProject

View File

@@ -19,9 +19,6 @@ const string Publish = "publish";
const string DetermineVersion = "determine-version";
const string UpdateVersion = "update-version";
const string PushToNuGet = "push-to-nuget";
const string DisplayBenchmarkResults = "display-benchmark-results";
const string CompareBenchmarkResults = "compare-benchmark-results";
const string GenerateBaseline = "generate-baseline";
Target(
Clean,
@@ -213,249 +210,6 @@ Target(
}
);
Target(
DisplayBenchmarkResults,
() =>
{
var githubStepSummary = Environment.GetEnvironmentVariable("GITHUB_STEP_SUMMARY");
var resultsDir = "benchmark-results/results";
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("No benchmark results found.");
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("No benchmark markdown reports found.");
return;
}
var output = new List<string> { "## Benchmark Results", "" };
foreach (var file in markdownFiles)
{
Console.WriteLine($"Processing {Path.GetFileName(file)}");
var content = File.ReadAllText(file);
output.Add(content);
output.Add("");
}
// Write to GitHub Step Summary if available
if (!string.IsNullOrEmpty(githubStepSummary))
{
File.AppendAllLines(githubStepSummary, output);
Console.WriteLine($"Benchmark results written to GitHub Step Summary");
}
else
{
// Write to console if not in GitHub Actions
foreach (var line in output)
{
Console.WriteLine(line);
}
}
}
);
Target(
CompareBenchmarkResults,
() =>
{
var githubStepSummary = Environment.GetEnvironmentVariable("GITHUB_STEP_SUMMARY");
var baselinePath = "tests/SharpCompress.Performance/baseline-results.md";
var resultsDir = "benchmark-results/results";
var output = new List<string> { "## Comparison with Baseline", "" };
if (!File.Exists(baselinePath))
{
Console.WriteLine("Baseline file not found");
output.Add("⚠️ Baseline file not found. Run `generate-baseline` to create it.");
WriteOutput(output, githubStepSummary);
return;
}
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("No current benchmark results found.");
output.Add("⚠️ No current benchmark results found. Showing baseline only.");
output.Add("");
output.Add("### Baseline Results");
output.AddRange(File.ReadAllLines(baselinePath));
WriteOutput(output, githubStepSummary);
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("No current benchmark markdown reports found.");
output.Add("⚠️ No current benchmark results found. Showing baseline only.");
output.Add("");
output.Add("### Baseline Results");
output.AddRange(File.ReadAllLines(baselinePath));
WriteOutput(output, githubStepSummary);
return;
}
Console.WriteLine("Parsing baseline results...");
var baselineMetrics = ParseBenchmarkResults(File.ReadAllText(baselinePath));
Console.WriteLine("Parsing current results...");
var currentText = string.Join("\n", markdownFiles.Select(f => File.ReadAllText(f)));
var currentMetrics = ParseBenchmarkResults(currentText);
Console.WriteLine("Comparing results...");
output.Add("### Performance Comparison");
output.Add("");
output.Add(
"| Benchmark | Baseline Mean | Current Mean | Change | Baseline Memory | Current Memory | Change |"
);
output.Add(
"|-----------|---------------|--------------|--------|-----------------|----------------|--------|"
);
var hasRegressions = false;
var hasImprovements = false;
foreach (var method in currentMetrics.Keys.Union(baselineMetrics.Keys).OrderBy(k => k))
{
var hasCurrent = currentMetrics.TryGetValue(method, out var current);
var hasBaseline = baselineMetrics.TryGetValue(method, out var baseline);
if (!hasCurrent)
{
output.Add(
$"| {method} | {baseline!.Mean} | ❌ Missing | N/A | {baseline.Memory} | N/A | N/A |"
);
continue;
}
if (!hasBaseline)
{
output.Add(
$"| {method} | ❌ New | {current!.Mean} | N/A | N/A | {current.Memory} | N/A |"
);
continue;
}
var timeChange = CalculateChange(baseline!.MeanValue, current!.MeanValue);
var memChange = CalculateChange(baseline.MemoryValue, current.MemoryValue);
var timeIcon =
timeChange > 25 ? "🔴"
: timeChange < -25 ? "🟢"
: "⚪";
var memIcon =
memChange > 25 ? "🔴"
: memChange < -25 ? "🟢"
: "⚪";
if (timeChange > 25 || memChange > 25)
hasRegressions = true;
if (timeChange < -25 || memChange < -25)
hasImprovements = true;
output.Add(
$"| {method} | {baseline.Mean} | {current.Mean} | {timeIcon} {timeChange:+0.0;-0.0;0}% | {baseline.Memory} | {current.Memory} | {memIcon} {memChange:+0.0;-0.0;0}% |"
);
}
output.Add("");
output.Add("**Legend:**");
output.Add("- 🔴 Regression (>25% slower/more memory)");
output.Add("- 🟢 Improvement (>25% faster/less memory)");
output.Add("- ⚪ No significant change");
if (hasRegressions)
{
output.Add("");
output.Add(
"⚠️ **Warning**: Performance regressions detected. Review the changes carefully."
);
}
else if (hasImprovements)
{
output.Add("");
output.Add("✅ Performance improvements detected!");
}
else
{
output.Add("");
output.Add("✅ Performance is stable compared to baseline.");
}
WriteOutput(output, githubStepSummary);
}
);
Target(
GenerateBaseline,
() =>
{
var perfProject = "tests/SharpCompress.Performance/SharpCompress.Performance.csproj";
var baselinePath = "tests/SharpCompress.Performance/baseline-results.md";
var artifactsDir = "baseline-artifacts";
Console.WriteLine("Building performance project...");
Run("dotnet", $"build {perfProject} --configuration Release");
Console.WriteLine("Running benchmarks to generate baseline...");
Run(
"dotnet",
$"run --project {perfProject} --configuration Release --no-build -- --filter \"*\" --exporters markdown --artifacts {artifactsDir}"
);
var resultsDir = Path.Combine(artifactsDir, "results");
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("ERROR: No benchmark results generated.");
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("ERROR: No markdown reports found.");
return;
}
Console.WriteLine($"Combining {markdownFiles.Count} benchmark reports...");
var baselineContent = new List<string>();
foreach (var file in markdownFiles)
{
var lines = File.ReadAllLines(file);
baselineContent.AddRange(lines.Select(l => l.Trim()).Where(l => l.StartsWith('|')));
}
File.WriteAllText(baselinePath, string.Join(Environment.NewLine, baselineContent));
Console.WriteLine($"Baseline written to {baselinePath}");
// Clean up artifacts directory
if (Directory.Exists(artifactsDir))
{
Directory.Delete(artifactsDir, true);
Console.WriteLine("Cleaned up artifacts directory.");
}
}
);
Target("default", [Publish], () => Console.WriteLine("Done!"));
await RunTargetsAndExitAsync(args);
@@ -476,7 +230,7 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
}
else
{
// Not tagged - create prerelease version
// Not tagged - create prerelease version based on next minor version
var allTags = (await GetGitOutput("tag", "--list"))
.Split('\n', StringSplitOptions.RemoveEmptyEntries)
.Where(tag => Regex.IsMatch(tag.Trim(), @"^\d+\.\d+\.\d+$"))
@@ -486,22 +240,8 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
var lastTag = allTags.OrderBy(tag => Version.Parse(tag)).LastOrDefault() ?? "0.0.0";
var lastVersion = Version.Parse(lastTag);
// Determine version increment based on branch
var currentBranch = await GetCurrentBranch();
Version nextVersion;
if (currentBranch == "release")
{
// Release branch: increment patch version
nextVersion = new Version(lastVersion.Major, lastVersion.Minor, lastVersion.Build + 1);
Console.WriteLine($"Building prerelease for release branch (patch increment)");
}
else
{
// Master or other branches: increment minor version
nextVersion = new Version(lastVersion.Major, lastVersion.Minor + 1, 0);
Console.WriteLine($"Building prerelease for {currentBranch} branch (minor increment)");
}
// Increment minor version for next release
var nextVersion = new Version(lastVersion.Major, lastVersion.Minor + 1, 0);
// Use commit count since the last version tag if available; otherwise, fall back to total count
var revListArgs = allTags.Any() ? $"--count {lastTag}..HEAD" : "--count HEAD";
@@ -513,28 +253,6 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
}
}
static async Task<string> GetCurrentBranch()
{
// In GitHub Actions, GITHUB_REF_NAME contains the branch name
var githubRefName = Environment.GetEnvironmentVariable("GITHUB_REF_NAME");
if (!string.IsNullOrEmpty(githubRefName))
{
return githubRefName;
}
// Fallback to git command for local builds
try
{
var (output, _) = await ReadAsync("git", "branch --show-current");
return output.Trim();
}
catch (Exception ex)
{
Console.WriteLine($"Warning: Could not determine current branch: {ex.Message}");
return "unknown";
}
}
static async Task<string> GetGitOutput(string command, string args)
{
try
@@ -548,142 +266,3 @@ static async Task<string> GetGitOutput(string command, string args)
throw new Exception($"Git command failed: git {command} {args}\n{ex.Message}", ex);
}
}
static void WriteOutput(List<string> output, string? githubStepSummary)
{
if (!string.IsNullOrEmpty(githubStepSummary))
{
File.AppendAllLines(githubStepSummary, output);
Console.WriteLine("Comparison written to GitHub Step Summary");
}
else
{
foreach (var line in output)
{
Console.WriteLine(line);
}
}
}
static Dictionary<string, BenchmarkMetric> ParseBenchmarkResults(string markdown)
{
var metrics = new Dictionary<string, BenchmarkMetric>();
var lines = markdown.Split('\n');
for (int i = 0; i < lines.Length; i++)
{
var line = lines[i].Trim();
// Look for table rows with benchmark data
if (line.StartsWith("|") && line.Contains("&#39;") && i > 0)
{
var parts = line.Split('|', StringSplitOptions.TrimEntries);
if (parts.Length >= 5)
{
var method = parts[1].Replace("&#39;", "'");
var meanStr = parts[2];
// Find Allocated column - it's usually the last column or labeled "Allocated"
string memoryStr = "N/A";
for (int j = parts.Length - 2; j >= 2; j--)
{
if (
parts[j].Contains("KB")
|| parts[j].Contains("MB")
|| parts[j].Contains("GB")
|| parts[j].Contains("B")
)
{
memoryStr = parts[j];
break;
}
}
if (
!method.Equals("Method", StringComparison.OrdinalIgnoreCase)
&& !string.IsNullOrWhiteSpace(method)
)
{
var metric = new BenchmarkMetric
{
Method = method,
Mean = meanStr,
MeanValue = ParseTimeValue(meanStr),
Memory = memoryStr,
MemoryValue = ParseMemoryValue(memoryStr),
};
metrics[method] = metric;
}
}
}
}
return metrics;
}
static double ParseTimeValue(string timeStr)
{
if (string.IsNullOrWhiteSpace(timeStr) || timeStr == "N/A" || timeStr == "NA")
return 0;
// Remove thousands separators and parse
timeStr = timeStr.Replace(",", "").Trim();
var match = Regex.Match(timeStr, @"([\d.]+)\s*(\w+)");
if (!match.Success)
return 0;
var value = double.Parse(match.Groups[1].Value);
var unit = match.Groups[2].Value.ToLower();
// Convert to microseconds for comparison
return unit switch
{
"s" => value * 1_000_000,
"ms" => value * 1_000,
"μs" or "us" => value,
"ns" => value / 1_000,
_ => value,
};
}
static double ParseMemoryValue(string memStr)
{
if (string.IsNullOrWhiteSpace(memStr) || memStr == "N/A" || memStr == "NA")
return 0;
memStr = memStr.Replace(",", "").Trim();
var match = Regex.Match(memStr, @"([\d.]+)\s*(\w+)");
if (!match.Success)
return 0;
var value = double.Parse(match.Groups[1].Value);
var unit = match.Groups[2].Value.ToUpper();
// Convert to KB for comparison
return unit switch
{
"GB" => value * 1_024 * 1_024,
"MB" => value * 1_024,
"KB" => value,
"B" => value / 1_024,
_ => value,
};
}
static double CalculateChange(double baseline, double current)
{
if (baseline == 0)
return 0;
return ((current - baseline) / baseline) * 100;
}
record BenchmarkMetric
{
public string Method { get; init; } = "";
public string Mean { get; init; } = "";
public double MeanValue { get; init; }
public string Memory { get; init; } = "";
public double MemoryValue { get; init; }
}

View File

@@ -25,12 +25,12 @@
},
"Microsoft.SourceLink.GitHub": {
"type": "Direct",
"requested": "[10.0.102, )",
"resolved": "10.0.102",
"contentHash": "Oxq3RCIJSdtpIU4hLqO7XaDe/Ra3HS9Wi8rJl838SAg6Zu1iQjerA0+xXWBgUFYbgknUGCLOU0T+lzMLkvY9Qg==",
"requested": "[8.0.0, )",
"resolved": "8.0.0",
"contentHash": "G5q7OqtwIyGTkeIOAc3u2ZuV/kicQaec5EaRnc0pIeSnh9LUjj+PYQrJYBURvDt7twGl2PKA7nSN0kz1Zw5bnQ==",
"dependencies": {
"Microsoft.Build.Tasks.Git": "10.0.102",
"Microsoft.SourceLink.Common": "10.0.102"
"Microsoft.Build.Tasks.Git": "8.0.0",
"Microsoft.SourceLink.Common": "8.0.0"
}
},
"Microsoft.VisualStudio.Threading.Analyzers": {
@@ -47,8 +47,8 @@
},
"Microsoft.Build.Tasks.Git": {
"type": "Transitive",
"resolved": "10.0.102",
"contentHash": "0i81LYX31U6UiXz4NOLbvc++u+/mVDmOt+PskrM/MygpDxkv9THKQyRUmavBpLK6iBV0abNWnn+CQgSRz//Pwg=="
"resolved": "8.0.0",
"contentHash": "bZKfSIKJRXLTuSzLudMFte/8CempWjVamNUR5eHJizsy+iuOuO/k2gnh7W0dHJmYY0tBf+gUErfluCv5mySAOQ=="
},
"Microsoft.NETFramework.ReferenceAssemblies.net461": {
"type": "Transitive",
@@ -57,8 +57,8 @@
},
"Microsoft.SourceLink.Common": {
"type": "Transitive",
"resolved": "10.0.102",
"contentHash": "Mk1IMb9q5tahC2NltxYXFkLBtuBvfBoCQ3pIxYQWfzbCE9o1OB9SsHe0hnNGo7lWgTA/ePbFAJLWu6nLL9K17A=="
"resolved": "8.0.0",
"contentHash": "dk9JPxTCIevS75HyEQ0E4OVAFhB2N+V9ShCXf8Q6FkUQZDkgLI12y679Nym1YqsiSysuQskT7Z+6nUf3yab6Vw=="
}
}
}

View File

@@ -8,61 +8,52 @@ Quick reference for commonly used SharpCompress APIs.
```csharp
// Auto-detect format
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
// Works with Zip, Tar, GZip, Rar, 7Zip, etc.
}
// Specific format - Archive API
using (var archive = ZipArchive.OpenArchive("file.zip"))
using (var archive = TarArchive.OpenArchive("file.tar"))
using (var archive = RarArchive.OpenArchive("file.rar"))
using (var archive = SevenZipArchive.OpenArchive("file.7z"))
using (var archive = GZipArchive.OpenArchive("file.gz"))
using (var archive = ZipArchive.Open("file.zip"))
using (var archive = TarArchive.Open("file.tar"))
using (var archive = RarArchive.Open("file.rar"))
using (var archive = SevenZipArchive.Open("file.7z"))
using (var archive = GZipArchive.Open("file.gz"))
// With fluent options (preferred)
var options = ReaderOptions.ForEncryptedArchive("password")
.WithArchiveEncoding(new ArchiveEncoding { Default = Encoding.GetEncoding(932) });
using (var archive = ZipArchive.OpenArchive("encrypted.zip", options))
// Alternative: object initializer
var options2 = new ReaderOptions
{
// With options
var options = new ReaderOptions
{
Password = "password",
LeaveStreamOpen = true,
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
};
using (var archive = ZipArchive.Open("encrypted.zip", options))
```
### Creating Archives
```csharp
// Writer Factory
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Zip, CompressionType.Deflate))
using (var writer = WriterFactory.Open(stream, ArchiveType.Zip, CompressionType.Deflate))
{
// Write entries
}
// Specific writer
using (var archive = ZipArchive.CreateArchive())
using (var archive = TarArchive.CreateArchive())
using (var archive = GZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
using (var archive = TarArchive.Create())
using (var archive = GZipArchive.Create())
// With fluent options (preferred)
var options = WriterOptions.ForZip()
.WithCompressionLevel(9)
.WithLeaveStreamOpen(false);
using (var archive = ZipArchive.CreateArchive())
{
archive.SaveTo("output.zip", options);
}
// Alternative: constructor with object initializer
var options2 = new WriterOptions(CompressionType.Deflate)
{
// With options
var options = new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 9,
LeaveStreamOpen = false
};
using (var archive = ZipArchive.Create())
{
archive.SaveTo("output.zip", options);
}
```
---
@@ -72,21 +63,26 @@ var options2 = new WriterOptions(CompressionType.Deflate)
### Reading/Extracting
```csharp
using (var archive = ZipArchive.OpenArchive("file.zip"))
using (var archive = ZipArchive.Open("file.zip"))
{
// Get all entries
IEnumerable<IArchiveEntry> entries = archive.Entries;
IEnumerable<IEntry> entries = archive.Entries;
// Find specific entry
var entry = archive.Entries.FirstOrDefault(e => e.Key == "file.txt");
// Extract all
archive.WriteToDirectory(@"C:\output");
archive.WriteToDirectory(@"C:\output", new ExtractionOptions
{
ExtractFullPath = true,
Overwrite = true
});
// Extract single entry
var entry = archive.Entries.First();
entry.WriteToFile(@"C:\output\file.txt");
entry.WriteToFile(@"C:\output\file.txt", new ExtractionOptions { Overwrite = true });
// Get entry stream
using (var stream = entry.OpenEntryStream())
{
@@ -94,14 +90,8 @@ using (var archive = ZipArchive.OpenArchive("file.zip"))
}
}
// Async extraction (requires IAsyncArchive)
using (var asyncArchive = await ZipArchive.OpenAsyncArchive("file.zip"))
{
await asyncArchive.WriteToDirectoryAsync(
@"C:\output",
cancellationToken: cancellationToken
);
}
// Async variants
await archive.WriteToDirectoryAsync(@"C:\output", options, cancellationToken);
using (var stream = await entry.OpenEntryStreamAsync(cancellationToken))
{
// ...
@@ -125,18 +115,18 @@ foreach (var entry in archive.Entries)
### Creating Archives
```csharp
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
// Add file
archive.AddEntry("file.txt", @"C:\source\file.txt");
archive.AddEntry("file.txt", "C:\\source\\file.txt");
// Add multiple files
archive.AddAllFromDirectory(@"C:\source");
archive.AddAllFromDirectory(@"C:\source", "*.txt"); // Pattern
archive.AddAllFromDirectory("C:\\source");
archive.AddAllFromDirectory("C:\\source", "*.txt"); // Pattern
// Save to file
archive.SaveTo("output.zip", CompressionType.Deflate);
// Save to stream
archive.SaveTo(outputStream, new WriterOptions(CompressionType.Deflate)
{
@@ -154,18 +144,18 @@ using (var archive = ZipArchive.CreateArchive())
```csharp
using (var stream = File.OpenRead("file.zip"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
IArchiveEntry entry = reader.Entry;
IEntry entry = reader.Entry;
if (!entry.IsDirectory)
{
// Extract entry
reader.WriteEntryToDirectory(@"C:\output");
reader.WriteEntryToFile(@"C:\output\file.txt");
// Or get stream
using (var entryStream = reader.OpenEntryStream())
{
@@ -175,24 +165,16 @@ using (var reader = ReaderFactory.OpenReader(stream))
}
}
// Async variants (use OpenAsyncReader to get IAsyncReader)
using (var stream = File.OpenRead("file.zip"))
using (var reader = await ReaderFactory.OpenAsyncReader(stream))
// Async variants
while (await reader.MoveToNextEntryAsync())
{
while (await reader.MoveToNextEntryAsync())
{
await reader.WriteEntryToFileAsync(
@"C:\output\" + reader.Entry.Key,
cancellationToken: cancellationToken
);
}
// Async extraction of all entries
await reader.WriteAllToDirectoryAsync(
@"C:\output",
cancellationToken
);
await reader.WriteEntryToFileAsync(@"C:\output\" + reader.Entry.Key, cancellationToken);
}
// Async extraction
await reader.WriteAllToDirectoryAsync(@"C:\output",
new ExtractionOptions { ExtractFullPath = true, Overwrite = true },
cancellationToken);
```
---
@@ -203,7 +185,7 @@ using (var reader = await ReaderFactory.OpenAsyncReader(stream))
```csharp
using (var stream = File.Create("output.zip"))
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Zip, CompressionType.Deflate))
using (var writer = WriterFactory.Open(stream, ArchiveType.Zip, CompressionType.Deflate))
{
// Write single file
using (var fileStream = File.OpenRead("source.txt"))
@@ -231,91 +213,43 @@ using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Zip, Compressio
### ReaderOptions
Use factory presets and fluent helpers for common configurations:
```csharp
// External stream with password and custom encoding
var options = ReaderOptions.ForExternalStream()
.WithPassword("password")
.WithArchiveEncoding(new ArchiveEncoding { Default = Encoding.GetEncoding(932) });
using (var archive = ZipArchive.OpenArchive("file.zip", options))
{
// ...
}
// Common presets
var safeOptions = ReaderOptions.SafeExtract; // No overwrite
var flatOptions = ReaderOptions.FlatExtract; // No directory structure
// Factory defaults:
// - file path / FileInfo overloads use LeaveStreamOpen = false
// - stream overloads use LeaveStreamOpen = true
```
Alternative: traditional object initializer:
```csharp
var options = new ReaderOptions
{
Password = "password",
LeaveStreamOpen = true,
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
Password = "password", // For encrypted archives
LeaveStreamOpen = true, // Don't close wrapped stream
ArchiveEncoding = new ArchiveEncoding // Custom character encoding
{
Default = Encoding.GetEncoding(932)
}
};
using (var archive = ZipArchive.Open("file.zip", options))
{
// ...
}
```
### WriterOptions
Factory methods provide a clean, discoverable way to create writer options:
```csharp
// Factory methods for common archive types
var zipOptions = WriterOptions.ForZip() // ZIP with Deflate
.WithCompressionLevel(9) // 0-9 for Deflate
.WithLeaveStreamOpen(false); // Close stream when done
var tarOptions = WriterOptions.ForTar(CompressionType.GZip) // TAR with GZip
.WithLeaveStreamOpen(false);
var gzipOptions = WriterOptions.ForGZip() // GZip file
.WithCompressionLevel(6);
archive.SaveTo("output.zip", zipOptions);
```
Alternative: traditional constructor with object initializer:
```csharp
var options = new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 9,
LeaveStreamOpen = true,
CompressionLevel = 9, // 0-9 for Deflate
LeaveStreamOpen = true, // Don't close stream
};
archive.SaveTo("output.zip", options);
```
### Extraction behavior
### ExtractionOptions
```csharp
var options = new ReaderOptions
var options = new ExtractionOptions
{
ExtractFullPath = true, // Recreate directory structure
Overwrite = true, // Overwrite existing files
PreserveFileTime = true // Keep original timestamps
};
using (var archive = ZipArchive.OpenArchive("file.zip", options))
{
archive.WriteToDirectory(@"C:\output");
}
```
### Options matrix
```text
ReaderOptions: open-time behavior (password, encoding, stream ownership, extraction defaults)
WriterOptions: write-time behavior (compression type/level, encoding, stream ownership)
ZipWriterEntryOptions: per-entry ZIP overrides (compression, level, timestamps, comments, zip64)
archive.WriteToDirectory(@"C:\output", options);
```
---
@@ -328,20 +262,15 @@ ZipWriterEntryOptions: per-entry ZIP overrides (compression, level, timestamps,
// For creating archives
CompressionType.None // No compression (store)
CompressionType.Deflate // DEFLATE (default for ZIP/GZip)
CompressionType.Deflate64 // Deflate64
CompressionType.BZip2 // BZip2
CompressionType.LZMA // LZMA (for 7Zip, LZip, XZ)
CompressionType.PPMd // PPMd (for ZIP)
CompressionType.Rar // RAR compression (read-only)
CompressionType.ZStandard // ZStandard
ArchiveType.Arc
ArchiveType.Arj
ArchiveType.Ace
// For Tar archives with compression
// Use WriterFactory to create compressed tar archives
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Tar, CompressionType.GZip)) // Tar.GZip
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Tar, CompressionType.BZip2)) // Tar.BZip2
// For Tar archives
// Use CompressionType in TarWriter constructor
using (var writer = TarWriter(stream, CompressionType.GZip)) // Tar.GZip
using (var writer = TarWriter(stream, CompressionType.BZip2)) // Tar.BZip2
```
### Archive Types
@@ -367,9 +296,13 @@ ArchiveType.ZStandard
try
{
using (var archive = ZipArchive.Open("archive.zip",
ReaderOptions.ForEncryptedArchive("password")))
new ReaderOptions { Password = "password" }))
{
archive.WriteToDirectory(@"C:\output");
archive.WriteToDirectory(@"C:\output", new ExtractionOptions
{
ExtractFullPath = true,
Overwrite = true
});
}
}
catch (PasswordRequiredException)
@@ -394,8 +327,8 @@ var progress = new Progress<ProgressReport>(report =>
Console.WriteLine($"Extracting {report.EntryPath}: {report.PercentComplete}%");
});
var options = ReaderOptions.ForOwnedFile().WithProgress(progress);
using (var archive = ZipArchive.OpenArchive("archive.zip", options))
var options = new ReaderOptions { Progress = progress };
using (var archive = ZipArchive.Open("archive.zip", options))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -409,12 +342,11 @@ cts.CancelAfter(TimeSpan.FromMinutes(5));
try
{
using (var archive = await ZipArchive.OpenAsyncArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
await archive.WriteToDirectoryAsync(
@"C:\output",
cancellationToken: cts.Token
);
await archive.WriteToDirectoryAsync(@"C:\output",
new ExtractionOptions { ExtractFullPath = true, Overwrite = true },
cts.Token);
}
}
catch (OperationCanceledException)
@@ -426,23 +358,23 @@ catch (OperationCanceledException)
### Create with Custom Compression
```csharp
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
archive.AddAllFromDirectory(@"D:\source");
// Fastest
archive.SaveTo("fast.zip", new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 1
archive.SaveTo("fast.zip", new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 1
});
// Balanced (default)
archive.SaveTo("normal.zip", CompressionType.Deflate);
// Best compression
archive.SaveTo("best.zip", new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 9
archive.SaveTo("best.zip", new WriterOptions(CompressionType.Deflate)
{
CompressionLevel = 9
});
}
```
@@ -451,7 +383,7 @@ using (var archive = ZipArchive.CreateArchive())
```csharp
using (var outputStream = new MemoryStream())
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
// Add content from memory
using (var contentStream = new MemoryStream(Encoding.UTF8.GetBytes("Hello")))
@@ -470,7 +402,7 @@ using (var archive = ZipArchive.CreateArchive())
### Extract Specific Files
```csharp
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
var filesToExtract = new[] { "file1.txt", "file2.txt" };
@@ -484,7 +416,7 @@ using (var archive = ZipArchive.OpenArchive("archive.zip"))
### List Archive Contents
```csharp
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
foreach (var entry in archive.Entries)
{
@@ -504,7 +436,7 @@ using (var archive = ZipArchive.OpenArchive("archive.zip"))
```csharp
var stream = File.OpenRead("archive.zip");
var archive = ZipArchive.OpenArchive(stream);
var archive = ZipArchive.Open(stream);
archive.WriteToDirectory(@"C:\output");
// stream not disposed - leaked resource
```
@@ -513,7 +445,7 @@ archive.WriteToDirectory(@"C:\output");
```csharp
using (var stream = File.OpenRead("archive.zip"))
using (var archive = ZipArchive.OpenArchive(stream))
using (var archive = ZipArchive.Open(stream))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -524,7 +456,7 @@ using (var archive = ZipArchive.OpenArchive(stream))
```csharp
// Loading entire archive then iterating
using (var archive = ZipArchive.OpenArchive("large.zip"))
using (var archive = ZipArchive.Open("large.zip"))
{
var entries = archive.Entries.ToList(); // Loads all in memory
foreach (var e in entries)
@@ -539,7 +471,7 @@ using (var archive = ZipArchive.OpenArchive("large.zip"))
```csharp
// Streaming iteration
using (var stream = File.OpenRead("large.zip"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{

View File

@@ -76,7 +76,7 @@ Factory classes for auto-detecting archive format and creating appropriate reade
- Format-specific: `ZipFactory.cs`, `TarFactory.cs`, `RarFactory.cs`, etc.
**How It Works:**
1. `ReaderFactory.OpenReader(stream)` probes stream signatures
1. `ReaderFactory.Open(stream)` probes stream signatures
2. Identifies format by magic bytes
3. Creates appropriate reader instance
4. Returns generic `IReader` interface
@@ -90,7 +90,7 @@ Common types, options, and enumerations used across formats.
- `ArchiveType.cs` - Enum for archive formats
- `CompressionType.cs` - Enum for compression methods
- `ArchiveEncoding.cs` - Character encoding configuration
- `IExtractionOptions.cs` - Extraction configuration exposed through `ReaderOptions`
- `ExtractionOptions.cs` - Extraction configuration
- Format-specific headers: `Zip/Headers/`, `Tar/Headers/`, `Rar/Headers/`, etc.
#### `Compressors/` - Compression Algorithms
@@ -142,7 +142,7 @@ Stream wrappers and utilities.
**Example:**
```csharp
// User calls factory
using (var reader = ReaderFactory.OpenReader(stream)) // Returns IReader
using (var reader = ReaderFactory.Open(stream)) // Returns IReader
{
while (reader.MoveToNextEntry())
{
@@ -175,7 +175,7 @@ CompressionType.LZMA // LZMA
CompressionType.PPMd // PPMd
// Writer uses strategy pattern
var archive = ZipArchive.CreateArchive();
var archive = ZipArchive.Create();
archive.SaveTo("output.zip", CompressionType.Deflate); // Use Deflate
archive.SaveTo("output.bz2", CompressionType.BZip2); // Use BZip2
```
@@ -215,13 +215,13 @@ using (var compressor = new DeflateStream(nonDisposingStream))
public abstract class AbstractArchive : IArchive
{
// Template methods
public virtual void WriteToDirectory(string destinationDirectory)
public virtual void WriteToDirectory(string destinationDirectory, ExtractionOptions options)
{
// Common extraction logic
foreach (var entry in Entries)
{
// Call subclass method
entry.WriteToFile(destinationPath);
entry.WriteToFile(destinationPath, options);
}
}
@@ -248,7 +248,7 @@ foreach (var entry in entries)
}
// Reader API - provides iterator
IReader reader = ReaderFactory.OpenReader(stream);
IReader reader = ReaderFactory.Open(stream);
while (reader.MoveToNextEntry())
{
// Forward-only iteration - one entry at a time
@@ -267,7 +267,8 @@ public interface IArchive : IDisposable
{
IEnumerable<IEntry> Entries { get; }
void WriteToDirectory(string destinationDirectory);
void WriteToDirectory(string destinationDirectory,
ExtractionOptions options = null);
IEntry FirstOrDefault(Func<IEntry, bool> predicate);
@@ -286,7 +287,8 @@ public interface IReader : IDisposable
bool MoveToNextEntry();
void WriteEntryToDirectory(string destinationDirectory);
void WriteEntryToDirectory(string destinationDirectory,
ExtractionOptions options = null);
Stream OpenEntryStream();
@@ -325,7 +327,7 @@ public interface IEntry
DateTime? LastModifiedTime { get; }
CompressionType CompressionType { get; }
void WriteToFile(string fullPath);
void WriteToFile(string fullPath, ExtractionOptions options = null);
void WriteToStream(Stream destinationStream);
Stream OpenEntryStream();
@@ -379,7 +381,7 @@ public class NewFormatArchive : AbstractArchive
private NewFormatHeader _header;
private List<NewFormatEntry> _entries;
public static NewFormatArchive OpenArchive(Stream stream)
public static NewFormatArchive Open(Stream stream)
{
var archive = new NewFormatArchive();
archive._header = NewFormatHeader.Read(stream);
@@ -440,8 +442,8 @@ public class NewFormatFactory : Factory, IArchiveFactory, IReaderFactory
public static NewFormatFactory Instance { get; } = new();
public IArchive CreateArchive(Stream stream)
=> NewFormatArchive.OpenArchive(stream);
public IArchive CreateArchive(Stream stream)
=> NewFormatArchive.Open(stream);
public IReader CreateReader(Stream stream, ReaderOptions options)
=> new NewFormatReader(stream) { Options = options };
@@ -479,7 +481,7 @@ public class NewFormatTests : TestBase
public void NewFormat_Extracts_Successfully()
{
var archivePath = Path.Combine(TEST_ARCHIVES_PATH, "archive.newformat");
using (var archive = NewFormatArchive.OpenArchive(archivePath))
using (var archive = NewFormatArchive.Open(archivePath))
{
archive.WriteToDirectory(SCRATCH_FILES_PATH);
// Assert extraction
@@ -559,7 +561,7 @@ public class CustomStream : Stream
```csharp
// Correct: Nested using blocks
using (var fileStream = File.OpenRead("archive.zip"))
using (var archive = ZipArchive.OpenArchive(fileStream))
using (var archive = ZipArchive.Open(fileStream))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -568,7 +570,7 @@ using (var archive = ZipArchive.OpenArchive(fileStream))
// Correct: Using with options
var options = new ReaderOptions { LeaveStreamOpen = true };
var stream = File.OpenRead("archive.zip");
using (var archive = ZipArchive.OpenArchive(stream, options))
using (var archive = ZipArchive.Open(stream, options))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -639,7 +641,7 @@ public void Archive_Extraction_Works()
var testArchive = Path.Combine(TEST_ARCHIVES_PATH, "test.zip");
// Act
using (var archive = ZipArchive.OpenArchive(testArchive))
using (var archive = ZipArchive.Open(testArchive))
{
archive.WriteToDirectory(SCRATCH_FILES_PATH);
}

View File

@@ -18,23 +18,22 @@ Most archive formats store filenames and metadata as bytes. SharpCompress must c
using SharpCompress.Common;
using SharpCompress.Readers;
// Configure encoding using fluent factory method (preferred)
var options = ReaderOptions.ForEncoding(
new ArchiveEncoding { Default = Encoding.GetEncoding(932) }); // cp932 for Japanese
// Configure encoding before opening archive
var options = new ReaderOptions
{
ArchiveEncoding = new ArchiveEncoding
{
Default = Encoding.GetEncoding(932) // cp932 for Japanese
}
};
using (var archive = ZipArchive.OpenArchive("japanese.zip", options))
using (var archive = ZipArchive.Open("japanese.zip", options))
{
foreach (var entry in archive.Entries)
{
Console.WriteLine(entry.Key); // Now shows correct characters
}
}
// Alternative: object initializer
var options2 = new ReaderOptions
{
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
};
```
### ArchiveEncoding Properties
@@ -48,9 +47,11 @@ var options2 = new ReaderOptions
**Archive API:**
```csharp
var options = ReaderOptions.ForEncoding(
new ArchiveEncoding { Default = Encoding.GetEncoding(932) });
using (var archive = ZipArchive.OpenArchive("file.zip", options))
var options = new ReaderOptions
{
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
};
using (var archive = ZipArchive.Open("file.zip", options))
{
// Use archive with correct encoding
}
@@ -58,10 +59,12 @@ using (var archive = ZipArchive.OpenArchive("file.zip", options))
**Reader API:**
```csharp
var options = ReaderOptions.ForEncoding(
new ArchiveEncoding { Default = Encoding.GetEncoding(932) });
var options = new ReaderOptions
{
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
};
using (var stream = File.OpenRead("file.zip"))
using (var reader = ReaderFactory.OpenReader(stream, options))
using (var reader = ReaderFactory.Open(stream, options))
{
while (reader.MoveToNextEntry())
{
@@ -86,7 +89,7 @@ var options = new ReaderOptions
Default = Encoding.GetEncoding(932)
}
};
using (var archive = ZipArchive.OpenArchive("japanese.zip", options))
using (var archive = ZipArchive.Open("japanese.zip", options))
{
// Correctly decodes Japanese filenames
}
@@ -263,7 +266,7 @@ SharpCompress attempts to auto-detect encoding, but this isn't always reliable:
```csharp
// Auto-detection (default)
using (var archive = ZipArchive.OpenArchive("file.zip")) // Uses UTF8 by default
using (var archive = ZipArchive.Open("file.zip")) // Uses UTF8 by default
{
// May show corrupted characters if archive uses different encoding
}
@@ -273,7 +276,7 @@ var options = new ReaderOptions
{
ArchiveEncoding = new ArchiveEncoding { Default = Encoding.GetEncoding(932) }
};
using (var archive = ZipArchive.OpenArchive("file.zip", options))
using (var archive = ZipArchive.Open("file.zip", options))
{
// Correct characters displayed
}
@@ -321,7 +324,7 @@ var options = new ReaderOptions
}
};
using (var archive = ZipArchive.OpenArchive("mixed.zip", options))
using (var archive = ZipArchive.Open("mixed.zip", options))
{
foreach (var entry in archive.Entries)
{
@@ -385,9 +388,13 @@ var options = new ReaderOptions
}
};
using (var archive = ZipArchive.OpenArchive("japanese_files.zip", options))
using (var archive = ZipArchive.Open("japanese_files.zip", options))
{
archive.WriteToDirectory(@"C:\output");
archive.WriteToDirectory(@"C:\output", new ExtractionOptions
{
ExtractFullPath = true,
Overwrite = true
});
}
// Files extracted with correct Japanese names
```
@@ -403,7 +410,7 @@ var options = new ReaderOptions
}
};
using (var archive = ZipArchive.OpenArchive("french_files.zip", options))
using (var archive = ZipArchive.Open("french_files.zip", options))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -421,7 +428,7 @@ var options = new ReaderOptions
}
};
using (var archive = ZipArchive.OpenArchive("chinese_files.zip", options))
using (var archive = ZipArchive.Open("chinese_files.zip", options))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -438,7 +445,7 @@ var options = new ReaderOptions
}
};
using (var archive = ZipArchive.OpenArchive("russian_files.zip", options))
using (var archive = ZipArchive.Open("russian_files.zip", options))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -456,7 +463,7 @@ var options = new ReaderOptions
};
using (var stream = File.OpenRead("japanese.zip"))
using (var reader = ReaderFactory.OpenReader(stream, options))
using (var reader = ReaderFactory.Open(stream, options))
{
while (reader.MoveToNextEntry())
{
@@ -477,7 +484,7 @@ When creating archives, SharpCompress uses UTF8 by default (recommended):
```csharp
// Create with UTF8 (default, recommended)
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
archive.AddAllFromDirectory(@"D:\my_files");
archive.SaveTo("output.zip", CompressionType.Deflate);

View File

@@ -24,7 +24,7 @@ Choose the right API based on your use case:
// - You need random access to entries
// - Stream is seekable (file, MemoryStream)
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
// Random access - all entries available
var specific = archive.Entries.FirstOrDefault(e => e.Key == "file.txt");
@@ -51,7 +51,7 @@ using (var archive = ZipArchive.OpenArchive("archive.zip"))
// - Forward-only processing is acceptable
using (var stream = File.OpenRead("large.zip"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
@@ -129,7 +129,7 @@ For processing archives from downloads or pipes:
```csharp
// Download stream (non-seekable)
using (var httpStream = await httpClient.GetStreamAsync(url))
using (var reader = ReaderFactory.OpenReader(httpStream))
using (var reader = ReaderFactory.Open(httpStream))
{
// Process entries as they arrive
while (reader.MoveToNextEntry())
@@ -159,14 +159,14 @@ Choose based on your constraints:
```csharp
// Download then extract (requires disk space)
var archivePath = await DownloadFile(url, @"C:\temp\archive.zip");
using (var archive = ZipArchive.OpenArchive(archivePath))
using (var archive = ZipArchive.Open(archivePath))
{
archive.WriteToDirectory(@"C:\output");
}
// Stream during download (on-the-fly extraction)
using (var httpStream = await httpClient.GetStreamAsync(url))
using (var reader = ReaderFactory.OpenReader(httpStream))
using (var reader = ReaderFactory.Open(httpStream))
{
while (reader.MoveToNextEntry())
{
@@ -198,7 +198,7 @@ Extracting File3 requires decompressing File1 and File2 first.
**Random Extraction (Slow):**
```csharp
using (var archive = RarArchive.OpenArchive("solid.rar"))
using (var archive = RarArchive.Open("solid.rar"))
{
foreach (var entry in archive.Entries)
{
@@ -210,10 +210,14 @@ using (var archive = RarArchive.OpenArchive("solid.rar"))
**Sequential Extraction (Fast):**
```csharp
using (var archive = RarArchive.OpenArchive("solid.rar"))
using (var archive = RarArchive.Open("solid.rar"))
{
// Method 1: Use WriteToDirectory (recommended)
archive.WriteToDirectory(@"C:\output");
archive.WriteToDirectory(@"C:\output", new ExtractionOptions
{
ExtractFullPath = true,
Overwrite = true
});
// Method 2: Use ExtractAllEntries
archive.ExtractAllEntries();
@@ -252,7 +256,7 @@ using (var archive = RarArchive.OpenArchive("solid.rar"))
// Level 9 = Slowest, best compression
// Write with different compression levels
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
archive.AddAllFromDirectory(@"D:\data");
@@ -289,7 +293,7 @@ using (var archive = ZipArchive.CreateArchive())
// Smaller block size = lower memory, faster
// Larger block size = better compression, slower
using (var archive = TarArchive.CreateArchive())
using (var archive = TarArchive.Create())
{
archive.AddAllFromDirectory(@"D:\data");
@@ -309,7 +313,7 @@ LZMA compression is very powerful but memory-intensive:
// - Better compression: larger dictionary
// Preset via CompressionType
using (var archive = TarArchive.CreateArchive())
using (var archive = TarArchive.Create())
{
archive.AddAllFromDirectory(@"D:\data");
archive.SaveTo("archive.tar.xz", CompressionType.LZMA); // Default settings
@@ -329,10 +333,11 @@ Async is beneficial when:
```csharp
// Async extraction (non-blocking)
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
await archive.WriteToDirectoryAsync(
@"C:\output",
new ExtractionOptions { ExtractFullPath = true, Overwrite = true },
cancellationToken
);
}
@@ -348,9 +353,12 @@ Async doesn't improve performance for:
```csharp
// Sync extraction (simpler, same performance on fast I/O)
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
archive.WriteToDirectory(@"C:\output");
archive.WriteToDirectory(
@"C:\output",
new ExtractionOptions { ExtractFullPath = true, Overwrite = true }
);
}
// Simple and fast - no async needed
```
@@ -365,10 +373,11 @@ cts.CancelAfter(TimeSpan.FromMinutes(5));
try
{
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
await archive.WriteToDirectoryAsync(
@"C:\output",
new ExtractionOptions { ExtractFullPath = true, Overwrite = true },
cts.Token
);
}
@@ -399,14 +408,14 @@ catch (OperationCanceledException)
// ✗ Slow - opens each archive separately
foreach (var file in files)
{
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
archive.WriteToDirectory(@"C:\output");
}
}
// ✓ Better - process multiple entries at once
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
archive.WriteToDirectory(@"C:\output");
}
@@ -416,7 +425,7 @@ using (var archive = ZipArchive.OpenArchive("archive.zip"))
```csharp
var sw = Stopwatch.StartNew();
using (var archive = ZipArchive.OpenArchive("large.zip"))
using (var archive = ZipArchive.Open("large.zip"))
{
archive.WriteToDirectory(@"C:\output");
}

View File

@@ -6,7 +6,7 @@ SharpCompress now provides full async/await support for all I/O operations. All
**Key Async Methods:**
- `reader.WriteEntryToAsync(stream, cancellationToken)` - Extract entry asynchronously
- `reader.WriteAllToDirectoryAsync(path, cancellationToken)` - Extract all asynchronously
- `reader.WriteAllToDirectoryAsync(path, options, cancellationToken)` - Extract all asynchronously
- `writer.WriteAsync(filename, stream, modTime, cancellationToken)` - Write entry asynchronously
- `writer.WriteAllAsync(directory, pattern, searchOption, cancellationToken)` - Write directory asynchronously
- `entry.OpenEntryStreamAsync(cancellationToken)` - Open entry stream asynchronously
@@ -40,10 +40,6 @@ To deal with the "correct" rules as well as the expectations of users, I've deci
To be explicit though, consider always using the overloads that use `ReaderOptions` or `WriterOptions` and explicitly set `LeaveStreamOpen` the way you want.
Default behavior in factory APIs:
- File path / `FileInfo` overloads set `LeaveStreamOpen = false`.
- Caller-provided `Stream` overloads set `LeaveStreamOpen = true`.
If using Compression Stream classes directly and you don't want the wrapped stream to be closed. Use the `NonDisposingStream` as a wrapper to prevent the stream being disposed. The change in 0.21 simplified a lot even though the usage is a bit more convoluted.
## Samples
@@ -52,7 +48,7 @@ Also, look over the tests for more thorough [examples](https://github.com/adamha
### Create Zip Archive from multiple files
```C#
using(var archive = ZipArchive.CreateArchive())
using(var archive = ZipArchive.Create())
{
archive.AddEntry("file01.txt", "C:\\file01.txt");
archive.AddEntry("file02.txt", "C:\\file02.txt");
@@ -65,7 +61,7 @@ using(var archive = ZipArchive.CreateArchive())
### Create Zip Archive from all files in a directory to a file
```C#
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
archive.AddAllFromDirectory("D:\\temp");
archive.SaveTo("C:\\temp.zip", CompressionType.Deflate);
@@ -76,7 +72,7 @@ using (var archive = ZipArchive.CreateArchive())
```C#
var memoryStream = new MemoryStream();
using (var archive = ZipArchive.CreateArchive())
using (var archive = ZipArchive.Create())
{
archive.AddAllFromDirectory("D:\\temp");
archive.SaveTo(memoryStream, new WriterOptions(CompressionType.Deflate)
@@ -94,21 +90,21 @@ Note: Extracting a solid rar or 7z file needs to be done in sequential order to
`ExtractAllEntries` is primarily intended for solid archives (like solid Rar) or 7Zip archives, where sequential extraction provides the best performance. For general/simple extraction with any supported archive type, use `archive.WriteToDirectory()` instead.
```C#
// Using fluent factory method for extraction options
using (var archive = RarArchive.OpenArchive("Test.rar",
ReaderOptions.ForOwnedFile()
.WithExtractFullPath(true)
.WithOverwrite(true)))
using (var archive = RarArchive.Open("Test.rar"))
{
// Simple extraction with RarArchive; this WriteToDirectory pattern works for all archive types
archive.WriteToDirectory(@"D:\temp");
archive.WriteToDirectory(@"D:\temp", new ExtractionOptions()
{
ExtractFullPath = true,
Overwrite = true
});
}
```
### Iterate over all files from a Rar file using RarArchive
```C#
using (var archive = RarArchive.OpenArchive("Test.rar"))
using (var archive = RarArchive.Open("Test.rar"))
{
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
@@ -130,13 +126,13 @@ var progress = new Progress<ProgressReport>(report =>
Console.WriteLine($"Extracting {report.EntryPath}: {report.PercentComplete}%");
});
using (var archive = RarArchive.OpenArchive("archive.rar",
ReaderOptions.ForOwnedFile()
.WithProgress(progress)
.WithExtractFullPath(true)
.WithOverwrite(true))) // Must be solid Rar or 7Zip
using (var archive = RarArchive.Open("archive.rar", new ReaderOptions { Progress = progress })) // Must be solid Rar or 7Zip
{
archive.WriteToDirectory(@"D:\output");
archive.WriteToDirectory(@"D:\output", new ExtractionOptions()
{
ExtractFullPath = true,
Overwrite = true
});
}
```
@@ -144,14 +140,18 @@ using (var archive = RarArchive.OpenArchive("archive.rar",
```C#
using (Stream stream = File.OpenRead("Tar.tar.bz2"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
if (!reader.Entry.IsDirectory)
{
Console.WriteLine(reader.Entry.Key);
reader.WriteEntryToDirectory(@"C:\temp");
reader.WriteEntryToDirectory(@"C:\temp", new ExtractionOptions()
{
ExtractFullPath = true,
Overwrite = true
});
}
}
}
@@ -161,7 +161,7 @@ using (var reader = ReaderFactory.OpenReader(stream))
```C#
using (Stream stream = File.OpenRead("Tar.tar.bz2"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
@@ -180,10 +180,10 @@ using (var reader = ReaderFactory.OpenReader(stream))
```C#
using (Stream stream = File.OpenWrite("C:\\temp.tgz"))
using (var writer = WriterFactory.OpenWriter(
stream,
ArchiveType.Tar,
WriterOptions.ForTar(CompressionType.GZip).WithLeaveStreamOpen(true)))
using (var writer = WriterFactory.Open(stream, ArchiveType.Tar, new WriterOptions(CompressionType.GZip)
{
LeaveOpenStream = true
}))
{
writer.WriteAll("D:\\temp", "*", SearchOption.AllDirectories);
}
@@ -192,15 +192,15 @@ using (var writer = WriterFactory.OpenWriter(
### Extract zip which has non-utf8 encoded filename(cp932)
```C#
var opts = new SharpCompress.Readers.ReaderOptions();
var encoding = Encoding.GetEncoding(932);
var opts = new ReaderOptions()
.WithArchiveEncoding(new ArchiveEncoding
{
CustomDecoder = (data, x, y) => encoding.GetString(data)
});
using var archive = ZipArchive.OpenArchive("test.zip", opts);
foreach(var entry in archive.Entries)
opts.ArchiveEncoding = new SharpCompress.Common.ArchiveEncoding();
opts.ArchiveEncoding.CustomDecoder = (data, x, y) =>
{
return encoding.GetString(data);
};
var tr = SharpCompress.Archives.Zip.ZipArchive.Open("test.zip", opts);
foreach(var entry in tr.Entries)
{
Console.WriteLine($"{entry.Key}");
}
@@ -213,7 +213,7 @@ foreach(var entry in archive.Entries)
**Extract single entry asynchronously:**
```C#
using (Stream stream = File.OpenRead("archive.zip"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
{
@@ -234,10 +234,15 @@ using (var reader = ReaderFactory.OpenReader(stream))
**Extract all entries asynchronously:**
```C#
using (Stream stream = File.OpenRead("archive.tar.gz"))
using (var reader = ReaderFactory.OpenReader(stream))
using (var reader = ReaderFactory.Open(stream))
{
await reader.WriteAllToDirectoryAsync(
@"D:\temp",
new ExtractionOptions()
{
ExtractFullPath = true,
Overwrite = true
},
cancellationToken
);
}
@@ -245,7 +250,7 @@ using (var reader = ReaderFactory.OpenReader(stream))
**Open and process entry stream asynchronously:**
```C#
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
foreach (var entry in archive.Entries.Where(e => !e.IsDirectory))
{
@@ -263,7 +268,7 @@ using (var archive = ZipArchive.OpenArchive("archive.zip"))
**Write single file asynchronously:**
```C#
using (Stream archiveStream = File.OpenWrite("output.zip"))
using (var writer = WriterFactory.OpenWriter(archiveStream, ArchiveType.Zip, CompressionType.Deflate))
using (var writer = WriterFactory.Open(archiveStream, ArchiveType.Zip, CompressionType.Deflate))
{
using (Stream fileStream = File.OpenRead("input.txt"))
{
@@ -275,7 +280,7 @@ using (var writer = WriterFactory.OpenWriter(archiveStream, ArchiveType.Zip, Com
**Write entire directory asynchronously:**
```C#
using (Stream stream = File.OpenWrite("backup.tar.gz"))
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Tar, new WriterOptions(CompressionType.GZip)))
using (var writer = WriterFactory.Open(stream, ArchiveType.Tar, new WriterOptions(CompressionType.GZip)))
{
await writer.WriteAllAsync(
@"D:\files",
@@ -294,7 +299,7 @@ var cts = new CancellationTokenSource();
cts.CancelAfter(TimeSpan.FromMinutes(5));
using (Stream stream = File.OpenWrite("archive.zip"))
using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Zip, CompressionType.Deflate))
using (var writer = WriterFactory.Open(stream, ArchiveType.Zip, CompressionType.Deflate))
{
try
{
@@ -311,11 +316,12 @@ using (var writer = WriterFactory.OpenWriter(stream, ArchiveType.Zip, Compressio
**Extract from archive asynchronously:**
```C#
using (var archive = ZipArchive.OpenArchive("archive.zip"))
using (var archive = ZipArchive.Open("archive.zip"))
{
// Simple async extraction - works for all archive types
await archive.WriteToDirectoryAsync(
@"C:\output",
new ExtractionOptions() { ExtractFullPath = true, Overwrite = true },
cancellationToken
);
}

View File

@@ -1,7 +1,7 @@
// Copyright (c) Six Labors.
// Licensed under the Apache License, Version 2.0.
#if !LEGACY_DOTNET
#if !NETSTANDARD2_0 && !NETSTANDARD2_1 && !NETFRAMEWORK
#define SUPPORTS_RUNTIME_INTRINSICS
#define SUPPORTS_HOTPATH
#endif

View File

@@ -1,103 +0,0 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public abstract partial class AbstractArchive<TEntry, TVolume>
where TEntry : IArchiveEntry
where TVolume : IVolume
{
#region Async Support
// Async properties
public virtual IAsyncEnumerable<TEntry> EntriesAsync => _lazyEntriesAsync;
public IAsyncEnumerable<TVolume> VolumesAsync => _lazyVolumesAsync;
protected virtual async IAsyncEnumerable<TEntry> LoadEntriesAsync(
IAsyncEnumerable<TVolume> volumes
)
{
foreach (var item in LoadEntries(await volumes.ToListAsync()))
{
yield return item;
}
}
public virtual async ValueTask DisposeAsync()
{
if (!_disposed)
{
await foreach (var v in _lazyVolumesAsync)
{
v.Dispose();
}
foreach (var v in _lazyEntriesAsync.GetLoaded().Cast<Entry>())
{
v.Close();
}
_sourceStream?.Dispose();
_disposed = true;
}
}
private async ValueTask EnsureEntriesLoadedAsync()
{
await _lazyEntriesAsync.EnsureFullyLoaded();
await _lazyVolumesAsync.EnsureFullyLoaded();
}
private async IAsyncEnumerable<IArchiveEntry> EntriesAsyncCast()
{
await foreach (var entry in EntriesAsync)
{
yield return entry;
}
}
IAsyncEnumerable<IArchiveEntry> IAsyncArchive.EntriesAsync => EntriesAsyncCast();
IAsyncEnumerable<IVolume> IAsyncArchive.VolumesAsync => VolumesAsyncCast();
private async IAsyncEnumerable<IVolume> VolumesAsyncCast()
{
await foreach (var volume in _lazyVolumesAsync)
{
yield return volume;
}
}
public async ValueTask<IAsyncReader> ExtractAllEntriesAsync()
{
if (!await IsSolidAsync() && Type != ArchiveType.SevenZip)
{
throw new SharpCompressException(
"ExtractAllEntries can only be used on solid archives or 7Zip archives (which require random access)."
);
}
await EnsureEntriesLoadedAsync();
return await CreateReaderForSolidExtractionAsync();
}
public virtual ValueTask<bool> IsSolidAsync() => new(false);
public async ValueTask<bool> IsCompleteAsync()
{
await EnsureEntriesLoadedAsync();
return await EntriesAsync.AllAsync(x => x.IsComplete);
}
public async ValueTask<long> TotalSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.CompressedSize);
public async ValueTask<long> TotalUncompressedSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.Size);
public ValueTask<bool> IsEncryptedAsync() => new(IsEncrypted);
#endregion
}

View File

@@ -7,7 +7,7 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives;
public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyncArchive
public abstract class AbstractArchive<TEntry, TVolume> : IArchive, IAsyncArchive
where TEntry : IArchiveEntry
where TVolume : IVolume
{
@@ -16,11 +16,7 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
private bool _disposed;
private readonly SourceStream? _sourceStream;
// Async fields - kept in original file per refactoring rules
private readonly LazyAsyncReadOnlyCollection<TVolume> _lazyVolumesAsync;
private readonly LazyAsyncReadOnlyCollection<TEntry> _lazyEntriesAsync;
public ReaderOptions ReaderOptions { get; protected set; }
protected ReaderOptions ReaderOptions { get; }
internal AbstractArchive(ArchiveType type, SourceStream sourceStream)
{
@@ -72,7 +68,7 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
/// <summary>
/// The total size of the files as uncompressed in the archive.
/// </summary>
public virtual long TotalUncompressedSize =>
public virtual long TotalUncompressSize =>
Entries.Aggregate(0L, (total, cf) => total + cf.Size);
protected abstract IEnumerable<TVolume> LoadVolumes(SourceStream sourceStream);
@@ -81,6 +77,16 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
protected virtual IAsyncEnumerable<TVolume> LoadVolumesAsync(SourceStream sourceStream) =>
LoadVolumes(sourceStream).ToAsyncEnumerable();
protected virtual async IAsyncEnumerable<TEntry> LoadEntriesAsync(
IAsyncEnumerable<TVolume> volumes
)
{
foreach (var item in LoadEntries(await volumes.ToListAsync()))
{
yield return item;
}
}
IEnumerable<IArchiveEntry> IArchive.Entries => Entries.Cast<IArchiveEntry>();
IEnumerable<IVolume> IArchive.Volumes => _lazyVolumes.Cast<IVolume>();
@@ -150,4 +156,67 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
return Entries.All(x => x.IsComplete);
}
}
#region Async Support
private readonly LazyAsyncReadOnlyCollection<TVolume> _lazyVolumesAsync;
private readonly LazyAsyncReadOnlyCollection<TEntry> _lazyEntriesAsync;
public virtual async ValueTask DisposeAsync()
{
if (!_disposed)
{
await foreach (var v in _lazyVolumesAsync)
{
v.Dispose();
}
foreach (var v in _lazyEntriesAsync.GetLoaded().Cast<Entry>())
{
v.Close();
}
_sourceStream?.Dispose();
_disposed = true;
}
}
private async ValueTask EnsureEntriesLoadedAsync()
{
await _lazyEntriesAsync.EnsureFullyLoaded();
await _lazyVolumesAsync.EnsureFullyLoaded();
}
public virtual IAsyncEnumerable<TEntry> EntriesAsync => _lazyEntriesAsync;
IAsyncEnumerable<IArchiveEntry> IAsyncArchive.EntriesAsync =>
EntriesAsync.Cast<TEntry, IArchiveEntry>();
public IAsyncEnumerable<IVolume> VolumesAsync => _lazyVolumesAsync.Cast<TVolume, IVolume>();
public async ValueTask<IAsyncReader> ExtractAllEntriesAsync()
{
if (!IsSolid && Type != ArchiveType.SevenZip)
{
throw new SharpCompressException(
"ExtractAllEntries can only be used on solid archives or 7Zip archives (which require random access)."
);
}
await EnsureEntriesLoadedAsync();
return await CreateReaderForSolidExtractionAsync();
}
public virtual ValueTask<bool> IsSolidAsync() => new(false);
public async ValueTask<bool> IsCompleteAsync()
{
await EnsureEntriesLoadedAsync();
return await EntriesAsync.All(x => x.IsComplete);
}
public async ValueTask<long> TotalSizeAsync() =>
await EntriesAsync.Aggregate(0L, (total, cf) => total + cf.CompressedSize);
public async ValueTask<long> TotalUncompressSizeAsync() =>
await EntriesAsync.Aggregate(0L, (total, cf) => total + cf.Size);
#endregion
}

View File

@@ -1,132 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
namespace SharpCompress.Archives;
public abstract partial class AbstractWritableArchive<TEntry, TVolume, TOptions>
where TEntry : IArchiveEntry
where TVolume : IVolume
where TOptions : IWriterOptions
{
// Async property moved from main file
private IAsyncEnumerable<TEntry> OldEntriesAsync =>
base.EntriesAsync.Where(x => !removedEntries.Contains(x));
private async ValueTask RebuildModifiedCollectionAsync()
{
if (pauseRebuilding)
{
return;
}
hasModifications = true;
newEntries.RemoveAll(v => removedEntries.Contains(v));
modifiedEntries.Clear();
await foreach (var entry in OldEntriesAsync)
{
modifiedEntries.Add(entry);
}
modifiedEntries.AddRange(newEntries);
}
public async ValueTask RemoveEntryAsync(TEntry entry)
{
if (!removedEntries.Contains(entry))
{
removedEntries.Add(entry);
await RebuildModifiedCollectionAsync();
}
}
private async ValueTask<bool> DoesKeyMatchExistingAsync(
string key,
CancellationToken cancellationToken
)
{
await foreach (
var entry in EntriesAsync.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
var path = entry.Key;
if (path is null)
{
continue;
}
var p = path.Replace('/', '\\');
if (p.Length > 0 && p[0] == '\\')
{
p = p.Substring(1);
}
return string.Equals(p, key, StringComparison.OrdinalIgnoreCase);
}
return false;
}
public async ValueTask<TEntry> AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size = 0,
DateTime? modified = null,
CancellationToken cancellationToken = default
)
{
if (key.Length > 0 && key[0] is '/' or '\\')
{
key = key.Substring(1);
}
if (await DoesKeyMatchExistingAsync(key, cancellationToken).ConfigureAwait(false))
{
throw new ArchiveException("Cannot add entry with duplicate key: " + key);
}
var entry = CreateEntry(key, source, size, modified, closeStream);
newEntries.Add(entry);
await RebuildModifiedCollectionAsync();
return entry;
}
public async ValueTask<TEntry> AddDirectoryEntryAsync(
string key,
DateTime? modified = null,
CancellationToken cancellationToken = default
)
{
if (key.Length > 0 && key[0] is '/' or '\\')
{
key = key.Substring(1);
}
if (await DoesKeyMatchExistingAsync(key, cancellationToken).ConfigureAwait(false))
{
throw new ArchiveException("Cannot add entry with duplicate key: " + key);
}
var entry = CreateDirectoryEntry(key, modified);
newEntries.Add(entry);
await RebuildModifiedCollectionAsync();
return entry;
}
public async ValueTask SaveToAsync(
Stream stream,
TOptions options,
CancellationToken cancellationToken = default
)
{
//reset streams of new entries
newEntries.Cast<IWritableArchiveEntry>().ForEach(x => x.Stream.Seek(0, SeekOrigin.Begin));
await SaveToAsync(stream, options, OldEntriesAsync, newEntries, cancellationToken)
.ConfigureAwait(false);
}
protected abstract ValueTask SaveToAsync(
Stream stream,
TOptions options,
IAsyncEnumerable<TEntry> oldEntries,
IEnumerable<TEntry> newEntries,
CancellationToken cancellationToken = default
);
}

View File

@@ -5,25 +5,22 @@ using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.IO;
using SharpCompress.Writers;
namespace SharpCompress.Archives;
public abstract partial class AbstractWritableArchive<TEntry, TVolume, TOptions>
public abstract class AbstractWritableArchive<TEntry, TVolume>
: AbstractArchive<TEntry, TVolume>,
IWritableArchive<TOptions>,
IWritableAsyncArchive<TOptions>
IWritableArchive
where TEntry : IArchiveEntry
where TVolume : IVolume
where TOptions : IWriterOptions
{
private class RebuildPauseDisposable : IDisposable
{
private readonly AbstractWritableArchive<TEntry, TVolume, TOptions> archive;
private readonly AbstractWritableArchive<TEntry, TVolume> archive;
public RebuildPauseDisposable(AbstractWritableArchive<TEntry, TVolume, TOptions> archive)
public RebuildPauseDisposable(AbstractWritableArchive<TEntry, TVolume> archive)
{
this.archive = archive;
archive.pauseRebuilding = true;
@@ -142,24 +139,6 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume, TOptions>
return false;
}
ValueTask IWritableAsyncArchive.RemoveEntryAsync(IArchiveEntry entry) =>
RemoveEntryAsync((TEntry)entry);
async ValueTask<IArchiveEntry> IWritableAsyncArchive.AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size,
DateTime? modified,
CancellationToken cancellationToken
) => await AddEntryAsync(key, source, closeStream, size, modified, cancellationToken);
async ValueTask<IArchiveEntry> IWritableAsyncArchive.AddDirectoryEntryAsync(
string key,
DateTime? modified,
CancellationToken cancellationToken
) => await AddDirectoryEntryAsync(key, modified, cancellationToken);
public TEntry AddDirectoryEntry(string key, DateTime? modified = null)
{
if (key.Length > 0 && key[0] is '/' or '\\')
@@ -176,13 +155,25 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume, TOptions>
return entry;
}
public void SaveTo(Stream stream, TOptions options)
public void SaveTo(Stream stream, WriterOptions options)
{
//reset streams of new entries
newEntries.Cast<IWritableArchiveEntry>().ForEach(x => x.Stream.Seek(0, SeekOrigin.Begin));
SaveTo(stream, options, OldEntries, newEntries);
}
public async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
CancellationToken cancellationToken = default
)
{
//reset streams of new entries
newEntries.Cast<IWritableArchiveEntry>().ForEach(x => x.Stream.Seek(0, SeekOrigin.Begin));
await SaveToAsync(stream, options, OldEntries, newEntries, cancellationToken)
.ConfigureAwait(false);
}
protected TEntry CreateEntry(
string key,
Stream source,
@@ -212,11 +203,19 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume, TOptions>
protected abstract void SaveTo(
Stream stream,
TOptions options,
WriterOptions options,
IEnumerable<TEntry> oldEntries,
IEnumerable<TEntry> newEntries
);
protected abstract ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<TEntry> oldEntries,
IEnumerable<TEntry> newEntries,
CancellationToken cancellationToken = default
);
public override void Dispose()
{
base.Dispose();

View File

@@ -1,157 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Factories;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public static partial class ArchiveFactory
{
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
readerOptions ??= ReaderOptions.ForExternalStream;
var factory = await FindFactoryAsync<IArchiveFactory>(stream, cancellationToken);
return factory.OpenAsyncArchive(stream, readerOptions);
}
public static ValueTask<IAsyncArchive> OpenAsyncArchive(
string filePath,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenAsyncArchive(new FileInfo(filePath), options, cancellationToken);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
options ??= ReaderOptions.ForOwnedFile;
var factory = await FindFactoryAsync<IArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(fileInfo, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
fileInfos.NotNull(nameof(fileInfos));
var filesArray = fileInfos.ToArray();
if (filesArray.Length == 0)
{
throw new InvalidOperationException("No files to open");
}
var fileInfo = filesArray[0];
if (filesArray.Length == 1)
{
return await OpenAsyncArchive(fileInfo, options, cancellationToken);
}
fileInfo.NotNull(nameof(fileInfo));
options ??= ReaderOptions.ForOwnedFile;
var factory = await FindFactoryAsync<IMultiArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(filesArray, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<Stream> streams,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
streams.NotNull(nameof(streams));
var streamsArray = streams.ToArray();
if (streamsArray.Length == 0)
{
throw new InvalidOperationException("No streams");
}
var firstStream = streamsArray[0];
if (streamsArray.Length == 1)
{
return await OpenAsyncArchive(firstStream, options, cancellationToken);
}
firstStream.NotNull(nameof(firstStream));
options ??= ReaderOptions.ForExternalStream;
var factory = await FindFactoryAsync<IMultiArchiveFactory>(firstStream, cancellationToken);
return factory.OpenAsyncArchive(streamsArray, options);
}
public static ValueTask<T> FindFactoryAsync<T>(
string path,
CancellationToken cancellationToken = default
)
where T : IFactory
{
path.NotNullOrEmpty(nameof(path));
return FindFactoryAsync<T>(new FileInfo(path), cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
FileInfo finfo,
CancellationToken cancellationToken
)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
using Stream stream = finfo.OpenRead();
return await FindFactoryAsync<T>(stream, cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
Stream stream,
CancellationToken cancellationToken
)
where T : IFactory
{
stream.NotNull(nameof(stream));
if (!stream.CanRead || !stream.CanSeek)
{
throw new ArgumentException("Stream should be readable and seekable");
}
var factories = Factory.Factories.OfType<T>();
var startPosition = stream.Position;
foreach (var factory in factories)
{
stream.Seek(startPosition, SeekOrigin.Begin);
if (await factory.IsArchiveAsync(stream, cancellationToken: cancellationToken))
{
stream.Seek(startPosition, SeekOrigin.Begin);
return factory;
}
}
var extensions = string.Join(", ", factories.Select(item => item.Name));
throw new InvalidOperationException(
$"Cannot determine compressed stream type. Supported Archive Formats: {extensions}"
);
}
}

View File

@@ -5,52 +5,157 @@ using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Factories;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public static partial class ArchiveFactory
public static class ArchiveFactory
{
public static IArchive OpenArchive(Stream stream, ReaderOptions? readerOptions = null)
/// <summary>
/// Opens an Archive for random access
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <returns></returns>
public static IArchive Open(Stream stream, ReaderOptions? readerOptions = null)
{
readerOptions ??= ReaderOptions.ForExternalStream;
return FindFactory<IArchiveFactory>(stream).OpenArchive(stream, readerOptions);
readerOptions ??= new ReaderOptions();
stream = SharpCompressStream.Create(stream, bufferSize: readerOptions.BufferSize);
return FindFactory<IArchiveFactory>(stream).Open(stream, readerOptions);
}
public static IWritableArchive<TOptions> CreateArchive<TOptions>()
where TOptions : IWriterOptions
/// <summary>
/// Opens an Archive for random access asynchronously
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
/// <returns></returns>
public static async ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
readerOptions ??= new ReaderOptions();
stream = SharpCompressStream.Create(stream, bufferSize: readerOptions.BufferSize);
var factory = await FindFactoryAsync<IArchiveFactory>(stream, cancellationToken)
.ConfigureAwait(false);
return await factory
.OpenAsync(stream, readerOptions, cancellationToken)
.ConfigureAwait(false);
}
public static IWritableArchive Create(ArchiveType type)
{
var factory = Factory
.Factories.OfType<IWriteableArchiveFactory<TOptions>>()
.FirstOrDefault();
.Factories.OfType<IWriteableArchiveFactory>()
.FirstOrDefault(item => item.KnownArchiveType == type);
if (factory != null)
{
return factory.CreateArchive();
return factory.CreateWriteableArchive();
}
throw new NotSupportedException("Cannot create Archives of type: " + typeof(TOptions));
throw new NotSupportedException("Cannot create Archives of type: " + type);
}
public static IArchive OpenArchive(string filePath, ReaderOptions? options = null)
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="options"></param>
public static IArchive Open(string filePath, ReaderOptions? options = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), options);
return Open(new FileInfo(filePath), options);
}
public static IArchive OpenArchive(FileInfo fileInfo, ReaderOptions? options = null)
/// <summary>
/// Opens an Archive from a filepath asynchronously.
/// </summary>
/// <param name="filePath"></param>
/// <param name="options"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
string filePath,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
options ??= ReaderOptions.ForOwnedFile;
return FindFactory<IArchiveFactory>(fileInfo).OpenArchive(fileInfo, options);
filePath.NotNullOrEmpty(nameof(filePath));
return OpenAsync(new FileInfo(filePath), options, cancellationToken);
}
public static IArchive OpenArchive(
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="options"></param>
public static IArchive Open(FileInfo fileInfo, ReaderOptions? options = null)
{
options ??= new ReaderOptions { LeaveStreamOpen = false };
return FindFactory<IArchiveFactory>(fileInfo).Open(fileInfo, options);
}
/// <summary>
/// Opens an Archive from a FileInfo object asynchronously.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="options"></param>
/// <param name="cancellationToken"></param>
public static async ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
options ??= new ReaderOptions { LeaveStreamOpen = false };
var factory = await FindFactoryAsync<IArchiveFactory>(fileInfo, cancellationToken)
.ConfigureAwait(false);
return await factory.OpenAsync(fileInfo, options, cancellationToken).ConfigureAwait(false);
}
/// <summary>
/// Constructor with IEnumerable FileInfo objects, multi and split support.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="options"></param>
public static IArchive Open(IEnumerable<FileInfo> fileInfos, ReaderOptions? options = null)
{
fileInfos.NotNull(nameof(fileInfos));
var filesArray = fileInfos.ToArray();
if (filesArray.Length == 0)
{
throw new InvalidOperationException("No files to open");
}
var fileInfo = filesArray[0];
if (filesArray.Length == 1)
{
return Open(fileInfo, options);
}
fileInfo.NotNull(nameof(fileInfo));
options ??= new ReaderOptions { LeaveStreamOpen = false };
return FindFactory<IMultiArchiveFactory>(fileInfo).Open(filesArray, options);
}
/// <summary>
/// Opens a multi-part archive from files asynchronously.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="options"></param>
/// <param name="cancellationToken"></param>
public static async ValueTask<IAsyncArchive> OpenAsync(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? options = null
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
fileInfos.NotNull(nameof(fileInfos));
@@ -63,16 +168,24 @@ public static partial class ArchiveFactory
var fileInfo = filesArray[0];
if (filesArray.Length == 1)
{
return OpenArchive(fileInfo, options);
return await OpenAsync(fileInfo, options, cancellationToken).ConfigureAwait(false);
}
fileInfo.NotNull(nameof(fileInfo));
options ??= ReaderOptions.ForOwnedFile;
options ??= new ReaderOptions { LeaveStreamOpen = false };
return FindFactory<IMultiArchiveFactory>(fileInfo).OpenArchive(filesArray, options);
var factory = FindFactory<IMultiArchiveFactory>(fileInfo);
return await factory
.OpenAsync(filesArray, options, cancellationToken)
.ConfigureAwait(false);
}
public static IArchive OpenArchive(IEnumerable<Stream> streams, ReaderOptions? options = null)
/// <summary>
/// Constructor with IEnumerable FileInfo objects, multi and split support.
/// </summary>
/// <param name="streams"></param>
/// <param name="options"></param>
public static IArchive Open(IEnumerable<Stream> streams, ReaderOptions? options = null)
{
streams.NotNull(nameof(streams));
var streamsArray = streams.ToArray();
@@ -84,34 +197,64 @@ public static partial class ArchiveFactory
var firstStream = streamsArray[0];
if (streamsArray.Length == 1)
{
return OpenArchive(firstStream, options);
return Open(firstStream, options);
}
firstStream.NotNull(nameof(firstStream));
options ??= ReaderOptions.ForExternalStream;
options ??= new ReaderOptions();
return FindFactory<IMultiArchiveFactory>(firstStream).OpenArchive(streamsArray, options);
return FindFactory<IMultiArchiveFactory>(firstStream).Open(streamsArray, options);
}
/// <summary>
/// Opens a multi-part archive from streams asynchronously.
/// </summary>
/// <param name="streams"></param>
/// <param name="options"></param>
/// <param name="cancellationToken"></param>
public static async ValueTask<IAsyncArchive> OpenAsync(
IEnumerable<Stream> streams,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
streams.NotNull(nameof(streams));
var streamsArray = streams.ToArray();
if (streamsArray.Length == 0)
{
throw new InvalidOperationException("No streams");
}
var firstStream = streamsArray[0];
if (streamsArray.Length == 1)
{
return await OpenAsync(firstStream, options, cancellationToken).ConfigureAwait(false);
}
firstStream.NotNull(nameof(firstStream));
options ??= new ReaderOptions();
var factory = FindFactory<IMultiArchiveFactory>(firstStream);
return await factory
.OpenAsync(streamsArray, options, cancellationToken)
.ConfigureAwait(false);
}
/// <summary>
/// Extract to specific directory, retaining filename
/// </summary>
public static void WriteToDirectory(
string sourceArchive,
string destinationDirectory,
ReaderOptions? options = null
ExtractionOptions? options = null
)
{
using var archive = OpenArchive(sourceArchive, options);
archive.WriteToDirectory(destinationDirectory);
using var archive = Open(sourceArchive);
archive.WriteToDirectory(destinationDirectory, options);
}
public static T FindFactory<T>(string path)
where T : IFactory
{
path.NotNullOrEmpty(nameof(path));
using Stream stream = File.OpenRead(path);
return FindFactory<T>(stream);
}
public static T FindFactory<T>(FileInfo finfo)
private static T FindFactory<T>(FileInfo finfo)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
@@ -119,7 +262,7 @@ public static partial class ArchiveFactory
return FindFactory<T>(stream);
}
public static T FindFactory<T>(Stream stream)
private static T FindFactory<T>(Stream stream)
where T : IFactory
{
stream.NotNull(nameof(stream));
@@ -151,14 +294,68 @@ public static partial class ArchiveFactory
);
}
public static bool IsArchive(string filePath, out ArchiveType? type)
private static async ValueTask<T> FindFactoryAsync<T>(
FileInfo finfo,
CancellationToken cancellationToken
)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
using Stream stream = finfo.OpenRead();
return await FindFactoryAsync<T>(stream, cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
Stream stream,
CancellationToken cancellationToken
)
where T : IFactory
{
stream.NotNull(nameof(stream));
if (!stream.CanRead || !stream.CanSeek)
{
throw new ArgumentException("Stream should be readable and seekable");
}
var factories = Factory.Factories.OfType<T>();
var startPosition = stream.Position;
foreach (var factory in factories)
{
stream.Seek(startPosition, SeekOrigin.Begin);
if (await factory.IsArchiveAsync(stream, cancellationToken: cancellationToken))
{
stream.Seek(startPosition, SeekOrigin.Begin);
return factory;
}
}
var extensions = string.Join(", ", factories.Select(item => item.Name));
throw new InvalidOperationException(
$"Cannot determine compressed stream type. Supported Archive Formats: {extensions}"
);
}
public static bool IsArchive(
string filePath,
out ArchiveType? type,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
filePath.NotNullOrEmpty(nameof(filePath));
using Stream s = File.OpenRead(filePath);
return IsArchive(s, out type);
return IsArchive(s, out type, bufferSize);
}
public static bool IsArchive(Stream stream, out ArchiveType? type)
public static bool IsArchive(
Stream stream,
out ArchiveType? type,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
type = null;
stream.NotNull(nameof(stream));
@@ -185,12 +382,22 @@ public static partial class ArchiveFactory
return false;
}
/// <summary>
/// From a passed in archive (zip, rar, 7z, 001), return all parts.
/// </summary>
/// <param name="part1"></param>
/// <returns></returns>
public static IEnumerable<string> GetFileParts(string part1)
{
part1.NotNullOrEmpty(nameof(part1));
return GetFileParts(new FileInfo(part1)).Select(a => a.FullName);
}
/// <summary>
/// From a passed in archive (zip, rar, 7z, 001), return all parts.
/// </summary>
/// <param name="part1"></param>
/// <returns></returns>
public static IEnumerable<FileInfo> GetFileParts(FileInfo part1)
{
part1.NotNull(nameof(part1));
@@ -204,7 +411,7 @@ public static partial class ArchiveFactory
if (part != null)
{
yield return part;
while ((part = factory.GetFilePart(i++, part1)) != null)
while ((part = factory.GetFilePart(i++, part1)) != null) //tests split too
{
yield return part;
}
@@ -213,4 +420,6 @@ public static partial class ArchiveFactory
}
}
}
public static IArchiveFactory AutoFactory { get; } = new AutoArchiveFactory();
}

View File

@@ -13,7 +13,6 @@ internal abstract class ArchiveVolumeFactory
//split 001, 002 ...
var m = Regex.Match(part1.Name, @"^(.*\.)([0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -23,13 +22,9 @@ internal abstract class ArchiveVolumeFactory
)
)
);
}
if (item != null && item.Exists)
{
return item;
}
return null;
}
}

View File

@@ -0,0 +1,51 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
internal class AutoArchiveFactory : IArchiveFactory
{
public string Name => nameof(AutoArchiveFactory);
public ArchiveType? KnownArchiveType => null;
public IEnumerable<string> GetSupportedExtensions() => throw new NotSupportedException();
public bool IsArchive(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
) => throw new NotSupportedException();
public ValueTask<bool> IsArchiveAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
) => throw new NotSupportedException();
public FileInfo? GetFilePart(int index, FileInfo part1) => throw new NotSupportedException();
public IArchive Open(Stream stream, ReaderOptions? readerOptions = null) =>
ArchiveFactory.Open(stream, readerOptions);
public async ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
) => await ArchiveFactory.OpenAsync(stream, readerOptions, cancellationToken);
public IArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null) =>
ArchiveFactory.Open(fileInfo, readerOptions);
public async ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
) => await ArchiveFactory.OpenAsync(fileInfo, readerOptions, cancellationToken);
}

View File

@@ -1,90 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.GZip;
using SharpCompress.Common.Options;
using SharpCompress.Readers;
using SharpCompress.Readers.GZip;
using SharpCompress.Writers;
using SharpCompress.Writers.GZip;
namespace SharpCompress.Archives.GZip;
public partial class GZipArchive
{
public ValueTask SaveToAsync(string filePath, CancellationToken cancellationToken = default) =>
SaveToAsync(new FileInfo(filePath), cancellationToken);
public async ValueTask SaveToAsync(
FileInfo fileInfo,
CancellationToken cancellationToken = default
)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await SaveToAsync(stream, new GZipWriterOptions(CompressionType.GZip), cancellationToken)
.ConfigureAwait(false);
}
protected override async ValueTask SaveToAsync(
Stream stream,
GZipWriterOptions options,
IAsyncEnumerable<GZipArchiveEntry> oldEntries,
IEnumerable<GZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
if (Entries.Count > 1)
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
await using var writer = new GZipWriter(
stream,
options as GZipWriterOptions ?? new GZipWriterOptions(options)
);
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (!entry.IsDirectory)
{
using var entryStream = await entry.OpenEntryStreamAsync(cancellationToken);
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries.Where(x => !x.IsDirectory))
{
using var entryStream = await entry.OpenEntryStreamAsync(cancellationToken);
await writer
.WriteAsync(entry.Key.NotNull("Entry Key is null"), entryStream, cancellationToken)
.ConfigureAwait(false);
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)GZipReader.OpenReader(stream));
}
protected override async IAsyncEnumerable<GZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<GZipVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
yield return new GZipArchiveEntry(
this,
await GZipFilePart.CreateAsync(stream, ReaderOptions.ArchiveEncoding),
ReaderOptions
);
}
}

View File

@@ -1,183 +0,0 @@
using System;
using System.Buffers;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers.GZip;
namespace SharpCompress.Archives.GZip;
public partial class GZipArchive
#if NET8_0_OR_GREATER
: IWritableArchiveOpenable<GZipWriterOptions>,
IMultiArchiveOpenable<
IWritableArchive<GZipWriterOptions>,
IWritableAsyncArchive<GZipWriterOptions>
>
#endif
{
public static IWritableAsyncArchive<GZipWriterOptions> OpenAsyncArchive(
string path,
ReaderOptions? readerOptions = null
)
{
path.NotNullOrEmpty(nameof(path));
return (IWritableAsyncArchive<GZipWriterOptions>)
OpenArchive(new FileInfo(path), readerOptions ?? new ReaderOptions());
}
public static IWritableArchive<GZipWriterOptions> OpenArchive(
string filePath,
ReaderOptions? readerOptions = null
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
public static IWritableArchive<GZipWriterOptions> OpenArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
)
{
fileInfo.NotNull(nameof(fileInfo));
return new GZipArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
public static IWritableArchive<GZipWriterOptions> OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new GZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IWritableArchive<GZipWriterOptions> OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new GZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IWritableArchive<GZipWriterOptions> OpenArchive(
Stream stream,
ReaderOptions? readerOptions = null
)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new GZipArchive(
new SourceStream(stream, _ => null, readerOptions ?? new ReaderOptions())
);
}
public static IWritableAsyncArchive<GZipWriterOptions> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<GZipWriterOptions>)OpenArchive(stream, readerOptions);
public static IWritableAsyncArchive<GZipWriterOptions> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<GZipWriterOptions>)OpenArchive(fileInfo, readerOptions);
public static IWritableAsyncArchive<GZipWriterOptions> OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<GZipWriterOptions>)OpenArchive(streams, readerOptions);
public static IWritableAsyncArchive<GZipWriterOptions> OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<GZipWriterOptions>)OpenArchive(fileInfos, readerOptions);
public static IWritableArchive<GZipWriterOptions> CreateArchive() => new GZipArchive();
public static IWritableAsyncArchive<GZipWriterOptions> CreateAsyncArchive() =>
new GZipArchive();
public static bool IsGZipFile(string filePath) => IsGZipFile(new FileInfo(filePath));
public static bool IsGZipFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsGZipFile(stream);
}
public static bool IsGZipFile(Stream stream)
{
Span<byte> header = stackalloc byte[10];
if (!stream.ReadFully(header))
{
return false;
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
public static async ValueTask<bool> IsGZipFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var header = ArrayPool<byte>.Shared.Rent(10);
try
{
await stream.ReadFullyAsync(header, 0, 10, cancellationToken).ConfigureAwait(false);
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
finally
{
ArrayPool<byte>.Shared.Return(header);
}
}
}

View File

@@ -2,9 +2,10 @@ using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.GZip;
using SharpCompress.Common.Options;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.GZip;
@@ -13,29 +14,251 @@ using SharpCompress.Writers.GZip;
namespace SharpCompress.Archives.GZip;
public partial class GZipArchive
: AbstractWritableArchive<GZipArchiveEntry, GZipVolume, GZipWriterOptions>
public class GZipArchive : AbstractWritableArchive<GZipArchiveEntry, GZipVolume>
{
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="readerOptions"></param>
public static GZipArchive Open(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
return Open(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
public static GZipArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null)
{
fileInfo.NotNull(nameof(fileInfo));
return new GZipArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all file parts passed in
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
public static GZipArchive Open(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new GZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all stream parts passed in
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
public static GZipArchive Open(IEnumerable<Stream> streams, ReaderOptions? readerOptions = null)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new GZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
public static GZipArchive Open(Stream stream, ReaderOptions? readerOptions = null)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new GZipArchive(
new SourceStream(stream, _ => null, readerOptions ?? new ReaderOptions())
);
}
/// <summary>
/// Opens a GZipArchive asynchronously from a stream.
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(stream, readerOptions));
}
/// <summary>
/// Opens a GZipArchive asynchronously from a FileInfo.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfo, readerOptions));
}
/// <summary>
/// Opens a GZipArchive asynchronously from multiple streams.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(streams, readerOptions));
}
/// <summary>
/// Opens a GZipArchive asynchronously from multiple FileInfo objects.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfos, readerOptions));
}
public static GZipArchive Create() => new();
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
/// <param name="sourceStream"></param>
private GZipArchive(SourceStream sourceStream)
: base(ArchiveType.GZip, sourceStream) { }
internal GZipArchive()
: base(ArchiveType.GZip) { }
protected override IEnumerable<GZipVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.LoadAllParts();
return sourceStream.Streams.Select(a => new GZipVolume(a, ReaderOptions, 0));
}
public static bool IsGZipFile(string filePath) => IsGZipFile(new FileInfo(filePath));
public static bool IsGZipFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsGZipFile(stream);
}
public void SaveTo(string filePath) => SaveTo(new FileInfo(filePath));
public void SaveTo(FileInfo fileInfo)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
SaveTo(stream, new GZipWriterOptions(CompressionType.GZip));
SaveTo(stream, new WriterOptions(CompressionType.GZip));
}
public ValueTask SaveToAsync(string filePath, CancellationToken cancellationToken = default) =>
SaveToAsync(new FileInfo(filePath), cancellationToken);
public async ValueTask SaveToAsync(
FileInfo fileInfo,
CancellationToken cancellationToken = default
)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await SaveToAsync(stream, new WriterOptions(CompressionType.GZip), cancellationToken)
.ConfigureAwait(false);
}
public static bool IsGZipFile(Stream stream)
{
// read the header on the first read
Span<byte> header = stackalloc byte[10];
// workitem 8501: handle edge case (decompress empty stream)
if (!stream.ReadFully(header))
{
return false;
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
public static async ValueTask<bool> IsGZipFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
// read the header on the first read
byte[] header = new byte[10];
// workitem 8501: handle edge case (decompress empty stream)
if (!await stream.ReadFullyAsync(header, cancellationToken).ConfigureAwait(false))
{
return false;
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
internal GZipArchive()
: base(ArchiveType.GZip) { }
protected override GZipArchiveEntry CreateEntryInternal(
string filePath,
Stream source,
@@ -58,7 +281,7 @@ public partial class GZipArchive
protected override void SaveTo(
Stream stream,
GZipWriterOptions options,
WriterOptions options,
IEnumerable<GZipArchiveEntry> oldEntries,
IEnumerable<GZipArchiveEntry> newEntries
)
@@ -67,10 +290,7 @@ public partial class GZipArchive
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
using var writer = new GZipWriter(
stream,
options as GZipWriterOptions ?? new GZipWriterOptions(options)
);
using var writer = new GZipWriter(stream, new GZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries).Where(x => !x.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
@@ -82,13 +302,34 @@ public partial class GZipArchive
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<GZipArchiveEntry> oldEntries,
IEnumerable<GZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
if (Entries.Count > 1)
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
using var writer = new GZipWriter(stream, new GZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries).Where(x => !x.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(entry.Key.NotNull("Entry Key is null"), entryStream, cancellationToken)
.ConfigureAwait(false);
}
}
protected override IEnumerable<GZipArchiveEntry> LoadEntries(IEnumerable<GZipVolume> volumes)
{
var stream = volumes.Single().Stream;
yield return new GZipArchiveEntry(
this,
GZipFilePart.Create(stream, ReaderOptions.ArchiveEncoding),
ReaderOptions
new GZipFilePart(stream, ReaderOptions.ArchiveEncoding)
);
}
@@ -96,6 +337,13 @@ public partial class GZipArchive
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return GZipReader.OpenReader(stream);
return GZipReader.Open(stream);
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new(GZipReader.Open(stream));
}
}

View File

@@ -1,16 +1,15 @@
using System.IO;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.GZip;
using SharpCompress.Common.Options;
namespace SharpCompress.Archives.GZip;
public class GZipArchiveEntry : GZipEntry, IArchiveEntry
{
internal GZipArchiveEntry(GZipArchive archive, GZipFilePart? part, IReaderOptions readerOptions)
: base(part, readerOptions) => Archive = archive;
internal GZipArchiveEntry(GZipArchive archive, GZipFilePart? part)
: base(part) => Archive = archive;
public virtual Stream OpenEntryStream()
{
@@ -24,10 +23,12 @@ public class GZipArchiveEntry : GZipEntry, IArchiveEntry
return Parts.Single().GetCompressedStream().NotNull();
}
public ValueTask<Stream> OpenEntryStreamAsync(CancellationToken cancellationToken = default)
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
// GZip synchronous implementation is fast enough, just wrap it
return new(OpenEntryStream());
return OpenEntryStream();
}
#region IArchiveEntry Members

View File

@@ -19,7 +19,7 @@ internal sealed class GZipWritableArchiveEntry : GZipArchiveEntry, IWritableArch
DateTime? lastModified,
bool closeStream
)
: base(archive, null, archive.ReaderOptions)
: base(archive, null)
{
this.stream = stream;
Key = path;
@@ -58,7 +58,7 @@ internal sealed class GZipWritableArchiveEntry : GZipArchiveEntry, IWritableArch
{
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -12,11 +12,6 @@ public interface IArchive : IDisposable
ArchiveType Type { get; }
/// <summary>
/// The options used when opening this archive, including extraction behavior settings.
/// </summary>
ReaderOptions ReaderOptions { get; }
/// <summary>
/// Use this method to extract all entries in an archive in order.
/// This is primarily for SOLID Rar Archives or 7Zip Archives as they need to be
@@ -43,10 +38,5 @@ public interface IArchive : IDisposable
/// <summary>
/// The total size of the files as uncompressed in the archive.
/// </summary>
long TotalUncompressedSize { get; }
/// <summary>
/// Returns whether the archive is encrypted.
/// </summary>
bool IsEncrypted { get; }
long TotalUncompressSize { get; }
}

View File

@@ -9,6 +9,8 @@ namespace SharpCompress.Archives;
public static class IArchiveEntryExtensions
{
private const int BufferSize = 81920;
/// <param name="archiveEntry">The archive entry to extract.</param>
extension(IArchiveEntry archiveEntry)
{
@@ -26,7 +28,7 @@ public static class IArchiveEntryExtensions
using var entryStream = archiveEntry.OpenEntryStream();
var sourceStream = WrapWithProgress(entryStream, archiveEntry, progress);
sourceStream.CopyTo(streamToWriteTo, Constants.BufferSize);
sourceStream.CopyTo(streamToWriteTo, BufferSize);
}
/// <summary>
@@ -46,16 +48,10 @@ public static class IArchiveEntryExtensions
throw new ExtractionException("Entry is a file directory and cannot be extracted.");
}
#if LEGACY_DOTNET
using var entryStream = await archiveEntry.OpenEntryStreamAsync(cancellationToken);
#else
await using var entryStream = await archiveEntry.OpenEntryStreamAsync(
cancellationToken
);
#endif
var sourceStream = WrapWithProgress(entryStream, archiveEntry, progress);
await sourceStream
.CopyToAsync(streamToWriteTo, Constants.BufferSize, cancellationToken)
.CopyToAsync(streamToWriteTo, BufferSize, cancellationToken)
.ConfigureAwait(false);
}
}
@@ -100,11 +96,15 @@ public static class IArchiveEntryExtensions
/// <summary>
/// Extract to specific directory, retaining filename
/// </summary>
public void WriteToDirectory(string destinationDirectory) =>
public void WriteToDirectory(
string destinationDirectory,
ExtractionOptions? options = null
) =>
ExtractionMethods.WriteEntryToDirectory(
entry,
destinationDirectory,
(path) => entry.WriteToFile(path)
options,
entry.WriteToFile
);
/// <summary>
@@ -112,14 +112,15 @@ public static class IArchiveEntryExtensions
/// </summary>
public async ValueTask WriteToDirectoryAsync(
string destinationDirectory,
ExtractionOptions? options = null,
CancellationToken cancellationToken = default
) =>
await ExtractionMethods
.WriteEntryToDirectoryAsync(
entry,
destinationDirectory,
async (path, ct) =>
await entry.WriteToFileAsync(path, ct).ConfigureAwait(false),
options,
entry.WriteToFileAsync,
cancellationToken
)
.ConfigureAwait(false);
@@ -127,10 +128,11 @@ public static class IArchiveEntryExtensions
/// <summary>
/// Extract to specific file
/// </summary>
public void WriteToFile(string destinationFileName) =>
public void WriteToFile(string destinationFileName, ExtractionOptions? options = null) =>
ExtractionMethods.WriteEntryToFile(
entry,
destinationFileName,
options,
(x, fm) =>
{
using var fs = File.Open(destinationFileName, fm);
@@ -143,12 +145,14 @@ public static class IArchiveEntryExtensions
/// </summary>
public async ValueTask WriteToFileAsync(
string destinationFileName,
ExtractionOptions? options = null,
CancellationToken cancellationToken = default
) =>
await ExtractionMethods
.WriteEntryToFileAsync(
entry,
destinationFileName,
options,
async (x, fm, ct) =>
{
using var fs = File.Open(destinationFileName, fm);

View File

@@ -8,38 +8,48 @@ namespace SharpCompress.Archives;
public static class IArchiveExtensions
{
/// <param name="archive">The archive to extract.</param>
extension(IArchive archive)
{
/// <summary>
/// Extract to specific directory with progress reporting
/// </summary>
/// <param name="destinationDirectory">The folder to extract into.</param>
/// <param name="options">Extraction options.</param>
/// <param name="progress">Optional progress reporter for tracking extraction progress.</param>
public void WriteToDirectory(
string destinationDirectory,
ExtractionOptions? options = null,
IProgress<ProgressReport>? progress = null
)
{
// For solid archives (Rar, 7Zip), use the optimized reader-based approach
if (archive.IsSolid || archive.Type == ArchiveType.SevenZip)
{
using var reader = archive.ExtractAllEntries();
reader.WriteAllToDirectory(destinationDirectory);
reader.WriteAllToDirectory(destinationDirectory, options);
}
else
{
archive.WriteToDirectoryInternal(destinationDirectory, progress);
// For non-solid archives, extract entries directly
archive.WriteToDirectoryInternal(destinationDirectory, options, progress);
}
}
private void WriteToDirectoryInternal(
string destinationDirectory,
ExtractionOptions? options,
IProgress<ProgressReport>? progress
)
{
var totalBytes = archive.TotalUncompressedSize;
// Prepare for progress reporting
var totalBytes = archive.TotalUncompressSize;
var bytesRead = 0L;
// Tracking for created directories.
var seenDirectories = new HashSet<string>();
// Extract
foreach (var entry in archive.Entries)
{
if (entry.IsDirectory)
@@ -58,8 +68,10 @@ public static class IArchiveExtensions
continue;
}
entry.WriteToDirectory(destinationDirectory);
// Use the entry's WriteToDirectory method which respects ExtractionOptions
entry.WriteToDirectory(destinationDirectory, options);
// Update progress
bytesRead += entry.Size;
progress?.Report(
new ProgressReport(entry.Key ?? string.Empty, bytesRead, totalBytes)

View File

@@ -1,5 +1,6 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Factories;
using SharpCompress.Readers;
@@ -25,21 +26,26 @@ public interface IArchiveFactory : IFactory
/// </summary>
/// <param name="stream">An open, readable and seekable stream.</param>
/// <param name="readerOptions">reading options.</param>
IArchive OpenArchive(Stream stream, ReaderOptions? readerOptions = null);
IArchive Open(Stream stream, ReaderOptions? readerOptions = null);
/// <summary>
/// Opens an Archive for random access asynchronously.
/// </summary>
/// <param name="stream">An open, readable and seekable stream.</param>
/// <param name="readerOptions">reading options.</param>
IAsyncArchive OpenAsyncArchive(Stream stream, ReaderOptions? readerOptions = null);
/// <param name="cancellationToken">Cancellation token.</param>
ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
);
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo">the file to open.</param>
/// <param name="readerOptions">reading options.</param>
IArchive OpenArchive(FileInfo fileInfo, ReaderOptions? readerOptions = null);
IArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null);
/// <summary>
/// Opens an Archive from a FileInfo object asynchronously.
@@ -47,5 +53,9 @@ public interface IArchiveFactory : IFactory
/// <param name="fileInfo">the file to open.</param>
/// <param name="readerOptions">reading options.</param>
/// <param name="cancellationToken">Cancellation token.</param>
IAsyncArchive OpenAsyncArchive(FileInfo fileInfo, ReaderOptions? readerOptions = null);
ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
);
}

View File

@@ -1,37 +0,0 @@
#if NET8_0_OR_GREATER
using System.IO;
using System.Threading;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public interface IArchiveOpenable<TSync, TASync>
where TSync : IArchive
where TASync : IAsyncArchive
{
public static abstract TSync OpenArchive(string filePath, ReaderOptions? readerOptions = null);
public static abstract TSync OpenArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
);
public static abstract TSync OpenArchive(Stream stream, ReaderOptions? readerOptions = null);
public static abstract TASync OpenAsyncArchive(
string path,
ReaderOptions? readerOptions = null
);
public static abstract TASync OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null
);
public static abstract TASync OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
);
}
#endif

View File

@@ -39,10 +39,5 @@ public interface IAsyncArchive : IAsyncDisposable
/// <summary>
/// The total size of the files as uncompressed in the archive.
/// </summary>
ValueTask<long> TotalUncompressedSizeAsync();
/// <summary>
/// Returns whether the archive is encrypted.
/// </summary>
ValueTask<bool> IsEncryptedAsync();
ValueTask<long> TotalUncompressSizeAsync();
}

View File

@@ -10,77 +10,84 @@ namespace SharpCompress.Archives;
public static class IAsyncArchiveExtensions
{
extension(IAsyncArchive archive)
/// <summary>
/// Extract to specific directory asynchronously with progress reporting and cancellation support
/// </summary>
/// <param name="archive">The archive to extract.</param>
/// <param name="destinationDirectory">The folder to extract into.</param>
/// <param name="options">Extraction options.</param>
/// <param name="progress">Optional progress reporter for tracking extraction progress.</param>
/// <param name="cancellationToken">Optional cancellation token.</param>
public static async Task WriteToDirectoryAsync(
this IAsyncArchive archive,
string destinationDirectory,
ExtractionOptions? options = null,
IProgress<ProgressReport>? progress = null,
CancellationToken cancellationToken = default
)
{
/// <summary>
/// Extract to specific directory asynchronously with progress reporting and cancellation support
/// </summary>
/// <param name="archive">The archive to extract.</param>
/// <param name="destinationDirectory">The folder to extract into.</param>
/// <param name="progress">Optional progress reporter for tracking extraction progress.</param>
/// <param name="cancellationToken">Optional cancellation token.</param>
public async ValueTask WriteToDirectoryAsync(
string destinationDirectory,
IProgress<ProgressReport>? progress = null,
CancellationToken cancellationToken = default
)
// For solid archives (Rar, 7Zip), use the optimized reader-based approach
if (await archive.IsSolidAsync() || archive.Type == ArchiveType.SevenZip)
{
if (await archive.IsSolidAsync() || archive.Type == ArchiveType.SevenZip)
{
await using var reader = await archive.ExtractAllEntriesAsync();
await reader
.WriteAllToDirectoryAsync(destinationDirectory, cancellationToken)
.ConfigureAwait(false);
}
else
{
await archive.WriteToDirectoryAsyncInternal(
destinationDirectory,
progress,
cancellationToken
);
}
await using var reader = await archive.ExtractAllEntriesAsync();
await reader.WriteAllToDirectoryAsync(destinationDirectory, options, cancellationToken);
}
private async ValueTask WriteToDirectoryAsyncInternal(
string destinationDirectory,
IProgress<ProgressReport>? progress,
CancellationToken cancellationToken
)
else
{
var totalBytes = await archive.TotalUncompressedSizeAsync();
var bytesRead = 0L;
var seenDirectories = new HashSet<string>();
// For non-solid archives, extract entries directly
await archive.WriteToDirectoryAsyncInternal(
destinationDirectory,
options,
progress,
cancellationToken
);
}
}
await foreach (var entry in archive.EntriesAsync.WithCancellation(cancellationToken))
private static async Task WriteToDirectoryAsyncInternal(
this IAsyncArchive archive,
string destinationDirectory,
ExtractionOptions? options,
IProgress<ProgressReport>? progress,
CancellationToken cancellationToken
)
{
// Prepare for progress reporting
var totalBytes = await archive.TotalUncompressSizeAsync();
var bytesRead = 0L;
// Tracking for created directories.
var seenDirectories = new HashSet<string>();
// Extract
await foreach (var entry in archive.EntriesAsync.WithCancellation(cancellationToken))
{
cancellationToken.ThrowIfCancellationRequested();
if (entry.IsDirectory)
{
cancellationToken.ThrowIfCancellationRequested();
if (entry.IsDirectory)
{
var dirPath = Path.Combine(
destinationDirectory,
entry.Key.NotNull("Entry Key is null")
);
if (
Path.GetDirectoryName(dirPath + "/") is { } parentDirectory
&& seenDirectories.Add(dirPath)
)
{
Directory.CreateDirectory(parentDirectory);
}
continue;
}
await entry
.WriteToDirectoryAsync(destinationDirectory, cancellationToken)
.ConfigureAwait(false);
bytesRead += entry.Size;
progress?.Report(
new ProgressReport(entry.Key ?? string.Empty, bytesRead, totalBytes)
var dirPath = Path.Combine(
destinationDirectory,
entry.Key.NotNull("Entry Key is null")
);
if (
Path.GetDirectoryName(dirPath + "/") is { } parentDirectory
&& seenDirectories.Add(dirPath)
)
{
Directory.CreateDirectory(parentDirectory);
}
continue;
}
// Use the entry's WriteToDirectoryAsync method which respects ExtractionOptions
await entry
.WriteToDirectoryAsync(destinationDirectory, options, cancellationToken)
.ConfigureAwait(false);
// Update progress
bytesRead += entry.Size;
progress?.Report(new ProgressReport(entry.Key ?? string.Empty, bytesRead, totalBytes));
}
}
}

View File

@@ -1,6 +1,7 @@
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Factories;
using SharpCompress.Readers;
@@ -26,16 +27,18 @@ public interface IMultiArchiveFactory : IFactory
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions">reading options.</param>
IArchive OpenArchive(IReadOnlyList<Stream> streams, ReaderOptions? readerOptions = null);
IArchive Open(IReadOnlyList<Stream> streams, ReaderOptions? readerOptions = null);
/// <summary>
/// Opens a multi-part archive from streams asynchronously.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions">reading options.</param>
IAsyncArchive OpenAsyncArchive(
/// <param name="cancellationToken">Cancellation token.</param>
ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
);
/// <summary>
@@ -43,15 +46,17 @@ public interface IMultiArchiveFactory : IFactory
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions">reading options.</param>
IArchive OpenArchive(IReadOnlyList<FileInfo> fileInfos, ReaderOptions? readerOptions = null);
IArchive Open(IReadOnlyList<FileInfo> fileInfos, ReaderOptions? readerOptions = null);
/// <summary>
/// Opens a multi-part archive from files asynchronously.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions">reading options.</param>
IAsyncArchive OpenAsyncArchive(
/// <param name="cancellationToken">Cancellation token.</param>
ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
);
}

View File

@@ -1,33 +0,0 @@
#if NET8_0_OR_GREATER
using System.Collections.Generic;
using System.IO;
using System.Threading;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public interface IMultiArchiveOpenable<TSync, TASync>
where TSync : IArchive
where TASync : IAsyncArchive
{
public static abstract TSync OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
);
public static abstract TSync OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
);
public static abstract TASync OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
);
public static abstract TASync OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
);
}
#endif

View File

@@ -2,22 +2,14 @@ using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Options;
using SharpCompress.Writers;
namespace SharpCompress.Archives;
public interface IWritableArchiveCommon
public interface IWritableArchive : IArchive
{
/// <summary>
/// Use this to pause entry rebuilding when adding large collections of entries. Dispose when complete. A using statement is recommended.
/// </summary>
/// <returns>IDisposeable to resume entry rebuilding</returns>
IDisposable PauseEntryRebuilding();
}
void RemoveEntry(IArchiveEntry entry);
public interface IWritableArchive : IArchive, IWritableArchiveCommon
{
IArchiveEntry AddEntry(
string key,
Stream source,
@@ -28,59 +20,17 @@ public interface IWritableArchive : IArchive, IWritableArchiveCommon
IArchiveEntry AddDirectoryEntry(string key, DateTime? modified = null);
/// <summary>
/// Removes the specified entry from the archive.
/// </summary>
void RemoveEntry(IArchiveEntry entry);
}
void SaveTo(Stream stream, WriterOptions options);
public interface IWritableArchive<TOptions> : IWritableArchive
where TOptions : IWriterOptions
{
/// <summary>
/// Saves the archive to the specified stream using the given writer options.
/// </summary>
void SaveTo(Stream stream, TOptions options);
}
public interface IWritableAsyncArchive : IAsyncArchive, IWritableArchiveCommon
{
/// <summary>
/// Asynchronously adds an entry to the archive with the specified key, source stream, and options.
/// </summary>
ValueTask<IArchiveEntry> AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size = 0,
DateTime? modified = null,
CancellationToken cancellationToken = default
);
/// <summary>
/// Asynchronously adds a directory entry to the archive with the specified key and modification time.
/// </summary>
ValueTask<IArchiveEntry> AddDirectoryEntryAsync(
string key,
DateTime? modified = null,
CancellationToken cancellationToken = default
);
/// <summary>
/// Removes the specified entry from the archive.
/// </summary>
ValueTask RemoveEntryAsync(IArchiveEntry entry);
}
public interface IWritableAsyncArchive<TOptions> : IWritableAsyncArchive
where TOptions : IWriterOptions
{
/// <summary>
/// Asynchronously saves the archive to the specified stream using the given writer options.
/// </summary>
ValueTask SaveToAsync(
Stream stream,
TOptions options,
WriterOptions options,
CancellationToken cancellationToken = default
);
/// <summary>
/// Use this to pause entry rebuilding when adding large collections of entries. Dispose when complete. A using statement is recommended.
/// </summary>
/// <returns>IDisposeable to resume entry rebuilding</returns>
IDisposable PauseEntryRebuilding();
}

View File

@@ -1,80 +1,106 @@
using System;
using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Writers;
namespace SharpCompress.Archives;
public static class IWritableArchiveExtensions
{
extension(IWritableArchive writableArchive)
public static void AddEntry(
this IWritableArchive writableArchive,
string entryPath,
string filePath
)
{
public void AddAllFromDirectory(
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
var fileInfo = new FileInfo(filePath);
if (!fileInfo.Exists)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (
var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption)
)
{
var fileInfo = new FileInfo(path);
writableArchive.AddEntry(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public IArchiveEntry AddEntry(string key, string file) =>
writableArchive.AddEntry(key, new FileInfo(file));
public IArchiveEntry AddEntry(
string key,
Stream source,
long size = 0,
DateTime? modified = null
) => writableArchive.AddEntry(key, source, false, size, modified);
public IArchiveEntry AddEntry(string key, FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntry(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
throw new FileNotFoundException("Could not AddEntry: " + filePath);
}
writableArchive.AddEntry(
entryPath,
new FileInfo(filePath).OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
public static void SaveTo<TOptions>(
this IWritableArchive<TOptions> writableArchive,
public static void SaveTo(
this IWritableArchive writableArchive,
string filePath,
TOptions options
)
where TOptions : IWriterOptions => writableArchive.SaveTo(new FileInfo(filePath), options);
WriterOptions options
) => writableArchive.SaveTo(new FileInfo(filePath), options);
public static void SaveTo<TOptions>(
this IWritableArchive<TOptions> writableArchive,
public static void SaveTo(
this IWritableArchive writableArchive,
FileInfo fileInfo,
TOptions options
WriterOptions options
)
where TOptions : IWriterOptions
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
writableArchive.SaveTo(stream, options);
}
public static ValueTask SaveToAsync(
this IWritableArchive writableArchive,
string filePath,
WriterOptions options,
CancellationToken cancellationToken = default
) => writableArchive.SaveToAsync(new FileInfo(filePath), options, cancellationToken);
public static async ValueTask SaveToAsync(
this IWritableArchive writableArchive,
FileInfo fileInfo,
WriterOptions options,
CancellationToken cancellationToken = default
)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await writableArchive.SaveToAsync(stream, options, cancellationToken).ConfigureAwait(false);
}
public static void AddAllFromDirectory(
this IWritableArchive writableArchive,
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption))
{
var fileInfo = new FileInfo(path);
writableArchive.AddEntry(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public static IArchiveEntry AddEntry(
this IWritableArchive writableArchive,
string key,
FileInfo fileInfo
)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntry(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}

View File

@@ -1,13 +0,0 @@
using SharpCompress.Common.Options;
#if NET8_0_OR_GREATER
namespace SharpCompress.Archives;
public interface IWritableArchiveOpenable<TOptions>
: IArchiveOpenable<IWritableArchive<TOptions>, IWritableAsyncArchive<TOptions>>
where TOptions : IWriterOptions
{
public static abstract IWritableArchive<TOptions> CreateArchive();
public static abstract IWritableAsyncArchive<TOptions> CreateAsyncArchive();
}
#endif

View File

@@ -1,85 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Writers;
namespace SharpCompress.Archives;
public static class IWritableAsyncArchiveExtensions
{
extension(IWritableAsyncArchive writableArchive)
{
public async ValueTask AddAllFromDirectoryAsync(
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (
var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption)
)
{
var fileInfo = new FileInfo(path);
await writableArchive.AddEntryAsync(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public ValueTask<IArchiveEntry> AddEntryAsync(string key, string file) =>
writableArchive.AddEntryAsync(key, new FileInfo(file));
public ValueTask<IArchiveEntry> AddEntryAsync(
string key,
Stream source,
long size = 0,
DateTime? modified = null
) => writableArchive.AddEntryAsync(key, source, false, size, modified);
public ValueTask<IArchiveEntry> AddEntryAsync(string key, FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntryAsync(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
public static ValueTask SaveToAsync<TOptions>(
this IWritableAsyncArchive<TOptions> writableArchive,
string filePath,
TOptions options,
CancellationToken cancellationToken = default
)
where TOptions : IWriterOptions =>
writableArchive.SaveToAsync(new FileInfo(filePath), options, cancellationToken);
public static async ValueTask SaveToAsync<TOptions>(
this IWritableAsyncArchive<TOptions> writableArchive,
FileInfo fileInfo,
TOptions options,
CancellationToken cancellationToken = default
)
where TOptions : IWriterOptions
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await writableArchive.SaveToAsync(stream, options, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,5 +1,3 @@
using SharpCompress.Common.Options;
namespace SharpCompress.Archives;
/// <summary>
@@ -12,12 +10,11 @@ namespace SharpCompress.Archives;
/// <item><see cref="Factories.ZipFactory"/></item>
/// <item><see cref="Factories.GZipFactory"/></item>
/// </list>
public interface IWriteableArchiveFactory<TOptions> : Factories.IFactory
where TOptions : IWriterOptions
public interface IWriteableArchiveFactory : Factories.IFactory
{
/// <summary>
/// Creates a new, empty archive, ready to be written.
/// </summary>
/// <returns></returns>
IWritableArchive<TOptions> CreateArchive();
IWritableArchive CreateWriteableArchive();
}

View File

@@ -24,7 +24,8 @@ internal class FileInfoRarArchiveVolume : RarVolume
private static ReaderOptions FixOptions(ReaderOptions options)
{
//make sure we're closing streams with fileinfo
return options with { LeaveStreamOpen = false };
options.LeaveStreamOpen = false;
return options;
}
internal ReadOnlyCollection<RarFilePart> FileParts { get; }
@@ -35,7 +36,4 @@ internal class FileInfoRarArchiveVolume : RarVolume
new FileInfoRarFilePart(this, ReaderOptions.Password, markHeader, fileHeader, FileInfo);
internal override IEnumerable<RarFilePart> ReadFileParts() => FileParts;
internal override IAsyncEnumerable<RarFilePart> ReadFilePartsAsync() =>
FileParts.ToAsyncEnumerable();
}

View File

@@ -1,53 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Archives.Rar;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.Rar;
namespace SharpCompress.Archives.Rar;
public partial class RarArchive
{
public override async ValueTask DisposeAsync()
{
if (!_disposed)
{
if (UnpackV1.IsValueCreated && UnpackV1.Value is IDisposable unpackV1)
{
unpackV1.Dispose();
}
_disposed = true;
await base.DisposeAsync();
}
}
protected override async ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
if (await this.IsMultipartVolumeAsync())
{
var streams = await VolumesAsync
.Select(volume =>
{
volume.Stream.Position = 0;
return volume.Stream;
})
.ToListAsync();
return (RarReader)RarReader.OpenReader(streams, ReaderOptions);
}
var stream = (await VolumesAsync.FirstAsync()).Stream;
stream.Position = 0;
return (RarReader)RarReader.OpenReader(stream, ReaderOptions);
}
public override async ValueTask<bool> IsSolidAsync() =>
await (await VolumesAsync.CastAsync<RarVolume>().FirstAsync()).IsSolidArchiveAsync();
}

View File

@@ -1,36 +1,18 @@
using System.Linq;
using System.Threading.Tasks;
using SharpCompress.Common.Rar;
using System.Linq;
namespace SharpCompress.Archives.Rar;
public static class RarArchiveExtensions
{
extension(IRarArchive archive)
{
/// <summary>
/// RarArchive is the first volume of a multi-part archive. If MultipartVolume is true and IsFirstVolume is false then the first volume file must be missing.
/// </summary>
public bool IsFirstVolume() => archive.Volumes.Cast<RarVolume>().First().IsFirstVolume;
/// <summary>
/// RarArchive is the first volume of a multi-part archive. If MultipartVolume is true and IsFirstVolume is false then the first volume file must be missing.
/// </summary>
public static bool IsFirstVolume(this RarArchive archive) =>
archive.Volumes.First().IsFirstVolume;
/// <summary>
/// RarArchive is part of a multi-part archive.
/// </summary>
public bool IsMultipartVolume() => archive.Volumes.Cast<RarVolume>().First().IsMultiVolume;
}
extension(IRarAsyncArchive archive)
{
/// <summary>
/// RarArchive is the first volume of a multi-part archive. If MultipartVolume is true and IsFirstVolume is false then the first volume file must be missing.
/// </summary>
public async ValueTask<bool> IsFirstVolumeAsync() =>
(await archive.VolumesAsync.CastAsync<RarVolume>().FirstAsync()).IsFirstVolume;
/// <summary>
/// RarArchive is part of a multi-part archive.
/// </summary>
public async ValueTask<bool> IsMultipartVolumeAsync() =>
(await archive.VolumesAsync.CastAsync<RarVolume>().FirstAsync()).IsMultiVolume;
}
/// <summary>
/// RarArchive is part of a multi-part archive.
/// </summary>
public static bool IsMultipartVolume(this RarArchive archive) =>
archive.Volumes.First().IsMultiVolume;
}

View File

@@ -1,177 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.Compressors.Rar;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.Rar;
namespace SharpCompress.Archives.Rar;
public partial class RarArchive
#if NET8_0_OR_GREATER
: IArchiveOpenable<IRarArchive, IRarAsyncArchive>,
IMultiArchiveOpenable<IRarArchive, IRarAsyncArchive>
#endif
{
public static IRarAsyncArchive OpenAsyncArchive(
string path,
ReaderOptions? readerOptions = null
)
{
path.NotNullOrEmpty(nameof(path));
return (IRarAsyncArchive)OpenArchive(new FileInfo(path), readerOptions);
}
public static IRarArchive OpenArchive(string filePath, ReaderOptions? options = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
var fileInfo = new FileInfo(filePath);
return new RarArchive(
new SourceStream(
fileInfo,
i => RarArchiveVolumeFactory.GetFilePart(i, fileInfo),
options ?? new ReaderOptions()
)
);
}
public static IRarArchive OpenArchive(FileInfo fileInfo, ReaderOptions? options = null)
{
fileInfo.NotNull(nameof(fileInfo));
return new RarArchive(
new SourceStream(
fileInfo,
i => RarArchiveVolumeFactory.GetFilePart(i, fileInfo),
options ?? new ReaderOptions()
)
);
}
public static IRarArchive OpenArchive(Stream stream, ReaderOptions? options = null)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new RarArchive(new SourceStream(stream, _ => null, options ?? new ReaderOptions()));
}
public static IRarArchive OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new RarArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IRarArchive OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new RarArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IRarAsyncArchive OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null
)
{
return (IRarAsyncArchive)OpenArchive(stream, readerOptions);
}
public static IRarAsyncArchive OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
)
{
return (IRarAsyncArchive)OpenArchive(fileInfo, readerOptions);
}
public static IRarAsyncArchive OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
)
{
return (IRarAsyncArchive)OpenArchive(streams, readerOptions);
}
public static IRarAsyncArchive OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
return (IRarAsyncArchive)OpenArchive(fileInfos, readerOptions);
}
public static bool IsRarFile(string filePath) => IsRarFile(new FileInfo(filePath));
public static bool IsRarFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsRarFile(stream);
}
public static bool IsRarFile(Stream stream, ReaderOptions? options = null)
{
try
{
MarkHeader.Read(stream, true, false);
return true;
}
catch
{
return false;
}
}
public static async ValueTask<bool> IsRarFileAsync(
Stream stream,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
await MarkHeader
.ReadAsync(stream, true, false, cancellationToken)
.ConfigureAwait(false);
return true;
}
catch
{
return false;
}
}
}

View File

@@ -14,26 +14,17 @@ using SharpCompress.Readers.Rar;
namespace SharpCompress.Archives.Rar;
public interface IRarArchiveCommon
{
int MinVersion { get; }
int MaxVersion { get; }
}
public interface IRarArchive : IArchive, IRarArchiveCommon { }
public interface IRarAsyncArchive : IAsyncArchive, IRarArchiveCommon { }
public partial class RarArchive
: AbstractArchive<RarArchiveEntry, RarVolume>,
IRarArchive,
IRarAsyncArchive
public class RarArchive : AbstractArchive<RarArchiveEntry, RarVolume>
{
private bool _disposed;
internal Lazy<IRarUnpack> UnpackV2017 { get; } =
new(() => new Compressors.Rar.UnpackV2017.Unpack());
internal Lazy<IRarUnpack> UnpackV1 { get; } = new(() => new Compressors.Rar.UnpackV1.Unpack());
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
/// <param name="sourceStream"></param>
private RarArchive(SourceStream sourceStream)
: base(ArchiveType.Rar, sourceStream) { }
@@ -54,17 +45,12 @@ public partial class RarArchive
protected override IEnumerable<RarArchiveEntry> LoadEntries(IEnumerable<RarVolume> volumes) =>
RarArchiveEntryFactory.GetEntries(this, volumes, ReaderOptions);
// Simple async property - kept in original file
protected override IAsyncEnumerable<RarArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<RarVolume> volumes
) => RarArchiveEntryFactory.GetEntriesAsync(this, volumes, ReaderOptions);
protected override IEnumerable<RarVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.LoadAllParts();
sourceStream.LoadAllParts(); //request all streams
var streams = sourceStream.Streams.ToArray();
var i = 0;
if (streams.Length > 1 && IsRarFile(streams[1], ReaderOptions))
if (streams.Length > 1 && IsRarFile(streams[1], ReaderOptions)) //test part 2 - true = multipart not split
{
sourceStream.IsVolumes = true;
streams[1].Position = 0;
@@ -77,10 +63,17 @@ public partial class RarArchive
));
}
//split mode or single file
return new StreamRarArchiveVolume(sourceStream, ReaderOptions, i++).AsEnumerable();
}
protected override IReader CreateReaderForSolidExtraction()
protected override IReader CreateReaderForSolidExtraction() =>
CreateReaderForSolidExtractionInternal();
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(CreateReaderForSolidExtractionInternal());
private RarReader CreateReaderForSolidExtractionInternal()
{
if (this.IsMultipartVolume())
{
@@ -89,12 +82,12 @@ public partial class RarArchive
volume.Stream.Position = 0;
return volume.Stream;
});
return (RarReader)RarReader.OpenReader(streams, ReaderOptions);
return RarReader.Open(streams, ReaderOptions);
}
var stream = Volumes.First().Stream;
stream.Position = 0;
return (RarReader)RarReader.OpenReader(stream, ReaderOptions);
return RarReader.Open(stream, ReaderOptions);
}
public override bool IsSolid => Volumes.First().IsSolidArchive;
@@ -102,6 +95,188 @@ public partial class RarArchive
public override bool IsEncrypted => Entries.First(x => !x.IsDirectory).IsEncrypted;
public virtual int MinVersion => Volumes.First().MinVersion;
public virtual int MaxVersion => Volumes.First().MaxVersion;
#region Creation
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="options"></param>
public static RarArchive Open(string filePath, ReaderOptions? options = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
var fileInfo = new FileInfo(filePath);
return new RarArchive(
new SourceStream(
fileInfo,
i => RarArchiveVolumeFactory.GetFilePart(i, fileInfo),
options ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="options"></param>
public static RarArchive Open(FileInfo fileInfo, ReaderOptions? options = null)
{
fileInfo.NotNull(nameof(fileInfo));
return new RarArchive(
new SourceStream(
fileInfo,
i => RarArchiveVolumeFactory.GetFilePart(i, fileInfo),
options ?? new ReaderOptions()
)
);
}
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
/// <param name="stream"></param>
/// <param name="options"></param>
public static RarArchive Open(Stream stream, ReaderOptions? options = null)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new RarArchive(new SourceStream(stream, _ => null, options ?? new ReaderOptions()));
}
/// <summary>
/// Constructor with all file parts passed in
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
public static RarArchive Open(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new RarArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all stream parts passed in
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
public static RarArchive Open(IEnumerable<Stream> streams, ReaderOptions? readerOptions = null)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new RarArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Opens a RarArchive asynchronously from a stream.
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(stream, readerOptions));
}
/// <summary>
/// Opens a RarArchive asynchronously from a FileInfo.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfo, readerOptions));
}
/// <summary>
/// Opens a RarArchive asynchronously from multiple streams.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(streams, readerOptions));
}
/// <summary>
/// Opens a RarArchive asynchronously from multiple FileInfo objects.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfos, readerOptions));
}
public static bool IsRarFile(string filePath) => IsRarFile(new FileInfo(filePath));
public static bool IsRarFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsRarFile(stream);
}
public static bool IsRarFile(Stream stream, ReaderOptions? options = null)
{
try
{
MarkHeader.Read(stream, true, false);
return true;
}
catch
{
return false;
}
}
#endregion
}

View File

@@ -1,43 +0,0 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.Compressors.Rar;
using SharpCompress.Readers;
namespace SharpCompress.Archives.Rar;
public partial class RarArchiveEntry
{
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
RarStream stream;
if (IsRarV3)
{
stream = new RarStream(
archive.UnpackV1.Value,
FileHeader,
await MultiVolumeReadOnlyAsyncStream.Create(
Parts.ToAsyncEnumerable().CastAsync<RarFilePart>()
)
);
}
else
{
stream = new RarStream(
archive.UnpackV2017.Value,
FileHeader,
await MultiVolumeReadOnlyAsyncStream.Create(
Parts.ToAsyncEnumerable().CastAsync<RarFilePart>()
)
);
}
await stream.InitializeAsync(cancellationToken);
return stream;
}
}

View File

@@ -12,7 +12,7 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives.Rar;
public partial class RarArchiveEntry : RarEntry, IArchiveEntry
public class RarArchiveEntry : RarEntry, IArchiveEntry
{
private readonly ICollection<RarFilePart> parts;
private readonly RarArchive archive;
@@ -23,7 +23,6 @@ public partial class RarArchiveEntry : RarEntry, IArchiveEntry
IEnumerable<RarFilePart> parts,
ReaderOptions readerOptions
)
: base(readerOptions)
{
this.parts = parts.ToList();
this.archive = archive;
@@ -93,6 +92,32 @@ public partial class RarArchiveEntry : RarEntry, IArchiveEntry
return stream;
}
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
RarStream stream;
if (IsRarV3)
{
stream = new RarStream(
archive.UnpackV1.Value,
FileHeader,
new MultiVolumeReadOnlyStream(Parts.Cast<RarFilePart>())
);
}
else
{
stream = new RarStream(
archive.UnpackV2017.Value,
FileHeader,
new MultiVolumeReadOnlyStream(Parts.Cast<RarFilePart>())
);
}
await stream.InitializeAsync(cancellationToken);
return stream;
}
public bool IsComplete
{
get

View File

@@ -17,19 +17,6 @@ internal static class RarArchiveEntryFactory
}
}
private static async IAsyncEnumerable<RarFilePart> GetFilePartsAsync(
IAsyncEnumerable<RarVolume> parts
)
{
await foreach (var rarPart in parts)
{
await foreach (var fp in rarPart.ReadFilePartsAsync())
{
yield return fp;
}
}
}
private static IEnumerable<IEnumerable<RarFilePart>> GetMatchedFileParts(
IEnumerable<RarVolume> parts
)
@@ -51,27 +38,6 @@ internal static class RarArchiveEntryFactory
}
}
private static async IAsyncEnumerable<IEnumerable<RarFilePart>> GetMatchedFilePartsAsync(
IAsyncEnumerable<RarVolume> parts
)
{
var groupedParts = new List<RarFilePart>();
await foreach (var fp in GetFilePartsAsync(parts))
{
groupedParts.Add(fp);
if (!fp.FileHeader.IsSplitAfter)
{
yield return groupedParts;
groupedParts = new List<RarFilePart>();
}
}
if (groupedParts.Count > 0)
{
yield return groupedParts;
}
}
internal static IEnumerable<RarArchiveEntry> GetEntries(
RarArchive archive,
IEnumerable<RarVolume> rarParts,
@@ -83,16 +49,4 @@ internal static class RarArchiveEntryFactory
yield return new RarArchiveEntry(archive, groupedParts, readerOptions);
}
}
internal static async IAsyncEnumerable<RarArchiveEntry> GetEntriesAsync(
RarArchive archive,
IAsyncEnumerable<RarVolume> rarParts,
ReaderOptions readerOptions
)
{
await foreach (var groupedParts in GetMatchedFilePartsAsync(rarParts))
{
yield return new RarArchiveEntry(archive, groupedParts, readerOptions);
}
}
}

View File

@@ -13,7 +13,6 @@ internal static class RarArchiveVolumeFactory
//new style rar - ..part1 | /part01 | part001 ....
var m = Regex.Match(part1.Name, @"^(.*\.part)([0-9]+)(\.rar)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -24,13 +23,11 @@ internal static class RarArchiveVolumeFactory
)
)
);
}
else
{
//old style - ...rar, .r00, .r01 ...
m = Regex.Match(part1.Name, @"^(.*\.)([r-z{])(ar|[0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -43,17 +40,12 @@ internal static class RarArchiveVolumeFactory
)
)
);
}
else //split .001, .002 ....
{
return ArchiveVolumeFactory.GetFilePart(index, part1);
}
}
if (item != null && item.Exists)
{
return item;
}
return null; //no more items
}

View File

@@ -14,9 +14,6 @@ internal class StreamRarArchiveVolume : RarVolume
internal override IEnumerable<RarFilePart> ReadFileParts() => GetVolumeFileParts();
internal override IAsyncEnumerable<RarFilePart> ReadFilePartsAsync() =>
GetVolumeFilePartsAsync();
internal override RarFilePart CreateFilePart(MarkHeader markHeader, FileHeader fileHeader) =>
new SeekableFilePart(markHeader, fileHeader, Index, Stream, ReaderOptions.Password);
}

View File

@@ -1,74 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.SevenZip;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives.SevenZip;
public partial class SevenZipArchive
{
private async ValueTask LoadFactoryAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
if (_database is null)
{
stream.Position = 0;
var reader = new ArchiveReader();
await reader.OpenAsync(
stream,
lookForHeader: ReaderOptions.LookForHeader,
cancellationToken
);
_database = await reader.ReadDatabaseAsync(
new PasswordProvider(ReaderOptions.Password),
cancellationToken
);
}
}
protected override async IAsyncEnumerable<SevenZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<SevenZipVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
await LoadFactoryAsync(stream);
if (_database is null)
{
yield break;
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(stream, _database, i, file, ReaderOptions.ArchiveEncoding),
ReaderOptions
);
}
foreach (var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder))
{
var isSolid = false;
foreach (var entry in group)
{
entry.IsSolid = isSolid;
isSolid = true;
}
}
foreach (var entry in entries)
{
yield return entry;
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(new SevenZipReader(ReaderOptions, this));
}

View File

@@ -1,194 +0,0 @@
using System;
using System.Buffers;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives.SevenZip;
public partial class SevenZipArchive
#if NET8_0_OR_GREATER
: IArchiveOpenable<IArchive, IAsyncArchive>,
IMultiArchiveOpenable<IArchive, IAsyncArchive>
#endif
{
public static IAsyncArchive OpenAsyncArchive(string path, ReaderOptions? readerOptions = null)
{
path.NotNullOrEmpty("path");
return (IAsyncArchive)OpenArchive(new FileInfo(path), readerOptions ?? new ReaderOptions());
}
public static IArchive OpenArchive(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty("filePath");
return OpenArchive(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
public static IArchive OpenArchive(FileInfo fileInfo, ReaderOptions? readerOptions = null)
{
fileInfo.NotNull("fileInfo");
return new SevenZipArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
public static IArchive OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new SevenZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IArchive OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new SevenZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IArchive OpenArchive(Stream stream, ReaderOptions? readerOptions = null)
{
stream.NotNull("stream");
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new SevenZipArchive(
new SourceStream(stream, _ => null, readerOptions ?? new ReaderOptions())
);
}
public static IAsyncArchive OpenAsyncArchive(Stream stream, ReaderOptions? readerOptions = null)
{
return (IAsyncArchive)OpenArchive(stream, readerOptions);
}
public static IAsyncArchive OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
)
{
return (IAsyncArchive)OpenArchive(fileInfo, readerOptions);
}
public static IAsyncArchive OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
)
{
return (IAsyncArchive)OpenArchive(streams, readerOptions);
}
public static IAsyncArchive OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
return (IAsyncArchive)OpenArchive(fileInfos, readerOptions);
}
public static bool IsSevenZipFile(string filePath) => IsSevenZipFile(new FileInfo(filePath));
public static bool IsSevenZipFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsSevenZipFile(stream);
}
public static bool IsSevenZipFile(Stream stream)
{
try
{
return SignatureMatch(stream);
}
catch
{
return false;
}
}
public static async ValueTask<bool> IsSevenZipFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
return await SignatureMatchAsync(stream, cancellationToken);
}
catch
{
return false;
}
}
private static ReadOnlySpan<byte> Signature => [(byte)'7', (byte)'z', 0xBC, 0xAF, 0x27, 0x1C];
private static bool SignatureMatch(Stream stream)
{
var buffer = ArrayPool<byte>.Shared.Rent(6);
try
{
stream.ReadExact(buffer, 0, 6);
return buffer.AsSpan().Slice(0, 6).SequenceEqual(Signature);
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
}
private static async ValueTask<bool> SignatureMatchAsync(
Stream stream,
CancellationToken cancellationToken
)
{
var buffer = ArrayPool<byte>.Shared.Rent(6);
try
{
if (!await stream.ReadFullyAsync(buffer, 0, 6, cancellationToken).ConfigureAwait(false))
{
return false;
}
return buffer.AsSpan().Slice(0, 6).SequenceEqual(Signature);
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
}
}

View File

@@ -12,10 +12,163 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives.SevenZip;
public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, SevenZipVolume>
public class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, SevenZipVolume>
{
private ArchiveDatabase? _database;
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="readerOptions"></param>
public static SevenZipArchive Open(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty("filePath");
return Open(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
public static SevenZipArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null)
{
fileInfo.NotNull("fileInfo");
return new SevenZipArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all file parts passed in
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
public static SevenZipArchive Open(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new SevenZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all stream parts passed in
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
public static SevenZipArchive Open(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new SevenZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
public static SevenZipArchive Open(Stream stream, ReaderOptions? readerOptions = null)
{
stream.NotNull("stream");
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new SevenZipArchive(
new SourceStream(stream, _ => null, readerOptions ?? new ReaderOptions())
);
}
/// <summary>
/// Opens a SevenZipArchive asynchronously from a stream.
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(stream, readerOptions));
}
/// <summary>
/// Opens a SevenZipArchive asynchronously from a FileInfo.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfo, readerOptions));
}
/// <summary>
/// Opens a SevenZipArchive asynchronously from multiple streams.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(streams, readerOptions));
}
/// <summary>
/// Opens a SevenZipArchive asynchronously from multiple FileInfo objects.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfos, readerOptions));
}
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
@@ -29,6 +182,18 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
return new SevenZipVolume(sourceStream, ReaderOptions, 0).AsEnumerable(); //simple single volume or split, multivolume not supported
}
public static bool IsSevenZipFile(string filePath) => IsSevenZipFile(new FileInfo(filePath));
public static bool IsSevenZipFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsSevenZipFile(stream);
}
internal SevenZipArchive()
: base(ArchiveType.SevenZip) { }
@@ -36,46 +201,32 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
IEnumerable<SevenZipVolume> volumes
)
{
foreach (var volume in volumes)
var stream = volumes.Single().Stream;
LoadFactory(stream);
if (_database is null)
{
LoadFactory(volume.Stream);
if (_database is null)
return Enumerable.Empty<SevenZipArchiveEntry>();
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(stream, _database, i, file, ReaderOptions.ArchiveEncoding)
);
}
foreach (var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder))
{
var isSolid = false;
foreach (var entry in group)
{
yield break;
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(
volume.Stream,
_database,
i,
file,
ReaderOptions.ArchiveEncoding
),
ReaderOptions
);
}
foreach (
var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder)
)
{
var isSolid = false;
foreach (var entry in group)
{
entry.IsSolid = isSolid;
isSolid = true;
}
}
foreach (var entry in entries)
{
yield return entry;
entry.IsSolid = isSolid;
isSolid = true; //mark others in this group as solid - same as rar behaviour.
}
}
return entries;
}
private void LoadFactory(Stream stream)
@@ -89,9 +240,34 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
}
}
public static bool IsSevenZipFile(Stream stream)
{
try
{
return SignatureMatch(stream);
}
catch
{
return false;
}
}
private static ReadOnlySpan<byte> Signature =>
new byte[] { (byte)'7', (byte)'z', 0xBC, 0xAF, 0x27, 0x1C };
private static bool SignatureMatch(Stream stream)
{
var reader = new BinaryReader(stream);
ReadOnlySpan<byte> signatureBytes = reader.ReadBytes(6);
return signatureBytes.SequenceEqual(Signature);
}
protected override IReader CreateReaderForSolidExtraction() =>
new SevenZipReader(ReaderOptions, this);
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(new SevenZipReader(ReaderOptions, this));
public override bool IsSolid =>
Entries
.Where(x => !x.IsDirectory)
@@ -103,34 +279,13 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
public override long TotalSize =>
_database?._packSizes.Aggregate(0L, (total, packSize) => total + packSize) ?? 0;
internal sealed class SevenZipReader : AbstractReader<SevenZipEntry, SevenZipVolume>
private sealed class SevenZipReader : AbstractReader<SevenZipEntry, SevenZipVolume>
{
private readonly SevenZipArchive _archive;
private SevenZipEntry? _currentEntry;
private Stream? _currentFolderStream;
private CFolder? _currentFolder;
/// <summary>
/// Enables internal diagnostics for tests.
/// When disabled (default), diagnostics properties return null to avoid exposing internal state.
/// </summary>
internal bool DiagnosticsEnabled { get; set; }
/// <summary>
/// Current folder instance used to decide whether the solid folder stream should be reused.
/// Only available when <see cref="DiagnosticsEnabled"/> is true.
/// </summary>
internal object? DiagnosticsCurrentFolder => DiagnosticsEnabled ? _currentFolder : null;
/// <summary>
/// Current shared folder stream instance.
/// Only available when <see cref="DiagnosticsEnabled"/> is true.
/// </summary>
internal Stream? DiagnosticsCurrentFolderStream =>
DiagnosticsEnabled ? _currentFolderStream : null;
internal SevenZipReader(ReaderOptions readerOptions, SevenZipArchive archive)
: base(readerOptions, ArchiveType.SevenZip, false) => this._archive = archive;
: base(readerOptions, ArchiveType.SevenZip) => this._archive = archive;
public override SevenZipVolume Volume => _archive.Volumes.Single();
@@ -143,10 +298,9 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
_currentEntry = dir;
yield return dir;
}
// For solid archives (entries in the same folder share a compressed stream),
// we must iterate entries sequentially and maintain the folder stream state
// across entries in the same folder to avoid recreating the decompression
// stream for each file, which breaks contiguous streaming.
// For non-directory entries, yield them without creating shared streams
// Each call to GetEntryStream() will create a fresh decompression stream
// to avoid state corruption issues with async operations
foreach (var entry in entries.Where(x => !x.IsDirectory))
{
_currentEntry = entry;
@@ -156,46 +310,19 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
protected override EntryStream GetEntryStream()
{
// Create a fresh decompression stream for each file (no state sharing).
// However, the LZMA decoder has bugs in its async implementation that cause
// state corruption even on fresh streams. The SyncOnlyStream wrapper
// works around these bugs by forcing async operations to use sync equivalents.
//
// TODO: Fix the LZMA decoder async bugs (in LzmaStream, Decoder, OutWindow)
// so this wrapper is no longer necessary.
var entry = _currentEntry.NotNull("currentEntry is not null");
if (entry.IsDirectory)
{
return CreateEntryStream(Stream.Null);
}
var folder = entry.FilePart.Folder;
// Check if we're starting a new folder - dispose old folder stream if needed
if (folder != _currentFolder)
{
_currentFolderStream?.Dispose();
_currentFolderStream = null;
_currentFolder = folder;
}
// Create the folder stream once per folder
if (_currentFolderStream is null)
{
_currentFolderStream = _archive._database!.GetFolderStream(
_archive.Volumes.Single().Stream,
folder!,
_archive._database.PasswordProvider
);
}
// Wrap with SyncOnlyStream to work around LZMA async bugs
// Return a ReadOnlySubStream that reads from the shared folder stream
return CreateEntryStream(
new SyncOnlyStream(
new ReadOnlySubStream(_currentFolderStream, entry.Size, leaveOpen: true)
)
);
}
public override void Dispose()
{
_currentFolderStream?.Dispose();
_currentFolderStream = null;
base.Dispose();
return CreateEntryStream(new SyncOnlyStream(entry.FilePart.GetCompressedStream()));
}
}
@@ -268,7 +395,7 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
return Task.CompletedTask;
}
#if !LEGACY_DOTNET
#if !NETFRAMEWORK && !NETSTANDARD2_0
public override ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default

View File

@@ -1,25 +1,20 @@
using System.IO;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Options;
using SharpCompress.Common.SevenZip;
namespace SharpCompress.Archives.SevenZip;
public class SevenZipArchiveEntry : SevenZipEntry, IArchiveEntry
{
internal SevenZipArchiveEntry(
SevenZipArchive archive,
SevenZipFilePart part,
IReaderOptions readerOptions
)
: base(part, readerOptions) => Archive = archive;
internal SevenZipArchiveEntry(SevenZipArchive archive, SevenZipFilePart part)
: base(part) => Archive = archive;
public Stream OpenEntryStream() => FilePart.GetCompressedStream();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
) => (await FilePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
) => OpenEntryStream();
public IArchive Archive { get; }

View File

@@ -1,167 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Common.Tar;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.Tar;
using SharpCompress.Writers;
using SharpCompress.Writers.Tar;
namespace SharpCompress.Archives.Tar;
public partial class TarArchive
{
protected override async ValueTask SaveToAsync(
Stream stream,
TarWriterOptions options,
IAsyncEnumerable<TarArchiveEntry> oldEntries,
IEnumerable<TarArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new TarWriter(
stream,
options as TarWriterOptions ?? new TarWriterOptions(options)
);
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)TarReader.OpenReader(stream));
}
protected override async IAsyncEnumerable<TarArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<TarVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
if (stream.CanSeek)
{
stream.Position = 0;
}
// Always use async header reading in LoadEntriesAsync for consistency
{
// Use async header reading for async-only streams
TarHeader? previousHeader = null;
await foreach (
var header in TarHeaderFactory.ReadHeaderAsync(
StreamingMode.Seekable,
stream,
ReaderOptions.ArchiveEncoding
)
)
{
if (header != null)
{
if (header.EntryType == EntryType.LongName)
{
previousHeader = header;
}
else
{
if (previousHeader != null)
{
var entry = new TarArchiveEntry(
this,
new TarFilePart(previousHeader, stream),
CompressionType.None,
ReaderOptions
);
var oldStreamPos = stream.Position;
using (var entryStream = entry.OpenEntryStream())
{
using var memoryStream = new MemoryStream();
await entryStream.CopyToAsync(memoryStream);
memoryStream.Position = 0;
var bytes = memoryStream.ToArray();
header.Name = ReaderOptions
.ArchiveEncoding.Decode(bytes)
.TrimNulls();
}
stream.Position = oldStreamPos;
previousHeader = null;
}
yield return new TarArchiveEntry(
this,
new TarFilePart(header, stream),
CompressionType.None,
ReaderOptions
);
}
}
else
{
throw new IncompleteArchiveException("Failed to read TAR header");
}
}
}
}
}

View File

@@ -1,187 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers.Tar;
namespace SharpCompress.Archives.Tar;
public partial class TarArchive
#if NET8_0_OR_GREATER
: IWritableArchiveOpenable<TarWriterOptions>,
IMultiArchiveOpenable<
IWritableArchive<TarWriterOptions>,
IWritableAsyncArchive<TarWriterOptions>
>
#endif
{
public static IWritableArchive<TarWriterOptions> OpenArchive(
string filePath,
ReaderOptions? readerOptions = null
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), readerOptions);
}
public static IWritableArchive<TarWriterOptions> OpenArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
)
{
fileInfo.NotNull(nameof(fileInfo));
return new TarArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
)
);
}
public static IWritableArchive<TarWriterOptions> OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new TarArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
)
);
}
public static IWritableArchive<TarWriterOptions> OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new TarArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IWritableArchive<TarWriterOptions> OpenArchive(
Stream stream,
ReaderOptions? readerOptions = null
)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new TarArchive(
new SourceStream(stream, i => null, readerOptions ?? new ReaderOptions())
);
}
public static IWritableAsyncArchive<TarWriterOptions> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<TarWriterOptions>)OpenArchive(stream, readerOptions);
public static IWritableAsyncArchive<TarWriterOptions> OpenAsyncArchive(
string path,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<TarWriterOptions>)OpenArchive(new FileInfo(path), readerOptions);
public static IWritableAsyncArchive<TarWriterOptions> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<TarWriterOptions>)OpenArchive(fileInfo, readerOptions);
public static IWritableAsyncArchive<TarWriterOptions> OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<TarWriterOptions>)OpenArchive(streams, readerOptions);
public static IWritableAsyncArchive<TarWriterOptions> OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<TarWriterOptions>)OpenArchive(fileInfos, readerOptions);
public static bool IsTarFile(string filePath) => IsTarFile(new FileInfo(filePath));
public static bool IsTarFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsTarFile(stream);
}
public static bool IsTarFile(Stream stream)
{
try
{
var tarHeader = new TarHeader(new ArchiveEncoding());
var reader = new BinaryReader(stream, Encoding.UTF8, false);
var readSucceeded = tarHeader.Read(reader);
var isEmptyArchive =
tarHeader.Name?.Length == 0
&& tarHeader.Size == 0
&& Enum.IsDefined(typeof(EntryType), tarHeader.EntryType);
return readSucceeded || isEmptyArchive;
}
catch (Exception)
{
// Catch all exceptions during tar header reading to determine if this is a valid tar file
// Invalid tar files or corrupted streams will throw various exceptions
return false;
}
}
public static async ValueTask<bool> IsTarFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
try
{
var tarHeader = new TarHeader(new ArchiveEncoding());
#if NET8_0_OR_GREATER
await using var reader = new AsyncBinaryReader(stream, leaveOpen: true);
#else
using var reader = new AsyncBinaryReader(stream, leaveOpen: true);
#endif
var readSucceeded = await tarHeader.ReadAsync(reader);
var isEmptyArchive =
tarHeader.Name?.Length == 0
&& tarHeader.Size == 0
&& Enum.IsDefined(typeof(EntryType), tarHeader.EntryType);
return readSucceeded || isEmptyArchive;
}
catch (Exception)
{
// Catch all exceptions during tar header reading to determine if this is a valid tar file
// Invalid tar files or corrupted streams will throw various exceptions
return false;
}
}
public static IWritableArchive<TarWriterOptions> CreateArchive() => new TarArchive();
public static IWritableAsyncArchive<TarWriterOptions> CreateAsyncArchive() => new TarArchive();
}

View File

@@ -5,7 +5,6 @@ using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Common.Tar;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
@@ -16,15 +15,196 @@ using SharpCompress.Writers.Tar;
namespace SharpCompress.Archives.Tar;
public partial class TarArchive
: AbstractWritableArchive<TarArchiveEntry, TarVolume, TarWriterOptions>
public class TarArchive : AbstractWritableArchive<TarArchiveEntry, TarVolume>
{
protected override IEnumerable<TarVolume> LoadVolumes(SourceStream sourceStream)
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="readerOptions"></param>
public static TarArchive Open(string filePath, ReaderOptions? readerOptions = null)
{
sourceStream.NotNull("SourceStream is null").LoadAllParts();
return new TarVolume(sourceStream, ReaderOptions, 1).AsEnumerable();
filePath.NotNullOrEmpty(nameof(filePath));
return Open(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
public static TarArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null)
{
fileInfo.NotNull(nameof(fileInfo));
return new TarArchive(
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all file parts passed in
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
public static TarArchive Open(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new TarArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all stream parts passed in
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
public static TarArchive Open(IEnumerable<Stream> streams, ReaderOptions? readerOptions = null)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new TarArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
public static TarArchive Open(Stream stream, ReaderOptions? readerOptions = null)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new TarArchive(
new SourceStream(stream, i => null, readerOptions ?? new ReaderOptions())
);
}
/// <summary>
/// Opens a TarArchive asynchronously from a stream.
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(stream, readerOptions));
}
/// <summary>
/// Opens a TarArchive asynchronously from a FileInfo.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfo, readerOptions));
}
/// <summary>
/// Opens a TarArchive asynchronously from multiple streams.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(streams, readerOptions));
}
/// <summary>
/// Opens a TarArchive asynchronously from multiple FileInfo objects.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfos, readerOptions));
}
public static bool IsTarFile(string filePath) => IsTarFile(new FileInfo(filePath));
public static bool IsTarFile(FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsTarFile(stream);
}
public static bool IsTarFile(Stream stream)
{
try
{
var tarHeader = new TarHeader(new ArchiveEncoding());
var readSucceeded = tarHeader.Read(new BinaryReader(stream));
var isEmptyArchive =
tarHeader.Name?.Length == 0
&& tarHeader.Size == 0
&& Enum.IsDefined(typeof(EntryType), tarHeader.EntryType);
return readSucceeded || isEmptyArchive;
}
catch { }
return false;
}
protected override IEnumerable<TarVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.NotNull("SourceStream is null").LoadAllParts(); //request all streams
return new TarVolume(sourceStream, ReaderOptions, 1).AsEnumerable(); //simple single volume or split, multivolume not supported
}
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
/// <param name="sourceStream"></param>
private TarArchive(SourceStream sourceStream)
: base(ArchiveType.Tar, sourceStream) { }
@@ -34,10 +214,6 @@ public partial class TarArchive
protected override IEnumerable<TarArchiveEntry> LoadEntries(IEnumerable<TarVolume> volumes)
{
var stream = volumes.Single().Stream;
if (stream.CanSeek)
{
stream.Position = 0;
}
TarHeader? previousHeader = null;
foreach (
var header in TarHeaderFactory.ReadHeader(
@@ -60,8 +236,7 @@ public partial class TarArchive
var entry = new TarArchiveEntry(
this,
new TarFilePart(previousHeader, stream),
CompressionType.None,
ReaderOptions
CompressionType.None
);
var oldStreamPos = stream.Position;
@@ -69,7 +244,7 @@ public partial class TarArchive
using (var entryStream = entry.OpenEntryStream())
{
using var memoryStream = new MemoryStream();
entryStream.CopyTo(memoryStream, Constants.BufferSize);
entryStream.CopyTo(memoryStream);
memoryStream.Position = 0;
var bytes = memoryStream.ToArray();
@@ -83,8 +258,7 @@ public partial class TarArchive
yield return new TarArchiveEntry(
this,
new TarFilePart(header, stream),
CompressionType.None,
ReaderOptions
CompressionType.None
);
}
}
@@ -95,6 +269,8 @@ public partial class TarArchive
}
}
public static TarArchive Create() => new();
protected override TarArchiveEntry CreateEntryInternal(
string filePath,
Stream source,
@@ -119,15 +295,12 @@ public partial class TarArchive
protected override void SaveTo(
Stream stream,
TarWriterOptions options,
WriterOptions options,
IEnumerable<TarArchiveEntry> oldEntries,
IEnumerable<TarArchiveEntry> newEntries
)
{
using var writer = new TarWriter(
stream,
options as TarWriterOptions ?? new TarWriterOptions(options)
);
using var writer = new TarWriter(stream, new TarWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
@@ -150,10 +323,54 @@ public partial class TarArchive
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<TarArchiveEntry> oldEntries,
IEnumerable<TarArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new TarWriter(stream, new TarWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override IReader CreateReaderForSolidExtraction()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return TarReader.OpenReader(stream);
return TarReader.Open(stream);
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new(TarReader.Open(stream));
}
}

View File

@@ -1,28 +1,22 @@
using System.IO;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Common.Tar;
namespace SharpCompress.Archives.Tar;
public class TarArchiveEntry : TarEntry, IArchiveEntry
{
internal TarArchiveEntry(
TarArchive archive,
TarFilePart? part,
CompressionType compressionType,
IReaderOptions readerOptions
)
: base(part, compressionType, readerOptions) => Archive = archive;
internal TarArchiveEntry(TarArchive archive, TarFilePart? part, CompressionType compressionType)
: base(part, compressionType) => Archive = archive;
public virtual Stream OpenEntryStream() => Parts.Single().GetCompressedStream().NotNull();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
) => (await Parts.Single().GetCompressedStreamAsync(cancellationToken)).NotNull();
) => OpenEntryStream();
#region IArchiveEntry Members

View File

@@ -21,7 +21,7 @@ internal sealed class TarWritableArchiveEntry : TarArchiveEntry, IWritableArchiv
DateTime? lastModified,
bool closeStream
)
: base(archive, null, compressionType, archive.ReaderOptions)
: base(archive, null, compressionType)
{
this.stream = stream;
Key = path;
@@ -36,7 +36,7 @@ internal sealed class TarWritableArchiveEntry : TarArchiveEntry, IWritableArchiv
string directoryPath,
DateTime? lastModified
)
: base(archive, null, CompressionType.None, archive.ReaderOptions)
: base(archive, null, CompressionType.None)
{
stream = null;
Key = directoryPath;
@@ -79,7 +79,7 @@ internal sealed class TarWritableArchiveEntry : TarArchiveEntry, IWritableArchiv
}
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -1,137 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Common.Zip;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers;
using SharpCompress.Writers.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchive
{
protected override async IAsyncEnumerable<ZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<ZipVolume> volumes
)
{
var vols = await volumes.ToListAsync();
var volsArray = vols.ToArray();
await foreach (
var h in headerFactory.NotNull().ReadSeekableHeaderAsync(volsArray.Last().Stream)
)
{
if (h != null)
{
switch (h.ZipHeaderType)
{
case ZipHeaderType.DirectoryEntry:
{
var deh = (DirectoryEntryHeader)h;
Stream s;
if (
deh.RelativeOffsetOfEntryHeader + deh.CompressedSize
> volsArray[deh.DiskNumberStart].Stream.Length
)
{
var v = volsArray.Skip(deh.DiskNumberStart).ToArray();
s = new SourceStream(
v[0].Stream,
i => i < v.Length ? v[i].Stream : null,
new ReaderOptions() { LeaveStreamOpen = true }
);
}
else
{
s = volsArray[deh.DiskNumberStart].Stream;
}
yield return new ZipArchiveEntry(
this,
new SeekableZipFilePart(headerFactory.NotNull(), deh, s),
ReaderOptions
);
}
break;
case ZipHeaderType.DirectoryEnd:
{
var bytes = ((DirectoryEndHeader)h).Comment ?? Array.Empty<byte>();
volsArray.Last().Comment = ReaderOptions.ArchiveEncoding.Decode(bytes);
yield break;
}
}
}
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
ZipWriterOptions options,
IAsyncEnumerable<ZipArchiveEntry> oldEntries,
IEnumerable<ZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new ZipWriter(
stream,
options as ZipWriterOptions ?? new ZipWriterOptions(options)
);
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
}

View File

@@ -1,272 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Zip;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchive
#if NET8_0_OR_GREATER
: IWritableArchiveOpenable<ZipWriterOptions>,
IMultiArchiveOpenable<
IWritableArchive<ZipWriterOptions>,
IWritableAsyncArchive<ZipWriterOptions>
>
#endif
{
public static IWritableArchive<ZipWriterOptions> OpenArchive(
string filePath,
ReaderOptions? readerOptions = null
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), readerOptions);
}
public static IWritableArchive<ZipWriterOptions> OpenArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
)
{
fileInfo.NotNull(nameof(fileInfo));
return new ZipArchive(
new SourceStream(
fileInfo,
i => ZipArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
)
);
}
public static IWritableArchive<ZipWriterOptions> OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new ZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
)
);
}
public static IWritableArchive<ZipWriterOptions> OpenArchive(
IEnumerable<Stream> streams,
ReaderOptions? readerOptions = null
)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new ZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
public static IWritableArchive<ZipWriterOptions> OpenArchive(
Stream stream,
ReaderOptions? readerOptions = null
)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new ZipArchive(
new SourceStream(stream, i => null, readerOptions ?? new ReaderOptions())
);
}
public static IWritableAsyncArchive<ZipWriterOptions> OpenAsyncArchive(
string path,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<ZipWriterOptions>)OpenArchive(path, readerOptions);
public static IWritableAsyncArchive<ZipWriterOptions> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<ZipWriterOptions>)OpenArchive(stream, readerOptions);
public static IWritableAsyncArchive<ZipWriterOptions> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<ZipWriterOptions>)OpenArchive(fileInfo, readerOptions);
public static IWritableAsyncArchive<ZipWriterOptions> OpenAsyncArchive(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<ZipWriterOptions>)OpenArchive(streams, readerOptions);
public static IWritableAsyncArchive<ZipWriterOptions> OpenAsyncArchive(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
) => (IWritableAsyncArchive<ZipWriterOptions>)OpenArchive(fileInfos, readerOptions);
public static bool IsZipFile(string filePath, string? password = null) =>
IsZipFile(new FileInfo(filePath), password);
public static bool IsZipFile(FileInfo fileInfo, string? password = null)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsZipFile(stream, password);
}
public static bool IsZipFile(Stream stream, string? password = null)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
return false;
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static bool IsZipMulti(Stream stream, string? password = null)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
if (stream.CanSeek)
{
var z = new SeekableZipHeaderFactory(password, new ArchiveEncoding());
var x = z.ReadSeekableHeader(stream).FirstOrDefault();
return x?.ZipHeaderType == ZipHeaderType.DirectoryEntry;
}
else
{
return false;
}
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static async ValueTask<bool> IsZipFileAsync(
Stream stream,
string? password = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
var header = await headerFactory
.ReadStreamHeaderAsync(stream)
.Where(x => x.ZipHeaderType != ZipHeaderType.Split)
.FirstOrDefaultAsync(cancellationToken);
if (header is null)
{
return false;
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static IWritableArchive<ZipWriterOptions> CreateArchive() => new ZipArchive();
public static IWritableAsyncArchive<ZipWriterOptions> CreateAsyncArchive() => new ZipArchive();
public static async ValueTask<bool> IsZipMultiAsync(
Stream stream,
string? password = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
if (stream.CanSeek)
{
var z = new SeekableZipHeaderFactory(password, new ArchiveEncoding());
ZipHeader? x = null;
await foreach (
var h in z.ReadSeekableHeaderAsync(stream)
.WithCancellation(cancellationToken)
)
{
x = h;
break;
}
return x?.ZipHeaderType == ZipHeaderType.DirectoryEntry;
}
else
{
return false;
}
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
}

View File

@@ -5,7 +5,6 @@ using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Options;
using SharpCompress.Common.Zip;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Compressors.Deflate;
@@ -17,13 +16,21 @@ using SharpCompress.Writers.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchive
: AbstractWritableArchive<ZipArchiveEntry, ZipVolume, ZipWriterOptions>
public class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVolume>
{
private readonly SeekableZipHeaderFactory? headerFactory;
/// <summary>
/// Gets or sets the compression level applied to files added to the archive,
/// if the compression method is set to deflate
/// </summary>
public CompressionLevel DeflateCompressionLevel { get; set; }
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
/// <param name="sourceStream"></param>
/// <param name="options"></param>
internal ZipArchive(SourceStream sourceStream)
: base(ArchiveType.Zip, sourceStream) =>
headerFactory = new SeekableZipHeaderFactory(
@@ -31,43 +38,377 @@ public partial class ZipArchive
sourceStream.ReaderOptions.ArchiveEncoding
);
internal ZipArchive()
: base(ArchiveType.Zip) { }
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
/// <param name="filePath"></param>
/// <param name="readerOptions"></param>
public static ZipArchive Open(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
return Open(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
public static ZipArchive Open(FileInfo fileInfo, ReaderOptions? readerOptions = null)
{
fileInfo.NotNull(nameof(fileInfo));
return new ZipArchive(
new SourceStream(
fileInfo,
i => ZipArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all file parts passed in
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
public static ZipArchive Open(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? readerOptions = null
)
{
fileInfos.NotNull(nameof(fileInfos));
var files = fileInfos.ToArray();
return new ZipArchive(
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Constructor with all stream parts passed in
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
public static ZipArchive Open(IEnumerable<Stream> streams, ReaderOptions? readerOptions = null)
{
streams.NotNull(nameof(streams));
var strms = streams.ToArray();
return new ZipArchive(
new SourceStream(
strms[0],
i => i < strms.Length ? strms[i] : null,
readerOptions ?? new ReaderOptions()
)
);
}
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
public static ZipArchive Open(Stream stream, ReaderOptions? readerOptions = null)
{
stream.NotNull(nameof(stream));
if (stream is not { CanSeek: true })
{
throw new ArgumentException("Stream must be seekable", nameof(stream));
}
return new ZipArchive(
new SourceStream(stream, i => null, readerOptions ?? new ReaderOptions())
);
}
/// <summary>
/// Opens a ZipArchive asynchronously from a stream.
/// </summary>
/// <param name="stream"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(stream, readerOptions));
}
/// <summary>
/// Opens a ZipArchive asynchronously from a FileInfo.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfo, readerOptions));
}
/// <summary>
/// Opens a ZipArchive asynchronously from multiple streams.
/// </summary>
/// <param name="streams"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<Stream> streams,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(streams, readerOptions));
}
/// <summary>
/// Opens a ZipArchive asynchronously from multiple FileInfo objects.
/// </summary>
/// <param name="fileInfos"></param>
/// <param name="readerOptions"></param>
/// <param name="cancellationToken"></param>
public static ValueTask<IAsyncArchive> OpenAsync(
IReadOnlyList<FileInfo> fileInfos,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return new(Open(fileInfos, readerOptions));
}
public static bool IsZipFile(
string filePath,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
) => IsZipFile(new FileInfo(filePath), password, bufferSize);
public static bool IsZipFile(
FileInfo fileInfo,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsZipFile(stream, password, bufferSize);
}
public static bool IsZipFile(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
return false;
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static bool IsZipMulti(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
if (stream.CanSeek) //could be multipart. Test for central directory - might not be z64 safe
{
var z = new SeekableZipHeaderFactory(password, new ArchiveEncoding());
var x = z.ReadSeekableHeader(stream, useSync: true).FirstOrDefault();
return x?.ZipHeaderType == ZipHeaderType.DirectoryEntry;
}
else
{
return false;
}
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static async ValueTask<bool> IsZipFileAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = await headerFactory
.ReadStreamHeaderAsync(stream)
.Where(x => x.ZipHeaderType != ZipHeaderType.Split)
.FirstOrDefaultAsync();
if (header is null)
{
return false;
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
public static async ValueTask<bool> IsZipMultiAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
if (header is null)
{
if (stream.CanSeek) //could be multipart. Test for central directory - might not be z64 safe
{
var z = new SeekableZipHeaderFactory(password, new ArchiveEncoding());
ZipHeader? x = null;
await foreach (
var h in z.ReadSeekableHeaderAsync(stream)
.WithCancellation(cancellationToken)
)
{
x = h;
break;
}
return x?.ZipHeaderType == ZipHeaderType.DirectoryEntry;
}
else
{
return false;
}
}
return Enum.IsDefined(typeof(ZipHeaderType), header.ZipHeaderType);
}
catch (CryptographicException)
{
return true;
}
catch
{
return false;
}
}
protected override IEnumerable<ZipVolume> LoadVolumes(SourceStream stream)
{
stream.LoadAllParts();
//stream.Position = 0;
stream.LoadAllParts(); //request all streams
stream.Position = 0;
var streams = stream.Streams.ToList();
var idx = 0;
if (streams.Count() > 1)
if (streams.Count() > 1) //test part 2 - true = multipart not split
{
//check if second stream is zip header without changing position
var headerProbeStream = streams[1];
var startPosition = headerProbeStream.Position;
headerProbeStream.Position = startPosition + 4;
var isZip = IsZipFile(headerProbeStream, ReaderOptions.Password);
headerProbeStream.Position = startPosition;
streams[1].Position += 4; //skip the POST_DATA_DESCRIPTOR to prevent an exception
var isZip = IsZipFile(streams[1], ReaderOptions.Password, ReaderOptions.BufferSize);
streams[1].Position -= 4;
if (isZip)
{
stream.IsVolumes = true;
var tmp = streams[0];
var tmp = streams[0]; //arcs as zip, z01 ... swap the zip the end
streams.RemoveAt(0);
streams.Add(tmp);
//streams[0].Position = 4; //skip the POST_DATA_DESCRIPTOR to prevent an exception
return streams.Select(a => new ZipVolume(a, ReaderOptions, idx++));
}
}
//split mode or single file
return new ZipVolume(stream, ReaderOptions, idx++).AsEnumerable();
}
internal ZipArchive()
: base(ArchiveType.Zip) { }
protected override IEnumerable<ZipArchiveEntry> LoadEntries(IEnumerable<ZipVolume> volumes)
{
var vols = volumes.ToArray();
foreach (var h in headerFactory.NotNull().ReadSeekableHeader(vols.Last().Stream))
foreach (
var h in headerFactory.NotNull().ReadSeekableHeader(vols.Last().Stream, useSync: true)
)
{
if (h != null)
{
@@ -96,8 +437,7 @@ public partial class ZipArchive
yield return new ZipArchiveEntry(
this,
new SeekableZipFilePart(headerFactory.NotNull(), deh, s),
ReaderOptions
new SeekableZipFilePart(headerFactory.NotNull(), deh, s)
);
}
break;
@@ -112,20 +452,69 @@ public partial class ZipArchive
}
}
public void SaveTo(Stream stream) =>
SaveTo(stream, new ZipWriterOptions(CompressionType.Deflate));
protected override async IAsyncEnumerable<ZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<ZipVolume> volumes
)
{
var vols = await volumes.ToListAsync();
var volsArray = vols.ToArray();
await foreach (
var h in headerFactory.NotNull().ReadSeekableHeaderAsync(volsArray.Last().Stream)
)
{
if (h != null)
{
switch (h.ZipHeaderType)
{
case ZipHeaderType.DirectoryEntry:
{
var deh = (DirectoryEntryHeader)h;
Stream s;
if (
deh.RelativeOffsetOfEntryHeader + deh.CompressedSize
> volsArray[deh.DiskNumberStart].Stream.Length
)
{
var v = volsArray.Skip(deh.DiskNumberStart).ToArray();
s = new SourceStream(
v[0].Stream,
i => i < v.Length ? v[i].Stream : null,
new ReaderOptions() { LeaveStreamOpen = true }
);
}
else
{
s = volsArray[deh.DiskNumberStart].Stream;
}
yield return new ZipArchiveEntry(
this,
new SeekableZipFilePart(headerFactory.NotNull(), deh, s)
);
}
break;
case ZipHeaderType.DirectoryEnd:
{
var bytes = ((DirectoryEndHeader)h).Comment ?? Array.Empty<byte>();
volsArray.Last().Comment = ReaderOptions.ArchiveEncoding.Decode(bytes);
yield break;
}
}
}
}
}
public void SaveTo(Stream stream) => SaveTo(stream, new WriterOptions(CompressionType.Deflate));
protected override void SaveTo(
Stream stream,
ZipWriterOptions options,
WriterOptions options,
IEnumerable<ZipArchiveEntry> oldEntries,
IEnumerable<ZipArchiveEntry> newEntries
)
{
using var writer = new ZipWriter(
stream,
options as ZipWriterOptions ?? new ZipWriterOptions(options)
);
using var writer = new ZipWriter(stream, new ZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
@@ -147,6 +536,41 @@ public partial class ZipArchive
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<ZipArchiveEntry> oldEntries,
IEnumerable<ZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new ZipWriter(stream, new ZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override ZipArchiveEntry CreateEntryInternal(
string filePath,
Stream source,
@@ -160,17 +584,19 @@ public partial class ZipArchive
DateTime? modified
) => new ZipWritableArchiveEntry(this, directoryPath, modified);
public static ZipArchive Create() => new();
protected override IReader CreateReaderForSolidExtraction()
{
var stream = Volumes.Single().Stream;
//stream.Position = 0;
return ZipReader.OpenReader(stream, ReaderOptions, Entries);
((IStreamStack)stream).StackSeek(0);
return ZipReader.Open(stream, ReaderOptions, Entries);
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)ZipReader.OpenReader(stream));
return new(ZipReader.Open(stream));
}
}

View File

@@ -1,22 +0,0 @@
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchiveEntry
{
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
var part = Parts.Single();
if (part is SeekableZipFilePart seekablePart)
{
return (await seekablePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
}
return OpenEntryStream();
}
}

View File

@@ -1,23 +1,30 @@
using System.IO;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Options;
using SharpCompress.Common.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchiveEntry : ZipEntry, IArchiveEntry
public class ZipArchiveEntry : ZipEntry, IArchiveEntry
{
internal ZipArchiveEntry(
ZipArchive archive,
SeekableZipFilePart? part,
IReaderOptions readerOptions
)
: base(part, readerOptions) => Archive = archive;
internal ZipArchiveEntry(ZipArchive archive, SeekableZipFilePart? part)
: base(part) => Archive = archive;
public virtual Stream OpenEntryStream() => Parts.Single().GetCompressedStream().NotNull();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
var part = Parts.Single();
if (part is SeekableZipFilePart seekablePart)
{
return (await seekablePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
}
return OpenEntryStream();
}
#region IArchiveEntry Members
public IArchive Archive { get; }

View File

@@ -14,7 +14,6 @@ internal static class ZipArchiveVolumeFactory
//new style .zip, z01.. | .zipx, zx01 - if the numbers go beyond 99 then they use 100 ...1000 etc
var m = Regex.Match(part1.Name, @"^(.*\.)(zipx?|zx?[0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -25,16 +24,11 @@ internal static class ZipArchiveVolumeFactory
)
)
);
}
else //split - 001, 002 ...
{
return ArchiveVolumeFactory.GetFilePart(index, part1);
}
if (item != null && item.Exists)
{
return item;
}
return null; //no more items
}

View File

@@ -21,7 +21,7 @@ internal class ZipWritableArchiveEntry : ZipArchiveEntry, IWritableArchiveEntry
DateTime? lastModified,
bool closeStream
)
: base(archive, null, archive.ReaderOptions)
: base(archive, null)
{
this.stream = stream;
Key = path;
@@ -36,7 +36,7 @@ internal class ZipWritableArchiveEntry : ZipArchiveEntry, IWritableArchiveEntry
string directoryPath,
DateTime? lastModified
)
: base(archive, null, archive.ReaderOptions)
: base(archive, null)
{
stream = null;
Key = directoryPath;
@@ -80,7 +80,7 @@ internal class ZipWritableArchiveEntry : ZipArchiveEntry, IWritableArchiveEntry
}
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -4,61 +4,58 @@ using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace SharpCompress.Common.Ace;
public class AceCrc
namespace SharpCompress.Common.Ace
{
// CRC-32 lookup table (standard polynomial 0xEDB88320, reflected)
private static readonly uint[] Crc32Table = GenerateTable();
private static uint[] GenerateTable()
public class AceCrc
{
var table = new uint[256];
// CRC-32 lookup table (standard polynomial 0xEDB88320, reflected)
private static readonly uint[] Crc32Table = GenerateTable();
for (int i = 0; i < 256; i++)
private static uint[] GenerateTable()
{
uint crc = (uint)i;
var table = new uint[256];
for (int j = 0; j < 8; j++)
for (int i = 0; i < 256; i++)
{
if ((crc & 1) != 0)
uint crc = (uint)i;
for (int j = 0; j < 8; j++)
{
crc = (crc >> 1) ^ 0xEDB88320u;
}
else
{
crc >>= 1;
if ((crc & 1) != 0)
crc = (crc >> 1) ^ 0xEDB88320u;
else
crc >>= 1;
}
table[i] = crc;
}
table[i] = crc;
return table;
}
return table;
}
/// <summary>
/// Calculate ACE CRC-32 checksum.
/// ACE CRC-32 uses standard CRC-32 polynomial (0xEDB88320, reflected)
/// with init=0xFFFFFFFF but NO final XOR.
/// </summary>
public static uint AceCrc32(ReadOnlySpan<byte> data)
{
uint crc = 0xFFFFFFFFu;
foreach (byte b in data)
/// <summary>
/// Calculate ACE CRC-32 checksum.
/// ACE CRC-32 uses standard CRC-32 polynomial (0xEDB88320, reflected)
/// with init=0xFFFFFFFF but NO final XOR.
/// </summary>
public static uint AceCrc32(ReadOnlySpan<byte> data)
{
crc = (crc >> 8) ^ Crc32Table[(crc ^ b) & 0xFF];
uint crc = 0xFFFFFFFFu;
foreach (byte b in data)
{
crc = (crc >> 8) ^ Crc32Table[(crc ^ b) & 0xFF];
}
return crc; // No final XOR for ACE
}
return crc; // No final XOR for ACE
}
/// <summary>
/// ACE CRC-16 is the lower 16 bits of the ACE CRC-32.
/// </summary>
public static ushort AceCrc16(ReadOnlySpan<byte> data)
{
return (ushort)(AceCrc32(data) & 0xFFFF);
/// <summary>
/// ACE CRC-16 is the lower 16 bits of the ACE CRC-32.
/// </summary>
public static ushort AceCrc16(ReadOnlySpan<byte> data)
{
return (ushort)(AceCrc32(data) & 0xFFFF);
}
}
}

View File

@@ -5,65 +5,64 @@ using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Common.Options;
namespace SharpCompress.Common.Ace;
public class AceEntry : Entry
namespace SharpCompress.Common.Ace
{
private readonly AceFilePart _filePart;
internal AceEntry(AceFilePart filePart, IReaderOptions readerOptions)
: base(readerOptions)
public class AceEntry : Entry
{
_filePart = filePart;
}
private readonly AceFilePart _filePart;
public override long Crc
{
get
internal AceEntry(AceFilePart filePart)
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc32;
_filePart = filePart;
}
}
public override string? Key => _filePart?.Header.Filename;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.PackedSize ?? 0;
public override CompressionType CompressionType
{
get
public override long Crc
{
if (_filePart.Header.CompressionType == Headers.CompressionType.Stored)
get
{
return CompressionType.None;
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc32;
}
return CompressionType.AceLZ77;
}
public override string? Key => _filePart?.Header.Filename;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.PackedSize ?? 0;
public override CompressionType CompressionType
{
get
{
if (_filePart.Header.CompressionType == Headers.CompressionType.Stored)
{
return CompressionType.None;
}
return CompressionType.AceLZ77;
}
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTime;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => _filePart.Header.IsFileEncrypted;
public override bool IsDirectory => _filePart.Header.IsDirectory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTime;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => _filePart.Header.IsFileEncrypted;
public override bool IsDirectory => _filePart.Header.IsDirectory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -7,45 +7,46 @@ using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.IO;
namespace SharpCompress.Common.Ace;
public class AceFilePart : FilePart
namespace SharpCompress.Common.Ace
{
private readonly Stream _stream;
internal AceFileHeader Header { get; set; }
internal AceFilePart(AceFileHeader localAceHeader, Stream seekableStream)
: base(localAceHeader.ArchiveEncoding)
public class AceFilePart : FilePart
{
_stream = seekableStream;
Header = localAceHeader;
}
private readonly Stream _stream;
internal AceFileHeader Header { get; set; }
internal override string? FilePartName => Header.Filename;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal AceFilePart(AceFileHeader localAceHeader, Stream seekableStream)
: base(localAceHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionType)
{
case Headers.CompressionType.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.PackedSize
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionQuality
);
}
return compressedStream;
_stream = seekableStream;
Header = localAceHeader;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
internal override string? FilePartName => Header.Filename;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionType)
{
case Headers.CompressionType.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.PackedSize
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionQuality
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
}
}

View File

@@ -7,28 +7,29 @@ using System.Threading.Tasks;
using SharpCompress.Common.Arj;
using SharpCompress.Readers;
namespace SharpCompress.Common.Ace;
public class AceVolume : Volume
namespace SharpCompress.Common.Ace
{
public AceVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public override bool IsFirstVolume
public class AceVolume : Volume
{
get { return true; }
}
public AceVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
public override bool IsFirstVolume
{
get { return true; }
}
internal IEnumerable<AceFilePart> GetVolumeFileParts()
{
return new List<AceFilePart>();
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
internal IEnumerable<AceFilePart> GetVolumeFileParts()
{
return new List<AceFilePart>();
}
}
}

View File

@@ -1,111 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Arc;
namespace SharpCompress.Common.Ace.Headers;
public sealed partial class AceFileHeader
{
/// <summary>
/// Asynchronously reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override async ValueTask<AceHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var headerData = await ReadHeaderAsync(stream, cancellationToken);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
}
}
// Store the data start position
DataStartPosition = stream.Position;
return this;
}
}

View File

@@ -2,173 +2,170 @@ using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using System.Xml.Linq;
using SharpCompress.Common.Arc;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// ACE file entry header
/// </summary>
public sealed partial class AceFileHeader : AceHeader
namespace SharpCompress.Common.Ace.Headers
{
public long DataStartPosition { get; private set; }
public long PackedSize { get; set; }
public long OriginalSize { get; set; }
public DateTime DateTime { get; set; }
public int Attributes { get; set; }
public uint Crc32 { get; set; }
public CompressionType CompressionType { get; set; }
public CompressionQuality CompressionQuality { get; set; }
public ushort Parameters { get; set; }
public string Filename { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
/// <summary>
/// File data offset in the archive
/// ACE file entry header
/// </summary>
public ulong DataOffset { get; set; }
public bool IsDirectory => (Attributes & 0x10) != 0;
public bool IsContinuedFromPrev =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_PREV) != 0;
public bool IsContinuedToNext =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_NEXT) != 0;
public int DictionarySize
public sealed class AceFileHeader : AceHeader
{
get
public long DataStartPosition { get; private set; }
public long PackedSize { get; set; }
public long OriginalSize { get; set; }
public DateTime DateTime { get; set; }
public int Attributes { get; set; }
public uint Crc32 { get; set; }
public CompressionType CompressionType { get; set; }
public CompressionQuality CompressionQuality { get; set; }
public ushort Parameters { get; set; }
public string Filename { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
/// <summary>
/// File data offset in the archive
/// </summary>
public ulong DataOffset { get; set; }
public bool IsDirectory => (Attributes & 0x10) != 0;
public bool IsContinuedFromPrev =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_PREV) != 0;
public bool IsContinuedToNext =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_NEXT) != 0;
public int DictionarySize
{
int bits = Parameters & 0x0F;
return bits < 10 ? 1024 : 1 << bits;
}
}
public AceFileHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.FILE) { }
/// <summary>
/// Reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
get
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
int bits = Parameters & 0x0F;
return bits < 10 ? 1024 : 1 << bits;
}
}
// Store the data start position
DataStartPosition = stream.Position;
public AceFileHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.FILE) { }
return this;
/// <summary>
/// Reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
}
}
// Store the data start position
DataStartPosition = stream.Position;
return this;
}
public CompressionType GetCompressionType(byte value) =>
value switch
{
0 => CompressionType.Stored,
1 => CompressionType.Lz77,
2 => CompressionType.Blocked,
_ => CompressionType.Unknown,
};
public CompressionQuality GetCompressionQuality(byte value) =>
value switch
{
0 => CompressionQuality.None,
1 => CompressionQuality.Fastest,
2 => CompressionQuality.Fast,
3 => CompressionQuality.Normal,
4 => CompressionQuality.Good,
5 => CompressionQuality.Best,
_ => CompressionQuality.Unknown,
};
}
// ReadAsync moved to AceFileHeader.Async.cs
public CompressionType GetCompressionType(byte value) =>
value switch
{
0 => CompressionType.Stored,
1 => CompressionType.Lz77,
2 => CompressionType.Blocked,
_ => CompressionType.Unknown,
};
public CompressionQuality GetCompressionQuality(byte value) =>
value switch
{
0 => CompressionQuality.None,
1 => CompressionQuality.Fastest,
2 => CompressionQuality.Fast,
3 => CompressionQuality.Normal,
4 => CompressionQuality.Good,
5 => CompressionQuality.Best,
_ => CompressionQuality.Unknown,
};
}

View File

@@ -1,69 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Ace.Headers;
public abstract partial class AceHeader
{
public abstract ValueTask<AceHeader?> ReadAsync(
Stream reader,
CancellationToken cancellationToken = default
);
public async ValueTask<byte[]> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (!await stream.ReadFullyAsync(headerBytes, 0, 4, cancellationToken))
{
return Array.Empty<byte>();
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
{
return Array.Empty<byte>();
}
// Read the header data
var body = new byte[HeaderSize];
if (!await stream.ReadFullyAsync(body, 0, HeaderSize, cancellationToken))
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
/// <summary>
/// Asynchronously checks if the stream is an ACE archive
/// </summary>
/// <param name="stream">The stream to read from</param>
/// <param name="cancellationToken">Cancellation token</param>
/// <returns>True if the stream is an ACE archive, false otherwise</returns>
public static async ValueTask<bool> IsArchiveAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var bytes = new byte[14];
if (!await stream.ReadFullyAsync(bytes, 0, 14, cancellationToken))
{
return false;
}
return CheckMagicBytes(bytes, 7);
}
}

View File

@@ -1,156 +1,153 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Arj.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Header type constants
/// </summary>
public enum AceHeaderType
namespace SharpCompress.Common.Ace.Headers
{
MAIN = 0,
FILE = 1,
RECOVERY32 = 2,
RECOVERY64A = 3,
RECOVERY64B = 4,
}
public abstract partial class AceHeader
{
// ACE signature: bytes at offset 7 should be "**ACE**"
private static readonly byte[] AceSignature =
[
(byte)'*',
(byte)'*',
(byte)'A',
(byte)'C',
(byte)'E',
(byte)'*',
(byte)'*',
];
public AceHeader(IArchiveEncoding archiveEncoding, AceHeaderType type)
/// <summary>
/// Header type constants
/// </summary>
public enum AceHeaderType
{
AceHeaderType = type;
ArchiveEncoding = archiveEncoding;
MAIN = 0,
FILE = 1,
RECOVERY32 = 2,
RECOVERY64A = 3,
RECOVERY64B = 4,
}
public IArchiveEncoding ArchiveEncoding { get; }
public AceHeaderType AceHeaderType { get; }
public ushort HeaderFlags { get; set; }
public ushort HeaderCrc { get; set; }
public ushort HeaderSize { get; set; }
public byte HeaderType { get; set; }
public bool IsFileEncrypted =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.FILE_ENCRYPTED) != 0;
public bool Is64Bit =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MEMORY_64BIT) != 0;
public bool IsSolid =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.SOLID_MAIN) != 0;
public bool IsMultiVolume =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MULTIVOLUME) != 0;
public abstract AceHeader? Read(Stream reader);
// Async methods moved to AceHeader.Async.cs
public byte[] ReadHeader(Stream stream)
public abstract class AceHeader
{
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (!stream.ReadFully(headerBytes))
// ACE signature: bytes at offset 7 should be "**ACE**"
private static readonly byte[] AceSignature =
[
(byte)'*',
(byte)'*',
(byte)'A',
(byte)'C',
(byte)'E',
(byte)'*',
(byte)'*',
];
public AceHeader(IArchiveEncoding archiveEncoding, AceHeaderType type)
{
return Array.Empty<byte>();
AceHeaderType = type;
ArchiveEncoding = archiveEncoding;
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
public IArchiveEncoding ArchiveEncoding { get; }
public AceHeaderType AceHeaderType { get; }
public ushort HeaderFlags { get; set; }
public ushort HeaderCrc { get; set; }
public ushort HeaderSize { get; set; }
public byte HeaderType { get; set; }
public bool IsFileEncrypted =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.FILE_ENCRYPTED) != 0;
public bool Is64Bit =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MEMORY_64BIT) != 0;
public bool IsSolid =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.SOLID_MAIN) != 0;
public bool IsMultiVolume =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MULTIVOLUME) != 0;
public abstract AceHeader? Read(Stream reader);
public byte[] ReadHeader(Stream stream)
{
return Array.Empty<byte>();
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (stream.Read(headerBytes, 0, 4) != 4)
{
return Array.Empty<byte>();
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
{
return Array.Empty<byte>();
}
// Read the header data
var body = new byte[HeaderSize];
if (stream.Read(body, 0, HeaderSize) != HeaderSize)
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
// Read the header data
var body = new byte[HeaderSize];
if (!stream.ReadFully(body))
public static bool IsArchive(Stream stream)
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
public static bool IsArchive(Stream stream)
{
// ACE files have a specific signature
// First two bytes are typically 0x60 0xEA (signature bytes)
// At offset 7, there should be "**ACE**" (7 bytes)
var bytes = new byte[14];
if (stream.Read(bytes, 0, 14) != 14)
{
return false;
}
// Check for "**ACE**" at offset 7
return CheckMagicBytes(bytes, 7);
}
protected static bool CheckMagicBytes(byte[] headerBytes, int offset)
{
// Check for "**ACE**" at specified offset
for (int i = 0; i < AceSignature.Length; i++)
{
if (headerBytes[offset + i] != AceSignature[i])
// ACE files have a specific signature
// First two bytes are typically 0x60 0xEA (signature bytes)
// At offset 7, there should be "**ACE**" (7 bytes)
var bytes = new byte[14];
if (stream.Read(bytes, 0, 14) != 14)
{
return false;
}
// Check for "**ACE**" at offset 7
return CheckMagicBytes(bytes, 7);
}
return true;
}
protected DateTime ConvertDosDateTime(uint dosDateTime)
{
try
protected static bool CheckMagicBytes(byte[] headerBytes, int offset)
{
int second = (int)(dosDateTime & 0x1F) * 2;
int minute = (int)((dosDateTime >> 5) & 0x3F);
int hour = (int)((dosDateTime >> 11) & 0x1F);
int day = (int)((dosDateTime >> 16) & 0x1F);
int month = (int)((dosDateTime >> 21) & 0x0F);
int year = (int)((dosDateTime >> 25) & 0x7F) + 1980;
// Check for "**ACE**" at specified offset
for (int i = 0; i < AceSignature.Length; i++)
{
if (headerBytes[offset + i] != AceSignature[i])
{
return false;
}
}
return true;
}
if (
day < 1
|| day > 31
|| month < 1
|| month > 12
|| hour > 23
|| minute > 59
|| second > 59
)
protected DateTime ConvertDosDateTime(uint dosDateTime)
{
try
{
int second = (int)(dosDateTime & 0x1F) * 2;
int minute = (int)((dosDateTime >> 5) & 0x3F);
int hour = (int)((dosDateTime >> 11) & 0x1F);
int day = (int)((dosDateTime >> 16) & 0x1F);
int month = (int)((dosDateTime >> 21) & 0x0F);
int year = (int)((dosDateTime >> 25) & 0x7F) + 1980;
if (
day < 1
|| day > 31
|| month < 1
|| month > 12
|| hour > 23
|| minute > 59
|| second > 59
)
{
return DateTime.MinValue;
}
return new DateTime(year, month, day, hour, minute, second);
}
catch
{
return DateTime.MinValue;
}
return new DateTime(year, month, day, hour, minute, second);
}
catch
{
return DateTime.MinValue;
}
}
}

View File

@@ -1,83 +0,0 @@
using System;
using System.Buffers.Binary;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
public sealed partial class AceMainHeader
{
/// <summary>
/// Asynchronously reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override async ValueTask<AceHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var headerData = await ReadHeaderAsync(stream, cancellationToken);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
}
}
return this;
}
}

View File

@@ -2,99 +2,96 @@ using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// ACE main archive header
/// </summary>
public sealed partial class AceMainHeader : AceHeader
namespace SharpCompress.Common.Ace.Headers
{
public byte ExtractVersion { get; set; }
public byte CreatorVersion { get; set; }
public HostOS HostOS { get; set; }
public byte VolumeNumber { get; set; }
public DateTime DateTime { get; set; }
public string Advert { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
public byte AceVersion { get; private set; }
public AceMainHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.MAIN) { }
/// <summary>
/// Reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// ACE main archive header
/// </summary>
public override AceHeader? Read(Stream stream)
public sealed class AceMainHeader : AceHeader
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
public byte ExtractVersion { get; set; }
public byte CreatorVersion { get; set; }
public HostOS HostOS { get; set; }
public byte VolumeNumber { get; set; }
public DateTime DateTime { get; set; }
public string Advert { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
public byte AceVersion { get; private set; }
public AceMainHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.MAIN) { }
/// <summary>
/// Reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
}
}
return this;
}
return this;
}
// ReadAsync moved to AceMainHeader.Async.cs
}

View File

@@ -1,15 +1,16 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Compression quality
/// </summary>
public enum CompressionQuality
namespace SharpCompress.Common.Ace.Headers
{
None,
Fastest,
Fast,
Normal,
Good,
Best,
Unknown,
/// <summary>
/// Compression quality
/// </summary>
public enum CompressionQuality
{
None,
Fastest,
Fast,
Normal,
Good,
Best,
Unknown,
}
}

View File

@@ -1,12 +1,13 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Compression types
/// </summary>
public enum CompressionType
namespace SharpCompress.Common.Ace.Headers
{
Stored,
Lz77,
Blocked,
Unknown,
/// <summary>
/// Compression types
/// </summary>
public enum CompressionType
{
Stored,
Lz77,
Blocked,
Unknown,
}
}

View File

@@ -1,32 +1,33 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Header flags (main + file, overlapping meanings)
/// </summary>
public static class HeaderFlags
namespace SharpCompress.Common.Ace.Headers
{
// Shared / low bits
public const ushort ADDSIZE = 0x0001; // extra size field present
public const ushort COMMENT = 0x0002; // comment present
public const ushort MEMORY_64BIT = 0x0004;
public const ushort AV_STRING = 0x0008; // AV string present
public const ushort SOLID = 0x0010; // solid file
public const ushort LOCKED = 0x0020;
public const ushort PROTECTED = 0x0040;
/// <summary>
/// Header flags (main + file, overlapping meanings)
/// </summary>
public static class HeaderFlags
{
// Shared / low bits
public const ushort ADDSIZE = 0x0001; // extra size field present
public const ushort COMMENT = 0x0002; // comment present
public const ushort MEMORY_64BIT = 0x0004;
public const ushort AV_STRING = 0x0008; // AV string present
public const ushort SOLID = 0x0010; // solid file
public const ushort LOCKED = 0x0020;
public const ushort PROTECTED = 0x0040;
// Main header specific
public const ushort V20FORMAT = 0x0100;
public const ushort SFX = 0x0200;
public const ushort LIMITSFXJR = 0x0400;
public const ushort MULTIVOLUME = 0x0800;
public const ushort ADVERT = 0x1000;
public const ushort RECOVERY = 0x2000;
public const ushort LOCKED_MAIN = 0x4000;
public const ushort SOLID_MAIN = 0x8000;
// Main header specific
public const ushort V20FORMAT = 0x0100;
public const ushort SFX = 0x0200;
public const ushort LIMITSFXJR = 0x0400;
public const ushort MULTIVOLUME = 0x0800;
public const ushort ADVERT = 0x1000;
public const ushort RECOVERY = 0x2000;
public const ushort LOCKED_MAIN = 0x4000;
public const ushort SOLID_MAIN = 0x8000;
// File header specific (same bits, different meaning)
public const ushort NTSECURITY = 0x0400;
public const ushort CONTINUED_PREV = 0x1000;
public const ushort CONTINUED_NEXT = 0x2000;
public const ushort FILE_ENCRYPTED = 0x4000; // file encrypted (file header)
// File header specific (same bits, different meaning)
public const ushort NTSECURITY = 0x0400;
public const ushort CONTINUED_PREV = 0x1000;
public const ushort CONTINUED_NEXT = 0x2000;
public const ushort FILE_ENCRYPTED = 0x4000; // file encrypted (file header)
}
}

View File

@@ -1,21 +1,22 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Host OS type
/// </summary>
public enum HostOS
namespace SharpCompress.Common.Ace.Headers
{
MsDos = 0,
Os2,
Windows,
Unix,
MacOs,
WinNt,
Primos,
AppleGs,
Atari,
Vax,
Amiga,
Next,
Unknown,
/// <summary>
/// Host OS type
/// </summary>
public enum HostOS
{
MsDos = 0,
Os2,
Windows,
Unix,
MacOs,
WinNt,
Primos,
AppleGs,
Atari,
Vax,
Amiga,
Next,
Unknown,
}
}

View File

@@ -5,57 +5,56 @@ using System.Linq;
using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.GZip;
using SharpCompress.Common.Options;
using SharpCompress.Common.Tar;
namespace SharpCompress.Common.Arc;
public class ArcEntry : Entry
namespace SharpCompress.Common.Arc
{
private readonly ArcFilePart? _filePart;
internal ArcEntry(ArcFilePart? filePart, IReaderOptions readerOptions)
: base(readerOptions)
public class ArcEntry : Entry
{
_filePart = filePart;
}
private readonly ArcFilePart? _filePart;
public override long Crc
{
get
internal ArcEntry(ArcFilePart? filePart)
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc16;
_filePart = filePart;
}
public override long Crc
{
get
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc16;
}
}
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType =>
_filePart?.Header.CompressionMethod ?? CompressionType.Unknown;
public override long Size => throw new NotImplementedException();
public override DateTime? LastModifiedTime => null;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => false;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType =>
_filePart?.Header.CompressionMethod ?? CompressionType.Unknown;
public override long Size => throw new NotImplementedException();
public override DateTime? LastModifiedTime => null;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => false;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -2,93 +2,75 @@ using System;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arc;
public class ArcEntryHeader
namespace SharpCompress.Common.Arc
{
public IArchiveEncoding ArchiveEncoding { get; }
public CompressionType CompressionMethod { get; private set; }
public string? Name { get; private set; }
public long CompressedSize { get; private set; }
public DateTime DateTime { get; private set; }
public int Crc16 { get; private set; }
public long OriginalSize { get; private set; }
public long DataStartPosition { get; private set; }
public ArcEntryHeader(IArchiveEncoding archiveEncoding)
public class ArcEntryHeader
{
this.ArchiveEncoding = archiveEncoding;
}
public IArchiveEncoding ArchiveEncoding { get; }
public CompressionType CompressionMethod { get; private set; }
public string? Name { get; private set; }
public long CompressedSize { get; private set; }
public DateTime DateTime { get; private set; }
public int Crc16 { get; private set; }
public long OriginalSize { get; private set; }
public long DataStartPosition { get; private set; }
public ArcEntryHeader? ReadHeader(Stream stream)
{
byte[] headerBytes = new byte[29];
if (stream.Read(headerBytes, 0, headerBytes.Length) != headerBytes.Length)
public ArcEntryHeader(IArchiveEncoding archiveEncoding)
{
return null;
this.ArchiveEncoding = archiveEncoding;
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
public async ValueTask<ArcEntryHeader?> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
byte[] headerBytes = new byte[29];
if (
await stream.ReadAsync(headerBytes, 0, headerBytes.Length, cancellationToken)
!= headerBytes.Length
)
public ArcEntryHeader? ReadHeader(Stream stream)
{
return null;
byte[] headerBytes = new byte[29];
if (stream.Read(headerBytes, 0, headerBytes.Length) != headerBytes.Length)
{
return null;
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
public ArcEntryHeader LoadFrom(byte[] headerBytes)
{
CompressionMethod = GetCompressionType(headerBytes[1]);
// Read name
int nameEnd = Array.IndexOf(headerBytes, (byte)0, 1); // Find null terminator
Name = Encoding.UTF8.GetString(headerBytes, 2, nameEnd > 0 ? nameEnd - 2 : 12);
int offset = 15;
CompressedSize = BitConverter.ToUInt32(headerBytes, offset);
offset += 4;
uint rawDateTime = BitConverter.ToUInt32(headerBytes, offset);
DateTime = ConvertToDateTime(rawDateTime);
offset += 4;
Crc16 = BitConverter.ToUInt16(headerBytes, offset);
offset += 2;
OriginalSize = BitConverter.ToUInt32(headerBytes, offset);
return this;
}
private CompressionType GetCompressionType(byte value)
{
return value switch
public ArcEntryHeader LoadFrom(byte[] headerBytes)
{
1 or 2 => CompressionType.None,
3 => CompressionType.Packed,
4 => CompressionType.Squeezed,
5 or 6 or 7 or 8 => CompressionType.Crunched,
9 => CompressionType.Squashed,
10 => CompressionType.Crushed,
11 => CompressionType.Distilled,
_ => CompressionType.Unknown,
};
}
CompressionMethod = GetCompressionType(headerBytes[1]);
public static DateTime ConvertToDateTime(long rawDateTime)
{
// Convert Unix timestamp to DateTime (UTC)
return DateTimeOffset.FromUnixTimeSeconds(rawDateTime).UtcDateTime;
// Read name
int nameEnd = Array.IndexOf(headerBytes, (byte)0, 1); // Find null terminator
Name = Encoding.UTF8.GetString(headerBytes, 2, nameEnd > 0 ? nameEnd - 2 : 12);
int offset = 15;
CompressedSize = BitConverter.ToUInt32(headerBytes, offset);
offset += 4;
uint rawDateTime = BitConverter.ToUInt32(headerBytes, offset);
DateTime = ConvertToDateTime(rawDateTime);
offset += 4;
Crc16 = BitConverter.ToUInt16(headerBytes, offset);
offset += 2;
OriginalSize = BitConverter.ToUInt32(headerBytes, offset);
return this;
}
private CompressionType GetCompressionType(byte value)
{
return value switch
{
1 or 2 => CompressionType.None,
3 => CompressionType.Packed,
4 => CompressionType.Squeezed,
5 or 6 or 7 or 8 => CompressionType.Crunched,
9 => CompressionType.Squashed,
10 => CompressionType.Crushed,
11 => CompressionType.Distilled,
_ => CompressionType.Unknown,
};
}
public static DateTime ConvertToDateTime(long rawDateTime)
{
// Convert Unix timestamp to DateTime (UTC)
return DateTimeOffset.FromUnixTimeSeconds(rawDateTime).UtcDateTime;
}
}
}

View File

@@ -1,58 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Compressors.Lzw;
using SharpCompress.Compressors.RLE90;
using SharpCompress.Compressors.Squeezed;
using SharpCompress.IO;
namespace SharpCompress.Common.Arc;
public partial class ArcFilePart
{
internal override async ValueTask<Stream?> GetCompressedStreamAsync(
CancellationToken cancellationToken = default
)
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Squeezed:
compressedStream = await SqueezeStream.CreateAsync(
_stream,
(int)Header.CompressedSize,
cancellationToken
);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(_stream, (int)Header.CompressedSize, true);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream;
}
}

View File

@@ -13,61 +13,71 @@ using SharpCompress.Compressors.RLE90;
using SharpCompress.Compressors.Squeezed;
using SharpCompress.IO;
namespace SharpCompress.Common.Arc;
public partial class ArcFilePart : FilePart
namespace SharpCompress.Common.Arc
{
private readonly Stream? _stream;
internal ArcFilePart(ArcEntryHeader localArcHeader, Stream? seekableStream)
: base(localArcHeader.ArchiveEncoding)
public class ArcFilePart : FilePart
{
_stream = seekableStream;
Header = localArcHeader;
}
private readonly Stream? _stream;
internal ArcEntryHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal ArcFilePart(ArcEntryHeader localArcHeader, Stream? seekableStream)
: base(localArcHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Squeezed:
compressedStream = SqueezeStream.Create(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(_stream, (int)Header.CompressedSize, true);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
_stream = seekableStream;
Header = localArcHeader;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
internal ArcEntryHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(
_stream,
(int)Header.CompressedSize
);
break;
case CompressionType.Squeezed:
compressedStream = new SqueezeStream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: "
+ Header.CompressionMethod
+ " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(
_stream,
(int)Header.CompressedSize,
true
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
}
}

View File

@@ -6,10 +6,11 @@ using System.Text;
using System.Threading.Tasks;
using SharpCompress.Readers;
namespace SharpCompress.Common.Arc;
public class ArcVolume : Volume
namespace SharpCompress.Common.Arc
{
public ArcVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public class ArcVolume : Volume
{
public ArcVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
}
}

View File

@@ -10,5 +10,4 @@ public enum ArchiveType
Arc,
Arj,
Ace,
Lzw,
}

View File

@@ -5,55 +5,54 @@ using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Arc;
using SharpCompress.Common.Arj.Headers;
using SharpCompress.Common.Options;
namespace SharpCompress.Common.Arj;
public class ArjEntry : Entry
namespace SharpCompress.Common.Arj
{
private readonly ArjFilePart _filePart;
internal ArjEntry(ArjFilePart filePart, IReaderOptions readerOptions)
: base(readerOptions)
public class ArjEntry : Entry
{
_filePart = filePart;
}
private readonly ArjFilePart _filePart;
public override long Crc => _filePart.Header.OriginalCrc32;
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType
{
get
internal ArjEntry(ArjFilePart filePart)
{
if (_filePart.Header.CompressionMethod == CompressionMethod.Stored)
{
return CompressionType.None;
}
return CompressionType.ArjLZ77;
_filePart = filePart;
}
public override long Crc => _filePart.Header.OriginalCrc32;
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType
{
get
{
if (_filePart.Header.CompressionMethod == CompressionMethod.Stored)
{
return CompressionType.None;
}
return CompressionType.ArjLZ77;
}
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTimeModified.DateTime;
public override DateTime? CreatedTime => _filePart.Header.DateTimeCreated.DateTime;
public override DateTime? LastAccessedTime => _filePart.Header.DateTimeAccessed.DateTime;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => _filePart.Header.FileType == FileType.Directory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTimeModified.DateTime;
public override DateTime? CreatedTime => _filePart.Header.DateTimeCreated?.DateTime;
public override DateTime? LastAccessedTime => _filePart.Header.DateTimeAccessed?.DateTime;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => _filePart.Header.FileType == FileType.Directory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -8,62 +8,65 @@ using SharpCompress.Common.Arj.Headers;
using SharpCompress.Compressors.Arj;
using SharpCompress.IO;
namespace SharpCompress.Common.Arj;
public class ArjFilePart : FilePart
namespace SharpCompress.Common.Arj
{
private readonly Stream _stream;
internal ArjLocalHeader Header { get; set; }
internal ArjFilePart(ArjLocalHeader localArjHeader, Stream seekableStream)
: base(localArjHeader.ArchiveEncoding)
public class ArjFilePart : FilePart
{
_stream = seekableStream;
Header = localArjHeader;
}
private readonly Stream _stream;
internal ArjLocalHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal ArjFilePart(ArjLocalHeader localArjHeader, Stream seekableStream)
: base(localArjHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionMethod.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionMethod.CompressedMost:
case CompressionMethod.Compressed:
case CompressionMethod.CompressedFaster:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new LhaStream<Lh7DecoderCfg>(
_stream,
(int)Header.OriginalSize
);
break;
case CompressionMethod.CompressedFastest:
compressedStream = new LHDecoderStream(_stream, (int)Header.OriginalSize);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
_stream = seekableStream;
Header = localArjHeader;
}
return _stream.NotNull();
}
internal override Stream GetRawStream() => _stream;
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionMethod.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionMethod.CompressedMost:
case CompressionMethod.Compressed:
case CompressionMethod.CompressedFaster:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: "
+ Header.CompressionMethod
+ " with size > 128KB"
);
}
compressedStream = new LhaStream<Lh7DecoderCfg>(
_stream,
(int)Header.OriginalSize
);
break;
case CompressionMethod.CompressedFastest:
compressedStream = new LHDecoderStream(_stream, (int)Header.OriginalSize);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream GetRawStream() => _stream;
}
}

View File

@@ -8,28 +8,29 @@ using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.Readers;
namespace SharpCompress.Common.Arj;
public class ArjVolume : Volume
namespace SharpCompress.Common.Arj
{
public ArjVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public override bool IsFirstVolume
public class ArjVolume : Volume
{
get { return true; }
}
public ArjVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
public override bool IsFirstVolume
{
get { return true; }
}
internal IEnumerable<ArjFilePart> GetVolumeFileParts()
{
return new List<ArjFilePart>();
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
internal IEnumerable<ArjFilePart> GetVolumeFileParts()
{
return new List<ArjFilePart>();
}
}
}

View File

@@ -1,132 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Arj.Headers;
public abstract partial class ArjHeader
{
public abstract ValueTask<ArjHeader?> ReadAsync(
Stream reader,
CancellationToken cancellationToken = default
);
public async ValueTask<byte[]> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
// check for magic bytes
var magic = new byte[2];
if (await stream.ReadAsync(magic, 0, 2, cancellationToken) != 2)
{
return Array.Empty<byte>();
}
if (!CheckMagicBytes(magic))
{
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
// read header_size
byte[] headerBytes = new byte[2];
await stream.ReadAsync(headerBytes, 0, 2, cancellationToken);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = await stream.ReadAsync(body, 0, headerSize, cancellationToken);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
read = await stream.ReadAsync(crc, 0, 4, cancellationToken);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
protected async ValueTask<List<byte[]>> ReadExtendedHeadersAsync(
Stream reader,
CancellationToken cancellationToken = default
)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = await reader.ReadAsync(buffer, 0, 2, cancellationToken);
if (bytesRead < 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
{
return extendedHeader;
}
byte[] header = new byte[extHeaderSize];
bytesRead = await reader.ReadAsync(header, 0, extHeaderSize, cancellationToken);
if (bytesRead < extHeaderSize)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
}
byte[] crcextended = new byte[4];
bytesRead = await reader.ReadAsync(crcextended, 0, 4, cancellationToken);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
if (checksum != BitConverter.ToUInt32(crcextended, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
}
extendedHeader.Add(header);
}
}
/// <summary>
/// Asynchronously checks if the stream is an ARJ archive
/// </summary>
/// <param name="stream">The stream to read from</param>
/// <param name="cancellationToken">Cancellation token</param>
/// <returns>True if the stream is an ARJ archive, false otherwise</returns>
public static async ValueTask<bool> IsArchiveAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var bytes = new byte[2];
if (await stream.ReadAsync(bytes, 0, 2, cancellationToken) != 2)
{
return false;
}
return CheckMagicBytes(bytes);
}
}

View File

@@ -3,158 +3,156 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Arj.Headers;
public enum ArjHeaderType
namespace SharpCompress.Common.Arj.Headers
{
MainHeader,
LocalHeader,
}
public abstract partial class ArjHeader
{
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public ArjHeader(ArjHeaderType type)
public enum ArjHeaderType
{
ArjHeaderType = type;
MainHeader,
LocalHeader,
}
public ArjHeaderType ArjHeaderType { get; }
public byte Flags { get; set; }
public FileType FileType { get; set; }
public abstract ArjHeader? Read(Stream reader);
// Async methods moved to ArjHeader.Async.cs
public byte[] ReadHeader(Stream stream)
public abstract class ArjHeader
{
// check for magic bytes
var magic = new byte[2];
if (stream.Read(magic) != 2)
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public ArjHeader(ArjHeaderType type)
{
return Array.Empty<byte>();
ArjHeaderType = type;
}
if (!CheckMagicBytes(magic))
public ArjHeaderType ArjHeaderType { get; }
public byte Flags { get; set; }
public FileType FileType { get; set; }
public abstract ArjHeader? Read(Stream reader);
public byte[] ReadHeader(Stream stream)
{
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
// read header_size
byte[] headerBytes = new byte[2];
stream.Read(headerBytes, 0, 2);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = stream.Read(body, 0, headerSize);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
read = stream.Read(crc, 0, 4);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
// ReadHeaderAsync moved to ArjHeader.Async.cs
protected List<byte[]> ReadExtendedHeaders(Stream reader)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = reader.Read(buffer, 0, 2);
if (bytesRead < 2)
// check for magic bytes
var magic = new byte[2];
if (stream.Read(magic) != 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
return Array.Empty<byte>();
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
if (!CheckMagicBytes(magic))
{
return extendedHeader;
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
byte[] header = new byte[extHeaderSize];
bytesRead = reader.Read(header, 0, extHeaderSize);
if (bytesRead < extHeaderSize)
// read header_size
byte[] headerBytes = new byte[2];
stream.Read(headerBytes, 0, 2);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = stream.Read(body, 0, headerSize);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
bytesRead = reader.Read(crc, 0, 4);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
read = stream.Read(crc, 0, 4);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
protected List<byte[]> ReadExtendedHeaders(Stream reader)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = reader.Read(buffer, 0, 2);
if (bytesRead < 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
{
return extendedHeader;
}
byte[] header = new byte[extHeaderSize];
bytesRead = reader.Read(header, 0, extHeaderSize);
if (bytesRead < extHeaderSize)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
}
byte[] crc = new byte[4];
bytesRead = reader.Read(crc, 0, 4);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
}
extendedHeader.Add(header);
}
}
// Flag helpers
public bool IsGabled => (Flags & 0x01) != 0;
public bool IsAnsiPage => (Flags & 0x02) != 0;
public bool IsVolume => (Flags & 0x04) != 0;
public bool IsArjProtected => (Flags & 0x08) != 0;
public bool IsPathSym => (Flags & 0x10) != 0;
public bool IsBackup => (Flags & 0x20) != 0;
public bool IsSecured => (Flags & 0x40) != 0;
public bool IsAltName => (Flags & 0x80) != 0;
public static FileType FileTypeFromByte(byte value)
{
return Enum.IsDefined(typeof(FileType), value)
? (FileType)value
: Headers.FileType.Unknown;
}
public static bool IsArchive(Stream stream)
{
var bytes = new byte[2];
if (stream.Read(bytes, 0, 2) != 2)
{
return false;
}
extendedHeader.Add(header);
return CheckMagicBytes(bytes);
}
}
// Flag helpers
public bool IsGabled => (Flags & 0x01) != 0;
public bool IsAnsiPage => (Flags & 0x02) != 0;
public bool IsVolume => (Flags & 0x04) != 0;
public bool IsArjProtected => (Flags & 0x08) != 0;
public bool IsPathSym => (Flags & 0x10) != 0;
public bool IsBackup => (Flags & 0x20) != 0;
public bool IsSecured => (Flags & 0x40) != 0;
public bool IsAltName => (Flags & 0x80) != 0;
public static FileType FileTypeFromByte(byte value)
{
return Enum.IsDefined(typeof(FileType), value) ? (FileType)value : Headers.FileType.Unknown;
}
public static bool IsArchive(Stream stream)
{
var bytes = new byte[2];
if (stream.Read(bytes, 0, 2) != 2)
protected static bool CheckMagicBytes(byte[] headerBytes)
{
return false;
var magicValue = (ushort)(headerBytes[0] | headerBytes[1] << 8);
return magicValue == ARJ_MAGIC;
}
return CheckMagicBytes(bytes);
}
protected static bool CheckMagicBytes(byte[] headerBytes)
{
var magicValue = (ushort)(headerBytes[0] | headerBytes[1] << 8);
return magicValue == ARJ_MAGIC;
}
}

View File

@@ -1,24 +0,0 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjLocalHeader
{
public override async ValueTask<ArjHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var body = await ReadHeaderAsync(stream, cancellationToken);
if (body.Length > 0)
{
await ReadExtendedHeadersAsync(stream, cancellationToken);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
}
return null;
}
}

View File

@@ -4,157 +4,158 @@ using System.IO;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjLocalHeader : ArjHeader
namespace SharpCompress.Common.Arj.Headers
{
public ArchiveEncoding ArchiveEncoding { get; }
public long DataStartPosition { get; protected set; }
public byte ArchiverVersionNumber { get; set; }
public byte MinVersionToExtract { get; set; }
public HostOS HostOS { get; set; }
public CompressionMethod CompressionMethod { get; set; }
public DosDateTime DateTimeModified { get; set; } = new DosDateTime(0);
public long CompressedSize { get; set; }
public long OriginalSize { get; set; }
public long OriginalCrc32 { get; set; }
public int FileSpecPosition { get; set; }
public int FileAccessMode { get; set; }
public byte FirstChapter { get; set; }
public byte LastChapter { get; set; }
public long ExtendedFilePosition { get; set; }
public DosDateTime? DateTimeAccessed { get; set; }
public DosDateTime? DateTimeCreated { get; set; }
public long OriginalSizeEvenForVolumes { get; set; }
public string Name { get; set; } = string.Empty;
public string Comment { get; set; } = string.Empty;
private const byte StdHdrSize = 30;
private const byte R9HdrSize = 46;
public ArjLocalHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.LocalHeader)
public class ArjLocalHeader : ArjHeader
{
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
public ArchiveEncoding ArchiveEncoding { get; }
public long DataStartPosition { get; protected set; }
public override ArjHeader? Read(Stream stream)
{
var body = ReadHeader(stream);
if (body.Length > 0)
public byte ArchiverVersionNumber { get; set; }
public byte MinVersionToExtract { get; set; }
public HostOS HostOS { get; set; }
public CompressionMethod CompressionMethod { get; set; }
public DosDateTime DateTimeModified { get; set; } = new DosDateTime(0);
public long CompressedSize { get; set; }
public long OriginalSize { get; set; }
public long OriginalCrc32 { get; set; }
public int FileSpecPosition { get; set; }
public int FileAccessMode { get; set; }
public byte FirstChapter { get; set; }
public byte LastChapter { get; set; }
public long ExtendedFilePosition { get; set; }
public DosDateTime DateTimeAccessed { get; set; } = new DosDateTime(0);
public DosDateTime DateTimeCreated { get; set; } = new DosDateTime(0);
public long OriginalSizeEvenForVolumes { get; set; }
public string Name { get; set; } = string.Empty;
public string Comment { get; set; } = string.Empty;
private const byte StdHdrSize = 30;
private const byte R9HdrSize = 46;
public ArjLocalHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.LocalHeader)
{
ReadExtendedHeaders(stream);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
return null;
}
// ReadAsync moved to ArjLocalHeader.Async.cs
public ArjLocalHeader LoadFrom(byte[] headerBytes)
{
int offset = 0;
int ReadInt16()
public override ArjHeader? Read(Stream stream)
{
if (offset + 1 >= headerBytes.Length)
var body = ReadHeader(stream);
if (body.Length > 0)
{
throw new EndOfStreamException();
ReadExtendedHeaders(stream);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
return null;
}
long ReadInt32()
public ArjLocalHeader LoadFrom(byte[] headerBytes)
{
if (offset + 3 >= headerBytes.Length)
int offset = 0;
int ReadInt16()
{
throw new EndOfStreamException();
if (offset + 1 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
byte headerSize = headerBytes[offset++];
ArchiverVersionNumber = headerBytes[offset++];
MinVersionToExtract = headerBytes[offset++];
HostOS hostOS = (HostOS)headerBytes[offset++];
Flags = headerBytes[offset++];
CompressionMethod = CompressionMethodFromByte(headerBytes[offset++]);
FileType = FileTypeFromByte(headerBytes[offset++]);
offset++; // Skip 1 byte
var rawTimestamp = ReadInt32();
DateTimeModified = rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
CompressedSize = ReadInt32();
OriginalSize = ReadInt32();
OriginalCrc32 = ReadInt32();
FileSpecPosition = ReadInt16();
FileAccessMode = ReadInt16();
FirstChapter = headerBytes[offset++];
LastChapter = headerBytes[offset++];
ExtendedFilePosition = 0;
OriginalSizeEvenForVolumes = 0;
if (headerSize > StdHdrSize)
{
ExtendedFilePosition = ReadInt32();
if (headerSize >= R9HdrSize)
long ReadInt32()
{
rawTimestamp = ReadInt32();
DateTimeAccessed = rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : null;
rawTimestamp = ReadInt32();
DateTimeCreated = rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : null;
OriginalSizeEvenForVolumes = ReadInt32();
if (offset + 3 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
byte headerSize = headerBytes[offset++];
ArchiverVersionNumber = headerBytes[offset++];
MinVersionToExtract = headerBytes[offset++];
HostOS hostOS = (HostOS)headerBytes[offset++];
Flags = headerBytes[offset++];
CompressionMethod = CompressionMethodFromByte(headerBytes[offset++]);
FileType = FileTypeFromByte(headerBytes[offset++]);
offset++; // Skip 1 byte
var rawTimestamp = ReadInt32();
DateTimeModified =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
CompressedSize = ReadInt32();
OriginalSize = ReadInt32();
OriginalCrc32 = ReadInt32();
FileSpecPosition = ReadInt16();
FileAccessMode = ReadInt16();
FirstChapter = headerBytes[offset++];
LastChapter = headerBytes[offset++];
ExtendedFilePosition = 0;
OriginalSizeEvenForVolumes = 0;
if (headerSize > StdHdrSize)
{
ExtendedFilePosition = ReadInt32();
if (headerSize >= R9HdrSize)
{
rawTimestamp = ReadInt32();
DateTimeAccessed =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
rawTimestamp = ReadInt32();
DateTimeCreated =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
OriginalSizeEvenForVolumes = ReadInt32();
}
}
Name = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Name.Length + 1;
Comment = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Comment.Length + 1;
return this;
}
Name = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Name.Length + 1;
Comment = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Comment.Length + 1;
return this;
}
public static CompressionMethod CompressionMethodFromByte(byte value)
{
return value switch
public static CompressionMethod CompressionMethodFromByte(byte value)
{
0 => CompressionMethod.Stored,
1 => CompressionMethod.CompressedMost,
2 => CompressionMethod.Compressed,
3 => CompressionMethod.CompressedFaster,
4 => CompressionMethod.CompressedFastest,
8 => CompressionMethod.NoDataNoCrc,
9 => CompressionMethod.NoData,
_ => CompressionMethod.Unknown,
};
return value switch
{
0 => CompressionMethod.Stored,
1 => CompressionMethod.CompressedMost,
2 => CompressionMethod.Compressed,
3 => CompressionMethod.CompressedFaster,
4 => CompressionMethod.CompressedFastest,
8 => CompressionMethod.NoDataNoCrc,
9 => CompressionMethod.NoData,
_ => CompressionMethod.Unknown,
};
}
}
}

Some files were not shown because too many files have changed in this diff Show More