Compare commits

..

3 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
4a6e5232ae Add opt-in multi-threading support with SupportsMultiThreadedExtraction flag
- Added IArchive.SupportsMultiThreadedExtraction property to indicate if multi-threading is supported
- Added ReaderOptions.EnableMultiThreadedExtraction option to opt-in to multi-threading
- Updated SeekableZipFilePart, TarFilePart, and SeekableFilePart to check the flag
- Added test to verify multi-threading flag behavior
- Multi-threading is now disabled by default for backward compatibility

Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-18 16:27:59 +00:00
copilot-swe-agent[bot]
3e23a6e5a6 Add multi-threading support for file-based archives - sync test passing
Co-authored-by: adamhathcock <527620+adamhathcock@users.noreply.github.com>
2026-01-18 16:09:54 +00:00
copilot-swe-agent[bot]
e0a43e9727 Initial plan 2026-01-18 15:56:08 +00:00
497 changed files with 14610 additions and 32156 deletions

7
.copilot-agent.yml Normal file
View File

@@ -0,0 +1,7 @@
enabled: true
agent:
name: copilot-coding-agent
allow:
- paths: ["src/**/*", "tests/**/*", "README.md", "AGENTS.md"]
actions: ["create", "modify"]
require_review_before_merge: true

17
.github/agents/copilot-agent.yml vendored Normal file
View File

@@ -0,0 +1,17 @@
enabled: true
agent:
name: copilot-coding-agent
allow:
- paths: ["src/**/*", "tests/**/*", "README.md", "AGENTS.md"]
actions: ["create", "modify", "delete"]
require_review_before_merge: true
required_approvals: 1
allowed_merge_strategies:
- squash
- merge
auto_merge_on_green: false
run_workflows: true
notes: |
- This manifest expresses the policy for the Copilot coding agent in this repository.
- It does NOT install or authorize the agent; a repository admin must install the Copilot coding agent app and grant the repository the necessary permissions (contents: write, pull_requests: write, checks: write, actions: write/read, issues: write) to allow the agent to act.
- Keep allow paths narrow and prefer require_review_before_merge during initial rollout.

25
.github/prompts/plan-async.prompt.md vendored Normal file
View File

@@ -0,0 +1,25 @@
# Plan: Implement Missing Async Functionality in SharpCompress
SharpCompress has async support for low-level stream operations and Reader/Writer APIs, but critical entry points (Archive.Open, factory methods, initialization) remain synchronous. This plan adds async overloads for all user-facing I/O operations and fixes existing async bugs, enabling full end-to-end async workflows.
## Steps
1. **Add async factory methods** to [ArchiveFactory.cs](src/SharpCompress/Factories/ArchiveFactory.cs), [ReaderFactory.cs](src/SharpCompress/Factories/ReaderFactory.cs), and [WriterFactory.cs](src/SharpCompress/Factories/WriterFactory.cs) with `OpenAsync` and `CreateAsync` overloads accepting `CancellationToken`
2. **Implement async Open methods** on concrete archive types ([ZipArchive.cs](src/SharpCompress/Archives/Zip/ZipArchive.cs), [TarArchive.cs](src/SharpCompress/Archives/Tar/TarArchive.cs), [RarArchive.cs](src/SharpCompress/Archives/Rar/RarArchive.cs), [GZipArchive.cs](src/SharpCompress/Archives/GZip/GZipArchive.cs), [SevenZipArchive.cs](src/SharpCompress/Archives/SevenZip/SevenZipArchive.cs)) and reader types ([ZipReader.cs](src/SharpCompress/Readers/Zip/ZipReader.cs), [TarReader.cs](src/SharpCompress/Readers/Tar/TarReader.cs), etc.)
3. **Convert archive initialization logic to async** including header reading, volume loading, and format signature detection across archive constructors and internal initialization methods
4. **Fix LZMA decoder async bugs** in [LzmaStream.cs](src/SharpCompress/Compressors/LZMA/LzmaStream.cs), [Decoder.cs](src/SharpCompress/Compressors/LZMA/Decoder.cs), and [OutWindow.cs](src/SharpCompress/Compressors/LZMA/OutWindow.cs) to enable true async 7Zip support and remove `NonDisposingStream` workaround
5. **Complete Rar async implementation** by converting `UnpackV2017` methods to async in [UnpackV2017.cs](src/SharpCompress/Compressors/Rar/UnpackV2017.cs) and updating Rar20 decompression
6. **Add comprehensive async tests** covering all new async entry points, cancellation scenarios, and concurrent operations across all archive formats in test files
## Further Considerations
1. **Breaking changes** - Should new async methods be added alongside existing sync methods (non-breaking), or should sync methods eventually be deprecated? Recommend additive approach for backward compatibility.
2. **Performance impact** - Header parsing for formats like Zip/Tar is often small; consider whether truly async parsing adds value vs sync parsing wrapped in Task, or make it conditional based on stream type (network vs file).
3. **7Zip complexity** - The LZMA async bug fix (Step 4) may be challenging due to state management in the decoder; consider whether to scope it separately or implement a simpler workaround that maintains correctness.

123
.github/prompts/plan-for-next.prompt.md vendored Normal file
View File

@@ -0,0 +1,123 @@
# Plan: Modernize SharpCompress Public API
Based on comprehensive analysis, the API has several inconsistencies around factory patterns, async support, format capabilities, and options classes. Most improvements can be done incrementally without breaking changes.
## Steps
1. **Standardize factory patterns** by deprecating format-specific static `Open` methods in [Archives/Zip/ZipArchive.cs](src/SharpCompress/Archives/Zip/ZipArchive.cs), [Archives/Tar/TarArchive.cs](src/SharpCompress/Archives/Tar/TarArchive.cs), etc. in favor of centralized [Factories/ArchiveFactory.cs](src/SharpCompress/Factories/ArchiveFactory.cs)
2. **Complete async implementation** in [Writers/Zip/ZipWriter.cs](src/SharpCompress/Writers/Zip/ZipWriter.cs) and other writers that currently use sync-over-async, implementing true async I/O throughout the writer hierarchy
3. **Unify options classes** by making [Common/ExtractionOptions.cs](src/SharpCompress/Common/ExtractionOptions.cs) inherit from `OptionsBase` and adding progress reporting to extraction methods consistently
4. **Clarify GZip semantics** in [Archives/GZip/GZipArchive.cs](src/SharpCompress/Archives/GZip/GZipArchive.cs) by adding XML documentation explaining single-entry limitation and relationship to GZip compression used in Tar.gz
## Further Considerations
1. **Breaking changes roadmap** - Should we plan a major version (2.0) to remove deprecated factory methods, clean up `ArchiveType` enum (remove Arc/Arj or add full support), and consolidate naming patterns?
2. **Progress reporting consistency** - Should `IProgress<ArchiveExtractionProgress<IEntry>>` be added to all extraction extension methods or consolidated into options classes?
## Detailed Analysis
### Factory Pattern Issues
Three different factory patterns exist with overlapping functionality:
1. **Static Factories**: ArchiveFactory, ReaderFactory, WriterFactory
2. **Instance Factories**: IArchiveFactory, IReaderFactory, IWriterFactory
3. **Format-specific static methods**: Each archive class has static `Open` methods
**Example confusion:**
```csharp
// Three ways to open a Zip archive - which is recommended?
var archive1 = ArchiveFactory.Open("file.zip");
var archive2 = ZipArchive.Open("file.zip");
var archive3 = ArchiveFactory.AutoFactory.Open(fileInfo, options);
```
### Async Support Gaps
Base `IWriter` interface has async methods, but writer implementations provide minimal async support. Most writers just call synchronous methods:
```csharp
public virtual async Task WriteAsync(...)
{
// Default implementation calls synchronous version
Write(filename, source, modificationTime);
await Task.CompletedTask.ConfigureAwait(false);
}
```
Real async implementations only in:
- `TarWriter` - Proper async implementation
- Most other writers use sync-over-async
### GZip Archive Special Case
GZip is treated as both a compression format and an archive format, but only supports single-entry archives:
```csharp
protected override GZipArchiveEntry CreateEntryInternal(...)
{
if (Entries.Any())
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
// ...
}
```
### Options Class Hierarchy
```
OptionsBase (LeaveStreamOpen, ArchiveEncoding)
├─ ReaderOptions (LookForHeader, Password, DisableCheckIncomplete, BufferSize, ExtensionHint, Progress)
├─ WriterOptions (CompressionType, CompressionLevel, Progress)
│ ├─ ZipWriterOptions (ArchiveComment, UseZip64)
│ ├─ TarWriterOptions (FinalizeArchiveOnClose, HeaderFormat)
│ └─ GZipWriterOptions (no additional properties)
└─ ExtractionOptions (standalone - Overwrite, ExtractFullPath, PreserveFileTime, PreserveAttributes)
```
**Issues:**
- `ExtractionOptions` doesn't inherit from `OptionsBase` - no encoding support during extraction
- Progress reporting inconsistency between readers and extraction
- Obsolete properties (`ChecksumIsValid`, `Version`) with unclear migration path
### Implementation Priorities
**High Priority (Non-Breaking):**
1. Add API usage guide (Archive vs Reader, factory recommendations, async best practices)
2. Fix progress reporting consistency
3. Complete async implementation in writers
**Medium Priority (Next Major Version):**
1. Unify factory pattern - deprecate format-specific static `Open` methods
2. Clean up options classes - make `ExtractionOptions` inherit from `OptionsBase`
3. Clarify archive types - remove Arc/Arj from `ArchiveType` enum or add full support
4. Standardize naming across archive types
**Low Priority:**
1. Add BZip2 archive support similar to GZipArchive
2. Complete obsolete property cleanup with migration guide
### Backward Compatibility Strategy
**Safe (Non-Breaking) Changes:**
- Add new methods to interfaces (use default implementations)
- Add new options properties (with defaults)
- Add new factory methods
- Improve async implementations
- Add progress reporting support
**Breaking Changes to Avoid:**
- ❌ Removing format-specific `Open` methods (deprecate instead)
- ❌ Changing `LeaveStreamOpen` default (currently `true`)
- ❌ Removing obsolete properties before major version bump
- ❌ Changing return types or signatures of existing methods
**Deprecation Pattern:**
- Use `[Obsolete]` for one major version
- Use `[EditorBrowsable(EditorBrowsableState.Never)]` in next major version
- Remove in following major version

View File

@@ -1,50 +0,0 @@
name: Performance Benchmarks
on:
push:
branches:
- 'master'
- 'release'
pull_request:
branches:
- 'master'
- 'release'
workflow_dispatch:
permissions:
contents: read
jobs:
benchmark:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
with:
fetch-depth: 0
- uses: actions/setup-dotnet@v5
with:
dotnet-version: 10.0.x
- name: Build Performance Project
run: dotnet build tests/SharpCompress.Performance/SharpCompress.Performance.csproj --configuration Release
- name: Run Benchmarks
run: dotnet run --project tests/SharpCompress.Performance/SharpCompress.Performance.csproj --configuration Release --no-build -- --filter "*" --exporters json markdown --artifacts benchmark-results
continue-on-error: true
- name: Display Benchmark Results
if: always()
run: dotnet run --project build/build.csproj -- display-benchmark-results
- name: Compare with Baseline
if: always()
run: dotnet run --project build/build.csproj -- compare-benchmark-results
- name: Upload Benchmark Results
if: always()
uses: actions/upload-artifact@v6
with:
name: benchmark-results
path: benchmark-results/

4
.gitignore vendored
View File

@@ -17,10 +17,6 @@ tests/TestArchives/*/Scratch2
tools
.idea/
artifacts/
BenchmarkDotNet.Artifacts/
baseline-artifacts/
profiler-snapshots/
.DS_Store
*.snupkg
benchmark-results/

View File

@@ -179,58 +179,3 @@ SharpCompress supports multiple archive and compression formats:
3. **Stream disposal** - Always set `LeaveStreamOpen` explicitly when needed (default is to close)
4. **Tar + non-seekable stream** - Must provide file size or it will throw
6. **Format detection** - Use `ReaderFactory.Open()` for auto-detection, test with actual archive files
### Async Struct-Copy Bug in LZMA RangeCoder
When implementing async methods on mutable `struct` types (like `BitEncoder` and `BitDecoder` in the LZMA RangeCoder), be aware that the async state machine copies the struct when `await` is encountered. This means mutations to struct fields after the `await` point may not persist back to the original struct stored in arrays or fields.
**The Bug:**
```csharp
// BAD: async method on mutable struct
public async ValueTask<uint> DecodeAsync(Decoder decoder, CancellationToken cancellationToken = default)
{
var newBound = (decoder._range >> K_NUM_BIT_MODEL_TOTAL_BITS) * _prob;
if (decoder._code < newBound)
{
decoder._range = newBound;
_prob += (K_BIT_MODEL_TOTAL - _prob) >> K_NUM_MOVE_BITS; // Mutates _prob
await decoder.Normalize2Async(cancellationToken).ConfigureAwait(false); // Struct gets copied here
return 0; // Original _prob update may be lost
}
// ...
}
```
**The Fix:**
Refactor async methods on mutable structs to perform all struct mutations synchronously before any `await`, or use a helper method to separate the await from the struct mutation:
```csharp
// GOOD: struct mutations happen synchronously, await is conditional
public ValueTask<uint> DecodeAsync(Decoder decoder, CancellationToken cancellationToken = default)
{
var newBound = (decoder._range >> K_NUM_BIT_MODEL_TOTAL_BITS) * _prob;
if (decoder._code < newBound)
{
decoder._range = newBound;
_prob += (K_BIT_MODEL_TOTAL - _prob) >> K_NUM_MOVE_BITS; // All mutations complete
return DecodeAsyncHelper(decoder.Normalize2Async(cancellationToken), 0); // Await in helper
}
decoder._range -= newBound;
decoder._code -= newBound;
_prob -= (_prob) >> K_NUM_MOVE_BITS; // All mutations complete
return DecodeAsyncHelper(decoder.Normalize2Async(cancellationToken), 1); // Await in helper
}
private static async ValueTask<uint> DecodeAsyncHelper(ValueTask normalizeTask, uint result)
{
await normalizeTask.ConfigureAwait(false);
return result;
}
```
**Why This Matters:**
In LZMA, the `BitEncoder` and `BitDecoder` structs maintain adaptive probability models in their `_prob` field. When these structs are stored in arrays (e.g., `_models[m]`), the async state machine copy breaks the adaptive model, causing incorrect bit decoding and eventually `DataErrorException` exceptions.
**Related Files:**
- `src/SharpCompress/Compressors/LZMA/RangeCoder/RangeCoderBit.Async.cs` - Fixed
- `src/SharpCompress/Compressors/LZMA/RangeCoder/RangeCoderBitTree.Async.cs` - Uses readonly structs, so this pattern doesn't apply

View File

@@ -1,20 +1,20 @@
<Project>
<ItemGroup>
<PackageVersion Include="BenchmarkDotNet" Version="0.15.8" />
<PackageVersion Include="Bullseye" Version="6.1.0" />
<PackageVersion Include="AwesomeAssertions" Version="9.3.0" />
<PackageVersion Include="Glob" Version="1.1.9" />
<PackageVersion Include="JetBrains.Profiler.SelfApi" Version="2.5.16" />
<PackageVersion Include="JetBrains.Profiler.SelfApi" Version="2.5.15" />
<PackageVersion Include="Microsoft.Bcl.AsyncInterfaces" Version="10.0.0" />
<PackageVersion Include="Microsoft.NET.ILLink.Task" Version="10.0.0" />
<PackageVersion Include="Microsoft.NET.Test.Sdk" Version="18.0.1" />
<PackageVersion Include="Mono.Posix.NETStandard" Version="1.0.0" />
<PackageVersion Include="SimpleExec" Version="13.0.0" />
<PackageVersion Include="System.Text.Encoding.CodePages" Version="10.0.0" />
<PackageVersion Include="System.Buffers" Version="4.6.1" />
<PackageVersion Include="System.Memory" Version="4.6.3" />
<PackageVersion Include="xunit.v3" Version="3.2.2" />
<PackageVersion Include="xunit" Version="2.9.3" />
<PackageVersion Include="xunit.runner.visualstudio" Version="3.1.5" />
<GlobalPackageReference Include="Microsoft.SourceLink.GitHub" Version="10.0.102" />
<GlobalPackageReference Include="Microsoft.SourceLink.GitHub" Version="8.0.0" />
<GlobalPackageReference Include="Microsoft.NETFramework.ReferenceAssemblies" Version="1.0.3" />
<GlobalPackageReference
Include="Microsoft.VisualStudio.Threading.Analyzers"

View File

@@ -19,9 +19,6 @@ const string Publish = "publish";
const string DetermineVersion = "determine-version";
const string UpdateVersion = "update-version";
const string PushToNuGet = "push-to-nuget";
const string DisplayBenchmarkResults = "display-benchmark-results";
const string CompareBenchmarkResults = "compare-benchmark-results";
const string GenerateBaseline = "generate-baseline";
Target(
Clean,
@@ -213,249 +210,6 @@ Target(
}
);
Target(
DisplayBenchmarkResults,
() =>
{
var githubStepSummary = Environment.GetEnvironmentVariable("GITHUB_STEP_SUMMARY");
var resultsDir = "benchmark-results/results";
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("No benchmark results found.");
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("No benchmark markdown reports found.");
return;
}
var output = new List<string> { "## Benchmark Results", "" };
foreach (var file in markdownFiles)
{
Console.WriteLine($"Processing {Path.GetFileName(file)}");
var content = File.ReadAllText(file);
output.Add(content);
output.Add("");
}
// Write to GitHub Step Summary if available
if (!string.IsNullOrEmpty(githubStepSummary))
{
File.AppendAllLines(githubStepSummary, output);
Console.WriteLine($"Benchmark results written to GitHub Step Summary");
}
else
{
// Write to console if not in GitHub Actions
foreach (var line in output)
{
Console.WriteLine(line);
}
}
}
);
Target(
CompareBenchmarkResults,
() =>
{
var githubStepSummary = Environment.GetEnvironmentVariable("GITHUB_STEP_SUMMARY");
var baselinePath = "tests/SharpCompress.Performance/baseline-results.md";
var resultsDir = "benchmark-results/results";
var output = new List<string> { "## Comparison with Baseline", "" };
if (!File.Exists(baselinePath))
{
Console.WriteLine("Baseline file not found");
output.Add("⚠️ Baseline file not found. Run `generate-baseline` to create it.");
WriteOutput(output, githubStepSummary);
return;
}
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("No current benchmark results found.");
output.Add("⚠️ No current benchmark results found. Showing baseline only.");
output.Add("");
output.Add("### Baseline Results");
output.AddRange(File.ReadAllLines(baselinePath));
WriteOutput(output, githubStepSummary);
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("No current benchmark markdown reports found.");
output.Add("⚠️ No current benchmark results found. Showing baseline only.");
output.Add("");
output.Add("### Baseline Results");
output.AddRange(File.ReadAllLines(baselinePath));
WriteOutput(output, githubStepSummary);
return;
}
Console.WriteLine("Parsing baseline results...");
var baselineMetrics = ParseBenchmarkResults(File.ReadAllText(baselinePath));
Console.WriteLine("Parsing current results...");
var currentText = string.Join("\n", markdownFiles.Select(f => File.ReadAllText(f)));
var currentMetrics = ParseBenchmarkResults(currentText);
Console.WriteLine("Comparing results...");
output.Add("### Performance Comparison");
output.Add("");
output.Add(
"| Benchmark | Baseline Mean | Current Mean | Change | Baseline Memory | Current Memory | Change |"
);
output.Add(
"|-----------|---------------|--------------|--------|-----------------|----------------|--------|"
);
var hasRegressions = false;
var hasImprovements = false;
foreach (var method in currentMetrics.Keys.Union(baselineMetrics.Keys).OrderBy(k => k))
{
var hasCurrent = currentMetrics.TryGetValue(method, out var current);
var hasBaseline = baselineMetrics.TryGetValue(method, out var baseline);
if (!hasCurrent)
{
output.Add(
$"| {method} | {baseline!.Mean} | ❌ Missing | N/A | {baseline.Memory} | N/A | N/A |"
);
continue;
}
if (!hasBaseline)
{
output.Add(
$"| {method} | ❌ New | {current!.Mean} | N/A | N/A | {current.Memory} | N/A |"
);
continue;
}
var timeChange = CalculateChange(baseline!.MeanValue, current!.MeanValue);
var memChange = CalculateChange(baseline.MemoryValue, current.MemoryValue);
var timeIcon =
timeChange > 25 ? "🔴"
: timeChange < -25 ? "🟢"
: "⚪";
var memIcon =
memChange > 25 ? "🔴"
: memChange < -25 ? "🟢"
: "⚪";
if (timeChange > 25 || memChange > 25)
hasRegressions = true;
if (timeChange < -25 || memChange < -25)
hasImprovements = true;
output.Add(
$"| {method} | {baseline.Mean} | {current.Mean} | {timeIcon} {timeChange:+0.0;-0.0;0}% | {baseline.Memory} | {current.Memory} | {memIcon} {memChange:+0.0;-0.0;0}% |"
);
}
output.Add("");
output.Add("**Legend:**");
output.Add("- 🔴 Regression (>25% slower/more memory)");
output.Add("- 🟢 Improvement (>25% faster/less memory)");
output.Add("- ⚪ No significant change");
if (hasRegressions)
{
output.Add("");
output.Add(
"⚠️ **Warning**: Performance regressions detected. Review the changes carefully."
);
}
else if (hasImprovements)
{
output.Add("");
output.Add("✅ Performance improvements detected!");
}
else
{
output.Add("");
output.Add("✅ Performance is stable compared to baseline.");
}
WriteOutput(output, githubStepSummary);
}
);
Target(
GenerateBaseline,
() =>
{
var perfProject = "tests/SharpCompress.Performance/SharpCompress.Performance.csproj";
var baselinePath = "tests/SharpCompress.Performance/baseline-results.md";
var artifactsDir = "baseline-artifacts";
Console.WriteLine("Building performance project...");
Run("dotnet", $"build {perfProject} --configuration Release");
Console.WriteLine("Running benchmarks to generate baseline...");
Run(
"dotnet",
$"run --project {perfProject} --configuration Release --no-build -- --filter \"*\" --exporters markdown --artifacts {artifactsDir}"
);
var resultsDir = Path.Combine(artifactsDir, "results");
if (!Directory.Exists(resultsDir))
{
Console.WriteLine("ERROR: No benchmark results generated.");
return;
}
var markdownFiles = Directory
.GetFiles(resultsDir, "*-report-github.md")
.OrderBy(f => f)
.ToList();
if (markdownFiles.Count == 0)
{
Console.WriteLine("ERROR: No markdown reports found.");
return;
}
Console.WriteLine($"Combining {markdownFiles.Count} benchmark reports...");
var baselineContent = new List<string>();
foreach (var file in markdownFiles)
{
var lines = File.ReadAllLines(file);
baselineContent.AddRange(lines.Select(l => l.Trim()).Where(l => l.StartsWith('|')));
}
File.WriteAllText(baselinePath, string.Join(Environment.NewLine, baselineContent));
Console.WriteLine($"Baseline written to {baselinePath}");
// Clean up artifacts directory
if (Directory.Exists(artifactsDir))
{
Directory.Delete(artifactsDir, true);
Console.WriteLine("Cleaned up artifacts directory.");
}
}
);
Target("default", [Publish], () => Console.WriteLine("Done!"));
await RunTargetsAndExitAsync(args);
@@ -476,7 +230,7 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
}
else
{
// Not tagged - create prerelease version
// Not tagged - create prerelease version based on next minor version
var allTags = (await GetGitOutput("tag", "--list"))
.Split('\n', StringSplitOptions.RemoveEmptyEntries)
.Where(tag => Regex.IsMatch(tag.Trim(), @"^\d+\.\d+\.\d+$"))
@@ -486,22 +240,8 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
var lastTag = allTags.OrderBy(tag => Version.Parse(tag)).LastOrDefault() ?? "0.0.0";
var lastVersion = Version.Parse(lastTag);
// Determine version increment based on branch
var currentBranch = await GetCurrentBranch();
Version nextVersion;
if (currentBranch == "release")
{
// Release branch: increment patch version
nextVersion = new Version(lastVersion.Major, lastVersion.Minor, lastVersion.Build + 1);
Console.WriteLine($"Building prerelease for release branch (patch increment)");
}
else
{
// Master or other branches: increment minor version
nextVersion = new Version(lastVersion.Major, lastVersion.Minor + 1, 0);
Console.WriteLine($"Building prerelease for {currentBranch} branch (minor increment)");
}
// Increment minor version for next release
var nextVersion = new Version(lastVersion.Major, lastVersion.Minor + 1, 0);
// Use commit count since the last version tag if available; otherwise, fall back to total count
var revListArgs = allTags.Any() ? $"--count {lastTag}..HEAD" : "--count HEAD";
@@ -513,28 +253,6 @@ static async Task<(string version, bool isPrerelease)> GetVersion()
}
}
static async Task<string> GetCurrentBranch()
{
// In GitHub Actions, GITHUB_REF_NAME contains the branch name
var githubRefName = Environment.GetEnvironmentVariable("GITHUB_REF_NAME");
if (!string.IsNullOrEmpty(githubRefName))
{
return githubRefName;
}
// Fallback to git command for local builds
try
{
var (output, _) = await ReadAsync("git", "branch --show-current");
return output.Trim();
}
catch (Exception ex)
{
Console.WriteLine($"Warning: Could not determine current branch: {ex.Message}");
return "unknown";
}
}
static async Task<string> GetGitOutput(string command, string args)
{
try
@@ -548,142 +266,3 @@ static async Task<string> GetGitOutput(string command, string args)
throw new Exception($"Git command failed: git {command} {args}\n{ex.Message}", ex);
}
}
static void WriteOutput(List<string> output, string? githubStepSummary)
{
if (!string.IsNullOrEmpty(githubStepSummary))
{
File.AppendAllLines(githubStepSummary, output);
Console.WriteLine("Comparison written to GitHub Step Summary");
}
else
{
foreach (var line in output)
{
Console.WriteLine(line);
}
}
}
static Dictionary<string, BenchmarkMetric> ParseBenchmarkResults(string markdown)
{
var metrics = new Dictionary<string, BenchmarkMetric>();
var lines = markdown.Split('\n');
for (int i = 0; i < lines.Length; i++)
{
var line = lines[i].Trim();
// Look for table rows with benchmark data
if (line.StartsWith("|") && line.Contains("&#39;") && i > 0)
{
var parts = line.Split('|', StringSplitOptions.TrimEntries);
if (parts.Length >= 5)
{
var method = parts[1].Replace("&#39;", "'");
var meanStr = parts[2];
// Find Allocated column - it's usually the last column or labeled "Allocated"
string memoryStr = "N/A";
for (int j = parts.Length - 2; j >= 2; j--)
{
if (
parts[j].Contains("KB")
|| parts[j].Contains("MB")
|| parts[j].Contains("GB")
|| parts[j].Contains("B")
)
{
memoryStr = parts[j];
break;
}
}
if (
!method.Equals("Method", StringComparison.OrdinalIgnoreCase)
&& !string.IsNullOrWhiteSpace(method)
)
{
var metric = new BenchmarkMetric
{
Method = method,
Mean = meanStr,
MeanValue = ParseTimeValue(meanStr),
Memory = memoryStr,
MemoryValue = ParseMemoryValue(memoryStr),
};
metrics[method] = metric;
}
}
}
}
return metrics;
}
static double ParseTimeValue(string timeStr)
{
if (string.IsNullOrWhiteSpace(timeStr) || timeStr == "N/A" || timeStr == "NA")
return 0;
// Remove thousands separators and parse
timeStr = timeStr.Replace(",", "").Trim();
var match = Regex.Match(timeStr, @"([\d.]+)\s*(\w+)");
if (!match.Success)
return 0;
var value = double.Parse(match.Groups[1].Value);
var unit = match.Groups[2].Value.ToLower();
// Convert to microseconds for comparison
return unit switch
{
"s" => value * 1_000_000,
"ms" => value * 1_000,
"μs" or "us" => value,
"ns" => value / 1_000,
_ => value,
};
}
static double ParseMemoryValue(string memStr)
{
if (string.IsNullOrWhiteSpace(memStr) || memStr == "N/A" || memStr == "NA")
return 0;
memStr = memStr.Replace(",", "").Trim();
var match = Regex.Match(memStr, @"([\d.]+)\s*(\w+)");
if (!match.Success)
return 0;
var value = double.Parse(match.Groups[1].Value);
var unit = match.Groups[2].Value.ToUpper();
// Convert to KB for comparison
return unit switch
{
"GB" => value * 1_024 * 1_024,
"MB" => value * 1_024,
"KB" => value,
"B" => value / 1_024,
_ => value,
};
}
static double CalculateChange(double baseline, double current)
{
if (baseline == 0)
return 0;
return ((current - baseline) / baseline) * 100;
}
record BenchmarkMetric
{
public string Method { get; init; } = "";
public string Mean { get; init; } = "";
public double MeanValue { get; init; }
public string Memory { get; init; } = "";
public double MemoryValue { get; init; }
}

View File

@@ -25,12 +25,12 @@
},
"Microsoft.SourceLink.GitHub": {
"type": "Direct",
"requested": "[10.0.102, )",
"resolved": "10.0.102",
"contentHash": "Oxq3RCIJSdtpIU4hLqO7XaDe/Ra3HS9Wi8rJl838SAg6Zu1iQjerA0+xXWBgUFYbgknUGCLOU0T+lzMLkvY9Qg==",
"requested": "[8.0.0, )",
"resolved": "8.0.0",
"contentHash": "G5q7OqtwIyGTkeIOAc3u2ZuV/kicQaec5EaRnc0pIeSnh9LUjj+PYQrJYBURvDt7twGl2PKA7nSN0kz1Zw5bnQ==",
"dependencies": {
"Microsoft.Build.Tasks.Git": "10.0.102",
"Microsoft.SourceLink.Common": "10.0.102"
"Microsoft.Build.Tasks.Git": "8.0.0",
"Microsoft.SourceLink.Common": "8.0.0"
}
},
"Microsoft.VisualStudio.Threading.Analyzers": {
@@ -47,8 +47,8 @@
},
"Microsoft.Build.Tasks.Git": {
"type": "Transitive",
"resolved": "10.0.102",
"contentHash": "0i81LYX31U6UiXz4NOLbvc++u+/mVDmOt+PskrM/MygpDxkv9THKQyRUmavBpLK6iBV0abNWnn+CQgSRz//Pwg=="
"resolved": "8.0.0",
"contentHash": "bZKfSIKJRXLTuSzLudMFte/8CempWjVamNUR5eHJizsy+iuOuO/k2gnh7W0dHJmYY0tBf+gUErfluCv5mySAOQ=="
},
"Microsoft.NETFramework.ReferenceAssemblies.net461": {
"type": "Transitive",
@@ -57,8 +57,8 @@
},
"Microsoft.SourceLink.Common": {
"type": "Transitive",
"resolved": "10.0.102",
"contentHash": "Mk1IMb9q5tahC2NltxYXFkLBtuBvfBoCQ3pIxYQWfzbCE9o1OB9SsHe0hnNGo7lWgTA/ePbFAJLWu6nLL9K17A=="
"resolved": "8.0.0",
"contentHash": "dk9JPxTCIevS75HyEQ0E4OVAFhB2N+V9ShCXf8Q6FkUQZDkgLI12y679Nym1YqsiSysuQskT7Z+6nUf3yab6Vw=="
}
}
}

View File

@@ -1,103 +0,0 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public abstract partial class AbstractArchive<TEntry, TVolume>
where TEntry : IArchiveEntry
where TVolume : IVolume
{
#region Async Support
// Async properties
public virtual IAsyncEnumerable<TEntry> EntriesAsync => _lazyEntriesAsync;
public IAsyncEnumerable<TVolume> VolumesAsync => _lazyVolumesAsync;
protected virtual async IAsyncEnumerable<TEntry> LoadEntriesAsync(
IAsyncEnumerable<TVolume> volumes
)
{
foreach (var item in LoadEntries(await volumes.ToListAsync()))
{
yield return item;
}
}
public virtual async ValueTask DisposeAsync()
{
if (!_disposed)
{
await foreach (var v in _lazyVolumesAsync)
{
v.Dispose();
}
foreach (var v in _lazyEntriesAsync.GetLoaded().Cast<Entry>())
{
v.Close();
}
_sourceStream?.Dispose();
_disposed = true;
}
}
private async ValueTask EnsureEntriesLoadedAsync()
{
await _lazyEntriesAsync.EnsureFullyLoaded();
await _lazyVolumesAsync.EnsureFullyLoaded();
}
private async IAsyncEnumerable<IArchiveEntry> EntriesAsyncCast()
{
await foreach (var entry in EntriesAsync)
{
yield return entry;
}
}
IAsyncEnumerable<IArchiveEntry> IAsyncArchive.EntriesAsync => EntriesAsyncCast();
IAsyncEnumerable<IVolume> IAsyncArchive.VolumesAsync => VolumesAsyncCast();
private async IAsyncEnumerable<IVolume> VolumesAsyncCast()
{
await foreach (var volume in _lazyVolumesAsync)
{
yield return volume;
}
}
public async ValueTask<IAsyncReader> ExtractAllEntriesAsync()
{
if (!await IsSolidAsync() && Type != ArchiveType.SevenZip)
{
throw new SharpCompressException(
"ExtractAllEntries can only be used on solid archives or 7Zip archives (which require random access)."
);
}
await EnsureEntriesLoadedAsync();
return await CreateReaderForSolidExtractionAsync();
}
public virtual ValueTask<bool> IsSolidAsync() => new(false);
public async ValueTask<bool> IsCompleteAsync()
{
await EnsureEntriesLoadedAsync();
return await EntriesAsync.AllAsync(x => x.IsComplete);
}
public async ValueTask<long> TotalSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.CompressedSize);
public async ValueTask<long> TotalUncompressedSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.Size);
public ValueTask<bool> IsEncryptedAsync() => new(IsEncrypted);
#endregion
}

View File

@@ -7,7 +7,7 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives;
public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyncArchive
public abstract class AbstractArchive<TEntry, TVolume> : IArchive, IAsyncArchive
where TEntry : IArchiveEntry
where TVolume : IVolume
{
@@ -16,10 +16,6 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
private bool _disposed;
private readonly SourceStream? _sourceStream;
// Async fields - kept in original file per refactoring rules
private readonly LazyAsyncReadOnlyCollection<TVolume> _lazyVolumesAsync;
private readonly LazyAsyncReadOnlyCollection<TEntry> _lazyEntriesAsync;
protected ReaderOptions ReaderOptions { get; }
internal AbstractArchive(ArchiveType type, SourceStream sourceStream)
@@ -81,6 +77,16 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
protected virtual IAsyncEnumerable<TVolume> LoadVolumesAsync(SourceStream sourceStream) =>
LoadVolumes(sourceStream).ToAsyncEnumerable();
protected virtual async IAsyncEnumerable<TEntry> LoadEntriesAsync(
IAsyncEnumerable<TVolume> volumes
)
{
foreach (var item in LoadEntries(await volumes.ToListAsync()))
{
yield return item;
}
}
IEnumerable<IArchiveEntry> IArchive.Entries => Entries.Cast<IArchiveEntry>();
IEnumerable<IVolume> IArchive.Volumes => _lazyVolumes.Cast<IVolume>();
@@ -139,6 +145,19 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
/// </summary>
public virtual bool IsEncrypted => false;
/// <summary>
/// Returns whether multi-threaded extraction is supported for this archive.
/// Multi-threading is supported when:
/// 1. The archive is opened from a FileInfo or file path (not a stream)
/// 2. Multi-threading is explicitly enabled in ReaderOptions
/// 3. The archive is not SOLID (SOLID archives should use sequential extraction)
/// </summary>
public virtual bool SupportsMultiThreadedExtraction =>
_sourceStream is not null
&& _sourceStream.IsFileMode
&& ReaderOptions.EnableMultiThreadedExtraction
&& !IsSolid;
/// <summary>
/// The archive can find all the parts of the archive needed to fully extract the archive. This forces the parsing of the entire archive.
/// </summary>
@@ -150,4 +169,85 @@ public abstract partial class AbstractArchive<TEntry, TVolume> : IArchive, IAsyn
return Entries.All(x => x.IsComplete);
}
}
#region Async Support
private readonly LazyAsyncReadOnlyCollection<TVolume> _lazyVolumesAsync;
private readonly LazyAsyncReadOnlyCollection<TEntry> _lazyEntriesAsync;
public virtual async ValueTask DisposeAsync()
{
if (!_disposed)
{
await foreach (var v in _lazyVolumesAsync)
{
v.Dispose();
}
foreach (var v in _lazyEntriesAsync.GetLoaded().Cast<Entry>())
{
v.Close();
}
_sourceStream?.Dispose();
_disposed = true;
}
}
private async ValueTask EnsureEntriesLoadedAsync()
{
await _lazyEntriesAsync.EnsureFullyLoaded();
await _lazyVolumesAsync.EnsureFullyLoaded();
}
public virtual IAsyncEnumerable<TEntry> EntriesAsync => _lazyEntriesAsync;
private async IAsyncEnumerable<IArchiveEntry> EntriesAsyncCast()
{
await foreach (var entry in EntriesAsync)
{
yield return entry;
}
}
IAsyncEnumerable<IArchiveEntry> IAsyncArchive.EntriesAsync => EntriesAsyncCast();
private async IAsyncEnumerable<IVolume> VolumesAsyncCast()
{
await foreach (var volume in VolumesAsync)
{
yield return volume;
}
}
public IAsyncEnumerable<IVolume> VolumesAsync => VolumesAsyncCast();
public async ValueTask<IAsyncReader> ExtractAllEntriesAsync()
{
if (!IsSolid && Type != ArchiveType.SevenZip)
{
throw new SharpCompressException(
"ExtractAllEntries can only be used on solid archives or 7Zip archives (which require random access)."
);
}
await EnsureEntriesLoadedAsync();
return await CreateReaderForSolidExtractionAsync();
}
public virtual ValueTask<bool> IsSolidAsync() => new(false);
public async ValueTask<bool> IsCompleteAsync()
{
await EnsureEntriesLoadedAsync();
return await EntriesAsync.AllAsync(x => x.IsComplete);
}
public async ValueTask<long> TotalSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.CompressedSize);
public async ValueTask<long> TotalUncompressedSizeAsync() =>
await EntriesAsync.AggregateAsync(0L, (total, cf) => total + cf.Size);
public ValueTask<bool> IsEncryptedAsync() => new(IsEncrypted);
#endregion
}

View File

@@ -1,123 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Writers;
namespace SharpCompress.Archives;
public abstract partial class AbstractWritableArchive<TEntry, TVolume>
where TEntry : IArchiveEntry
where TVolume : IVolume
{
// Async property moved from main file
private IAsyncEnumerable<TEntry> OldEntriesAsync =>
base.EntriesAsync.Where(x => !removedEntries.Contains(x));
private async ValueTask RebuildModifiedCollectionAsync()
{
if (pauseRebuilding)
{
return;
}
hasModifications = true;
newEntries.RemoveAll(v => removedEntries.Contains(v));
modifiedEntries.Clear();
await foreach (var entry in OldEntriesAsync)
{
modifiedEntries.Add(entry);
}
modifiedEntries.AddRange(newEntries);
}
public async ValueTask RemoveEntryAsync(TEntry entry)
{
if (!removedEntries.Contains(entry))
{
removedEntries.Add(entry);
await RebuildModifiedCollectionAsync();
}
}
private async ValueTask<bool> DoesKeyMatchExistingAsync(
string key,
CancellationToken cancellationToken
)
{
await foreach (
var entry in EntriesAsync.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
var path = entry.Key;
if (path is null)
{
continue;
}
var p = path.Replace('/', '\\');
if (p.Length > 0 && p[0] == '\\')
{
p = p.Substring(1);
}
return string.Equals(p, key, StringComparison.OrdinalIgnoreCase);
}
return false;
}
public async ValueTask<TEntry> AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size = 0,
DateTime? modified = null,
CancellationToken cancellationToken = default
)
{
if (key.Length > 0 && key[0] is '/' or '\\')
{
key = key.Substring(1);
}
if (await DoesKeyMatchExistingAsync(key, cancellationToken).ConfigureAwait(false))
{
throw new ArchiveException("Cannot add entry with duplicate key: " + key);
}
var entry = CreateEntry(key, source, size, modified, closeStream);
newEntries.Add(entry);
await RebuildModifiedCollectionAsync();
return entry;
}
public async ValueTask<TEntry> AddDirectoryEntryAsync(
string key,
DateTime? modified = null,
CancellationToken cancellationToken = default
)
{
if (key.Length > 0 && key[0] is '/' or '\\')
{
key = key.Substring(1);
}
if (await DoesKeyMatchExistingAsync(key, cancellationToken).ConfigureAwait(false))
{
throw new ArchiveException("Cannot add entry with duplicate key: " + key);
}
var entry = CreateDirectoryEntry(key, modified);
newEntries.Add(entry);
await RebuildModifiedCollectionAsync();
return entry;
}
public async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
CancellationToken cancellationToken = default
)
{
//reset streams of new entries
newEntries.Cast<IWritableArchiveEntry>().ForEach(x => x.Stream.Seek(0, SeekOrigin.Begin));
await SaveToAsync(stream, options, OldEntriesAsync, newEntries, cancellationToken)
.ConfigureAwait(false);
}
}

View File

@@ -10,7 +10,7 @@ using SharpCompress.Writers;
namespace SharpCompress.Archives;
public abstract partial class AbstractWritableArchive<TEntry, TVolume>
public abstract class AbstractWritableArchive<TEntry, TVolume>
: AbstractArchive<TEntry, TVolume>,
IWritableArchive,
IWritableAsyncArchive
@@ -84,12 +84,12 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume>
}
}
void IWritableArchive.RemoveEntry(IArchiveEntry entry) => RemoveEntry((TEntry)entry);
void IWritableArchiveCommon.RemoveEntry(IArchiveEntry entry) => RemoveEntry((TEntry)entry);
public TEntry AddEntry(string key, Stream source, long size = 0, DateTime? modified = null) =>
AddEntry(key, source, false, size, modified);
IArchiveEntry IWritableArchive.AddEntry(
IArchiveEntry IWritableArchiveCommon.AddEntry(
string key,
Stream source,
bool closeStream,
@@ -97,7 +97,7 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume>
DateTime? modified
) => AddEntry(key, source, closeStream, size, modified);
IArchiveEntry IWritableArchive.AddDirectoryEntry(string key, DateTime? modified) =>
IArchiveEntry IWritableArchiveCommon.AddDirectoryEntry(string key, DateTime? modified) =>
AddDirectoryEntry(key, modified);
public TEntry AddEntry(
@@ -140,24 +140,6 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume>
return false;
}
ValueTask IWritableAsyncArchive.RemoveEntryAsync(IArchiveEntry entry) =>
RemoveEntryAsync((TEntry)entry);
async ValueTask<IArchiveEntry> IWritableAsyncArchive.AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size,
DateTime? modified,
CancellationToken cancellationToken
) => await AddEntryAsync(key, source, closeStream, size, modified, cancellationToken);
async ValueTask<IArchiveEntry> IWritableAsyncArchive.AddDirectoryEntryAsync(
string key,
DateTime? modified,
CancellationToken cancellationToken
) => await AddDirectoryEntryAsync(key, modified, cancellationToken);
public TEntry AddDirectoryEntry(string key, DateTime? modified = null)
{
if (key.Length > 0 && key[0] is '/' or '\\')
@@ -181,6 +163,18 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume>
SaveTo(stream, options, OldEntries, newEntries);
}
public async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
CancellationToken cancellationToken = default
)
{
//reset streams of new entries
newEntries.Cast<IWritableArchiveEntry>().ForEach(x => x.Stream.Seek(0, SeekOrigin.Begin));
await SaveToAsync(stream, options, OldEntries, newEntries, cancellationToken)
.ConfigureAwait(false);
}
protected TEntry CreateEntry(
string key,
Stream source,
@@ -218,7 +212,7 @@ public abstract partial class AbstractWritableArchive<TEntry, TVolume>
protected abstract ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IAsyncEnumerable<TEntry> oldEntries,
IEnumerable<TEntry> oldEntries,
IEnumerable<TEntry> newEntries,
CancellationToken cancellationToken = default
);

View File

@@ -1,157 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Factories;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
public static partial class ArchiveFactory
{
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
readerOptions ??= new ReaderOptions();
var factory = await FindFactoryAsync<IArchiveFactory>(stream, cancellationToken);
return factory.OpenAsyncArchive(stream, readerOptions);
}
public static ValueTask<IAsyncArchive> OpenAsyncArchive(
string filePath,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenAsyncArchive(new FileInfo(filePath), options, cancellationToken);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
options ??= new ReaderOptions { LeaveStreamOpen = false };
var factory = await FindFactoryAsync<IArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(fileInfo, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
fileInfos.NotNull(nameof(fileInfos));
var filesArray = fileInfos.ToArray();
if (filesArray.Length == 0)
{
throw new InvalidOperationException("No files to open");
}
var fileInfo = filesArray[0];
if (filesArray.Length == 1)
{
return await OpenAsyncArchive(fileInfo, options, cancellationToken);
}
fileInfo.NotNull(nameof(fileInfo));
options ??= new ReaderOptions { LeaveStreamOpen = false };
var factory = await FindFactoryAsync<IMultiArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(filesArray, options, cancellationToken);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<Stream> streams,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
streams.NotNull(nameof(streams));
var streamsArray = streams.ToArray();
if (streamsArray.Length == 0)
{
throw new InvalidOperationException("No streams");
}
var firstStream = streamsArray[0];
if (streamsArray.Length == 1)
{
return await OpenAsyncArchive(firstStream, options, cancellationToken);
}
firstStream.NotNull(nameof(firstStream));
options ??= new ReaderOptions();
var factory = await FindFactoryAsync<IMultiArchiveFactory>(firstStream, cancellationToken);
return factory.OpenAsyncArchive(streamsArray, options);
}
public static ValueTask<T> FindFactoryAsync<T>(
string path,
CancellationToken cancellationToken = default
)
where T : IFactory
{
path.NotNullOrEmpty(nameof(path));
return FindFactoryAsync<T>(new FileInfo(path), cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
FileInfo finfo,
CancellationToken cancellationToken
)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
using Stream stream = finfo.OpenRead();
return await FindFactoryAsync<T>(stream, cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
Stream stream,
CancellationToken cancellationToken
)
where T : IFactory
{
stream.NotNull(nameof(stream));
if (!stream.CanRead || !stream.CanSeek)
{
throw new ArgumentException("Stream should be readable and seekable");
}
var factories = Factory.Factories.OfType<T>();
var startPosition = stream.Position;
foreach (var factory in factories)
{
stream.Seek(startPosition, SeekOrigin.Begin);
if (await factory.IsArchiveAsync(stream, cancellationToken: cancellationToken))
{
stream.Seek(startPosition, SeekOrigin.Begin);
return factory;
}
}
var extensions = string.Join(", ", factories.Select(item => item.Name));
throw new InvalidOperationException(
$"Cannot determine compressed stream type. Supported Archive Formats: {extensions}"
);
}
}

View File

@@ -11,14 +11,27 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives;
public static partial class ArchiveFactory
public static class ArchiveFactory
{
public static IArchive OpenArchive(Stream stream, ReaderOptions? readerOptions = null)
{
readerOptions ??= new ReaderOptions();
stream = SharpCompressStream.Create(stream, bufferSize: readerOptions.BufferSize);
return FindFactory<IArchiveFactory>(stream).OpenArchive(stream, readerOptions);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
Stream stream,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
readerOptions ??= new ReaderOptions();
stream = SharpCompressStream.Create(stream, bufferSize: readerOptions.BufferSize);
var factory = await FindFactoryAsync<IArchiveFactory>(stream, cancellationToken);
return factory.OpenAsyncArchive(stream, readerOptions);
}
public static IWritableArchive CreateArchive(ArchiveType type)
{
var factory = Factory
@@ -39,6 +52,16 @@ public static partial class ArchiveFactory
return OpenArchive(new FileInfo(filePath), options);
}
public static ValueTask<IAsyncArchive> OpenAsyncArchive(
string filePath,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenAsyncArchive(new FileInfo(filePath), options, cancellationToken);
}
public static IArchive OpenArchive(FileInfo fileInfo, ReaderOptions? options = null)
{
options ??= new ReaderOptions { LeaveStreamOpen = false };
@@ -46,6 +69,18 @@ public static partial class ArchiveFactory
return FindFactory<IArchiveFactory>(fileInfo).OpenArchive(fileInfo, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
options ??= new ReaderOptions { LeaveStreamOpen = false };
var factory = await FindFactoryAsync<IArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(fileInfo, options, cancellationToken);
}
public static IArchive OpenArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? options = null
@@ -70,6 +105,32 @@ public static partial class ArchiveFactory
return FindFactory<IMultiArchiveFactory>(fileInfo).OpenArchive(filesArray, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<FileInfo> fileInfos,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
fileInfos.NotNull(nameof(fileInfos));
var filesArray = fileInfos.ToArray();
if (filesArray.Length == 0)
{
throw new InvalidOperationException("No files to open");
}
var fileInfo = filesArray[0];
if (filesArray.Length == 1)
{
return await OpenAsyncArchive(fileInfo, options, cancellationToken);
}
fileInfo.NotNull(nameof(fileInfo));
options ??= new ReaderOptions { LeaveStreamOpen = false };
var factory = await FindFactoryAsync<IMultiArchiveFactory>(fileInfo, cancellationToken);
return factory.OpenAsyncArchive(filesArray, options, cancellationToken);
}
public static IArchive OpenArchive(IEnumerable<Stream> streams, ReaderOptions? options = null)
{
streams.NotNull(nameof(streams));
@@ -91,6 +152,33 @@ public static partial class ArchiveFactory
return FindFactory<IMultiArchiveFactory>(firstStream).OpenArchive(streamsArray, options);
}
public static async ValueTask<IAsyncArchive> OpenAsyncArchive(
IEnumerable<Stream> streams,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
streams.NotNull(nameof(streams));
var streamsArray = streams.ToArray();
if (streamsArray.Length == 0)
{
throw new InvalidOperationException("No streams");
}
var firstStream = streamsArray[0];
if (streamsArray.Length == 1)
{
return await OpenAsyncArchive(firstStream, options, cancellationToken);
}
firstStream.NotNull(nameof(firstStream));
options ??= new ReaderOptions();
var factory = FindFactory<IMultiArchiveFactory>(firstStream);
return factory.OpenAsyncArchive(streamsArray, options);
}
public static void WriteToDirectory(
string sourceArchive,
string destinationDirectory,
@@ -101,15 +189,7 @@ public static partial class ArchiveFactory
archive.WriteToDirectory(destinationDirectory, options);
}
public static T FindFactory<T>(string path)
where T : IFactory
{
path.NotNullOrEmpty(nameof(path));
using Stream stream = File.OpenRead(path);
return FindFactory<T>(stream);
}
public static T FindFactory<T>(FileInfo finfo)
private static T FindFactory<T>(FileInfo finfo)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
@@ -117,7 +197,7 @@ public static partial class ArchiveFactory
return FindFactory<T>(stream);
}
public static T FindFactory<T>(Stream stream)
private static T FindFactory<T>(Stream stream)
where T : IFactory
{
stream.NotNull(nameof(stream));
@@ -149,14 +229,68 @@ public static partial class ArchiveFactory
);
}
public static bool IsArchive(string filePath, out ArchiveType? type)
private static async ValueTask<T> FindFactoryAsync<T>(
FileInfo finfo,
CancellationToken cancellationToken
)
where T : IFactory
{
finfo.NotNull(nameof(finfo));
using Stream stream = finfo.OpenRead();
return await FindFactoryAsync<T>(stream, cancellationToken);
}
private static async ValueTask<T> FindFactoryAsync<T>(
Stream stream,
CancellationToken cancellationToken
)
where T : IFactory
{
stream.NotNull(nameof(stream));
if (!stream.CanRead || !stream.CanSeek)
{
throw new ArgumentException("Stream should be readable and seekable");
}
var factories = Factory.Factories.OfType<T>();
var startPosition = stream.Position;
foreach (var factory in factories)
{
stream.Seek(startPosition, SeekOrigin.Begin);
if (await factory.IsArchiveAsync(stream, cancellationToken: cancellationToken))
{
stream.Seek(startPosition, SeekOrigin.Begin);
return factory;
}
}
var extensions = string.Join(", ", factories.Select(item => item.Name));
throw new InvalidOperationException(
$"Cannot determine compressed stream type. Supported Archive Formats: {extensions}"
);
}
public static bool IsArchive(
string filePath,
out ArchiveType? type,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
filePath.NotNullOrEmpty(nameof(filePath));
using Stream s = File.OpenRead(filePath);
return IsArchive(s, out type);
return IsArchive(s, out type, bufferSize);
}
public static bool IsArchive(Stream stream, out ArchiveType? type)
public static bool IsArchive(
Stream stream,
out ArchiveType? type,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
type = null;
stream.NotNull(nameof(stream));
@@ -211,4 +345,6 @@ public static partial class ArchiveFactory
}
}
}
public static IArchiveFactory AutoFactory { get; } = new AutoArchiveFactory();
}

View File

@@ -13,7 +13,6 @@ internal abstract class ArchiveVolumeFactory
//split 001, 002 ...
var m = Regex.Match(part1.Name, @"^(.*\.)([0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -23,13 +22,9 @@ internal abstract class ArchiveVolumeFactory
)
)
);
}
if (item != null && item.Exists)
{
return item;
}
return null;
}
}

View File

@@ -0,0 +1,52 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Readers;
namespace SharpCompress.Archives;
internal class AutoArchiveFactory : IArchiveFactory
{
public string Name => nameof(AutoArchiveFactory);
public ArchiveType? KnownArchiveType => null;
public IEnumerable<string> GetSupportedExtensions() => throw new NotSupportedException();
public bool IsArchive(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
) => throw new NotSupportedException();
public ValueTask<bool> IsArchiveAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
) => throw new NotSupportedException();
public FileInfo? GetFilePart(int index, FileInfo part1) => throw new NotSupportedException();
public IArchive OpenArchive(Stream stream, ReaderOptions? readerOptions = null) =>
ArchiveFactory.OpenArchive(stream, readerOptions);
public IAsyncArchive OpenAsyncArchive(Stream stream, ReaderOptions? readerOptions = null) =>
(IAsyncArchive)OpenArchive(stream, readerOptions);
public IArchive OpenArchive(FileInfo fileInfo, ReaderOptions? readerOptions = null) =>
ArchiveFactory.OpenArchive(fileInfo, readerOptions);
public IAsyncArchive OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
return (IAsyncArchive)OpenArchive(fileInfo, readerOptions);
}
}

View File

@@ -1,86 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.GZip;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.GZip;
using SharpCompress.Writers;
using SharpCompress.Writers.GZip;
namespace SharpCompress.Archives.GZip;
public partial class GZipArchive
{
public ValueTask SaveToAsync(string filePath, CancellationToken cancellationToken = default) =>
SaveToAsync(new FileInfo(filePath), cancellationToken);
public async ValueTask SaveToAsync(
FileInfo fileInfo,
CancellationToken cancellationToken = default
)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await SaveToAsync(stream, new WriterOptions(CompressionType.GZip), cancellationToken)
.ConfigureAwait(false);
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IAsyncEnumerable<GZipArchiveEntry> oldEntries,
IEnumerable<GZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
if (Entries.Count > 1)
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
using var writer = new GZipWriter(stream, new GZipWriterOptions(options));
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (!entry.IsDirectory)
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries.Where(x => !x.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(entry.Key.NotNull("Entry Key is null"), entryStream, cancellationToken)
.ConfigureAwait(false);
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)GZipReader.OpenReader(stream));
}
protected override async IAsyncEnumerable<GZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<GZipVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
yield return new GZipArchiveEntry(
this,
await GZipFilePart.CreateAsync(stream, ReaderOptions.ArchiveEncoding)
);
}
}

View File

@@ -1,5 +1,4 @@
using System;
using System.Buffers;
using System.Collections.Generic;
using System.IO;
using System.Linq;
@@ -181,21 +180,18 @@ public partial class GZipArchive
CancellationToken cancellationToken = default
)
{
var header = ArrayPool<byte>.Shared.Rent(10);
try
{
await stream.ReadFullyAsync(header, 0, 10, cancellationToken).ConfigureAwait(false);
byte[] header = new byte[10];
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
finally
if (!await stream.ReadFullyAsync(header, cancellationToken).ConfigureAwait(false))
{
ArrayPool<byte>.Shared.Return(header);
return false;
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
}

View File

@@ -36,6 +36,19 @@ public partial class GZipArchive : AbstractWritableArchive<GZipArchiveEntry, GZi
SaveTo(stream, new WriterOptions(CompressionType.GZip));
}
public ValueTask SaveToAsync(string filePath, CancellationToken cancellationToken = default) =>
SaveToAsync(new FileInfo(filePath), cancellationToken);
public async ValueTask SaveToAsync(
FileInfo fileInfo,
CancellationToken cancellationToken = default
)
{
using var stream = fileInfo.Open(FileMode.Create, FileAccess.Write);
await SaveToAsync(stream, new WriterOptions(CompressionType.GZip), cancellationToken)
.ConfigureAwait(false);
}
protected override GZipArchiveEntry CreateEntryInternal(
string filePath,
Stream source,
@@ -79,6 +92,28 @@ public partial class GZipArchive : AbstractWritableArchive<GZipArchiveEntry, GZi
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<GZipArchiveEntry> oldEntries,
IEnumerable<GZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
if (Entries.Count > 1)
{
throw new InvalidFormatException("Only one entry is allowed in a GZip Archive");
}
using var writer = new GZipWriter(stream, new GZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries).Where(x => !x.IsDirectory))
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(entry.Key.NotNull("Entry Key is null"), entryStream, cancellationToken)
.ConfigureAwait(false);
}
}
protected override IEnumerable<GZipArchiveEntry> LoadEntries(IEnumerable<GZipVolume> volumes)
{
var stream = volumes.Single().Stream;
@@ -88,10 +123,28 @@ public partial class GZipArchive : AbstractWritableArchive<GZipArchiveEntry, GZi
);
}
protected override async IAsyncEnumerable<GZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<GZipVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
yield return new GZipArchiveEntry(
this,
await GZipFilePart.CreateAsync(stream, ReaderOptions.ArchiveEncoding)
);
}
protected override IReader CreateReaderForSolidExtraction()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return GZipReader.OpenReader(stream);
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)GZipReader.OpenReader(stream));
}
}

View File

@@ -58,7 +58,7 @@ internal sealed class GZipWritableArchiveEntry : GZipArchiveEntry, IWritableArch
{
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -44,4 +44,12 @@ public interface IArchive : IDisposable
/// Returns whether the archive is encrypted.
/// </summary>
bool IsEncrypted { get; }
/// <summary>
/// Returns whether multi-threaded extraction is supported for this archive.
/// Multi-threading is supported when the archive is opened from a FileInfo or file path
/// (not a stream) and the format supports random access (e.g., Zip, Tar, Rar).
/// SOLID archives (some Rar, all 7Zip) should use sequential extraction for best performance.
/// </summary>
bool SupportsMultiThreadedExtraction { get; }
}

View File

@@ -9,6 +9,8 @@ namespace SharpCompress.Archives;
public static class IArchiveEntryExtensions
{
private const int BufferSize = 81920;
/// <param name="archiveEntry">The archive entry to extract.</param>
extension(IArchiveEntry archiveEntry)
{
@@ -26,7 +28,7 @@ public static class IArchiveEntryExtensions
using var entryStream = archiveEntry.OpenEntryStream();
var sourceStream = WrapWithProgress(entryStream, archiveEntry, progress);
sourceStream.CopyTo(streamToWriteTo, Constants.BufferSize);
sourceStream.CopyTo(streamToWriteTo, BufferSize);
}
/// <summary>
@@ -46,16 +48,10 @@ public static class IArchiveEntryExtensions
throw new ExtractionException("Entry is a file directory and cannot be extracted.");
}
#if LEGACY_DOTNET
using var entryStream = await archiveEntry.OpenEntryStreamAsync(cancellationToken);
#else
await using var entryStream = await archiveEntry.OpenEntryStreamAsync(
cancellationToken
);
#endif
var sourceStream = WrapWithProgress(entryStream, archiveEntry, progress);
await sourceStream
.CopyToAsync(streamToWriteTo, Constants.BufferSize, cancellationToken)
.CopyToAsync(streamToWriteTo, BufferSize, cancellationToken)
.ConfigureAwait(false);
}
}

View File

@@ -47,5 +47,9 @@ public interface IArchiveFactory : IFactory
/// <param name="fileInfo">the file to open.</param>
/// <param name="readerOptions">reading options.</param>
/// <param name="cancellationToken">Cancellation token.</param>
IAsyncArchive OpenAsyncArchive(FileInfo fileInfo, ReaderOptions? readerOptions = null);
IAsyncArchive OpenAsyncArchive(
FileInfo fileInfo,
ReaderOptions? readerOptions = null,
CancellationToken cancellationToken = default
);
}

View File

@@ -20,7 +20,7 @@ public static class IAsyncArchiveExtensions
/// <param name="options">Extraction options.</param>
/// <param name="progress">Optional progress reporter for tracking extraction progress.</param>
/// <param name="cancellationToken">Optional cancellation token.</param>
public async ValueTask WriteToDirectoryAsync(
public async Task WriteToDirectoryAsync(
string destinationDirectory,
ExtractionOptions? options = null,
IProgress<ProgressReport>? progress = null,
@@ -47,7 +47,7 @@ public static class IAsyncArchiveExtensions
}
}
private async ValueTask WriteToDirectoryAsyncInternal(
private async Task WriteToDirectoryAsyncInternal(
string destinationDirectory,
ExtractionOptions? options,
IProgress<ProgressReport>? progress,

View File

@@ -13,10 +13,12 @@ public interface IWritableArchiveCommon
/// </summary>
/// <returns>IDisposeable to resume entry rebuilding</returns>
IDisposable PauseEntryRebuilding();
}
public interface IWritableArchive : IArchive, IWritableArchiveCommon
{
/// <summary>
/// Removes the specified entry from the archive.
/// </summary>
void RemoveEntry(IArchiveEntry entry);
IArchiveEntry AddEntry(
string key,
Stream source,
@@ -26,16 +28,14 @@ public interface IWritableArchive : IArchive, IWritableArchiveCommon
);
IArchiveEntry AddDirectoryEntry(string key, DateTime? modified = null);
}
public interface IWritableArchive : IArchive, IWritableArchiveCommon
{
/// <summary>
/// Saves the archive to the specified stream using the given writer options.
/// </summary>
void SaveTo(Stream stream, WriterOptions options);
/// <summary>
/// Removes the specified entry from the archive.
/// </summary>
void RemoveEntry(IArchiveEntry entry);
}
public interface IWritableAsyncArchive : IAsyncArchive, IWritableArchiveCommon
@@ -48,30 +48,4 @@ public interface IWritableAsyncArchive : IAsyncArchive, IWritableArchiveCommon
WriterOptions options,
CancellationToken cancellationToken = default
);
/// <summary>
/// Asynchronously adds an entry to the archive with the specified key, source stream, and options.
/// </summary>
ValueTask<IArchiveEntry> AddEntryAsync(
string key,
Stream source,
bool closeStream,
long size = 0,
DateTime? modified = null,
CancellationToken cancellationToken = default
);
/// <summary>
/// Asynchronously adds a directory entry to the archive with the specified key and modification time.
/// </summary>
ValueTask<IArchiveEntry> AddDirectoryEntryAsync(
string key,
DateTime? modified = null,
CancellationToken cancellationToken = default
);
/// <summary>
/// Removes the specified entry from the archive.
/// </summary>
ValueTask RemoveEntryAsync(IArchiveEntry entry);
}

View File

@@ -0,0 +1,59 @@
using System;
using System.IO;
namespace SharpCompress.Archives;
public static class IWritableArchiveCommonExtensions
{
extension(IWritableArchiveCommon writableArchive)
{
public void AddAllFromDirectory(
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (
var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption)
)
{
var fileInfo = new FileInfo(path);
writableArchive.AddEntry(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public IArchiveEntry AddEntry(string key, string file) =>
writableArchive.AddEntry(key, new FileInfo(file));
public IArchiveEntry AddEntry(
string key,
Stream source,
long size = 0,
DateTime? modified = null
) => writableArchive.AddEntry(key, source, false, size, modified);
public IArchiveEntry AddEntry(string key, FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntry(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}

View File

@@ -1,4 +1,3 @@
using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Writers;
@@ -9,55 +8,6 @@ public static class IWritableArchiveExtensions
{
extension(IWritableArchive writableArchive)
{
public void AddAllFromDirectory(
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (
var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption)
)
{
var fileInfo = new FileInfo(path);
writableArchive.AddEntry(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public IArchiveEntry AddEntry(string key, string file) =>
writableArchive.AddEntry(key, new FileInfo(file));
public IArchiveEntry AddEntry(
string key,
Stream source,
long size = 0,
DateTime? modified = null
) => writableArchive.AddEntry(key, source, false, size, modified);
public IArchiveEntry AddEntry(string key, FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntry(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
public void SaveTo(string filePath, WriterOptions? options = null) =>
writableArchive.SaveTo(new FileInfo(filePath), options ?? new(CompressionType.Deflate));

View File

@@ -1,4 +1,3 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
@@ -11,55 +10,6 @@ public static class IWritableAsyncArchiveExtensions
{
extension(IWritableAsyncArchive writableArchive)
{
public async ValueTask AddAllFromDirectoryAsync(
string filePath,
string searchPattern = "*.*",
SearchOption searchOption = SearchOption.AllDirectories
)
{
using (writableArchive.PauseEntryRebuilding())
{
foreach (
var path in Directory.EnumerateFiles(filePath, searchPattern, searchOption)
)
{
var fileInfo = new FileInfo(path);
await writableArchive.AddEntryAsync(
path.Substring(filePath.Length),
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
}
}
public ValueTask<IArchiveEntry> AddEntryAsync(string key, string file) =>
writableArchive.AddEntryAsync(key, new FileInfo(file));
public ValueTask<IArchiveEntry> AddEntryAsync(
string key,
Stream source,
long size = 0,
DateTime? modified = null
) => writableArchive.AddEntryAsync(key, source, false, size, modified);
public ValueTask<IArchiveEntry> AddEntryAsync(string key, FileInfo fileInfo)
{
if (!fileInfo.Exists)
{
throw new ArgumentException("FileInfo does not exist.");
}
return writableArchive.AddEntryAsync(
key,
fileInfo.OpenRead(),
true,
fileInfo.Length,
fileInfo.LastWriteTime
);
}
public ValueTask SaveToAsync(
string filePath,
WriterOptions? options = null,

View File

@@ -36,7 +36,4 @@ internal class FileInfoRarArchiveVolume : RarVolume
new FileInfoRarFilePart(this, ReaderOptions.Password, markHeader, fileHeader, FileInfo);
internal override IEnumerable<RarFilePart> ReadFileParts() => FileParts;
internal override IAsyncEnumerable<RarFilePart> ReadFilePartsAsync() =>
FileParts.ToAsyncEnumerable();
}

View File

@@ -1,53 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Archives.Rar;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.Rar;
namespace SharpCompress.Archives.Rar;
public partial class RarArchive
{
public override async ValueTask DisposeAsync()
{
if (!_disposed)
{
if (UnpackV1.IsValueCreated && UnpackV1.Value is IDisposable unpackV1)
{
unpackV1.Dispose();
}
_disposed = true;
await base.DisposeAsync();
}
}
protected override async ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
if (await this.IsMultipartVolumeAsync())
{
var streams = await VolumesAsync
.Select(volume =>
{
volume.Stream.Position = 0;
return volume.Stream;
})
.ToListAsync();
return (RarReader)RarReader.OpenReader(streams, ReaderOptions);
}
var stream = (await VolumesAsync.FirstAsync()).Stream;
stream.Position = 0;
return (RarReader)RarReader.OpenReader(stream, ReaderOptions);
}
public override async ValueTask<bool> IsSolidAsync() =>
await (await VolumesAsync.CastAsync<RarVolume>().FirstAsync()).IsSolidArchiveAsync();
}

View File

@@ -3,7 +3,6 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
@@ -164,24 +163,4 @@ public partial class RarArchive
return false;
}
}
public static async ValueTask<bool> IsRarFileAsync(
Stream stream,
ReaderOptions? options = null,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
await MarkHeader
.ReadAsync(stream, true, false, cancellationToken)
.ConfigureAwait(false);
return true;
}
catch
{
return false;
}
}
}

View File

@@ -24,10 +24,7 @@ public interface IRarArchive : IArchive, IRarArchiveCommon { }
public interface IRarAsyncArchive : IAsyncArchive, IRarArchiveCommon { }
public partial class RarArchive
: AbstractArchive<RarArchiveEntry, RarVolume>,
IRarArchive,
IRarAsyncArchive
public partial class RarArchive : AbstractArchive<RarArchiveEntry, RarVolume>, IRarArchive
{
private bool _disposed;
internal Lazy<IRarUnpack> UnpackV2017 { get; } =
@@ -51,14 +48,23 @@ public partial class RarArchive
}
}
public override async ValueTask DisposeAsync()
{
if (!_disposed)
{
if (UnpackV1.IsValueCreated && UnpackV1.Value is IDisposable unpackV1)
{
unpackV1.Dispose();
}
_disposed = true;
await base.DisposeAsync();
}
}
protected override IEnumerable<RarArchiveEntry> LoadEntries(IEnumerable<RarVolume> volumes) =>
RarArchiveEntryFactory.GetEntries(this, volumes, ReaderOptions);
// Simple async property - kept in original file
protected override IAsyncEnumerable<RarArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<RarVolume> volumes
) => RarArchiveEntryFactory.GetEntriesAsync(this, volumes, ReaderOptions);
protected override IEnumerable<RarVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.LoadAllParts();
@@ -80,7 +86,13 @@ public partial class RarArchive
return new StreamRarArchiveVolume(sourceStream, ReaderOptions, i++).AsEnumerable();
}
protected override IReader CreateReaderForSolidExtraction()
protected override IReader CreateReaderForSolidExtraction() =>
CreateReaderForSolidExtractionInternal();
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(CreateReaderForSolidExtractionInternal());
private RarReader CreateReaderForSolidExtractionInternal()
{
if (this.IsMultipartVolume())
{
@@ -102,6 +114,5 @@ public partial class RarArchive
public override bool IsEncrypted => Entries.First(x => !x.IsDirectory).IsEncrypted;
public virtual int MinVersion => Volumes.First().MinVersion;
public virtual int MaxVersion => Volumes.First().MaxVersion;
}

View File

@@ -1,43 +0,0 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.Compressors.Rar;
using SharpCompress.Readers;
namespace SharpCompress.Archives.Rar;
public partial class RarArchiveEntry
{
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
RarStream stream;
if (IsRarV3)
{
stream = new RarStream(
archive.UnpackV1.Value,
FileHeader,
await MultiVolumeReadOnlyAsyncStream.Create(
Parts.ToAsyncEnumerable().CastAsync<RarFilePart>()
)
);
}
else
{
stream = new RarStream(
archive.UnpackV2017.Value,
FileHeader,
await MultiVolumeReadOnlyAsyncStream.Create(
Parts.ToAsyncEnumerable().CastAsync<RarFilePart>()
)
);
}
await stream.InitializeAsync(cancellationToken);
return stream;
}
}

View File

@@ -12,7 +12,7 @@ using SharpCompress.Readers;
namespace SharpCompress.Archives.Rar;
public partial class RarArchiveEntry : RarEntry, IArchiveEntry
public class RarArchiveEntry : RarEntry, IArchiveEntry
{
private readonly ICollection<RarFilePart> parts;
private readonly RarArchive archive;
@@ -92,6 +92,32 @@ public partial class RarArchiveEntry : RarEntry, IArchiveEntry
return stream;
}
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
RarStream stream;
if (IsRarV3)
{
stream = new RarStream(
archive.UnpackV1.Value,
FileHeader,
new MultiVolumeReadOnlyStream(Parts.Cast<RarFilePart>())
);
}
else
{
stream = new RarStream(
archive.UnpackV2017.Value,
FileHeader,
new MultiVolumeReadOnlyStream(Parts.Cast<RarFilePart>())
);
}
await stream.InitializeAsync(cancellationToken);
return stream;
}
public bool IsComplete
{
get

View File

@@ -17,19 +17,6 @@ internal static class RarArchiveEntryFactory
}
}
private static async IAsyncEnumerable<RarFilePart> GetFilePartsAsync(
IAsyncEnumerable<RarVolume> parts
)
{
await foreach (var rarPart in parts)
{
await foreach (var fp in rarPart.ReadFilePartsAsync())
{
yield return fp;
}
}
}
private static IEnumerable<IEnumerable<RarFilePart>> GetMatchedFileParts(
IEnumerable<RarVolume> parts
)
@@ -51,27 +38,6 @@ internal static class RarArchiveEntryFactory
}
}
private static async IAsyncEnumerable<IEnumerable<RarFilePart>> GetMatchedFilePartsAsync(
IAsyncEnumerable<RarVolume> parts
)
{
var groupedParts = new List<RarFilePart>();
await foreach (var fp in GetFilePartsAsync(parts))
{
groupedParts.Add(fp);
if (!fp.FileHeader.IsSplitAfter)
{
yield return groupedParts;
groupedParts = new List<RarFilePart>();
}
}
if (groupedParts.Count > 0)
{
yield return groupedParts;
}
}
internal static IEnumerable<RarArchiveEntry> GetEntries(
RarArchive archive,
IEnumerable<RarVolume> rarParts,
@@ -83,16 +49,4 @@ internal static class RarArchiveEntryFactory
yield return new RarArchiveEntry(archive, groupedParts, readerOptions);
}
}
internal static async IAsyncEnumerable<RarArchiveEntry> GetEntriesAsync(
RarArchive archive,
IAsyncEnumerable<RarVolume> rarParts,
ReaderOptions readerOptions
)
{
await foreach (var groupedParts in GetMatchedFilePartsAsync(rarParts))
{
yield return new RarArchiveEntry(archive, groupedParts, readerOptions);
}
}
}

View File

@@ -13,7 +13,6 @@ internal static class RarArchiveVolumeFactory
//new style rar - ..part1 | /part01 | part001 ....
var m = Regex.Match(part1.Name, @"^(.*\.part)([0-9]+)(\.rar)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -24,13 +23,11 @@ internal static class RarArchiveVolumeFactory
)
)
);
}
else
{
//old style - ...rar, .r00, .r01 ...
m = Regex.Match(part1.Name, @"^(.*\.)([r-z{])(ar|[0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -43,17 +40,12 @@ internal static class RarArchiveVolumeFactory
)
)
);
}
else //split .001, .002 ....
{
return ArchiveVolumeFactory.GetFilePart(index, part1);
}
}
if (item != null && item.Exists)
{
return item;
}
return null; //no more items
}

View File

@@ -1,6 +1,7 @@
using System.IO;
using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.IO;
namespace SharpCompress.Archives.Rar;
@@ -24,6 +25,76 @@ internal class SeekableFilePart : RarFilePart
internal override Stream GetCompressedStream()
{
Stream streamToUse;
// If the stream is a SourceStream in file mode with multi-threading enabled,
// create an independent stream to support concurrent extraction
if (
_stream is SourceStream sourceStream
&& sourceStream.IsFileMode
&& sourceStream.ReaderOptions.EnableMultiThreadedExtraction
)
{
var independentStream = sourceStream.CreateIndependentStream(0);
if (independentStream is not null)
{
streamToUse = independentStream;
streamToUse.Position = FileHeader.DataStartPosition;
if (FileHeader.R4Salt != null)
{
var cryptKey = new CryptKey3(_password!);
return new RarCryptoWrapper(streamToUse, FileHeader.R4Salt, cryptKey);
}
if (FileHeader.Rar5CryptoInfo != null)
{
var cryptKey = new CryptKey5(_password!, FileHeader.Rar5CryptoInfo);
return new RarCryptoWrapper(
streamToUse,
FileHeader.Rar5CryptoInfo.Salt,
cryptKey
);
}
return streamToUse;
}
}
// Check if the stream wraps a FileStream
Stream? underlyingStream = _stream;
if (_stream is IStreamStack streamStack)
{
underlyingStream = streamStack.BaseStream();
}
if (underlyingStream is FileStream fileStream)
{
// Create a new independent stream from the file
streamToUse = new FileStream(
fileStream.Name,
FileMode.Open,
FileAccess.Read,
FileShare.Read
);
streamToUse.Position = FileHeader.DataStartPosition;
if (FileHeader.R4Salt != null)
{
var cryptKey = new CryptKey3(_password!);
return new RarCryptoWrapper(streamToUse, FileHeader.R4Salt, cryptKey);
}
if (FileHeader.Rar5CryptoInfo != null)
{
var cryptKey = new CryptKey5(_password!, FileHeader.Rar5CryptoInfo);
return new RarCryptoWrapper(streamToUse, FileHeader.Rar5CryptoInfo.Salt, cryptKey);
}
return streamToUse;
}
// Fall back to existing behavior for stream-based sources
_stream.Position = FileHeader.DataStartPosition;
if (FileHeader.R4Salt != null)

View File

@@ -14,9 +14,6 @@ internal class StreamRarArchiveVolume : RarVolume
internal override IEnumerable<RarFilePart> ReadFileParts() => GetVolumeFileParts();
internal override IAsyncEnumerable<RarFilePart> ReadFilePartsAsync() =>
GetVolumeFilePartsAsync();
internal override RarFilePart CreateFilePart(MarkHeader markHeader, FileHeader fileHeader) =>
new SeekableFilePart(markHeader, fileHeader, Index, Stream, ReaderOptions.Password);
}

View File

@@ -1,73 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.SevenZip;
using SharpCompress.IO;
using SharpCompress.Readers;
namespace SharpCompress.Archives.SevenZip;
public partial class SevenZipArchive
{
private async ValueTask LoadFactoryAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
if (_database is null)
{
stream.Position = 0;
var reader = new ArchiveReader();
await reader.OpenAsync(
stream,
lookForHeader: ReaderOptions.LookForHeader,
cancellationToken
);
_database = await reader.ReadDatabaseAsync(
new PasswordProvider(ReaderOptions.Password),
cancellationToken
);
}
}
protected override async IAsyncEnumerable<SevenZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<SevenZipVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
await LoadFactoryAsync(stream);
if (_database is null)
{
yield break;
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(stream, _database, i, file, ReaderOptions.ArchiveEncoding)
);
}
foreach (var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder))
{
var isSolid = false;
foreach (var entry in group)
{
entry.IsSolid = isSolid;
isSolid = true;
}
}
foreach (var entry in entries)
{
yield return entry;
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(new SevenZipReader(ReaderOptions, this));
}

View File

@@ -1,10 +1,12 @@
using System;
using System.Buffers;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.SevenZip;
using SharpCompress.Compressors.LZMA.Utilites;
using SharpCompress.IO;
using SharpCompress.Readers;
@@ -155,56 +157,13 @@ public partial class SevenZipArchive
}
}
public static async ValueTask<bool> IsSevenZipFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
return await SignatureMatchAsync(stream, cancellationToken);
}
catch
{
return false;
}
}
private static ReadOnlySpan<byte> Signature => [(byte)'7', (byte)'z', 0xBC, 0xAF, 0x27, 0x1C];
private static ReadOnlySpan<byte> Signature =>
new byte[] { (byte)'7', (byte)'z', 0xBC, 0xAF, 0x27, 0x1C };
private static bool SignatureMatch(Stream stream)
{
var buffer = ArrayPool<byte>.Shared.Rent(6);
try
{
stream.ReadExact(buffer, 0, 6);
return buffer.AsSpan().Slice(0, 6).SequenceEqual(Signature);
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
}
private static async ValueTask<bool> SignatureMatchAsync(
Stream stream,
CancellationToken cancellationToken
)
{
var buffer = ArrayPool<byte>.Shared.Rent(6);
try
{
if (!await stream.ReadFullyAsync(buffer, 0, 6, cancellationToken).ConfigureAwait(false))
{
return false;
}
return buffer.AsSpan().Slice(0, 6).SequenceEqual(Signature);
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
var reader = new BinaryReader(stream);
ReadOnlySpan<byte> signatureBytes = reader.ReadBytes(6);
return signatureBytes.SequenceEqual(Signature);
}
}

View File

@@ -16,65 +16,48 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
{
private ArchiveDatabase? _database;
/// <summary>
/// Constructor with a SourceStream able to handle FileInfo and Streams.
/// </summary>
/// <param name="sourceStream"></param>
private SevenZipArchive(SourceStream sourceStream)
: base(ArchiveType.SevenZip, sourceStream) { }
protected override IEnumerable<SevenZipVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.NotNull("SourceStream is null").LoadAllParts(); //request all streams
return new SevenZipVolume(sourceStream, ReaderOptions, 0).AsEnumerable(); //simple single volume or split, multivolume not supported
}
internal SevenZipArchive()
: base(ArchiveType.SevenZip) { }
protected override IEnumerable<SevenZipVolume> LoadVolumes(SourceStream sourceStream)
{
sourceStream.NotNull("SourceStream is null").LoadAllParts();
return new SevenZipVolume(sourceStream, ReaderOptions, 0).AsEnumerable();
}
protected override IEnumerable<SevenZipArchiveEntry> LoadEntries(
IEnumerable<SevenZipVolume> volumes
)
{
foreach (var volume in volumes)
var stream = volumes.Single().Stream;
LoadFactory(stream);
if (_database is null)
{
LoadFactory(volume.Stream);
if (_database is null)
return Enumerable.Empty<SevenZipArchiveEntry>();
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(stream, _database, i, file, ReaderOptions.ArchiveEncoding)
);
}
foreach (var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder))
{
var isSolid = false;
foreach (var entry in group)
{
yield break;
}
var entries = new SevenZipArchiveEntry[_database._files.Count];
for (var i = 0; i < _database._files.Count; i++)
{
var file = _database._files[i];
entries[i] = new SevenZipArchiveEntry(
this,
new SevenZipFilePart(
volume.Stream,
_database,
i,
file,
ReaderOptions.ArchiveEncoding
)
);
}
foreach (
var group in entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder)
)
{
var isSolid = false;
foreach (var entry in group)
{
entry.IsSolid = isSolid;
isSolid = true;
}
}
foreach (var entry in entries)
{
yield return entry;
entry.IsSolid = isSolid;
isSolid = true;
}
}
return entries;
}
private void LoadFactory(Stream stream)
@@ -91,6 +74,9 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
protected override IReader CreateReaderForSolidExtraction() =>
new SevenZipReader(ReaderOptions, this);
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync() =>
new(new SevenZipReader(ReaderOptions, this));
public override bool IsSolid =>
Entries
.Where(x => !x.IsDirectory)
@@ -102,34 +88,13 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
public override long TotalSize =>
_database?._packSizes.Aggregate(0L, (total, packSize) => total + packSize) ?? 0;
internal sealed class SevenZipReader : AbstractReader<SevenZipEntry, SevenZipVolume>
private sealed class SevenZipReader : AbstractReader<SevenZipEntry, SevenZipVolume>
{
private readonly SevenZipArchive _archive;
private SevenZipEntry? _currentEntry;
private Stream? _currentFolderStream;
private CFolder? _currentFolder;
/// <summary>
/// Enables internal diagnostics for tests.
/// When disabled (default), diagnostics properties return null to avoid exposing internal state.
/// </summary>
internal bool DiagnosticsEnabled { get; set; }
/// <summary>
/// Current folder instance used to decide whether the solid folder stream should be reused.
/// Only available when <see cref="DiagnosticsEnabled"/> is true.
/// </summary>
internal object? DiagnosticsCurrentFolder => DiagnosticsEnabled ? _currentFolder : null;
/// <summary>
/// Current shared folder stream instance.
/// Only available when <see cref="DiagnosticsEnabled"/> is true.
/// </summary>
internal Stream? DiagnosticsCurrentFolderStream =>
DiagnosticsEnabled ? _currentFolderStream : null;
internal SevenZipReader(ReaderOptions readerOptions, SevenZipArchive archive)
: base(readerOptions, ArchiveType.SevenZip, false) => this._archive = archive;
: base(readerOptions, ArchiveType.SevenZip) => this._archive = archive;
public override SevenZipVolume Volume => _archive.Volumes.Single();
@@ -142,10 +107,6 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
_currentEntry = dir;
yield return dir;
}
// For solid archives (entries in the same folder share a compressed stream),
// we must iterate entries sequentially and maintain the folder stream state
// across entries in the same folder to avoid recreating the decompression
// stream for each file, which breaks contiguous streaming.
foreach (var entry in entries.Where(x => !x.IsDirectory))
{
_currentEntry = entry;
@@ -160,53 +121,10 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
{
return CreateEntryStream(Stream.Null);
}
var folder = entry.FilePart.Folder;
// Check if we're starting a new folder - dispose old folder stream if needed
if (folder != _currentFolder)
{
_currentFolderStream?.Dispose();
_currentFolderStream = null;
_currentFolder = folder;
}
// Create the folder stream once per folder
if (_currentFolderStream is null)
{
_currentFolderStream = _archive._database!.GetFolderStream(
_archive.Volumes.Single().Stream,
folder!,
_archive._database.PasswordProvider
);
}
// Wrap with SyncOnlyStream to work around LZMA async bugs
// Return a ReadOnlySubStream that reads from the shared folder stream
return CreateEntryStream(
new SyncOnlyStream(
new ReadOnlySubStream(_currentFolderStream, entry.Size, leaveOpen: true)
)
);
}
public override void Dispose()
{
_currentFolderStream?.Dispose();
_currentFolderStream = null;
base.Dispose();
return CreateEntryStream(new SyncOnlyStream(entry.FilePart.GetCompressedStream()));
}
}
/// <summary>
/// WORKAROUND: Forces async operations to use synchronous equivalents.
/// This is necessary because the LZMA decoder has bugs in its async implementation
/// that cause state corruption (IndexOutOfRangeException, DataErrorException).
///
/// The proper fix would be to repair the LZMA decoder's async methods
/// (LzmaStream.ReadAsync, Decoder.CodeAsync, OutWindow async operations),
/// but that requires deep changes to the decoder state machine.
/// </summary>
private sealed class SyncOnlyStream : Stream
{
private readonly Stream _baseStream;
@@ -236,7 +154,6 @@ public partial class SevenZipArchive : AbstractArchive<SevenZipArchiveEntry, Sev
public override void Write(byte[] buffer, int offset, int count) =>
_baseStream.Write(buffer, offset, count);
// Force async operations to use sync equivalents to avoid LZMA decoder bugs
public override Task<int> ReadAsync(
byte[] buffer,
int offset,

View File

@@ -12,9 +12,8 @@ public class SevenZipArchiveEntry : SevenZipEntry, IArchiveEntry
public Stream OpenEntryStream() => FilePart.GetCompressedStream();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
) => (await FilePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
public ValueTask<Stream> OpenEntryStreamAsync(CancellationToken cancellationToken = default) =>
new(OpenEntryStream());
public IArchive Archive { get; }

View File

@@ -1,161 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Tar;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.Tar;
using SharpCompress.Writers;
using SharpCompress.Writers.Tar;
namespace SharpCompress.Archives.Tar;
public partial class TarArchive
{
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IAsyncEnumerable<TarArchiveEntry> oldEntries,
IEnumerable<TarArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new TarWriter(stream, new TarWriterOptions(options));
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)TarReader.OpenReader(stream));
}
protected override async IAsyncEnumerable<TarArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<TarVolume> volumes
)
{
var stream = (await volumes.SingleAsync()).Stream;
if (stream.CanSeek)
{
stream.Position = 0;
}
// Always use async header reading in LoadEntriesAsync for consistency
{
// Use async header reading for async-only streams
TarHeader? previousHeader = null;
await foreach (
var header in TarHeaderFactory.ReadHeaderAsync(
StreamingMode.Seekable,
stream,
ReaderOptions.ArchiveEncoding
)
)
{
if (header != null)
{
if (header.EntryType == EntryType.LongName)
{
previousHeader = header;
}
else
{
if (previousHeader != null)
{
var entry = new TarArchiveEntry(
this,
new TarFilePart(previousHeader, stream),
CompressionType.None
);
var oldStreamPos = stream.Position;
using (var entryStream = entry.OpenEntryStream())
{
using var memoryStream = new MemoryStream();
await entryStream.CopyToAsync(memoryStream);
memoryStream.Position = 0;
var bytes = memoryStream.ToArray();
header.Name = ReaderOptions
.ArchiveEncoding.Decode(bytes)
.TrimNulls();
}
stream.Position = oldStreamPos;
previousHeader = null;
}
yield return new TarArchiveEntry(
this,
new TarFilePart(header, stream),
CompressionType.None
);
}
}
else
{
throw new IncompleteArchiveException("Failed to read TAR header");
}
}
}
}
}

View File

@@ -2,13 +2,15 @@ using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Tar;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers;
using SharpCompress.Writers.Tar;
namespace SharpCompress.Archives.Tar;
@@ -21,7 +23,7 @@ public partial class TarArchive
public static IWritableArchive OpenArchive(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), readerOptions);
return OpenArchive(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
public static IWritableArchive OpenArchive(
@@ -34,7 +36,7 @@ public partial class TarArchive
new SourceStream(
fileInfo,
i => ArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
readerOptions ?? new ReaderOptions()
)
);
}
@@ -50,7 +52,7 @@ public partial class TarArchive
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
readerOptions ?? new ReaderOptions()
)
);
}
@@ -152,48 +154,15 @@ public partial class TarArchive
try
{
var tarHeader = new TarHeader(new ArchiveEncoding());
var reader = new BinaryReader(stream, Encoding.UTF8, false);
var readSucceeded = tarHeader.Read(reader);
var readSucceeded = tarHeader.Read(new BinaryReader(stream));
var isEmptyArchive =
tarHeader.Name?.Length == 0
&& tarHeader.Size == 0
&& Enum.IsDefined(typeof(EntryType), tarHeader.EntryType);
return readSucceeded || isEmptyArchive;
}
catch (Exception)
{
// Catch all exceptions during tar header reading to determine if this is a valid tar file
// Invalid tar files or corrupted streams will throw various exceptions
return false;
}
}
public static async ValueTask<bool> IsTarFileAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
try
{
var tarHeader = new TarHeader(new ArchiveEncoding());
#if NET8_0_OR_GREATER
await using var reader = new AsyncBinaryReader(stream, leaveOpen: true);
#else
using var reader = new AsyncBinaryReader(stream, leaveOpen: true);
#endif
var readSucceeded = await tarHeader.ReadAsync(reader);
var isEmptyArchive =
tarHeader.Name?.Length == 0
&& tarHeader.Size == 0
&& Enum.IsDefined(typeof(EntryType), tarHeader.EntryType);
return readSucceeded || isEmptyArchive;
}
catch (Exception)
{
// Catch all exceptions during tar header reading to determine if this is a valid tar file
// Invalid tar files or corrupted streams will throw various exceptions
return false;
}
catch { }
return false;
}
public static IWritableArchive CreateArchive() => new TarArchive();

View File

@@ -32,10 +32,6 @@ public partial class TarArchive : AbstractWritableArchive<TarArchiveEntry, TarVo
protected override IEnumerable<TarArchiveEntry> LoadEntries(IEnumerable<TarVolume> volumes)
{
var stream = volumes.Single().Stream;
if (stream.CanSeek)
{
stream.Position = 0;
}
TarHeader? previousHeader = null;
foreach (
var header in TarHeaderFactory.ReadHeader(
@@ -66,7 +62,7 @@ public partial class TarArchive : AbstractWritableArchive<TarArchiveEntry, TarVo
using (var entryStream = entry.OpenEntryStream())
{
using var memoryStream = new MemoryStream();
entryStream.CopyTo(memoryStream, Constants.BufferSize);
entryStream.CopyTo(memoryStream);
memoryStream.Position = 0;
var bytes = memoryStream.ToArray();
@@ -143,10 +139,54 @@ public partial class TarArchive : AbstractWritableArchive<TarArchiveEntry, TarVo
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<TarArchiveEntry> oldEntries,
IEnumerable<TarArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new TarWriter(stream, new TarWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
entry.LastModifiedTime,
entry.Size,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override IReader CreateReaderForSolidExtraction()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return TarReader.OpenReader(stream);
}
protected override ValueTask<IAsyncReader> CreateReaderForSolidExtractionAsync()
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return new((IAsyncReader)TarReader.OpenReader(stream));
}
}

View File

@@ -14,9 +14,8 @@ public class TarArchiveEntry : TarEntry, IArchiveEntry
public virtual Stream OpenEntryStream() => Parts.Single().GetCompressedStream().NotNull();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
) => (await Parts.Single().GetCompressedStreamAsync(cancellationToken)).NotNull();
public ValueTask<Stream> OpenEntryStreamAsync(CancellationToken cancellationToken = default) =>
new(OpenEntryStream());
#region IArchiveEntry Members

View File

@@ -79,7 +79,7 @@ internal sealed class TarWritableArchiveEntry : TarArchiveEntry, IWritableArchiv
}
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -1,132 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.Common.Zip;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Writers;
using SharpCompress.Writers.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchive
{
protected override async IAsyncEnumerable<ZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<ZipVolume> volumes
)
{
var vols = await volumes.ToListAsync();
var volsArray = vols.ToArray();
await foreach (
var h in headerFactory.NotNull().ReadSeekableHeaderAsync(volsArray.Last().Stream)
)
{
if (h != null)
{
switch (h.ZipHeaderType)
{
case ZipHeaderType.DirectoryEntry:
{
var deh = (DirectoryEntryHeader)h;
Stream s;
if (
deh.RelativeOffsetOfEntryHeader + deh.CompressedSize
> volsArray[deh.DiskNumberStart].Stream.Length
)
{
var v = volsArray.Skip(deh.DiskNumberStart).ToArray();
s = new SourceStream(
v[0].Stream,
i => i < v.Length ? v[i].Stream : null,
new ReaderOptions() { LeaveStreamOpen = true }
);
}
else
{
s = volsArray[deh.DiskNumberStart].Stream;
}
yield return new ZipArchiveEntry(
this,
new SeekableZipFilePart(headerFactory.NotNull(), deh, s)
);
}
break;
case ZipHeaderType.DirectoryEnd:
{
var bytes = ((DirectoryEndHeader)h).Comment ?? Array.Empty<byte>();
volsArray.Last().Comment = ReaderOptions.ArchiveEncoding.Decode(bytes);
yield break;
}
}
}
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IAsyncEnumerable<ZipArchiveEntry> oldEntries,
IEnumerable<ZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new ZipWriter(stream, new ZipWriterOptions(options));
await foreach (
var entry in oldEntries.WithCancellation(cancellationToken).ConfigureAwait(false)
)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
foreach (var entry in newEntries)
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
}

View File

@@ -21,7 +21,7 @@ public partial class ZipArchive
public static IWritableArchive OpenArchive(string filePath, ReaderOptions? readerOptions = null)
{
filePath.NotNullOrEmpty(nameof(filePath));
return OpenArchive(new FileInfo(filePath), readerOptions);
return OpenArchive(new FileInfo(filePath), readerOptions ?? new ReaderOptions());
}
public static IWritableArchive OpenArchive(
@@ -34,7 +34,7 @@ public partial class ZipArchive
new SourceStream(
fileInfo,
i => ZipArchiveVolumeFactory.GetFilePart(i, fileInfo),
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
readerOptions ?? new ReaderOptions()
)
);
}
@@ -50,7 +50,7 @@ public partial class ZipArchive
new SourceStream(
files[0],
i => i < files.Length ? files[i] : null,
readerOptions ?? new ReaderOptions() { LeaveStreamOpen = false }
readerOptions ?? new ReaderOptions()
)
);
}
@@ -135,24 +135,40 @@ public partial class ZipArchive
return (IWritableAsyncArchive)OpenArchive(fileInfos, readerOptions);
}
public static bool IsZipFile(string filePath, string? password = null) =>
IsZipFile(new FileInfo(filePath), password);
public static bool IsZipFile(
string filePath,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
) => IsZipFile(new FileInfo(filePath), password, bufferSize);
public static bool IsZipFile(FileInfo fileInfo, string? password = null)
public static bool IsZipFile(
FileInfo fileInfo,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
if (!fileInfo.Exists)
{
return false;
}
using Stream stream = fileInfo.OpenRead();
return IsZipFile(stream, password);
return IsZipFile(stream, password, bufferSize);
}
public static bool IsZipFile(Stream stream, string? password = null)
public static bool IsZipFile(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
@@ -172,11 +188,20 @@ public partial class ZipArchive
}
}
public static bool IsZipMulti(Stream stream, string? password = null)
public static bool IsZipMulti(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize
)
{
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);
@@ -185,7 +210,7 @@ public partial class ZipArchive
if (stream.CanSeek)
{
var z = new SeekableZipHeaderFactory(password, new ArchiveEncoding());
var x = z.ReadSeekableHeader(stream).FirstOrDefault();
var x = z.ReadSeekableHeader(stream, useSync: true).FirstOrDefault();
return x?.ZipHeaderType == ZipHeaderType.DirectoryEntry;
}
else
@@ -208,6 +233,7 @@ public partial class ZipArchive
public static async ValueTask<bool> IsZipFileAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
)
{
@@ -215,6 +241,11 @@ public partial class ZipArchive
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = await headerFactory
.ReadStreamHeaderAsync(stream)
.Where(x => x.ZipHeaderType != ZipHeaderType.Split)
@@ -242,6 +273,7 @@ public partial class ZipArchive
public static async ValueTask<bool> IsZipMultiAsync(
Stream stream,
string? password = null,
int bufferSize = ReaderOptions.DefaultBufferSize,
CancellationToken cancellationToken = default
)
{
@@ -249,6 +281,11 @@ public partial class ZipArchive
var headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding(), null);
try
{
if (stream is not SharpCompressStream)
{
stream = new SharpCompressStream(stream, bufferSize: bufferSize);
}
var header = headerFactory
.ReadStreamHeader(stream)
.FirstOrDefault(x => x.ZipHeaderType != ZipHeaderType.Split);

View File

@@ -35,18 +35,15 @@ public partial class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVo
protected override IEnumerable<ZipVolume> LoadVolumes(SourceStream stream)
{
stream.LoadAllParts();
//stream.Position = 0;
stream.Position = 0;
var streams = stream.Streams.ToList();
var idx = 0;
if (streams.Count() > 1)
{
//check if second stream is zip header without changing position
var headerProbeStream = streams[1];
var startPosition = headerProbeStream.Position;
headerProbeStream.Position = startPosition + 4;
var isZip = IsZipFile(headerProbeStream, ReaderOptions.Password);
headerProbeStream.Position = startPosition;
streams[1].Position += 4;
var isZip = IsZipFile(streams[1], ReaderOptions.Password, ReaderOptions.BufferSize);
streams[1].Position -= 4;
if (isZip)
{
stream.IsVolumes = true;
@@ -65,7 +62,9 @@ public partial class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVo
protected override IEnumerable<ZipArchiveEntry> LoadEntries(IEnumerable<ZipVolume> volumes)
{
var vols = volumes.ToArray();
foreach (var h in headerFactory.NotNull().ReadSeekableHeader(vols.Last().Stream))
foreach (
var h in headerFactory.NotNull().ReadSeekableHeader(vols.Last().Stream, useSync: true)
)
{
if (h != null)
{
@@ -109,6 +108,59 @@ public partial class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVo
}
}
protected override async IAsyncEnumerable<ZipArchiveEntry> LoadEntriesAsync(
IAsyncEnumerable<ZipVolume> volumes
)
{
var vols = await volumes.ToListAsync();
var volsArray = vols.ToArray();
await foreach (
var h in headerFactory.NotNull().ReadSeekableHeaderAsync(volsArray.Last().Stream)
)
{
if (h != null)
{
switch (h.ZipHeaderType)
{
case ZipHeaderType.DirectoryEntry:
{
var deh = (DirectoryEntryHeader)h;
Stream s;
if (
deh.RelativeOffsetOfEntryHeader + deh.CompressedSize
> volsArray[deh.DiskNumberStart].Stream.Length
)
{
var v = volsArray.Skip(deh.DiskNumberStart).ToArray();
s = new SourceStream(
v[0].Stream,
i => i < v.Length ? v[i].Stream : null,
new ReaderOptions() { LeaveStreamOpen = true }
);
}
else
{
s = volsArray[deh.DiskNumberStart].Stream;
}
yield return new ZipArchiveEntry(
this,
new SeekableZipFilePart(headerFactory.NotNull(), deh, s)
);
}
break;
case ZipHeaderType.DirectoryEnd:
{
var bytes = ((DirectoryEndHeader)h).Comment ?? Array.Empty<byte>();
volsArray.Last().Comment = ReaderOptions.ArchiveEncoding.Decode(bytes);
yield break;
}
}
}
}
}
public void SaveTo(Stream stream) => SaveTo(stream, new WriterOptions(CompressionType.Deflate));
protected override void SaveTo(
@@ -140,6 +192,41 @@ public partial class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVo
}
}
protected override async ValueTask SaveToAsync(
Stream stream,
WriterOptions options,
IEnumerable<ZipArchiveEntry> oldEntries,
IEnumerable<ZipArchiveEntry> newEntries,
CancellationToken cancellationToken = default
)
{
using var writer = new ZipWriter(stream, new ZipWriterOptions(options));
foreach (var entry in oldEntries.Concat(newEntries))
{
if (entry.IsDirectory)
{
await writer
.WriteDirectoryAsync(
entry.Key.NotNull("Entry Key is null"),
entry.LastModifiedTime,
cancellationToken
)
.ConfigureAwait(false);
}
else
{
using var entryStream = entry.OpenEntryStream();
await writer
.WriteAsync(
entry.Key.NotNull("Entry Key is null"),
entryStream,
cancellationToken
)
.ConfigureAwait(false);
}
}
}
protected override ZipArchiveEntry CreateEntryInternal(
string filePath,
Stream source,
@@ -156,7 +243,7 @@ public partial class ZipArchive : AbstractWritableArchive<ZipArchiveEntry, ZipVo
protected override IReader CreateReaderForSolidExtraction()
{
var stream = Volumes.Single().Stream;
//stream.Position = 0;
((IStreamStack)stream).StackSeek(0);
return ZipReader.OpenReader(stream, ReaderOptions, Entries);
}

View File

@@ -1,22 +0,0 @@
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchiveEntry
{
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
var part = Parts.Single();
if (part is SeekableZipFilePart seekablePart)
{
return (await seekablePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
}
return OpenEntryStream();
}
}

View File

@@ -6,13 +6,25 @@ using SharpCompress.Common.Zip;
namespace SharpCompress.Archives.Zip;
public partial class ZipArchiveEntry : ZipEntry, IArchiveEntry
public class ZipArchiveEntry : ZipEntry, IArchiveEntry
{
internal ZipArchiveEntry(ZipArchive archive, SeekableZipFilePart? part)
: base(part) => Archive = archive;
public virtual Stream OpenEntryStream() => Parts.Single().GetCompressedStream().NotNull();
public async ValueTask<Stream> OpenEntryStreamAsync(
CancellationToken cancellationToken = default
)
{
var part = Parts.Single();
if (part is SeekableZipFilePart seekablePart)
{
return (await seekablePart.GetCompressedStreamAsync(cancellationToken)).NotNull();
}
return OpenEntryStream();
}
#region IArchiveEntry Members
public IArchive Archive { get; }

View File

@@ -14,7 +14,6 @@ internal static class ZipArchiveVolumeFactory
//new style .zip, z01.. | .zipx, zx01 - if the numbers go beyond 99 then they use 100 ...1000 etc
var m = Regex.Match(part1.Name, @"^(.*\.)(zipx?|zx?[0-9]+)$", RegexOptions.IgnoreCase);
if (m.Success)
{
item = new FileInfo(
Path.Combine(
part1.DirectoryName!,
@@ -25,16 +24,11 @@ internal static class ZipArchiveVolumeFactory
)
)
);
}
else //split - 001, 002 ...
{
return ArchiveVolumeFactory.GetFilePart(index, part1);
}
if (item != null && item.Exists)
{
return item;
}
return null; //no more items
}

View File

@@ -80,7 +80,7 @@ internal class ZipWritableArchiveEntry : ZipArchiveEntry, IWritableArchiveEntry
}
//ensure new stream is at the start, this could be reset
stream.Seek(0, SeekOrigin.Begin);
return SharpCompressStream.CreateNonDisposing(stream);
return SharpCompressStream.Create(stream, leaveOpen: true);
}
internal override void Close()

View File

@@ -4,61 +4,58 @@ using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace SharpCompress.Common.Ace;
public class AceCrc
namespace SharpCompress.Common.Ace
{
// CRC-32 lookup table (standard polynomial 0xEDB88320, reflected)
private static readonly uint[] Crc32Table = GenerateTable();
private static uint[] GenerateTable()
public class AceCrc
{
var table = new uint[256];
// CRC-32 lookup table (standard polynomial 0xEDB88320, reflected)
private static readonly uint[] Crc32Table = GenerateTable();
for (int i = 0; i < 256; i++)
private static uint[] GenerateTable()
{
uint crc = (uint)i;
var table = new uint[256];
for (int j = 0; j < 8; j++)
for (int i = 0; i < 256; i++)
{
if ((crc & 1) != 0)
uint crc = (uint)i;
for (int j = 0; j < 8; j++)
{
crc = (crc >> 1) ^ 0xEDB88320u;
}
else
{
crc >>= 1;
if ((crc & 1) != 0)
crc = (crc >> 1) ^ 0xEDB88320u;
else
crc >>= 1;
}
table[i] = crc;
}
table[i] = crc;
return table;
}
return table;
}
/// <summary>
/// Calculate ACE CRC-32 checksum.
/// ACE CRC-32 uses standard CRC-32 polynomial (0xEDB88320, reflected)
/// with init=0xFFFFFFFF but NO final XOR.
/// </summary>
public static uint AceCrc32(ReadOnlySpan<byte> data)
{
uint crc = 0xFFFFFFFFu;
foreach (byte b in data)
/// <summary>
/// Calculate ACE CRC-32 checksum.
/// ACE CRC-32 uses standard CRC-32 polynomial (0xEDB88320, reflected)
/// with init=0xFFFFFFFF but NO final XOR.
/// </summary>
public static uint AceCrc32(ReadOnlySpan<byte> data)
{
crc = (crc >> 8) ^ Crc32Table[(crc ^ b) & 0xFF];
uint crc = 0xFFFFFFFFu;
foreach (byte b in data)
{
crc = (crc >> 8) ^ Crc32Table[(crc ^ b) & 0xFF];
}
return crc; // No final XOR for ACE
}
return crc; // No final XOR for ACE
}
/// <summary>
/// ACE CRC-16 is the lower 16 bits of the ACE CRC-32.
/// </summary>
public static ushort AceCrc16(ReadOnlySpan<byte> data)
{
return (ushort)(AceCrc32(data) & 0xFFFF);
/// <summary>
/// ACE CRC-16 is the lower 16 bits of the ACE CRC-32.
/// </summary>
public static ushort AceCrc16(ReadOnlySpan<byte> data)
{
return (ushort)(AceCrc32(data) & 0xFFFF);
}
}
}

View File

@@ -6,62 +6,63 @@ using System.Text;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
namespace SharpCompress.Common.Ace;
public class AceEntry : Entry
namespace SharpCompress.Common.Ace
{
private readonly AceFilePart _filePart;
internal AceEntry(AceFilePart filePart)
public class AceEntry : Entry
{
_filePart = filePart;
}
private readonly AceFilePart _filePart;
public override long Crc
{
get
internal AceEntry(AceFilePart filePart)
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc32;
_filePart = filePart;
}
}
public override string? Key => _filePart?.Header.Filename;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.PackedSize ?? 0;
public override CompressionType CompressionType
{
get
public override long Crc
{
if (_filePart.Header.CompressionType == Headers.CompressionType.Stored)
get
{
return CompressionType.None;
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc32;
}
return CompressionType.AceLZ77;
}
public override string? Key => _filePart?.Header.Filename;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.PackedSize ?? 0;
public override CompressionType CompressionType
{
get
{
if (_filePart.Header.CompressionType == Headers.CompressionType.Stored)
{
return CompressionType.None;
}
return CompressionType.AceLZ77;
}
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTime;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => _filePart.Header.IsFileEncrypted;
public override bool IsDirectory => _filePart.Header.IsDirectory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTime;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => _filePart.Header.IsFileEncrypted;
public override bool IsDirectory => _filePart.Header.IsDirectory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -7,45 +7,46 @@ using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.IO;
namespace SharpCompress.Common.Ace;
public class AceFilePart : FilePart
namespace SharpCompress.Common.Ace
{
private readonly Stream _stream;
internal AceFileHeader Header { get; set; }
internal AceFilePart(AceFileHeader localAceHeader, Stream seekableStream)
: base(localAceHeader.ArchiveEncoding)
public class AceFilePart : FilePart
{
_stream = seekableStream;
Header = localAceHeader;
}
private readonly Stream _stream;
internal AceFileHeader Header { get; set; }
internal override string? FilePartName => Header.Filename;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal AceFilePart(AceFileHeader localAceHeader, Stream seekableStream)
: base(localAceHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionType)
{
case Headers.CompressionType.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.PackedSize
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionQuality
);
}
return compressedStream;
_stream = seekableStream;
Header = localAceHeader;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
internal override string? FilePartName => Header.Filename;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionType)
{
case Headers.CompressionType.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.PackedSize
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionQuality
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
}
}

View File

@@ -7,28 +7,29 @@ using System.Threading.Tasks;
using SharpCompress.Common.Arj;
using SharpCompress.Readers;
namespace SharpCompress.Common.Ace;
public class AceVolume : Volume
namespace SharpCompress.Common.Ace
{
public AceVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public override bool IsFirstVolume
public class AceVolume : Volume
{
get { return true; }
}
public AceVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
public override bool IsFirstVolume
{
get { return true; }
}
internal IEnumerable<AceFilePart> GetVolumeFileParts()
{
return new List<AceFilePart>();
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
internal IEnumerable<AceFilePart> GetVolumeFileParts()
{
return new List<AceFilePart>();
}
}
}

View File

@@ -1,111 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Arc;
namespace SharpCompress.Common.Ace.Headers;
public sealed partial class AceFileHeader
{
/// <summary>
/// Asynchronously reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override async ValueTask<AceHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var headerData = await ReadHeaderAsync(stream, cancellationToken);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
}
}
// Store the data start position
DataStartPosition = stream.Position;
return this;
}
}

View File

@@ -2,173 +2,170 @@ using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using System.Xml.Linq;
using SharpCompress.Common.Arc;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// ACE file entry header
/// </summary>
public sealed partial class AceFileHeader : AceHeader
namespace SharpCompress.Common.Ace.Headers
{
public long DataStartPosition { get; private set; }
public long PackedSize { get; set; }
public long OriginalSize { get; set; }
public DateTime DateTime { get; set; }
public int Attributes { get; set; }
public uint Crc32 { get; set; }
public CompressionType CompressionType { get; set; }
public CompressionQuality CompressionQuality { get; set; }
public ushort Parameters { get; set; }
public string Filename { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
/// <summary>
/// File data offset in the archive
/// ACE file entry header
/// </summary>
public ulong DataOffset { get; set; }
public bool IsDirectory => (Attributes & 0x10) != 0;
public bool IsContinuedFromPrev =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_PREV) != 0;
public bool IsContinuedToNext =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_NEXT) != 0;
public int DictionarySize
public sealed class AceFileHeader : AceHeader
{
get
public long DataStartPosition { get; private set; }
public long PackedSize { get; set; }
public long OriginalSize { get; set; }
public DateTime DateTime { get; set; }
public int Attributes { get; set; }
public uint Crc32 { get; set; }
public CompressionType CompressionType { get; set; }
public CompressionQuality CompressionQuality { get; set; }
public ushort Parameters { get; set; }
public string Filename { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
/// <summary>
/// File data offset in the archive
/// </summary>
public ulong DataOffset { get; set; }
public bool IsDirectory => (Attributes & 0x10) != 0;
public bool IsContinuedFromPrev =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_PREV) != 0;
public bool IsContinuedToNext =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.CONTINUED_NEXT) != 0;
public int DictionarySize
{
int bits = Parameters & 0x0F;
return bits < 10 ? 1024 : 1 << bits;
}
}
public AceFileHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.FILE) { }
/// <summary>
/// Reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
get
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
int bits = Parameters & 0x0F;
return bits < 10 ? 1024 : 1 << bits;
}
}
// Store the data start position
DataStartPosition = stream.Position;
public AceFileHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.FILE) { }
return this;
/// <summary>
/// Reads the next file entry header from the stream.
/// Returns null if no more entries or end of archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type (1 byte)
HeaderType = headerData[offset++];
// Skip recovery record headers (ACE 2.0 feature)
if (HeaderType == (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.RECOVERY32)
{
// Skip to next header
return null;
}
if (HeaderType != (byte)SharpCompress.Common.Ace.Headers.AceHeaderType.FILE)
{
// Unknown header type - skip
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Packed size (4 bytes)
PackedSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Original size (4 bytes)
OriginalSize = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// File date/time in DOS format (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// File attributes (4 bytes)
Attributes = (int)BitConverter.ToUInt32(headerData, offset);
offset += 4;
// CRC32 (4 bytes)
Crc32 = BitConverter.ToUInt32(headerData, offset);
offset += 4;
// Compression type (1 byte)
byte compressionType = headerData[offset++];
CompressionType = GetCompressionType(compressionType);
// Compression quality/parameter (1 byte)
byte compressionQuality = headerData[offset++];
CompressionQuality = GetCompressionQuality(compressionQuality);
// Parameters (2 bytes)
Parameters = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Reserved (2 bytes) - skip
offset += 2;
// Filename length (2 bytes)
var filenameLength = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Filename
if (offset + filenameLength <= headerData.Length)
{
Filename = ArchiveEncoding.Decode(headerData, offset, filenameLength);
offset += filenameLength;
}
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
// Comment length (2 bytes)
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength; // Skip comment
}
}
// Store the data start position
DataStartPosition = stream.Position;
return this;
}
public CompressionType GetCompressionType(byte value) =>
value switch
{
0 => CompressionType.Stored,
1 => CompressionType.Lz77,
2 => CompressionType.Blocked,
_ => CompressionType.Unknown,
};
public CompressionQuality GetCompressionQuality(byte value) =>
value switch
{
0 => CompressionQuality.None,
1 => CompressionQuality.Fastest,
2 => CompressionQuality.Fast,
3 => CompressionQuality.Normal,
4 => CompressionQuality.Good,
5 => CompressionQuality.Best,
_ => CompressionQuality.Unknown,
};
}
// ReadAsync moved to AceFileHeader.Async.cs
public CompressionType GetCompressionType(byte value) =>
value switch
{
0 => CompressionType.Stored,
1 => CompressionType.Lz77,
2 => CompressionType.Blocked,
_ => CompressionType.Unknown,
};
public CompressionQuality GetCompressionQuality(byte value) =>
value switch
{
0 => CompressionQuality.None,
1 => CompressionQuality.Fastest,
2 => CompressionQuality.Fast,
3 => CompressionQuality.Normal,
4 => CompressionQuality.Good,
5 => CompressionQuality.Best,
_ => CompressionQuality.Unknown,
};
}

View File

@@ -1,69 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Ace.Headers;
public abstract partial class AceHeader
{
public abstract ValueTask<AceHeader?> ReadAsync(
Stream reader,
CancellationToken cancellationToken = default
);
public async ValueTask<byte[]> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (!await stream.ReadFullyAsync(headerBytes, 0, 4, cancellationToken))
{
return Array.Empty<byte>();
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
{
return Array.Empty<byte>();
}
// Read the header data
var body = new byte[HeaderSize];
if (!await stream.ReadFullyAsync(body, 0, HeaderSize, cancellationToken))
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
/// <summary>
/// Asynchronously checks if the stream is an ACE archive
/// </summary>
/// <param name="stream">The stream to read from</param>
/// <param name="cancellationToken">Cancellation token</param>
/// <returns>True if the stream is an ACE archive, false otherwise</returns>
public static async ValueTask<bool> IsArchiveAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var bytes = new byte[14];
if (!await stream.ReadFullyAsync(bytes, 0, 14, cancellationToken))
{
return false;
}
return CheckMagicBytes(bytes, 7);
}
}

View File

@@ -1,156 +1,153 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Arj.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Header type constants
/// </summary>
public enum AceHeaderType
namespace SharpCompress.Common.Ace.Headers
{
MAIN = 0,
FILE = 1,
RECOVERY32 = 2,
RECOVERY64A = 3,
RECOVERY64B = 4,
}
public abstract partial class AceHeader
{
// ACE signature: bytes at offset 7 should be "**ACE**"
private static readonly byte[] AceSignature =
[
(byte)'*',
(byte)'*',
(byte)'A',
(byte)'C',
(byte)'E',
(byte)'*',
(byte)'*',
];
public AceHeader(IArchiveEncoding archiveEncoding, AceHeaderType type)
/// <summary>
/// Header type constants
/// </summary>
public enum AceHeaderType
{
AceHeaderType = type;
ArchiveEncoding = archiveEncoding;
MAIN = 0,
FILE = 1,
RECOVERY32 = 2,
RECOVERY64A = 3,
RECOVERY64B = 4,
}
public IArchiveEncoding ArchiveEncoding { get; }
public AceHeaderType AceHeaderType { get; }
public ushort HeaderFlags { get; set; }
public ushort HeaderCrc { get; set; }
public ushort HeaderSize { get; set; }
public byte HeaderType { get; set; }
public bool IsFileEncrypted =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.FILE_ENCRYPTED) != 0;
public bool Is64Bit =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MEMORY_64BIT) != 0;
public bool IsSolid =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.SOLID_MAIN) != 0;
public bool IsMultiVolume =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MULTIVOLUME) != 0;
public abstract AceHeader? Read(Stream reader);
// Async methods moved to AceHeader.Async.cs
public byte[] ReadHeader(Stream stream)
public abstract class AceHeader
{
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (!stream.ReadFully(headerBytes))
// ACE signature: bytes at offset 7 should be "**ACE**"
private static readonly byte[] AceSignature =
[
(byte)'*',
(byte)'*',
(byte)'A',
(byte)'C',
(byte)'E',
(byte)'*',
(byte)'*',
];
public AceHeader(IArchiveEncoding archiveEncoding, AceHeaderType type)
{
return Array.Empty<byte>();
AceHeaderType = type;
ArchiveEncoding = archiveEncoding;
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
public IArchiveEncoding ArchiveEncoding { get; }
public AceHeaderType AceHeaderType { get; }
public ushort HeaderFlags { get; set; }
public ushort HeaderCrc { get; set; }
public ushort HeaderSize { get; set; }
public byte HeaderType { get; set; }
public bool IsFileEncrypted =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.FILE_ENCRYPTED) != 0;
public bool Is64Bit =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MEMORY_64BIT) != 0;
public bool IsSolid =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.SOLID_MAIN) != 0;
public bool IsMultiVolume =>
(HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.MULTIVOLUME) != 0;
public abstract AceHeader? Read(Stream reader);
public byte[] ReadHeader(Stream stream)
{
return Array.Empty<byte>();
// Read header CRC (2 bytes) and header size (2 bytes)
var headerBytes = new byte[4];
if (stream.Read(headerBytes, 0, 4) != 4)
{
return Array.Empty<byte>();
}
HeaderCrc = BitConverter.ToUInt16(headerBytes, 0); // CRC for validation
HeaderSize = BitConverter.ToUInt16(headerBytes, 2);
if (HeaderSize == 0)
{
return Array.Empty<byte>();
}
// Read the header data
var body = new byte[HeaderSize];
if (stream.Read(body, 0, HeaderSize) != HeaderSize)
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
// Read the header data
var body = new byte[HeaderSize];
if (!stream.ReadFully(body))
public static bool IsArchive(Stream stream)
{
return Array.Empty<byte>();
}
// Verify crc
var checksum = AceCrc.AceCrc16(body);
if (checksum != HeaderCrc)
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
public static bool IsArchive(Stream stream)
{
// ACE files have a specific signature
// First two bytes are typically 0x60 0xEA (signature bytes)
// At offset 7, there should be "**ACE**" (7 bytes)
var bytes = new byte[14];
if (stream.Read(bytes, 0, 14) != 14)
{
return false;
}
// Check for "**ACE**" at offset 7
return CheckMagicBytes(bytes, 7);
}
protected static bool CheckMagicBytes(byte[] headerBytes, int offset)
{
// Check for "**ACE**" at specified offset
for (int i = 0; i < AceSignature.Length; i++)
{
if (headerBytes[offset + i] != AceSignature[i])
// ACE files have a specific signature
// First two bytes are typically 0x60 0xEA (signature bytes)
// At offset 7, there should be "**ACE**" (7 bytes)
var bytes = new byte[14];
if (stream.Read(bytes, 0, 14) != 14)
{
return false;
}
// Check for "**ACE**" at offset 7
return CheckMagicBytes(bytes, 7);
}
return true;
}
protected DateTime ConvertDosDateTime(uint dosDateTime)
{
try
protected static bool CheckMagicBytes(byte[] headerBytes, int offset)
{
int second = (int)(dosDateTime & 0x1F) * 2;
int minute = (int)((dosDateTime >> 5) & 0x3F);
int hour = (int)((dosDateTime >> 11) & 0x1F);
int day = (int)((dosDateTime >> 16) & 0x1F);
int month = (int)((dosDateTime >> 21) & 0x0F);
int year = (int)((dosDateTime >> 25) & 0x7F) + 1980;
// Check for "**ACE**" at specified offset
for (int i = 0; i < AceSignature.Length; i++)
{
if (headerBytes[offset + i] != AceSignature[i])
{
return false;
}
}
return true;
}
if (
day < 1
|| day > 31
|| month < 1
|| month > 12
|| hour > 23
|| minute > 59
|| second > 59
)
protected DateTime ConvertDosDateTime(uint dosDateTime)
{
try
{
int second = (int)(dosDateTime & 0x1F) * 2;
int minute = (int)((dosDateTime >> 5) & 0x3F);
int hour = (int)((dosDateTime >> 11) & 0x1F);
int day = (int)((dosDateTime >> 16) & 0x1F);
int month = (int)((dosDateTime >> 21) & 0x0F);
int year = (int)((dosDateTime >> 25) & 0x7F) + 1980;
if (
day < 1
|| day > 31
|| month < 1
|| month > 12
|| hour > 23
|| minute > 59
|| second > 59
)
{
return DateTime.MinValue;
}
return new DateTime(year, month, day, hour, minute, second);
}
catch
{
return DateTime.MinValue;
}
return new DateTime(year, month, day, hour, minute, second);
}
catch
{
return DateTime.MinValue;
}
}
}

View File

@@ -1,83 +0,0 @@
using System;
using System.Buffers.Binary;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
public sealed partial class AceMainHeader
{
/// <summary>
/// Asynchronously reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override async ValueTask<AceHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var headerData = await ReadHeaderAsync(stream, cancellationToken);
if (headerData.Length == 0)
{
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
}
}
return this;
}
}

View File

@@ -2,99 +2,96 @@ using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Ace.Headers;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// ACE main archive header
/// </summary>
public sealed partial class AceMainHeader : AceHeader
namespace SharpCompress.Common.Ace.Headers
{
public byte ExtractVersion { get; set; }
public byte CreatorVersion { get; set; }
public HostOS HostOS { get; set; }
public byte VolumeNumber { get; set; }
public DateTime DateTime { get; set; }
public string Advert { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
public byte AceVersion { get; private set; }
public AceMainHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.MAIN) { }
/// <summary>
/// Reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// ACE main archive header
/// </summary>
public override AceHeader? Read(Stream stream)
public sealed class AceMainHeader : AceHeader
{
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
public byte ExtractVersion { get; set; }
public byte CreatorVersion { get; set; }
public HostOS HostOS { get; set; }
public byte VolumeNumber { get; set; }
public DateTime DateTime { get; set; }
public string Advert { get; set; } = string.Empty;
public List<byte> Comment { get; set; } = new();
public byte AceVersion { get; private set; }
public AceMainHeader(IArchiveEncoding archiveEncoding)
: base(archiveEncoding, AceHeaderType.MAIN) { }
/// <summary>
/// Reads the main archive header from the stream.
/// Returns header if this is a valid ACE archive.
/// Supports both ACE 1.0 and ACE 2.0 formats.
/// </summary>
public override AceHeader? Read(Stream stream)
{
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
var headerData = ReadHeader(stream);
if (headerData.Length == 0)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
return null;
}
int offset = 0;
// Header type should be 0 for main header
if (headerData[offset++] != HeaderType)
{
return null;
}
// Header flags (2 bytes)
HeaderFlags = BitConverter.ToUInt16(headerData, offset);
offset += 2;
// Skip signature "**ACE**" (7 bytes)
if (!CheckMagicBytes(headerData, offset))
{
throw new InvalidDataException("Invalid ACE archive signature.");
}
offset += 7;
// ACE version (1 byte) - 10 for ACE 1.0, 20 for ACE 2.0
AceVersion = headerData[offset++];
ExtractVersion = headerData[offset++];
// Host OS (1 byte)
if (offset < headerData.Length)
{
var hostOsByte = headerData[offset++];
HostOS = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
}
// Volume number (1 byte)
VolumeNumber = headerData[offset++];
// Creation date/time (4 bytes)
var dosDateTime = BitConverter.ToUInt32(headerData, offset);
DateTime = ConvertDosDateTime(dosDateTime);
offset += 4;
// Reserved fields (8 bytes)
if (offset + 8 <= headerData.Length)
{
offset += 8;
}
// Skip additional fields based on flags
// Handle comment if present
if ((HeaderFlags & SharpCompress.Common.Ace.Headers.HeaderFlags.COMMENT) != 0)
{
if (offset + 2 <= headerData.Length)
{
ushort commentLength = BitConverter.ToUInt16(headerData, offset);
offset += 2 + commentLength;
}
}
return this;
}
return this;
}
// ReadAsync moved to AceMainHeader.Async.cs
}

View File

@@ -1,15 +1,16 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Compression quality
/// </summary>
public enum CompressionQuality
namespace SharpCompress.Common.Ace.Headers
{
None,
Fastest,
Fast,
Normal,
Good,
Best,
Unknown,
/// <summary>
/// Compression quality
/// </summary>
public enum CompressionQuality
{
None,
Fastest,
Fast,
Normal,
Good,
Best,
Unknown,
}
}

View File

@@ -1,12 +1,13 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Compression types
/// </summary>
public enum CompressionType
namespace SharpCompress.Common.Ace.Headers
{
Stored,
Lz77,
Blocked,
Unknown,
/// <summary>
/// Compression types
/// </summary>
public enum CompressionType
{
Stored,
Lz77,
Blocked,
Unknown,
}
}

View File

@@ -1,32 +1,33 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Header flags (main + file, overlapping meanings)
/// </summary>
public static class HeaderFlags
namespace SharpCompress.Common.Ace.Headers
{
// Shared / low bits
public const ushort ADDSIZE = 0x0001; // extra size field present
public const ushort COMMENT = 0x0002; // comment present
public const ushort MEMORY_64BIT = 0x0004;
public const ushort AV_STRING = 0x0008; // AV string present
public const ushort SOLID = 0x0010; // solid file
public const ushort LOCKED = 0x0020;
public const ushort PROTECTED = 0x0040;
/// <summary>
/// Header flags (main + file, overlapping meanings)
/// </summary>
public static class HeaderFlags
{
// Shared / low bits
public const ushort ADDSIZE = 0x0001; // extra size field present
public const ushort COMMENT = 0x0002; // comment present
public const ushort MEMORY_64BIT = 0x0004;
public const ushort AV_STRING = 0x0008; // AV string present
public const ushort SOLID = 0x0010; // solid file
public const ushort LOCKED = 0x0020;
public const ushort PROTECTED = 0x0040;
// Main header specific
public const ushort V20FORMAT = 0x0100;
public const ushort SFX = 0x0200;
public const ushort LIMITSFXJR = 0x0400;
public const ushort MULTIVOLUME = 0x0800;
public const ushort ADVERT = 0x1000;
public const ushort RECOVERY = 0x2000;
public const ushort LOCKED_MAIN = 0x4000;
public const ushort SOLID_MAIN = 0x8000;
// Main header specific
public const ushort V20FORMAT = 0x0100;
public const ushort SFX = 0x0200;
public const ushort LIMITSFXJR = 0x0400;
public const ushort MULTIVOLUME = 0x0800;
public const ushort ADVERT = 0x1000;
public const ushort RECOVERY = 0x2000;
public const ushort LOCKED_MAIN = 0x4000;
public const ushort SOLID_MAIN = 0x8000;
// File header specific (same bits, different meaning)
public const ushort NTSECURITY = 0x0400;
public const ushort CONTINUED_PREV = 0x1000;
public const ushort CONTINUED_NEXT = 0x2000;
public const ushort FILE_ENCRYPTED = 0x4000; // file encrypted (file header)
// File header specific (same bits, different meaning)
public const ushort NTSECURITY = 0x0400;
public const ushort CONTINUED_PREV = 0x1000;
public const ushort CONTINUED_NEXT = 0x2000;
public const ushort FILE_ENCRYPTED = 0x4000; // file encrypted (file header)
}
}

View File

@@ -1,21 +1,22 @@
namespace SharpCompress.Common.Ace.Headers;
/// <summary>
/// Host OS type
/// </summary>
public enum HostOS
namespace SharpCompress.Common.Ace.Headers
{
MsDos = 0,
Os2,
Windows,
Unix,
MacOs,
WinNt,
Primos,
AppleGs,
Atari,
Vax,
Amiga,
Next,
Unknown,
/// <summary>
/// Host OS type
/// </summary>
public enum HostOS
{
MsDos = 0,
Os2,
Windows,
Unix,
MacOs,
WinNt,
Primos,
AppleGs,
Atari,
Vax,
Amiga,
Next,
Unknown,
}
}

View File

@@ -7,53 +7,54 @@ using System.Threading.Tasks;
using SharpCompress.Common.GZip;
using SharpCompress.Common.Tar;
namespace SharpCompress.Common.Arc;
public class ArcEntry : Entry
namespace SharpCompress.Common.Arc
{
private readonly ArcFilePart? _filePart;
internal ArcEntry(ArcFilePart? filePart)
public class ArcEntry : Entry
{
_filePart = filePart;
}
private readonly ArcFilePart? _filePart;
public override long Crc
{
get
internal ArcEntry(ArcFilePart? filePart)
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc16;
_filePart = filePart;
}
public override long Crc
{
get
{
if (_filePart == null)
{
return 0;
}
return _filePart.Header.Crc16;
}
}
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType =>
_filePart?.Header.CompressionMethod ?? CompressionType.Unknown;
public override long Size => throw new NotImplementedException();
public override DateTime? LastModifiedTime => null;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => false;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType =>
_filePart?.Header.CompressionMethod ?? CompressionType.Unknown;
public override long Size => throw new NotImplementedException();
public override DateTime? LastModifiedTime => null;
public override DateTime? CreatedTime => null;
public override DateTime? LastAccessedTime => null;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => false;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -2,93 +2,75 @@ using System;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arc;
public class ArcEntryHeader
namespace SharpCompress.Common.Arc
{
public IArchiveEncoding ArchiveEncoding { get; }
public CompressionType CompressionMethod { get; private set; }
public string? Name { get; private set; }
public long CompressedSize { get; private set; }
public DateTime DateTime { get; private set; }
public int Crc16 { get; private set; }
public long OriginalSize { get; private set; }
public long DataStartPosition { get; private set; }
public ArcEntryHeader(IArchiveEncoding archiveEncoding)
public class ArcEntryHeader
{
this.ArchiveEncoding = archiveEncoding;
}
public IArchiveEncoding ArchiveEncoding { get; }
public CompressionType CompressionMethod { get; private set; }
public string? Name { get; private set; }
public long CompressedSize { get; private set; }
public DateTime DateTime { get; private set; }
public int Crc16 { get; private set; }
public long OriginalSize { get; private set; }
public long DataStartPosition { get; private set; }
public ArcEntryHeader? ReadHeader(Stream stream)
{
byte[] headerBytes = new byte[29];
if (stream.Read(headerBytes, 0, headerBytes.Length) != headerBytes.Length)
public ArcEntryHeader(IArchiveEncoding archiveEncoding)
{
return null;
this.ArchiveEncoding = archiveEncoding;
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
public async ValueTask<ArcEntryHeader?> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
byte[] headerBytes = new byte[29];
if (
await stream.ReadAsync(headerBytes, 0, headerBytes.Length, cancellationToken)
!= headerBytes.Length
)
public ArcEntryHeader? ReadHeader(Stream stream)
{
return null;
byte[] headerBytes = new byte[29];
if (stream.Read(headerBytes, 0, headerBytes.Length) != headerBytes.Length)
{
return null;
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
DataStartPosition = stream.Position;
return LoadFrom(headerBytes);
}
public ArcEntryHeader LoadFrom(byte[] headerBytes)
{
CompressionMethod = GetCompressionType(headerBytes[1]);
// Read name
int nameEnd = Array.IndexOf(headerBytes, (byte)0, 1); // Find null terminator
Name = Encoding.UTF8.GetString(headerBytes, 2, nameEnd > 0 ? nameEnd - 2 : 12);
int offset = 15;
CompressedSize = BitConverter.ToUInt32(headerBytes, offset);
offset += 4;
uint rawDateTime = BitConverter.ToUInt32(headerBytes, offset);
DateTime = ConvertToDateTime(rawDateTime);
offset += 4;
Crc16 = BitConverter.ToUInt16(headerBytes, offset);
offset += 2;
OriginalSize = BitConverter.ToUInt32(headerBytes, offset);
return this;
}
private CompressionType GetCompressionType(byte value)
{
return value switch
public ArcEntryHeader LoadFrom(byte[] headerBytes)
{
1 or 2 => CompressionType.None,
3 => CompressionType.Packed,
4 => CompressionType.Squeezed,
5 or 6 or 7 or 8 => CompressionType.Crunched,
9 => CompressionType.Squashed,
10 => CompressionType.Crushed,
11 => CompressionType.Distilled,
_ => CompressionType.Unknown,
};
}
CompressionMethod = GetCompressionType(headerBytes[1]);
public static DateTime ConvertToDateTime(long rawDateTime)
{
// Convert Unix timestamp to DateTime (UTC)
return DateTimeOffset.FromUnixTimeSeconds(rawDateTime).UtcDateTime;
// Read name
int nameEnd = Array.IndexOf(headerBytes, (byte)0, 1); // Find null terminator
Name = Encoding.UTF8.GetString(headerBytes, 2, nameEnd > 0 ? nameEnd - 2 : 12);
int offset = 15;
CompressedSize = BitConverter.ToUInt32(headerBytes, offset);
offset += 4;
uint rawDateTime = BitConverter.ToUInt32(headerBytes, offset);
DateTime = ConvertToDateTime(rawDateTime);
offset += 4;
Crc16 = BitConverter.ToUInt16(headerBytes, offset);
offset += 2;
OriginalSize = BitConverter.ToUInt32(headerBytes, offset);
return this;
}
private CompressionType GetCompressionType(byte value)
{
return value switch
{
1 or 2 => CompressionType.None,
3 => CompressionType.Packed,
4 => CompressionType.Squeezed,
5 or 6 or 7 or 8 => CompressionType.Crunched,
9 => CompressionType.Squashed,
10 => CompressionType.Crushed,
11 => CompressionType.Distilled,
_ => CompressionType.Unknown,
};
}
public static DateTime ConvertToDateTime(long rawDateTime)
{
// Convert Unix timestamp to DateTime (UTC)
return DateTimeOffset.FromUnixTimeSeconds(rawDateTime).UtcDateTime;
}
}
}

View File

@@ -1,58 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Compressors.Lzw;
using SharpCompress.Compressors.RLE90;
using SharpCompress.Compressors.Squeezed;
using SharpCompress.IO;
namespace SharpCompress.Common.Arc;
public partial class ArcFilePart
{
internal override async ValueTask<Stream?> GetCompressedStreamAsync(
CancellationToken cancellationToken = default
)
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Squeezed:
compressedStream = await SqueezeStream.CreateAsync(
_stream,
(int)Header.CompressedSize,
cancellationToken
);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(_stream, (int)Header.CompressedSize, true);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream;
}
}

View File

@@ -13,61 +13,71 @@ using SharpCompress.Compressors.RLE90;
using SharpCompress.Compressors.Squeezed;
using SharpCompress.IO;
namespace SharpCompress.Common.Arc;
public partial class ArcFilePart : FilePart
namespace SharpCompress.Common.Arc
{
private readonly Stream? _stream;
internal ArcFilePart(ArcEntryHeader localArcHeader, Stream? seekableStream)
: base(localArcHeader.ArchiveEncoding)
public class ArcFilePart : FilePart
{
_stream = seekableStream;
Header = localArcHeader;
}
private readonly Stream? _stream;
internal ArcEntryHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal ArcFilePart(ArcEntryHeader localArcHeader, Stream? seekableStream)
: base(localArcHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Squeezed:
compressedStream = SqueezeStream.Create(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(_stream, (int)Header.CompressedSize, true);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
_stream = seekableStream;
Header = localArcHeader;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
internal ArcEntryHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionType.None:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionType.Packed:
compressedStream = new RunLength90Stream(
_stream,
(int)Header.CompressedSize
);
break;
case CompressionType.Squeezed:
compressedStream = new SqueezeStream(_stream, (int)Header.CompressedSize);
break;
case CompressionType.Crunched:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: "
+ Header.CompressionMethod
+ " with size > 128KB"
);
}
compressedStream = new ArcLzwStream(
_stream,
(int)Header.CompressedSize,
true
);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream? GetRawStream() => _stream;
}
}

View File

@@ -6,10 +6,11 @@ using System.Text;
using System.Threading.Tasks;
using SharpCompress.Readers;
namespace SharpCompress.Common.Arc;
public class ArcVolume : Volume
namespace SharpCompress.Common.Arc
{
public ArcVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public class ArcVolume : Volume
{
public ArcVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
}
}

View File

@@ -6,52 +6,53 @@ using System.Threading.Tasks;
using SharpCompress.Common.Arc;
using SharpCompress.Common.Arj.Headers;
namespace SharpCompress.Common.Arj;
public class ArjEntry : Entry
namespace SharpCompress.Common.Arj
{
private readonly ArjFilePart _filePart;
internal ArjEntry(ArjFilePart filePart)
public class ArjEntry : Entry
{
_filePart = filePart;
}
private readonly ArjFilePart _filePart;
public override long Crc => _filePart.Header.OriginalCrc32;
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType
{
get
internal ArjEntry(ArjFilePart filePart)
{
if (_filePart.Header.CompressionMethod == CompressionMethod.Stored)
{
return CompressionType.None;
}
return CompressionType.ArjLZ77;
_filePart = filePart;
}
public override long Crc => _filePart.Header.OriginalCrc32;
public override string? Key => _filePart?.Header.Name;
public override string? LinkTarget => null;
public override long CompressedSize => _filePart?.Header.CompressedSize ?? 0;
public override CompressionType CompressionType
{
get
{
if (_filePart.Header.CompressionMethod == CompressionMethod.Stored)
{
return CompressionType.None;
}
return CompressionType.ArjLZ77;
}
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTimeModified.DateTime;
public override DateTime? CreatedTime => _filePart.Header.DateTimeCreated.DateTime;
public override DateTime? LastAccessedTime => _filePart.Header.DateTimeAccessed.DateTime;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => _filePart.Header.FileType == FileType.Directory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}
public override long Size => _filePart?.Header.OriginalSize ?? 0;
public override DateTime? LastModifiedTime => _filePart.Header.DateTimeModified.DateTime;
public override DateTime? CreatedTime => _filePart.Header.DateTimeCreated.DateTime;
public override DateTime? LastAccessedTime => _filePart.Header.DateTimeAccessed.DateTime;
public override DateTime? ArchivedTime => null;
public override bool IsEncrypted => false;
public override bool IsDirectory => _filePart.Header.FileType == FileType.Directory;
public override bool IsSplitAfter => false;
internal override IEnumerable<FilePart> Parts => _filePart.Empty();
}

View File

@@ -8,62 +8,65 @@ using SharpCompress.Common.Arj.Headers;
using SharpCompress.Compressors.Arj;
using SharpCompress.IO;
namespace SharpCompress.Common.Arj;
public class ArjFilePart : FilePart
namespace SharpCompress.Common.Arj
{
private readonly Stream _stream;
internal ArjLocalHeader Header { get; set; }
internal ArjFilePart(ArjLocalHeader localArjHeader, Stream seekableStream)
: base(localArjHeader.ArchiveEncoding)
public class ArjFilePart : FilePart
{
_stream = seekableStream;
Header = localArjHeader;
}
private readonly Stream _stream;
internal ArjLocalHeader Header { get; set; }
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
internal ArjFilePart(ArjLocalHeader localArjHeader, Stream seekableStream)
: base(localArjHeader.ArchiveEncoding)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionMethod.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionMethod.CompressedMost:
case CompressionMethod.Compressed:
case CompressionMethod.CompressedFaster:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod + " with size > 128KB"
);
}
compressedStream = new LhaStream<Lh7DecoderCfg>(
_stream,
(int)Header.OriginalSize
);
break;
case CompressionMethod.CompressedFastest:
compressedStream = new LHDecoderStream(_stream, (int)Header.OriginalSize);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
_stream = seekableStream;
Header = localArjHeader;
}
return _stream.NotNull();
}
internal override Stream GetRawStream() => _stream;
internal override string? FilePartName => Header.Name;
internal override Stream GetCompressedStream()
{
if (_stream != null)
{
Stream compressedStream;
switch (Header.CompressionMethod)
{
case CompressionMethod.Stored:
compressedStream = new ReadOnlySubStream(
_stream,
Header.DataStartPosition,
Header.CompressedSize
);
break;
case CompressionMethod.CompressedMost:
case CompressionMethod.Compressed:
case CompressionMethod.CompressedFaster:
if (Header.OriginalSize > 128 * 1024)
{
throw new NotSupportedException(
"CompressionMethod: "
+ Header.CompressionMethod
+ " with size > 128KB"
);
}
compressedStream = new LhaStream<Lh7DecoderCfg>(
_stream,
(int)Header.OriginalSize
);
break;
case CompressionMethod.CompressedFastest:
compressedStream = new LHDecoderStream(_stream, (int)Header.OriginalSize);
break;
default:
throw new NotSupportedException(
"CompressionMethod: " + Header.CompressionMethod
);
}
return compressedStream;
}
return _stream.NotNull();
}
internal override Stream GetRawStream() => _stream;
}
}

View File

@@ -8,28 +8,29 @@ using SharpCompress.Common.Rar;
using SharpCompress.Common.Rar.Headers;
using SharpCompress.Readers;
namespace SharpCompress.Common.Arj;
public class ArjVolume : Volume
namespace SharpCompress.Common.Arj
{
public ArjVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
public override bool IsFirstVolume
public class ArjVolume : Volume
{
get { return true; }
}
public ArjVolume(Stream stream, ReaderOptions readerOptions, int index = 0)
: base(stream, readerOptions, index) { }
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
public override bool IsFirstVolume
{
get { return true; }
}
internal IEnumerable<ArjFilePart> GetVolumeFileParts()
{
return new List<ArjFilePart>();
/// <summary>
/// ArjArchive is part of a multi-part archive.
/// </summary>
public override bool IsMultiVolume
{
get { return false; }
}
internal IEnumerable<ArjFilePart> GetVolumeFileParts()
{
return new List<ArjFilePart>();
}
}
}

View File

@@ -1,132 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Arj.Headers;
public abstract partial class ArjHeader
{
public abstract ValueTask<ArjHeader?> ReadAsync(
Stream reader,
CancellationToken cancellationToken = default
);
public async ValueTask<byte[]> ReadHeaderAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
// check for magic bytes
var magic = new byte[2];
if (await stream.ReadAsync(magic, 0, 2, cancellationToken) != 2)
{
return Array.Empty<byte>();
}
if (!CheckMagicBytes(magic))
{
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
// read header_size
byte[] headerBytes = new byte[2];
await stream.ReadAsync(headerBytes, 0, 2, cancellationToken);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = await stream.ReadAsync(body, 0, headerSize, cancellationToken);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
read = await stream.ReadAsync(crc, 0, 4, cancellationToken);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
protected async ValueTask<List<byte[]>> ReadExtendedHeadersAsync(
Stream reader,
CancellationToken cancellationToken = default
)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = await reader.ReadAsync(buffer, 0, 2, cancellationToken);
if (bytesRead < 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
{
return extendedHeader;
}
byte[] header = new byte[extHeaderSize];
bytesRead = await reader.ReadAsync(header, 0, extHeaderSize, cancellationToken);
if (bytesRead < extHeaderSize)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
}
byte[] crcextended = new byte[4];
bytesRead = await reader.ReadAsync(crcextended, 0, 4, cancellationToken);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
if (checksum != BitConverter.ToUInt32(crcextended, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
}
extendedHeader.Add(header);
}
}
/// <summary>
/// Asynchronously checks if the stream is an ARJ archive
/// </summary>
/// <param name="stream">The stream to read from</param>
/// <param name="cancellationToken">Cancellation token</param>
/// <returns>True if the stream is an ARJ archive, false otherwise</returns>
public static async ValueTask<bool> IsArchiveAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var bytes = new byte[2];
if (await stream.ReadAsync(bytes, 0, 2, cancellationToken) != 2)
{
return false;
}
return CheckMagicBytes(bytes);
}
}

View File

@@ -3,158 +3,156 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Arj.Headers;
public enum ArjHeaderType
namespace SharpCompress.Common.Arj.Headers
{
MainHeader,
LocalHeader,
}
public abstract partial class ArjHeader
{
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public ArjHeader(ArjHeaderType type)
public enum ArjHeaderType
{
ArjHeaderType = type;
MainHeader,
LocalHeader,
}
public ArjHeaderType ArjHeaderType { get; }
public byte Flags { get; set; }
public FileType FileType { get; set; }
public abstract ArjHeader? Read(Stream reader);
// Async methods moved to ArjHeader.Async.cs
public byte[] ReadHeader(Stream stream)
public abstract class ArjHeader
{
// check for magic bytes
var magic = new byte[2];
if (stream.Read(magic) != 2)
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public ArjHeader(ArjHeaderType type)
{
return Array.Empty<byte>();
ArjHeaderType = type;
}
if (!CheckMagicBytes(magic))
public ArjHeaderType ArjHeaderType { get; }
public byte Flags { get; set; }
public FileType FileType { get; set; }
public abstract ArjHeader? Read(Stream reader);
public byte[] ReadHeader(Stream stream)
{
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
// read header_size
byte[] headerBytes = new byte[2];
stream.Read(headerBytes, 0, 2);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = stream.Read(body, 0, headerSize);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
read = stream.Read(crc, 0, 4);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
// ReadHeaderAsync moved to ArjHeader.Async.cs
protected List<byte[]> ReadExtendedHeaders(Stream reader)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = reader.Read(buffer, 0, 2);
if (bytesRead < 2)
// check for magic bytes
var magic = new byte[2];
if (stream.Read(magic) != 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
return Array.Empty<byte>();
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
if (!CheckMagicBytes(magic))
{
return extendedHeader;
throw new InvalidDataException("Not an ARJ file (wrong magic bytes)");
}
byte[] header = new byte[extHeaderSize];
bytesRead = reader.Read(header, 0, extHeaderSize);
if (bytesRead < extHeaderSize)
// read header_size
byte[] headerBytes = new byte[2];
stream.Read(headerBytes, 0, 2);
var headerSize = (ushort)(headerBytes[0] | headerBytes[1] << 8);
if (headerSize < 1)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
return Array.Empty<byte>();
}
var body = new byte[headerSize];
var read = stream.Read(body, 0, headerSize);
if (read < headerSize)
{
return Array.Empty<byte>();
}
byte[] crc = new byte[4];
bytesRead = reader.Read(crc, 0, 4);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
read = stream.Read(crc, 0, 4);
var checksum = Crc32Stream.Compute(body);
// Compute the hash value
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
throw new InvalidDataException("Header checksum is invalid");
}
return body;
}
protected List<byte[]> ReadExtendedHeaders(Stream reader)
{
List<byte[]> extendedHeader = new List<byte[]>();
byte[] buffer = new byte[2];
while (true)
{
int bytesRead = reader.Read(buffer, 0, 2);
if (bytesRead < 2)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header size."
);
}
var extHeaderSize = (ushort)(buffer[0] | (buffer[1] << 8));
if (extHeaderSize == 0)
{
return extendedHeader;
}
byte[] header = new byte[extHeaderSize];
bytesRead = reader.Read(header, 0, extHeaderSize);
if (bytesRead < extHeaderSize)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header data."
);
}
byte[] crc = new byte[4];
bytesRead = reader.Read(crc, 0, 4);
if (bytesRead < 4)
{
throw new EndOfStreamException(
"Unexpected end of stream while reading extended header CRC."
);
}
var checksum = Crc32Stream.Compute(header);
if (checksum != BitConverter.ToUInt32(crc, 0))
{
throw new InvalidDataException("Extended header checksum is invalid");
}
extendedHeader.Add(header);
}
}
// Flag helpers
public bool IsGabled => (Flags & 0x01) != 0;
public bool IsAnsiPage => (Flags & 0x02) != 0;
public bool IsVolume => (Flags & 0x04) != 0;
public bool IsArjProtected => (Flags & 0x08) != 0;
public bool IsPathSym => (Flags & 0x10) != 0;
public bool IsBackup => (Flags & 0x20) != 0;
public bool IsSecured => (Flags & 0x40) != 0;
public bool IsAltName => (Flags & 0x80) != 0;
public static FileType FileTypeFromByte(byte value)
{
return Enum.IsDefined(typeof(FileType), value)
? (FileType)value
: Headers.FileType.Unknown;
}
public static bool IsArchive(Stream stream)
{
var bytes = new byte[2];
if (stream.Read(bytes, 0, 2) != 2)
{
return false;
}
extendedHeader.Add(header);
return CheckMagicBytes(bytes);
}
}
// Flag helpers
public bool IsGabled => (Flags & 0x01) != 0;
public bool IsAnsiPage => (Flags & 0x02) != 0;
public bool IsVolume => (Flags & 0x04) != 0;
public bool IsArjProtected => (Flags & 0x08) != 0;
public bool IsPathSym => (Flags & 0x10) != 0;
public bool IsBackup => (Flags & 0x20) != 0;
public bool IsSecured => (Flags & 0x40) != 0;
public bool IsAltName => (Flags & 0x80) != 0;
public static FileType FileTypeFromByte(byte value)
{
return Enum.IsDefined(typeof(FileType), value) ? (FileType)value : Headers.FileType.Unknown;
}
public static bool IsArchive(Stream stream)
{
var bytes = new byte[2];
if (stream.Read(bytes, 0, 2) != 2)
protected static bool CheckMagicBytes(byte[] headerBytes)
{
return false;
var magicValue = (ushort)(headerBytes[0] | headerBytes[1] << 8);
return magicValue == ARJ_MAGIC;
}
return CheckMagicBytes(bytes);
}
protected static bool CheckMagicBytes(byte[] headerBytes)
{
var magicValue = (ushort)(headerBytes[0] | headerBytes[1] << 8);
return magicValue == ARJ_MAGIC;
}
}

View File

@@ -1,24 +0,0 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjLocalHeader
{
public override async ValueTask<ArjHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var body = await ReadHeaderAsync(stream, cancellationToken);
if (body.Length > 0)
{
await ReadExtendedHeadersAsync(stream, cancellationToken);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
}
return null;
}
}

View File

@@ -4,159 +4,158 @@ using System.IO;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjLocalHeader : ArjHeader
namespace SharpCompress.Common.Arj.Headers
{
public ArchiveEncoding ArchiveEncoding { get; }
public long DataStartPosition { get; protected set; }
public byte ArchiverVersionNumber { get; set; }
public byte MinVersionToExtract { get; set; }
public HostOS HostOS { get; set; }
public CompressionMethod CompressionMethod { get; set; }
public DosDateTime DateTimeModified { get; set; } = new DosDateTime(0);
public long CompressedSize { get; set; }
public long OriginalSize { get; set; }
public long OriginalCrc32 { get; set; }
public int FileSpecPosition { get; set; }
public int FileAccessMode { get; set; }
public byte FirstChapter { get; set; }
public byte LastChapter { get; set; }
public long ExtendedFilePosition { get; set; }
public DosDateTime DateTimeAccessed { get; set; } = new DosDateTime(0);
public DosDateTime DateTimeCreated { get; set; } = new DosDateTime(0);
public long OriginalSizeEvenForVolumes { get; set; }
public string Name { get; set; } = string.Empty;
public string Comment { get; set; } = string.Empty;
private const byte StdHdrSize = 30;
private const byte R9HdrSize = 46;
public ArjLocalHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.LocalHeader)
public class ArjLocalHeader : ArjHeader
{
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
public ArchiveEncoding ArchiveEncoding { get; }
public long DataStartPosition { get; protected set; }
public override ArjHeader? Read(Stream stream)
{
var body = ReadHeader(stream);
if (body.Length > 0)
public byte ArchiverVersionNumber { get; set; }
public byte MinVersionToExtract { get; set; }
public HostOS HostOS { get; set; }
public CompressionMethod CompressionMethod { get; set; }
public DosDateTime DateTimeModified { get; set; } = new DosDateTime(0);
public long CompressedSize { get; set; }
public long OriginalSize { get; set; }
public long OriginalCrc32 { get; set; }
public int FileSpecPosition { get; set; }
public int FileAccessMode { get; set; }
public byte FirstChapter { get; set; }
public byte LastChapter { get; set; }
public long ExtendedFilePosition { get; set; }
public DosDateTime DateTimeAccessed { get; set; } = new DosDateTime(0);
public DosDateTime DateTimeCreated { get; set; } = new DosDateTime(0);
public long OriginalSizeEvenForVolumes { get; set; }
public string Name { get; set; } = string.Empty;
public string Comment { get; set; } = string.Empty;
private const byte StdHdrSize = 30;
private const byte R9HdrSize = 46;
public ArjLocalHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.LocalHeader)
{
ReadExtendedHeaders(stream);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
return null;
}
// ReadAsync moved to ArjLocalHeader.Async.cs
public ArjLocalHeader LoadFrom(byte[] headerBytes)
{
int offset = 0;
int ReadInt16()
public override ArjHeader? Read(Stream stream)
{
if (offset + 1 >= headerBytes.Length)
var body = ReadHeader(stream);
if (body.Length > 0)
{
throw new EndOfStreamException();
ReadExtendedHeaders(stream);
var header = LoadFrom(body);
header.DataStartPosition = stream.Position;
return header;
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
return null;
}
long ReadInt32()
public ArjLocalHeader LoadFrom(byte[] headerBytes)
{
if (offset + 3 >= headerBytes.Length)
int offset = 0;
int ReadInt16()
{
throw new EndOfStreamException();
if (offset + 1 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
byte headerSize = headerBytes[offset++];
ArchiverVersionNumber = headerBytes[offset++];
MinVersionToExtract = headerBytes[offset++];
HostOS hostOS = (HostOS)headerBytes[offset++];
Flags = headerBytes[offset++];
CompressionMethod = CompressionMethodFromByte(headerBytes[offset++]);
FileType = FileTypeFromByte(headerBytes[offset++]);
offset++; // Skip 1 byte
var rawTimestamp = ReadInt32();
DateTimeModified = rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
CompressedSize = ReadInt32();
OriginalSize = ReadInt32();
OriginalCrc32 = ReadInt32();
FileSpecPosition = ReadInt16();
FileAccessMode = ReadInt16();
FirstChapter = headerBytes[offset++];
LastChapter = headerBytes[offset++];
ExtendedFilePosition = 0;
OriginalSizeEvenForVolumes = 0;
if (headerSize > StdHdrSize)
{
ExtendedFilePosition = ReadInt32();
if (headerSize >= R9HdrSize)
long ReadInt32()
{
rawTimestamp = ReadInt32();
DateTimeAccessed =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
rawTimestamp = ReadInt32();
DateTimeCreated =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
OriginalSizeEvenForVolumes = ReadInt32();
if (offset + 3 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
byte headerSize = headerBytes[offset++];
ArchiverVersionNumber = headerBytes[offset++];
MinVersionToExtract = headerBytes[offset++];
HostOS hostOS = (HostOS)headerBytes[offset++];
Flags = headerBytes[offset++];
CompressionMethod = CompressionMethodFromByte(headerBytes[offset++]);
FileType = FileTypeFromByte(headerBytes[offset++]);
offset++; // Skip 1 byte
var rawTimestamp = ReadInt32();
DateTimeModified =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
CompressedSize = ReadInt32();
OriginalSize = ReadInt32();
OriginalCrc32 = ReadInt32();
FileSpecPosition = ReadInt16();
FileAccessMode = ReadInt16();
FirstChapter = headerBytes[offset++];
LastChapter = headerBytes[offset++];
ExtendedFilePosition = 0;
OriginalSizeEvenForVolumes = 0;
if (headerSize > StdHdrSize)
{
ExtendedFilePosition = ReadInt32();
if (headerSize >= R9HdrSize)
{
rawTimestamp = ReadInt32();
DateTimeAccessed =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
rawTimestamp = ReadInt32();
DateTimeCreated =
rawTimestamp != 0 ? new DosDateTime(rawTimestamp) : new DosDateTime(0);
OriginalSizeEvenForVolumes = ReadInt32();
}
}
Name = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Name.Length + 1;
Comment = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Comment.Length + 1;
return this;
}
Name = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Name.Length + 1;
Comment = Encoding.ASCII.GetString(
headerBytes,
offset,
Array.IndexOf(headerBytes, (byte)0, offset) - offset
);
offset += Comment.Length + 1;
return this;
}
public static CompressionMethod CompressionMethodFromByte(byte value)
{
return value switch
public static CompressionMethod CompressionMethodFromByte(byte value)
{
0 => CompressionMethod.Stored,
1 => CompressionMethod.CompressedMost,
2 => CompressionMethod.Compressed,
3 => CompressionMethod.CompressedFaster,
4 => CompressionMethod.CompressedFastest,
8 => CompressionMethod.NoDataNoCrc,
9 => CompressionMethod.NoData,
_ => CompressionMethod.Unknown,
};
return value switch
{
0 => CompressionMethod.Stored,
1 => CompressionMethod.CompressedMost,
2 => CompressionMethod.Compressed,
3 => CompressionMethod.CompressedFaster,
4 => CompressionMethod.CompressedFastest,
8 => CompressionMethod.NoDataNoCrc,
9 => CompressionMethod.NoData,
_ => CompressionMethod.Unknown,
};
}
}
}

View File

@@ -1,18 +0,0 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjMainHeader
{
public override async ValueTask<ArjHeader?> ReadAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var body = await ReadHeaderAsync(stream, cancellationToken);
await ReadExtendedHeadersAsync(stream, cancellationToken);
return LoadFrom(body);
}
}

View File

@@ -1,141 +1,138 @@
using System;
using System.IO;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Compressors.Deflate;
using SharpCompress.Crypto;
namespace SharpCompress.Common.Arj.Headers;
public partial class ArjMainHeader : ArjHeader
namespace SharpCompress.Common.Arj.Headers
{
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public ArchiveEncoding ArchiveEncoding { get; }
public int ArchiverVersionNumber { get; private set; }
public int MinVersionToExtract { get; private set; }
public HostOS HostOs { get; private set; }
public int SecurityVersion { get; private set; }
public DosDateTime CreationDateTime { get; private set; } = new DosDateTime(0);
public long CompressedSize { get; private set; }
public long ArchiveSize { get; private set; }
public long SecurityEnvelope { get; private set; }
public int FileSpecPosition { get; private set; }
public int SecurityEnvelopeLength { get; private set; }
public int EncryptionVersion { get; private set; }
public int LastChapter { get; private set; }
public int ArjProtectionFactor { get; private set; }
public int Flags2 { get; private set; }
public string Name { get; private set; } = string.Empty;
public string Comment { get; private set; } = string.Empty;
public ArjMainHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.MainHeader)
public class ArjMainHeader : ArjHeader
{
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
private const int FIRST_HDR_SIZE = 34;
private const ushort ARJ_MAGIC = 0xEA60;
public override ArjHeader? Read(Stream stream)
{
var body = ReadHeader(stream);
ReadExtendedHeaders(stream);
return LoadFrom(body);
}
public ArchiveEncoding ArchiveEncoding { get; }
// ReadAsync moved to ArjMainHeader.Async.cs
public int ArchiverVersionNumber { get; private set; }
public int MinVersionToExtract { get; private set; }
public HostOS HostOs { get; private set; }
public int SecurityVersion { get; private set; }
public DosDateTime CreationDateTime { get; private set; } = new DosDateTime(0);
public long CompressedSize { get; private set; }
public long ArchiveSize { get; private set; }
public long SecurityEnvelope { get; private set; }
public int FileSpecPosition { get; private set; }
public int SecurityEnvelopeLength { get; private set; }
public int EncryptionVersion { get; private set; }
public int LastChapter { get; private set; }
public ArjMainHeader LoadFrom(byte[] headerBytes)
{
var offset = 1;
public int ArjProtectionFactor { get; private set; }
public int Flags2 { get; private set; }
public string Name { get; private set; } = string.Empty;
public string Comment { get; private set; } = string.Empty;
byte ReadByte()
public ArjMainHeader(ArchiveEncoding archiveEncoding)
: base(ArjHeaderType.MainHeader)
{
if (offset >= headerBytes.Length)
{
throw new EndOfStreamException();
}
return (byte)(headerBytes[offset++] & 0xFF);
ArchiveEncoding =
archiveEncoding ?? throw new ArgumentNullException(nameof(archiveEncoding));
}
int ReadInt16()
public override ArjHeader? Read(Stream stream)
{
if (offset + 1 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
var body = ReadHeader(stream);
ReadExtendedHeaders(stream);
return LoadFrom(body);
}
long ReadInt32()
public ArjMainHeader LoadFrom(byte[] headerBytes)
{
if (offset + 3 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
string ReadNullTerminatedString(byte[] x, int startIndex)
{
var result = new StringBuilder();
int i = startIndex;
var offset = 1;
while (i < x.Length && x[i] != 0)
byte ReadByte()
{
result.Append((char)x[i]);
if (offset >= headerBytes.Length)
{
throw new EndOfStreamException();
}
return (byte)(headerBytes[offset++] & 0xFF);
}
int ReadInt16()
{
if (offset + 1 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
var v = headerBytes[offset] & 0xFF | (headerBytes[offset + 1] & 0xFF) << 8;
offset += 2;
return v;
}
long ReadInt32()
{
if (offset + 3 >= headerBytes.Length)
{
throw new EndOfStreamException();
}
long v =
headerBytes[offset] & 0xFF
| (headerBytes[offset + 1] & 0xFF) << 8
| (headerBytes[offset + 2] & 0xFF) << 16
| (headerBytes[offset + 3] & 0xFF) << 24;
offset += 4;
return v;
}
string ReadNullTerminatedString(byte[] x, int startIndex)
{
var result = new StringBuilder();
int i = startIndex;
while (i < x.Length && x[i] != 0)
{
result.Append((char)x[i]);
i++;
}
// Skip the null terminator
i++;
if (i < x.Length)
{
byte[] remainder = new byte[x.Length - i];
Array.Copy(x, i, remainder, 0, remainder.Length);
x = remainder;
}
return result.ToString();
}
// Skip the null terminator
i++;
if (i < x.Length)
{
byte[] remainder = new byte[x.Length - i];
Array.Copy(x, i, remainder, 0, remainder.Length);
x = remainder;
}
ArchiverVersionNumber = ReadByte();
MinVersionToExtract = ReadByte();
return result.ToString();
var hostOsByte = ReadByte();
HostOs = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
Flags = ReadByte();
SecurityVersion = ReadByte();
FileType = FileTypeFromByte(ReadByte());
offset++; // skip reserved
CreationDateTime = new DosDateTime((int)ReadInt32());
CompressedSize = ReadInt32();
ArchiveSize = ReadInt32();
SecurityEnvelope = ReadInt32();
FileSpecPosition = ReadInt16();
SecurityEnvelopeLength = ReadInt16();
EncryptionVersion = ReadByte();
LastChapter = ReadByte();
Name = ReadNullTerminatedString(headerBytes, offset);
Comment = ReadNullTerminatedString(headerBytes, offset + 1 + Name.Length);
return this;
}
ArchiverVersionNumber = ReadByte();
MinVersionToExtract = ReadByte();
var hostOsByte = ReadByte();
HostOs = hostOsByte <= 11 ? (HostOS)hostOsByte : HostOS.Unknown;
Flags = ReadByte();
SecurityVersion = ReadByte();
FileType = FileTypeFromByte(ReadByte());
offset++; // skip reserved
CreationDateTime = new DosDateTime((int)ReadInt32());
CompressedSize = ReadInt32();
ArchiveSize = ReadInt32();
SecurityEnvelope = ReadInt32();
FileSpecPosition = ReadInt16();
SecurityEnvelopeLength = ReadInt16();
EncryptionVersion = ReadByte();
LastChapter = ReadByte();
Name = ReadNullTerminatedString(headerBytes, offset);
Comment = ReadNullTerminatedString(headerBytes, offset + 1 + Name.Length);
return this;
}
}

View File

@@ -4,16 +4,17 @@ using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace SharpCompress.Common.Arj.Headers;
public enum CompressionMethod
namespace SharpCompress.Common.Arj.Headers
{
Stored = 0,
CompressedMost = 1,
Compressed = 2,
CompressedFaster = 3,
CompressedFastest = 4,
NoDataNoCrc = 8,
NoData = 9,
Unknown,
public enum CompressionMethod
{
Stored = 0,
CompressedMost = 1,
Compressed = 2,
CompressedFaster = 3,
CompressedFastest = 4,
NoDataNoCrc = 8,
NoData = 9,
Unknown,
}
}

View File

@@ -1,36 +1,37 @@
using System;
namespace SharpCompress.Common.Arj.Headers;
public class DosDateTime
namespace SharpCompress.Common.Arj.Headers
{
public DateTime DateTime { get; }
public DosDateTime(long dosValue)
public class DosDateTime
{
// Ensure only the lower 32 bits are used
int value = unchecked((int)(dosValue & 0xFFFFFFFF));
public DateTime DateTime { get; }
var date = (value >> 16) & 0xFFFF;
var time = value & 0xFFFF;
var day = date & 0x1F;
var month = (date >> 5) & 0x0F;
var year = ((date >> 9) & 0x7F) + 1980;
var second = (time & 0x1F) * 2;
var minute = (time >> 5) & 0x3F;
var hour = (time >> 11) & 0x1F;
try
public DosDateTime(long dosValue)
{
DateTime = new DateTime(year, month, day, hour, minute, second);
}
catch
{
DateTime = DateTime.MinValue;
// Ensure only the lower 32 bits are used
int value = unchecked((int)(dosValue & 0xFFFFFFFF));
var date = (value >> 16) & 0xFFFF;
var time = value & 0xFFFF;
var day = date & 0x1F;
var month = (date >> 5) & 0x0F;
var year = ((date >> 9) & 0x7F) + 1980;
var second = (time & 0x1F) * 2;
var minute = (time >> 5) & 0x3F;
var hour = (time >> 11) & 0x1F;
try
{
DateTime = new DateTime(year, month, day, hour, minute, second);
}
catch
{
DateTime = DateTime.MinValue;
}
}
public override string ToString() => DateTime.ToString("yyyy-MM-dd HH:mm:ss");
}
public override string ToString() => DateTime.ToString("yyyy-MM-dd HH:mm:ss");
}

View File

@@ -1,12 +1,13 @@
namespace SharpCompress.Common.Arj.Headers;
public enum FileType : byte
namespace SharpCompress.Common.Arj.Headers
{
Binary = 0,
Text7Bit = 1,
CommentHeader = 2,
Directory = 3,
VolumeLabel = 4,
ChapterLabel = 5,
Unknown = 255,
public enum FileType : byte
{
Binary = 0,
Text7Bit = 1,
CommentHeader = 2,
Directory = 3,
VolumeLabel = 4,
ChapterLabel = 5,
Unknown = 255,
}
}

View File

@@ -1,18 +1,19 @@
namespace SharpCompress.Common.Arj.Headers;
public enum HostOS
namespace SharpCompress.Common.Arj.Headers
{
MsDos = 0,
PrimOS = 1,
Unix = 2,
Amiga = 3,
MacOs = 4,
OS2 = 5,
AppleGS = 6,
AtariST = 7,
NeXT = 8,
VaxVMS = 9,
Win95 = 10,
Win32 = 11,
Unknown = 255,
public enum HostOS
{
MsDos = 0,
PrimOS = 1,
Unix = 2,
Amiga = 3,
MacOs = 4,
OS2 = 5,
AppleGS = 6,
AtariST = 7,
NeXT = 8,
VaxVMS = 9,
Win95 = 10,
Win32 = 11,
Unknown = 255,
}
}

View File

@@ -0,0 +1,103 @@
using System;
using System.Buffers.Binary;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common
{
public sealed class AsyncBinaryReader : IDisposable
{
private readonly Stream _stream;
private readonly Stream _originalStream;
private readonly bool _leaveOpen;
private readonly byte[] _buffer = new byte[8];
private bool _disposed;
public AsyncBinaryReader(Stream stream, bool leaveOpen = false, int bufferSize = 4096)
{
_originalStream = stream ?? throw new ArgumentNullException(nameof(stream));
_leaveOpen = leaveOpen;
// Use the stream directly without wrapping in BufferedStream
// BufferedStream uses synchronous Read internally which doesn't work with async-only streams
// SharpCompress uses SharpCompressStream for buffering which supports true async reads
_stream = stream;
}
public Stream BaseStream => _stream;
public async ValueTask<byte> ReadByteAsync(CancellationToken ct = default)
{
await _stream.ReadExactAsync(_buffer, 0, 1, ct).ConfigureAwait(false);
return _buffer[0];
}
public async ValueTask<ushort> ReadUInt16Async(CancellationToken ct = default)
{
await _stream.ReadExactAsync(_buffer, 0, 2, ct).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt16LittleEndian(_buffer);
}
public async ValueTask<uint> ReadUInt32Async(CancellationToken ct = default)
{
await _stream.ReadExactAsync(_buffer, 0, 4, ct).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt32LittleEndian(_buffer);
}
public async ValueTask<ulong> ReadUInt64Async(CancellationToken ct = default)
{
await _stream.ReadExactAsync(_buffer, 0, 8, ct).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt64LittleEndian(_buffer);
}
public async ValueTask ReadBytesAsync(
byte[] bytes,
int offset,
int count,
CancellationToken ct = default
)
{
await _stream.ReadExactAsync(bytes, offset, count, ct).ConfigureAwait(false);
}
public async ValueTask SkipAsync(int count, CancellationToken ct = default)
{
await _stream.SkipAsync(count, ct).ConfigureAwait(false);
}
public void Dispose()
{
if (_disposed)
{
return;
}
_disposed = true;
// Dispose the original stream if we own it
if (!_leaveOpen)
{
_originalStream.Dispose();
}
}
#if NET8_0_OR_GREATER
public async ValueTask DisposeAsync()
{
if (_disposed)
{
return;
}
_disposed = true;
// Dispose the original stream if we own it
if (!_leaveOpen)
{
await _originalStream.DisposeAsync().ConfigureAwait(false);
}
}
#endif
}
}

View File

@@ -1,41 +0,0 @@
namespace SharpCompress.Common;
public static class Constants
{
/// <summary>
/// The default buffer size for stream operations, matching .NET's Stream.CopyTo default of 81920 bytes.
/// This can be modified globally at runtime.
/// </summary>
public static int BufferSize { get; set; } = 81920;
/// <summary>
/// The default size for rewindable buffers in SharpCompressStream.
/// Used for format detection on non-seekable streams.
/// </summary>
/// <remarks>
/// <para>
/// When opening archives from non-seekable streams (network streams, pipes,
/// compressed streams), SharpCompress uses a ring buffer to enable format
/// auto-detection. This buffer allows the library to try multiple decoders
/// by rewinding and re-reading the same data.
/// </para>
/// <para>
/// <b>Default:</b> 81920 bytes (81KB) - sufficient for typical format detection.
/// </para>
/// <para>
/// <b>Typical usage:</b> 500-1000 bytes for most archives
/// </para>
/// <para>
/// <b>Can be overridden per-stream via ReaderOptions.RewindableBufferSize.</b>
/// </para>
/// <para>
/// <b>Increase if:</b>
/// <list type="bullet">
/// <item>Handling self-extracting archives (may need 512KB+)</item>
/// <item>Format detection fails with buffer overflow errors</item>
/// <item>Using custom formats with large headers</item>
/// </list>
/// </para>
/// </remarks>
public static int RewindableBufferSize { get; set; } = 81920;
}

View File

@@ -1,81 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.IO;
namespace SharpCompress.Common;
public partial class EntryStream
{
/// <summary>
/// Asynchronously skip the rest of the entry stream.
/// </summary>
public async ValueTask SkipEntryAsync(CancellationToken cancellationToken = default)
{
await this.SkipAsync(cancellationToken).ConfigureAwait(false);
_completed = true;
}
#if !LEGACY_DOTNET
public override async ValueTask DisposeAsync()
{
if (_isDisposed)
{
return;
}
_isDisposed = true;
if (!(_completed || _reader.Cancelled))
{
await SkipEntryAsync().ConfigureAwait(false);
}
//Need a safe standard approach to this - it's okay for compression to overreads. Handling needs to be standardised
if (_stream is IStreamStack ss)
{
if (ss.BaseStream() is SharpCompress.Compressors.Deflate.DeflateStream deflateStream)
{
await deflateStream.FlushAsync().ConfigureAwait(false);
}
else if (ss.BaseStream() is SharpCompress.Compressors.LZMA.LzmaStream lzmaStream)
{
await lzmaStream.FlushAsync().ConfigureAwait(false);
}
}
await base.DisposeAsync().ConfigureAwait(false);
await _stream.DisposeAsync().ConfigureAwait(false);
}
#endif
public override async Task<int> ReadAsync(
byte[] buffer,
int offset,
int count,
CancellationToken cancellationToken
)
{
var read = await _stream
.ReadAsync(buffer, offset, count, cancellationToken)
.ConfigureAwait(false);
if (read <= 0)
{
_completed = true;
}
return read;
}
#if !LEGACY_DOTNET
public override async ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default
)
{
var read = await _stream.ReadAsync(buffer, cancellationToken).ConfigureAwait(false);
if (read <= 0)
{
_completed = true;
}
return read;
}
#endif
}

View File

@@ -8,8 +8,28 @@ using SharpCompress.Readers;
namespace SharpCompress.Common;
public partial class EntryStream : Stream
public class EntryStream : Stream, IStreamStack
{
#if DEBUG_STREAMS
long IStreamStack.InstanceId { get; set; }
#endif
int IStreamStack.DefaultBufferSize { get; set; }
Stream IStreamStack.BaseStream() => _stream;
int IStreamStack.BufferSize
{
get => 0;
set { }
}
int IStreamStack.BufferPosition
{
get => 0;
set { }
}
void IStreamStack.SetPosition(long position) { }
private readonly IReader _reader;
private readonly Stream _stream;
private bool _completed;
@@ -19,6 +39,9 @@ public partial class EntryStream : Stream
{
_reader = reader;
_stream = stream;
#if DEBUG_STREAMS
this.DebugConstruct(typeof(EntryStream));
#endif
}
/// <summary>
@@ -30,6 +53,15 @@ public partial class EntryStream : Stream
_completed = true;
}
/// <summary>
/// Asynchronously skip the rest of the entry stream.
/// </summary>
public async ValueTask SkipEntryAsync(CancellationToken cancellationToken = default)
{
await this.SkipAsync(cancellationToken).ConfigureAwait(false);
_completed = true;
}
protected override void Dispose(bool disposing)
{
if (_isDisposed)
@@ -39,38 +71,61 @@ public partial class EntryStream : Stream
_isDisposed = true;
if (!(_completed || _reader.Cancelled))
{
if (Utility.UseSyncOverAsyncDispose())
{
SkipEntryAsync().GetAwaiter().GetResult();
}
else
{
SkipEntry();
}
SkipEntry();
}
//Need a safe standard approach to this - it's okay for compression to overreads. Handling needs to be standardised
if (_stream is IStreamStack ss)
{
if (
ss.GetStream<SharpCompress.Compressors.Deflate.DeflateStream>()
is SharpCompress.Compressors.Deflate.DeflateStream deflateStream
)
if (ss.BaseStream() is SharpCompress.Compressors.Deflate.DeflateStream deflateStream)
{
deflateStream.Flush(); //Deflate over reads. Knock it back
}
else if (
ss.GetStream<SharpCompress.Compressors.LZMA.LzmaStream>()
is SharpCompress.Compressors.LZMA.LzmaStream lzmaStream
)
else if (ss.BaseStream() is SharpCompress.Compressors.LZMA.LzmaStream lzmaStream)
{
lzmaStream.Flush(); //Lzma over reads. Knock it back
}
}
#if DEBUG_STREAMS
this.DebugDispose(typeof(EntryStream));
#endif
base.Dispose(disposing);
_stream.Dispose();
}
#if !LEGACY_DOTNET
public override async ValueTask DisposeAsync()
{
if (_isDisposed)
{
return;
}
_isDisposed = true;
if (!(_completed || _reader.Cancelled))
{
await SkipEntryAsync().ConfigureAwait(false);
}
//Need a safe standard approach to this - it's okay for compression to overreads. Handling needs to be standardised
if (_stream is IStreamStack ss)
{
if (ss.BaseStream() is SharpCompress.Compressors.Deflate.DeflateStream deflateStream)
{
await deflateStream.FlushAsync().ConfigureAwait(false);
}
else if (ss.BaseStream() is SharpCompress.Compressors.LZMA.LzmaStream lzmaStream)
{
await lzmaStream.FlushAsync().ConfigureAwait(false);
}
}
#if DEBUG_STREAMS
this.DebugDispose(typeof(EntryStream));
#endif
await base.DisposeAsync().ConfigureAwait(false);
await _stream.DisposeAsync().ConfigureAwait(false);
}
#endif
public override bool CanRead => true;
public override bool CanSeek => false;
@@ -99,6 +154,38 @@ public partial class EntryStream : Stream
return read;
}
public override async Task<int> ReadAsync(
byte[] buffer,
int offset,
int count,
CancellationToken cancellationToken
)
{
var read = await _stream
.ReadAsync(buffer, offset, count, cancellationToken)
.ConfigureAwait(false);
if (read <= 0)
{
_completed = true;
}
return read;
}
#if !LEGACY_DOTNET
public override async ValueTask<int> ReadAsync(
Memory<byte> buffer,
CancellationToken cancellationToken = default
)
{
var read = await _stream.ReadAsync(buffer, cancellationToken).ConfigureAwait(false);
if (read <= 0)
{
_completed = true;
}
return read;
}
#endif
public override int ReadByte()
{
var value = _stream.ReadByte();

View File

@@ -1,116 +0,0 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace SharpCompress.Common;
internal static partial class ExtractionMethods
{
public static async ValueTask WriteEntryToDirectoryAsync(
IEntry entry,
string destinationDirectory,
ExtractionOptions? options,
Func<string, ExtractionOptions?, CancellationToken, ValueTask> writeAsync,
CancellationToken cancellationToken = default
)
{
string destinationFileName;
var fullDestinationDirectoryPath = Path.GetFullPath(destinationDirectory);
//check for trailing slash.
if (
fullDestinationDirectoryPath[fullDestinationDirectoryPath.Length - 1]
!= Path.DirectorySeparatorChar
)
{
fullDestinationDirectoryPath += Path.DirectorySeparatorChar;
}
if (!Directory.Exists(fullDestinationDirectoryPath))
{
throw new ExtractionException(
$"Directory does not exist to extract to: {fullDestinationDirectoryPath}"
);
}
options ??= new ExtractionOptions() { Overwrite = true };
var file = Path.GetFileName(entry.Key.NotNull("Entry Key is null")).NotNull("File is null");
file = Utility.ReplaceInvalidFileNameChars(file);
if (options.ExtractFullPath)
{
var folder = Path.GetDirectoryName(entry.Key.NotNull("Entry Key is null"))
.NotNull("Directory is null");
var destdir = Path.GetFullPath(Path.Combine(fullDestinationDirectoryPath, folder));
if (!Directory.Exists(destdir))
{
if (!destdir.StartsWith(fullDestinationDirectoryPath, PathComparison))
{
throw new ExtractionException(
"Entry is trying to create a directory outside of the destination directory."
);
}
Directory.CreateDirectory(destdir);
}
destinationFileName = Path.Combine(destdir, file);
}
else
{
destinationFileName = Path.Combine(fullDestinationDirectoryPath, file);
}
if (!entry.IsDirectory)
{
destinationFileName = Path.GetFullPath(destinationFileName);
if (!destinationFileName.StartsWith(fullDestinationDirectoryPath, PathComparison))
{
throw new ExtractionException(
"Entry is trying to write a file outside of the destination directory."
);
}
await writeAsync(destinationFileName, options, cancellationToken).ConfigureAwait(false);
}
else if (options.ExtractFullPath && !Directory.Exists(destinationFileName))
{
Directory.CreateDirectory(destinationFileName);
}
}
public static async ValueTask WriteEntryToFileAsync(
IEntry entry,
string destinationFileName,
ExtractionOptions? options,
Func<string, FileMode, CancellationToken, ValueTask> openAndWriteAsync,
CancellationToken cancellationToken = default
)
{
if (entry.LinkTarget != null)
{
if (options?.WriteSymbolicLink is null)
{
throw new ExtractionException(
"Entry is a symbolic link but ExtractionOptions.WriteSymbolicLink delegate is null"
);
}
options.WriteSymbolicLink(destinationFileName, entry.LinkTarget);
}
else
{
var fm = FileMode.Create;
options ??= new ExtractionOptions() { Overwrite = true };
if (!options.Overwrite)
{
fm = FileMode.CreateNew;
}
await openAndWriteAsync(destinationFileName, fm, cancellationToken)
.ConfigureAwait(false);
entry.PreserveExtractionOptions(destinationFileName, options);
}
}
}

View File

@@ -6,7 +6,7 @@ using System.Threading.Tasks;
namespace SharpCompress.Common;
internal static partial class ExtractionMethods
internal static class ExtractionMethods
{
/// <summary>
/// Gets the appropriate StringComparison for path checks based on the file system.
@@ -123,4 +123,111 @@ internal static partial class ExtractionMethods
entry.PreserveExtractionOptions(destinationFileName, options);
}
}
public static async ValueTask WriteEntryToDirectoryAsync(
IEntry entry,
string destinationDirectory,
ExtractionOptions? options,
Func<string, ExtractionOptions?, CancellationToken, ValueTask> writeAsync,
CancellationToken cancellationToken = default
)
{
string destinationFileName;
var fullDestinationDirectoryPath = Path.GetFullPath(destinationDirectory);
//check for trailing slash.
if (
fullDestinationDirectoryPath[fullDestinationDirectoryPath.Length - 1]
!= Path.DirectorySeparatorChar
)
{
fullDestinationDirectoryPath += Path.DirectorySeparatorChar;
}
if (!Directory.Exists(fullDestinationDirectoryPath))
{
throw new ExtractionException(
$"Directory does not exist to extract to: {fullDestinationDirectoryPath}"
);
}
options ??= new ExtractionOptions() { Overwrite = true };
var file = Path.GetFileName(entry.Key.NotNull("Entry Key is null")).NotNull("File is null");
file = Utility.ReplaceInvalidFileNameChars(file);
if (options.ExtractFullPath)
{
var folder = Path.GetDirectoryName(entry.Key.NotNull("Entry Key is null"))
.NotNull("Directory is null");
var destdir = Path.GetFullPath(Path.Combine(fullDestinationDirectoryPath, folder));
if (!Directory.Exists(destdir))
{
if (!destdir.StartsWith(fullDestinationDirectoryPath, PathComparison))
{
throw new ExtractionException(
"Entry is trying to create a directory outside of the destination directory."
);
}
Directory.CreateDirectory(destdir);
}
destinationFileName = Path.Combine(destdir, file);
}
else
{
destinationFileName = Path.Combine(fullDestinationDirectoryPath, file);
}
if (!entry.IsDirectory)
{
destinationFileName = Path.GetFullPath(destinationFileName);
if (!destinationFileName.StartsWith(fullDestinationDirectoryPath, PathComparison))
{
throw new ExtractionException(
"Entry is trying to write a file outside of the destination directory."
);
}
await writeAsync(destinationFileName, options, cancellationToken).ConfigureAwait(false);
}
else if (options.ExtractFullPath && !Directory.Exists(destinationFileName))
{
Directory.CreateDirectory(destinationFileName);
}
}
public static async ValueTask WriteEntryToFileAsync(
IEntry entry,
string destinationFileName,
ExtractionOptions? options,
Func<string, FileMode, CancellationToken, ValueTask> openAndWriteAsync,
CancellationToken cancellationToken = default
)
{
if (entry.LinkTarget != null)
{
if (options?.WriteSymbolicLink is null)
{
throw new ExtractionException(
"Entry is a symbolic link but ExtractionOptions.WriteSymbolicLink delegate is null"
);
}
options.WriteSymbolicLink(destinationFileName, entry.LinkTarget);
}
else
{
var fm = FileMode.Create;
options ??= new ExtractionOptions() { Overwrite = true };
if (!options.Overwrite)
{
fm = FileMode.CreateNew;
}
await openAndWriteAsync(destinationFileName, fm, cancellationToken)
.ConfigureAwait(false);
entry.PreserveExtractionOptions(destinationFileName, options);
}
}
}

View File

@@ -1,15 +0,0 @@
using System.Collections.Generic;
using System.IO;
namespace SharpCompress.Common.GZip;
public partial class GZipEntry
{
internal static async IAsyncEnumerable<GZipEntry> GetEntriesAsync(
Stream stream,
OptionsBase options
)
{
yield return new GZipEntry(await GZipFilePart.CreateAsync(stream, options.ArchiveEncoding));
}
}

View File

@@ -4,7 +4,7 @@ using System.IO;
namespace SharpCompress.Common.GZip;
public partial class GZipEntry : Entry
public class GZipEntry : Entry
{
private readonly GZipFilePart? _filePart;
@@ -42,6 +42,4 @@ public partial class GZipEntry : Entry
{
yield return new GZipEntry(GZipFilePart.Create(stream, options.ArchiveEncoding));
}
// Async methods moved to GZipEntry.Async.cs
}

View File

@@ -1,133 +0,0 @@
using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.Compressors.Deflate;
namespace SharpCompress.Common.GZip;
internal sealed partial class GZipFilePart
{
internal static async ValueTask<GZipFilePart> CreateAsync(
Stream stream,
IArchiveEncoding archiveEncoding,
CancellationToken cancellationToken = default
)
{
var part = new GZipFilePart(stream, archiveEncoding);
await part.ReadAndValidateGzipHeaderAsync(cancellationToken);
if (stream.CanSeek)
{
var position = stream.Position;
stream.Position = stream.Length - 8;
await part.ReadTrailerAsync(cancellationToken);
stream.Position = position;
part.EntryStartPosition = position;
}
else
{
// For non-seekable streams, we can't read the trailer or track position.
// Set to 0 since the stream will be read sequentially from its current position.
part.EntryStartPosition = 0;
}
return part;
}
private async ValueTask ReadTrailerAsync(CancellationToken cancellationToken = default)
{
// Read and potentially verify the GZIP trailer: CRC32 and size mod 2^32
var trailer = new byte[8];
_ = await _stream.ReadFullyAsync(trailer, 0, 8, cancellationToken);
Crc = BinaryPrimitives.ReadUInt32LittleEndian(trailer);
UncompressedSize = BinaryPrimitives.ReadUInt32LittleEndian(trailer.AsSpan().Slice(4));
}
private async ValueTask ReadAndValidateGzipHeaderAsync(
CancellationToken cancellationToken = default
)
{
// read the header on the first read
var header = new byte[10];
var n = await _stream.ReadAsync(header, 0, 10, cancellationToken);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
return;
}
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
var timet = BinaryPrimitives.ReadInt32LittleEndian(header.AsSpan().Slice(4));
DateModified = TarHeader.EPOCH.AddSeconds(timet);
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
var lengthField = new byte[2];
_ = await _stream.ReadAsync(lengthField, 0, 2, cancellationToken);
var extraLength = (short)(lengthField[0] + (lengthField[1] * 256));
var extra = new byte[extraLength];
if (!await _stream.ReadFullyAsync(extra, cancellationToken))
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
}
}
if ((header[3] & 0x08) == 0x08)
{
_name = await ReadZeroTerminatedStringAsync(_stream, cancellationToken);
}
if ((header[3] & 0x10) == 0x010)
{
await ReadZeroTerminatedStringAsync(_stream, cancellationToken);
}
if ((header[3] & 0x02) == 0x02)
{
var buf = new byte[1];
_ = await _stream.ReadAsync(buf, 0, 1, cancellationToken); // CRC16, ignore
}
}
private async ValueTask<string> ReadZeroTerminatedStringAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var buf1 = new byte[1];
var list = new List<byte>();
var done = false;
do
{
// workitem 7740
var n = await stream.ReadAsync(buf1, 0, 1, cancellationToken);
if (n != 1)
{
throw new ZlibException("Unexpected EOF reading GZIP header.");
}
if (buf1[0] == 0)
{
done = true;
}
else
{
list.Add(buf1[0]);
}
} while (!done);
var buffer = list.ToArray();
return ArchiveEncoding.Decode(buffer);
}
}

View File

@@ -2,13 +2,15 @@ using System;
using System.Buffers.Binary;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.Compressors;
using SharpCompress.Compressors.Deflate;
namespace SharpCompress.Common.GZip;
internal sealed partial class GZipFilePart : FilePart
internal sealed class GZipFilePart : FilePart
{
private string? _name;
private readonly Stream _stream;
@@ -35,6 +37,32 @@ internal sealed partial class GZipFilePart : FilePart
return part;
}
internal static async ValueTask<GZipFilePart> CreateAsync(
Stream stream,
IArchiveEncoding archiveEncoding,
CancellationToken cancellationToken = default
)
{
var part = new GZipFilePart(stream, archiveEncoding);
await part.ReadAndValidateGzipHeaderAsync(cancellationToken);
if (stream.CanSeek)
{
var position = stream.Position;
stream.Position = stream.Length - 8;
await part.ReadTrailerAsync(cancellationToken);
stream.Position = position;
part.EntryStartPosition = position;
}
else
{
// For non-seekable streams, we can't read the trailer or track position.
// Set to 0 since the stream will be read sequentially from its current position.
part.EntryStartPosition = 0;
}
return part;
}
private GZipFilePart(Stream stream, IArchiveEncoding archiveEncoding)
: base(archiveEncoding) => _stream = stream;
@@ -47,12 +75,7 @@ internal sealed partial class GZipFilePart : FilePart
internal override string? FilePartName => _name;
internal override Stream GetCompressedStream() =>
new DeflateStream(
_stream,
CompressionMode.Decompress,
CompressionLevel.Default,
leaveOpen: true
);
new DeflateStream(_stream, CompressionMode.Decompress, CompressionLevel.Default);
internal override Stream GetRawStream() => _stream;
@@ -66,6 +89,16 @@ internal sealed partial class GZipFilePart : FilePart
UncompressedSize = BinaryPrimitives.ReadUInt32LittleEndian(trailer.Slice(4));
}
private async ValueTask ReadTrailerAsync(CancellationToken cancellationToken = default)
{
// Read and potentially verify the GZIP trailer: CRC32 and size mod 2^32
var trailer = new byte[8];
_ = await _stream.ReadFullyAsync(trailer, 0, 8, cancellationToken);
Crc = BinaryPrimitives.ReadUInt32LittleEndian(trailer);
UncompressedSize = BinaryPrimitives.ReadUInt32LittleEndian(trailer.AsSpan().Slice(4));
}
private void ReadAndValidateGzipHeader()
{
// read the header on the first read
@@ -118,6 +151,61 @@ internal sealed partial class GZipFilePart : FilePart
}
}
private async ValueTask ReadAndValidateGzipHeaderAsync(
CancellationToken cancellationToken = default
)
{
// read the header on the first read
var header = new byte[10];
var n = await _stream.ReadAsync(header, 0, 10, cancellationToken);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
return;
}
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
var timet = BinaryPrimitives.ReadInt32LittleEndian(header.AsSpan().Slice(4));
DateModified = TarHeader.EPOCH.AddSeconds(timet);
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
var lengthField = new byte[2];
_ = await _stream.ReadAsync(lengthField, 0, 2, cancellationToken);
var extraLength = (short)(lengthField[0] + (lengthField[1] * 256));
var extra = new byte[extraLength];
if (!await _stream.ReadFullyAsync(extra, cancellationToken))
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
}
}
if ((header[3] & 0x08) == 0x08)
{
_name = await ReadZeroTerminatedStringAsync(_stream, cancellationToken);
}
if ((header[3] & 0x10) == 0x010)
{
await ReadZeroTerminatedStringAsync(_stream, cancellationToken);
}
if ((header[3] & 0x02) == 0x02)
{
var buf = new byte[1];
_ = await _stream.ReadAsync(buf, 0, 1, cancellationToken); // CRC16, ignore
}
}
private string ReadZeroTerminatedString(Stream stream)
{
Span<byte> buf1 = stackalloc byte[1];
@@ -143,4 +231,33 @@ internal sealed partial class GZipFilePart : FilePart
var buffer = list.ToArray();
return ArchiveEncoding.Decode(buffer);
}
private async ValueTask<string> ReadZeroTerminatedStringAsync(
Stream stream,
CancellationToken cancellationToken = default
)
{
var buf1 = new byte[1];
var list = new List<byte>();
var done = false;
do
{
// workitem 7740
var n = await stream.ReadAsync(buf1, 0, 1, cancellationToken);
if (n != 1)
{
throw new ZlibException("Unexpected EOF reading GZIP header.");
}
if (buf1[0] == 0)
{
done = true;
}
else
{
list.Add(buf1[0]);
}
} while (!done);
var buffer = list.ToArray();
return ArchiveEncoding.Decode(buffer);
}
}

View File

@@ -2,7 +2,7 @@ using System;
namespace SharpCompress.Common;
public interface IVolume : IDisposable, IAsyncDisposable
public interface IVolume : IDisposable
{
int Index { get; }

View File

@@ -1,190 +0,0 @@
using System;
using System.Buffers.Binary;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using SharpCompress.Common;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar;
internal class AsyncMarkingBinaryReader
{
private readonly AsyncBinaryReader _reader;
public AsyncMarkingBinaryReader(Stream stream)
{
_reader = new AsyncBinaryReader(stream, leaveOpen: true);
}
public Stream BaseStream => _reader.BaseStream;
public virtual long CurrentReadByteCount { get; protected set; }
public virtual void Mark() => CurrentReadByteCount = 0;
public virtual async ValueTask<bool> ReadBooleanAsync(
CancellationToken cancellationToken = default
) => await ReadByteAsync(cancellationToken).ConfigureAwait(false) != 0;
public virtual async ValueTask<byte> ReadByteAsync(
CancellationToken cancellationToken = default
)
{
CurrentReadByteCount++;
return await _reader.ReadByteAsync(cancellationToken).ConfigureAwait(false);
}
public virtual async ValueTask<byte[]> ReadBytesAsync(
int count,
CancellationToken cancellationToken = default
)
{
CurrentReadByteCount += count;
var bytes = new byte[count];
await _reader.ReadBytesAsync(bytes, 0, count, cancellationToken).ConfigureAwait(false);
return bytes;
}
public async ValueTask<ushort> ReadUInt16Async(CancellationToken cancellationToken = default)
{
var bytes = await ReadBytesAsync(2, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt16LittleEndian(bytes);
}
public async ValueTask<uint> ReadUInt32Async(CancellationToken cancellationToken = default)
{
var bytes = await ReadBytesAsync(4, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt32LittleEndian(bytes);
}
public virtual async ValueTask<ulong> ReadUInt64Async(
CancellationToken cancellationToken = default
)
{
var bytes = await ReadBytesAsync(8, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadUInt64LittleEndian(bytes);
}
public virtual async ValueTask<short> ReadInt16Async(
CancellationToken cancellationToken = default
)
{
var bytes = await ReadBytesAsync(2, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadInt16LittleEndian(bytes);
}
public virtual async ValueTask<int> ReadInt32Async(
CancellationToken cancellationToken = default
)
{
var bytes = await ReadBytesAsync(4, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadInt32LittleEndian(bytes);
}
public virtual async ValueTask<long> ReadInt64Async(
CancellationToken cancellationToken = default
)
{
var bytes = await ReadBytesAsync(8, cancellationToken).ConfigureAwait(false);
return BinaryPrimitives.ReadInt64LittleEndian(bytes);
}
public async ValueTask<ulong> ReadRarVIntAsync(
CancellationToken cancellationToken = default,
int maxBytes = 10
) => await DoReadRarVIntAsync((maxBytes - 1) * 7, cancellationToken).ConfigureAwait(false);
private async ValueTask<ulong> DoReadRarVIntAsync(
int maxShift,
CancellationToken cancellationToken
)
{
var shift = 0;
ulong result = 0;
do
{
var b0 = await ReadByteAsync(cancellationToken).ConfigureAwait(false);
var b1 = ((uint)b0) & 0x7f;
ulong n = b1;
var shifted = n << shift;
if (n != shifted >> shift)
{
// overflow
break;
}
result |= shifted;
if (b0 == b1)
{
return result;
}
shift += 7;
} while (shift <= maxShift);
throw new FormatException("malformed vint");
}
public async ValueTask<uint> ReadRarVIntUInt32Async(
int maxBytes = 5,
CancellationToken cancellationToken = default
) =>
// hopefully this gets inlined
await DoReadRarVIntUInt32Async((maxBytes - 1) * 7, cancellationToken).ConfigureAwait(false);
public async ValueTask<ushort> ReadRarVIntUInt16Async(
int maxBytes = 3,
CancellationToken cancellationToken = default
) =>
// hopefully this gets inlined
checked(
(ushort)
await DoReadRarVIntUInt32Async((maxBytes - 1) * 7, cancellationToken)
.ConfigureAwait(false)
);
public async ValueTask<byte> ReadRarVIntByteAsync(
int maxBytes = 2,
CancellationToken cancellationToken = default
) =>
// hopefully this gets inlined
checked(
(byte)
await DoReadRarVIntUInt32Async((maxBytes - 1) * 7, cancellationToken)
.ConfigureAwait(false)
);
public async ValueTask SkipAsync(int count, CancellationToken cancellationToken = default)
{
CurrentReadByteCount += count;
await _reader.SkipAsync(count, cancellationToken).ConfigureAwait(false);
}
private async ValueTask<uint> DoReadRarVIntUInt32Async(
int maxShift,
CancellationToken cancellationToken = default
)
{
var shift = 0;
uint result = 0;
do
{
var b0 = await ReadByteAsync(cancellationToken).ConfigureAwait(false);
var b1 = ((uint)b0) & 0x7f;
var n = b1;
var shifted = n << shift;
if (n != shifted >> shift)
{
// overflow
break;
}
result |= shifted;
if (b0 == b1)
{
return result;
}
shift += 7;
} while (shift <= maxShift);
throw new FormatException("malformed vint");
}
}

Some files were not shown because too many files have changed in this diff Show More