Compare commits

..

6 Commits

Author SHA1 Message Date
Adam Hathcock
0dfdba3ea1 Change more ReadBytes 2016-10-07 16:02:08 +01:00
Adam Hathcock
3d4b1904ca use pool in MarkingBinaryReader and BinaryReader 2016-10-07 15:41:31 +01:00
Adam Hathcock
cd7e480e51 Dumb mistake on buffer usage 2016-10-07 11:56:31 +01:00
Adam Hathcock
4f092f4c44 update xunit 2016-10-07 11:51:50 +01:00
Adam Hathcock
aa55e1643c Merge branch 'master' into system_buffers 2016-10-07 11:49:29 +01:00
Adam Hathcock
771986c50c Start using buffers. Minimal in algorithms for now. 2016-10-03 11:09:09 +01:00
254 changed files with 2124 additions and 2825 deletions

3
.gitignore vendored
View File

@@ -10,7 +10,6 @@ TestResults/
*.nupkg
packages/*/
project.lock.json
tests/TestArchives/Scratch
test/TestArchives/Scratch
.vs
tools
.vscode

View File

@@ -1,13 +0,0 @@
dist: trusty
language: csharp
cache:
directories:
- $HOME/.dotnet
solution: SharpCompress.sln
matrix:
include:
- dotnet: 1.0.4
mono: none
env: DOTNETCORE=1
script:
- ./build.sh

View File

@@ -1,28 +1,25 @@
# Archive Formats
## Accessing Archives
Archive classes allow random access to a seekable stream.
Reader classes allow forward-only reading
Writer classes allow forward-only Writing
## Supported Format Table
| Archive Format | Compression Format(s) | Compress/Decompress | Archive API | Reader API | Writer API |
| --- | --- | --- | --- | --- | --- |
| Rar | Rar | Decompress (1) | RarArchive | RarReader | N/A |
| Zip (2) | None, DEFLATE, BZip2, LZMA/LZMA2, PPMd | Both | ZipArchive | ZipReader | ZipWriter |
| Tar | None, BZip2, GZip, LZip | Both | TarArchive | TarReader | TarWriter (3) |
| Tar | None, BZip2, GZip | Both | TarArchive | TarReader | TarWriter (3) |
| GZip (single file) | GZip | Both | GZipArchive | GZipReader | GZipWriter |
| 7Zip (4) | LZMA, LZMA2, BZip2, PPMd, BCJ, BCJ2, Deflate | Decompress | SevenZipArchive | N/A | N/A |
1. SOLID Rars are only supported in the RarReader API.
2. Zip format supports pkware and WinzipAES encryption. However, encrypted LZMA is not supported. Zip64 reading is supported.
2. Zip format supports pkware and WinzipAES encryption. However, encrypted LZMA is not supported.
3. The Tar format requires a file size in the header. If no size is specified to the TarWriter and the stream is not seekable, then an exception will be thrown.
4. The 7Zip format doesn't allow for reading as a forward-only stream so 7Zip is only supported through the Archive API
## Compressors
For those who want to directly compress/decompress bits
| Compressor | Compress/Decompress |
@@ -33,4 +30,3 @@ For those who want to directly compress/decompress bits
| LZMAStream | Both |
| PPMdStream | Both |
| ADCStream | Decompress |
| LZipStream | Decompress |

View File

@@ -1,15 +1,11 @@
# SharpCompress
SharpCompress is a compression library in pure C# for .NET 3.5, 4.5, .NET Standard 1.0, 1.3 that can unrar, un7zip, unzip, untar unbzip2 and ungzip with forward-only reading and file random access APIs. Write support for zip/tar/bzip2/gzip are implemented.
SharpCompress is a compression library for .NET/Mono/Silverlight/WP7 that can unrar, un7zip, unzip, untar unbzip2 and ungzip with forward-only reading and file random access APIs. Write support for zip/tar/bzip2/gzip are implemented.
The major feature is support for non-seekable streams so large files can be processed on the fly (i.e. download stream).
AppVeyor Build -
[![Build status](https://ci.appveyor.com/api/projects/status/voxg971oemmvxh1e/branch/master?svg=true)](https://ci.appveyor.com/project/adamhathcock/sharpcompress/branch/master)
Travis CI Build -
[![Build Status](https://travis-ci.org/adamhathcock/sharpcompress.svg?branch=master)](https://travis-ci.org/adamhathcock/sharpcompress)
## Need Help?
Post Issues on Github!
@@ -29,52 +25,12 @@ I'm always looking for help or ideas. Please submit code or email with ideas. Un
* RAR 5 support
* 7Zip writing
* Zip64 (Need writing and extend Reading)
* Zip64
* Multi-volume Zip support.
* RAR5 support
## Version Log
### Version 0.16.2
* Fix [.NET 3.5 should support files and cryptography (was a regression from 0.16.0)](https://github.com/adamhathcock/sharpcompress/pull/251)
* Fix [Zip per entry compression customization wrote the wrong method into the zip archive ](https://github.com/adamhathcock/sharpcompress/pull/249)
### Version 0.16.1
* Fix [Preserve compression method when getting a compressed stream](https://github.com/adamhathcock/sharpcompress/pull/235)
* Fix [RAR entry key normalization fix](https://github.com/adamhathcock/sharpcompress/issues/201)
### Version 0.16.0
* Breaking - [Progress Event Tracking rethink](https://github.com/adamhathcock/sharpcompress/pull/226)
* Update to VS2017 - [VS2017](https://github.com/adamhathcock/sharpcompress/pull/231) - Framework targets have been changed.
* New - [Add Zip64 writing](https://github.com/adamhathcock/sharpcompress/pull/211)
* [Fix invalid/mismatching Zip version flags.](https://github.com/adamhathcock/sharpcompress/issues/164) - This allows nuget/System.IO.Packaging to read zip files generated by SharpCompress
* [Fix 7Zip directory hiding](https://github.com/adamhathcock/sharpcompress/pull/215/files)
* [Verify RAR CRC headers](https://github.com/adamhathcock/sharpcompress/pull/220)
### Version 0.15.2
* [Fix invalid headers](https://github.com/adamhathcock/sharpcompress/pull/210) - fixes an issue creating large-ish zip archives that was introduced with zip64 reading.
### Version 0.15.1
* [Zip64 extending information and ZipReader](https://github.com/adamhathcock/sharpcompress/pull/206)
### Version 0.15.0
* [Add zip64 support for ZipArchive extraction](https://github.com/adamhathcock/sharpcompress/pull/205)
### Version 0.14.1
* [.NET Assemblies aren't strong named](https://github.com/adamhathcock/sharpcompress/issues/158)
* [Pkware encryption for Zip files didn't allow for multiple reads of an entry](https://github.com/adamhathcock/sharpcompress/issues/197)
* [GZip Entry couldn't be read multiple times](https://github.com/adamhathcock/sharpcompress/issues/198)
### Version 0.14.0
* [Support for LZip reading in for Tars](https://github.com/adamhathcock/sharpcompress/pull/191)
### Version 0.13.1
* [Fix null password on ReaderFactory. Fix null options on SevenZipArchive](https://github.com/adamhathcock/sharpcompress/pull/188)
@@ -157,6 +113,8 @@ I'm always looking for help or ideas. Please submit code or email with ideas. Un
* Embedded some BouncyCastle crypto classes to allow RAR Decryption and Winzip AES Decryption in Portable and Windows Store DLLs
* Built in Release (I think)
Some Help/Discussion: https://sharpcompress.codeplex.com/discussions
7Zip implementation based on: https://code.google.com/p/managed-lzma/
LICENSE

View File

@@ -1,38 +1,44 @@
Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio 15
VisualStudioVersion = 15.0.26430.6
MinimumVisualStudioVersion = 10.0.40219.1
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{F18F1765-4A02-42FD-9BEF-F0E2FCBD9D17}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "src", "src", "{3C5BE746-03E5-4895-9988-0B57F162F86C}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "tests", "tests", "{0F0901FF-E8D9-426A-B5A2-17C7F47C1529}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SharpCompress", "src\SharpCompress\SharpCompress.csproj", "{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SharpCompress.Test", "tests\SharpCompress.Test\SharpCompress.Test.csproj", "{F2B1A1EB-0FA6-40D0-8908-E13247C7226F}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Debug|Any CPU.Build.0 = Debug|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Release|Any CPU.ActiveCfg = Release|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Release|Any CPU.Build.0 = Release|Any CPU
{F2B1A1EB-0FA6-40D0-8908-E13247C7226F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{F2B1A1EB-0FA6-40D0-8908-E13247C7226F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{F2B1A1EB-0FA6-40D0-8908-E13247C7226F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{F2B1A1EB-0FA6-40D0-8908-E13247C7226F}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998} = {3C5BE746-03E5-4895-9988-0B57F162F86C}
{F2B1A1EB-0FA6-40D0-8908-E13247C7226F} = {0F0901FF-E8D9-426A-B5A2-17C7F47C1529}
EndGlobalSection
EndGlobal
Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio 14
VisualStudioVersion = 14.0.24720.0
MinimumVisualStudioVersion = 10.0.40219.1
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{F18F1765-4A02-42FD-9BEF-F0E2FCBD9D17}"
ProjectSection(SolutionItems) = preProject
global.json = global.json
EndProjectSection
EndProject
Project("{8BB2217D-0F2D-49D1-97BC-3654ED321F3B}") = "SharpCompress", "src\SharpCompress\SharpCompress.xproj", "{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "src", "src", "{3C5BE746-03E5-4895-9988-0B57F162F86C}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "test", "test", "{0F0901FF-E8D9-426A-B5A2-17C7F47C1529}"
EndProject
Project("{8BB2217D-0F2D-49D1-97BC-3654ED321F3B}") = "SharpCompress.Test", "test\SharpCompress.Test\SharpCompress.Test.xproj", "{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5}"
ProjectSection(ProjectDependencies) = postProject
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998} = {FD19DDD8-72B2-4024-8665-0D1F7A2AA998}
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Debug|Any CPU.Build.0 = Debug|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Release|Any CPU.ActiveCfg = Release|Any CPU
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998}.Release|Any CPU.Build.0 = Release|Any CPU
{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5}.Debug|Any CPU.Build.0 = Debug|Any CPU
{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5}.Release|Any CPU.ActiveCfg = Release|Any CPU
{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{FD19DDD8-72B2-4024-8665-0D1F7A2AA998} = {3C5BE746-03E5-4895-9988-0B57F162F86C}
{3B80E585-A2F3-4666-8F69-C7FFDA0DD7E5} = {0F0901FF-E8D9-426A-B5A2-17C7F47C1529}
EndGlobalSection
EndGlobal

Binary file not shown.

View File

@@ -80,7 +80,7 @@ using (var archive = RarArchive.Open("Test.rar"))
### Use ReaderFactory to autodetect archive type and Open the entry stream
```C#
using (Stream stream = File.OpenRead("Tar.tar.bz2"))
using (Stream stream = File.OpenRead("Tar.tar.bz2")))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
@@ -101,7 +101,7 @@ using (var reader = ReaderFactory.Open(stream))
### Use ReaderFactory to autodetect archive type and Open the entry stream
```C#
using (Stream stream = File.OpenRead("Tar.tar.bz2"))
using (Stream stream = File.OpenRead("Tar.tar.bz2")))
using (var reader = ReaderFactory.Open(stream))
{
while (reader.MoveToNextEntry())
@@ -128,4 +128,4 @@ using (var writer = WriterFactory.Open(stream, ArchiveType.Tar, new WriterOption
{
writer.WriteAll("D:\\temp", "*", SearchOption.AllDirectories);
}
```
```

View File

@@ -1,20 +1,17 @@
version: '{build}'
image: Visual Studio 2017
version: '0.13.{build}'
pull_requests:
do_not_increment_build_number: true
branches:
only:
- master
nuget:
disable_publish_on_pr: true
init:
- git config --global core.autocrlf true
build_script:
- ps: .\build.ps1
test: off
cache:
- tools -> build.cake
- tools -> build.ps1
artifacts:
- path: src\SharpCompress\bin\Release\*.nupkg
- path: nupkgs\*.nupkg
name: NuPkgs

View File

@@ -1,93 +1,229 @@
#addin "Cake.Json"
#addin "nuget:?package=NuGet.Core"
using NuGet;
//////////////////////////////////////////////////////////////////////
// ARGUMENTS
//////////////////////////////////////////////////////////////////////
var target = Argument("target", "Default");
var tag = Argument("tag", "cake");
var apiKey = Argument("apiKey", "");
var repo = Argument("repo", "");
//////////////////////////////////////////////////////////////////////
// PREPARATION
//////////////////////////////////////////////////////////////////////
var sources = new [] { "https://api.nuget.org/v3/index.json" };
var publishTarget = "";
Warning("=============");
var globalPath = MakeFullPath("global.json");
var nupkgs = MakeFullPath("nupkgs");
Warning("Operating on global.json: " + globalPath);
Warning("=============");
//////////////////////////////////////////////////////////////////////
// FUNCTIONS
//////////////////////////////////////////////////////////////////////
string MakeFullPath(string relativePath)
{
if (string.IsNullOrEmpty(repo))
{
return MakeAbsolute(new DirectoryPath(relativePath)).ToString();
}
if (!System.IO.Path.IsPathRooted(repo))
{
return MakeAbsolute(new DirectoryPath(System.IO.Path.Combine(repo,relativePath))).ToString();
}
return System.IO.Path.Combine(repo, relativePath);
}
IEnumerable<string> GetAllProjects()
{
var global = DeserializeJsonFromFile<JObject>(globalPath);
var projs = global["projects"].Select(x => x.ToString());
foreach(var y in projs)
{
yield return MakeFullPath(y);
}
}
IEnumerable<string> GetSourceProjects()
{
return GetAllProjects().Where(x => x.EndsWith("src"));
}
IEnumerable<string> GetTestProjects()
{
return GetAllProjects().Where(x => x.EndsWith("test"));
}
IEnumerable<string> GetFrameworks(string path)
{
var projectJObject = DeserializeJsonFromFile<JObject>(path);
foreach(var prop in ((JObject)projectJObject["frameworks"]).Properties())
{
yield return prop.Name;
}
}
string GetVersion(string path)
{
var projectJObject = DeserializeJsonFromFile<JObject>(path);
return ((JToken)projectJObject["version"]).ToString();
}
IEnumerable<string> GetProjectJsons(IEnumerable<string> projects)
{
foreach(var proj in projects)
{
foreach(var projectJson in GetFiles(proj + "/**/project.json"))
{
yield return MakeFullPath(projectJson.ToString());
}
}
}
bool IsNuGetPublished (FilePath file, string nugetSource)
{
var pkg = new ZipPackage(file.ToString());
var repo = PackageRepositoryFactory.Default.CreateRepository(nugetSource);
var packages = repo.FindPackagesById(pkg.Id);
var version = SemanticVersion.Parse(pkg.Version.ToString());
//Filter the list of packages that are not Release (Stable) versions
var exists = packages.Any (p => p.Version == version);
return exists;
}
//////////////////////////////////////////////////////////////////////
// TASKS
//////////////////////////////////////////////////////////////////////
Task("Restore")
.Does(() =>
.Does(() =>
{
DotNetCoreRestore(".");
var settings = new DotNetCoreRestoreSettings
{
Sources = sources,
NoCache = true
};
foreach(var project in GetProjectJsons(GetSourceProjects().Concat(GetTestProjects())))
{
DotNetCoreRestore(project, settings);
}
});
Task("Build")
.IsDependentOn("Restore")
.Does(() =>
.Does(() =>
{
if (IsRunningOnWindows())
var settings = new DotNetCoreBuildSettings
{
MSBuild("./sharpcompress.sln", c =>
{
c.SetConfiguration("Release")
.SetVerbosity(Verbosity.Minimal)
.UseToolVersion(MSBuildToolVersion.VS2017);
});
}
else
Configuration = "Release"
};
foreach(var project in GetProjectJsons(GetSourceProjects().Concat(GetTestProjects())))
{
var settings = new DotNetCoreBuildSettings
foreach(var framework in GetFrameworks(project))
{
Framework = "netstandard1.0",
Configuration = "Release"
};
DotNetCoreBuild("./src/SharpCompress/SharpCompress.csproj", settings);
settings.Framework = "netcoreapp1.1";
DotNetCoreBuild("./tests/SharpCompress.Test/SharpCompress.Test.csproj", settings);
}
Information("Building: {0} on Framework: {1}", project, framework);
Information("========");
settings.Framework = framework;
DotNetCoreBuild(project, settings);
}
}
});
Task("Test")
.IsDependentOn("Build")
.Does(() =>
{
if (!bool.Parse(EnvironmentVariable("APPVEYOR") ?? "false")
&& !bool.Parse(EnvironmentVariable("TRAVIS") ?? "false"))
.Does(() =>
{
var settings = new DotNetCoreTestSettings
{
var files = GetFiles("tests/**/*.csproj");
foreach(var file in files)
{
var settings = new DotNetCoreTestSettings
{
Configuration = "Release"
};
DotNetCoreTest(file.ToString(), settings);
}
}
else
{
Information("Skipping tests as this is AppVeyor or Travis CI");
Configuration = "Release",
Verbose = true
};
foreach(var project in GetProjectJsons(GetTestProjects()))
{
settings.Framework = GetFrameworks(project).First();
DotNetCoreTest(project.ToString(), settings);
}
}).ReportError(exception =>
{
Error(exception.ToString());
});
Task("Pack")
.IsDependentOn("Build")
.Does(() =>
{
if (IsRunningOnWindows())
.Does(() =>
{
if (DirectoryExists(nupkgs))
{
MSBuild("src/SharpCompress/SharpCompress.csproj", c => c
.SetConfiguration("Release")
.SetVerbosity(Verbosity.Minimal)
.UseToolVersion(MSBuildToolVersion.VS2017)
.WithProperty("NoBuild", "true")
.WithTarget("Pack"));
}
else
{
Information("Skipping Pack as this is not Windows");
DeleteDirectory(nupkgs, true);
}
CreateDirectory(nupkgs);
var settings = new DotNetCorePackSettings
{
Configuration = "Release",
OutputDirectory = nupkgs
};
foreach(var project in GetProjectJsons(GetSourceProjects()))
{
DotNetCorePack(project, settings);
}
});
Task("Publish")
.IsDependentOn("Restore")
.IsDependentOn("Build")
.IsDependentOn("Test")
.IsDependentOn("Pack")
.Does(() =>
{
var packages = GetFiles(nupkgs + "/*.nupkg");
foreach(var package in packages)
{
if (package.ToString().Contains("symbols"))
{
Warning("Skipping Symbols package " + package);
continue;
}
if (IsNuGetPublished(package, sources[1]))
{
throw new InvalidOperationException(package + " is already published.");
}
NuGetPush(package, new NuGetPushSettings{
ApiKey = apiKey,
Verbosity = NuGetVerbosity.Detailed,
Source = publishTarget
});
}
});
//////////////////////////////////////////////////////////////////////
// TASK TARGETS
//////////////////////////////////////////////////////////////////////
Task("Default")
.IsDependentOn("Restore")
.IsDependentOn("Build")
.IsDependentOn("Test")
.IsDependentOn("Pack");
Task("RunTests")
.IsDependentOn("Restore")
.IsDependentOn("Build")
.IsDependentOn("Test");
//////////////////////////////////////////////////////////////////////
// EXECUTION
//////////////////////////////////////////////////////////////////////
RunTarget(target);

250
build.ps1
View File

@@ -1,41 +1,22 @@
##########################################################################
# This is the Cake bootstrapper script for PowerShell.
# This file was downloaded from https://github.com/cake-build/resources
# Feel free to change this file to fit your needs.
##########################################################################
<#
.SYNOPSIS
This is a Powershell script to bootstrap a Cake build.
.DESCRIPTION
This Powershell script will download NuGet if missing, restore NuGet tools (including Cake)
and execute your Cake build script with the parameters you provide.
.PARAMETER Script
The build script to execute.
.PARAMETER Target
The build script target to run.
.PARAMETER Configuration
The build configuration to use.
.PARAMETER Verbosity
Specifies the amount of information to be displayed.
.PARAMETER Experimental
Tells Cake to use the latest Roslyn release.
.PARAMETER WhatIf
Performs a dry run of the build script.
No tasks will be executed.
.PARAMETER Mono
Tells Cake to use the Mono scripting engine.
.PARAMETER SkipToolPackageRestore
Skips restoring of packages.
.PARAMETER ScriptArgs
Remaining arguments are added here.
.LINK
http://cakebuild.net
#>
[CmdletBinding()]
@@ -46,183 +27,104 @@ Param(
[string]$Configuration = "Release",
[ValidateSet("Quiet", "Minimal", "Normal", "Verbose", "Diagnostic")]
[string]$Verbosity = "Verbose",
[switch]$Experimental,
[Alias("DryRun","Noop")]
[switch]$WhatIf,
[switch]$Mono,
[switch]$SkipToolPackageRestore,
[Parameter(Position=0,Mandatory=$false,ValueFromRemainingArguments=$true)]
[string[]]$ScriptArgs
)
[Reflection.Assembly]::LoadWithPartialName("System.Security") | Out-Null
function MD5HashFile([string] $filePath)
{
if ([string]::IsNullOrEmpty($filePath) -or !(Test-Path $filePath -PathType Leaf))
{
return $null
}
[System.IO.Stream] $file = $null;
[System.Security.Cryptography.MD5] $md5 = $null;
try
{
$md5 = [System.Security.Cryptography.MD5]::Create()
$file = [System.IO.File]::OpenRead($filePath)
return [System.BitConverter]::ToString($md5.ComputeHash($file))
}
finally
{
if ($file -ne $null)
{
$file.Dispose()
}
}
}
Write-Host "Preparing to run build script..."
if(!$PSScriptRoot){
$PSScriptRoot = Split-Path $MyInvocation.MyCommand.Path -Parent
}
$TOOLS_DIR = Join-Path $PSScriptRoot "tools"
$ADDINS_DIR = Join-Path $TOOLS_DIR "addins"
$MODULES_DIR = Join-Path $TOOLS_DIR "modules"
$NUGET_EXE = Join-Path $TOOLS_DIR "nuget.exe"
$CAKE_EXE = Join-Path $TOOLS_DIR "Cake/Cake.exe"
$NUGET_URL = "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
$PACKAGES_CONFIG = Join-Path $TOOLS_DIR "packages.config"
$PACKAGES_CONFIG_MD5 = Join-Path $TOOLS_DIR "packages.config.md5sum"
$ADDINS_PACKAGES_CONFIG = Join-Path $ADDINS_DIR "packages.config"
$MODULES_PACKAGES_CONFIG = Join-Path $MODULES_DIR "packages.config"
# Should we use mono?
$UseMono = "";
if($Mono.IsPresent) {
Write-Verbose -Message "Using the Mono based scripting engine."
$UseMono = "-mono"
}
# Should we use the new Roslyn?
$UseExperimental = "";
if($Experimental.IsPresent -and !($Mono.IsPresent)) {
Write-Verbose -Message "Using experimental version of Roslyn."
$UseExperimental = "-experimental"
}
# Is this a dry run?
$UseDryRun = "";
if($WhatIf.IsPresent) {
$UseDryRun = "-dryrun"
}
$CakeVersion = "0.16.1"
$DotNetChannel = "preview";
$DotNetVersion = "1.0.0-preview2-003131";
$DotNetInstallerUri = "https://raw.githubusercontent.com/dotnet/cli/rel/1.0.0-preview2/scripts/obtain/dotnet-install.ps1";
$NugetUrl = "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
# Make sure tools folder exists
if ((Test-Path $PSScriptRoot) -and !(Test-Path $TOOLS_DIR)) {
Write-Verbose -Message "Creating tools directory..."
New-Item -Path $TOOLS_DIR -Type directory | out-null
$PSScriptRoot = Split-Path $MyInvocation.MyCommand.Path -Parent
$ToolPath = Join-Path $PSScriptRoot "tools"
if (!(Test-Path $ToolPath)) {
Write-Verbose "Creating tools directory..."
New-Item -Path $ToolPath -Type directory | out-null
}
# Make sure that packages.config exist.
if (!(Test-Path $PACKAGES_CONFIG)) {
Write-Verbose -Message "Downloading packages.config..."
try { (New-Object System.Net.WebClient).DownloadFile("http://cakebuild.net/download/bootstrapper/packages", $PACKAGES_CONFIG) } catch {
Throw "Could not download packages.config."
}
}
###########################################################################
# INSTALL .NET CORE CLI
###########################################################################
# Try find NuGet.exe in path if not exists
if (!(Test-Path $NUGET_EXE)) {
Write-Verbose -Message "Trying to find nuget.exe in PATH..."
$existingPaths = $Env:Path -Split ';' | Where-Object { (![string]::IsNullOrEmpty($_)) -and (Test-Path $_ -PathType Container) }
$NUGET_EXE_IN_PATH = Get-ChildItem -Path $existingPaths -Filter "nuget.exe" | Select -First 1
if ($NUGET_EXE_IN_PATH -ne $null -and (Test-Path $NUGET_EXE_IN_PATH.FullName)) {
Write-Verbose -Message "Found in PATH at $($NUGET_EXE_IN_PATH.FullName)."
$NUGET_EXE = $NUGET_EXE_IN_PATH.FullName
}
}
# Try download NuGet.exe if not exists
if (!(Test-Path $NUGET_EXE)) {
Write-Verbose -Message "Downloading NuGet.exe..."
try {
(New-Object System.Net.WebClient).DownloadFile($NUGET_URL, $NUGET_EXE)
} catch {
Throw "Could not download NuGet.exe."
}
}
# Save nuget.exe path to environment to be available to child processed
$ENV:NUGET_EXE = $NUGET_EXE
# Restore tools from NuGet?
if(-Not $SkipToolPackageRestore.IsPresent) {
Push-Location
Set-Location $TOOLS_DIR
# Check for changes in packages.config and remove installed tools if true.
[string] $md5Hash = MD5HashFile($PACKAGES_CONFIG)
if((!(Test-Path $PACKAGES_CONFIG_MD5)) -Or
($md5Hash -ne (Get-Content $PACKAGES_CONFIG_MD5 ))) {
Write-Verbose -Message "Missing or changed package.config hash..."
Remove-Item * -Recurse -Exclude packages.config,nuget.exe
}
Write-Verbose -Message "Restoring tools from NuGet..."
$NuGetOutput = Invoke-Expression "&`"$NUGET_EXE`" install -ExcludeVersion -OutputDirectory `"$TOOLS_DIR`""
if ($LASTEXITCODE -ne 0) {
Throw "An error occured while restoring NuGet tools."
}
else
Function Remove-PathVariable([string]$VariableToRemove)
{
$path = [Environment]::GetEnvironmentVariable("PATH", "User")
if ($path -ne $null)
{
$md5Hash | Out-File $PACKAGES_CONFIG_MD5 -Encoding "ASCII"
$newItems = $path.Split(';', [StringSplitOptions]::RemoveEmptyEntries) | Where-Object { "$($_)" -inotlike $VariableToRemove }
[Environment]::SetEnvironmentVariable("PATH", [System.String]::Join(';', $newItems), "User")
}
$path = [Environment]::GetEnvironmentVariable("PATH", "Process")
if ($path -ne $null)
{
$newItems = $path.Split(';', [StringSplitOptions]::RemoveEmptyEntries) | Where-Object { "$($_)" -inotlike $VariableToRemove }
[Environment]::SetEnvironmentVariable("PATH", [System.String]::Join(';', $newItems), "Process")
}
Write-Verbose -Message ($NuGetOutput | out-string)
Pop-Location
}
# Restore addins from NuGet
if (Test-Path $ADDINS_PACKAGES_CONFIG) {
Push-Location
Set-Location $ADDINS_DIR
# Get .NET Core CLI path if installed.
$FoundDotNetCliVersion = $null;
if (Get-Command dotnet -ErrorAction SilentlyContinue) {
$FoundDotNetCliVersion = dotnet --version;
}
Write-Verbose -Message "Restoring addins from NuGet..."
$NuGetOutput = Invoke-Expression "&`"$NUGET_EXE`" install -ExcludeVersion -OutputDirectory `"$ADDINS_DIR`""
if($FoundDotNetCliVersion -ne $DotNetVersion) {
$InstallPath = Join-Path $PSScriptRoot ".dotnet"
if (!(Test-Path $InstallPath)) {
mkdir -Force $InstallPath | Out-Null;
}
(New-Object System.Net.WebClient).DownloadFile($DotNetInstallerUri, "$InstallPath\dotnet-install.ps1");
& $InstallPath\dotnet-install.ps1 -Channel $DotNetChannel -Version $DotNetVersion -InstallDir $InstallPath;
Remove-PathVariable "$InstallPath"
$env:PATH = "$InstallPath;$env:PATH"
$env:DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1
$env:DOTNET_CLI_TELEMETRY_OPTOUT=1
}
###########################################################################
# INSTALL NUGET
###########################################################################
# Make sure nuget.exe exists.
$NugetPath = Join-Path $ToolPath "nuget.exe"
if (!(Test-Path $NugetPath)) {
Write-Host "Downloading NuGet.exe..."
(New-Object System.Net.WebClient).DownloadFile($NugetUrl, $NugetPath);
}
###########################################################################
# INSTALL CAKE
###########################################################################
# Make sure Cake has been installed.
$CakePath = Join-Path $ToolPath "Cake.$CakeVersion/Cake.exe"
if (!(Test-Path $CakePath)) {
Write-Host "Installing Cake..."
Invoke-Expression "&`"$NugetPath`" install Cake -Version $CakeVersion -OutputDirectory `"$ToolPath`"" | Out-Null;
if ($LASTEXITCODE -ne 0) {
Throw "An error occured while restoring NuGet addins."
Throw "An error occured while restoring Cake from NuGet."
}
Write-Verbose -Message ($NuGetOutput | out-string)
Pop-Location
}
# Restore modules from NuGet
if (Test-Path $MODULES_PACKAGES_CONFIG) {
Push-Location
Set-Location $MODULES_DIR
###########################################################################
# RUN BUILD SCRIPT
###########################################################################
Write-Verbose -Message "Restoring modules from NuGet..."
$NuGetOutput = Invoke-Expression "&`"$NUGET_EXE`" install -ExcludeVersion -OutputDirectory `"$MODULES_DIR`""
if ($LASTEXITCODE -ne 0) {
Throw "An error occured while restoring NuGet modules."
}
Write-Verbose -Message ($NuGetOutput | out-string)
Pop-Location
}
# Make sure that Cake has been installed.
if (!(Test-Path $CAKE_EXE)) {
Throw "Could not find Cake.exe at $CAKE_EXE"
}
# Build the argument list.
$Arguments = @{
target=$Target;
configuration=$Configuration;
verbosity=$Verbosity;
dryrun=$WhatIf;
}.GetEnumerator() | %{"--{0}=`"{1}`"" -f $_.key, $_.value };
# Start Cake
Write-Host "Running build script..."
Invoke-Expression "& `"$CAKE_EXE`" `"$Script`" -target=`"$Target`" -configuration=`"$Configuration`" -verbosity=`"$Verbosity`" $UseMono $UseDryRun $UseExperimental $ScriptArgs"
Invoke-Expression "& `"$CakePath`" `"$Script`" $Arguments $ScriptArgs"
exit $LASTEXITCODE

View File

@@ -1,42 +0,0 @@
#!/usr/bin/env bash
##########################################################################
# This is the Cake bootstrapper script for Linux and OS X.
# This file was downloaded from https://github.com/cake-build/resources
# Feel free to change this file to fit your needs.
##########################################################################
# Define directories.
SCRIPT_DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
TOOLS_DIR=$SCRIPT_DIR/tools
CAKE_VERSION=0.19.1
CAKE_DLL=$TOOLS_DIR/Cake.CoreCLR.$CAKE_VERSION/Cake.dll
# Make sure the tools folder exist.
if [ ! -d "$TOOLS_DIR" ]; then
mkdir "$TOOLS_DIR"
fi
###########################################################################
# INSTALL CAKE
###########################################################################
if [ ! -f "$CAKE_DLL" ]; then
curl -Lsfo Cake.CoreCLR.zip "https://www.nuget.org/api/v2/package/Cake.CoreCLR/$CAKE_VERSION" && unzip -q Cake.CoreCLR.zip -d "$TOOLS_DIR/Cake.CoreCLR.$CAKE_VERSION" && rm -f Cake.CoreCLR.zip
if [ $? -ne 0 ]; then
echo "An error occured while installing Cake."
exit 1
fi
fi
# Make sure that Cake has been installed.
if [ ! -f "$CAKE_DLL" ]; then
echo "Could not find Cake.exe at '$CAKE_DLL'."
exit 1
fi
###########################################################################
# RUN BUILD SCRIPT
###########################################################################
# Start Cake
exec dotnet "$CAKE_DLL" "$@"

3
global.json Normal file
View File

@@ -0,0 +1,3 @@
{
"projects": ["src","test"]
}

View File

@@ -61,12 +61,18 @@ namespace SharpCompress.Archives
void IArchiveExtractionListener.FireEntryExtractionBegin(IArchiveEntry entry)
{
EntryExtractionBegin?.Invoke(this, new ArchiveExtractionEventArgs<IArchiveEntry>(entry));
if (EntryExtractionBegin != null)
{
EntryExtractionBegin(this, new ArchiveExtractionEventArgs<IArchiveEntry>(entry));
}
}
void IArchiveExtractionListener.FireEntryExtractionEnd(IArchiveEntry entry)
{
EntryExtractionEnd?.Invoke(this, new ArchiveExtractionEventArgs<IArchiveEntry>(entry));
if (EntryExtractionEnd != null)
{
EntryExtractionEnd(this, new ArchiveExtractionEventArgs<IArchiveEntry>(entry));
}
}
private static Stream CheckStreams(Stream stream)
@@ -123,21 +129,27 @@ namespace SharpCompress.Archives
void IExtractionListener.FireCompressedBytesRead(long currentPartCompressedBytes, long compressedReadBytes)
{
CompressedBytesRead?.Invoke(this, new CompressedBytesReadEventArgs
if (CompressedBytesRead != null)
{
CurrentFilePartCompressedBytesRead = currentPartCompressedBytes,
CompressedBytesRead = compressedReadBytes
});
CompressedBytesRead(this, new CompressedBytesReadEventArgs
{
CurrentFilePartCompressedBytesRead = currentPartCompressedBytes,
CompressedBytesRead = compressedReadBytes
});
}
}
void IExtractionListener.FireFilePartExtractionBegin(string name, long size, long compressedSize)
{
FilePartExtractionBegin?.Invoke(this, new FilePartExtractionBeginEventArgs
if (FilePartExtractionBegin != null)
{
CompressedSize = compressedSize,
Size = size,
Name = name
});
FilePartExtractionBegin(this, new FilePartExtractionBeginEventArgs
{
CompressedSize = compressedSize,
Size = size,
Name = name
});
}
}
/// <summary>

View File

@@ -4,6 +4,7 @@ using System.IO;
using System.Linq;
using SharpCompress.Common;
using SharpCompress.Common.GZip;
using SharpCompress.IO;
using SharpCompress.Readers;
using SharpCompress.Readers.GZip;
using SharpCompress.Writers;
@@ -103,26 +104,28 @@ namespace SharpCompress.Archives.GZip
public static bool IsGZipFile(Stream stream)
{
// read the header on the first read
byte[] header = new byte[10];
int n = stream.Read(header, 0, header.Length);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
using (var header = ByteArrayPool.RentScope(10))
{
return false;
}
int n = stream.Read(header);
if (n != 10)
{
return false;
}
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
return false;
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
if (n != 10)
{
return false;
}
return true;
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
return false;
}
return true;
}
}
/// <summary>

View File

@@ -14,12 +14,6 @@ namespace SharpCompress.Archives.GZip
public virtual Stream OpenEntryStream()
{
//this is to reset the stream to be read multiple times
var part = Parts.Single() as GZipFilePart;
if (part.GetRawStream().Position != part.EntryStartPosition)
{
part.GetRawStream().Position = part.EntryStartPosition;
}
return Parts.Single().GetCompressedStream();
}
@@ -27,7 +21,7 @@ namespace SharpCompress.Archives.GZip
public IArchive Archive { get; }
public bool IsComplete => true;
public bool IsComplete { get { return true; } }
#endregion
}

View File

@@ -22,31 +22,31 @@ namespace SharpCompress.Archives.GZip
this.closeStream = closeStream;
}
public override long Crc => 0;
public override long Crc { get { return 0; } }
public override string Key { get; }
public override long CompressedSize => 0;
public override long CompressedSize { get { return 0; } }
public override long Size { get; }
public override DateTime? LastModifiedTime { get; }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => false;
public override bool IsDirectory { get { return false; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => throw new NotImplementedException();
internal override IEnumerable<FilePart> Parts { get { throw new NotImplementedException(); } }
Stream IWritableArchiveEntry.Stream => stream;
Stream IWritableArchiveEntry.Stream { get { return stream; } }
public override Stream OpenEntryStream()
{

View File

@@ -60,7 +60,7 @@ namespace SharpCompress.Archives.Rar
return RarReader.Open(stream, ReaderOptions);
}
public override bool IsSolid => Volumes.First().IsSolidArchive;
public override bool IsSolid { get { return Volumes.First().IsSolidArchive; } }
#region Creation

View File

@@ -20,13 +20,13 @@ namespace SharpCompress.Archives.Rar
this.archive = archive;
}
public override CompressionType CompressionType => CompressionType.Rar;
public override CompressionType CompressionType { get { return CompressionType.Rar; } }
public IArchive Archive => archive;
public IArchive Archive { get { return archive; } }
internal override IEnumerable<FilePart> Parts => parts.Cast<FilePart>();
internal override IEnumerable<FilePart> Parts { get { return parts.Cast<FilePart>(); } }
internal override FileHeader FileHeader => parts.First().FileHeader;
internal override FileHeader FileHeader { get { return parts.First().FileHeader; } }
public override long Crc
{

View File

@@ -28,6 +28,6 @@ namespace SharpCompress.Archives.Rar
return stream;
}
internal override string FilePartName => "Unknown Stream - File Entry: " + FileHeader.FileName;
internal override string FilePartName { get { return "Unknown Stream - File Entry: " + FileHeader.FileName; } }
}
}

View File

@@ -106,7 +106,10 @@ namespace SharpCompress.Archives.SevenZip
for (int i = 0; i < database.Files.Count; i++)
{
var file = database.Files[i];
yield return new SevenZipArchiveEntry(this, new SevenZipFilePart(stream, database, i, file));
if (!file.IsDir)
{
yield return new SevenZipArchiveEntry(this, new SevenZipFilePart(stream, database, i, file));
}
}
}
@@ -138,8 +141,10 @@ namespace SharpCompress.Archives.SevenZip
private static bool SignatureMatch(Stream stream)
{
BinaryReader reader = new BinaryReader(stream);
byte[] signatureBytes = reader.ReadBytes(6);
return signatureBytes.BinaryEquals(SIGNATURE);
using (var signatureBytes = reader.ReadScope(6))
{
return signatureBytes.BinaryEquals(SIGNATURE);
}
}
protected override IReader CreateReaderForSolidExtraction()
@@ -171,7 +176,7 @@ namespace SharpCompress.Archives.SevenZip
this.archive = archive;
}
public override SevenZipVolume Volume => archive.Volumes.Single();
public override SevenZipVolume Volume { get { return archive.Volumes.Single(); } }
internal override IEnumerable<SevenZipEntry> GetEntries(Stream stream)
{
@@ -206,4 +211,4 @@ namespace SharpCompress.Archives.SevenZip
}
}
}
}
}

View File

@@ -18,11 +18,11 @@ namespace SharpCompress.Archives.SevenZip
public IArchive Archive { get; }
public bool IsComplete => true;
public bool IsComplete { get { return true; } }
/// <summary>
/// This is a 7Zip Anti item
/// </summary>
public bool IsAnti => FilePart.Header.IsAnti;
public bool IsAnti { get { return FilePart.Header.IsAnti; } }
}
}

View File

@@ -22,7 +22,7 @@ namespace SharpCompress.Archives.Tar
public IArchive Archive { get; }
public bool IsComplete => true;
public bool IsComplete { get { return true; } }
#endregion
}

View File

@@ -22,30 +22,30 @@ namespace SharpCompress.Archives.Tar
this.closeStream = closeStream;
}
public override long Crc => 0;
public override long Crc { get { return 0; } }
public override string Key { get; }
public override long CompressedSize => 0;
public override long CompressedSize { get { return 0; } }
public override long Size { get; }
public override DateTime? LastModifiedTime { get; }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => false;
public override bool IsDirectory { get { return false; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => throw new NotImplementedException();
Stream IWritableArchiveEntry.Stream => stream;
internal override IEnumerable<FilePart> Parts { get { throw new NotImplementedException(); } }
Stream IWritableArchiveEntry.Stream { get { return stream; } }
public override Stream OpenEntryStream()
{

View File

@@ -21,10 +21,10 @@ namespace SharpCompress.Archives.Zip
public IArchive Archive { get; }
public bool IsComplete => true;
public bool IsComplete { get { return true; } }
#endregion
public string Comment => (Parts.Single() as SeekableZipFilePart).Comment;
public string Comment { get { return (Parts.Single() as SeekableZipFilePart).Comment; } }
}
}

View File

@@ -23,31 +23,31 @@ namespace SharpCompress.Archives.Zip
this.closeStream = closeStream;
}
public override long Crc => 0;
public override long Crc { get { return 0; } }
public override string Key { get; }
public override long CompressedSize => 0;
public override long CompressedSize { get { return 0; } }
public override long Size { get; }
public override DateTime? LastModifiedTime { get; }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => false;
public override bool IsDirectory { get { return false; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => throw new NotImplementedException();
internal override IEnumerable<FilePart> Parts { get { throw new NotImplementedException(); } }
Stream IWritableArchiveEntry.Stream => stream;
Stream IWritableArchiveEntry.Stream { get { return stream; } }
public override Stream OpenEntryStream()
{

View File

@@ -4,22 +4,6 @@ using System.Runtime.CompilerServices;
[assembly: AssemblyTitle("SharpCompress")]
[assembly: AssemblyProduct("SharpCompress")]
[assembly: InternalsVisibleTo("SharpCompress.Test" + SharpCompress.AssemblyInfo.PublicKeySuffix)]
[assembly: InternalsVisibleTo("SharpCompress.Test.Portable" + SharpCompress.AssemblyInfo.PublicKeySuffix)]
[assembly: CLSCompliant(true)]
namespace SharpCompress
{
/// <summary>
/// Just a static class to house the public key, to avoid repetition.
/// </summary>
internal static class AssemblyInfo
{
internal const string PublicKeySuffix =
",PublicKey=002400000480000094000000060200000024000052534131000400000100010059acfa17d26c44" +
"7a4d03f16eaa72c9187c04f16e6569dd168b080e39a6f5c9fd00f28c768cd8e9a089d5a0e1b34c" +
"cd971488e7afe030ce5ce8df2053cf12ec89f6d38065c434c09ee6af3ee284c5dc08f44774b679" +
"bf39298e57efe30d4b00aecf9e4f6f8448b2cb0146d8956dfcab606cc64a0ac38c60a7d78b0d65" +
"d3b98dc0";
}
}
[assembly: InternalsVisibleTo("SharpCompress.Test")]
[assembly: InternalsVisibleTo("SharpCompress.Test.Portable")]
[assembly: CLSCompliant(true)]

View File

@@ -9,6 +9,6 @@ namespace SharpCompress.Common
Item = entry;
}
public T Item { get; }
public T Item { get; private set; }
}
}

View File

@@ -11,7 +11,6 @@
LZMA,
BCJ,
BCJ2,
LZip,
Unknown
}
}

View File

@@ -75,6 +75,6 @@ namespace SharpCompress.Common
/// <summary>
/// Entry file attribute.
/// </summary>
public virtual int? Attrib => throw new NotImplementedException();
public virtual int? Attrib { get { throw new NotImplementedException(); } }
}
}

View File

@@ -44,20 +44,20 @@ namespace SharpCompress.Common
stream.Dispose();
}
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -13,31 +13,31 @@ namespace SharpCompress.Common.GZip
this.filePart = filePart;
}
public override CompressionType CompressionType => CompressionType.GZip;
public override CompressionType CompressionType { get { return CompressionType.GZip; } }
public override long Crc => 0;
public override long Crc { get { return 0; } }
public override string Key => filePart.FilePartName;
public override string Key { get { return filePart.FilePartName; } }
public override long CompressedSize => 0;
public override long CompressedSize { get { return 0; } }
public override long Size => 0;
public override long Size { get { return 0; } }
public override DateTime? LastModifiedTime => filePart.DateModified;
public override DateTime? LastModifiedTime { get { return filePart.DateModified; } }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => false;
public override bool IsDirectory { get { return false; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => filePart.AsEnumerable<FilePart>();
internal override IEnumerable<FilePart> Parts { get { return filePart.AsEnumerable<FilePart>(); } }
internal static IEnumerable<GZipEntry> GetEntries(Stream stream)
{

View File

@@ -5,6 +5,7 @@ using SharpCompress.Common.Tar.Headers;
using SharpCompress.Compressors;
using SharpCompress.Compressors.Deflate;
using SharpCompress.Converters;
using SharpCompress.IO;
namespace SharpCompress.Common.GZip
{
@@ -16,15 +17,12 @@ namespace SharpCompress.Common.GZip
internal GZipFilePart(Stream stream)
{
ReadAndValidateGzipHeader(stream);
EntryStartPosition = stream.Position;
this.stream = stream;
}
internal long EntryStartPosition { get; }
internal DateTime? DateModified { get; private set; }
internal override string FilePartName => name;
internal override string FilePartName { get { return name; } }
internal override Stream GetCompressedStream()
{
@@ -39,79 +37,85 @@ namespace SharpCompress.Common.GZip
private void ReadAndValidateGzipHeader(Stream stream)
{
// read the header on the first read
byte[] header = new byte[10];
int n = stream.Read(header, 0, header.Length);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
using (var header = ByteArrayPool.RentScope(10))
{
return;
}
int n = stream.Read(header);
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
Int32 timet = DataConverter.LittleEndian.GetInt32(header, 4);
DateModified = TarHeader.Epoch.AddSeconds(timet);
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
n = stream.Read(header, 0, 2); // 2-byte length field
Int16 extraLength = (Int16)(header[0] + header[1] * 256);
byte[] extra = new byte[extraLength];
n = stream.Read(extra, 0, extra.Length);
if (n != extraLength)
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
return;
}
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
Int32 timet = DataConverter.LittleEndian.GetInt32(header.Array, 4);
DateModified = TarHeader.Epoch.AddSeconds(timet);
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
n = stream.Read(header.Array, 0, 2); // 2-byte length field
Int16 extraLength = (Int16)(header[0] + header[1] * 256);
using (var extra = ByteArrayPool.RentScope(extraLength))
{
n = stream.Read(extra);
if (n != extraLength)
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
}
}
}
if ((header[3] & 0x08) == 0x08)
{
name = ReadZeroTerminatedString(stream);
}
if ((header[3] & 0x10) == 0x010)
{
ReadZeroTerminatedString(stream);
}
if ((header[3] & 0x02) == 0x02)
{
stream.ReadByte(); // CRC16, ignore
}
}
if ((header[3] & 0x08) == 0x08)
{
name = ReadZeroTerminatedString(stream);
}
if ((header[3] & 0x10) == 0x010)
{
ReadZeroTerminatedString(stream);
}
if ((header[3] & 0x02) == 0x02)
{
stream.ReadByte(); // CRC16, ignore
}
}
private static string ReadZeroTerminatedString(Stream stream)
{
byte[] buf1 = new byte[1];
var list = new List<byte>();
bool done = false;
do
using (var buf1 = ByteArrayPool.RentScope(1))
{
// workitem 7740
int n = stream.Read(buf1, 0, 1);
if (n != 1)
var list = new List<byte>();
bool done = false;
do
{
throw new ZlibException("Unexpected EOF reading GZIP header.");
}
if (buf1[0] == 0)
{
done = true;
}
else
{
list.Add(buf1[0]);
// workitem 7740
int n = stream.Read(buf1);
if (n != 1)
{
throw new ZlibException("Unexpected EOF reading GZIP header.");
}
if (buf1[0] == 0)
{
done = true;
}
else
{
list.Add(buf1[0]);
}
}
while (!done);
byte[] a = list.ToArray();
return ArchiveEncoding.Default.GetString(a, 0, a.Length);
}
while (!done);
byte[] a = list.ToArray();
return ArchiveEncoding.Default.GetString(a, 0, a.Length);
}
}
}

View File

@@ -18,8 +18,8 @@ namespace SharpCompress.Common.GZip
}
#endif
public override bool IsFirstVolume => true;
public override bool IsFirstVolume { get { return true; } }
public override bool IsMultiVolume => true;
public override bool IsMultiVolume { get { return true; } }
}
}

View File

@@ -17,7 +17,7 @@ namespace SharpCompress.Common.Rar.Headers
}
}
internal ArchiveFlags ArchiveHeaderFlags => (ArchiveFlags)Flags;
internal ArchiveFlags ArchiveHeaderFlags { get { return (ArchiveFlags)Flags; } }
internal short HighPosAv { get; private set; }
@@ -25,6 +25,6 @@ namespace SharpCompress.Common.Rar.Headers
internal byte EncryptionVersion { get; private set; }
public bool HasPassword => ArchiveHeaderFlags.HasFlag(ArchiveFlags.PASSWORD);
public bool HasPassword { get { return ArchiveHeaderFlags.HasFlag(ArchiveFlags.PASSWORD); } }
}
}

View File

@@ -16,7 +16,7 @@ namespace SharpCompress.Common.Rar.Headers
}
}
internal EndArchiveFlags EndArchiveFlags => (EndArchiveFlags)Flags;
internal EndArchiveFlags EndArchiveFlags { get { return (EndArchiveFlags)Flags; } }
internal int? ArchiveCRC { get; private set; }

View File

@@ -47,81 +47,82 @@ namespace SharpCompress.Common.Rar.Headers
nameSize = nameSize > 4 * 1024 ? (short)(4 * 1024) : nameSize;
byte[] fileNameBytes = reader.ReadBytes(nameSize);
switch (HeaderType)
using (var fileNameBytes = reader.ReadScope(nameSize))
{
case HeaderType.FileHeader:
switch (HeaderType)
{
if (FileFlags.HasFlag(FileFlags.UNICODE))
case HeaderType.FileHeader:
{
int length = 0;
while (length < fileNameBytes.Length
&& fileNameBytes[length] != 0)
if (FileFlags.HasFlag(FileFlags.UNICODE))
{
length++;
}
if (length != nameSize)
{
length++;
FileName = FileNameDecoder.Decode(fileNameBytes, length);
int length = 0;
while (length < fileNameBytes.Count
&& fileNameBytes[length] != 0)
{
length++;
}
if (length != nameSize)
{
length++;
FileName = FileNameDecoder.Decode(fileNameBytes, length);
}
else
{
FileName = DecodeDefault(fileNameBytes);
}
}
else
{
FileName = DecodeDefault(fileNameBytes);
}
FileName = ConvertPath(FileName, HostOS);
}
else
break;
case HeaderType.NewSubHeader:
{
FileName = DecodeDefault(fileNameBytes);
}
FileName = ConvertPath(FileName, HostOS);
}
break;
case HeaderType.NewSubHeader:
{
int datasize = HeaderSize - NEWLHD_SIZE - nameSize;
if (FileFlags.HasFlag(FileFlags.SALT))
{
datasize -= SALT_SIZE;
}
if (datasize > 0)
{
SubData = reader.ReadBytes(datasize);
}
int datasize = HeaderSize - NEWLHD_SIZE - nameSize;
if (FileFlags.HasFlag(FileFlags.SALT))
{
datasize -= SALT_SIZE;
}
if (datasize > 0)
{
SubData = reader.ReadBytes(datasize);
}
if (NewSubHeaderType.SUBHEAD_TYPE_RR.Equals(fileNameBytes))
{
RecoverySectors = SubData[8] + (SubData[9] << 8)
+ (SubData[10] << 16) + (SubData[11] << 24);
if (NewSubHeaderType.SUBHEAD_TYPE_RR.Equals(fileNameBytes))
{
RecoverySectors = SubData[8] + (SubData[9] << 8)
+ (SubData[10] << 16) + (SubData[11] << 24);
}
}
break;
}
break;
}
if (FileFlags.HasFlag(FileFlags.SALT))
{
Salt = reader.ReadBytes(SALT_SIZE);
}
if (FileFlags.HasFlag(FileFlags.EXTTIME))
{
// verify that the end of the header hasn't been reached before reading the Extended Time.
// some tools incorrectly omit Extended Time despite specifying FileFlags.EXTTIME, which most parsers tolerate.
if (ReadBytes + reader.CurrentReadByteCount <= HeaderSize - 2)
if (FileFlags.HasFlag(FileFlags.SALT))
{
ushort extendedFlags = reader.ReadUInt16();
FileLastModifiedTime = ProcessExtendedTime(extendedFlags, FileLastModifiedTime, reader, 0);
FileCreatedTime = ProcessExtendedTime(extendedFlags, null, reader, 1);
FileLastAccessedTime = ProcessExtendedTime(extendedFlags, null, reader, 2);
FileArchivedTime = ProcessExtendedTime(extendedFlags, null, reader, 3);
Salt = reader.ReadBytes(SALT_SIZE);
}
if (FileFlags.HasFlag(FileFlags.EXTTIME))
{
// verify that the end of the header hasn't been reached before reading the Extended Time.
// some tools incorrectly omit Extended Time despite specifying FileFlags.EXTTIME, which most parsers tolerate.
if (ReadBytes + reader.CurrentReadByteCount <= HeaderSize - 2)
{
ushort extendedFlags = reader.ReadUInt16();
FileLastModifiedTime = ProcessExtendedTime(extendedFlags, FileLastModifiedTime, reader, 0);
FileCreatedTime = ProcessExtendedTime(extendedFlags, null, reader, 1);
FileLastAccessedTime = ProcessExtendedTime(extendedFlags, null, reader, 2);
FileArchivedTime = ProcessExtendedTime(extendedFlags, null, reader, 3);
}
}
}
}
//only the full .net framework will do other code pages than unicode/utf8
private string DecodeDefault(byte[] bytes)
private string DecodeDefault(ByteArrayPoolScope bytes)
{
return ArchiveEncoding.Default.GetString(bytes, 0, bytes.Length);
return ArchiveEncoding.Default.GetString(bytes.Array, 0, bytes.Count);
}
private long UInt32To64(uint x, uint y)
@@ -165,13 +166,25 @@ namespace SharpCompress.Common.Rar.Headers
#if NO_FILE
return path.Replace('\\', '/');
#else
if (Path.DirectorySeparatorChar == '/')
switch (os)
{
return path.Replace('\\', '/');
}
else if (Path.DirectorySeparatorChar == '\\')
{
return path.Replace('/', '\\');
case HostOS.MacOS:
case HostOS.Unix:
{
if (Path.DirectorySeparatorChar == '\\')
{
return path.Replace('/', '\\');
}
}
break;
default:
{
if (Path.DirectorySeparatorChar == '/')
{
return path.Replace('\\', '/');
}
}
break;
}
return path;
#endif
@@ -196,7 +209,7 @@ namespace SharpCompress.Common.Rar.Headers
internal int FileAttributes { get; private set; }
internal FileFlags FileFlags => (FileFlags)Flags;
internal FileFlags FileFlags { get { return (FileFlags)Flags; } }
internal long CompressedSize { get; private set; }
internal long UncompressedSize { get; private set; }

View File

@@ -1,4 +1,5 @@
using System.Text;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar.Headers
{
@@ -7,12 +8,12 @@ namespace SharpCompress.Common.Rar.Headers
/// </summary>
internal static class FileNameDecoder
{
internal static int GetChar(byte[] name, int pos)
internal static int GetChar(ByteArrayPoolScope name, int pos)
{
return name[pos] & 0xff;
}
internal static string Decode(byte[] name, int encPos)
internal static string Decode(ByteArrayPoolScope name, int encPos)
{
int decPos = 0;
int flags = 0;
@@ -22,7 +23,7 @@ namespace SharpCompress.Common.Rar.Headers
int high = 0;
int highByte = GetChar(name, encPos++);
StringBuilder buf = new StringBuilder();
while (encPos < name.Length)
while (encPos < name.Count)
{
if (flagBits == 0)
{
@@ -54,7 +55,7 @@ namespace SharpCompress.Common.Rar.Headers
if ((length & 0x80) != 0)
{
int correction = GetChar(name, encPos++);
for (length = (length & 0x7f) + 2; length > 0 && decPos < name.Length; length--, decPos++)
for (length = (length & 0x7f) + 2; length > 0 && decPos < name.Count; length--, decPos++)
{
low = (GetChar(name, decPos) + correction) & 0xff;
buf.Append((char)((highByte << 8) + low));
@@ -62,7 +63,7 @@ namespace SharpCompress.Common.Rar.Headers
}
else
{
for (length += 2; length > 0 && decPos < name.Length; length--, decPos++)
for (length += 2; length > 0 && decPos < name.Count; length--, decPos++)
{
buf.Append((char)(GetChar(name, decPos)));
}

View File

@@ -13,7 +13,7 @@ namespace SharpCompress.Common.Rar.Headers
Mark = reader.ReadBytes(8);
}
internal uint DataSize => AdditionalSize;
internal uint DataSize { get { return AdditionalSize; } }
internal byte Version { get; private set; }
internal ushort RecSectors { get; private set; }
internal uint TotalBlocks { get; private set; }

View File

@@ -1,5 +1,4 @@
using System;
using System.IO;
using System.IO;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar.Headers
@@ -19,14 +18,14 @@ namespace SharpCompress.Common.Rar.Headers
ReadBytes = baseHeader.ReadBytes;
}
internal static RarHeader Create(RarCrcBinaryReader reader)
internal static RarHeader Create(MarkingBinaryReader reader)
{
try
{
RarHeader header = new RarHeader();
reader.Mark();
header.ReadStartFromReader(reader);
header.ReadFromReader(reader);
header.ReadBytes += reader.CurrentReadByteCount;
return header;
@@ -37,10 +36,9 @@ namespace SharpCompress.Common.Rar.Headers
}
}
private void ReadStartFromReader(RarCrcBinaryReader reader)
protected virtual void ReadFromReader(MarkingBinaryReader reader)
{
HeadCRC = reader.ReadUInt16();
reader.ResetCrc();
HeadCRC = reader.ReadInt16();
HeaderType = (HeaderType)(reader.ReadByte() & 0xff);
Flags = reader.ReadInt16();
HeaderSize = reader.ReadInt16();
@@ -50,11 +48,7 @@ namespace SharpCompress.Common.Rar.Headers
}
}
protected virtual void ReadFromReader(MarkingBinaryReader reader) {
throw new NotImplementedException();
}
internal T PromoteHeader<T>(RarCrcBinaryReader reader)
internal T PromoteHeader<T>(MarkingBinaryReader reader)
where T : RarHeader, new()
{
T header = new T();
@@ -68,24 +62,13 @@ namespace SharpCompress.Common.Rar.Headers
if (headerSizeDiff > 0)
{
reader.ReadBytes(headerSizeDiff);
using (reader.ReadScope(headerSizeDiff))
{ }
}
VerifyHeaderCrc(reader.GetCrc());
return header;
}
private void VerifyHeaderCrc(ushort crc) {
if (HeaderType != HeaderType.MarkHeader)
{
if (crc != HeadCRC)
{
throw new InvalidFormatException("rar header crc mismatch");
}
}
}
protected virtual void PostReadingBytes(MarkingBinaryReader reader)
{
}
@@ -95,7 +78,7 @@ namespace SharpCompress.Common.Rar.Headers
/// </summary>
protected long ReadBytes { get; private set; }
protected ushort HeadCRC { get; private set; }
protected short HeadCRC { get; private set; }
internal HeaderType HeaderType { get; private set; }

View File

@@ -52,35 +52,39 @@ namespace SharpCompress.Common.Rar.Headers
if (firstByte == 0x52)
{
MemoryStream buffer = new MemoryStream();
byte[] nextThreeBytes = reader.ReadBytes(3);
if ((nextThreeBytes[0] == 0x45)
&& (nextThreeBytes[1] == 0x7E)
&& (nextThreeBytes[2] == 0x5E))
using (var nextThreeBytes = reader.ReadScope(3))
{
//old format and isvalid
buffer.WriteByte(0x52);
buffer.Write(nextThreeBytes, 0, 3);
rewindableStream.Rewind(buffer);
break;
if ((nextThreeBytes[0] == 0x45)
&& (nextThreeBytes[1] == 0x7E)
&& (nextThreeBytes[2] == 0x5E))
{
//old format and isvalid
buffer.WriteByte(0x52);
buffer.Write(nextThreeBytes.Array, 0, 3);
rewindableStream.Rewind(buffer);
break;
}
using (var secondThreeBytes = reader.ReadScope(3))
{
if ((nextThreeBytes[0] == 0x61)
&& (nextThreeBytes[1] == 0x72)
&& (nextThreeBytes[2] == 0x21)
&& (secondThreeBytes[0] == 0x1A)
&& (secondThreeBytes[1] == 0x07)
&& (secondThreeBytes[2] == 0x00))
{
//new format and isvalid
buffer.WriteByte(0x52);
buffer.Write(nextThreeBytes.Array, 0, 3);
buffer.Write(secondThreeBytes.Array, 0, 3);
rewindableStream.Rewind(buffer);
break;
}
buffer.Write(nextThreeBytes.Array, 0, 3);
buffer.Write(secondThreeBytes.Array, 0, 3);
rewindableStream.Rewind(buffer);
}
}
byte[] secondThreeBytes = reader.ReadBytes(3);
if ((nextThreeBytes[0] == 0x61)
&& (nextThreeBytes[1] == 0x72)
&& (nextThreeBytes[2] == 0x21)
&& (secondThreeBytes[0] == 0x1A)
&& (secondThreeBytes[1] == 0x07)
&& (secondThreeBytes[2] == 0x00))
{
//new format and isvalid
buffer.WriteByte(0x52);
buffer.Write(nextThreeBytes, 0, 3);
buffer.Write(secondThreeBytes, 0, 3);
rewindableStream.Rewind(buffer);
break;
}
buffer.Write(nextThreeBytes, 0, 3);
buffer.Write(secondThreeBytes, 0, 3);
rewindableStream.Rewind(buffer);
}
if (count > MAX_SFX_SIZE)
{
@@ -129,7 +133,7 @@ namespace SharpCompress.Common.Rar.Headers
reader.InitializeAes(salt);
}
#else
var reader = new RarCrcBinaryReader(stream);
var reader = new MarkingBinaryReader(stream);
#endif
@@ -247,4 +251,4 @@ namespace SharpCompress.Common.Rar.Headers
}
}
}
}
}

View File

@@ -1,40 +0,0 @@
using System.IO;
using SharpCompress.Compressors.Rar;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar {
internal class RarCrcBinaryReader : MarkingBinaryReader {
private uint currentCrc;
public RarCrcBinaryReader(Stream stream) : base(stream)
{
}
public ushort GetCrc()
{
return (ushort)~currentCrc;
}
public void ResetCrc()
{
currentCrc = 0xffffffff;
}
protected void UpdateCrc(byte b)
{
currentCrc = RarCRC.CheckCrc(currentCrc, b);
}
protected byte[] ReadBytesNoCrc(int count)
{
return base.ReadBytes(count);
}
public override byte[] ReadBytes(int count)
{
var result = base.ReadBytes(count);
currentCrc = RarCRC.CheckCrc(currentCrc, result, 0, result.Length);
return result;
}
}
}

View File

@@ -6,13 +6,12 @@ using SharpCompress.IO;
namespace SharpCompress.Common.Rar
{
internal class RarCryptoBinaryReader : RarCrcBinaryReader
internal class RarCryptoBinaryReader : MarkingBinaryReader
{
private RarRijndael rijndael;
private byte[] salt;
private readonly string password;
private readonly Queue<byte> data = new Queue<byte>();
private long readCount;
public RarCryptoBinaryReader(Stream stream, string password )
: base(stream)
@@ -20,22 +19,6 @@ namespace SharpCompress.Common.Rar
this.password = password;
}
// track read count ourselves rather than using the underlying stream since we buffer
public override long CurrentReadByteCount {
get
{
return this.readCount;
}
protected set
{
// ignore
}
}
public override void Mark() {
this.readCount = 0;
}
protected bool UseEncryption
{
get { return salt != null; }
@@ -49,15 +32,21 @@ namespace SharpCompress.Common.Rar
public override byte[] ReadBytes(int count)
{
if (UseEncryption)
{
return ReadAndDecryptBytes(count);
}
this.readCount += count;
return base.ReadBytes(count);
byte[] b = new byte[count];
Read(b, 0, count);
return b;
}
private byte[] ReadAndDecryptBytes(int count)
public override int Read(byte[] buffer, int index, int count)
{
if (UseEncryption)
{
return ReadAndDecryptBytes(buffer, index, count);
}
return base.Read(buffer, index, count);
}
private int ReadAndDecryptBytes(byte[] buffer, int index, int count)
{
int queueSize = data.Count;
int sizeToRead = count - queueSize;
@@ -67,26 +56,41 @@ namespace SharpCompress.Common.Rar
int alignedSize = sizeToRead + ((~sizeToRead + 1) & 0xf);
for (int i = 0; i < alignedSize / 16; i++)
{
//long ax = System.currentTimeMillis();
byte[] cipherText = base.ReadBytesNoCrc(16);
var readBytes = rijndael.ProcessBlock(cipherText);
foreach (var readByte in readBytes)
data.Enqueue(readByte);
using (var cipherText = PrivateReadScope(16))
{
var readBytes = rijndael.ProcessBlock(cipherText);
foreach (var readByte in readBytes)
{
data.Enqueue(readByte);
}
}
}
}
var decryptedBytes = new byte[count];
for (int i = 0; i < count; i++)
for (int i = index; i < count; i++)
{
var b = data.Dequeue();
decryptedBytes[i] = b;
UpdateCrc(b);
buffer[i] = data.Dequeue();
}
this.readCount += count;
return decryptedBytes;
return count;
}
private ByteArrayPoolScope PrivateReadScope(int count)
{
var scope = ByteArrayPool.RentScope(count);
int numRead = 0;
do
{
int n = base.Read(scope.Array, numRead, count);
if (n == 0)
{
break;
}
numRead += n;
count -= n;
} while (count > 0);
scope.OverrideSize(numRead);
return scope;
}
public void ClearQueue()

View File

@@ -3,6 +3,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar
{
@@ -55,17 +56,21 @@ namespace SharpCompress.Common.Rar
for (int i = 0; i < alignedSize / 16; i++)
{
//long ax = System.currentTimeMillis();
byte[] cipherText = new byte[RarRijndael.CRYPTO_BLOCK_SIZE];
actualStream.Read(cipherText, 0, RarRijndael.CRYPTO_BLOCK_SIZE);
var readBytes = rijndael.ProcessBlock(cipherText);
foreach (var readByte in readBytes)
data.Enqueue(readByte);
using (var cipherText = ByteArrayPool.RentScope(RarRijndael.CRYPTO_BLOCK_SIZE))
{
actualStream.Read(cipherText.Array, 0, RarRijndael.CRYPTO_BLOCK_SIZE);
var readBytes = rijndael.ProcessBlock(cipherText);
foreach (var readByte in readBytes)
{
data.Enqueue(readByte);
}
}
}
for (int i = 0; i < count; i++)
{
buffer[offset + i] = data.Dequeue();
}
}
return count;
}

View File

@@ -10,44 +10,44 @@ namespace SharpCompress.Common.Rar
/// <summary>
/// The File's 32 bit CRC Hash
/// </summary>
public override long Crc => FileHeader.FileCRC;
public override long Crc { get { return FileHeader.FileCRC; } }
/// <summary>
/// The path of the file internal to the Rar Archive.
/// </summary>
public override string Key => FileHeader.FileName;
public override string Key { get { return FileHeader.FileName; } }
/// <summary>
/// The entry last modified time in the archive, if recorded
/// </summary>
public override DateTime? LastModifiedTime => FileHeader.FileLastModifiedTime;
public override DateTime? LastModifiedTime { get { return FileHeader.FileLastModifiedTime; } }
/// <summary>
/// The entry create time in the archive, if recorded
/// </summary>
public override DateTime? CreatedTime => FileHeader.FileCreatedTime;
public override DateTime? CreatedTime { get { return FileHeader.FileCreatedTime; } }
/// <summary>
/// The entry last accessed time in the archive, if recorded
/// </summary>
public override DateTime? LastAccessedTime => FileHeader.FileLastAccessedTime;
public override DateTime? LastAccessedTime { get { return FileHeader.FileLastAccessedTime; } }
/// <summary>
/// The entry time whend archived, if recorded
/// </summary>
public override DateTime? ArchivedTime => FileHeader.FileArchivedTime;
public override DateTime? ArchivedTime { get { return FileHeader.FileArchivedTime; } }
/// <summary>
/// Entry is password protected and encrypted and cannot be extracted.
/// </summary>
public override bool IsEncrypted => FileHeader.FileFlags.HasFlag(FileFlags.PASSWORD);
public override bool IsEncrypted { get { return FileHeader.FileFlags.HasFlag(FileFlags.PASSWORD); } }
/// <summary>
/// Entry is password protected and encrypted and cannot be extracted.
/// </summary>
public override bool IsDirectory => FileHeader.FileFlags.HasFlag(FileFlags.DIRECTORY);
public override bool IsDirectory { get { return FileHeader.FileFlags.HasFlag(FileFlags.DIRECTORY); } }
public override bool IsSplit => FileHeader.FileFlags.HasFlag(FileFlags.SPLIT_AFTER);
public override bool IsSplit { get { return FileHeader.FileFlags.HasFlag(FileFlags.SPLIT_AFTER); } }
public override string ToString()
{

View File

@@ -14,9 +14,9 @@ namespace SharpCompress.Common.Rar
FileHeader = fh;
}
internal MarkHeader MarkHeader { get; }
internal MarkHeader MarkHeader { get; private set; }
internal FileHeader FileHeader { get; }
internal FileHeader FileHeader { get; private set; }
internal override Stream GetRawStream()
{

View File

@@ -6,6 +6,7 @@ using System.Security.Cryptography;
using System.Text;
using Org.BouncyCastle.Crypto.Engines;
using Org.BouncyCastle.Crypto.Parameters;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar
{
@@ -96,22 +97,24 @@ namespace SharpCompress.Common.Rar
return rijndael;
}
public byte[] ProcessBlock(byte[] cipherText)
public byte[] ProcessBlock(ByteArrayPoolScope cipherText)
{
var plainText = new byte[CRYPTO_BLOCK_SIZE];
var decryptedBytes = new List<byte>();
rijndael.ProcessBlock(cipherText, 0, plainText, 0);
for (int j = 0; j < plainText.Length; j++)
using (var plainText = ByteArrayPool.RentScope(CRYPTO_BLOCK_SIZE))
{
decryptedBytes.Add((byte) (plainText[j] ^ aesInitializationVector[j%16])); //32:114, 33:101
}
var decryptedBytes = new List<byte>();
rijndael.ProcessBlock(cipherText, plainText);
for (int j = 0; j < aesInitializationVector.Length; j++)
{
aesInitializationVector[j] = cipherText[j];
for (int j = 0; j < plainText.Count; j++)
{
decryptedBytes.Add((byte)(plainText[j] ^ aesInitializationVector[j % 16])); //32:114, 33:101
}
for (int j = 0; j < aesInitializationVector.Length; j++)
{
aesInitializationVector[j] = cipherText[j];
}
return decryptedBytes.ToArray();
}
return decryptedBytes.ToArray();
}
public void Dispose()

View File

@@ -21,7 +21,7 @@ namespace SharpCompress.Common.Rar
headerFactory = new RarHeaderFactory(mode, options);
}
internal StreamingMode Mode => headerFactory.StreamingMode;
internal StreamingMode Mode { get { return headerFactory.StreamingMode; } }
internal abstract IEnumerable<RarFilePart> ReadFileParts();

View File

@@ -1,17 +1,14 @@
using System;
using SharpCompress.Readers;
namespace SharpCompress.Common
{
public class ReaderExtractionEventArgs<T> : EventArgs
{
internal ReaderExtractionEventArgs(T entry, ReaderProgress readerProgress = null)
internal ReaderExtractionEventArgs(T entry)
{
Item = entry;
ReaderProgress = readerProgress;
}
public T Item { get; }
public ReaderProgress ReaderProgress { get; }
public T Item { get; private set; }
}
}

View File

@@ -1339,20 +1339,20 @@ namespace SharpCompress.Common.SevenZip
#region Stream
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -12,9 +12,9 @@ namespace SharpCompress.Common.SevenZip
public bool HasStream { get; internal set; }
public bool IsDir { get; internal set; }
public bool CrcDefined => Crc != null;
public bool CrcDefined { get { return Crc != null; } }
public bool AttribDefined => Attrib != null;
public bool AttribDefined { get { return Attrib != null; } }
public void SetAttrib(uint attrib)
{

View File

@@ -13,7 +13,7 @@ namespace SharpCompress.Common.SevenZip
internal List<long> UnpackSizes = new List<long>();
internal uint? UnpackCRC;
internal bool UnpackCRCDefined => UnpackCRC != null;
internal bool UnpackCRCDefined { get { return UnpackCRC != null; } }
public long GetUnpackSize()
{

View File

@@ -12,32 +12,32 @@ namespace SharpCompress.Common.SevenZip
internal SevenZipFilePart FilePart { get; }
public override CompressionType CompressionType => FilePart.CompressionType;
public override CompressionType CompressionType { get { return FilePart.CompressionType; } }
public override long Crc => FilePart.Header.Crc ?? 0;
public override long Crc { get { return FilePart.Header.Crc ?? 0; } }
public override string Key => FilePart.Header.Name;
public override string Key { get { return FilePart.Header.Name; } }
public override long CompressedSize => 0;
public override long CompressedSize { get { return 0; } }
public override long Size => FilePart.Header.Size;
public override long Size { get { return FilePart.Header.Size; } }
public override DateTime? LastModifiedTime => FilePart.Header.MTime;
public override DateTime? LastModifiedTime { get { return FilePart.Header.MTime; } }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => FilePart.Header.IsDir;
public override bool IsDirectory { get { return FilePart.Header.IsDir; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
public override int? Attrib => (int)FilePart.Header.Attrib;
public override int? Attrib { get { return (int)FilePart.Header.Attrib; } }
internal override IEnumerable<FilePart> Parts => FilePart.AsEnumerable<FilePart>();
internal override IEnumerable<FilePart> Parts { get { return FilePart.AsEnumerable<FilePart>(); } }
}
}

View File

@@ -28,7 +28,7 @@ namespace SharpCompress.Common.SevenZip
internal CFolder Folder { get; }
internal int Index { get; }
internal override string FilePartName => Header.Name;
internal override string FilePartName { get { return Header.Name; } }
internal override Stream GetRawStream()
{

View File

@@ -2,6 +2,7 @@
using System.IO;
using System.Text;
using SharpCompress.Converters;
using SharpCompress.IO;
namespace SharpCompress.Common.Tar.Headers
{
@@ -87,63 +88,75 @@ namespace SharpCompress.Common.Tar.Headers
internal bool Read(BinaryReader reader)
{
var buffer = ReadBlock(reader);
if (buffer.Length == 0)
try
{
return false;
}
if (ReadEntryType(buffer) == EntryType.LongName)
{
Name = ReadLongName(reader, buffer);
buffer = ReadBlock(reader);
}
else
{
Name = ArchiveEncoding.Default.GetString(buffer, 0, 100).TrimNulls();
}
EntryType = ReadEntryType(buffer);
Size = ReadSize(buffer);
//Mode = ReadASCIIInt32Base8(buffer, 100, 7);
//UserId = ReadASCIIInt32Base8(buffer, 108, 7);
//GroupId = ReadASCIIInt32Base8(buffer, 116, 7);
long unixTimeStamp = ReadASCIIInt64Base8(buffer, 136, 11);
LastModifiedTime = Epoch.AddSeconds(unixTimeStamp).ToLocalTime();
Magic = ArchiveEncoding.Default.GetString(buffer, 257, 6).TrimNulls();
if (!string.IsNullOrEmpty(Magic)
&& "ustar".Equals(Magic))
{
string namePrefix = ArchiveEncoding.Default.GetString(buffer, 345, 157);
namePrefix = namePrefix.TrimNulls();
if (!string.IsNullOrEmpty(namePrefix))
if (buffer.Count == 0)
{
Name = namePrefix + "/" + Name;
return false;
}
if (ReadEntryType(buffer.Array) == EntryType.LongName)
{
Name = ReadLongName(reader, buffer.Array);
buffer.Dispose();
buffer = ReadBlock(reader);
}
else
{
Name = ArchiveEncoding.Default.GetString(buffer.Array, 0, 100).TrimNulls();
}
EntryType = ReadEntryType(buffer.Array);
Size = ReadSize(buffer.Array);
//Mode = ReadASCIIInt32Base8(buffer, 100, 7);
//UserId = ReadASCIIInt32Base8(buffer, 108, 7);
//GroupId = ReadASCIIInt32Base8(buffer, 116, 7);
long unixTimeStamp = ReadASCIIInt64Base8(buffer.Array, 136, 11);
LastModifiedTime = Epoch.AddSeconds(unixTimeStamp).ToLocalTime();
Magic = ArchiveEncoding.Default.GetString(buffer.Array, 257, 6).TrimNulls();
if (!string.IsNullOrEmpty(Magic)
&& "ustar".Equals(Magic))
{
string namePrefix = ArchiveEncoding.Default.GetString(buffer.Array, 345, 157);
namePrefix = namePrefix.TrimNulls();
if (!string.IsNullOrEmpty(namePrefix))
{
Name = namePrefix + "/" + Name;
}
}
if (EntryType != EntryType.LongName
&& Name.Length == 0)
{
return false;
}
return true;
}
if (EntryType != EntryType.LongName
&& Name.Length == 0)
finally
{
return false;
buffer.Dispose();
}
return true;
}
private string ReadLongName(BinaryReader reader, byte[] buffer)
{
var size = ReadSize(buffer);
var nameLength = (int)size;
var nameBytes = reader.ReadBytes(nameLength);
var remainingBytesToRead = BlockSize - (nameLength % BlockSize);
// Read the rest of the block and discard the data
if (remainingBytesToRead < BlockSize)
using (var nameBytes = reader.ReadScope(nameLength))
{
reader.ReadBytes(remainingBytesToRead);
var remainingBytesToRead = BlockSize - (nameLength % BlockSize);
// Read the rest of the block and discard the data
if (remainingBytesToRead < BlockSize)
{
using (reader.ReadScope(remainingBytesToRead))
{
}
}
return ArchiveEncoding.Default.GetString(nameBytes.Array, 0, nameBytes.Count).TrimNulls();
}
return ArchiveEncoding.Default.GetString(nameBytes, 0, nameBytes.Length).TrimNulls();
}
private static EntryType ReadEntryType(byte[] buffer)
@@ -160,11 +173,11 @@ namespace SharpCompress.Common.Tar.Headers
return ReadASCIIInt64Base8(buffer, 124, 11);
}
private static byte[] ReadBlock(BinaryReader reader)
private static ByteArrayPoolScope ReadBlock(BinaryReader reader)
{
byte[] buffer = reader.ReadBytes(BlockSize);
var buffer = reader.ReadScope(BlockSize);
if (buffer.Length != 0 && buffer.Length < BlockSize)
if (buffer.Count != 0 && buffer.Count < BlockSize)
{
throw new InvalidOperationException("Buffer is invalid size");
}

View File

@@ -18,29 +18,29 @@ namespace SharpCompress.Common.Tar
public override CompressionType CompressionType { get; }
public override long Crc => 0;
public override long Crc { get { return 0; } }
public override string Key => filePart.Header.Name;
public override string Key { get { return filePart.Header.Name; } }
public override long CompressedSize => filePart.Header.Size;
public override long CompressedSize { get { return filePart.Header.Size; } }
public override long Size => filePart.Header.Size;
public override long Size { get { return filePart.Header.Size; } }
public override DateTime? LastModifiedTime => filePart.Header.LastModifiedTime;
public override DateTime? LastModifiedTime { get { return filePart.Header.LastModifiedTime; } }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => false;
public override bool IsEncrypted { get { return false; } }
public override bool IsDirectory => filePart.Header.EntryType == EntryType.Directory;
public override bool IsDirectory { get { return filePart.Header.EntryType == EntryType.Directory; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => filePart.AsEnumerable<FilePart>();
internal override IEnumerable<FilePart> Parts { get { return filePart.AsEnumerable<FilePart>(); } }
internal static IEnumerable<TarEntry> GetEntries(StreamingMode mode, Stream stream,
CompressionType compressionType)

View File

@@ -16,7 +16,7 @@ namespace SharpCompress.Common.Tar
internal TarHeader Header { get; }
internal override string FilePartName => Header.Name;
internal override string FilePartName { get { return Header.Name; } }
internal override Stream GetCompressedStream()
{

View File

@@ -42,20 +42,20 @@ namespace SharpCompress.Common.Tar
public Stream Stream { get; }
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -14,7 +14,7 @@ namespace SharpCompress.Common
ReaderOptions = readerOptions;
}
internal Stream Stream => new NonDisposingStream(actualStream);
internal Stream Stream { get { return new NonDisposingStream(actualStream); } }
protected ReaderOptions ReaderOptions { get; }
@@ -22,12 +22,12 @@ namespace SharpCompress.Common
/// RarArchive is the first volume of a multi-part archive.
/// Only Rar 3.0 format and higher
/// </summary>
public virtual bool IsFirstVolume => true;
public virtual bool IsFirstVolume { get { return true; } }
/// <summary>
/// RarArchive is part of a multi-part archive.
/// </summary>
public virtual bool IsMultiVolume => true;
public virtual bool IsMultiVolume { get { return true; } }
private bool disposed;

View File

@@ -21,6 +21,18 @@ namespace SharpCompress.Common.Zip.Headers
Comment = reader.ReadBytes(CommentLength);
}
internal override void Write(BinaryWriter writer)
{
writer.Write(VolumeNumber);
writer.Write(FirstVolumeWithDirectory);
writer.Write(TotalNumberOfEntriesInDisk);
writer.Write(TotalNumberOfEntries);
writer.Write(DirectorySize);
writer.Write(DirectoryStartOffsetRelativeToDisk);
writer.Write(CommentLength);
writer.Write(Comment);
}
public ushort VolumeNumber { get; private set; }
public ushort FirstVolumeWithDirectory { get; private set; }
@@ -36,9 +48,5 @@ namespace SharpCompress.Common.Zip.Headers
public byte[] Comment { get; private set; }
public ushort TotalNumberOfEntries { get; private set; }
public bool IsZip64 => TotalNumberOfEntriesInDisk == ushort.MaxValue
|| DirectorySize == uint.MaxValue
|| DirectoryStartOffsetRelativeToDisk == uint.MaxValue;
}
}

View File

@@ -1,6 +1,6 @@
using System;
using System.IO;
using System.IO;
using System.Linq;
using SharpCompress.IO;
namespace SharpCompress.Common.Zip.Headers
{
@@ -30,42 +30,59 @@ namespace SharpCompress.Common.Zip.Headers
ExternalFileAttributes = reader.ReadUInt32();
RelativeOffsetOfEntryHeader = reader.ReadUInt32();
byte[] name = reader.ReadBytes(nameLength);
Name = DecodeString(name);
byte[] extra = reader.ReadBytes(extraLength);
byte[] comment = reader.ReadBytes(commentLength);
Comment = DecodeString(comment);
LoadExtra(extra);
using (var name = reader.ReadScope(nameLength))
{
Name = DecodeString(name);
}
using (var extra = reader.ReadScope(extraLength))
using (var comment = reader.ReadScope(commentLength))
{
Comment = DecodeString(comment);
LoadExtra(extra);
}
var unicodePathExtra = Extra.FirstOrDefault(u => u.Type == ExtraDataType.UnicodePathExtraField);
if (unicodePathExtra != null)
{
Name = ((ExtraUnicodePathExtraField)unicodePathExtra).UnicodeName;
}
}
var zip64ExtraData = Extra.OfType<Zip64ExtendedInformationExtraField>().FirstOrDefault();
if (zip64ExtraData != null)
{
if (CompressedSize == uint.MaxValue)
{
CompressedSize = zip64ExtraData.CompressedSize;
}
if (UncompressedSize == uint.MaxValue)
{
UncompressedSize = zip64ExtraData.UncompressedSize;
}
if (RelativeOffsetOfEntryHeader == uint.MaxValue)
{
RelativeOffsetOfEntryHeader = zip64ExtraData.RelativeOffsetOfEntryHeader;
}
}
internal override void Write(BinaryWriter writer)
{
writer.Write(Version);
writer.Write(VersionNeededToExtract);
writer.Write((ushort)Flags);
writer.Write((ushort)CompressionMethod);
writer.Write(LastModifiedTime);
writer.Write(LastModifiedDate);
writer.Write(Crc);
writer.Write(CompressedSize);
writer.Write(UncompressedSize);
byte[] nameBytes = EncodeString(Name);
writer.Write((ushort)nameBytes.Length);
//writer.Write((ushort)Extra.Length);
writer.Write((ushort)0);
writer.Write((ushort)Comment.Length);
writer.Write(DiskNumberStart);
writer.Write(InternalFileAttributes);
writer.Write(ExternalFileAttributes);
writer.Write(RelativeOffsetOfEntryHeader);
writer.Write(nameBytes);
// writer.Write(Extra);
writer.Write(Comment);
}
internal ushort Version { get; private set; }
public ushort VersionNeededToExtract { get; set; }
public long RelativeOffsetOfEntryHeader { get; set; }
public uint RelativeOffsetOfEntryHeader { get; set; }
public uint ExternalFileAttributes { get; set; }

View File

@@ -13,5 +13,10 @@ namespace SharpCompress.Common.Zip.Headers
internal override void Read(BinaryReader reader)
{
}
internal override void Write(BinaryWriter writer)
{
throw new NotImplementedException();
}
}
}

View File

@@ -1,5 +1,6 @@
using System.IO;
using System.Linq;
using SharpCompress.IO;
namespace SharpCompress.Common.Zip.Headers
{
@@ -22,29 +23,41 @@ namespace SharpCompress.Common.Zip.Headers
UncompressedSize = reader.ReadUInt32();
ushort nameLength = reader.ReadUInt16();
ushort extraLength = reader.ReadUInt16();
byte[] name = reader.ReadBytes(nameLength);
byte[] extra = reader.ReadBytes(extraLength);
Name = DecodeString(name);
LoadExtra(extra);
using (var name = reader.ReadScope(nameLength))
using (var extra = reader.ReadScope(extraLength))
{
Name = DecodeString(name);
LoadExtra(extra);
}
var unicodePathExtra = Extra.FirstOrDefault(u => u.Type == ExtraDataType.UnicodePathExtraField);
if (unicodePathExtra != null)
{
Name = ((ExtraUnicodePathExtraField)unicodePathExtra).UnicodeName;
}
}
var zip64ExtraData = Extra.OfType<Zip64ExtendedInformationExtraField>().FirstOrDefault();
if (zip64ExtraData != null)
{
if (CompressedSize == uint.MaxValue)
{
CompressedSize = zip64ExtraData.CompressedSize;
}
if (UncompressedSize == uint.MaxValue)
{
UncompressedSize = zip64ExtraData.UncompressedSize;
}
}
internal override void Write(BinaryWriter writer)
{
writer.Write(Version);
writer.Write((ushort)Flags);
writer.Write((ushort)CompressionMethod);
writer.Write(LastModifiedTime);
writer.Write(LastModifiedDate);
writer.Write(Crc);
writer.Write(CompressedSize);
writer.Write(UncompressedSize);
byte[] nameBytes = EncodeString(Name);
writer.Write((ushort)nameBytes.Length);
writer.Write((ushort)0);
//if (Extra != null)
//{
// writer.Write(Extra);
//}
writer.Write(nameBytes);
}
internal ushort Version { get; private set; }

View File

@@ -1,6 +1,5 @@
using System;
using System.Text;
using SharpCompress.Converters;
namespace SharpCompress.Common.Zip.Headers
{
@@ -12,8 +11,7 @@ namespace SharpCompress.Common.Zip.Headers
// Third Party Mappings
// -Info-ZIP Unicode Path Extra Field
UnicodePathExtraField = 0x7075,
Zip64ExtendedInformationExtraField = 0x0001
UnicodePathExtraField = 0x7075
}
internal class ExtraData
@@ -25,7 +23,7 @@ namespace SharpCompress.Common.Zip.Headers
internal class ExtraUnicodePathExtraField : ExtraData
{
internal byte Version => DataBytes[0];
internal byte Version { get { return DataBytes[0]; } }
internal byte[] NameCRC32
{
@@ -49,73 +47,6 @@ namespace SharpCompress.Common.Zip.Headers
}
}
internal class Zip64ExtendedInformationExtraField : ExtraData
{
public Zip64ExtendedInformationExtraField(ExtraDataType type, ushort length, byte[] dataBytes)
{
Type = type;
Length = length;
DataBytes = dataBytes;
Process();
}
//From the spec values are only in the extradata if the standard
//value is set to 0xFFFF, but if one of the sizes are present, both are.
//Hence if length == 4 volume only
// if length == 8 offset only
// if length == 12 offset + volume
// if length == 16 sizes only
// if length == 20 sizes + volume
// if length == 24 sizes + offset
// if length == 28 everything.
//It is unclear how many of these are used in the wild.
private void Process()
{
switch (DataBytes.Length)
{
case 4:
VolumeNumber = DataConverter.LittleEndian.GetUInt32(DataBytes, 0);
return;
case 8:
RelativeOffsetOfEntryHeader = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
return;
case 12:
RelativeOffsetOfEntryHeader = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
VolumeNumber = DataConverter.LittleEndian.GetUInt32(DataBytes, 8);
return;
case 16:
UncompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
CompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 8);
return;
case 20:
UncompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
CompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 8);
VolumeNumber = DataConverter.LittleEndian.GetUInt32(DataBytes, 16);
return;
case 24:
UncompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
CompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 8);
RelativeOffsetOfEntryHeader = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 16);
return;
case 28:
UncompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 0);
CompressedSize = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 8);
RelativeOffsetOfEntryHeader = (long)DataConverter.LittleEndian.GetUInt64(DataBytes, 16);
VolumeNumber = DataConverter.LittleEndian.GetUInt32(DataBytes, 24);
return;
default:
throw new ArchiveException("Unexpected size of of Zip64 extended information extra field");
}
}
public long UncompressedSize { get; private set; }
public long CompressedSize { get; private set; }
public long RelativeOffsetOfEntryHeader { get; private set; }
public uint VolumeNumber { get; private set; }
}
internal static class LocalEntryHeaderExtraFactory
{
internal static ExtraData Create(ExtraDataType type, ushort length, byte[] extraData)
@@ -129,13 +60,6 @@ namespace SharpCompress.Common.Zip.Headers
Length = length,
DataBytes = extraData
};
case ExtraDataType.Zip64ExtendedInformationExtraField:
return new Zip64ExtendedInformationExtraField
(
type,
length,
extraData
);
default:
return new ExtraData
{

View File

@@ -14,5 +14,10 @@ namespace SharpCompress.Common.Zip.Headers
{
throw new NotImplementedException();
}
internal override void Write(BinaryWriter writer)
{
throw new NotImplementedException();
}
}
}

View File

@@ -1,49 +0,0 @@
using System;
using System.IO;
namespace SharpCompress.Common.Zip.Headers
{
internal class Zip64DirectoryEndHeader : ZipHeader
{
public Zip64DirectoryEndHeader()
: base(ZipHeaderType.Zip64DirectoryEnd)
{
}
internal override void Read(BinaryReader reader)
{
SizeOfDirectoryEndRecord = (long)reader.ReadUInt64();
VersionMadeBy = reader.ReadUInt16();
VersionNeededToExtract = reader.ReadUInt16();
VolumeNumber = reader.ReadUInt32();
FirstVolumeWithDirectory = reader.ReadUInt32();
TotalNumberOfEntriesInDisk = (long)reader.ReadUInt64();
TotalNumberOfEntries = (long)reader.ReadUInt64();
DirectorySize = (long)reader.ReadUInt64();
DirectoryStartOffsetRelativeToDisk = (long)reader.ReadUInt64();
DataSector = reader.ReadBytes((int)(SizeOfDirectoryEndRecord - SizeOfFixedHeaderDataExceptSignatureAndSizeFields));
}
const int SizeOfFixedHeaderDataExceptSignatureAndSizeFields = 44;
public long SizeOfDirectoryEndRecord { get; private set; }
public ushort VersionMadeBy { get; private set; }
public ushort VersionNeededToExtract { get; private set; }
public uint VolumeNumber { get; private set; }
public uint FirstVolumeWithDirectory { get; private set; }
public long TotalNumberOfEntriesInDisk { get; private set; }
public long TotalNumberOfEntries { get; private set; }
public long DirectorySize { get; private set; }
public long DirectoryStartOffsetRelativeToDisk { get; private set; }
public byte[] DataSector { get; private set; }
}
}

View File

@@ -1,25 +0,0 @@
using System.IO;
namespace SharpCompress.Common.Zip.Headers
{
internal class Zip64DirectoryEndLocatorHeader : ZipHeader
{
public Zip64DirectoryEndLocatorHeader()
: base(ZipHeaderType.Zip64DirectoryEndLocator)
{
}
internal override void Read(BinaryReader reader)
{
FirstVolumeWithDirectory = reader.ReadUInt32();
RelativeOffsetOfTheEndOfDirectoryRecord = (long)reader.ReadUInt64();
TotalNumberOfVolumes = reader.ReadUInt32();
}
public uint FirstVolumeWithDirectory { get; private set; }
public long RelativeOffsetOfTheEndOfDirectoryRecord { get; private set; }
public uint TotalNumberOfVolumes { get; private set; }
}
}

View File

@@ -3,6 +3,7 @@ using System.Collections.Generic;
using System.IO;
using System.Text;
using SharpCompress.Converters;
using SharpCompress.IO;
namespace SharpCompress.Common.Zip.Headers
{
@@ -30,14 +31,14 @@ namespace SharpCompress.Common.Zip.Headers
}
}
protected string DecodeString(byte[] str)
protected string DecodeString(ByteArrayPoolScope str)
{
if (FlagUtility.HasFlag(Flags, HeaderFlags.UTF8))
{
return Encoding.UTF8.GetString(str, 0, str.Length);
return Encoding.UTF8.GetString(str.Array, 0, str.Count);
}
return ArchiveEncoding.Default.GetString(str, 0, str.Length);
return ArchiveEncoding.Default.GetString(str.Array, 0, str.Count);
}
protected byte[] EncodeString(string str)
@@ -57,31 +58,15 @@ namespace SharpCompress.Common.Zip.Headers
internal ZipCompressionMethod CompressionMethod { get; set; }
internal long CompressedSize { get; set; }
internal uint CompressedSize { get; set; }
internal long? DataStartPosition { get; set; }
internal long UncompressedSize { get; set; }
internal uint UncompressedSize { get; set; }
internal List<ExtraData> Extra { get; set; }
public string Password { get; set; }
internal PkwareTraditionalEncryptionData ComposeEncryptionData(Stream archiveStream)
{
if (archiveStream == null)
{
throw new ArgumentNullException(nameof(archiveStream));
}
var buffer = new byte[12];
archiveStream.Read(buffer, 0, 12);
PkwareTraditionalEncryptionData encryptionData = PkwareTraditionalEncryptionData.ForRead(Password, this, buffer);
return encryptionData;
}
internal PkwareTraditionalEncryptionData PkwareTraditionalEncryptionData { get; set; }
#if !NO_CRYPTO
internal WinzipAesEncryptionData WinzipAesEncryptionData { get; set; }
#endif
@@ -92,27 +77,27 @@ namespace SharpCompress.Common.Zip.Headers
internal uint Crc { get; set; }
protected void LoadExtra(byte[] extra)
protected void LoadExtra(ByteArrayPoolScope extra)
{
for (int i = 0; i < extra.Length - 4;)
for (int i = 0; i < extra.Count - 4;)
{
ExtraDataType type = (ExtraDataType)DataConverter.LittleEndian.GetUInt16(extra, i);
ExtraDataType type = (ExtraDataType)DataConverter.LittleEndian.GetUInt16(extra.Array, i);
if (!Enum.IsDefined(typeof(ExtraDataType), type))
{
type = ExtraDataType.NotImplementedExtraData;
}
ushort length = DataConverter.LittleEndian.GetUInt16(extra, i + 2);
byte[] data = new byte[length];
Buffer.BlockCopy(extra, i + 4, data, 0, length);
Extra.Add(LocalEntryHeaderExtraFactory.Create(type, length, data));
ushort length = DataConverter.LittleEndian.GetUInt16(extra.Array, i + 2);
using (var data = ByteArrayPool.RentScope(length))
{
Buffer.BlockCopy(extra.Array, i + 4, data.Array, 0, length);
Extra.Add(LocalEntryHeaderExtraFactory.Create(type, length, data.Array));
}
i += length + 4;
}
}
internal ZipFilePart Part { get; set; }
internal bool IsZip64 => CompressedSize == uint.MaxValue;
}
}

View File

@@ -10,10 +10,12 @@ namespace SharpCompress.Common.Zip.Headers
HasData = true;
}
internal ZipHeaderType ZipHeaderType { get; }
internal ZipHeaderType ZipHeaderType { get; private set; }
internal abstract void Read(BinaryReader reader);
internal abstract void Write(BinaryWriter writer);
internal bool HasData { get; set; }
}
}

View File

@@ -6,8 +6,6 @@
LocalEntry,
DirectoryEntry,
DirectoryEnd,
Split,
Zip64DirectoryEnd,
Zip64DirectoryEndLocator
Split
}
}

View File

@@ -23,15 +23,15 @@ namespace SharpCompress.Common.Zip
this.mode = mode;
}
public override bool CanRead => (mode == CryptoMode.Decrypt);
public override bool CanRead { get { return (mode == CryptoMode.Decrypt); } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => (mode == CryptoMode.Encrypt);
public override bool CanWrite { get { return (mode == CryptoMode.Encrypt); } }
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -24,7 +24,7 @@ namespace SharpCompress.Common.Zip
return base.GetCompressedStream();
}
internal string Comment => (Header as DirectoryEntryHeader).Comment;
internal string Comment { get { return (Header as DirectoryEntryHeader).Comment; } }
private void LoadLocalHeader()
{

View File

@@ -9,7 +9,6 @@ namespace SharpCompress.Common.Zip
internal class SeekableZipHeaderFactory : ZipHeaderFactory
{
private const int MAX_ITERATIONS_FOR_DIRECTORY_HEADER = 4096;
private bool zip64;
internal SeekableZipHeaderFactory(string password)
: base(StreamingMode.Seekable, password)
@@ -17,56 +16,11 @@ namespace SharpCompress.Common.Zip
}
internal IEnumerable<DirectoryEntryHeader> ReadSeekableHeader(Stream stream)
{
var reader = new BinaryReader(stream);
SeekBackToHeader(stream, reader, DIRECTORY_END_HEADER_BYTES);
var entry = new DirectoryEndHeader();
entry.Read(reader);
if (entry.IsZip64)
{
zip64 = true;
SeekBackToHeader(stream, reader, ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR);
var zip64Locator = new Zip64DirectoryEndLocatorHeader();
zip64Locator.Read(reader);
stream.Seek(zip64Locator.RelativeOffsetOfTheEndOfDirectoryRecord, SeekOrigin.Begin);
uint zip64Signature = reader.ReadUInt32();
if(zip64Signature != ZIP64_END_OF_CENTRAL_DIRECTORY)
throw new ArchiveException("Failed to locate the Zip64 Header");
var zip64Entry = new Zip64DirectoryEndHeader();
zip64Entry.Read(reader);
stream.Seek(zip64Entry.DirectoryStartOffsetRelativeToDisk, SeekOrigin.Begin);
}
else
{
stream.Seek(entry.DirectoryStartOffsetRelativeToDisk, SeekOrigin.Begin);
}
long position = stream.Position;
while (true)
{
stream.Position = position;
uint signature = reader.ReadUInt32();
var directoryEntryHeader = ReadHeader(signature, reader, zip64) as DirectoryEntryHeader;
position = stream.Position;
if (directoryEntryHeader == null)
{
yield break;
}
//entry could be zero bytes so we need to know that.
directoryEntryHeader.HasData = directoryEntryHeader.CompressedSize != 0;
yield return directoryEntryHeader;
}
}
private static void SeekBackToHeader(Stream stream, BinaryReader reader, uint headerSignature)
{
long offset = 0;
uint signature;
BinaryReader reader = new BinaryReader(stream);
int iterationCount = 0;
do
{
@@ -80,10 +34,33 @@ namespace SharpCompress.Common.Zip
iterationCount++;
if (iterationCount > MAX_ITERATIONS_FOR_DIRECTORY_HEADER)
{
throw new ArchiveException("Could not find Zip file Directory at the end of the file. File may be corrupted.");
throw new ArchiveException(
"Could not find Zip file Directory at the end of the file. File may be corrupted.");
}
}
while (signature != headerSignature);
while (signature != DIRECTORY_END_HEADER_BYTES);
var entry = new DirectoryEndHeader();
entry.Read(reader);
stream.Seek(entry.DirectoryStartOffsetRelativeToDisk, SeekOrigin.Begin);
DirectoryEntryHeader directoryEntryHeader = null;
long position = stream.Position;
while (true)
{
stream.Position = position;
signature = reader.ReadUInt32();
directoryEntryHeader = ReadHeader(signature, reader) as DirectoryEntryHeader;
position = stream.Position;
if (directoryEntryHeader == null)
{
yield break;
}
//entry could be zero bytes so we need to know that.
directoryEntryHeader.HasData = directoryEntryHeader.CompressedSize != 0;
yield return directoryEntryHeader;
}
}
internal LocalEntryHeader GetLocalHeader(Stream stream, DirectoryEntryHeader directoryEntryHeader)
@@ -91,7 +68,7 @@ namespace SharpCompress.Common.Zip
stream.Seek(directoryEntryHeader.RelativeOffsetOfEntryHeader, SeekOrigin.Begin);
BinaryReader reader = new BinaryReader(stream);
uint signature = reader.ReadUInt32();
var localEntryHeader = ReadHeader(signature, reader, zip64) as LocalEntryHeader;
var localEntryHeader = ReadHeader(signature, reader) as LocalEntryHeader;
if (localEntryHeader == null)
{
throw new InvalidOperationException();

View File

@@ -25,7 +25,7 @@ namespace SharpCompress.Common.Zip
{
return Stream.Null;
}
decompressionStream = CreateDecompressionStream(GetCryptoStream(CreateBaseStream()), Header.CompressionMethod);
decompressionStream = CreateDecompressionStream(GetCryptoStream(CreateBaseStream()));
if (LeaveStreamOpen)
{
return new NonDisposingStream(decompressionStream);

View File

@@ -28,7 +28,7 @@ namespace SharpCompress.Common.Zip
ZipHeader header = null;
BinaryReader reader = new BinaryReader(rewindableStream);
if (lastEntryHeader != null &&
(FlagUtility.HasFlag(lastEntryHeader.Flags, HeaderFlags.UsePostDataDescriptor) || lastEntryHeader.IsZip64))
FlagUtility.HasFlag(lastEntryHeader.Flags, HeaderFlags.UsePostDataDescriptor))
{
reader = (lastEntryHeader.Part as StreamingZipFilePart).FixStreamedFileLocation(ref rewindableStream);
long? pos = rewindableStream.CanSeek ? (long?)rewindableStream.Position : null;

View File

@@ -52,28 +52,28 @@ namespace SharpCompress.Common.Zip
}
}
public override long Crc => filePart.Header.Crc;
public override long Crc { get { return filePart.Header.Crc; } }
public override string Key => filePart.Header.Name;
public override string Key { get { return filePart.Header.Name; } }
public override long CompressedSize => filePart.Header.CompressedSize;
public override long CompressedSize { get { return filePart.Header.CompressedSize; } }
public override long Size => filePart.Header.UncompressedSize;
public override long Size { get { return filePart.Header.UncompressedSize; } }
public override DateTime? LastModifiedTime { get; }
public override DateTime? CreatedTime => null;
public override DateTime? CreatedTime { get { return null; } }
public override DateTime? LastAccessedTime => null;
public override DateTime? LastAccessedTime { get { return null; } }
public override DateTime? ArchivedTime => null;
public override DateTime? ArchivedTime { get { return null; } }
public override bool IsEncrypted => FlagUtility.HasFlag(filePart.Header.Flags, HeaderFlags.Encrypted);
public override bool IsEncrypted { get { return FlagUtility.HasFlag(filePart.Header.Flags, HeaderFlags.Encrypted); } }
public override bool IsDirectory => filePart.Header.IsDirectory;
public override bool IsDirectory { get { return filePart.Header.IsDirectory; } }
public override bool IsSplit => false;
public override bool IsSplit { get { return false; } }
internal override IEnumerable<FilePart> Parts => filePart.AsEnumerable<FilePart>();
internal override IEnumerable<FilePart> Parts { get { return filePart.AsEnumerable<FilePart>(); } }
}
}

View File

@@ -21,10 +21,10 @@ namespace SharpCompress.Common.Zip
BaseStream = stream;
}
internal Stream BaseStream { get; }
internal Stream BaseStream { get; private set; }
internal ZipFileEntry Header { get; set; }
internal override string FilePartName => Header.Name;
internal override string FilePartName { get { return Header.Name; } }
internal override Stream GetCompressedStream()
{
@@ -32,7 +32,7 @@ namespace SharpCompress.Common.Zip
{
return Stream.Null;
}
Stream decompressionStream = CreateDecompressionStream(GetCryptoStream(CreateBaseStream()), Header.CompressionMethod);
Stream decompressionStream = CreateDecompressionStream(GetCryptoStream(CreateBaseStream()));
if (LeaveStreamOpen)
{
return new NonDisposingStream(decompressionStream);
@@ -51,11 +51,11 @@ namespace SharpCompress.Common.Zip
protected abstract Stream CreateBaseStream();
protected bool LeaveStreamOpen => FlagUtility.HasFlag(Header.Flags, HeaderFlags.UsePostDataDescriptor) || Header.IsZip64;
protected bool LeaveStreamOpen { get { return FlagUtility.HasFlag(Header.Flags, HeaderFlags.UsePostDataDescriptor); } }
protected Stream CreateDecompressionStream(Stream stream, ZipCompressionMethod method)
protected Stream CreateDecompressionStream(Stream stream)
{
switch (method)
switch (Header.CompressionMethod)
{
case ZipCompressionMethod.None:
{
@@ -102,9 +102,9 @@ namespace SharpCompress.Common.Zip
{
throw new InvalidFormatException("Winzip data length is not 7.");
}
ushort compressedMethod = DataConverter.LittleEndian.GetUInt16(data.DataBytes, 0);
ushort method = DataConverter.LittleEndian.GetUInt16(data.DataBytes, 0);
if (compressedMethod != 0x01 && compressedMethod != 0x02)
if (method != 0x01 && method != 0x02)
{
throw new InvalidFormatException("Unexpected vendor version number for WinZip AES metadata");
}
@@ -114,7 +114,8 @@ namespace SharpCompress.Common.Zip
{
throw new InvalidFormatException("Unexpected vendor ID for WinZip AES metadata");
}
return CreateDecompressionStream(stream, (ZipCompressionMethod)DataConverter.LittleEndian.GetUInt16(data.DataBytes, 5));
Header.CompressionMethod = (ZipCompressionMethod)DataConverter.LittleEndian.GetUInt16(data.DataBytes, 5);
return CreateDecompressionStream(stream);
}
default:
{
@@ -125,16 +126,18 @@ namespace SharpCompress.Common.Zip
protected Stream GetCryptoStream(Stream plainStream)
{
bool isFileEncrypted = FlagUtility.HasFlag(Header.Flags, HeaderFlags.Encrypted);
if (Header.CompressedSize == 0 && isFileEncrypted)
if ((Header.CompressedSize == 0)
#if !NO_CRYPTO
&& ((Header.PkwareTraditionalEncryptionData != null)
|| (Header.WinzipAesEncryptionData != null)))
#else
&& (Header.PkwareTraditionalEncryptionData != null))
#endif
{
throw new NotSupportedException("Cannot encrypt file with unknown size at start.");
}
if ((Header.CompressedSize == 0
if ((Header.CompressedSize == 0)
&& FlagUtility.HasFlag(Header.Flags, HeaderFlags.UsePostDataDescriptor))
|| Header.IsZip64)
{
plainStream = new NonDisposingStream(plainStream); //make sure AES doesn't close
}
@@ -142,40 +145,18 @@ namespace SharpCompress.Common.Zip
{
plainStream = new ReadOnlySubStream(plainStream, Header.CompressedSize); //make sure AES doesn't close
}
if (isFileEncrypted)
if (Header.PkwareTraditionalEncryptionData != null)
{
switch (Header.CompressionMethod)
{
case ZipCompressionMethod.None:
case ZipCompressionMethod.Deflate:
case ZipCompressionMethod.Deflate64:
case ZipCompressionMethod.BZip2:
case ZipCompressionMethod.LZMA:
case ZipCompressionMethod.PPMd:
{
return new PkwareTraditionalCryptoStream(plainStream, Header.ComposeEncryptionData(plainStream), CryptoMode.Decrypt);
}
case ZipCompressionMethod.WinzipAes:
{
#if !NO_FILE
if (Header.WinzipAesEncryptionData != null)
{
return new WinzipAesCryptoStream(plainStream, Header.WinzipAesEncryptionData, Header.CompressedSize - 10);
}
#endif
return plainStream;
}
default:
{
throw new ArgumentOutOfRangeException();
}
}
return new PkwareTraditionalCryptoStream(plainStream, Header.PkwareTraditionalEncryptionData,
CryptoMode.Decrypt);
}
#if !NO_FILE
if (Header.WinzipAesEncryptionData != null)
{
//only read 10 less because the last ten are auth bytes
return new WinzipAesCryptoStream(plainStream, Header.WinzipAesEncryptionData, Header.CompressedSize - 10);
}
#endif
return plainStream;
}
}

View File

@@ -17,8 +17,8 @@ namespace SharpCompress.Common.Zip
internal const uint DIGITAL_SIGNATURE = 0x05054b50;
internal const uint SPLIT_ARCHIVE_HEADER_BYTES = 0x30304b50;
internal const uint ZIP64_END_OF_CENTRAL_DIRECTORY = 0x06064b50;
internal const uint ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR = 0x07064b50;
private const uint ZIP64_END_OF_CENTRAL_DIRECTORY = 0x06064b50;
private const uint ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR = 0x07064b50;
protected LocalEntryHeader lastEntryHeader;
private readonly string password;
@@ -30,7 +30,7 @@ namespace SharpCompress.Common.Zip
this.password = password;
}
protected ZipHeader ReadHeader(uint headerBytes, BinaryReader reader, bool zip64 = false)
protected ZipHeader ReadHeader(uint headerBytes, BinaryReader reader)
{
switch (headerBytes)
{
@@ -54,12 +54,14 @@ namespace SharpCompress.Common.Zip
if (FlagUtility.HasFlag(lastEntryHeader.Flags, HeaderFlags.UsePostDataDescriptor))
{
lastEntryHeader.Crc = reader.ReadUInt32();
lastEntryHeader.CompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
lastEntryHeader.UncompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
lastEntryHeader.CompressedSize = reader.ReadUInt32();
lastEntryHeader.UncompressedSize = reader.ReadUInt32();
}
else
{
reader.ReadBytes(zip64 ? 20 : 12);
reader.ReadUInt32();
reader.ReadUInt32();
reader.ReadUInt32();
}
return null;
}
@@ -76,14 +78,9 @@ namespace SharpCompress.Common.Zip
return new SplitHeader();
}
case ZIP64_END_OF_CENTRAL_DIRECTORY:
{
var entry = new Zip64DirectoryEndHeader();
entry.Read(reader);
return entry;
}
case ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR:
{
var entry = new Zip64DirectoryEndLocatorHeader();
var entry = new IgnoreHeader(ZipHeaderType.Ignore);
entry.Read(reader);
return entry;
}
@@ -114,43 +111,46 @@ namespace SharpCompress.Common.Zip
{
if (FlagUtility.HasFlag(entryHeader.Flags, HeaderFlags.Encrypted))
{
if (!entryHeader.IsDirectory && entryHeader.CompressedSize == 0 &&
if (!entryHeader.IsDirectory &&
entryHeader.CompressedSize == 0 &&
FlagUtility.HasFlag(entryHeader.Flags, HeaderFlags.UsePostDataDescriptor))
{
throw new NotSupportedException("SharpCompress cannot currently read non-seekable Zip Streams with encrypted data that has been written in a non-seekable manner.");
throw new NotSupportedException(
"SharpCompress cannot currently read non-seekable Zip Streams with encrypted data that has been written in a non-seekable manner.");
}
if (password == null)
{
throw new CryptographicException("No password supplied for encrypted zip.");
}
entryHeader.Password = password;
if (entryHeader.CompressionMethod == ZipCompressionMethod.WinzipAes)
if (entryHeader.CompressionMethod != ZipCompressionMethod.WinzipAes)
{
byte[] buffer = new byte[12];
stream.Read(buffer, 0, 12);
entryHeader.PkwareTraditionalEncryptionData = PkwareTraditionalEncryptionData.ForRead(password,
entryHeader,
buffer);
entryHeader.CompressedSize -= 12;
}
else
{
#if NO_CRYPTO
throw new NotSupportedException("Cannot decrypt Winzip AES with Silverlight or WP7.");
#else
ExtraData data = entryHeader.Extra.SingleOrDefault(x => x.Type == ExtraDataType.WinZipAes);
if (data != null)
{
var keySize = (WinzipAesKeySize)data.DataBytes[4];
var data = entryHeader.Extra.SingleOrDefault(x => x.Type == ExtraDataType.WinZipAes);
WinzipAesKeySize keySize = (WinzipAesKeySize) data.DataBytes[4];
var salt = new byte[WinzipAesEncryptionData.KeyLengthInBytes(keySize) / 2];
var passwordVerifyValue = new byte[2];
stream.Read(salt, 0, salt.Length);
stream.Read(passwordVerifyValue, 0, 2);
entryHeader.WinzipAesEncryptionData =
new WinzipAesEncryptionData(keySize, salt, passwordVerifyValue, password);
byte[] salt = new byte[WinzipAesEncryptionData.KeyLengthInBytes(keySize)/2];
byte[] passwordVerifyValue = new byte[2];
stream.Read(salt, 0, salt.Length);
stream.Read(passwordVerifyValue, 0, 2);
entryHeader.WinzipAesEncryptionData = new WinzipAesEncryptionData(keySize, salt, passwordVerifyValue,
password);
entryHeader.CompressedSize -= (uint) (salt.Length + 2);
entryHeader.CompressedSize -= (uint)(salt.Length + 2);
}
#endif
}
}
if (entryHeader.IsDirectory)
{
return;
@@ -168,15 +168,13 @@ namespace SharpCompress.Common.Zip
{
entryHeader.DataStartPosition = stream.Position;
stream.Position += entryHeader.CompressedSize;
break;
}
break;
case StreamingMode.Streaming:
{
entryHeader.PackedStream = stream;
break;
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");

View File

@@ -73,15 +73,15 @@ namespace SharpCompress.Compressors.ADC
this.stream = stream;
}
public override bool CanRead => stream.CanRead;
public override bool CanRead { get { return stream.CanRead; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => position; set => throw new NotSupportedException(); }
public override long Position { get { return position; } set { throw new NotSupportedException(); } }
public override void Flush()
{

View File

@@ -1,4 +1,5 @@
using System.IO;
using SharpCompress.IO;
namespace SharpCompress.Compressors.BZip2
{
@@ -48,20 +49,20 @@ namespace SharpCompress.Compressors.BZip2
public CompressionMode Mode { get; }
public override bool CanRead => stream.CanRead;
public override bool CanRead { get { return stream.CanRead; } }
public override bool CanSeek => stream.CanSeek;
public override bool CanSeek { get { return stream.CanSeek; } }
public override bool CanWrite => stream.CanWrite;
public override bool CanWrite { get { return stream.CanWrite; } }
public override void Flush()
{
stream.Flush();
}
public override long Length => stream.Length;
public override long Length { get { return stream.Length; } }
public override long Position { get => stream.Position; set => stream.Position = value; }
public override long Position { get { return stream.Position; } set { stream.Position = value; } }
public override int Read(byte[] buffer, int offset, int count)
{
@@ -91,12 +92,14 @@ namespace SharpCompress.Compressors.BZip2
public static bool IsBZip2(Stream stream)
{
BinaryReader br = new BinaryReader(stream);
byte[] chars = br.ReadBytes(2);
if (chars.Length < 2 || chars[0] != 'B' || chars[1] != 'Z')
using (var chars = br.ReadScope(2))
{
return false;
if (chars.Count < 2 || chars[0] != 'B' || chars[1] != 'Z')
{
return false;
}
return true;
}
return true;
}
}
}

View File

@@ -1092,13 +1092,13 @@ namespace SharpCompress.Compressors.BZip2
{
}
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override long Length => 0;
public override long Length { get { return 0; } }
public override long Position { get { return 0; } set { } }
}

View File

@@ -1956,13 +1956,13 @@ namespace SharpCompress.Compressors.BZip2
}
}
public override bool CanRead => false;
public override bool CanRead { get { return false; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => true;
public override bool CanWrite { get { return true; } }
public override long Length => 0;
public override long Length { get { return 0; } }
public override long Position { get { return 0; } set { } }
}

View File

@@ -92,7 +92,14 @@ namespace SharpCompress.Compressors.Deflate
/// <summary>
/// Indicates the current CRC for all blocks slurped in.
/// </summary>
public Int32 Crc32Result => unchecked((Int32)(~runningCrc32Result));
public Int32 Crc32Result
{
get
{
// return one's complement of the running result
return unchecked((Int32)(~runningCrc32Result));
}
}
/// <summary>
/// Returns the CRC32 for the specified stream.

View File

@@ -50,7 +50,7 @@ namespace SharpCompress.Compressors.Deflate
/// </remarks>
public virtual FlushType FlushMode
{
get => (_baseStream._flushMode);
get { return (_baseStream._flushMode); }
set
{
if (_disposed)
@@ -80,7 +80,7 @@ namespace SharpCompress.Compressors.Deflate
/// </remarks>
public int BufferSize
{
get => _baseStream._bufferSize;
get { return _baseStream._bufferSize; }
set
{
if (_disposed)
@@ -111,7 +111,7 @@ namespace SharpCompress.Compressors.Deflate
/// </remarks>
public CompressionStrategy Strategy
{
get => _baseStream.Strategy;
get { return _baseStream.Strategy; }
set
{
if (_disposed)
@@ -123,10 +123,10 @@ namespace SharpCompress.Compressors.Deflate
}
/// <summary> Returns the total number of bytes input so far.</summary>
public virtual long TotalIn => _baseStream._z.TotalBytesIn;
public virtual long TotalIn { get { return _baseStream._z.TotalBytesIn; } }
/// <summary> Returns the total number of bytes output so far.</summary>
public virtual long TotalOut => _baseStream._z.TotalBytesOut;
public virtual long TotalOut { get { return _baseStream._z.TotalBytesOut; } }
#endregion
@@ -156,7 +156,7 @@ namespace SharpCompress.Compressors.Deflate
/// <remarks>
/// Always returns false.
/// </remarks>
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
/// <summary>
/// Indicates whether the stream can be written.
@@ -179,7 +179,7 @@ namespace SharpCompress.Compressors.Deflate
/// <summary>
/// Reading this property always throws a <see cref="NotImplementedException"/>.
/// </summary>
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
/// <summary>
/// The position of the stream pointer.
@@ -206,7 +206,7 @@ namespace SharpCompress.Compressors.Deflate
}
return 0;
}
set => throw new NotSupportedException();
set { throw new NotSupportedException(); }
}
/// <summary>
@@ -342,7 +342,13 @@ namespace SharpCompress.Compressors.Deflate
#endregion
public MemoryStream InputBuffer => new MemoryStream(_baseStream._z.InputBuffer, _baseStream._z.NextIn,
_baseStream._z.AvailableBytesIn);
public MemoryStream InputBuffer
{
get
{
return new MemoryStream(_baseStream._z.InputBuffer, _baseStream._z.NextIn,
_baseStream._z.AvailableBytesIn);
}
}
}
}

View File

@@ -30,6 +30,7 @@ using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Converters;
using SharpCompress.IO;
namespace SharpCompress.Compressors.Deflate
{
@@ -71,7 +72,7 @@ namespace SharpCompress.Compressors.Deflate
public virtual FlushType FlushMode
{
get => (BaseStream._flushMode);
get { return (BaseStream._flushMode); }
set
{
if (disposed)
@@ -84,7 +85,7 @@ namespace SharpCompress.Compressors.Deflate
public int BufferSize
{
get => BaseStream._bufferSize;
get { return BaseStream._bufferSize; }
set
{
if (disposed)
@@ -105,9 +106,9 @@ namespace SharpCompress.Compressors.Deflate
}
}
internal virtual long TotalIn => BaseStream._z.TotalBytesIn;
internal virtual long TotalIn { get { return BaseStream._z.TotalBytesIn; } }
internal virtual long TotalOut => BaseStream._z.TotalBytesOut;
internal virtual long TotalOut { get { return BaseStream._z.TotalBytesOut; } }
#endregion
@@ -137,7 +138,7 @@ namespace SharpCompress.Compressors.Deflate
/// <remarks>
/// Always returns false.
/// </remarks>
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
/// <summary>
/// Indicates whether the stream can be written.
@@ -160,7 +161,7 @@ namespace SharpCompress.Compressors.Deflate
/// <summary>
/// Reading this property always throws a <see cref="NotImplementedException"/>.
/// </summary>
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
/// <summary>
/// The position of the stream pointer.
@@ -188,7 +189,7 @@ namespace SharpCompress.Compressors.Deflate
return 0;
}
set => throw new NotSupportedException();
set { throw new NotSupportedException(); }
}
/// <summary>
@@ -350,7 +351,7 @@ namespace SharpCompress.Compressors.Deflate
public String Comment
{
get => comment;
get { return comment; }
set
{
if (disposed)
@@ -363,7 +364,7 @@ namespace SharpCompress.Compressors.Deflate
public string FileName
{
get => fileName;
get { return fileName; }
set
{
if (disposed)
@@ -413,67 +414,74 @@ namespace SharpCompress.Compressors.Deflate
int fnLength = (FileName == null) ? 0 : filenameBytes.Length + 1;
int bufferLength = 10 + cbLength + fnLength;
var header = new byte[bufferLength];
int i = 0;
// ID
header[i++] = 0x1F;
header[i++] = 0x8B;
// compression method
header[i++] = 8;
byte flag = 0;
if (Comment != null)
var header = ByteArrayPool.RentWritable(bufferLength);
try
{
flag ^= 0x10;
int i = 0;
// ID
header[i++] = 0x1F;
header[i++] = 0x8B;
// compression method
header[i++] = 8;
byte flag = 0;
if (Comment != null)
{
flag ^= 0x10;
}
if (FileName != null)
{
flag ^= 0x8;
}
// flag
header[i++] = flag;
// mtime
if (!LastModified.HasValue)
{
LastModified = DateTime.Now;
}
TimeSpan delta = LastModified.Value - UnixEpoch;
var timet = (Int32)delta.TotalSeconds;
DataConverter.LittleEndian.PutBytes(header, i, timet);
i += 4;
// xflg
header[i++] = 0; // this field is totally useless
// OS
header[i++] = 0xFF; // 0xFF == unspecified
// extra field length - only if FEXTRA is set, which it is not.
//header[i++]= 0;
//header[i++]= 0;
// filename
if (fnLength != 0)
{
Array.Copy(filenameBytes, 0, header, i, fnLength - 1);
i += fnLength - 1;
header[i++] = 0; // terminate
}
// comment
if (cbLength != 0)
{
Array.Copy(commentBytes, 0, header, i, cbLength - 1);
i += cbLength - 1;
header[i++] = 0; // terminate
}
BaseStream._stream.Write(header, 0, bufferLength);
return bufferLength; // bytes written
}
if (FileName != null)
finally
{
flag ^= 0x8;
ByteArrayPool.Return(header);
}
// flag
header[i++] = flag;
// mtime
if (!LastModified.HasValue)
{
LastModified = DateTime.Now;
}
TimeSpan delta = LastModified.Value - UnixEpoch;
var timet = (Int32)delta.TotalSeconds;
DataConverter.LittleEndian.PutBytes(header, i, timet);
i += 4;
// xflg
header[i++] = 0; // this field is totally useless
// OS
header[i++] = 0xFF; // 0xFF == unspecified
// extra field length - only if FEXTRA is set, which it is not.
//header[i++]= 0;
//header[i++]= 0;
// filename
if (fnLength != 0)
{
Array.Copy(filenameBytes, 0, header, i, fnLength - 1);
i += fnLength - 1;
header[i++] = 0; // terminate
}
// comment
if (cbLength != 0)
{
Array.Copy(commentBytes, 0, header, i, cbLength - 1);
i += cbLength - 1;
header[i++] = 0; // terminate
}
BaseStream._stream.Write(header, 0, header.Length);
return header.Length; // bytes written
}
}
}

View File

@@ -30,6 +30,7 @@ using System.IO;
using SharpCompress.Common;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.Converters;
using SharpCompress.IO;
namespace SharpCompress.Compressors.Deflate
{
@@ -98,7 +99,7 @@ namespace SharpCompress.Compressors.Deflate
}
}
protected internal bool _wantCompress => (_compressionMode == CompressionMode.Compress);
protected internal bool _wantCompress { get { return (_compressionMode == CompressionMode.Compress); } }
private ZlibCodec z
{
@@ -267,46 +268,48 @@ namespace SharpCompress.Compressors.Deflate
}
// Read and potentially verify the GZIP trailer: CRC32 and size mod 2^32
byte[] trailer = new byte[8];
// workitem 8679
if (_z.AvailableBytesIn != 8)
using (var trailer = ByteArrayPool.RentScope(8))
{
// Make sure we have read to the end of the stream
Array.Copy(_z.InputBuffer, _z.NextIn, trailer, 0, _z.AvailableBytesIn);
int bytesNeeded = 8 - _z.AvailableBytesIn;
int bytesRead = _stream.Read(trailer,
_z.AvailableBytesIn,
bytesNeeded);
if (bytesNeeded != bytesRead)
// workitem 8679
if (_z.AvailableBytesIn != 8)
{
throw new ZlibException(String.Format(
"Protocol error. AvailableBytesIn={0}, expected 8",
_z.AvailableBytesIn + bytesRead));
// Make sure we have read to the end of the stream
Array.Copy(_z.InputBuffer, _z.NextIn, trailer.Array, 0, _z.AvailableBytesIn);
int bytesNeeded = 8 - _z.AvailableBytesIn;
int bytesRead = _stream.Read(trailer.Array,
_z.AvailableBytesIn,
bytesNeeded);
if (bytesNeeded != bytesRead)
{
throw new ZlibException(String.Format(
"Protocol error. AvailableBytesIn={0}, expected 8",
_z.AvailableBytesIn + bytesRead));
}
}
else
{
Array.Copy(_z.InputBuffer, _z.NextIn, trailer.Array, 0, trailer.Count);
}
}
else
{
Array.Copy(_z.InputBuffer, _z.NextIn, trailer, 0, trailer.Length);
}
Int32 crc32_expected = DataConverter.LittleEndian.GetInt32(trailer, 0);
Int32 crc32_actual = crc.Crc32Result;
Int32 isize_expected = DataConverter.LittleEndian.GetInt32(trailer, 4);
Int32 isize_actual = (Int32)(_z.TotalBytesOut & 0x00000000FFFFFFFF);
Int32 crc32_expected = DataConverter.LittleEndian.GetInt32(trailer.Array, 0);
Int32 crc32_actual = crc.Crc32Result;
Int32 isize_expected = DataConverter.LittleEndian.GetInt32(trailer.Array, 4);
Int32 isize_actual = (Int32)(_z.TotalBytesOut & 0x00000000FFFFFFFF);
if (crc32_actual != crc32_expected)
{
throw new ZlibException(
String.Format("Bad CRC32 in GZIP stream. (actual({0:X8})!=expected({1:X8}))",
crc32_actual, crc32_expected));
}
if (crc32_actual != crc32_expected)
{
throw new ZlibException(
String.Format("Bad CRC32 in GZIP stream. (actual({0:X8})!=expected({1:X8}))",
crc32_actual, crc32_expected));
}
if (isize_actual != isize_expected)
{
throw new ZlibException(
String.Format("Bad size in GZIP stream. (actual({0})!=expected({1}))", isize_actual,
isize_expected));
if (isize_actual != isize_expected)
{
throw new ZlibException(
String.Format("Bad size in GZIP stream. (actual({0})!=expected({1}))", isize_actual,
isize_expected));
}
}
}
else
@@ -427,57 +430,61 @@ namespace SharpCompress.Compressors.Deflate
int totalBytesRead = 0;
// read the header on the first read
byte[] header = new byte[10];
int n = _stream.Read(header, 0, header.Length);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
using (var header = ByteArrayPool.RentScope(10))
{
return 0;
}
int n = _stream.Read(header);
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
Int32 timet = DataConverter.LittleEndian.GetInt32(header, 4);
_GzipMtime = TarHeader.Epoch.AddSeconds(timet);
totalBytesRead += n;
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
n = _stream.Read(header, 0, 2); // 2-byte length field
totalBytesRead += n;
Int16 extraLength = (Int16)(header[0] + header[1] * 256);
byte[] extra = new byte[extraLength];
n = _stream.Read(extra, 0, extra.Length);
if (n != extraLength)
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
return 0;
}
totalBytesRead += n;
}
if ((header[3] & 0x08) == 0x08)
{
_GzipFileName = ReadZeroTerminatedString();
}
if ((header[3] & 0x10) == 0x010)
{
_GzipComment = ReadZeroTerminatedString();
}
if ((header[3] & 0x02) == 0x02)
{
Read(_buf1, 0, 1); // CRC16, ignore
}
return totalBytesRead;
if (n != 10)
{
throw new ZlibException("Not a valid GZIP stream.");
}
if (header[0] != 0x1F || header[1] != 0x8B || header[2] != 8)
{
throw new ZlibException("Bad GZIP header.");
}
Int32 timet = DataConverter.LittleEndian.GetInt32(header.Array, 4);
_GzipMtime = TarHeader.Epoch.AddSeconds(timet);
totalBytesRead += n;
if ((header[3] & 0x04) == 0x04)
{
// read and discard extra field
n = _stream.Read(header.Array, 0, 2); // 2-byte length field
totalBytesRead += n;
Int16 extraLength = (Int16)(header[0] + header[1] * 256);
using (var extra = ByteArrayPool.RentScope(extraLength))
{
n = _stream.Read(extra);
if (n != extraLength)
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
}
totalBytesRead += n;
}
}
if ((header[3] & 0x08) == 0x08)
{
_GzipFileName = ReadZeroTerminatedString();
}
if ((header[3] & 0x10) == 0x010)
{
_GzipComment = ReadZeroTerminatedString();
}
if ((header[3] & 0x02) == 0x02)
{
Read(_buf1, 0, 1); // CRC16, ignore
}
return totalBytesRead;
}
}
public override Int32 Read(Byte[] buffer, Int32 offset, Int32 count)
@@ -630,15 +637,15 @@ namespace SharpCompress.Compressors.Deflate
return rc;
}
public override Boolean CanRead => _stream.CanRead;
public override Boolean CanRead { get { return _stream.CanRead; } }
public override Boolean CanSeek => _stream.CanSeek;
public override Boolean CanSeek { get { return _stream.CanSeek; } }
public override Boolean CanWrite => _stream.CanWrite;
public override Boolean CanWrite { get { return _stream.CanWrite; } }
public override Int64 Length => _stream.Length;
public override Int64 Length { get { return _stream.Length; } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
internal enum StreamMode
{

View File

@@ -171,7 +171,7 @@ namespace SharpCompress.Compressors.Deflate
/// <summary>
/// The Adler32 checksum on the data transferred through the codec so far. You probably don't need to look at this.
/// </summary>
public int Adler32 => (int)_Adler32;
public int Adler32 { get { return (int)_Adler32; } }
/// <summary>
/// Create a ZlibCodec.

View File

@@ -63,7 +63,7 @@ namespace SharpCompress.Compressors.Deflate
/// </summary>
public virtual FlushType FlushMode
{
get => (_baseStream._flushMode);
get { return (_baseStream._flushMode); }
set
{
if (_disposed)
@@ -93,7 +93,7 @@ namespace SharpCompress.Compressors.Deflate
/// </remarks>
public int BufferSize
{
get => _baseStream._bufferSize;
get { return _baseStream._bufferSize; }
set
{
if (_disposed)
@@ -115,10 +115,10 @@ namespace SharpCompress.Compressors.Deflate
}
/// <summary> Returns the total number of bytes input so far.</summary>
public virtual long TotalIn => _baseStream._z.TotalBytesIn;
public virtual long TotalIn { get { return _baseStream._z.TotalBytesIn; } }
/// <summary> Returns the total number of bytes output so far.</summary>
public virtual long TotalOut => _baseStream._z.TotalBytesOut;
public virtual long TotalOut { get { return _baseStream._z.TotalBytesOut; } }
#endregion
@@ -148,7 +148,7 @@ namespace SharpCompress.Compressors.Deflate
/// <remarks>
/// Always returns false.
/// </remarks>
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
/// <summary>
/// Indicates whether the stream can be written.
@@ -171,7 +171,7 @@ namespace SharpCompress.Compressors.Deflate
/// <summary>
/// Reading this property always throws a <see cref="NotImplementedException"/>.
/// </summary>
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
/// <summary>
/// The position of the stream pointer.
@@ -199,7 +199,7 @@ namespace SharpCompress.Compressors.Deflate
return 0;
}
set => throw new NotSupportedException();
set { throw new NotSupportedException(); }
}
/// <summary>

View File

@@ -78,20 +78,20 @@ namespace SharpCompress.Compressors.Filters
baseStream.Dispose();
}
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => baseStream.Length + data1.Length + data2.Length;
public override long Length { get { return baseStream.Length + data1.Length + data2.Length; } }
public override long Position { get => position; set => throw new NotSupportedException(); }
public override long Position { get { return position; } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -34,20 +34,20 @@ namespace SharpCompress.Compressors.Filters
baseStream.Dispose();
}
public override bool CanRead => !isEncoder;
public override bool CanRead { get { return !isEncoder; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => isEncoder;
public override bool CanWrite { get { return isEncoder; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => baseStream.Length;
public override long Length { get { return baseStream.Length; } }
public override long Position { get => baseStream.Position; set => throw new NotSupportedException(); }
public override long Position { get { return baseStream.Position; } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -8,20 +8,20 @@ namespace SharpCompress.Compressors.LZMA
{
internal abstract class DecoderStream2 : Stream
{
public override bool CanRead => true;
public override bool CanRead { get { return true; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override long Seek(long offset, SeekOrigin origin)
{

View File

@@ -178,6 +178,6 @@ namespace SharpCompress.Compressors.LZMA.LZ
_streamPos -= (UInt32)subValue;
}
public bool IsDataStarved => _streamPos - _pos < _keepSizeAfter;
public bool IsDataStarved { get { return _streamPos - _pos < _keepSizeAfter; } }
}
}

View File

@@ -166,9 +166,9 @@ namespace SharpCompress.Compressors.LZMA.LZ
Limit = Total + size;
}
public bool HasSpace => _pos < _windowSize && Total < Limit;
public bool HasSpace { get { return _pos < _windowSize && Total < Limit; } }
public bool HasPending => _pendingLen > 0;
public bool HasPending { get { return _pendingLen > 0; } }
public int Read(byte[] buffer, int offset, int count)
{
@@ -200,6 +200,6 @@ namespace SharpCompress.Compressors.LZMA.LZ
}
}
public int AvailableBytes => _pos - _streamPos;
public int AvailableBytes { get { return _pos - _streamPos; } }
}
}

View File

@@ -1,153 +0,0 @@
using System;
using System.IO;
namespace SharpCompress.Compressors.LZMA
{
// TODO:
// - Write as well as read
// - Multi-volume support
// - Use of the data size / member size values at the end of the stream
/// <summary>
/// Stream supporting the LZIP format, as documented at http://www.nongnu.org/lzip/manual/lzip_manual.html
/// </summary>
public class LZipStream : Stream
{
private readonly Stream stream;
private bool disposed;
private readonly bool leaveOpen;
public LZipStream(Stream stream, CompressionMode mode)
: this(stream, mode, false)
{
}
public LZipStream(Stream stream, CompressionMode mode, bool leaveOpen)
{
if (mode != CompressionMode.Decompress)
{
throw new NotImplementedException("Only LZip decompression is currently supported");
}
Mode = mode;
this.leaveOpen = leaveOpen;
int dictionarySize = ValidateAndReadSize(stream);
if (dictionarySize == 0)
{
throw new IOException("Not an LZip stream");
}
byte[] properties = GetProperties(dictionarySize);
this.stream = new LzmaStream(properties, stream);
}
#region Stream methods
protected override void Dispose(bool disposing)
{
if (disposed)
{
return;
}
disposed = true;
if (disposing && !leaveOpen)
{
stream.Dispose();
}
}
public CompressionMode Mode { get; }
public override bool CanRead => stream.CanRead;
public override bool CanSeek => false;
public override bool CanWrite => false;
public override void Flush()
{
stream.Flush();
}
// TODO: Both Length and Position are sometimes feasible, but would require
// reading the output length when we initialize.
public override long Length => throw new NotImplementedException();
public override long Position { get => throw new NotImplementedException(); set => throw new NotImplementedException(); }
public override int Read(byte[] buffer, int offset, int count) => stream.Read(buffer, offset, count);
public override long Seek(long offset, SeekOrigin origin)
{
throw new NotSupportedException();
}
public override void SetLength(long value)
{
throw new NotImplementedException();
}
public override void Write(byte[] buffer, int offset, int count)
{
throw new NotImplementedException();
}
#endregion
/// <summary>
/// Determines if the given stream is positioned at the start of a v1 LZip
/// file, as indicated by the ASCII characters "LZIP" and a version byte
/// of 1, followed by at least one byte.
/// </summary>
/// <param name="stream">The stream to read from. Must not be null.</param>
/// <returns><c>true</c> if the given stream is an LZip file, <c>false</c> otherwise.</returns>
public static bool IsLZipFile(Stream stream) => ValidateAndReadSize(stream) != 0;
/// <summary>
/// Reads the 6-byte header of the stream, and returns 0 if either the header
/// couldn't be read or it isn't a validate LZIP header, or the dictionary
/// size if it *is* a valid LZIP file.
/// </summary>
private static int ValidateAndReadSize(Stream stream)
{
if (stream == null)
{
throw new ArgumentNullException(nameof(stream));
}
// Read the header
byte[] header = new byte[6];
int n = stream.Read(header, 0, header.Length);
// TODO: Handle reading only part of the header?
if (n != 6)
{
return 0;
}
if (header[0] != 'L' || header[1] != 'Z' || header[2] != 'I' || header[3] != 'P' || header[4] != 1 /* version 1 */)
{
return 0;
}
int basePower = header[5] & 0x1F;
int subtractionNumerator = (header[5] & 0xE0) >> 5;
return (1 << basePower) - subtractionNumerator * (1 << (basePower - 4));
}
/// <summary>
/// Creates a byte array to communicate the parameters and dictionary size to LzmaStream.
/// </summary>
private static byte[] GetProperties(int dictionarySize) =>
new byte[]
{
// Parameters as per http://www.nongnu.org/lzip/manual/lzip_manual.html#Stream-format
// but encoded as a single byte in the format LzmaStream expects.
// literal_context_bits = 3
// literal_pos_state_bits = 0
// pos_state_bits = 2
93,
// Dictionary size as 4-byte little-endian value
(byte)(dictionarySize & 0xff),
(byte)((dictionarySize >> 8) & 0xff),
(byte)((dictionarySize >> 16) & 0xff),
(byte)((dictionarySize >> 24) & 0xff)
};
}
}

View File

@@ -118,11 +118,11 @@ namespace SharpCompress.Compressors.LZMA
}
}
public override bool CanRead => encoder == null;
public override bool CanRead { get { return encoder == null; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => encoder != null;
public override bool CanWrite { get { return encoder != null; } }
public override void Flush()
{
@@ -149,9 +149,9 @@ namespace SharpCompress.Compressors.LZMA
base.Dispose(disposing);
}
public override long Length => position + availableBytes;
public override long Length { get { return position + availableBytes; } }
public override long Position { get => position; set => throw new NotSupportedException(); }
public override long Position { get { return position; } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -245,7 +245,7 @@ namespace SharpCompress.Compressors.LZMA.RangeCoder
return symbol;
}
public bool IsFinished => Code == 0;
public bool IsFinished { get { return Code == 0; } }
// ulong GetProcessedSize() {return Stream.GetProcessedSize(); }
}

View File

@@ -40,19 +40,19 @@ namespace SharpCompress.Compressors.LZMA.Utilites
return mCRC;
}
public override bool CanRead => false;
public override bool CanRead { get { return false; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => true;
public override bool CanWrite { get { return true; } }
public override void Flush()
{
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{
@@ -122,20 +122,20 @@ namespace SharpCompress.Compressors.LZMA.Utilites
return mCRC;
}
public override bool CanRead => mSource.CanRead;
public override bool CanRead { get { return mSource.CanRead; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => false;
public override bool CanWrite { get { return false; } }
public override void Flush()
{
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -62,19 +62,19 @@ namespace SharpCompress.Compressors.LZMA.Utilites
}
}
public override bool CanRead => false;
public override bool CanRead { get { return false; } }
public override bool CanSeek => false;
public override bool CanSeek { get { return false; } }
public override bool CanWrite => true;
public override bool CanWrite { get { return true; } }
public override void Flush()
{
}
public override long Length => throw new NotSupportedException();
public override long Length { get { throw new NotSupportedException(); } }
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }
public override long Position { get { throw new NotSupportedException(); } set { throw new NotSupportedException(); } }
public override int Read(byte[] buffer, int offset, int count)
{

View File

@@ -19,7 +19,7 @@ namespace SharpCompress.Compressors.PPMd.H
{
}
internal int SummFreq { get => DataConverter.LittleEndian.GetInt16(Memory, Address) & 0xffff; set => DataConverter.LittleEndian.PutBytes(Memory, Address, (short)value); }
internal int SummFreq { get { return DataConverter.LittleEndian.GetInt16(Memory, Address) & 0xffff; } set { DataConverter.LittleEndian.PutBytes(Memory, Address, (short)value); } }
internal FreqData Initialize(byte[] mem)
{

View File

@@ -22,33 +22,33 @@ namespace SharpCompress.Compressors.PPMd.H
public SubAllocator SubAlloc { get; } = new SubAllocator();
public virtual SEE2Context DummySEE2Cont => dummySEE2Cont;
public virtual SEE2Context DummySEE2Cont { get { return dummySEE2Cont; } }
public virtual int InitRL => initRL;
public virtual int InitRL { get { return initRL; } }
public virtual int EscCount { get => escCount; set => escCount = value & 0xff; }
public virtual int EscCount { get { return escCount; } set { escCount = value & 0xff; } }
public virtual int[] CharMask => charMask;
public virtual int[] CharMask { get { return charMask; } }
public virtual int NumMasked { get => numMasked; set => numMasked = value; }
public virtual int NumMasked { get { return numMasked; } set { numMasked = value; } }
public virtual int PrevSuccess { get => prevSuccess; set => prevSuccess = value & 0xff; }
public virtual int PrevSuccess { get { return prevSuccess; } set { prevSuccess = value & 0xff; } }
public virtual int InitEsc { get => initEsc; set => initEsc = value; }
public virtual int InitEsc { get { return initEsc; } set { initEsc = value; } }
public virtual int RunLength { get => runLength; set => runLength = value; }
public virtual int RunLength { get { return runLength; } set { runLength = value; } }
public virtual int HiBitsFlag { get => hiBitsFlag; set => hiBitsFlag = value & 0xff; }
public virtual int HiBitsFlag { get { return hiBitsFlag; } set { hiBitsFlag = value & 0xff; } }
public virtual int[][] BinSumm => binSumm;
public virtual int[][] BinSumm { get { return binSumm; } }
internal RangeCoder Coder { get; private set; }
internal State FoundState { get; private set; }
public virtual byte[] Heap => SubAlloc.Heap;
public virtual byte[] Heap { get { return SubAlloc.Heap; } }
public virtual int OrderFall => orderFall;
public virtual int OrderFall { get { return orderFall; } }
public const int MAX_O = 64; /* maximum allowed model order */

View File

@@ -8,7 +8,8 @@ namespace SharpCompress.Compressors.PPMd.H
{
internal FreqData FreqData
{
get => freqData;
get { return freqData; }
set
{
freqData.SummFreq = value.SummFreq;
@@ -130,7 +131,7 @@ namespace SharpCompress.Compressors.PPMd.H
internal override int Address
{
get => base.Address;
get { return base.Address; }
set
{
base.Address = value;

Some files were not shown because too many files have changed in this diff Show More