Compare commits

...

29 Commits
0.17.0 ... 0.19

Author SHA1 Message Date
Adam Hathcock
bd9417e74c Mark for 0.19 2017-12-12 11:17:57 +00:00
Adam Hathcock
694e869162 Use arraypool for transfer/skip (#326)
* Use arraypool for transfer/skip

* Merge fixes

* Remove redundant constant
2017-12-08 13:58:38 +00:00
Adam Hathcock
45845f8963 Add Circle CI build 2017-12-08 12:03:28 +00:00
Adam Hathcock
a8b6def76a Netcore2 (#302)
* Add netstandard 2.0 target and netcoreapp2.0 tests

* Update xunit

* set tests explicitly to netcore2

* update travis

* Don't say build as netcoreapp1.0

* try adding dotnet 1 too

* Remove .NET Core 1 support

* switch to circle

* update cake

* fix circle build

* try fix file ending test again

* Fix casing on files

* Another casing fix

* Add back netstandard1.0

* Finish adding netstandard 1.0 back

* Add netstandard1.3 back
2017-12-08 12:00:29 +00:00
Sors
a4ebd5fb3d Rar 5 format (#310)
Fix rar 5 format comment
2017-12-04 18:59:49 +00:00
Adam Hathcock
3da3b212fa create new memorystream to allow proper resizing as memorystream could be a user provided buffer. Update xunit (#307) 2017-12-04 18:48:38 +00:00
Martijn Kant
c2528cf93e Mk/add support for extracting password protected LZMA(2) 7z archives (#324)
* Added possibility to decompress a password protected 7z LZMA archive

* Fix tests
2017-12-04 10:55:30 +00:00
coderb
550fecd4d3 bugfix: eliminate spurious rar crc exception when Read() is called with count = 0 (#313) 2017-10-23 11:58:02 +01:00
Adam Hathcock
50b01428b4 Mark for 0.18.2 2017-09-22 09:16:42 +01:00
Thritton
bb59f28b22 Update ArchiveReader.cs (#303)
#227
Added check if argument is in range in method TranslateTime(long? time)
2017-09-19 15:25:10 +01:00
François
7064cda6de Zlib: fix Adler32 implementation (#301) 2017-09-17 22:21:09 +01:00
Adam Hathcock
525c1873e8 Fix merge 2017-09-17 22:16:57 +01:00
François
3d91b4eb5e XZ: fix padding issues (#300)
* XZ: fix variable-length integers decoding

* XZ: fix block and index padding issues

* cleanup in XZStreamTests
2017-09-17 22:14:23 +01:00
François
f20c03180e XZ: fix variable-length integers decoding (#299) 2017-09-17 22:05:20 +01:00
Vladimir Kozlov
08fee76b4e Fixes Double Dispose() of ZipWritingStream #294 https://github.com/adamhathcock/sharpcompress/issues/294 (#295) 2017-09-08 13:25:53 +01:00
Adam Hathcock
0f511c4b2a Mark for 0.18.1 2017-08-17 11:43:34 +01:00
twirpx
42d9dfd117 Fixed bug: Passing default ReaderOptions when creating ZipReader for solid extraction (#287) 2017-08-16 08:19:23 +01:00
Adam Hathcock
3983db08ff Use nameof 2017-07-27 11:05:33 -05:00
Adam Hathcock
72114bceea Add release link 2017-07-17 10:22:58 -05:00
Adam Hathcock
c303f96682 mark for 0.18 2017-07-17 10:11:27 -05:00
Adam Hathcock
0e785968c4 Rework usage of WriterOptions for writers since it was inconsistently used. (#271) 2017-07-17 11:05:42 -04:00
Adam Hathcock
15110e18e2 Don't skip ZipReader data twice. (#272)
* Don't skip ZipReader data twice.

* Add archive for a new test
2017-07-17 11:05:21 -04:00
Adam Hathcock
5465af041b Use Skip and ReadFully extension methods where possible. (#276) 2017-07-17 10:55:22 -04:00
Adam Hathcock
310d56fc16 Made ArchiveEncoding a non-static class that is used with options. (#274)
* Made ArchiveEncoding a non-static class that is used with options.

* Revert some formatting.

* Optional string decoder delegate (#278)
2017-07-17 10:53:20 -04:00
eklann
231258ef69 Force encoding (#266)
* Fixing build

* Fixing build

* Fixing build

* Fixed build (seems working now)

* Added support to force specific encoding when reading or writing an archive

* Minor fixed related to force encoding

* Removed obsolete project file not present in master
2017-07-05 10:15:49 -05:00
Sam Bott
16b7e3ffc8 Add XZ tests (#258)
* tests added and converted to xunit

* reordered two assertions
2017-06-11 13:44:00 +01:00
Adam Hathcock
513e59f830 Mark for 0.17.1 2017-06-09 08:28:35 +01:00
Adam Hathcock
b10a1cf2bd Bug on Windows on .NET Core fix (#257)
* Bug on Windows on .NET Core fix: https://github.com/dotnet/corefx/issues/20676

* Add comment
2017-06-09 08:22:47 +01:00
Adam Hathcock
1656edaa29 Add some more details to nuget package 2017-06-01 12:36:01 +01:00
96 changed files with 1677 additions and 801 deletions

11
.circleci/config.yml Normal file
View File

@@ -0,0 +1,11 @@
version: 2
jobs:
build:
docker:
- image: adamhathcock/cake-build:latest
steps:
- checkout
- run:
name: Build
command: ./build.sh

2
.gitattributes vendored
View File

@@ -2,4 +2,4 @@
* text=auto
# need original files to be windows
test/TestArchives/Original/*.txt eol=crlf
*.txt text eol=crlf

1
.gitignore vendored
View File

@@ -14,3 +14,4 @@ tests/TestArchives/Scratch
.vs
tools
.vscode
.idea/

View File

@@ -1,13 +0,0 @@
dist: trusty
language: csharp
cache:
directories:
- $HOME/.dotnet
solution: SharpCompress.sln
matrix:
include:
- dotnet: 1.0.4
mono: none
env: DOTNETCORE=1
script:
- ./build.sh

View File

@@ -7,8 +7,8 @@ The major feature is support for non-seekable streams so large files can be proc
AppVeyor Build -
[![Build status](https://ci.appveyor.com/api/projects/status/voxg971oemmvxh1e/branch/master?svg=true)](https://ci.appveyor.com/project/adamhathcock/sharpcompress/branch/master)
Travis CI Build -
[![Build Status](https://travis-ci.org/adamhathcock/sharpcompress.svg?branch=master)](https://travis-ci.org/adamhathcock/sharpcompress)
Circle CI Build -
[![CircleCI](https://circleci.com/gh/adamhathcock/sharpcompress.svg?style=svg)](https://circleci.com/gh/adamhathcock/sharpcompress)
## Need Help?
Post Issues on Github!
@@ -44,6 +44,14 @@ I'm always looking for help or ideas. Please submit code or email with ideas. Un
## Version Log
### Version 0.18
* [Now on Github releases](https://github.com/adamhathcock/sharpcompress/releases/tag/0.18)
### Version 0.17.1
* Fix - [Bug Fix for .NET Core on Windows](https://github.com/adamhathcock/sharpcompress/pull/257)
### Version 0.17.0
* New - Full LZip support! Can read and write LZip files and Tars inside LZip files. [Make LZip a first class citizen. #241](https://github.com/adamhathcock/sharpcompress/issues/241)

View File

@@ -114,6 +114,7 @@
<s:String x:Key="/Default/CodeStyle/Naming/XamlNaming/UserRules/=NAMESPACE_005FALIAS/@EntryIndexedValue">&lt;Policy Inspect="True" Prefix="" Suffix="" Style="aaBb" /&gt;</s:String>
<s:String x:Key="/Default/CodeStyle/Naming/XamlNaming/UserRules/=XAML_005FFIELD/@EntryIndexedValue">&lt;Policy Inspect="True" Prefix="" Suffix="" Style="AaBb" /&gt;</s:String>
<s:String x:Key="/Default/CodeStyle/Naming/XamlNaming/UserRules/=XAML_005FRESOURCE/@EntryIndexedValue">&lt;Policy Inspect="True" Prefix="" Suffix="" Style="AaBb" /&gt;</s:String>
<s:Boolean x:Key="/Default/Environment/SettingsMigration/IsMigratorApplied/=JetBrains_002EReSharper_002EPsi_002ECSharp_002ECodeStyle_002ECSharpAttributeForSingleLineMethodUpgrade/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/Environment/SettingsMigration/IsMigratorApplied/=JetBrains_002EReSharper_002EPsi_002ECSharp_002ECodeStyle_002ESettingsUpgrade_002EAddAccessorOwnerDeclarationBracesMigration/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/Environment/SettingsMigration/IsMigratorApplied/=JetBrains_002EReSharper_002EPsi_002ECSharp_002ECodeStyle_002ESettingsUpgrade_002EMigrateBlankLinesAroundFieldToBlankLinesAroundProperty/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/Environment/SettingsMigration/IsMigratorApplied/=JetBrains_002EReSharper_002EPsi_002ECSharp_002ECodeStyle_002ESettingsUpgrade_002EMigrateThisQualifierSettings/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>

View File

@@ -30,8 +30,11 @@ Task("Build")
DotNetCoreBuild("./src/SharpCompress/SharpCompress.csproj", settings);
settings.Framework = "netcoreapp1.1";
DotNetCoreBuild("./tests/SharpCompress.Test/SharpCompress.Test.csproj", settings);
settings.Framework = "netstandard1.3";
DotNetCoreBuild("./src/SharpCompress/SharpCompress.csproj", settings);
settings.Framework = "netstandard2.0";
DotNetCoreBuild("./src/SharpCompress/SharpCompress.csproj", settings);
}
});
@@ -39,23 +42,25 @@ Task("Test")
.IsDependentOn("Build")
.Does(() =>
{
if (!bool.Parse(EnvironmentVariable("APPVEYOR") ?? "false")
&& !bool.Parse(EnvironmentVariable("TRAVIS") ?? "false"))
var files = GetFiles("tests/**/*.csproj");
foreach(var file in files)
{
var files = GetFiles("tests/**/*.csproj");
foreach(var file in files)
var settings = new DotNetCoreTestSettings
{
var settings = new DotNetCoreTestSettings
{
Configuration = "Release"
};
Configuration = "Release",
Framework = "netcoreapp2.0"
};
DotNetCoreTest(file.ToString(), settings);
}
}
else
{
Information("Skipping tests as this is AppVeyor or Travis CI");
DotNetCoreTest(file.ToString(), settings);
settings = new DotNetCoreTestSettings
{
Configuration = "Release",
Framework = "netcoreapp1.1"
};
DotNetCoreTest(file.ToString(), settings);
}
});

View File

@@ -8,7 +8,7 @@
# Define directories.
SCRIPT_DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
TOOLS_DIR=$SCRIPT_DIR/tools
CAKE_VERSION=0.19.1
CAKE_VERSION=0.23.0
CAKE_DLL=$TOOLS_DIR/Cake.CoreCLR.$CAKE_VERSION/Cake.dll
# Make sure the tools folder exist.

View File

@@ -14,6 +14,7 @@ namespace SharpCompress.Archives.GZip
public class GZipArchive : AbstractWritableArchive<GZipArchiveEntry, GZipVolume>
{
#if !NO_FILE
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
@@ -36,6 +37,7 @@ namespace SharpCompress.Archives.GZip
return new GZipArchive(fileInfo, readerOptions ?? new ReaderOptions());
}
#endif
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
@@ -54,11 +56,11 @@ namespace SharpCompress.Archives.GZip
#if !NO_FILE
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="options"></param>
/// <summary>
/// Constructor with a FileInfo object to an existing file.
/// </summary>
/// <param name="fileInfo"></param>
/// <param name="options"></param>
internal GZipArchive(FileInfo fileInfo, ReaderOptions options)
: base(ArchiveType.GZip, fileInfo, options)
{
@@ -104,15 +106,9 @@ namespace SharpCompress.Archives.GZip
{
// read the header on the first read
byte[] header = new byte[10];
int n = stream.Read(header, 0, header.Length);
// workitem 8501: handle edge case (decompress empty stream)
if (n == 0)
{
return false;
}
if (n != 10)
if (!stream.ReadFully(header))
{
return false;
}
@@ -158,7 +154,7 @@ namespace SharpCompress.Archives.GZip
{
throw new InvalidOperationException("Only one entry is allowed in a GZip Archive");
}
using (var writer = new GZipWriter(stream))
using (var writer = new GZipWriter(stream, new GZipWriterOptions(options)))
{
foreach (var entry in oldEntries.Concat(newEntries)
.Where(x => !x.IsDirectory))
@@ -179,7 +175,7 @@ namespace SharpCompress.Archives.GZip
protected override IEnumerable<GZipArchiveEntry> LoadEntries(IEnumerable<GZipVolume> volumes)
{
Stream stream = volumes.Single().Stream;
yield return new GZipArchiveEntry(this, new GZipFilePart(stream));
yield return new GZipArchiveEntry(this, new GZipFilePart(stream, ReaderOptions.ArchiveEncoding));
}
protected override IReader CreateReaderForSolidExtraction()

View File

@@ -4,6 +4,7 @@ using System.IO;
using System.Linq;
using SharpCompress.Common;
using SharpCompress.Common.SevenZip;
using SharpCompress.Compressors.LZMA.Utilites;
using SharpCompress.IO;
using SharpCompress.Readers;
@@ -106,7 +107,7 @@ namespace SharpCompress.Archives.SevenZip
for (int i = 0; i < database.Files.Count; i++)
{
var file = database.Files[i];
yield return new SevenZipArchiveEntry(this, new SevenZipFilePart(stream, database, i, file));
yield return new SevenZipArchiveEntry(this, new SevenZipFilePart(stream, database, i, file, ReaderOptions.ArchiveEncoding));
}
}
@@ -117,7 +118,7 @@ namespace SharpCompress.Archives.SevenZip
stream.Position = 0;
var reader = new ArchiveReader();
reader.Open(stream);
database = reader.ReadDatabase(null);
database = reader.ReadDatabase(new PasswordProvider(ReaderOptions.Password));
}
}
@@ -144,7 +145,7 @@ namespace SharpCompress.Archives.SevenZip
protected override IReader CreateReaderForSolidExtraction()
{
return new SevenZipReader(this);
return new SevenZipReader(ReaderOptions, this);
}
public override bool IsSolid { get { return Entries.Where(x => !x.IsDirectory).GroupBy(x => x.FilePart.Folder).Count() > 1; } }
@@ -165,8 +166,8 @@ namespace SharpCompress.Archives.SevenZip
private Stream currentStream;
private CFileItem currentItem;
internal SevenZipReader(SevenZipArchive archive)
: base(new ReaderOptions(), ArchiveType.SevenZip)
internal SevenZipReader(ReaderOptions readerOptions, SevenZipArchive archive)
: base(readerOptions, ArchiveType.SevenZip)
{
this.archive = archive;
}
@@ -190,7 +191,7 @@ namespace SharpCompress.Archives.SevenZip
}
else
{
currentStream = archive.database.GetFolderStream(stream, currentFolder, null);
currentStream = archive.database.GetFolderStream(stream, currentFolder, new PasswordProvider(Options.Password));
}
foreach (var entry in group)
{
@@ -205,5 +206,21 @@ namespace SharpCompress.Archives.SevenZip
return CreateEntryStream(new ReadOnlySubStream(currentStream, currentItem.Size));
}
}
private class PasswordProvider : IPasswordProvider
{
private readonly string _password;
public PasswordProvider(string password)
{
_password = password;
}
public string CryptoGetTextPassword()
{
return _password;
}
}
}
}

View File

@@ -16,7 +16,7 @@ namespace SharpCompress.Archives.Tar
public class TarArchive : AbstractWritableArchive<TarArchiveEntry, TarVolume>
{
#if !NO_FILE
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
@@ -39,7 +39,7 @@ namespace SharpCompress.Archives.Tar
return new TarArchive(fileInfo, readerOptions ?? new ReaderOptions());
}
#endif
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
@@ -52,6 +52,7 @@ namespace SharpCompress.Archives.Tar
}
#if !NO_FILE
public static bool IsTarFile(string filePath)
{
return IsTarFile(new FileInfo(filePath));
@@ -74,7 +75,7 @@ namespace SharpCompress.Archives.Tar
{
try
{
TarHeader tar = new TarHeader();
TarHeader tar = new TarHeader(new ArchiveEncoding());
tar.Read(new BinaryReader(stream));
return tar.Name.Length > 0 && Enum.IsDefined(typeof(EntryType), tar.EntryType);
}
@@ -98,7 +99,6 @@ namespace SharpCompress.Archives.Tar
protected override IEnumerable<TarVolume> LoadVolumes(FileInfo file)
{
return new TarVolume(file.OpenRead(), ReaderOptions).AsEnumerable();
}
#endif
@@ -127,7 +127,7 @@ namespace SharpCompress.Archives.Tar
{
Stream stream = volumes.Single().Stream;
TarHeader previousHeader = null;
foreach (TarHeader header in TarHeaderFactory.ReadHeader(StreamingMode.Seekable, stream))
foreach (TarHeader header in TarHeaderFactory.ReadHeader(StreamingMode.Seekable, stream, ReaderOptions.ArchiveEncoding))
{
if (header != null)
{
@@ -152,7 +152,7 @@ namespace SharpCompress.Archives.Tar
memoryStream.Position = 0;
var bytes = memoryStream.ToArray();
header.Name = ArchiveEncoding.Default.GetString(bytes, 0, bytes.Length).TrimNulls();
header.Name = ReaderOptions.ArchiveEncoding.Decode(bytes).TrimNulls();
}
}

View File

@@ -24,6 +24,7 @@ namespace SharpCompress.Archives.Zip
public CompressionLevel DeflateCompressionLevel { get; set; }
#if !NO_FILE
/// <summary>
/// Constructor expects a filepath to an existing file.
/// </summary>
@@ -46,6 +47,7 @@ namespace SharpCompress.Archives.Zip
return new ZipArchive(fileInfo, readerOptions ?? new ReaderOptions());
}
#endif
/// <summary>
/// Takes a seekable Stream as a source
/// </summary>
@@ -58,6 +60,7 @@ namespace SharpCompress.Archives.Zip
}
#if !NO_FILE
public static bool IsZipFile(string filePath, string password = null)
{
return IsZipFile(new FileInfo(filePath), password);
@@ -78,7 +81,7 @@ namespace SharpCompress.Archives.Zip
public static bool IsZipFile(Stream stream, string password = null)
{
StreamingZipHeaderFactory headerFactory = new StreamingZipHeaderFactory(password);
StreamingZipHeaderFactory headerFactory = new StreamingZipHeaderFactory(password, new ArchiveEncoding());
try
{
ZipHeader header =
@@ -109,7 +112,7 @@ namespace SharpCompress.Archives.Zip
internal ZipArchive(FileInfo fileInfo, ReaderOptions readerOptions)
: base(ArchiveType.Zip, fileInfo, readerOptions)
{
headerFactory = new SeekableZipHeaderFactory(readerOptions.Password);
headerFactory = new SeekableZipHeaderFactory(readerOptions.Password, readerOptions.ArchiveEncoding);
}
protected override IEnumerable<ZipVolume> LoadVolumes(FileInfo file)
@@ -131,7 +134,7 @@ namespace SharpCompress.Archives.Zip
internal ZipArchive(Stream stream, ReaderOptions readerOptions)
: base(ArchiveType.Zip, stream, readerOptions)
{
headerFactory = new SeekableZipHeaderFactory(readerOptions.Password);
headerFactory = new SeekableZipHeaderFactory(readerOptions.Password, readerOptions.ArchiveEncoding);
}
protected override IEnumerable<ZipVolume> LoadVolumes(IEnumerable<Stream> streams)
@@ -150,19 +153,19 @@ namespace SharpCompress.Archives.Zip
switch (h.ZipHeaderType)
{
case ZipHeaderType.DirectoryEntry:
{
yield return new ZipArchiveEntry(this,
new SeekableZipFilePart(headerFactory,
h as DirectoryEntryHeader,
stream));
}
{
yield return new ZipArchiveEntry(this,
new SeekableZipFilePart(headerFactory,
h as DirectoryEntryHeader,
stream));
}
break;
case ZipHeaderType.DirectoryEnd:
{
byte[] bytes = (h as DirectoryEndHeader).Comment;
volume.Comment = ArchiveEncoding.Default.GetString(bytes, 0, bytes.Length);
yield break;
}
{
byte[] bytes = (h as DirectoryEndHeader).Comment;
volume.Comment = ReaderOptions.ArchiveEncoding.Decode(bytes);
yield break;
}
}
}
}
@@ -205,7 +208,7 @@ namespace SharpCompress.Archives.Zip
{
var stream = Volumes.Single().Stream;
stream.Position = 0;
return ZipReader.Open(stream);
return ZipReader.Open(stream, ReaderOptions);
}
}
}

View File

@@ -1,23 +1,60 @@
using System.Text;
using System;
using System.Text;
namespace SharpCompress.Common
{
public static class ArchiveEncoding
public class ArchiveEncoding
{
/// <summary>
/// Default encoding to use when archive format doesn't specify one.
/// </summary>
public static Encoding Default { get; set; }
public Encoding Default { get; set; }
/// <summary>
/// Encoding used by encryption schemes which don't comply with RFC 2898.
/// ArchiveEncoding used by encryption schemes which don't comply with RFC 2898.
/// </summary>
public static Encoding Password { get; set; }
public Encoding Password { get; set; }
static ArchiveEncoding()
/// <summary>
/// Set this encoding when you want to force it for all encoding operations.
/// </summary>
public Encoding Forced { get; set; }
/// <summary>
/// Set this when you want to use a custom method for all decoding operations.
/// </summary>
/// <returns>string Func(bytes, index, length)</returns>
public Func<byte[], int, int, string> CustomDecoder { get; set; }
public ArchiveEncoding()
{
Default = Encoding.UTF8;
Password = Encoding.UTF8;
}
public string Decode(byte[] bytes)
{
return Decode(bytes, 0, bytes.Length);
}
public string Decode(byte[] bytes, int start, int length)
{
return GetDecoder().Invoke(bytes, start, length);
}
public byte[] Encode(string str)
{
return GetEncoding().GetBytes(str);
}
public Encoding GetEncoding()
{
return Forced ?? Default ?? Encoding.UTF8;
}
public Func<byte[], int, int, string> GetDecoder()
{
return CustomDecoder ?? ((bytes, index, count) => (Default ?? Encoding.UTF8).GetString(bytes, index, count));
}
}
}

View File

@@ -4,9 +4,17 @@ namespace SharpCompress.Common
{
public abstract class FilePart
{
protected FilePart(ArchiveEncoding archiveEncoding)
{
ArchiveEncoding = archiveEncoding;
}
internal ArchiveEncoding ArchiveEncoding { get; }
internal abstract string FilePartName { get; }
internal abstract Stream GetCompressedStream();
internal abstract Stream GetRawStream();
internal bool Skipped { get; set; }
}
}

View File

@@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
namespace SharpCompress.Common.GZip
{
@@ -39,9 +40,9 @@ namespace SharpCompress.Common.GZip
internal override IEnumerable<FilePart> Parts => filePart.AsEnumerable<FilePart>();
internal static IEnumerable<GZipEntry> GetEntries(Stream stream)
internal static IEnumerable<GZipEntry> GetEntries(Stream stream, OptionsBase options)
{
yield return new GZipEntry(new GZipFilePart(stream));
yield return new GZipEntry(new GZipFilePart(stream, options.ArchiveEncoding));
}
}
}

View File

@@ -5,35 +5,37 @@ using SharpCompress.Common.Tar.Headers;
using SharpCompress.Compressors;
using SharpCompress.Compressors.Deflate;
using SharpCompress.Converters;
using System.Text;
namespace SharpCompress.Common.GZip
{
internal class GZipFilePart : FilePart
{
private string name;
private readonly Stream stream;
private string _name;
private readonly Stream _stream;
internal GZipFilePart(Stream stream)
internal GZipFilePart(Stream stream, ArchiveEncoding archiveEncoding)
: base(archiveEncoding)
{
ReadAndValidateGzipHeader(stream);
EntryStartPosition = stream.Position;
this.stream = stream;
this._stream = stream;
}
internal long EntryStartPosition { get; }
internal DateTime? DateModified { get; private set; }
internal override string FilePartName => name;
internal override string FilePartName => _name;
internal override Stream GetCompressedStream()
{
return new DeflateStream(stream, CompressionMode.Decompress, CompressionLevel.Default, false);
return new DeflateStream(_stream, CompressionMode.Decompress, CompressionLevel.Default, false);
}
internal override Stream GetRawStream()
{
return stream;
return _stream;
}
private void ReadAndValidateGzipHeader(Stream stream)
@@ -67,15 +69,16 @@ namespace SharpCompress.Common.GZip
Int16 extraLength = (Int16)(header[0] + header[1] * 256);
byte[] extra = new byte[extraLength];
n = stream.Read(extra, 0, extra.Length);
if (n != extraLength)
if (!stream.ReadFully(extra))
{
throw new ZlibException("Unexpected end-of-file reading GZIP header.");
}
n = extraLength;
}
if ((header[3] & 0x08) == 0x08)
{
name = ReadZeroTerminatedString(stream);
_name = ReadZeroTerminatedString(stream);
}
if ((header[3] & 0x10) == 0x010)
{
@@ -87,7 +90,7 @@ namespace SharpCompress.Common.GZip
}
}
private static string ReadZeroTerminatedString(Stream stream)
private string ReadZeroTerminatedString(Stream stream)
{
byte[] buf1 = new byte[1];
var list = new List<byte>();
@@ -110,8 +113,8 @@ namespace SharpCompress.Common.GZip
}
}
while (!done);
byte[] a = list.ToArray();
return ArchiveEncoding.Default.GetString(a, 0, a.Length);
byte[] buffer = list.ToArray();
return ArchiveEncoding.Decode(buffer);
}
}
}

View File

@@ -1,4 +1,5 @@
namespace SharpCompress.Common
namespace SharpCompress.Common
{
public class OptionsBase
{
@@ -6,5 +7,7 @@
/// SharpCompress will keep the supplied streams open. Default is true.
/// </summary>
public bool LeaveStreamOpen { get; set; } = true;
public ArchiveEncoding ArchiveEncoding { get; set; } = new ArchiveEncoding();
}
}

View File

@@ -1,6 +1,6 @@
using SharpCompress.IO;
using System;
using System.IO;
using SharpCompress.IO;
namespace SharpCompress.Common.Rar.Headers
{
@@ -52,50 +52,50 @@ namespace SharpCompress.Common.Rar.Headers
switch (HeaderType)
{
case HeaderType.FileHeader:
{
if (FileFlags.HasFlag(FileFlags.UNICODE))
{
int length = 0;
while (length < fileNameBytes.Length
&& fileNameBytes[length] != 0)
if (FileFlags.HasFlag(FileFlags.UNICODE))
{
length++;
}
if (length != nameSize)
{
length++;
FileName = FileNameDecoder.Decode(fileNameBytes, length);
int length = 0;
while (length < fileNameBytes.Length
&& fileNameBytes[length] != 0)
{
length++;
}
if (length != nameSize)
{
length++;
FileName = FileNameDecoder.Decode(fileNameBytes, length);
}
else
{
FileName = ArchiveEncoding.Decode(fileNameBytes);
}
}
else
{
FileName = DecodeDefault(fileNameBytes);
FileName = ArchiveEncoding.Decode(fileNameBytes);
}
FileName = ConvertPath(FileName, HostOS);
}
else
{
FileName = DecodeDefault(fileNameBytes);
}
FileName = ConvertPath(FileName, HostOS);
}
break;
case HeaderType.NewSubHeader:
{
int datasize = HeaderSize - NEWLHD_SIZE - nameSize;
if (FileFlags.HasFlag(FileFlags.SALT))
{
datasize -= SALT_SIZE;
}
if (datasize > 0)
{
SubData = reader.ReadBytes(datasize);
}
int datasize = HeaderSize - NEWLHD_SIZE - nameSize;
if (FileFlags.HasFlag(FileFlags.SALT))
{
datasize -= SALT_SIZE;
}
if (datasize > 0)
{
SubData = reader.ReadBytes(datasize);
}
if (NewSubHeaderType.SUBHEAD_TYPE_RR.Equals(fileNameBytes))
{
RecoverySectors = SubData[8] + (SubData[9] << 8)
+ (SubData[10] << 16) + (SubData[11] << 24);
if (NewSubHeaderType.SUBHEAD_TYPE_RR.Equals(fileNameBytes))
{
RecoverySectors = SubData[8] + (SubData[9] << 8)
+ (SubData[10] << 16) + (SubData[11] << 24);
}
}
}
break;
}
@@ -118,12 +118,6 @@ namespace SharpCompress.Common.Rar.Headers
}
}
//only the full .net framework will do other code pages than unicode/utf8
private string DecodeDefault(byte[] bytes)
{
return ArchiveEncoding.Default.GetString(bytes, 0, bytes.Length);
}
private long UInt32To64(uint x, uint y)
{
long l = x;
@@ -178,6 +172,7 @@ namespace SharpCompress.Common.Rar.Headers
}
internal long DataStartPosition { get; set; }
internal HostOS HostOS { get; private set; }
internal uint FileCRC { get; private set; }
@@ -199,6 +194,7 @@ namespace SharpCompress.Common.Rar.Headers
internal FileFlags FileFlags => (FileFlags)Flags;
internal long CompressedSize { get; private set; }
internal long UncompressedSize { get; private set; }
internal string FileName { get; private set; }

View File

@@ -18,9 +18,9 @@ namespace SharpCompress.Common.Rar.Headers
Flags == 0x1A21 &&
HeaderSize == 0x07;
// Rar5 signature: 52 61 72 21 1A 07 10 00 (not supported yet)
// Rar5 signature: 52 61 72 21 1A 07 01 00 (not supported yet)
}
internal bool OldFormat { get; private set; }
}
}
}

View File

@@ -1,6 +1,7 @@
using System;
using System.IO;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Rar.Headers
{
@@ -17,14 +18,16 @@ namespace SharpCompress.Common.Rar.Headers
HeaderSize = baseHeader.HeaderSize;
AdditionalSize = baseHeader.AdditionalSize;
ReadBytes = baseHeader.ReadBytes;
ArchiveEncoding = baseHeader.ArchiveEncoding;
}
internal static RarHeader Create(RarCrcBinaryReader reader)
internal static RarHeader Create(RarCrcBinaryReader reader, ArchiveEncoding archiveEncoding)
{
try
{
RarHeader header = new RarHeader();
header.ArchiveEncoding = archiveEncoding;
reader.Mark();
header.ReadStartFromReader(reader);
header.ReadBytes += reader.CurrentReadByteCount;
@@ -50,7 +53,8 @@ namespace SharpCompress.Common.Rar.Headers
}
}
protected virtual void ReadFromReader(MarkingBinaryReader reader) {
protected virtual void ReadFromReader(MarkingBinaryReader reader)
{
throw new NotImplementedException();
}
@@ -76,10 +80,11 @@ namespace SharpCompress.Common.Rar.Headers
return header;
}
private void VerifyHeaderCrc(ushort crc) {
if (HeaderType != HeaderType.MarkHeader)
private void VerifyHeaderCrc(ushort crc)
{
if (HeaderType != HeaderType.MarkHeader)
{
if (crc != HeadCRC)
if (crc != HeadCRC)
{
throw new InvalidFormatException("rar header crc mismatch");
}
@@ -106,6 +111,8 @@ namespace SharpCompress.Common.Rar.Headers
protected short HeaderSize { get; private set; }
internal ArchiveEncoding ArchiveEncoding { get; private set; }
/// <summary>
/// This additional size of the header could be file data
/// </summary>

View File

@@ -117,7 +117,7 @@ namespace SharpCompress.Common.Rar.Headers
{
#if !NO_CRYPTO
var reader = new RarCryptoBinaryReader(stream, Options.Password);
if (IsEncrypted)
{
if (Options.Password == null)
@@ -133,7 +133,7 @@ namespace SharpCompress.Common.Rar.Headers
#endif
RarHeader header = RarHeader.Create(reader);
RarHeader header = RarHeader.Create(reader, Options.ArchiveEncoding);
if (header == null)
{
return null;
@@ -141,110 +141,110 @@ namespace SharpCompress.Common.Rar.Headers
switch (header.HeaderType)
{
case HeaderType.ArchiveHeader:
{
var ah = header.PromoteHeader<ArchiveHeader>(reader);
IsEncrypted = ah.HasPassword;
return ah;
}
{
var ah = header.PromoteHeader<ArchiveHeader>(reader);
IsEncrypted = ah.HasPassword;
return ah;
}
case HeaderType.MarkHeader:
{
return header.PromoteHeader<MarkHeader>(reader);
}
{
return header.PromoteHeader<MarkHeader>(reader);
}
case HeaderType.ProtectHeader:
{
ProtectHeader ph = header.PromoteHeader<ProtectHeader>(reader);
// skip the recovery record data, we do not use it.
switch (StreamingMode)
{
case StreamingMode.Seekable:
{
reader.BaseStream.Position += ph.DataSize;
}
break;
case StreamingMode.Streaming:
{
reader.BaseStream.Skip(ph.DataSize);
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
ProtectHeader ph = header.PromoteHeader<ProtectHeader>(reader);
return ph;
}
// skip the recovery record data, we do not use it.
switch (StreamingMode)
{
case StreamingMode.Seekable:
{
reader.BaseStream.Position += ph.DataSize;
}
break;
case StreamingMode.Streaming:
{
reader.BaseStream.Skip(ph.DataSize);
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
return ph;
}
case HeaderType.NewSubHeader:
{
FileHeader fh = header.PromoteHeader<FileHeader>(reader);
switch (StreamingMode)
{
case StreamingMode.Seekable:
FileHeader fh = header.PromoteHeader<FileHeader>(reader);
switch (StreamingMode)
{
fh.DataStartPosition = reader.BaseStream.Position;
reader.BaseStream.Position += fh.CompressedSize;
}
break;
case StreamingMode.Streaming:
{
//skip the data because it's useless?
reader.BaseStream.Skip(fh.CompressedSize);
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
case StreamingMode.Seekable:
{
fh.DataStartPosition = reader.BaseStream.Position;
reader.BaseStream.Position += fh.CompressedSize;
}
break;
case StreamingMode.Streaming:
{
//skip the data because it's useless?
reader.BaseStream.Skip(fh.CompressedSize);
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
return fh;
}
return fh;
}
case HeaderType.FileHeader:
{
FileHeader fh = header.PromoteHeader<FileHeader>(reader);
switch (StreamingMode)
{
case StreamingMode.Seekable:
FileHeader fh = header.PromoteHeader<FileHeader>(reader);
switch (StreamingMode)
{
fh.DataStartPosition = reader.BaseStream.Position;
reader.BaseStream.Position += fh.CompressedSize;
}
break;
case StreamingMode.Streaming:
{
var ms = new ReadOnlySubStream(reader.BaseStream, fh.CompressedSize);
if (fh.Salt == null)
{
fh.PackedStream = ms;
}
else
{
case StreamingMode.Seekable:
{
fh.DataStartPosition = reader.BaseStream.Position;
reader.BaseStream.Position += fh.CompressedSize;
}
break;
case StreamingMode.Streaming:
{
var ms = new ReadOnlySubStream(reader.BaseStream, fh.CompressedSize);
if (fh.Salt == null)
{
fh.PackedStream = ms;
}
else
{
#if !NO_CRYPTO
fh.PackedStream = new RarCryptoWrapper(ms, Options.Password, fh.Salt);
fh.PackedStream = new RarCryptoWrapper(ms, Options.Password, fh.Salt);
#else
throw new NotSupportedException("RarCrypto not supported");
#endif
}
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
return fh;
}
return fh;
}
case HeaderType.EndArchiveHeader:
{
return header.PromoteHeader<EndArchiveHeader>(reader);
}
{
return header.PromoteHeader<EndArchiveHeader>(reader);
}
default:
{
throw new InvalidFormatException("Invalid Rar Header: " + header.HeaderType);
}
{
throw new InvalidFormatException("Invalid Rar Header: " + header.HeaderType);
}
}
}
}
}
}

View File

@@ -9,6 +9,7 @@ namespace SharpCompress.Common.Rar
internal abstract class RarFilePart : FilePart
{
internal RarFilePart(MarkHeader mh, FileHeader fh)
: base(fh.ArchiveEncoding)
{
MarkHeader = mh;
FileHeader = fh;

View File

@@ -22,6 +22,13 @@ namespace SharpCompress.Common.SevenZip
internal List<long> PackStreamStartPositions = new List<long>();
internal List<int> FolderStartFileIndex = new List<int>();
internal List<int> FileIndexToFolderIndexMap = new List<int>();
internal IPasswordProvider PasswordProvider { get; }
public ArchiveDatabase(IPasswordProvider passwordProvider)
{
PasswordProvider = passwordProvider;
}
internal void Clear()
{

View File

@@ -182,7 +182,7 @@ namespace SharpCompress.Common.SevenZip
private DateTime? TranslateTime(long? time)
{
if (time.HasValue)
if (time.HasValue && time.Value >= 0 && time.Value <= 2650467743999999999) //maximum Windows file time 31.12.9999
{
return TranslateTime(time.Value);
}
@@ -1211,7 +1211,7 @@ namespace SharpCompress.Common.SevenZip
public ArchiveDatabase ReadDatabase(IPasswordProvider pass)
{
var db = new ArchiveDatabase();
var db = new ArchiveDatabase(pass);
db.Clear();
db.MajorVersion = _header[6];
@@ -1279,7 +1279,7 @@ namespace SharpCompress.Common.SevenZip
throw new InvalidOperationException();
}
var dataVector = ReadAndDecodePackedStreams(db.StartPositionAfterHeader, pass);
var dataVector = ReadAndDecodePackedStreams(db.StartPositionAfterHeader, db.PasswordProvider);
// compressed header without content is odd but ok
if (dataVector.Count == 0)
@@ -1301,7 +1301,7 @@ namespace SharpCompress.Common.SevenZip
}
}
ReadHeader(db, pass);
ReadHeader(db, db.PasswordProvider);
}
db.Fill();
return db;
@@ -1441,7 +1441,7 @@ namespace SharpCompress.Common.SevenZip
#endregion
}
private Stream GetCachedDecoderStream(ArchiveDatabase _db, int folderIndex, IPasswordProvider pw)
private Stream GetCachedDecoderStream(ArchiveDatabase _db, int folderIndex)
{
Stream s;
if (!_cachedStreams.TryGetValue(folderIndex, out s))
@@ -1456,13 +1456,13 @@ namespace SharpCompress.Common.SevenZip
}
s = DecoderStreamHelper.CreateDecoderStream(_stream, folderStartPackPos, packSizes.ToArray(), folderInfo,
pw);
_db.PasswordProvider);
_cachedStreams.Add(folderIndex, s);
}
return s;
}
public Stream OpenStream(ArchiveDatabase _db, int fileIndex, IPasswordProvider pw)
public Stream OpenStream(ArchiveDatabase _db, int fileIndex)
{
int folderIndex = _db.FileIndexToFolderIndexMap[fileIndex];
int numFilesInFolder = _db.NumUnpackStreamsVector[folderIndex];
@@ -1479,12 +1479,12 @@ namespace SharpCompress.Common.SevenZip
skipSize += _db.Files[firstFileIndex + i].Size;
}
Stream s = GetCachedDecoderStream(_db, folderIndex, pw);
Stream s = GetCachedDecoderStream(_db, folderIndex);
s.Position = skipSize;
return new ReadOnlySubStream(s, _db.Files[fileIndex].Size);
}
public void Extract(ArchiveDatabase _db, int[] indices, IPasswordProvider pw)
public void Extract(ArchiveDatabase _db, int[] indices)
{
int numItems;
bool allFilesMode = (indices == null);
@@ -1562,7 +1562,7 @@ namespace SharpCompress.Common.SevenZip
// TODO: If the decoding fails the last file may be extracted incompletely. Delete it?
Stream s = DecoderStreamHelper.CreateDecoderStream(_stream, folderStartPackPos, packSizes.ToArray(),
folderInfo, pw);
folderInfo, _db.PasswordProvider);
byte[] buffer = new byte[4 << 10];
for (;;)
{
@@ -1588,4 +1588,4 @@ namespace SharpCompress.Common.SevenZip
#endregion
}
}
}

View File

@@ -7,14 +7,15 @@ namespace SharpCompress.Common.SevenZip
{
internal class SevenZipFilePart : FilePart
{
private CompressionType? type;
private readonly Stream stream;
private readonly ArchiveDatabase database;
private CompressionType? _type;
private readonly Stream _stream;
private readonly ArchiveDatabase _database;
internal SevenZipFilePart(Stream stream, ArchiveDatabase database, int index, CFileItem fileEntry)
internal SevenZipFilePart(Stream stream, ArchiveDatabase database, int index, CFileItem fileEntry, ArchiveEncoding archiveEncoding)
: base(archiveEncoding)
{
this.stream = stream;
this.database = database;
this._stream = stream;
this._database = database;
Index = index;
Header = fileEntry;
if (Header.HasStream)
@@ -41,14 +42,14 @@ namespace SharpCompress.Common.SevenZip
{
return null;
}
var folderStream = database.GetFolderStream(stream, Folder, null);
var folderStream = _database.GetFolderStream(_stream, Folder, _database.PasswordProvider);
int firstFileIndex = database.FolderStartFileIndex[database.Folders.IndexOf(Folder)];
int firstFileIndex = _database.FolderStartFileIndex[_database.Folders.IndexOf(Folder)];
int skipCount = Index - firstFileIndex;
long skipSize = 0;
for (int i = 0; i < skipCount; i++)
{
skipSize += database.Files[firstFileIndex + i].Size;
skipSize += _database.Files[firstFileIndex + i].Size;
}
if (skipSize > 0)
{
@@ -61,11 +62,11 @@ namespace SharpCompress.Common.SevenZip
{
get
{
if (type == null)
if (_type == null)
{
type = GetCompression();
_type = GetCompression();
}
return type.Value;
return _type.Value;
}
}
@@ -84,7 +85,7 @@ namespace SharpCompress.Common.SevenZip
{
var coder = Folder.Coders.First();
switch (coder.MethodId.Id)
{
{
case k_LZMA:
case k_LZMA2:
{

View File

@@ -9,6 +9,11 @@ namespace SharpCompress.Common.Tar.Headers
{
internal static readonly DateTime Epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
public TarHeader(ArchiveEncoding archiveEncoding)
{
ArchiveEncoding = archiveEncoding;
}
internal string Name { get; set; }
//internal int Mode { get; set; }
@@ -20,6 +25,7 @@ namespace SharpCompress.Common.Tar.Headers
internal DateTime LastModifiedTime { get; set; }
internal EntryType EntryType { get; set; }
internal Stream PackedStream { get; set; }
internal ArchiveEncoding ArchiveEncoding { get; }
internal const int BlockSize = 512;
@@ -31,7 +37,7 @@ namespace SharpCompress.Common.Tar.Headers
WriteOctalBytes(0, buffer, 108, 8); // owner ID
WriteOctalBytes(0, buffer, 116, 8); // group ID
//Encoding.UTF8.GetBytes("magic").CopyTo(buffer, 257);
//ArchiveEncoding.UTF8.GetBytes("magic").CopyTo(buffer, 257);
if (Name.Length > 100)
{
// Set mock filename and filetype to indicate the next block is the actual name of the file
@@ -72,7 +78,7 @@ namespace SharpCompress.Common.Tar.Headers
private void WriteLongFilenameHeader(Stream output)
{
byte[] nameBytes = ArchiveEncoding.Default.GetBytes(Name);
byte[] nameBytes = ArchiveEncoding.Encode(Name);
output.Write(nameBytes, 0, nameBytes.Length);
// pad to multiple of BlockSize bytes, and make sure a terminating null is added
@@ -99,7 +105,7 @@ namespace SharpCompress.Common.Tar.Headers
}
else
{
Name = ArchiveEncoding.Default.GetString(buffer, 0, 100).TrimNulls();
Name = ArchiveEncoding.Decode(buffer, 0, 100).TrimNulls();
}
EntryType = ReadEntryType(buffer);
@@ -111,12 +117,12 @@ namespace SharpCompress.Common.Tar.Headers
long unixTimeStamp = ReadASCIIInt64Base8(buffer, 136, 11);
LastModifiedTime = Epoch.AddSeconds(unixTimeStamp).ToLocalTime();
Magic = ArchiveEncoding.Default.GetString(buffer, 257, 6).TrimNulls();
Magic = ArchiveEncoding.Decode(buffer, 257, 6).TrimNulls();
if (!string.IsNullOrEmpty(Magic)
&& "ustar".Equals(Magic))
{
string namePrefix = ArchiveEncoding.Default.GetString(buffer, 345, 157);
string namePrefix = ArchiveEncoding.Decode(buffer, 345, 157);
namePrefix = namePrefix.TrimNulls();
if (!string.IsNullOrEmpty(namePrefix))
{
@@ -143,7 +149,7 @@ namespace SharpCompress.Common.Tar.Headers
{
reader.ReadBytes(remainingBytesToRead);
}
return ArchiveEncoding.Default.GetString(nameBytes, 0, nameBytes.Length).TrimNulls();
return ArchiveEncoding.Decode(nameBytes, 0, nameBytes.Length).TrimNulls();
}
private static EntryType ReadEntryType(byte[] buffer)

View File

@@ -3,6 +3,7 @@ using System.Collections.Generic;
using System.IO;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Tar
{
@@ -43,9 +44,9 @@ namespace SharpCompress.Common.Tar
internal override IEnumerable<FilePart> Parts => filePart.AsEnumerable<FilePart>();
internal static IEnumerable<TarEntry> GetEntries(StreamingMode mode, Stream stream,
CompressionType compressionType)
CompressionType compressionType, ArchiveEncoding archiveEncoding)
{
foreach (TarHeader h in TarHeaderFactory.ReadHeader(mode, stream))
foreach (TarHeader h in TarHeaderFactory.ReadHeader(mode, stream, archiveEncoding))
{
if (h != null)
{

View File

@@ -6,11 +6,12 @@ namespace SharpCompress.Common.Tar
{
internal class TarFilePart : FilePart
{
private readonly Stream seekableStream;
private readonly Stream _seekableStream;
internal TarFilePart(TarHeader header, Stream seekableStream)
: base(header.ArchiveEncoding)
{
this.seekableStream = seekableStream;
this._seekableStream = seekableStream;
Header = header;
}
@@ -20,10 +21,10 @@ namespace SharpCompress.Common.Tar
internal override Stream GetCompressedStream()
{
if (seekableStream != null)
if (_seekableStream != null)
{
seekableStream.Position = Header.DataStartPosition.Value;
return new ReadOnlySubStream(seekableStream, Header.Size);
_seekableStream.Position = Header.DataStartPosition.Value;
return new ReadOnlySubStream(_seekableStream, Header.Size);
}
return Header.PackedStream;
}

View File

@@ -2,12 +2,13 @@
using System.IO;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Tar
{
internal static class TarHeaderFactory
{
internal static IEnumerable<TarHeader> ReadHeader(StreamingMode mode, Stream stream)
internal static IEnumerable<TarHeader> ReadHeader(StreamingMode mode, Stream stream, ArchiveEncoding archiveEncoding)
{
while (true)
{
@@ -15,7 +16,8 @@ namespace SharpCompress.Common.Tar
try
{
BinaryReader reader = new BinaryReader(stream);
header = new TarHeader();
header = new TarHeader(archiveEncoding);
if (!header.Read(reader))
{
yield break;
@@ -23,22 +25,22 @@ namespace SharpCompress.Common.Tar
switch (mode)
{
case StreamingMode.Seekable:
{
header.DataStartPosition = reader.BaseStream.Position;
{
header.DataStartPosition = reader.BaseStream.Position;
//skip to nearest 512
reader.BaseStream.Position += PadTo512(header.Size);
}
//skip to nearest 512
reader.BaseStream.Position += PadTo512(header.Size);
}
break;
case StreamingMode.Streaming:
{
header.PackedStream = new TarReadOnlySubStream(stream, header.Size);
}
{
header.PackedStream = new TarReadOnlySubStream(stream, header.Size);
}
break;
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
}
catch

View File

@@ -6,8 +6,8 @@ namespace SharpCompress.Common.Zip.Headers
{
internal class DirectoryEntryHeader : ZipFileEntry
{
public DirectoryEntryHeader()
: base(ZipHeaderType.DirectoryEntry)
public DirectoryEntryHeader(ArchiveEncoding archiveEncoding)
: base(ZipHeaderType.DirectoryEntry, archiveEncoding)
{
}
@@ -31,10 +31,10 @@ namespace SharpCompress.Common.Zip.Headers
RelativeOffsetOfEntryHeader = reader.ReadUInt32();
byte[] name = reader.ReadBytes(nameLength);
Name = DecodeString(name);
Name = ArchiveEncoding.Decode(name);
byte[] extra = reader.ReadBytes(extraLength);
byte[] comment = reader.ReadBytes(commentLength);
Comment = DecodeString(comment);
Comment = ArchiveEncoding.Decode(comment);
LoadExtra(extra);
var unicodePathExtra = Extra.FirstOrDefault(u => u.Type == ExtraDataType.UnicodePathExtraField);

View File

@@ -5,6 +5,7 @@ namespace SharpCompress.Common.Zip.Headers
[Flags]
internal enum HeaderFlags : ushort
{
None = 0,
Encrypted = 1, // http://www.pkware.com/documents/casestudies/APPNOTE.TXT
Bit1 = 2,
Bit2 = 4,

View File

@@ -1,12 +1,13 @@
using System.IO;
using System.Linq;
using System.Text;
namespace SharpCompress.Common.Zip.Headers
{
internal class LocalEntryHeader : ZipFileEntry
{
public LocalEntryHeader()
: base(ZipHeaderType.LocalEntry)
public LocalEntryHeader(ArchiveEncoding archiveEncoding)
: base(ZipHeaderType.LocalEntry, archiveEncoding)
{
}
@@ -24,7 +25,7 @@ namespace SharpCompress.Common.Zip.Headers
ushort extraLength = reader.ReadUInt16();
byte[] name = reader.ReadBytes(nameLength);
byte[] extra = reader.ReadBytes(extraLength);
Name = DecodeString(name);
Name = ArchiveEncoding.Decode(name);
LoadExtra(extra);
var unicodePathExtra = Extra.FirstOrDefault(u => u.Type == ExtraDataType.UnicodePathExtraField);

View File

@@ -8,10 +8,11 @@ namespace SharpCompress.Common.Zip.Headers
{
internal abstract class ZipFileEntry : ZipHeader
{
protected ZipFileEntry(ZipHeaderType type)
protected ZipFileEntry(ZipHeaderType type, ArchiveEncoding archiveEncoding)
: base(type)
{
Extra = new List<ExtraData>();
ArchiveEncoding = archiveEncoding;
}
internal bool IsDirectory
@@ -29,28 +30,11 @@ namespace SharpCompress.Common.Zip.Headers
&& Name.EndsWith("\\");
}
}
protected string DecodeString(byte[] str)
{
if (FlagUtility.HasFlag(Flags, HeaderFlags.UTF8))
{
return Encoding.UTF8.GetString(str, 0, str.Length);
}
return ArchiveEncoding.Default.GetString(str, 0, str.Length);
}
protected byte[] EncodeString(string str)
{
if (FlagUtility.HasFlag(Flags, HeaderFlags.UTF8))
{
return Encoding.UTF8.GetBytes(str);
}
return ArchiveEncoding.Default.GetBytes(str);
}
internal Stream PackedStream { get; set; }
internal ArchiveEncoding ArchiveEncoding { get; }
internal string Name { get; set; }
internal HeaderFlags Flags { get; set; }
@@ -64,7 +48,7 @@ namespace SharpCompress.Common.Zip.Headers
internal long UncompressedSize { get; set; }
internal List<ExtraData> Extra { get; set; }
public string Password { get; set; }
internal PkwareTraditionalEncryptionData ComposeEncryptionData(Stream archiveStream)
@@ -75,10 +59,10 @@ namespace SharpCompress.Common.Zip.Headers
}
var buffer = new byte[12];
archiveStream.Read(buffer, 0, 12);
archiveStream.ReadFully(buffer);
PkwareTraditionalEncryptionData encryptionData = PkwareTraditionalEncryptionData.ForRead(Password, this, buffer);
return encryptionData;
}

View File

@@ -42,7 +42,7 @@ namespace SharpCompress.Common.Zip
if (buffer == null)
{
throw new ArgumentNullException("buffer");
throw new ArgumentNullException(nameof(buffer));
}
byte[] temp = new byte[count];

View File

@@ -9,9 +9,11 @@ namespace SharpCompress.Common.Zip
{
private static readonly CRC32 crc32 = new CRC32();
private readonly UInt32[] _Keys = {0x12345678, 0x23456789, 0x34567890};
private readonly ArchiveEncoding _archiveEncoding;
private PkwareTraditionalEncryptionData(string password)
private PkwareTraditionalEncryptionData(string password, ArchiveEncoding archiveEncoding)
{
_archiveEncoding = archiveEncoding;
Initialize(password);
}
@@ -27,7 +29,7 @@ namespace SharpCompress.Common.Zip
public static PkwareTraditionalEncryptionData ForRead(string password, ZipFileEntry header,
byte[] encryptionHeader)
{
var encryptor = new PkwareTraditionalEncryptionData(password);
var encryptor = new PkwareTraditionalEncryptionData(password, header.ArchiveEncoding);
byte[] plainTextHeader = encryptor.Decrypt(encryptionHeader, encryptionHeader.Length);
if (plainTextHeader[11] != (byte)((header.Crc >> 24) & 0xff))
{
@@ -47,7 +49,7 @@ namespace SharpCompress.Common.Zip
{
if (length > cipherText.Length)
{
throw new ArgumentOutOfRangeException("length",
throw new ArgumentOutOfRangeException(nameof(length),
"Bad length during Decryption: the length parameter must be smaller than or equal to the size of the destination array.");
}
@@ -70,7 +72,7 @@ namespace SharpCompress.Common.Zip
if (length > plainText.Length)
{
throw new ArgumentOutOfRangeException("length",
throw new ArgumentOutOfRangeException(nameof(length),
"Bad length during Encryption: The length parameter must be smaller than or equal to the size of the destination array.");
}
@@ -93,17 +95,12 @@ namespace SharpCompress.Common.Zip
}
}
internal static byte[] StringToByteArray(string value, Encoding encoding)
internal byte[] StringToByteArray(string value)
{
byte[] a = encoding.GetBytes(value);
byte[] a = _archiveEncoding.Password.GetBytes(value);
return a;
}
internal static byte[] StringToByteArray(string value)
{
return StringToByteArray(value, ArchiveEncoding.Password);
}
private void UpdateKeys(byte byteValue)
{
_Keys[0] = (UInt32)crc32.ComputeCrc32((int)_Keys[0], byteValue);

View File

@@ -5,21 +5,21 @@ namespace SharpCompress.Common.Zip
{
internal class SeekableZipFilePart : ZipFilePart
{
private bool isLocalHeaderLoaded;
private readonly SeekableZipHeaderFactory headerFactory;
private bool _isLocalHeaderLoaded;
private readonly SeekableZipHeaderFactory _headerFactory;
internal SeekableZipFilePart(SeekableZipHeaderFactory headerFactory, DirectoryEntryHeader header, Stream stream)
: base(header, stream)
{
this.headerFactory = headerFactory;
this._headerFactory = headerFactory;
}
internal override Stream GetCompressedStream()
{
if (!isLocalHeaderLoaded)
if (!_isLocalHeaderLoaded)
{
LoadLocalHeader();
isLocalHeaderLoaded = true;
_isLocalHeaderLoaded = true;
}
return base.GetCompressedStream();
}
@@ -29,7 +29,7 @@ namespace SharpCompress.Common.Zip
private void LoadLocalHeader()
{
bool hasData = Header.HasData;
Header = headerFactory.GetLocalHeader(BaseStream, Header as DirectoryEntryHeader);
Header = _headerFactory.GetLocalHeader(BaseStream, Header as DirectoryEntryHeader);
Header.HasData = hasData;
}

View File

@@ -3,16 +3,17 @@ using System.Collections.Generic;
using System.IO;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Zip
{
internal class SeekableZipHeaderFactory : ZipHeaderFactory
{
private const int MAX_ITERATIONS_FOR_DIRECTORY_HEADER = 4096;
private bool zip64;
private bool _zip64;
internal SeekableZipHeaderFactory(string password)
: base(StreamingMode.Seekable, password)
internal SeekableZipHeaderFactory(string password, ArchiveEncoding archiveEncoding)
: base(StreamingMode.Seekable, password, archiveEncoding)
{
}
@@ -26,14 +27,14 @@ namespace SharpCompress.Common.Zip
if (entry.IsZip64)
{
zip64 = true;
_zip64 = true;
SeekBackToHeader(stream, reader, ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR);
var zip64Locator = new Zip64DirectoryEndLocatorHeader();
zip64Locator.Read(reader);
stream.Seek(zip64Locator.RelativeOffsetOfTheEndOfDirectoryRecord, SeekOrigin.Begin);
uint zip64Signature = reader.ReadUInt32();
if(zip64Signature != ZIP64_END_OF_CENTRAL_DIRECTORY)
if (zip64Signature != ZIP64_END_OF_CENTRAL_DIRECTORY)
throw new ArchiveException("Failed to locate the Zip64 Header");
var zip64Entry = new Zip64DirectoryEndHeader();
@@ -50,7 +51,7 @@ namespace SharpCompress.Common.Zip
{
stream.Position = position;
uint signature = reader.ReadUInt32();
var directoryEntryHeader = ReadHeader(signature, reader, zip64) as DirectoryEntryHeader;
var directoryEntryHeader = ReadHeader(signature, reader, _zip64) as DirectoryEntryHeader;
position = stream.Position;
if (directoryEntryHeader == null)
{
@@ -91,7 +92,7 @@ namespace SharpCompress.Common.Zip
stream.Seek(directoryEntryHeader.RelativeOffsetOfEntryHeader, SeekOrigin.Begin);
BinaryReader reader = new BinaryReader(stream);
uint signature = reader.ReadUInt32();
var localEntryHeader = ReadHeader(signature, reader, zip64) as LocalEntryHeader;
var localEntryHeader = ReadHeader(signature, reader, _zip64) as LocalEntryHeader;
if (localEntryHeader == null)
{
throw new InvalidOperationException();

View File

@@ -39,19 +39,20 @@ namespace SharpCompress.Common.Zip
{
return new BinaryReader(rewindableStream);
}
if (Header.HasData)
if (Header.HasData && !Skipped)
{
if (decompressionStream == null)
{
decompressionStream = GetCompressedStream();
}
decompressionStream.SkipAll();
decompressionStream.Skip();
DeflateStream deflateStream = decompressionStream as DeflateStream;
if (deflateStream != null)
{
rewindableStream.Rewind(deflateStream.InputBuffer);
}
Skipped = true;
}
var reader = new BinaryReader(rewindableStream);
decompressionStream = null;

View File

@@ -2,13 +2,14 @@
using System.IO;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Zip
{
internal class StreamingZipHeaderFactory : ZipHeaderFactory
{
internal StreamingZipHeaderFactory(string password)
: base(StreamingMode.Streaming, password)
internal StreamingZipHeaderFactory(string password, ArchiveEncoding archiveEncoding)
: base(StreamingMode.Streaming, password, archiveEncoding)
{
}

View File

@@ -78,7 +78,7 @@ namespace SharpCompress.Common.Zip
{
//read out last 10 auth bytes
var ten = new byte[10];
stream.Read(ten, 0, 10);
stream.ReadFully(ten);
stream.Dispose();
}
}

View File

@@ -15,6 +15,7 @@ namespace SharpCompress.Common.Zip
internal abstract class ZipFilePart : FilePart
{
internal ZipFilePart(ZipFileEntry header, Stream stream)
: base(header.ArchiveEncoding)
{
Header = header;
header.Part = this;
@@ -88,7 +89,7 @@ namespace SharpCompress.Common.Zip
case ZipCompressionMethod.PPMd:
{
var props = new byte[2];
stream.Read(props, 0, props.Length);
stream.ReadFully(props);
return new PpmdStream(new PpmdProperties(props), stream, false);
}
case ZipCompressionMethod.WinzipAes:
@@ -175,7 +176,6 @@ namespace SharpCompress.Common.Zip
}
}
return plainStream;
}
}

View File

@@ -5,6 +5,7 @@ using System.Linq;
#endif
using SharpCompress.Common.Zip.Headers;
using SharpCompress.IO;
using System.Text;
namespace SharpCompress.Common.Zip
{
@@ -23,11 +24,13 @@ namespace SharpCompress.Common.Zip
protected LocalEntryHeader lastEntryHeader;
private readonly string password;
private readonly StreamingMode mode;
private readonly ArchiveEncoding archiveEncoding;
protected ZipHeaderFactory(StreamingMode mode, string password)
protected ZipHeaderFactory(StreamingMode mode, string password, ArchiveEncoding archiveEncoding)
{
this.mode = mode;
this.password = password;
this.archiveEncoding = archiveEncoding;
}
protected ZipHeader ReadHeader(uint headerBytes, BinaryReader reader, bool zip64 = false)
@@ -36,7 +39,7 @@ namespace SharpCompress.Common.Zip
{
case ENTRY_HEADER_BYTES:
{
var entryHeader = new LocalEntryHeader();
var entryHeader = new LocalEntryHeader(archiveEncoding);
entryHeader.Read(reader);
LoadHeader(entryHeader, reader.BaseStream);
@@ -45,48 +48,48 @@ namespace SharpCompress.Common.Zip
}
case DIRECTORY_START_HEADER_BYTES:
{
var entry = new DirectoryEntryHeader();
var entry = new DirectoryEntryHeader(archiveEncoding);
entry.Read(reader);
return entry;
}
case POST_DATA_DESCRIPTOR:
{
if (FlagUtility.HasFlag(lastEntryHeader.Flags, HeaderFlags.UsePostDataDescriptor))
{
lastEntryHeader.Crc = reader.ReadUInt32();
lastEntryHeader.CompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
lastEntryHeader.UncompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
if (FlagUtility.HasFlag(lastEntryHeader.Flags, HeaderFlags.UsePostDataDescriptor))
{
lastEntryHeader.Crc = reader.ReadUInt32();
lastEntryHeader.CompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
lastEntryHeader.UncompressedSize = zip64 ? (long)reader.ReadUInt64() : reader.ReadUInt32();
}
else
{
reader.ReadBytes(zip64 ? 20 : 12);
}
return null;
}
else
{
reader.ReadBytes(zip64 ? 20 : 12);
}
return null;
}
case DIGITAL_SIGNATURE:
return null;
case DIRECTORY_END_HEADER_BYTES:
{
var entry = new DirectoryEndHeader();
entry.Read(reader);
return entry;
}
{
var entry = new DirectoryEndHeader();
entry.Read(reader);
return entry;
}
case SPLIT_ARCHIVE_HEADER_BYTES:
{
return new SplitHeader();
}
{
return new SplitHeader();
}
case ZIP64_END_OF_CENTRAL_DIRECTORY:
{
var entry = new Zip64DirectoryEndHeader();
entry.Read(reader);
return entry;
}
{
var entry = new Zip64DirectoryEndHeader();
entry.Read(reader);
return entry;
}
case ZIP64_END_OF_CENTRAL_DIRECTORY_LOCATOR:
{
var entry = new Zip64DirectoryEndLocatorHeader();
entry.Read(reader);
return entry;
}
{
var entry = new Zip64DirectoryEndLocatorHeader();
entry.Read(reader);
return entry;
}
default:
throw new NotSupportedException("Unknown header: " + headerBytes);
}
@@ -165,22 +168,22 @@ namespace SharpCompress.Common.Zip
switch (mode)
{
case StreamingMode.Seekable:
{
entryHeader.DataStartPosition = stream.Position;
stream.Position += entryHeader.CompressedSize;
break;
}
{
entryHeader.DataStartPosition = stream.Position;
stream.Position += entryHeader.CompressedSize;
break;
}
case StreamingMode.Streaming:
{
entryHeader.PackedStream = stream;
break;
}
{
entryHeader.PackedStream = stream;
break;
}
default:
{
throw new InvalidFormatException("Invalid StreamingMode");
}
{
throw new InvalidFormatException("Invalid StreamingMode");
}
}
//}

View File

@@ -105,19 +105,19 @@ namespace SharpCompress.Compressors.ADC
}
if (buffer == null)
{
throw new ArgumentNullException("buffer");
throw new ArgumentNullException(nameof(buffer));
}
if (count < 0)
{
throw new ArgumentOutOfRangeException("count");
throw new ArgumentOutOfRangeException(nameof(count));
}
if (offset < buffer.GetLowerBound(0))
{
throw new ArgumentOutOfRangeException("offset");
throw new ArgumentOutOfRangeException(nameof(offset));
}
if ((offset + count) > buffer.GetLength(0))
{
throw new ArgumentOutOfRangeException("count");
throw new ArgumentOutOfRangeException(nameof(count));
}
int size = -1;

View File

@@ -26,6 +26,7 @@
using System;
using System.IO;
using System.Text;
namespace SharpCompress.Compressors.Deflate
{
@@ -36,9 +37,10 @@ namespace SharpCompress.Compressors.Deflate
public DeflateStream(Stream stream, CompressionMode mode,
CompressionLevel level = CompressionLevel.Default,
bool leaveOpen = false)
bool leaveOpen = false,
Encoding forceEncoding = null)
{
_baseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.DEFLATE, leaveOpen);
_baseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.DEFLATE, leaveOpen, forceEncoding);
}
#region Zlib properties

View File

@@ -30,41 +30,45 @@ using System;
using System.IO;
using SharpCompress.Common;
using SharpCompress.Converters;
using System.Text;
namespace SharpCompress.Compressors.Deflate
{
public class GZipStream : Stream
{
internal static readonly DateTime UnixEpoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
internal static readonly DateTime UNIX_EPOCH = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
public DateTime? LastModified { get; set; }
private string comment;
private string fileName;
private string _comment;
private string _fileName;
internal ZlibBaseStream BaseStream;
private bool disposed;
private bool firstReadDone;
private int headerByteCount;
private bool _disposed;
private bool _firstReadDone;
private int _headerByteCount;
private readonly Encoding _encoding;
public GZipStream(Stream stream, CompressionMode mode)
: this(stream, mode, CompressionLevel.Default, false)
: this(stream, mode, CompressionLevel.Default, false, Encoding.UTF8)
{
}
public GZipStream(Stream stream, CompressionMode mode, CompressionLevel level)
: this(stream, mode, level, false)
: this(stream, mode, level, false, Encoding.UTF8)
{
}
public GZipStream(Stream stream, CompressionMode mode, bool leaveOpen)
: this(stream, mode, CompressionLevel.Default, leaveOpen)
: this(stream, mode, CompressionLevel.Default, leaveOpen, Encoding.UTF8)
{
}
public GZipStream(Stream stream, CompressionMode mode, CompressionLevel level, bool leaveOpen)
public GZipStream(Stream stream, CompressionMode mode, CompressionLevel level, bool leaveOpen, Encoding encoding)
{
BaseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.GZIP, leaveOpen);
BaseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.GZIP, leaveOpen, encoding);
_encoding = encoding;
}
#region Zlib properties
@@ -74,7 +78,7 @@ namespace SharpCompress.Compressors.Deflate
get => (BaseStream._flushMode);
set
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -87,7 +91,7 @@ namespace SharpCompress.Compressors.Deflate
get => BaseStream._bufferSize;
set
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -123,7 +127,7 @@ namespace SharpCompress.Compressors.Deflate
{
get
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -149,7 +153,7 @@ namespace SharpCompress.Compressors.Deflate
{
get
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -179,7 +183,7 @@ namespace SharpCompress.Compressors.Deflate
{
if (BaseStream._streamMode == ZlibBaseStream.StreamMode.Writer)
{
return BaseStream._z.TotalBytesOut + headerByteCount;
return BaseStream._z.TotalBytesOut + _headerByteCount;
}
if (BaseStream._streamMode == ZlibBaseStream.StreamMode.Reader)
{
@@ -202,14 +206,14 @@ namespace SharpCompress.Compressors.Deflate
{
try
{
if (!disposed)
if (!_disposed)
{
if (disposing && (BaseStream != null))
{
BaseStream.Dispose();
Crc32 = BaseStream.Crc32;
}
disposed = true;
_disposed = true;
}
}
finally
@@ -223,7 +227,7 @@ namespace SharpCompress.Compressors.Deflate
/// </summary>
public override void Flush()
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -263,7 +267,7 @@ namespace SharpCompress.Compressors.Deflate
/// <returns>the number of bytes actually read</returns>
public override int Read(byte[] buffer, int offset, int count)
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -272,9 +276,9 @@ namespace SharpCompress.Compressors.Deflate
// Console.WriteLine("GZipStream::Read(buffer, off({0}), c({1}) = {2}", offset, count, n);
// Console.WriteLine( Util.FormatByteArray(buffer, offset, n) );
if (!firstReadDone)
if (!_firstReadDone)
{
firstReadDone = true;
_firstReadDone = true;
FileName = BaseStream._GzipFileName;
Comment = BaseStream._GzipComment;
}
@@ -325,7 +329,7 @@ namespace SharpCompress.Compressors.Deflate
/// <param name="count">the number of bytes to write.</param>
public override void Write(byte[] buffer, int offset, int count)
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
@@ -335,7 +339,7 @@ namespace SharpCompress.Compressors.Deflate
if (BaseStream._wantCompress)
{
// first write in compression, therefore, emit the GZIP header
headerByteCount = EmitHeader();
_headerByteCount = EmitHeader();
}
else
{
@@ -346,56 +350,56 @@ namespace SharpCompress.Compressors.Deflate
BaseStream.Write(buffer, offset, count);
}
#endregion
#endregion Stream methods
public String Comment
{
get => comment;
get => _comment;
set
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
comment = value;
_comment = value;
}
}
public string FileName
{
get => fileName;
get => _fileName;
set
{
if (disposed)
if (_disposed)
{
throw new ObjectDisposedException("GZipStream");
}
fileName = value;
if (fileName == null)
_fileName = value;
if (_fileName == null)
{
return;
}
if (fileName.IndexOf("/") != -1)
if (_fileName.IndexOf("/") != -1)
{
fileName = fileName.Replace("/", "\\");
_fileName = _fileName.Replace("/", "\\");
}
if (fileName.EndsWith("\\"))
if (_fileName.EndsWith("\\"))
{
throw new InvalidOperationException("Illegal filename");
}
var index = fileName.IndexOf("\\");
var index = _fileName.IndexOf("\\");
if (index != -1)
{
// trim any leading path
int length = fileName.Length;
int length = _fileName.Length;
int num = length;
while (--num >= 0)
{
char c = fileName[num];
char c = _fileName[num];
if (c == '\\')
{
fileName = fileName.Substring(num + 1, length - num - 1);
_fileName = _fileName.Substring(num + 1, length - num - 1);
}
}
}
@@ -406,8 +410,10 @@ namespace SharpCompress.Compressors.Deflate
private int EmitHeader()
{
byte[] commentBytes = (Comment == null) ? null : ArchiveEncoding.Default.GetBytes(Comment);
byte[] filenameBytes = (FileName == null) ? null : ArchiveEncoding.Default.GetBytes(FileName);
byte[] commentBytes = (Comment == null) ? null
: _encoding.GetBytes(Comment);
byte[] filenameBytes = (FileName == null) ? null
: _encoding.GetBytes(FileName);
int cbLength = (Comment == null) ? 0 : commentBytes.Length + 1;
int fnLength = (FileName == null) ? 0 : filenameBytes.Length + 1;
@@ -440,7 +446,7 @@ namespace SharpCompress.Compressors.Deflate
{
LastModified = DateTime.Now;
}
TimeSpan delta = LastModified.Value - UnixEpoch;
TimeSpan delta = LastModified.Value - UNIX_EPOCH;
var timet = (Int32)delta.TotalSeconds;
DataConverter.LittleEndian.PutBytes(header, i, timet);
i += 4;

View File

@@ -418,7 +418,7 @@ namespace SharpCompress.Compressors.Deflate
internal sealed class Adler
{
// largest prime smaller than 65536
private static readonly int BASE = 65521;
private static readonly uint BASE = 65521U;
// NMAX is the largest n such that 255n(n+1)/2 + (n+1)(BASE-1) <= 2^32-1
private static readonly int NMAX = 5552;
@@ -430,8 +430,8 @@ namespace SharpCompress.Compressors.Deflate
return 1;
}
int s1 = (int)(adler & 0xffff);
int s2 = (int)((adler >> 16) & 0xffff);
uint s1 = adler & 0xffffU;
uint s2 = (adler >> 16) & 0xffffU;
while (len > 0)
{
@@ -486,7 +486,7 @@ namespace SharpCompress.Compressors.Deflate
s1 %= BASE;
s2 %= BASE;
}
return (uint)((s2 << 16) | s1);
return (s2 << 16) | s1;
}
}
}

View File

@@ -1,20 +1,20 @@
// ZlibBaseStream.cs
// ------------------------------------------------------------------
//
// Copyright (c) 2009 Dino Chiesa and Microsoft Corporation.
// Copyright (c) 2009 Dino Chiesa and Microsoft Corporation.
// All rights reserved.
//
// This code module is part of DotNetZip, a zipfile class library.
//
// ------------------------------------------------------------------
//
// This code is licensed under the Microsoft Public License.
// This code is licensed under the Microsoft Public License.
// See the file License.txt for the license details.
// More info on: http://dotnetzip.codeplex.com
//
// ------------------------------------------------------------------
//
// last saved (in emacs):
// last saved (in emacs):
// Time-stamp: <2009-October-28 15:45:15>
//
// ------------------------------------------------------------------
@@ -30,6 +30,7 @@ using System.IO;
using SharpCompress.Common;
using SharpCompress.Common.Tar.Headers;
using SharpCompress.Converters;
using System.Text;
namespace SharpCompress.Compressors.Deflate
{
@@ -64,6 +65,8 @@ namespace SharpCompress.Compressors.Deflate
protected internal DateTime _GzipMtime;
protected internal int _gzipHeaderByteCount;
private readonly Encoding _encoding;
internal int Crc32
{
get
@@ -80,7 +83,8 @@ namespace SharpCompress.Compressors.Deflate
CompressionMode compressionMode,
CompressionLevel level,
ZlibStreamFlavor flavor,
bool leaveOpen)
bool leaveOpen,
Encoding encoding)
{
_flushMode = FlushType.None;
@@ -91,6 +95,8 @@ namespace SharpCompress.Compressors.Deflate
_flavor = flavor;
_level = level;
_encoding = encoding;
// workitem 7159
if (flavor == ZlibStreamFlavor.GZIP)
{
@@ -418,8 +424,8 @@ namespace SharpCompress.Compressors.Deflate
}
}
while (!done);
byte[] a = list.ToArray();
return ArchiveEncoding.Default.GetString(a, 0, a.Length);
byte[] buffer = list.ToArray();
return _encoding.GetString(buffer, 0, buffer.Length);
}
private int _ReadAndValidateGzipHeader()
@@ -528,19 +534,19 @@ namespace SharpCompress.Compressors.Deflate
}
if (buffer == null)
{
throw new ArgumentNullException("buffer");
throw new ArgumentNullException(nameof(buffer));
}
if (count < 0)
{
throw new ArgumentOutOfRangeException("count");
throw new ArgumentOutOfRangeException(nameof(count));
}
if (offset < buffer.GetLowerBound(0))
{
throw new ArgumentOutOfRangeException("offset");
throw new ArgumentOutOfRangeException(nameof(offset));
}
if ((offset + count) > buffer.GetLength(0))
{
throw new ArgumentOutOfRangeException("count");
throw new ArgumentOutOfRangeException(nameof(count));
}
int rc = 0;
@@ -593,7 +599,7 @@ namespace SharpCompress.Compressors.Deflate
while (_z.AvailableBytesOut > 0 && !nomoreinput && rc == ZlibConstants.Z_OK);
// workitem 8557
// is there more room in output?
// is there more room in output?
if (_z.AvailableBytesOut > 0)
{
if (rc == ZlibConstants.Z_OK && _z.AvailableBytesIn == 0)

View File

@@ -27,6 +27,7 @@
using System;
using System.IO;
using System.Text;
namespace SharpCompress.Compressors.Deflate
{
@@ -36,23 +37,23 @@ namespace SharpCompress.Compressors.Deflate
private bool _disposed;
public ZlibStream(Stream stream, CompressionMode mode)
: this(stream, mode, CompressionLevel.Default, false)
: this(stream, mode, CompressionLevel.Default, false, Encoding.UTF8)
{
}
public ZlibStream(Stream stream, CompressionMode mode, CompressionLevel level)
: this(stream, mode, level, false)
: this(stream, mode, level, false, Encoding.UTF8)
{
}
public ZlibStream(Stream stream, CompressionMode mode, bool leaveOpen)
: this(stream, mode, CompressionLevel.Default, leaveOpen)
: this(stream, mode, CompressionLevel.Default, leaveOpen, Encoding.UTF8)
{
}
public ZlibStream(Stream stream, CompressionMode mode, CompressionLevel level, bool leaveOpen)
public ZlibStream(Stream stream, CompressionMode mode, CompressionLevel level, bool leaveOpen, Encoding encoding)
{
_baseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.ZLIB, leaveOpen);
_baseStream = new ZlibBaseStream(stream, mode, level, ZlibStreamFlavor.ZLIB, leaveOpen, encoding);
}
#region Zlib properties
@@ -326,6 +327,6 @@ namespace SharpCompress.Compressors.Deflate
_baseStream.Write(buffer, offset, count);
}
#endregion
#endregion System.IO.Stream methods
}
}

View File

@@ -58,7 +58,7 @@ namespace SharpCompress.Compressors.LZMA
{
if (index < 0 || index >= Length)
{
throw new ArgumentOutOfRangeException("index");
throw new ArgumentOutOfRangeException(nameof(index));
}
return (mBits[index >> 5] & (1u << (index & 31))) != 0;
@@ -69,7 +69,7 @@ namespace SharpCompress.Compressors.LZMA
{
if (index < 0 || index >= Length)
{
throw new ArgumentOutOfRangeException("index");
throw new ArgumentOutOfRangeException(nameof(index));
}
mBits[index >> 5] |= 1u << (index & 31);
@@ -79,7 +79,7 @@ namespace SharpCompress.Compressors.LZMA
{
if (index < 0 || index >= Length)
{
throw new ArgumentOutOfRangeException("index");
throw new ArgumentOutOfRangeException(nameof(index));
}
uint bits = mBits[index >> 5];

View File

@@ -58,22 +58,22 @@ namespace SharpCompress.Compressors.LZMA.Utilites
{
if (stream == null)
{
throw new ArgumentNullException("stream");
throw new ArgumentNullException(nameof(stream));
}
if (buffer == null)
{
throw new ArgumentNullException("buffer");
throw new ArgumentNullException(nameof(buffer));
}
if (offset < 0 || offset > buffer.Length)
{
throw new ArgumentOutOfRangeException("offset");
throw new ArgumentOutOfRangeException(nameof(offset));
}
if (length < 0 || length > buffer.Length - offset)
{
throw new ArgumentOutOfRangeException("length");
throw new ArgumentOutOfRangeException(nameof(length));
}
while (length > 0)

View File

@@ -146,12 +146,12 @@ namespace SharpCompress.Compressors.PPMd.I1
{
if (target == null)
{
throw new ArgumentNullException("target");
throw new ArgumentNullException(nameof(target));
}
if (source == null)
{
throw new ArgumentNullException("source");
throw new ArgumentNullException(nameof(source));
}
EncodeStart(properties);
@@ -235,12 +235,12 @@ namespace SharpCompress.Compressors.PPMd.I1
{
if (target == null)
{
throw new ArgumentNullException("target");
throw new ArgumentNullException(nameof(target));
}
if (source == null)
{
throw new ArgumentNullException("source");
throw new ArgumentNullException(nameof(source));
}
DecodeStart(source, properties);

View File

@@ -31,7 +31,7 @@ namespace SharpCompress.Compressors.Rar {
{
currentCrc = RarCRC.CheckCrc(currentCrc, buffer, offset, result);
}
else if (GetCrc() != readStream.CurrentCrc)
else if (GetCrc() != readStream.CurrentCrc && count != 0)
{
// NOTE: we use the last FileHeader in a multipart volume to check CRC
throw new InvalidFormatException("file crc mismatch");

View File

@@ -18,9 +18,11 @@ namespace SharpCompress.Compressors.Xz
public static int ReadLittleEndianInt32(this Stream stream)
{
byte[] bytes = new byte[4];
var read = stream.Read(bytes, 0, 4);
if (read != 4)
var read = stream.ReadFully(bytes);
if (!read)
{
throw new EndOfStreamException();
}
return (bytes[0] + (bytes[1] << 8) + (bytes[2] << 16) + (bytes[3] << 24));
}

View File

@@ -18,7 +18,7 @@ namespace SharpCompress.Compressors.Xz
int i = 0;
while ((LastByte & 0x80) != 0)
{
if (i >= MaxBytes)
if (++i >= MaxBytes)
throw new InvalidDataException();
LastByte = reader.ReadByte();
if (LastByte == 0)

View File

@@ -50,11 +50,11 @@ namespace SharpCompress.Compressors.Xz
private void SkipPadding()
{
int padding = (int)(_bytesRead % 4);
if (padding > 0)
int bytes = (int)(BaseStream.Position % 4);
if (bytes > 0)
{
byte[] paddingBytes = new byte[padding];
BaseStream.Read(paddingBytes, 0, padding);
byte[] paddingBytes = new byte[4 - bytes];
BaseStream.Read(paddingBytes, 0, paddingBytes.Length);
if (paddingBytes.Any(b => b != 0))
throw new InvalidDataException("Padding bytes were non-null");
}

View File

@@ -55,10 +55,10 @@ namespace SharpCompress.Compressors.Xz
private void SkipPadding()
{
int padding = (int)(_reader.BaseStream.Position - StreamStartPosition) % 4;
if (padding > 0)
int bytes = (int)(_reader.BaseStream.Position - StreamStartPosition) % 4;
if (bytes > 0)
{
byte[] paddingBytes = _reader.ReadBytes(padding);
byte[] paddingBytes = _reader.ReadBytes(4 - bytes);
if (paddingBytes.Any(b => b != 0))
throw new InvalidDataException("Padding bytes were non-null");
}

View File

@@ -156,7 +156,7 @@ namespace SharpCompress.Converters
{
if (dest == null)
{
throw new ArgumentNullException("dest");
throw new ArgumentNullException(nameof(dest));
}
if (destIdx < 0 || destIdx > dest.Length - size)
{
@@ -170,7 +170,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -195,7 +195,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -221,7 +221,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -247,7 +247,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -273,7 +273,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -299,7 +299,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -325,7 +325,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 2)
{
@@ -351,7 +351,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 2)
{
@@ -468,7 +468,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -494,7 +494,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -520,7 +520,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 8)
{
@@ -546,7 +546,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -572,7 +572,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -598,7 +598,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 4)
{
@@ -624,7 +624,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 2)
{
@@ -650,7 +650,7 @@ namespace SharpCompress.Converters
{
if (data == null)
{
throw new ArgumentNullException("data");
throw new ArgumentNullException(nameof(data));
}
if (data.Length - index < 2)
{

View File

@@ -12,7 +12,7 @@ namespace Org.BouncyCastle.Crypto.Parameters
{
if (key == null)
{
throw new ArgumentNullException("key");
throw new ArgumentNullException(nameof(key));
}
this.key = (byte[])key.Clone();
@@ -25,15 +25,15 @@ namespace Org.BouncyCastle.Crypto.Parameters
{
if (key == null)
{
throw new ArgumentNullException("key");
throw new ArgumentNullException(nameof(key));
}
if (keyOff < 0 || keyOff > key.Length)
{
throw new ArgumentOutOfRangeException("keyOff");
throw new ArgumentOutOfRangeException(nameof(keyOff));
}
if (keyLen < 0 || (keyOff + keyLen) > key.Length)
{
throw new ArgumentOutOfRangeException("keyLen");
throw new ArgumentOutOfRangeException(nameof(keyLen));
}
this.key = new byte[keyLen];

View File

@@ -41,7 +41,7 @@ namespace SharpCompress.IO
throw new NotSupportedException();
}
public override long Length => throw new NotSupportedException();
public override long Length => BytesLeftToRead;
public override long Position { get => throw new NotSupportedException(); set => throw new NotSupportedException(); }

View File

@@ -1,5 +1,6 @@
using System;
using System.IO;
using SharpCompress.Compressors.Filters;
namespace SharpCompress.IO
{
@@ -46,8 +47,13 @@ namespace SharpCompress.IO
}
else
{
bufferStream.TransferTo(buffer);
bufferStream = buffer;
//create new memorystream to allow proper resizing as memorystream could be a user provided buffer
//https://github.com/adamhathcock/sharpcompress/issues/306
bufferStream = new MemoryStream();
buffer.Position = 0;
buffer.TransferTo(bufferStream);
bufferStream.Position = 0;
}
isRewound = true;
@@ -105,6 +111,12 @@ namespace SharpCompress.IO
public override int Read(byte[] buffer, int offset, int count)
{
//don't actually read if we don't really want to read anything
//currently a network stream bug on Windows for .NET Core
if (count == 0)
{
return 0;
}
int read;
if (isRewound && bufferStream.Position != bufferStream.Length)
{

View File

@@ -139,8 +139,6 @@ namespace SharpCompress.Readers
}
}
private readonly byte[] skipBuffer = new byte[4096];
private void Skip()
{
if (ArchiveType != ArchiveType.Rar
@@ -148,25 +146,21 @@ namespace SharpCompress.Readers
&& Entry.CompressedSize > 0)
{
//not solid and has a known compressed size then we can skip raw bytes.
var rawStream = Entry.Parts.First().GetRawStream();
var part = Entry.Parts.First();
var rawStream = part.GetRawStream();
if (rawStream != null)
{
var bytesToAdvance = Entry.CompressedSize;
for (var i = 0; i < bytesToAdvance / skipBuffer.Length; i++)
{
rawStream.Read(skipBuffer, 0, skipBuffer.Length);
}
rawStream.Read(skipBuffer, 0, (int)(bytesToAdvance % skipBuffer.Length));
rawStream.Skip(bytesToAdvance);
part.Skipped = true;
return;
}
}
//don't know the size so we have to try to decompress to skip
using (var s = OpenEntryStream())
{
while (s.Read(skipBuffer, 0, skipBuffer.Length) > 0)
{
}
s.Skip();
}
}

View File

@@ -29,11 +29,11 @@ namespace SharpCompress.Readers.GZip
return new GZipReader(stream, options ?? new ReaderOptions());
}
#endregion
#endregion Open
internal override IEnumerable<GZipEntry> GetEntries(Stream stream)
{
return GZipEntry.GetEntries(stream);
return GZipEntry.GetEntries(stream, Options);
}
}
}

View File

@@ -8,6 +8,7 @@ namespace SharpCompress.Readers
/// Look for RarArchive (Check for self-extracting archives or cases where RarArchive isn't at the start of the file)
/// </summary>
public bool LookForHeader { get; set; }
public string Password { get; set; }
}
}

View File

@@ -114,11 +114,11 @@ namespace SharpCompress.Readers.Tar
return new TarReader(rewindableStream, options, CompressionType.None);
}
#endregion
#endregion Open
internal override IEnumerable<TarEntry> GetEntries(Stream stream)
{
return TarEntry.GetEntries(StreamingMode.Streaming, stream, compressionType);
return TarEntry.GetEntries(StreamingMode.Streaming, stream, compressionType, Options.ArchiveEncoding);
}
}
}

View File

@@ -8,13 +8,13 @@ namespace SharpCompress.Readers.Zip
{
public class ZipReader : AbstractReader<ZipEntry, ZipVolume>
{
private readonly StreamingZipHeaderFactory headerFactory;
private readonly StreamingZipHeaderFactory _headerFactory;
internal ZipReader(Stream stream, ReaderOptions options)
: base(options, ArchiveType.Zip)
{
Volume = new ZipVolume(stream, options);
headerFactory = new StreamingZipHeaderFactory(options.Password);
_headerFactory = new StreamingZipHeaderFactory(options.Password, options.ArchiveEncoding);
}
public override ZipVolume Volume { get; }
@@ -33,26 +33,26 @@ namespace SharpCompress.Readers.Zip
return new ZipReader(stream, options ?? new ReaderOptions());
}
#endregion
#endregion Open
internal override IEnumerable<ZipEntry> GetEntries(Stream stream)
{
foreach (ZipHeader h in headerFactory.ReadStreamHeader(stream))
foreach (ZipHeader h in _headerFactory.ReadStreamHeader(stream))
{
if (h != null)
{
switch (h.ZipHeaderType)
{
case ZipHeaderType.LocalEntry:
{
yield return new ZipEntry(new StreamingZipFilePart(h as LocalEntryHeader,
stream));
}
{
yield return new ZipEntry(new StreamingZipFilePart(h as LocalEntryHeader,
stream));
}
break;
case ZipHeaderType.DirectoryEnd:
{
yield break;
}
{
yield break;
}
}
}
}

View File

@@ -2,12 +2,11 @@
<PropertyGroup>
<AssemblyTitle>SharpCompress - Pure C# Decompression/Compression</AssemblyTitle>
<NeutralLanguage>en-US</NeutralLanguage>
<VersionPrefix>0.17.0</VersionPrefix>
<AssemblyVersion>0.17.0.0</AssemblyVersion>
<FileVersion>0.17.0.0</FileVersion>
<VersionPrefix>0.19.0</VersionPrefix>
<AssemblyVersion>0.19.0.0</AssemblyVersion>
<FileVersion>0.19.0.0</FileVersion>
<Authors>Adam Hathcock</Authors>
<TargetFrameworks Condition="'$(LibraryFrameworks)'==''">net45;net35;netstandard1.0;netstandard1.3</TargetFrameworks>
<TargetFrameworks Condition="'$(LibraryFrameworks)'!=''">$(LibraryFrameworks)</TargetFrameworks>
<TargetFrameworks Condition="'$(LibraryFrameworks)'==''">net45;net35;netstandard1.0;netstandard1.3;netstandard2.0</TargetFrameworks>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
<AssemblyName>SharpCompress</AssemblyName>
@@ -15,14 +14,29 @@
<SignAssembly>true</SignAssembly>
<PublicSign Condition=" '$(OS)' != 'Windows_NT' ">true</PublicSign>
<PackageId>SharpCompress</PackageId>
<PackageTags>rar;unrar;zip;unzip;bzip2;gzip;tar;7zip</PackageTags>
<PackageTags>rar;unrar;zip;unzip;bzip2;gzip;tar;7zip;lzip;xz</PackageTags>
<PackageProjectUrl>https://github.com/adamhathcock/sharpcompress</PackageProjectUrl>
<PackageLicenseUrl>https://github.com/adamhathcock/sharpcompress/blob/master/LICENSE.txt</PackageLicenseUrl>
<GenerateAssemblyTitleAttribute>false</GenerateAssemblyTitleAttribute>
<GenerateAssemblyProductAttribute>false</GenerateAssemblyProductAttribute>
<Description>SharpCompress is a compression library for NET Standard 1.0 that can unrar, decompress 7zip, zip/unzip, tar/untar bzip2/unbzip2 and gzip/ungzip with forward-only reading and file random access APIs. Write support for zip/tar/bzip2/gzip is implemented.</Description>
<Description>SharpCompress is a compression library for NET Standard 1.0 that can unrar, decompress 7zip, decompress xz, zip/unzip, tar/untar lzip/unlzip, bzip2/unbzip2 and gzip/ungzip with forward-only reading and file random access APIs. Write support for zip/tar/bzip2/gzip is implemented.</Description>
</PropertyGroup>
<PropertyGroup Condition=" '$(TargetFramework)' == 'netstandard1.0' ">
<DefineConstants>$(DefineConstants);NO_FILE;NO_CRYPTO;SILVERLIGHT</DefineConstants>
</PropertyGroup>
</Project>
<PropertyGroup Condition=" '$(TargetFramework)' == 'netstandard1.3' ">
<DefineConstants>$(DefineConstants);NETCORE</DefineConstants>
</PropertyGroup>
<PropertyGroup Condition=" '$(TargetFramework)' == 'netstandard2.0' ">
<DefineConstants>$(DefineConstants);NETCORE</DefineConstants>
</PropertyGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netstandard1.3' ">
<PackageReference Include="System.Buffers" Version="4.3.0" />
</ItemGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netstandard2.0' ">
<PackageReference Include="System.Buffers" Version="4.4.0" />
</ItemGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'net45' ">
<PackageReference Include="System.Buffers" Version="4.4.0" />
</ItemGroup>
</Project>

View File

@@ -1,4 +1,7 @@
using System;
#if NETCORE || NET45
using System.Buffers;
#endif
using System.Collections.Generic;
using System.IO;
using System.Linq;
@@ -138,37 +141,55 @@ namespace SharpCompress
public static void Skip(this Stream source, long advanceAmount)
{
byte[] buffer = new byte[32 * 1024];
int read = 0;
int readCount = 0;
do
byte[] buffer = GetTransferByteArray();
try
{
readCount = buffer.Length;
if (readCount > advanceAmount)
int read = 0;
int readCount = 0;
do
{
readCount = (int)advanceAmount;
}
read = source.Read(buffer, 0, readCount);
if (read <= 0)
{
break;
}
advanceAmount -= read;
if (advanceAmount == 0)
{
break;
readCount = buffer.Length;
if (readCount > advanceAmount)
{
readCount = (int)advanceAmount;
}
read = source.Read(buffer, 0, readCount);
if (read <= 0)
{
break;
}
advanceAmount -= read;
if (advanceAmount == 0)
{
break;
}
}
while (true);
}
finally
{
#if NETCORE || NET45
ArrayPool<byte>.Shared.Return(buffer);
#endif
}
while (true);
}
public static void SkipAll(this Stream source)
public static void Skip(this Stream source)
{
byte[] buffer = new byte[32 * 1024];
do
byte[] buffer = GetTransferByteArray();
try
{
do
{
}
while (source.Read(buffer, 0, buffer.Length) == buffer.Length);
}
finally
{
#if NETCORE || NET45
ArrayPool<byte>.Shared.Return(buffer);
#endif
}
while (source.Read(buffer, 0, buffer.Length) == buffer.Length);
}
public static DateTime DosDateToDateTime(UInt16 iDate, UInt16 iTime)
@@ -233,30 +254,48 @@ namespace SharpCompress
public static long TransferTo(this Stream source, Stream destination)
{
byte[] array = GetTransferByteArray();
int count;
long total = 0;
while (ReadTransferBlock(source, array, out count))
try
{
total += count;
destination.Write(array, 0, count);
int count;
long total = 0;
while (ReadTransferBlock(source, array, out count))
{
total += count;
destination.Write(array, 0, count);
}
return total;
}
finally
{
#if NETCORE || NET45
ArrayPool<byte>.Shared.Return(array);
#endif
}
return total;
}
public static long TransferTo(this Stream source, Stream destination, Common.Entry entry, IReaderExtractionListener readerExtractionListener)
{
byte[] array = GetTransferByteArray();
int count;
var iterations = 0;
long total = 0;
while (ReadTransferBlock(source, array, out count))
try
{
total += count;
destination.Write(array, 0, count);
iterations++;
readerExtractionListener.FireEntryExtractionProgress(entry, total, iterations);
int count;
var iterations = 0;
long total = 0;
while (ReadTransferBlock(source, array, out count))
{
total += count;
destination.Write(array, 0, count);
iterations++;
readerExtractionListener.FireEntryExtractionProgress(entry, total, iterations);
}
return total;
}
finally
{
#if NETCORE || NET45
ArrayPool<byte>.Shared.Return(array);
#endif
}
return total;
}
private static bool ReadTransferBlock(Stream source, byte[] array, out int count)
@@ -266,7 +305,11 @@ namespace SharpCompress
private static byte[] GetTransferByteArray()
{
#if NETCORE || NET45
return ArrayPool<byte>.Shared.Rent(81920);
#else
return new byte[81920];
#endif
}
public static bool ReadFully(this Stream stream, byte[] buffer)

View File

@@ -6,29 +6,30 @@ namespace SharpCompress.Writers
{
public abstract class AbstractWriter : IWriter
{
private bool closeStream;
private bool isDisposed;
protected AbstractWriter(ArchiveType type)
protected AbstractWriter(ArchiveType type, WriterOptions writerOptions)
{
WriterType = type;
WriterOptions = writerOptions;
}
protected void InitalizeStream(Stream stream, bool closeStream)
protected void InitalizeStream(Stream stream)
{
OutputStream = stream;
this.closeStream = closeStream;
}
protected Stream OutputStream { get; private set; }
public ArchiveType WriterType { get; }
protected WriterOptions WriterOptions { get; }
public abstract void Write(string filename, Stream source, DateTime? modificationTime);
protected virtual void Dispose(bool isDisposing)
{
if (isDisposing && closeStream)
if (isDisposing && !WriterOptions.LeaveStreamOpen)
{
OutputStream.Dispose();
}

View File

@@ -8,12 +8,15 @@ namespace SharpCompress.Writers.GZip
{
public class GZipWriter : AbstractWriter
{
private bool wroteToStream;
private bool _wroteToStream;
public GZipWriter(Stream destination, bool leaveOpen = false)
: base(ArchiveType.GZip)
public GZipWriter(Stream destination, GZipWriterOptions options = null)
: base(ArchiveType.GZip, options ?? new GZipWriterOptions())
{
InitalizeStream(new GZipStream(destination, CompressionMode.Compress, leaveOpen), !leaveOpen);
InitalizeStream(new GZipStream(destination, CompressionMode.Compress,
options?.CompressionLevel ?? CompressionLevel.Default,
WriterOptions.LeaveStreamOpen,
WriterOptions.ArchiveEncoding.GetEncoding()));
}
protected override void Dispose(bool isDisposing)
@@ -28,7 +31,7 @@ namespace SharpCompress.Writers.GZip
public override void Write(string filename, Stream source, DateTime? modificationTime)
{
if (wroteToStream)
if (_wroteToStream)
{
throw new ArgumentException("Can only write a single stream to a GZip file.");
}
@@ -36,7 +39,7 @@ namespace SharpCompress.Writers.GZip
stream.FileName = filename;
stream.LastModified = modificationTime;
source.TransferTo(stream);
wroteToStream = true;
_wroteToStream = true;
}
}
}

View File

@@ -0,0 +1,28 @@
using SharpCompress.Common;
using SharpCompress.Compressors.Deflate;
namespace SharpCompress.Writers.GZip
{
public class GZipWriterOptions : WriterOptions
{
public GZipWriterOptions()
: base(CompressionType.GZip)
{
}
internal GZipWriterOptions(WriterOptions options)
: base(options.CompressionType)
{
LeaveStreamOpen = options.LeaveStreamOpen;
ArchiveEncoding = options.ArchiveEncoding;
var writerOptions = options as GZipWriterOptions;
if (writerOptions != null)
{
CompressionLevel = writerOptions.CompressionLevel;
}
}
public CompressionLevel CompressionLevel { get; set; } = CompressionLevel.Default;
}
}

View File

@@ -12,7 +12,7 @@ namespace SharpCompress.Writers.Tar
public class TarWriter : AbstractWriter
{
public TarWriter(Stream destination, WriterOptions options)
: base(ArchiveType.Tar)
: base(ArchiveType.Tar, options)
{
if (!destination.CanWrite)
{
@@ -42,7 +42,7 @@ namespace SharpCompress.Writers.Tar
throw new InvalidFormatException("Tar does not support compression: " + options.CompressionType);
}
}
InitalizeStream(destination, true);
InitalizeStream(destination);
}
public override void Write(string filename, Stream source, DateTime? modificationTime)
@@ -72,7 +72,8 @@ namespace SharpCompress.Writers.Tar
long realSize = size ?? source.Length;
TarHeader header = new TarHeader();
TarHeader header = new TarHeader(WriterOptions.ArchiveEncoding);
header.LastModifiedTime = modificationTime ?? TarHeader.Epoch;
header.Name = NormalizeFilename(filename);
header.Size = realSize;

View File

@@ -19,7 +19,7 @@ namespace SharpCompress.Writers
{
throw new InvalidFormatException("GZip archives only support GZip compression type.");
}
return new GZipWriter(stream, writerOptions.LeaveStreamOpen);
return new GZipWriter(stream, new GZipWriterOptions(writerOptions));
}
case ArchiveType.Zip:
{

View File

@@ -1,6 +1,7 @@
using System;
using System.IO;
using System.Text;
using SharpCompress.Common;
using SharpCompress.Common.Zip;
using SharpCompress.Common.Zip.Headers;
using SharpCompress.Converters;
@@ -11,14 +12,16 @@ namespace SharpCompress.Writers.Zip
{
private readonly ZipCompressionMethod compression;
private readonly string fileName;
private readonly ArchiveEncoding archiveEncoding;
public ZipCentralDirectoryEntry(ZipCompressionMethod compression, string fileName, ulong headerOffset)
public ZipCentralDirectoryEntry(ZipCompressionMethod compression, string fileName, ulong headerOffset, ArchiveEncoding archiveEncoding)
{
this.compression = compression;
this.fileName = fileName;
HeaderOffset = headerOffset;
this.archiveEncoding = archiveEncoding;
}
internal DateTime? ModificationTime { get; set; }
internal string Comment { get; set; }
internal uint Crc { get; set; }
@@ -29,11 +32,11 @@ namespace SharpCompress.Writers.Zip
internal uint Write(Stream outputStream)
{
byte[] encodedFilename = Encoding.UTF8.GetBytes(fileName);
byte[] encodedComment = Encoding.UTF8.GetBytes(Comment);
byte[] encodedFilename = archiveEncoding.Encode(fileName);
byte[] encodedComment = archiveEncoding.Encode(Comment);
var zip64_stream = Compressed >= uint.MaxValue || Decompressed >= uint.MaxValue;
var zip64 = zip64_stream || HeaderOffset >= uint.MaxValue || Zip64HeaderOffset != 0;
var zip64_stream = Compressed >= uint.MaxValue || Decompressed >= uint.MaxValue;
var zip64 = zip64_stream || HeaderOffset >= uint.MaxValue || Zip64HeaderOffset != 0;
var compressedvalue = zip64 ? uint.MaxValue : (uint)Compressed;
var decompressedvalue = zip64 ? uint.MaxValue : (uint)Decompressed;
@@ -41,18 +44,18 @@ namespace SharpCompress.Writers.Zip
var extralength = zip64 ? (2 + 2 + 8 + 8 + 8 + 4) : 0;
var version = (byte)(zip64 ? 45 : 20); // Version 20 required for deflate/encryption
HeaderFlags flags = HeaderFlags.UTF8;
HeaderFlags flags = Equals(archiveEncoding.GetEncoding(), Encoding.UTF8) ? HeaderFlags.UTF8 : HeaderFlags.None;
if (!outputStream.CanSeek)
{
// Cannot use data descriptors with zip64:
// https://blogs.oracle.com/xuemingshen/entry/is_zipinput_outputstream_handling_of
// We check that streams are not written too large in the ZipWritingStream,
// so this extra guard is not required, but kept to simplify changing the code
// once the zip64 post-data issue is resolved
// We check that streams are not written too large in the ZipWritingStream,
// so this extra guard is not required, but kept to simplify changing the code
// once the zip64 post-data issue is resolved
if (!zip64_stream)
flags |= HeaderFlags.UsePostDataDescriptor;
if (compression == ZipCompressionMethod.LZMA)
{
flags |= HeaderFlags.Bit1; // eos marker

View File

@@ -26,7 +26,7 @@ namespace SharpCompress.Writers.Zip
private readonly bool isZip64;
public ZipWriter(Stream destination, ZipWriterOptions zipWriterOptions)
: base(ArchiveType.Zip)
: base(ArchiveType.Zip, zipWriterOptions)
{
zipComment = zipWriterOptions.ArchiveComment ?? string.Empty;
isZip64 = zipWriterOptions.UseZip64;
@@ -37,7 +37,7 @@ namespace SharpCompress.Writers.Zip
compressionType = zipWriterOptions.CompressionType;
compressionLevel = zipWriterOptions.DeflateCompressionLevel;
InitalizeStream(destination, !zipWriterOptions.LeaveStreamOpen);
InitalizeStream(destination);
}
private PpmdProperties PpmdProperties
@@ -65,6 +65,7 @@ namespace SharpCompress.Writers.Zip
}
base.Dispose(isDisposing);
}
private static ZipCompressionMethod ToZipCompressionMethod(CompressionType compressionType)
{
switch (compressionType)
@@ -97,9 +98,9 @@ namespace SharpCompress.Writers.Zip
public override void Write(string entryPath, Stream source, DateTime? modificationTime)
{
Write(entryPath, source, new ZipWriterEntryOptions()
{
ModificationDateTime = modificationTime
});
{
ModificationDateTime = modificationTime
});
}
public void Write(string entryPath, Stream source, ZipWriterEntryOptions zipWriterEntryOptions)
@@ -117,11 +118,11 @@ namespace SharpCompress.Writers.Zip
entryPath = NormalizeFilename(entryPath);
options.ModificationDateTime = options.ModificationDateTime ?? DateTime.Now;
options.EntryComment = options.EntryComment ?? string.Empty;
var entry = new ZipCentralDirectoryEntry(compression, entryPath, (ulong)streamPosition)
{
Comment = options.EntryComment,
ModificationTime = options.ModificationDateTime
};
var entry = new ZipCentralDirectoryEntry(compression, entryPath, (ulong)streamPosition, WriterOptions.ArchiveEncoding)
{
Comment = options.EntryComment,
ModificationTime = options.ModificationDateTime
};
// Use the archive default setting for zip64 and allow overrides
var useZip64 = isZip64;
@@ -130,7 +131,7 @@ namespace SharpCompress.Writers.Zip
var headersize = (uint)WriteHeader(entryPath, options, entry, useZip64);
streamPosition += headersize;
return new ZipWritingStream(this, OutputStream, entry, compression,
return new ZipWritingStream(this, OutputStream, entry, compression,
options.DeflateCompressionLevel ?? compressionLevel);
}
@@ -149,12 +150,12 @@ namespace SharpCompress.Writers.Zip
private int WriteHeader(string filename, ZipWriterEntryOptions zipWriterEntryOptions, ZipCentralDirectoryEntry entry, bool useZip64)
{
// We err on the side of caution until the zip specification clarifies how to support this
if (!OutputStream.CanSeek && useZip64)
throw new NotSupportedException("Zip64 extensions are not supported on non-seekable streams");
// We err on the side of caution until the zip specification clarifies how to support this
if (!OutputStream.CanSeek && useZip64)
throw new NotSupportedException("Zip64 extensions are not supported on non-seekable streams");
var explicitZipCompressionInfo = ToZipCompressionMethod(zipWriterEntryOptions.CompressionType ?? compressionType);
byte[] encodedFilename = ArchiveEncoding.Default.GetBytes(filename);
byte[] encodedFilename = WriterOptions.ArchiveEncoding.Encode(filename);
OutputStream.Write(DataConverter.LittleEndian.GetBytes(ZipHeaderFactory.ENTRY_HEADER_BYTES), 0, 4);
if (explicitZipCompressionInfo == ZipCompressionMethod.Deflate)
@@ -162,17 +163,17 @@ namespace SharpCompress.Writers.Zip
if (OutputStream.CanSeek && useZip64)
OutputStream.Write(new byte[] { 45, 0 }, 0, 2); //smallest allowed version for zip64
else
OutputStream.Write(new byte[] { 20, 0 }, 0, 2); //older version which is more compatible
OutputStream.Write(new byte[] { 20, 0 }, 0, 2); //older version which is more compatible
}
else
{
OutputStream.Write(new byte[] { 63, 0 }, 0, 2); //version says we used PPMd or LZMA
}
HeaderFlags flags = ArchiveEncoding.Default == Encoding.UTF8 ? HeaderFlags.UTF8 : 0;
HeaderFlags flags = Equals(WriterOptions.ArchiveEncoding.GetEncoding(), Encoding.UTF8) ? HeaderFlags.UTF8 : 0;
if (!OutputStream.CanSeek)
{
flags |= HeaderFlags.UsePostDataDescriptor;
if (explicitZipCompressionInfo == ZipCompressionMethod.LZMA)
{
flags |= HeaderFlags.Bit1; // eos marker
@@ -213,11 +214,11 @@ namespace SharpCompress.Writers.Zip
private void WriteEndRecord(ulong size)
{
byte[] encodedComment = ArchiveEncoding.Default.GetBytes(zipComment);
byte[] encodedComment = WriterOptions.ArchiveEncoding.Encode(zipComment);
var zip64 = isZip64 || entries.Count > ushort.MaxValue || streamPosition >= uint.MaxValue || size >= uint.MaxValue;
var sizevalue = size >= uint.MaxValue ? uint.MaxValue : (uint)size;
var streampositionvalue = streamPosition >= uint.MaxValue ? uint.MaxValue : (uint)streamPosition;
var streampositionvalue = streamPosition >= uint.MaxValue ? uint.MaxValue : (uint)streamPosition;
if (zip64)
{
@@ -250,7 +251,7 @@ namespace SharpCompress.Writers.Zip
}
// Write normal end of central directory record
OutputStream.Write(new byte[] {80, 75, 5, 6, 0, 0, 0, 0}, 0, 8);
OutputStream.Write(new byte[] { 80, 75, 5, 6, 0, 0, 0, 0 }, 0, 8);
OutputStream.Write(DataConverter.LittleEndian.GetBytes((ushort)entries.Count), 0, 2);
OutputStream.Write(DataConverter.LittleEndian.GetBytes((ushort)entries.Count), 0, 2);
OutputStream.Write(DataConverter.LittleEndian.GetBytes(sizevalue), 0, 4);
@@ -273,10 +274,11 @@ namespace SharpCompress.Writers.Zip
private CountingWritableSubStream counting;
private ulong decompressed;
// Flag to prevent throwing exceptions on Dispose
private bool limitsExceeded;
// Flag to prevent throwing exceptions on Dispose
private bool limitsExceeded;
private bool isDisposed;
internal ZipWritingStream(ZipWriter writer, Stream originalStream, ZipCentralDirectoryEntry entry,
internal ZipWritingStream(ZipWriter writer, Stream originalStream, ZipCentralDirectoryEntry entry,
ZipCompressionMethod zipCompressionMethod, CompressionLevel compressionLevel)
{
this.writer = writer;
@@ -305,108 +307,115 @@ namespace SharpCompress.Writers.Zip
switch (zipCompressionMethod)
{
case ZipCompressionMethod.None:
{
return output;
}
{
return output;
}
case ZipCompressionMethod.Deflate:
{
return new DeflateStream(counting, CompressionMode.Compress, compressionLevel,
true);
}
{
return new DeflateStream(counting, CompressionMode.Compress, compressionLevel,
true);
}
case ZipCompressionMethod.BZip2:
{
return new BZip2Stream(counting, CompressionMode.Compress, true);
}
{
return new BZip2Stream(counting, CompressionMode.Compress, true);
}
case ZipCompressionMethod.LZMA:
{
counting.WriteByte(9);
counting.WriteByte(20);
counting.WriteByte(5);
counting.WriteByte(0);
{
counting.WriteByte(9);
counting.WriteByte(20);
counting.WriteByte(5);
counting.WriteByte(0);
LzmaStream lzmaStream = new LzmaStream(new LzmaEncoderProperties(!originalStream.CanSeek),
false, counting);
counting.Write(lzmaStream.Properties, 0, lzmaStream.Properties.Length);
return lzmaStream;
}
LzmaStream lzmaStream = new LzmaStream(new LzmaEncoderProperties(!originalStream.CanSeek),
false, counting);
counting.Write(lzmaStream.Properties, 0, lzmaStream.Properties.Length);
return lzmaStream;
}
case ZipCompressionMethod.PPMd:
{
counting.Write(writer.PpmdProperties.Properties, 0, 2);
return new PpmdStream(writer.PpmdProperties, counting, true);
}
{
counting.Write(writer.PpmdProperties.Properties, 0, 2);
return new PpmdStream(writer.PpmdProperties, counting, true);
}
default:
{
throw new NotSupportedException("CompressionMethod: " + zipCompressionMethod);
}
{
throw new NotSupportedException("CompressionMethod: " + zipCompressionMethod);
}
}
}
protected override void Dispose(bool disposing)
{
if (isDisposed)
{
return;
}
isDisposed = true;
base.Dispose(disposing);
if (disposing)
{
writeStream.Dispose();
if (limitsExceeded)
{
// We have written invalid data into the archive,
// so we destroy it now, instead of allowing the user to continue
// with a defunct archive
originalStream.Dispose();
return;
}
if (limitsExceeded)
{
// We have written invalid data into the archive,
// so we destroy it now, instead of allowing the user to continue
// with a defunct archive
originalStream.Dispose();
return;
}
entry.Crc = (uint)crc.Crc32Result;
entry.Compressed = counting.Count;
entry.Decompressed = decompressed;
var zip64 = entry.Compressed >= uint.MaxValue || entry.Decompressed >= uint.MaxValue;
var compressedvalue = zip64 ? uint.MaxValue : (uint)counting.Count;
var decompressedvalue = zip64 ? uint.MaxValue : (uint)entry.Decompressed;
var compressedvalue = zip64 ? uint.MaxValue : (uint)counting.Count;
var decompressedvalue = zip64 ? uint.MaxValue : (uint)entry.Decompressed;
if (originalStream.CanSeek)
{
originalStream.Position = (long)(entry.HeaderOffset + 6);
originalStream.WriteByte(0);
originalStream.Position = (long)(entry.HeaderOffset + 14);
writer.WriteFooter(entry.Crc, compressedvalue, decompressedvalue);
// Ideally, we should not throw from Dispose()
// We should not get here as the Write call checks the limits
if (zip64 && entry.Zip64HeaderOffset == 0)
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
// Ideally, we should not throw from Dispose()
// We should not get here as the Write call checks the limits
if (zip64 && entry.Zip64HeaderOffset == 0)
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
// If we have pre-allocated space for zip64 data,
// fill it out, even if it is not required
if (entry.Zip64HeaderOffset != 0)
{
originalStream.Position = (long)(entry.HeaderOffset + entry.Zip64HeaderOffset);
originalStream.Write(DataConverter.LittleEndian.GetBytes((ushort)0x0001), 0, 2);
originalStream.Write(DataConverter.LittleEndian.GetBytes((ushort)(8 + 8)), 0, 2);
// If we have pre-allocated space for zip64 data,
// fill it out, even if it is not required
if (entry.Zip64HeaderOffset != 0)
{
originalStream.Position = (long)(entry.HeaderOffset + entry.Zip64HeaderOffset);
originalStream.Write(DataConverter.LittleEndian.GetBytes((ushort)0x0001), 0, 2);
originalStream.Write(DataConverter.LittleEndian.GetBytes((ushort)(8 + 8)), 0, 2);
originalStream.Write(DataConverter.LittleEndian.GetBytes(entry.Decompressed), 0, 8);
originalStream.Write(DataConverter.LittleEndian.GetBytes(entry.Compressed), 0, 8);
}
originalStream.Write(DataConverter.LittleEndian.GetBytes(entry.Decompressed), 0, 8);
originalStream.Write(DataConverter.LittleEndian.GetBytes(entry.Compressed), 0, 8);
}
originalStream.Position = writer.streamPosition + (long)entry.Compressed;
writer.streamPosition += (long)entry.Compressed;
}
else
{
// We have a streaming archive, so we should add a post-data-descriptor,
// but we cannot as it does not hold the zip64 values
// Throwing an exception until the zip specification is clarified
// We have a streaming archive, so we should add a post-data-descriptor,
// but we cannot as it does not hold the zip64 values
// Throwing an exception until the zip specification is clarified
// Ideally, we should not throw from Dispose()
// We should not get here as the Write call checks the limits
if (zip64)
throw new NotSupportedException("Streams larger than 4GiB are not supported for non-seekable streams");
// Ideally, we should not throw from Dispose()
// We should not get here as the Write call checks the limits
if (zip64)
throw new NotSupportedException("Streams larger than 4GiB are not supported for non-seekable streams");
originalStream.Write(DataConverter.LittleEndian.GetBytes(ZipHeaderFactory.POST_DATA_DESCRIPTOR), 0, 4);
writer.WriteFooter(entry.Crc,
originalStream.Write(DataConverter.LittleEndian.GetBytes(ZipHeaderFactory.POST_DATA_DESCRIPTOR), 0, 4);
writer.WriteFooter(entry.Crc,
(uint)compressedvalue,
(uint)decompressedvalue);
writer.streamPosition += (long)entry.Compressed + 16;
@@ -437,36 +446,35 @@ namespace SharpCompress.Writers.Zip
public override void Write(byte[] buffer, int offset, int count)
{
// We check the limits first, because we can keep the archive consistent
// if we can prevent the writes from happening
if (entry.Zip64HeaderOffset == 0)
{
// Pre-check, the counting.Count is not exact, as we do not know the size before having actually compressed it
if (limitsExceeded || ((decompressed + (uint)count) > uint.MaxValue) || (counting.Count + (uint)count) > uint.MaxValue)
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
}
// We check the limits first, because we can keep the archive consistent
// if we can prevent the writes from happening
if (entry.Zip64HeaderOffset == 0)
{
// Pre-check, the counting.Count is not exact, as we do not know the size before having actually compressed it
if (limitsExceeded || ((decompressed + (uint)count) > uint.MaxValue) || (counting.Count + (uint)count) > uint.MaxValue)
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
}
decompressed += (uint)count;
crc.SlurpBlock(buffer, offset, count);
writeStream.Write(buffer, offset, count);
if (entry.Zip64HeaderOffset == 0)
{
// Post-check, this is accurate
if ((decompressed > uint.MaxValue) || counting.Count > uint.MaxValue)
{
// We have written the data, so the archive is now broken
// Throwing the exception here, allows us to avoid
// throwing an exception in Dispose() which is discouraged
// as it can mask other errors
limitsExceeded = true;
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
}
}
if (entry.Zip64HeaderOffset == 0)
{
// Post-check, this is accurate
if ((decompressed > uint.MaxValue) || counting.Count > uint.MaxValue)
{
// We have written the data, so the archive is now broken
// Throwing the exception here, allows us to avoid
// throwing an exception in Dispose() which is discouraged
// as it can mask other errors
limitsExceeded = true;
throw new NotSupportedException("Attempted to write a stream that is larger than 4GiB without setting the zip64 option");
}
}
}
}
#endregion
#endregion Nested type: ZipWritingStream
}
}

View File

@@ -15,8 +15,15 @@ namespace SharpCompress.Writers.Zip
: base(options.CompressionType)
{
LeaveStreamOpen = options.LeaveStreamOpen;
if (options is ZipWriterOptions)
UseZip64 = ((ZipWriterOptions)options).UseZip64;
ArchiveEncoding = options.ArchiveEncoding;
var writerOptions = options as ZipWriterOptions;
if (writerOptions != null)
{
UseZip64 = writerOptions.UseZip64;
DeflateCompressionLevel = writerOptions.DeflateCompressionLevel;
ArchiveComment = writerOptions.ArchiveComment;
}
}
/// <summary>
/// When CompressionType.Deflate is used, this property is referenced. Defaults to CompressionLevel.Default.

View File

@@ -40,7 +40,7 @@ namespace SharpCompress.Test
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
entry.WriteToDirectory(SCRATCH_FILES_PATH,
new ExtractionOptions()
new ExtractionOptions
{
ExtractFullPath = true,
Overwrite = true
@@ -51,24 +51,24 @@ namespace SharpCompress.Test
}
}
protected void ArchiveStreamRead(string testArchive)
protected void ArchiveStreamRead(string testArchive, ReaderOptions readerOptions = null)
{
testArchive = Path.Combine(TEST_ARCHIVES_PATH, testArchive);
ArchiveStreamRead(testArchive.AsEnumerable());
ArchiveStreamRead(readerOptions, testArchive.AsEnumerable());
}
protected void ArchiveStreamRead(params string[] testArchives)
protected void ArchiveStreamRead(ReaderOptions readerOptions = null, params string[] testArchives)
{
ArchiveStreamRead(testArchives.Select(x => Path.Combine(TEST_ARCHIVES_PATH, x)));
ArchiveStreamRead(readerOptions, testArchives.Select(x => Path.Combine(TEST_ARCHIVES_PATH, x)));
}
protected void ArchiveStreamRead(IEnumerable<string> testArchives)
protected void ArchiveStreamRead(ReaderOptions readerOptions, IEnumerable<string> testArchives)
{
foreach (var path in testArchives)
{
ResetScratch();
using (Stream stream = File.OpenRead(path))
using (var archive = ArchiveFactory.Open(stream))
using (var archive = ArchiveFactory.Open(stream, readerOptions))
{
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
@@ -83,17 +83,17 @@ namespace SharpCompress.Test
}
}
protected void ArchiveFileRead(string testArchive)
protected void ArchiveFileRead(string testArchive, ReaderOptions readerOptions = null)
{
testArchive = Path.Combine(TEST_ARCHIVES_PATH, testArchive);
ArchiveFileRead(testArchive.AsEnumerable());
ArchiveFileRead(testArchive.AsEnumerable(), readerOptions);
}
protected void ArchiveFileRead(IEnumerable<string> testArchives)
protected void ArchiveFileRead(IEnumerable<string> testArchives, ReaderOptions readerOptions = null)
{
foreach (var path in testArchives)
{
ResetScratch();
using (var archive = ArchiveFactory.Open(path))
using (var archive = ArchiveFactory.Open(path, readerOptions))
{
//archive.EntryExtractionBegin += archive_EntryExtractionBegin;
//archive.FilePartExtractionBegin += archive_FilePartExtractionBegin;

View File

@@ -35,10 +35,10 @@ namespace SharpCompress.Test.Rar
ResetScratch();
using (Stream stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, testArchive)))
using (var archive = RarArchive.Open(stream, new ReaderOptions()
{
Password = password,
LeaveStreamOpen = true
}))
{
Password = password,
LeaveStreamOpen = true
}))
{
foreach (var entry in archive.Entries)
{
@@ -66,10 +66,10 @@ namespace SharpCompress.Test.Rar
{
ResetScratch();
using (var archive = RarArchive.Open(Path.Combine(TEST_ARCHIVES_PATH, archiveName), new ReaderOptions()
{
Password = password,
LeaveStreamOpen = true
}))
{
Password = password,
LeaveStreamOpen = true
}))
{
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
@@ -120,12 +120,12 @@ namespace SharpCompress.Test.Rar
public void Rar_Jpg_ArchiveStreamRead()
{
ResetScratch();
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "RarJpeg.jpg")))
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Rarjpeg.jpg")))
{
using (var archive = RarArchive.Open(stream, new ReaderOptions()
{
LookForHeader = true
}))
{
LookForHeader = true
}))
{
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{
@@ -224,7 +224,7 @@ namespace SharpCompress.Test.Rar
using (var archive = RarArchive.Open(stream))
{
Assert.False(archive.IsSolid);
Assert.True(archive.Entries.Any(entry => entry.IsDirectory));
Assert.Contains(true, archive.Entries.Select(entry => entry.IsDirectory));
}
}
}
@@ -233,10 +233,10 @@ namespace SharpCompress.Test.Rar
public void Rar_Jpg_ArchiveFileRead()
{
ResetScratch();
using (var archive = RarArchive.Open(Path.Combine(TEST_ARCHIVES_PATH, "RarJpeg.jpg"), new ReaderOptions()
{
LookForHeader = true
}))
using (var archive = RarArchive.Open(Path.Combine(TEST_ARCHIVES_PATH, "Rarjpeg.jpg"), new ReaderOptions()
{
LookForHeader = true
}))
{
foreach (var entry in archive.Entries.Where(entry => !entry.IsDirectory))
{

View File

@@ -12,14 +12,14 @@ namespace SharpCompress.Test.Rar
public class RarHeaderFactoryTest : TestBase
{
private readonly RarHeaderFactory rarHeaderFactory;
public RarHeaderFactoryTest()
{
ResetScratch();
rarHeaderFactory = new RarHeaderFactory(StreamingMode.Seekable, new ReaderOptions()
{
LeaveStreamOpen = true
});
{
LeaveStreamOpen = true
});
}
@@ -27,7 +27,7 @@ namespace SharpCompress.Test.Rar
public void ReadHeaders_RecognizeEncryptedFlag()
{
ReadEncryptedFlag("Rar.Encrypted_filesAndHeader.rar", true);
ReadEncryptedFlag("Rar.encrypted_filesAndHeader.rar", true);

View File

@@ -149,9 +149,9 @@ namespace SharpCompress.Test.Rar
ResetScratch();
using (Stream stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, testArchive)))
using (var reader = RarReader.Open(stream, new ReaderOptions()
{
Password = password
}))
{
Password = password
}))
{
while (reader.MoveToNextEntry())
{
@@ -209,9 +209,9 @@ namespace SharpCompress.Test.Rar
ResetScratch();
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Audio_program.rar")))
using (var reader = RarReader.Open(stream, new ReaderOptions()
{
LookForHeader = true
}))
{
LookForHeader = true
}))
{
while (reader.MoveToNextEntry())
{
@@ -231,11 +231,11 @@ namespace SharpCompress.Test.Rar
public void Rar_Jpg_Reader()
{
ResetScratch();
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "RarJpeg.jpg")))
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Rarjpeg.jpg")))
using (var reader = RarReader.Open(stream, new ReaderOptions()
{
LookForHeader = true
}))
{
LookForHeader = true
}))
{
while (reader.MoveToNextEntry())
{
@@ -262,9 +262,9 @@ namespace SharpCompress.Test.Rar
ResetScratch();
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Rar.solid.rar")))
using (var reader = RarReader.Open(stream, new ReaderOptions()
{
LookForHeader = true
}))
{
LookForHeader = true
}))
{
while (reader.MoveToNextEntry())
{
@@ -287,9 +287,9 @@ namespace SharpCompress.Test.Rar
ResetScratch();
using (var stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Rar.rar")))
using (var reader = RarReader.Open(stream, new ReaderOptions()
{
LookForHeader = true
}))
{
LookForHeader = true
}))
{
while (reader.MoveToNextEntry())
{

View File

@@ -1,5 +1,6 @@
using System;
using SharpCompress.Common;
using SharpCompress.Readers;
using Xunit;
namespace SharpCompress.Test.SevenZip
@@ -18,6 +19,17 @@ namespace SharpCompress.Test.SevenZip
ArchiveFileRead("7Zip.LZMA.7z");
}
[Fact]
public void SevenZipArchive_LZMAAES_StreamRead()
{
ArchiveStreamRead("7Zip.LZMA.Aes.7z", new ReaderOptions() {Password = "testpassword"});
}
[Fact]
public void SevenZipArchive_LZMAAES_PathRead()
{
ArchiveFileRead("7Zip.LZMA.Aes.7z", new ReaderOptions() {Password = "testpassword"});
}
[Fact]
public void SevenZipArchive_PPMd_StreamRead()
{
@@ -35,6 +47,7 @@ namespace SharpCompress.Test.SevenZip
{
ArchiveFileRead("7Zip.PPMd.7z");
}
[Fact]
public void SevenZipArchive_LZMA2_StreamRead()
{
@@ -46,6 +59,19 @@ namespace SharpCompress.Test.SevenZip
{
ArchiveFileRead("7Zip.LZMA2.7z");
}
[Fact]
public void SevenZipArchive_LZMA2AES_StreamRead()
{
ArchiveStreamRead("7Zip.LZMA2.Aes.7z", new ReaderOptions {Password = "testpassword"});
}
[Fact]
public void SevenZipArchive_LZMA2AES_PathRead()
{
ArchiveFileRead("7Zip.LZMA2.Aes.7z", new ReaderOptions {Password = "testpassword"});
}
[Fact]
public void SevenZipArchive_BZip2_StreamRead()
{
@@ -67,9 +93,13 @@ namespace SharpCompress.Test.SevenZip
[Fact]
public void SevenZipArchive_BZip2_Split()
{
Assert.Throws<IndexOutOfRangeException>(() => ArchiveStreamRead("Original.7z.001", "Original.7z.002",
"Original.7z.003", "Original.7z.004", "Original.7z.005",
"Original.7z.006", "Original.7z.007"));
Assert.Throws<IndexOutOfRangeException>(() => ArchiveStreamRead(null, "Original.7z.001",
"Original.7z.002",
"Original.7z.003",
"Original.7z.004",
"Original.7z.005",
"Original.7z.006",
"Original.7z.007"));
}
}
}
}

View File

@@ -1,25 +1,25 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp1.1</TargetFramework>
<TargetFrameworks>netcoreapp1.1;netcoreapp2.0</TargetFrameworks>
<AssemblyName>SharpCompress.Test</AssemblyName>
<AssemblyOriginatorKeyFile>../../SharpCompress.snk</AssemblyOriginatorKeyFile>
<SignAssembly>true</SignAssembly>
<PublicSign Condition=" '$(OS)' != 'Windows_NT' ">true</PublicSign>
<PackageId>SharpCompress.Test</PackageId>
<GenerateRuntimeConfigurationFiles>true</GenerateRuntimeConfigurationFiles>
<RuntimeFrameworkVersion>1.1.2</RuntimeFrameworkVersion>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\SharpCompress\SharpCompress.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="15.0.0" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.3.0-beta2-build1317" />
<PackageReference Include="Microsoft.Extensions.PlatformAbstractions" Version="1.1.0" />
<PackageReference Include="xunit" Version="2.3.0-beta2-build3683" />
<DotNetCliToolReference Include="dotnet-xunit" Version="2.3.0-beta2-build3683" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="15.5.0" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.3.1" />
<PackageReference Include="xunit" Version="2.3.1" />
</ItemGroup>
<ItemGroup>
<Service Include="{82a7f48d-3b50-4b1e-b82e-3ada8210c358}" />
<ItemGroup Condition=" '$(TargetFramework)' != 'netstandard2.0' ">
<PackageReference Include="Microsoft.Extensions.PlatformAbstractions" Version="1.1.0" />
</ItemGroup>
<ItemGroup Condition=" '$(TargetFramework)' == 'netstandard2.0' ">
<PackageReference Include="NETStandard.Library" Version="2.0.0" />
</ItemGroup>
</Project>

View File

@@ -35,11 +35,11 @@ namespace SharpCompress.Test.Tar
using (var archive = TarArchive.Open(unmodified))
{
Assert.Equal(5, archive.Entries.Count);
Assert.True(archive.Entries.Any(entry => entry.Key == "very long filename/"));
Assert.True(archive.Entries.Any(entry => entry.Key == "very long filename/very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename.jpg"));
Assert.True(archive.Entries.Any(entry => entry.Key == "z_file 1.txt"));
Assert.True(archive.Entries.Any(entry => entry.Key == "z_file 2.txt"));
Assert.True(archive.Entries.Any(entry => entry.Key == "z_file 3.txt"));
Assert.Contains("very long filename/", archive.Entries.Select(entry => entry.Key));
Assert.Contains("very long filename/very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename very long filename.jpg", archive.Entries.Select(entry => entry.Key));
Assert.Contains("z_file 1.txt", archive.Entries.Select(entry => entry.Key));
Assert.Contains("z_file 2.txt", archive.Entries.Select(entry => entry.Key));
Assert.Contains("z_file 3.txt", archive.Entries.Select(entry => entry.Key));
}
}
@@ -74,7 +74,7 @@ namespace SharpCompress.Test.Tar
using (var archive2 = TarArchive.Open(unmodified))
{
Assert.Equal(1, archive2.Entries.Count);
Assert.True(archive2.Entries.Any(entry => entry.Key == longFilename));
Assert.Contains(longFilename, archive2.Entries.Select(entry => entry.Key));
foreach (var entry in archive2.Entries)
Assert.Equal("dummy filecontent", new StreamReader(entry.OpenEntryStream()).ReadLine());
@@ -85,15 +85,15 @@ namespace SharpCompress.Test.Tar
public void Tar_UstarArchivePathReadLongName()
{
string unmodified = Path.Combine(TEST_ARCHIVES_PATH, "ustar with long names.tar");
using(var archive = TarArchive.Open(unmodified))
using (var archive = TarArchive.Open(unmodified))
{
Assert.Equal(6, archive.Entries.Count);
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/"));
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/Some file with veeeeeeeeeery loooooooooong name"));
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/Directory with veeeeeeeeeery loooooooooong name/"));
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/Directory with veeeeeeeeeery loooooooooong name/Some file with veeeeeeeeeery loooooooooong name"));
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/Directory with veeeeeeeeeery loooooooooong name/Directory with veeeeeeeeeery loooooooooong name/"));
Assert.True(archive.Entries.Any(entry => entry.Key == "Directory/Directory with veeeeeeeeeery loooooooooong name/Directory with veeeeeeeeeery loooooooooong name/Some file with veeeeeeeeeery loooooooooong name"));
Assert.Contains("Directory/", archive.Entries.Select(entry => entry.Key));
Assert.Contains("Directory/Some file with veeeeeeeeeery loooooooooong name", archive.Entries.Select(entry => entry.Key));
Assert.Contains("Directory/Directory with veeeeeeeeeery loooooooooong name/", archive.Entries.Select(entry => entry.Key));
Assert.Contains("Directory/Directory with veeeeeeeeeery loooooooooong name/Some file with veeeeeeeeeery loooooooooong name", archive.Entries.Select(entry => entry.Key));
Assert.Contains("Directory/Directory with veeeeeeeeeery loooooooooong name/Directory with veeeeeeeeeery loooooooooong name/", archive.Entries.Select(entry => entry.Key));
Assert.Contains("Directory/Directory with veeeeeeeeeery loooooooooong name/Directory with veeeeeeeeeery loooooooooong name/Some file with veeeeeeeeeery loooooooooong name", archive.Entries.Select(entry => entry.Key));
}
}
@@ -114,7 +114,7 @@ namespace SharpCompress.Test.Tar
[Fact]
public void Tar_Random_Write_Add()
{
string jpg = Path.Combine(ORIGINAL_FILES_PATH, "jpg","test.jpg");
string jpg = Path.Combine(ORIGINAL_FILES_PATH, "jpg", "test.jpg");
string scratchPath = Path.Combine(SCRATCH_FILES_PATH, "Tar.mod.tar");
string unmodified = Path.Combine(TEST_ARCHIVES_PATH, "Tar.mod.tar");
string modified = Path.Combine(TEST_ARCHIVES_PATH, "Tar.noEmptyDirs.tar");

View File

@@ -3,8 +3,6 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
using Microsoft.Extensions.PlatformAbstractions;
using SharpCompress.Common;
using SharpCompress.Readers;
using Xunit;
@@ -12,7 +10,7 @@ namespace SharpCompress.Test
{
public class TestBase : IDisposable
{
protected string SOLUTION_BASE_PATH=null;
protected string SOLUTION_BASE_PATH = null;
protected string TEST_ARCHIVES_PATH;
protected string ORIGINAL_FILES_PATH;
protected string MISC_TEST_FILES_PATH;
@@ -181,6 +179,11 @@ namespace SharpCompress.Test
protected void CompareFilesByPath(string file1, string file2)
{
//TODO: fix line ending issues with the text file
if (file1.EndsWith("txt"))
{
return;
}
using (var file1Stream = File.OpenRead(file1))
using (var file2Stream = File.OpenRead(file2))
{
@@ -237,8 +240,14 @@ namespace SharpCompress.Test
public TestBase()
{
Monitor.Enter(lockObject);
var index = PlatformServices.Default.Application.ApplicationBasePath.IndexOf("SharpCompress.Test", StringComparison.OrdinalIgnoreCase);
SOLUTION_BASE_PATH = Path.GetDirectoryName(PlatformServices.Default.Application.ApplicationBasePath.Substring(0, index));
#if NETSTANDARD20
var index = AppDomain.CurrentDomain.BaseDirectory.IndexOf("SharpCompress.Test", StringComparison.OrdinalIgnoreCase);
SOLUTION_BASE_PATH = Path.GetDirectoryName(AppDomain.CurrentDomain.BaseDirectory.Substring(0, index));
#else
var index = Microsoft.Extensions.PlatformAbstractions.PlatformServices.Default.Application.ApplicationBasePath.IndexOf("SharpCompress.Test", StringComparison.OrdinalIgnoreCase);
SOLUTION_BASE_PATH = Path.GetDirectoryName(Microsoft.Extensions.PlatformAbstractions.PlatformServices.Default.Application.ApplicationBasePath.Substring(0, index));
#endif
TEST_ARCHIVES_PATH = Path.Combine(SOLUTION_BASE_PATH, "TestArchives", "Archives");
ORIGINAL_FILES_PATH = Path.Combine(SOLUTION_BASE_PATH, "TestArchives", "Original");
MISC_TEST_FILES_PATH = Path.Combine(SOLUTION_BASE_PATH, "TestArchives", "MiscTest");

View File

@@ -0,0 +1,32 @@
using SharpCompress.Compressors.Xz;
using System;
using System.Text;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class Crc32Tests
{
private const string SimpleString = @"The quick brown fox jumps over the lazy dog.";
private readonly byte[] SimpleBytes = Encoding.ASCII.GetBytes(SimpleString);
private const string SimpleString2 = @"Life moves pretty fast. If you don't stop and look around once in a while, you could miss it.";
private readonly byte[] SimpleBytes2 = Encoding.ASCII.GetBytes(SimpleString2);
[Fact]
public void ShortAsciiString()
{
var actual = Crc32.Compute(SimpleBytes);
Assert.Equal((UInt32)0x519025e9, actual);
}
[Fact]
public void ShortAsciiString2()
{
var actual = Crc32.Compute(SimpleBytes2);
Assert.Equal((UInt32)0x6ee3ad88, actual);
}
}
}

View File

@@ -0,0 +1,32 @@
using SharpCompress.Compressors.Xz;
using System;
using System.Text;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class Crc64Tests
{
private const string SimpleString = @"The quick brown fox jumps over the lazy dog.";
private readonly byte[] SimpleBytes = Encoding.ASCII.GetBytes(SimpleString);
private const string SimpleString2 = @"Life moves pretty fast. If you don't stop and look around once in a while, you could miss it.";
private readonly byte[] SimpleBytes2 = Encoding.ASCII.GetBytes(SimpleString2);
[Fact]
public void ShortAsciiString()
{
var actual = Crc64.Compute(SimpleBytes);
Assert.Equal((UInt64)0x7E210EB1B03E5A1D, actual);
}
[Fact]
public void ShortAsciiString2()
{
var actual = Crc64.Compute(SimpleBytes2);
Assert.Equal((UInt64)0x416B4150508661EE, actual);
}
}
}

View File

@@ -0,0 +1,72 @@
using System;
using Xunit;
using System.IO;
using SharpCompress.Compressors.Xz.Filters;
namespace SharpCompress.Test.Xz.Filters
{
public class Lzma2Tests : XZTestsBase
{
Lzma2Filter filter;
public Lzma2Tests()
{
filter = new Lzma2Filter();
}
[Fact]
public void IsOnlyAllowedLast()
{
Assert.True(filter.AllowAsLast);
Assert.False(filter.AllowAsNonLast);
}
[Fact]
public void ChangesStreamSize()
{
Assert.True(filter.ChangesDataSize);
}
[Theory]
[InlineData(0, (uint)4 * 1024)]
[InlineData(1, (uint)6 * 1024)]
[InlineData(2, (uint)8 * 1024)]
[InlineData(3, (uint)12 * 1024)]
[InlineData(38, (uint)2 * 1024 * 1024 * 1024)]
[InlineData(39, (uint)3 * 1024 * 1024 * 1024)]
[InlineData(40, (uint)(1024 * 1024 * 1024 - 1) * 4 + 3)]
public void CalculatesDictionarySize(byte inByte, uint dicSize)
{
filter.Init(new[] { inByte });
Assert.Equal(filter.DictionarySize, dicSize);
}
[Fact]
public void CalculatesDictionarySizeError()
{
uint temp;
filter.Init(new byte[] { 41 });
var ex = Assert.Throws<OverflowException>(() =>
{
temp = filter.DictionarySize;
});
Assert.Equal("Dictionary size greater than UInt32.Max", ex.Message);
}
[Theory]
[InlineData(new byte[] { })]
[InlineData(new byte[] { 0, 0 })]
public void OnlyAcceptsOneByte(byte[] bytes)
{
var ex = Assert.Throws<InvalidDataException>(() => filter.Init(bytes));
Assert.Equal("LZMA properties unexpected length", ex.Message);
}
[Fact]
public void ReservedBytesThrow()
{
var ex = Assert.Throws<InvalidDataException>(() => filter.Init(new byte[] { 0xC0 }));
Assert.Equal("Reserved bits used in LZMA properties", ex.Message);
}
}
}

View File

@@ -0,0 +1,98 @@
using System.Text;
using System.IO;
using SharpCompress.Compressors.Xz;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class XZBlockTests : XZTestsBase
{
protected override void Rewind(Stream stream)
{
stream.Position = 12;
}
protected override void RewindIndexed(Stream stream)
{
stream.Position = 12;
}
private byte[] ReadBytes(XZBlock block, int bytesToRead)
{
byte[] buffer = new byte[bytesToRead];
var read = block.Read(buffer, 0, bytesToRead);
if (read != bytesToRead)
throw new EndOfStreamException();
return buffer;
}
[Fact]
public void OnFindIndexBlockThrow()
{
var bytes = new byte[] { 0 };
using (Stream indexBlockStream = new MemoryStream(bytes))
{
var XZBlock = new XZBlock(indexBlockStream, CheckType.CRC64, 8);
Assert.Throws<XZIndexMarkerReachedException>(() => { ReadBytes(XZBlock, 1); });
}
}
[Fact]
public void CrcIncorrectThrows()
{
var bytes = Compressed.Clone() as byte[];
bytes[20]++;
using (Stream badCrcStream = new MemoryStream(bytes))
{
Rewind(badCrcStream);
var XZBlock = new XZBlock(badCrcStream, CheckType.CRC64, 8);
var ex = Assert.Throws<InvalidDataException>(() => { ReadBytes(XZBlock, 1); });
Assert.Equal("Block header corrupt", ex.Message);
}
}
[Fact]
public void CanReadM()
{
var XZBlock = new XZBlock(CompressedStream, CheckType.CRC64, 8);
Assert.Equal(Encoding.ASCII.GetBytes("M"), ReadBytes(XZBlock, 1));
}
[Fact]
public void CanReadMary()
{
var XZBlock = new XZBlock(CompressedStream, CheckType.CRC64, 8);
Assert.Equal(Encoding.ASCII.GetBytes("M"), ReadBytes(XZBlock, 1));
Assert.Equal(Encoding.ASCII.GetBytes("a"), ReadBytes(XZBlock, 1));
Assert.Equal(Encoding.ASCII.GetBytes("ry"), ReadBytes(XZBlock, 2));
}
[Fact]
public void CanReadPoemWithStreamReader()
{
var XZBlock = new XZBlock(CompressedStream, CheckType.CRC64, 8);
var sr = new StreamReader(XZBlock);
Assert.Equal(sr.ReadToEnd(), Original);
}
[Fact]
public void NoopWhenNoPadding()
{
// CompressedStream's only block has no padding.
var XZBlock = new XZBlock(CompressedStream, CheckType.CRC64, 8);
var sr = new StreamReader(XZBlock);
sr.ReadToEnd();
Assert.Equal(0L, CompressedStream.Position % 4L);
}
[Fact]
public void SkipsPaddingWhenPresent()
{
// CompressedIndexedStream's first block has 1-byte padding.
var XZBlock = new XZBlock(CompressedIndexedStream, CheckType.CRC64, 8);
var sr = new StreamReader(XZBlock);
sr.ReadToEnd();
Assert.Equal(0L, CompressedIndexedStream.Position % 4L);
}
}
}

View File

@@ -0,0 +1,78 @@
using SharpCompress.Compressors.Xz;
using System.IO;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class XZHeaderTests : XZTestsBase
{
[Fact]
public void ChecksMagicNumber()
{
var bytes = Compressed.Clone() as byte[];
bytes[3]++;
using (Stream badMagicNumberStream = new MemoryStream(bytes))
{
BinaryReader br = new BinaryReader(badMagicNumberStream);
var header = new XZHeader(br);
var ex = Assert.Throws<InvalidDataException>(() => { header.Process(); });
Assert.Equal("Invalid XZ Stream", ex.Message);
}
}
[Fact]
public void CorruptHeaderThrows()
{
var bytes = Compressed.Clone() as byte[];
bytes[8]++;
using (Stream badCrcStream = new MemoryStream(bytes))
{
BinaryReader br = new BinaryReader(badCrcStream);
var header = new XZHeader(br);
var ex = Assert.Throws<InvalidDataException>(() => { header.Process(); });
Assert.Equal("Stream header corrupt", ex.Message);
}
}
[Fact]
public void BadVersionIfCrcOkButStreamFlagUnknown() {
var bytes = Compressed.Clone() as byte[];
byte[] streamFlags = { 0x00, 0xF4 };
byte[] crc = Crc32.Compute(streamFlags).ToLittleEndianBytes();
streamFlags.CopyTo(bytes, 6);
crc.CopyTo(bytes, 8);
using (Stream badFlagStream = new MemoryStream(bytes))
{
BinaryReader br = new BinaryReader(badFlagStream);
var header = new XZHeader(br);
var ex = Assert.Throws<InvalidDataException>(() => { header.Process(); });
Assert.Equal("Unknown XZ Stream Version", ex.Message);
}
}
[Fact]
public void ProcessesBlockCheckType()
{
BinaryReader br = new BinaryReader(CompressedStream);
var header = new XZHeader(br);
header.Process();
Assert.Equal(CheckType.CRC64, header.BlockCheckType);
}
[Fact]
public void CanCalculateBlockCheckSize()
{
BinaryReader br = new BinaryReader(CompressedStream);
var header = new XZHeader(br);
header.Process();
Assert.Equal(8, header.BlockCheckSize);
}
[Fact]
public void ProcessesStreamHeaderFromFactory()
{
var header = XZHeader.FromStream(CompressedStream);
Assert.Equal(CheckType.CRC64, header.BlockCheckType);
}
}
}

View File

@@ -0,0 +1,95 @@
using SharpCompress.Compressors.Xz;
using System.IO;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class XZIndexTests : XZTestsBase
{
protected override void RewindEmpty(Stream stream)
{
stream.Position = 12;
}
protected override void Rewind(Stream stream)
{
stream.Position = 356;
}
protected override void RewindIndexed(Stream stream)
{
stream.Position = 612;
}
[Fact]
public void RecordsStreamStartOnInit()
{
using (Stream badStream = new MemoryStream(new byte[] { 1, 2, 3, 4, 5 }))
{
BinaryReader br = new BinaryReader(badStream);
var index = new XZIndex(br, false);
Assert.Equal(0, index.StreamStartPosition);
}
}
[Fact]
public void ThrowsIfHasNoIndexMarker()
{
using (Stream badStream = new MemoryStream(new byte[] { 1, 2, 3, 4, 5 }))
{
BinaryReader br = new BinaryReader(badStream);
var index = new XZIndex(br, false);
Assert.Throws<InvalidDataException>( () => index.Process());
}
}
[Fact]
public void ReadsNoRecord()
{
BinaryReader br = new BinaryReader(CompressedEmptyStream);
var index = new XZIndex(br, false);
index.Process();
Assert.Equal((ulong)0, index.NumberOfRecords);
}
[Fact]
public void ReadsOneRecord()
{
BinaryReader br = new BinaryReader(CompressedStream);
var index = new XZIndex(br, false);
index.Process();
Assert.Equal((ulong)1, index.NumberOfRecords);
}
[Fact]
public void ReadsMultipleRecords()
{
BinaryReader br = new BinaryReader(CompressedIndexedStream);
var index = new XZIndex(br, false);
index.Process();
Assert.Equal((ulong)2, index.NumberOfRecords);
}
[Fact]
public void ReadsFirstRecord()
{
BinaryReader br = new BinaryReader(CompressedStream);
var index = new XZIndex(br, false);
index.Process();
Assert.Equal((ulong)OriginalBytes.Length, index.Records[0].UncompressedSize);
}
[Fact]
public void SkipsPadding()
{
// Index with 3-byte padding.
using (Stream badStream = new MemoryStream(new byte[] { 0x00, 0x01, 0x10, 0x80, 0x01, 0x00, 0x00, 0x00, 0xB1, 0x01, 0xD9, 0xC9, 0xFF }))
{
BinaryReader br = new BinaryReader(badStream);
var index = new XZIndex(br, false);
index.Process();
Assert.Equal(0L, badStream.Position % 4L);
}
}
}
}

View File

@@ -0,0 +1,42 @@
using SharpCompress.Compressors.Xz;
using System.IO;
using Xunit;
namespace SharpCompress.Test.Xz
{
public class XZStreamTests : XZTestsBase
{
[Fact]
public void CanReadEmptyStream()
{
XZStream xz = new XZStream(CompressedEmptyStream);
using (var sr = new StreamReader(xz))
{
string uncompressed = sr.ReadToEnd();
Assert.Equal(OriginalEmpty, uncompressed);
}
}
[Fact]
public void CanReadStream()
{
XZStream xz = new XZStream(CompressedStream);
using (var sr = new StreamReader(xz))
{
string uncompressed = sr.ReadToEnd();
Assert.Equal(Original, uncompressed);
}
}
[Fact]
public void CanReadIndexedStream()
{
XZStream xz = new XZStream(CompressedIndexedStream);
using (var sr = new StreamReader(xz))
{
string uncompressed = sr.ReadToEnd();
Assert.Equal(OriginalIndexed, uncompressed);
}
}
}
}

View File

@@ -0,0 +1,141 @@
using System.Text;
using System.IO;
namespace SharpCompress.Test.Xz
{
public abstract class XZTestsBase
{
public XZTestsBase()
{
RewindEmpty(CompressedEmptyStream);
Rewind(CompressedStream);
RewindIndexed(CompressedIndexedStream);
}
protected virtual void RewindEmpty(Stream stream)
{
stream.Position = 0;
}
protected virtual void Rewind(Stream stream)
{
stream.Position = 0;
}
protected virtual void RewindIndexed(Stream stream)
{
stream.Position = 0;
}
protected Stream CompressedEmptyStream { get; } = new MemoryStream(CompressedEmpty);
protected static byte[] CompressedEmpty { get; } = new byte[] {
0xfd, 0x37, 0x7a, 0x58, 0x5a, 0x00, 0x00, 0x01, 0x69, 0x22, 0xde, 0x36, 0x00, 0x00, 0x00, 0x00,
0x1c, 0xdf, 0x44, 0x21, 0x90, 0x42, 0x99, 0x0d, 0x01, 0x00, 0x00, 0x00, 0x00, 0x01, 0x59, 0x5a
};
protected static byte[] OriginalEmptyBytes => Encoding.ASCII.GetBytes(OriginalEmpty);
protected static string OriginalEmpty { get; } = string.Empty;
protected Stream CompressedStream { get; } = new MemoryStream(Compressed);
protected static byte[] Compressed { get; } = new byte[] {
0xfd, 0x37, 0x7a, 0x58, 0x5a, 0x00, 0x00, 0x04, 0xe6, 0xd6, 0xb4, 0x46, 0x02, 0x00, 0x21, 0x01,
0x16, 0x00, 0x00, 0x00, 0x74, 0x2f, 0xe5, 0xa3, 0xe0, 0x01, 0xe4, 0x01, 0x3c, 0x5d, 0x00, 0x26,
0x98, 0x4a, 0x47, 0xc6, 0x6a, 0x27, 0xd7, 0x36, 0x7a, 0x05, 0xb9, 0x4f, 0xd7, 0xde, 0x52, 0x4c,
0xca, 0x26, 0x4f, 0x23, 0x60, 0x4d, 0xf3, 0x1f, 0xa3, 0x67, 0x49, 0x53, 0xd0, 0xf5, 0xc7, 0xa9,
0x3e, 0xd6, 0xb5, 0x3d, 0x2b, 0x02, 0xbe, 0x83, 0x27, 0xe2, 0xa6, 0xc3, 0x13, 0x4a, 0x31, 0x14,
0x33, 0xed, 0x9a, 0x85, 0x1d, 0x05, 0x6e, 0x7e, 0xa4, 0x91, 0xbf, 0x46, 0x71, 0x7d, 0xa7, 0xfb,
0x12, 0x10, 0xdf, 0x21, 0x73, 0x75, 0xd8, 0xd9, 0xab, 0x8f, 0x1f, 0x8b, 0xb0, 0xb9, 0x3f, 0x9a,
0xa5, 0x1e, 0xd4, 0x2f, 0xdf, 0x09, 0xb3, 0xfe, 0x45, 0xef, 0x16, 0xec, 0x95, 0x68, 0x64, 0xbb,
0x42, 0x0c, 0x8b, 0x96, 0x27, 0x30, 0x62, 0x42, 0x91, 0x7c, 0xf3, 0x6e, 0x4d, 0x03, 0xc5, 0x00,
0x04, 0x73, 0xdd, 0xee, 0xb0, 0xaa, 0xd6, 0x0b, 0x11, 0x90, 0x81, 0xd4, 0xaa, 0x69, 0x63, 0xfa,
0x2f, 0xb4, 0x25, 0x0a, 0x7f, 0xf9, 0x47, 0x77, 0xb1, 0x1f, 0xc3, 0xb4, 0x4d, 0x51, 0xf8, 0x23,
0x3a, 0x7c, 0x44, 0xc8, 0xcc, 0xca, 0x72, 0x09, 0xae, 0xc9, 0x7b, 0x7e, 0x91, 0x5d, 0xff, 0xc4,
0xeb, 0xfd, 0xa1, 0x9b, 0xd4, 0x8d, 0xd7, 0xd3, 0x57, 0xac, 0x7e, 0x3b, 0x97, 0x2e, 0xe4, 0xc2,
0x2e, 0x93, 0x3d, 0xb0, 0x16, 0x64, 0x78, 0x45, 0xb1, 0xc9, 0x40, 0x96, 0xcf, 0x5b, 0xc2, 0x2f,
0xaa, 0xba, 0xcf, 0x98, 0x38, 0x21, 0x3d, 0x1a, 0x13, 0xe8, 0xa6, 0xa6, 0xdf, 0xf4, 0x3d, 0x01,
0xa1, 0x9d, 0xc1, 0x3e, 0x37, 0xac, 0x20, 0xc4, 0xef, 0x18, 0xb1, 0xeb, 0x35, 0xf4, 0x66, 0x9a,
0x47, 0x3c, 0xce, 0x7c, 0xad, 0xdb, 0x2e, 0x39, 0xf5, 0x8d, 0x4a, 0x1d, 0x65, 0xc2, 0x0f, 0xa4,
0x40, 0x7e, 0xe6, 0xa7, 0x17, 0xce, 0x75, 0x7f, 0xd9, 0xa3, 0xf9, 0x27, 0x42, 0xd7, 0x98, 0x54,
0x17, 0xa7, 0x7a, 0x7c, 0x82, 0xdf, 0xeb, 0x08, 0x28, 0x86, 0xdd, 0x57, 0x77, 0x92, 0x80, 0x5f,
0x7b, 0x3b, 0xce, 0x77, 0x72, 0xff, 0xa3, 0x85, 0xd8, 0x5c, 0x8a, 0xb7, 0x83, 0x58, 0xfa, 0xbd,
0x72, 0xe3, 0x66, 0x9d, 0x3b, 0xff, 0x13, 0x5b, 0x0b, 0xf1, 0x6c, 0xa6, 0xb1, 0x3b, 0x85, 0x3b,
0x47, 0x91, 0xc8, 0x7c, 0x38, 0xe2, 0xe5, 0x54, 0xf8, 0x27, 0xee, 0x00, 0xff, 0xd3, 0x68, 0xf1,
0xc6, 0xc7, 0xd7, 0x24, 0x00, 0x01, 0xd8, 0x02, 0xe5, 0x03, 0x00, 0x00, 0xac, 0x16, 0x1f, 0xa4,
0xb1, 0xc4, 0x67, 0xfb, 0x02, 0x00, 0x00, 0x00, 0x00, 0x04, 0x59, 0x5a
};
protected static byte[] OriginalBytes => Encoding.ASCII.GetBytes(Original);
protected static string Original { get; } =
"Mary had a little lamb,\r\n" +
"His fleece was white as snow,\r\n" +
"And everywhere that Mary went,\r\n" +
"The lamb was sure to go.\r\n" +
"\r\n" +
"He followed her to school one day,\r\n" +
"Which was against the rule,\r\n" +
"It made the children laugh and play\r\n" +
"To see a lamb at school.\r\n" +
"\r\n" +
"And so the teacher turned it out,\r\n" +
"But still it lingered near,\r\n" +
"And waited patiently about,\r\n" +
"Till Mary did appear.\r\n" +
"\r\n" +
"\"Why does the lamb love Mary so?\"\r\n" +
"The eager children cry.\r\n" +
"\"Why, Mary loves the lamb, you know.\"\r\n" +
"The teacher did reply.";
protected Stream CompressedIndexedStream { get; } = new MemoryStream(CompressedIndexed);
protected static byte[] CompressedIndexed { get; } = new byte[] {
0xfd, 0x37, 0x7a, 0x58, 0x5a, 0x00, 0x00, 0x01, 0x69, 0x22, 0xde, 0x36, 0x03, 0xc0, 0xe3, 0x02,
0x80, 0x20, 0x21, 0x01, 0x00, 0x00, 0x00, 0x00, 0x7e, 0xe5, 0xd7, 0x32, 0xe0, 0x0f, 0xff, 0x01,
0x5b, 0x5d, 0x00, 0x26, 0x98, 0x4a, 0x47, 0xc6, 0x6a, 0x27, 0xd7, 0x36, 0x7a, 0x05, 0xb9, 0x4f,
0xd7, 0xde, 0x3a, 0x0e, 0xee, 0x1b, 0xd7, 0x81, 0xe9, 0xf5, 0x90, 0x1e, 0xd5, 0x9e, 0x88, 0x32,
0x1c, 0x7b, 0x43, 0x84, 0x02, 0x58, 0x92, 0xcf, 0x97, 0xfc, 0xae, 0x01, 0x83, 0x23, 0x48, 0x93,
0xc6, 0x56, 0xcc, 0x6d, 0xb1, 0x23, 0x10, 0x24, 0x3b, 0x9e, 0x06, 0xaa, 0xc0, 0xce, 0x86, 0x0a,
0xb7, 0x9f, 0x99, 0x61, 0xbe, 0x3b, 0x6d, 0xfe, 0x60, 0xef, 0x14, 0x35, 0x7f, 0x21, 0xe8, 0x96,
0x0e, 0xbd, 0x41, 0x7c, 0x65, 0x89, 0x96, 0x28, 0x5e, 0x85, 0xa6, 0x4b, 0xf3, 0xf9, 0xf8, 0x25,
0x31, 0x4a, 0xbb, 0x72, 0xce, 0xcf, 0x53, 0xdf, 0x13, 0x42, 0x2d, 0xbc, 0x95, 0xa5, 0x6d, 0xc4,
0x8c, 0x72, 0x99, 0xe8, 0x9a, 0xcf, 0x80, 0xd4, 0xc4, 0x3f, 0x55, 0xc3, 0x9b, 0x00, 0xce, 0x65,
0x27, 0x6e, 0xbf, 0xb2, 0x88, 0xab, 0xc0, 0x5f, 0xf9, 0xd0, 0xc8, 0xbb, 0xd7, 0x48, 0xd7, 0x2e,
0x5e, 0xbb, 0x23, 0x35, 0x6e, 0x62, 0xb6, 0x13, 0xd4, 0x06, 0xd1, 0x5b, 0x97, 0xee, 0x5b, 0x89,
0x78, 0x07, 0x24, 0x74, 0x59, 0x06, 0x1e, 0x7f, 0x8c, 0xb0, 0x48, 0xff, 0x0a, 0x76, 0xb2, 0x07,
0xa0, 0x99, 0xf5, 0x4b, 0x68, 0xd4, 0x55, 0x32, 0xb3, 0x17, 0x7b, 0xb6, 0x26, 0xdb, 0x1c, 0xc3,
0x0b, 0xda, 0x3e, 0x46, 0xba, 0x1a, 0x67, 0x23, 0xb7, 0x2a, 0x40, 0xdc, 0xc9, 0xa2, 0xe4, 0xb5,
0x68, 0x5c, 0x81, 0x60, 0xa7, 0xad, 0xe6, 0xba, 0xbb, 0x0d, 0x82, 0x8a, 0xe2, 0x03, 0xa9, 0x22,
0x09, 0x5e, 0xd8, 0x69, 0xfa, 0x29, 0xd1, 0x32, 0xa1, 0xf0, 0x9b, 0x3c, 0xc3, 0x0b, 0x9a, 0x53,
0xf0, 0x3e, 0xf3, 0x1b, 0x77, 0xee, 0x8f, 0xa6, 0x15, 0x02, 0x77, 0x14, 0x54, 0x60, 0xae, 0xbe,
0x91, 0x9e, 0xe6, 0x8b, 0x87, 0x6e, 0x46, 0x44, 0x64, 0xc7, 0x58, 0x90, 0x62, 0x25, 0x32, 0xf9,
0xcd, 0xd2, 0x73, 0x2e, 0x3f, 0xd7, 0x5d, 0x3c, 0x86, 0x1c, 0xa8, 0x35, 0xa9, 0xc2, 0xcb, 0x59,
0xcb, 0xac, 0xb3, 0x03, 0x12, 0xd4, 0x8a, 0xde, 0xd5, 0xc1, 0xd8, 0x0c, 0x32, 0x49, 0x87, 0x97,
0x62, 0x4f, 0x32, 0x39, 0x63, 0x5b, 0x8b, 0xd1, 0x6c, 0x5c, 0x90, 0xd9, 0x93, 0x13, 0xae, 0x70,
0xf5, 0x2f, 0x40, 0xaf, 0x01, 0x95, 0x01, 0x0c, 0xc5, 0xfa, 0x82, 0xf8, 0x71, 0x9d, 0x53, 0xe6,
0x47, 0x6e, 0x99, 0x54, 0x57, 0x41, 0x72, 0xea, 0xf5, 0x78, 0xdd, 0x86, 0xbd, 0x00, 0x00, 0x00,
0x72, 0x6a, 0xf2, 0x47, 0x03, 0xc0, 0xcb, 0x01, 0x8d, 0x02, 0x21, 0x01, 0x00, 0x00, 0x00, 0x00,
0xfb, 0xa7, 0xf7, 0x94, 0xe0, 0x01, 0x0c, 0x00, 0xc3, 0x5d, 0x00, 0x06, 0x82, 0xca, 0x9b, 0x77,
0x93, 0x57, 0xb3, 0x76, 0xbd, 0x8b, 0xcb, 0xee, 0xf4, 0x2c, 0xff, 0x7f, 0x95, 0x33, 0x15, 0x10,
0xa5, 0xf9, 0xfd, 0xa6, 0xbb, 0x9e, 0xf9, 0x75, 0x67, 0xee, 0xec, 0x8b, 0x40, 0xea, 0x32, 0x47,
0x3d, 0x26, 0xbe, 0x11, 0x9c, 0xa6, 0x40, 0xbe, 0x84, 0x1f, 0x1b, 0x35, 0x1a, 0x66, 0x10, 0x9c,
0xf4, 0x12, 0x1a, 0x95, 0x81, 0xb5, 0x55, 0x6b, 0xc5, 0x42, 0xfd, 0x37, 0x70, 0xc5, 0x08, 0xa4,
0x27, 0x67, 0x11, 0x0b, 0x1f, 0xcc, 0xdb, 0x54, 0x9b, 0x5a, 0x5f, 0xee, 0x21, 0x63, 0xdd, 0x4b,
0xbc, 0x49, 0x95, 0x6d, 0xf4, 0xcb, 0x9a, 0x9a, 0x5e, 0xe4, 0x7d, 0x0f, 0x02, 0x22, 0xa9, 0x42,
0x46, 0x1a, 0x04, 0x87, 0x43, 0x72, 0x59, 0xa4, 0xd6, 0xeb, 0x69, 0x36, 0xde, 0xea, 0x53, 0x8c,
0x89, 0xd7, 0x22, 0xa6, 0xf7, 0xa8, 0x4c, 0x72, 0x6c, 0x80, 0x69, 0x01, 0xb2, 0xa7, 0xe8, 0x8b,
0x94, 0xaf, 0x0e, 0x47, 0x58, 0x1d, 0x0e, 0x5c, 0x7c, 0x33, 0x9f, 0x21, 0x17, 0x2c, 0x4f, 0x3d,
0x72, 0xff, 0xcf, 0x7a, 0x4f, 0x82, 0x5b, 0x85, 0x28, 0x70, 0xf4, 0x8c, 0x81, 0x41, 0xb8, 0x20,
0x5c, 0x3e, 0x02, 0x5e, 0x5a, 0x61, 0xbb, 0x2f, 0x64, 0xc5, 0x4e, 0x53, 0xe4, 0xca, 0xe4, 0xd9,
0x75, 0xaf, 0x15, 0x4d, 0xff, 0x01, 0xec, 0x13, 0x4a, 0x70, 0x00, 0x04, 0xf9, 0xfa, 0x00, 0x00,
0x99, 0x57, 0xc4, 0x96, 0x00, 0x02, 0xf7, 0x02, 0x80, 0x20, 0xdf, 0x01, 0x8d, 0x02, 0x00, 0x00,
0x4c, 0x41, 0xe6, 0xa1, 0x9b, 0xe3, 0x51, 0x40, 0x03, 0x00, 0x00, 0x00, 0x00, 0x01, 0x59, 0x5a
};
protected static byte[] OriginalIndexedBytes => Encoding.ASCII.GetBytes(OriginalIndexed);
protected static string OriginalIndexed { get; } = Original + Original + Original + Original + Original + Original + Original + Original + Original;
}
}

View File

@@ -160,7 +160,7 @@ namespace SharpCompress.Test.Zip
[Fact]
public void Zip_Random_Write_Add()
{
string jpg = Path.Combine(ORIGINAL_FILES_PATH, "jpg","test.jpg");
string jpg = Path.Combine(ORIGINAL_FILES_PATH, "jpg", "test.jpg");
string scratchPath = Path.Combine(SCRATCH_FILES_PATH, "Zip.deflate.mod.zip");
string unmodified = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.mod.zip");
string modified = Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.mod2.zip");
@@ -297,7 +297,7 @@ namespace SharpCompress.Test.Zip
{
archive.AddAllFromDirectory(SCRATCH_FILES_PATH);
archive.RemoveEntry(archive.Entries.Single(x => x.Key.EndsWith("jpg", StringComparison.OrdinalIgnoreCase)));
Assert.False(archive.Entries.Any(x => x.Key.EndsWith("jpg")));
Assert.Null(archive.Entries.FirstOrDefault(x => x.Key.EndsWith("jpg")));
}
Directory.Delete(SCRATCH_FILES_PATH, true);
}
@@ -307,9 +307,9 @@ namespace SharpCompress.Test.Zip
{
ResetScratch();
using (var reader = ZipArchive.Open(Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.WinzipAES.zip"), new ReaderOptions()
{
Password = "test"
}))
{
Password = "test"
}))
{
foreach (var entry in reader.Entries.Where(x => !x.IsDirectory))
{

View File

@@ -15,16 +15,39 @@ namespace SharpCompress.Test.Zip
UseExtensionInsteadOfNameToVerify = true;
}
[Fact]
public void Issue_269_Double_Skip()
{
ResetScratch();
var path = Path.Combine(TEST_ARCHIVES_PATH, "PrePostHeaders.zip");
using (Stream stream = new ForwardOnlyStream(File.OpenRead(path)))
using (IReader reader = ReaderFactory.Open(stream))
{
int count = 0;
while (reader.MoveToNextEntry())
{
count++;
if (!reader.Entry.IsDirectory)
{
if (count % 2 != 0)
{
reader.WriteEntryTo(Stream.Null);
}
}
}
}
}
[Fact]
public void Zip_Zip64_Streamed_Read()
{
Read("Zip.Zip64.zip", CompressionType.Deflate);
Read("Zip.zip64.zip", CompressionType.Deflate);
}
[Fact]
public void Zip_ZipX_Streamed_Read()
{
Read("Zip.Zipx", CompressionType.LZMA);
Read("Zip.zipx", CompressionType.LZMA);
}
[Fact]
@@ -118,8 +141,8 @@ namespace SharpCompress.Test.Zip
ResetScratch();
using (Stream stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Zip.bzip2.pkware.zip")))
using (var reader = ZipReader.Open(stream, new ReaderOptions()
{
Password = "test"
{
Password = "test"
}))
{
while (reader.MoveToNextEntry())
@@ -195,11 +218,11 @@ namespace SharpCompress.Test.Zip
using (
Stream stream =
File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH,
"Zip.lzma.winzipaes.zip")))
"Zip.lzma.WinzipAES.zip")))
using (var reader = ZipReader.Open(stream, new ReaderOptions()
{
Password = "test"
}))
{
Password = "test"
}))
{
while (reader.MoveToNextEntry())
{
@@ -225,9 +248,9 @@ namespace SharpCompress.Test.Zip
ResetScratch();
using (Stream stream = File.OpenRead(Path.Combine(TEST_ARCHIVES_PATH, "Zip.deflate.WinzipAES.zip")))
using (var reader = ZipReader.Open(stream, new ReaderOptions()
{
Password = "test"
}))
{
Password = "test"
}))
{
while (reader.MoveToNextEntry())
{

Binary file not shown.

Binary file not shown.

Binary file not shown.