System.ArgumentException: Specified preset is not supported #49

Open
opened 2026-01-29 16:28:07 +00:00 by claunia · 10 comments
Owner

Originally created by @buckstephenh on GitHub (Jun 12, 2019).

Originally assigned to: @qmfrederik on GitHub.

System.ArgumentException: Specified preset is not supported
at Packaging.Targets.IO.XZOutputStream.Write(Byte[] buffer, Int32 offset, Int32 count)

seems to occur with larger project with many nested folders but not sure yet.

Originally created by @buckstephenh on GitHub (Jun 12, 2019). Originally assigned to: @qmfrederik on GitHub. System.ArgumentException: Specified preset is not supported at Packaging.Targets.IO.XZOutputStream.Write(Byte[] buffer, Int32 offset, Int32 count) seems to occur with larger project with many nested folders but not sure yet.
claunia added the bug label 2026-01-29 16:28:07 +00:00
Author
Owner

@qmfrederik commented on GitHub (Jun 16, 2019):

@buckstephenh Thanks for reporting this. It would be really helpful if you can share project which reproduces this issue.

@qmfrederik commented on GitHub (Jun 16, 2019): @buckstephenh Thanks for reporting this. It would be really helpful if you can share project which reproduces this issue.
Author
Owner

@qmfrederik commented on GitHub (Dec 5, 2019):

Oddly enough, I can reproduce this when setting a breakpoint around lzma_code, so this appears to be some sort of timing issue.

@qmfrederik commented on GitHub (Dec 5, 2019): Oddly enough, I can reproduce this when setting a breakpoint around lzma_code, so this appears to be some sort of timing issue.
Author
Owner

@qmfrederik commented on GitHub (Dec 10, 2019):

I just got a repro of this on Windows, but haven't seen the issue on Linux. @buckstephenh Did you hit this issue on Windows, too?

@qmfrederik commented on GitHub (Dec 10, 2019): I just got a repro of this on Windows, but haven't seen the issue on Linux. @buckstephenh Did you hit this issue on Windows, too?
Author
Owner

@qmfrederik commented on GitHub (Dec 10, 2019):

It looks like the lzma port in VCPKG is buggy or is using a buggy version of lzma; replacing lzma.dll with a copy built directly from https://github.com/xz-mirror/xz/ master works.

@qmfrederik commented on GitHub (Dec 10, 2019): It looks like the lzma port in VCPKG is buggy or is using a buggy version of lzma; replacing lzma.dll with a copy built directly from https://github.com/xz-mirror/xz/ master works.
Author
Owner

@buckstephenh commented on GitHub (Dec 10, 2019):

Sorry I haven’t replied. I switched projects/jobs and haven’t been able to return to this topic. MS also has an issue with single file EXE builds related to duplicate references that are not identified/caught in normal build. Probably not related to this.

@buckstephenh commented on GitHub (Dec 10, 2019): Sorry I haven’t replied. I switched projects/jobs and haven’t been able to return to this topic. MS also has an issue with single file EXE builds related to duplicate references that are not identified/caught in normal build. Probably not related to this.
Author
Owner

@FRvanderVeen commented on GitHub (Feb 24, 2022):

@qmfrederik I run into this issue too. I'm building on Windows. The strange thing is that sometimes the error occurs, but there are also times it doesn't.
Don't now if it helps, but this is the stack trace I see:

The "RpmTask" task failed unexpectedly. [project.csproj]
System.ArgumentException: Specified preset is not supported [project.csproj]
   at Packaging.Targets.IO.XZOutputStream.Dispose(Boolean disposing) [project.csproj]
   at Packaging.Targets.Rpm.RpmPackageCreator.CreatePackage(List`1 archiveEntries, Stream payloadStream, String name, String version, String arch, String release, Boolean createUser, String userName, Boolean installService, String serviceName, String vendor, String description, String url, String prefix, String preInstallScript, String postInstallScript, String preRemoveScript, String postRemoveScript, IEnumerable`1 additionalDependencies, Action`1 additionalMetadata, IPackageSigner signer, Stream targetStream, Boolean includeVersionInName, Boolean payloadIsCompressed) [project.csproj]
   at Packaging.Targets.Rpm.RpmPackageCreator.CreatePackage(List`1 archiveEntries, Stream payloadStream, String name, String version, String arch, String release, Boolean createUser, String userName, Boolean installService, String serviceName, String vendor, String description, String url, String prefix, String preInstallScript, String postInstallScript, String preRemoveScript, String postRemoveScript, IEnumerable`1 additionalDependencies, Action`1 additionalMetadata, PgpPrivateKey privateKey, Stream targetStream) [project.csproj]
   at Packaging.Targets.RpmTask.Execute() [project.csproj]
   at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute() [project.csproj]
   at Microsoft.Build.BackEnd.TaskBuilder.ExecuteInstantiatedTask(ITaskExecutionHost taskExecutionHost, TaskLoggingContext taskLoggingContext, TaskHost taskHost, ItemBucket bucket, TaskExecutionMode howToExecuteTask) [project.csproj]

Using packaging.targets\0.1.220\build\Packaging.Targets.targets(105,5)

@FRvanderVeen commented on GitHub (Feb 24, 2022): @qmfrederik I run into this issue too. I'm building on Windows. The strange thing is that sometimes the error occurs, but there are also times it doesn't. Don't now if it helps, but this is the stack trace I see: ``` The "RpmTask" task failed unexpectedly. [project.csproj] System.ArgumentException: Specified preset is not supported [project.csproj] at Packaging.Targets.IO.XZOutputStream.Dispose(Boolean disposing) [project.csproj] at Packaging.Targets.Rpm.RpmPackageCreator.CreatePackage(List`1 archiveEntries, Stream payloadStream, String name, String version, String arch, String release, Boolean createUser, String userName, Boolean installService, String serviceName, String vendor, String description, String url, String prefix, String preInstallScript, String postInstallScript, String preRemoveScript, String postRemoveScript, IEnumerable`1 additionalDependencies, Action`1 additionalMetadata, IPackageSigner signer, Stream targetStream, Boolean includeVersionInName, Boolean payloadIsCompressed) [project.csproj] at Packaging.Targets.Rpm.RpmPackageCreator.CreatePackage(List`1 archiveEntries, Stream payloadStream, String name, String version, String arch, String release, Boolean createUser, String userName, Boolean installService, String serviceName, String vendor, String description, String url, String prefix, String preInstallScript, String postInstallScript, String preRemoveScript, String postRemoveScript, IEnumerable`1 additionalDependencies, Action`1 additionalMetadata, PgpPrivateKey privateKey, Stream targetStream) [project.csproj] at Packaging.Targets.RpmTask.Execute() [project.csproj] at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute() [project.csproj] at Microsoft.Build.BackEnd.TaskBuilder.ExecuteInstantiatedTask(ITaskExecutionHost taskExecutionHost, TaskLoggingContext taskLoggingContext, TaskHost taskHost, ItemBucket bucket, TaskExecutionMode howToExecuteTask) [project.csproj] ``` Using packaging.targets\0.1.220\build\Packaging.Targets.targets(105,5)
Author
Owner

@andyjmorgan commented on GitHub (Jan 19, 2023):

did you manage to find a workaround for this?

@andyjmorgan commented on GitHub (Jan 19, 2023): did you manage to find a workaround for this?
Author
Owner

@RealOrko commented on GitHub (Feb 16, 2025):

Definitely seeing the same thing on linux using Ubuntu 24.04 with dotnet 9.0. Will post more when I find it.

@RealOrko commented on GitHub (Feb 16, 2025): Definitely seeing the same thing on linux using Ubuntu 24.04 with dotnet 9.0. Will post more when I find it.
Author
Owner

@RealOrko commented on GitHub (Feb 16, 2025):

So I tried enabling LD_DEBUG=all and found the following.

   2809232:	calling init: /lib/x86_64-linux-gnu/liblzma.so.5
   2809232:	
   2809232:	opening file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]; direct_opencount=1
   2809232:	
   2809232:	symbol=dlsym;  lookup in file=/lib/x86_64-linux-gnu/libdl.so.2 [0]
   2809232:	symbol=dlsym;  lookup in file=/lib/x86_64-linux-gnu/libc.so.6 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/libdl.so.2 [0] to /lib/x86_64-linux-gnu/libc.so.6 [0]: normal symbol `dlsym'
   2809232:	symbol=lzma_stream_decoder;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_decoder'
   2809232:	symbol=lzma_code;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_code'
   2809232:	symbol=lzma_stream_footer_decode;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_footer_decode'
   2809232:	symbol=lzma_index_uncompressed_size;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_uncompressed_size'
   2809232:	symbol=lzma_index_buffer_decode;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_buffer_decode'
   2809232:	symbol=lzma_index_end;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_end'
   2809232:	symbol=lzma_end;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_end'
   2809232:	symbol=lzma_easy_encoder;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_easy_encoder'
   2809232:	symbol=lzma_stream_encoder_mt;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_encoder_mt'
   2809232:	symbol=lzma_stream_buffer_bound;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_buffer_bound'
   2809232:	symbol=lzma_easy_buffer_encode;  lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]
   2809232:	binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_easy_buffer_encode'
  Created DEB package '/home/gavin/code/sql-d/src/sql-d.ui/bin/Release/net9.0/SqlD.UI.1.1.0-helm.deb' from folder 'bin/Release/net9.0/publish/'

Everything works as expected. So I think the output from LD_DEBUG is slowing things down enough to confirm the race condition mentioned here. The question is where though?

There is 4 flavours of System.Diagnostics.Process involved in this solution. We should start by harmonising to dotnet 9? Please see screenshot.

Image

@RealOrko commented on GitHub (Feb 16, 2025): So I tried enabling [LD_DEBUG=all](https://bnikolic.co.uk/blog/linux-ld-debug.html) and found the following. ``` 2809232: calling init: /lib/x86_64-linux-gnu/liblzma.so.5 2809232: 2809232: opening file=/lib/x86_64-linux-gnu/liblzma.so.5 [0]; direct_opencount=1 2809232: 2809232: symbol=dlsym; lookup in file=/lib/x86_64-linux-gnu/libdl.so.2 [0] 2809232: symbol=dlsym; lookup in file=/lib/x86_64-linux-gnu/libc.so.6 [0] 2809232: binding file /lib/x86_64-linux-gnu/libdl.so.2 [0] to /lib/x86_64-linux-gnu/libc.so.6 [0]: normal symbol `dlsym' 2809232: symbol=lzma_stream_decoder; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_decoder' 2809232: symbol=lzma_code; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_code' 2809232: symbol=lzma_stream_footer_decode; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_footer_decode' 2809232: symbol=lzma_index_uncompressed_size; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_uncompressed_size' 2809232: symbol=lzma_index_buffer_decode; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_buffer_decode' 2809232: symbol=lzma_index_end; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_index_end' 2809232: symbol=lzma_end; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_end' 2809232: symbol=lzma_easy_encoder; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_easy_encoder' 2809232: symbol=lzma_stream_encoder_mt; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_encoder_mt' 2809232: symbol=lzma_stream_buffer_bound; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_stream_buffer_bound' 2809232: symbol=lzma_easy_buffer_encode; lookup in file=/lib/x86_64-linux-gnu/liblzma.so.5 [0] 2809232: binding file /lib/x86_64-linux-gnu/liblzma.so.5 [0] to /lib/x86_64-linux-gnu/liblzma.so.5 [0]: normal symbol `lzma_easy_buffer_encode' Created DEB package '/home/gavin/code/sql-d/src/sql-d.ui/bin/Release/net9.0/SqlD.UI.1.1.0-helm.deb' from folder 'bin/Release/net9.0/publish/' ``` Everything works as expected. So I think the output from LD_DEBUG is slowing things down enough to confirm the race condition mentioned [here](https://github.com/quamotion/dotnet-packaging/issues/99#issuecomment-562076434). The question is where though? There is 4 flavours of System.Diagnostics.Process involved in this solution. We should start by harmonising to dotnet 9? Please see screenshot. ![Image](https://github.com/user-attachments/assets/16fa67f8-031f-4131-914e-f450b1721a51)
Author
Owner

@dj3mu commented on GitHub (Sep 2, 2025):

Happens for Deb packages in my case.

The problem seems to appear randomly.
But what I observed is that with three build nodes the problem seems to persist for some time on the same node. Even multiple retries selecting same node fail with the same error. A run on a different node, immediately works.
Few days later, surprisingly the previously working node might start to fail while the other one starts working.
All nodes different physical hardware, all Docker runners using MS SDK image, only one build run per node at a time.

It seems pretty reproducible. But I'm totally puzzled which "condition" triggers this and no clue, why it suddenly disappears again. I'm not able to prove this. But I suspect either of or a combination of load, latencies (during NuGet download?) or caching (of corrupt? artifacts) may be causing this.

@dj3mu commented on GitHub (Sep 2, 2025): Happens for Deb packages in my case. The problem seems to appear randomly. But what I observed is that with three build nodes the problem seems to persist for some time on the same node. Even multiple retries selecting same node fail with the same error. A run on a different node, immediately works. Few days later, surprisingly the previously working node might start to fail while the other one starts working. All nodes different physical hardware, all Docker runners using MS SDK image, only one build run per node at a time. It seems pretty reproducible. But I'm totally puzzled which "condition" triggers this and no clue, why it suddenly disappears again. I'm not able to prove this. But I suspect either of or a combination of load, latencies (during NuGet download?) or caching (of corrupt? artifacts) may be causing this.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/dotnet-packaging#49