Compare commits

...

77 Commits

Author SHA1 Message Date
dependabot[bot]
73e13f5799 Bump github/codeql-action from 4.31.4 to 4.31.7
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.4 to 4.31.7.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](e12f017898...cf1bb45a27)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-08 08:06:30 +00:00
Copybara-Service
5fa73e23be Merge pull request #1392 from cclauss:comprehensions
PiperOrigin-RevId: 836153778
2025-11-24 03:36:29 -08:00
Eugene Kliuchnikov
5db7aca571 Merge branch 'master' into comprehensions 2025-11-24 12:17:20 +01:00
Evgenii Kliuchnikov
486ed5bc56 Fix one-off in dcheck.
How to prove correctness:
1) when `position` is `0`, `max_length` is allowed to be `ringbuffer_size`
2) in other words: `position + max_length <= ringbuffer_size`
3) `ringbuffer_mask == ringbuffer_size - 1`
4) thus `position + max_length <= ringbuffer_size + 1`

PiperOrigin-RevId: 836145553
2025-11-24 03:05:39 -08:00
Eugene Kliuchnikov
688d661f40 Merge branch 'master' into comprehensions 2025-11-24 11:55:20 +01:00
Copybara-Service
5151a220d5 Merge pull request #1397 from google:dependabot/github_actions/actions/checkout-6.0.0
PiperOrigin-RevId: 836136994
2025-11-24 02:35:08 -08:00
dependabot[bot]
e1979c07fe Bump actions/checkout from 5.0.0 to 6.0.0
Bumps [actions/checkout](https://github.com/actions/checkout) from 5.0.0 to 6.0.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](08c6903cd8...1af3b93b68)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-24 08:09:44 +00:00
Christian Clauss
7ff6d1d286 Add ruff rule PERF for performance 2025-11-20 10:38:31 +01:00
Copybara-Service
61af0e5b94 Merge pull request #1391 from cclauss:ruff
PiperOrigin-RevId: 834609353
2025-11-19 23:12:53 -08:00
Copybara-Service
a0d2679607 Merge pull request #1388 from google:dependabot/github_actions/github/codeql-action-4.31.3
PiperOrigin-RevId: 834609287
2025-11-19 23:11:59 -08:00
Christian Clauss
52ad34cea4 Lint Python code with ruff 2025-11-19 22:52:03 +01:00
dependabot[bot]
7c77ca0b18 Bump github/codeql-action from 4.31.2 to 4.31.3
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.31.2 to 4.31.3.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](0499de31b9...014f16e7ab)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-19 10:57:16 +00:00
Evgenii Kliuchnikov
f81d6bc7f0 Modernize tests.
Avoid file IO.
Drive-by: drop bro.py and bro_test.py; we do not support it well and likely no one uses it.
PiperOrigin-RevId: 834206605
2025-11-19 02:56:01 -08:00
Evgenii Kliuchnikov
fa925d0c15 fix man installation dir
PiperOrigin-RevId: 830898574
2025-11-11 07:07:50 -08:00
Copybara-Service
8e4d912826 Merge pull request #1383 from google:dependabot/github_actions/step-security/harden-runner-2.13.2
PiperOrigin-RevId: 830866924
2025-11-11 05:21:43 -08:00
dependabot[bot]
2138ac6153 Bump step-security/harden-runner from 2.13.1 to 2.13.2
Bumps [step-security/harden-runner](https://github.com/step-security/harden-runner) from 2.13.1 to 2.13.2.
- [Release notes](https://github.com/step-security/harden-runner/releases)
- [Commits](f4a75cfd61...95d9a5deda)

---
updated-dependencies:
- dependency-name: step-security/harden-runner
  dependency-version: 2.13.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-10 11:46:53 +00:00
Evgenii Kliuchnikov
ea5b5c10dd Drop Py 3.9- support
PiperOrigin-RevId: 830342213
2025-11-10 02:18:38 -08:00
Evgenii Kliuchnikov
8a9ab54e2e fix JavaDoc warnings
PiperOrigin-RevId: 829420925
2025-11-07 07:00:04 -08:00
Copybara-Service
c83197f8eb Merge pull request #1374 from google:dependabot/github_actions/github/codeql-action-4.31.2
PiperOrigin-RevId: 829379228
2025-11-07 04:34:45 -08:00
Copybara-Service
48152367b3 Merge pull request #1365 from google:dependabot/github_actions/actions/setup-node-6.0.0
PiperOrigin-RevId: 829379009
2025-11-07 04:33:39 -08:00
Copybara-Service
fa141a189a Merge pull request #1364 from google:dependabot/github_actions/actions/download-artifact-6.0.0
PiperOrigin-RevId: 829378980
2025-11-07 04:32:48 -08:00
Copybara-Service
464fe15603 Merge pull request #1379 from Cycloctane:fix-docstring
PiperOrigin-RevId: 829349146
2025-11-07 02:46:17 -08:00
Rui Xi
e4bc10a000 fix method name in python Decompressor docstring 2025-11-07 15:48:14 +08:00
Eugene Kliuchnikov
cedd986cf2 Merge branch 'master' into dependabot/github_actions/actions/download-artifact-6.0.0 2025-11-06 12:48:25 +01:00
Eugene Kliuchnikov
7f0d259e54 Merge branch 'master' into dependabot/github_actions/actions/setup-node-6.0.0 2025-11-06 12:48:11 +01:00
Eugene Kliuchnikov
0e8a06c0bd Merge branch 'master' into dependabot/github_actions/github/codeql-action-4.31.2 2025-11-06 12:47:56 +01:00
Copybara-Service
595a634fd7 Merge pull request #1376 from anthrotype:normalize-py-name
PiperOrigin-RevId: 828867720
2025-11-06 02:49:36 -08:00
Cosimo Lupo
808e2b99e6 [setup.py] Use PEP625-compliant lowercase 'brotli' package name
PyPI now requires that wheel filenames and metadata use normalized
package names following PEP 625. Modern build tooling automatically
normalize this, however this does not happen for old python versions
(e.g. 2.7, 3.6, 3.7) which makes their upload fail.
I had to apply the same patch to the v1.2.0 release be able to
upload all the python wheels to PyPI, see:

https://github.com/google/brotli/issues/1327#issuecomment-3491479583

ca4fed169b

It makes more sense to have this in here and get rid of the patch,
regardless of when we drop support for those EOL pythons.
2025-11-06 09:19:22 +00:00
dependabot[bot]
b54d27c0f1 Bump github/codeql-action from 4.30.8 to 4.31.2
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 4.30.8 to 4.31.2.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v4.30.8...0499de31b99561a6d14a36a5f662c2a54f91beee)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: 4.31.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-05 12:23:31 +00:00
dependabot[bot]
d00c29a783 Bump actions/setup-node from 5.0.0 to 6.0.0
Bumps [actions/setup-node](https://github.com/actions/setup-node) from 5.0.0 to 6.0.0.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](a0853c2454...2028fbc5c2)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-05 12:23:23 +00:00
dependabot[bot]
781c2698ba Bump actions/download-artifact from 5.0.0 to 6.0.0
Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 5.0.0 to 6.0.0.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](634f93cb29...018cc2cf5b)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-05 12:23:17 +00:00
Evgenii Kliuchnikov
1ed8c49aca fix iOS build
PiperOrigin-RevId: 828405754
2025-11-05 04:22:14 -08:00
Evgenii Kliuchnikov
e230f474b8 disable BROTLI_MODEL macro for some targets
PiperOrigin-RevId: 827486322
2025-11-03 07:20:52 -08:00
Evgenii Kliuchnikov
028fb5a236 release v1.2.0
PiperOrigin-RevId: 824484966
2025-10-27 06:07:48 -07:00
Evgenii Kliuchnikov
390de5b472 build and test csharp decoder
PiperOrigin-RevId: 822490991
2025-10-22 02:21:13 -07:00
Evgenii Kliuchnikov
3499acbb7a regenerate go/kt/js/ts
PiperOrigin-RevId: 822489795
2025-10-22 02:16:51 -07:00
Evgenii Kliuchnikov
8ca2312c61 fix release workflow
PiperOrigin-RevId: 822073417
2025-10-21 05:37:07 -07:00
Evgenii Kliuchnikov
ee771daf20 fix copy-paste in Java decoder
PiperOrigin-RevId: 822024938
2025-10-21 02:42:59 -07:00
Evgenii Kliuchnikov
42aee32891 try to fix release workflow
PiperOrigin-RevId: 822012047
2025-10-21 02:03:38 -07:00
Evgenii Kliuchnikov
392c06bac0 redesign release resource uploading
PiperOrigin-RevId: 821982935
2025-10-21 00:22:30 -07:00
Evgenii Kliuchnikov
1964cdb1b9 ramp up all GH actions plugins
PiperOrigin-RevId: 821598646
2025-10-20 05:07:13 -07:00
Evgenii Kliuchnikov
61605b1cb3 pick VCPKG patches
PiperOrigin-RevId: 821593009
2025-10-20 04:44:24 -07:00
Evgenii Kliuchnikov
4b0f27b6f9 pick changes from Alpine patch
PiperOrigin-RevId: 816164347
2025-10-07 05:39:06 -07:00
Evgenii Kliuchnikov
1e4425a372 pick changes from Debian patch
PiperOrigin-RevId: 816157554
2025-10-07 05:16:20 -07:00
Copybara-Service
f038020bd7 Merge pull request #1346 from google:dependabot/github_actions/softprops/action-gh-release-2.3.4
PiperOrigin-RevId: 816151932
2025-10-07 04:56:02 -07:00
Copybara-Service
4d5a32bf45 Merge pull request #1347 from google:dependabot/github_actions/ossf/scorecard-action-2.4.3
PiperOrigin-RevId: 816151799
2025-10-07 04:54:57 -07:00
Evgenii Kliuchnikov
34e43eb020 fix typos
PiperOrigin-RevId: 815676548
2025-10-06 05:16:07 -07:00
Evgenii Kliuchnikov
b3142143f6 add installation section to README
PiperOrigin-RevId: 815667870
2025-10-06 04:45:22 -07:00
Evgenii Kliuchnikov
0e7ea31e6b add alternative unaligned memory access for MIPS
PiperOrigin-RevId: 815662268
2025-10-06 04:26:03 -07:00
Evgenii Kliuchnikov
da2e091eb7 prepare for v1.2.0.rc1
PiperOrigin-RevId: 815650799
2025-10-06 03:48:49 -07:00
dependabot[bot]
30576423b8 Bump ossf/scorecard-action from 2.4.2 to 2.4.3
Bumps [ossf/scorecard-action](https://github.com/ossf/scorecard-action) from 2.4.2 to 2.4.3.
- [Release notes](https://github.com/ossf/scorecard-action/releases)
- [Changelog](https://github.com/ossf/scorecard-action/blob/main/RELEASE.md)
- [Commits](05b42c6244...4eaacf0543)

---
updated-dependencies:
- dependency-name: ossf/scorecard-action
  dependency-version: 2.4.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 09:30:15 +00:00
dependabot[bot]
9cf25439ad Bump softprops/action-gh-release from 2.3.3 to 2.3.4
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.3.3 to 2.3.4.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](6cbd405e2c...62c96d0c4e)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-version: 2.3.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 09:29:59 +00:00
Copybara-Service
82d3c163cb Merge pull request #1328 from akazwz:go
PiperOrigin-RevId: 815626477
2025-10-06 02:29:02 -07:00
Copybara-Service
4876ada111 Merge pull request #1335 from google:dependabot/github_actions/actions/cache-4.3.0
PiperOrigin-RevId: 815624406
2025-10-06 02:21:54 -07:00
Eugene Kliuchnikov
f1c80224e8 Fix some typos / non-typos. (#1345) 2025-10-03 12:16:07 +02:00
Evgenii Kliuchnikov
ed93810e27 support multi-phase initialization
PiperOrigin-RevId: 814128632
2025-10-02 01:51:02 -07:00
Eugene Kliuchnikov
7cc02a1687 Merge branch 'master' into dependabot/github_actions/actions/cache-4.3.0 2025-10-01 21:06:49 +02:00
Evgenii Kliuchnikov
54481d4ebe use builtin bswap when available
PiperOrigin-RevId: 813849285
2025-10-01 11:56:09 -07:00
Evgenii Kliuchnikov
a896e79d4f Java: ramp-up artifact versions in pom files
PiperOrigin-RevId: 813673237
2025-10-01 03:26:40 -07:00
Evgenii Kliuchnikov
947f74e908 update links in readme
PiperOrigin-RevId: 813665159
2025-10-01 02:58:49 -07:00
Evgenii Kliuchnikov
916e4a46a8 update docs
PiperOrigin-RevId: 813658707
2025-10-01 02:37:06 -07:00
Eugene Kliuchnikov
e4e56a3203 Add missing newline 2025-09-29 16:32:28 +02:00
Eugene Kliuchnikov
2c5f2d1198 Merge branch 'master' into go 2025-09-29 15:39:10 +02:00
Eugene Kliuchnikov
f3b0ceed2d Merge branch 'master' into dependabot/github_actions/actions/cache-4.3.0 2025-09-29 15:37:19 +02:00
Evgenii Kliuchnikov
1f6ab76bff use module-bound exception
PiperOrigin-RevId: 812739918
2025-09-29 05:12:31 -07:00
dependabot[bot]
5c79b32b14 Bump actions/cache from 4.2.4 to 4.3.0
Bumps [actions/cache](https://github.com/actions/cache) from 4.2.4 to 4.3.0.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](0400d5f644...0057852bfa)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-version: 4.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 09:19:37 +00:00
Copybara-Service
d74b0a4a22 Merge pull request #1323 from google:dependabot/github_actions/actions/setup-python-6.0.0
PiperOrigin-RevId: 811710346
2025-09-26 01:25:56 -07:00
Copybara-Service
dbcb332b66 Merge pull request #1324 from google:dependabot/github_actions/softprops/action-gh-release-2.3.3
PiperOrigin-RevId: 811710240
2025-09-26 01:24:57 -07:00
dependabot[bot]
c0d785dfe2 Bump actions/setup-python from 5.6.0 to 6.0.0
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.6.0 to 6.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](a26af69be9...e797f83bcb)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-25 15:02:31 +00:00
dependabot[bot]
466613c266 Bump softprops/action-gh-release from 2.3.2 to 2.3.3
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.3.2 to 2.3.3.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](72f2c25fcb...6cbd405e2c)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-version: 2.3.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-25 15:01:52 +00:00
Evgenii Kliuchnikov
1406898440 Build and test with PY2.7
PiperOrigin-RevId: 811352084
2025-09-25 08:00:20 -07:00
Evgenii Kliuchnikov
0bef8a6936 clarify that prepared dictionaries are "lean"
PiperOrigin-RevId: 811236534
2025-09-25 01:27:28 -07:00
Evgenii Kliuchnikov
9686382ff3 PY: continue renovation of extension
Fixed unchecked malloc for "tail" input data.
Fixed inconsistencies in docstrings.

Rewritten "growable buffer" to C-code, so it could run without acquiring GIL.

Breaking changes:
 - native object allocation failures now handled at object creation time
 - some lower level exceptions (e.g. OOM) are not shadowed by brotli.error

PiperOrigin-RevId: 810813664
2025-09-24 03:52:44 -07:00
Evgenii Kliuchnikov
85d46ce6b5 Drop finalize()
Now it is solely embedders responisbility to close things that hold native resources. No more "safety net".

Consider "try-with-resources". For longer lasting items (e.g. native PreparedDictionary) use Cleaner as a last resort.

PiperOrigin-RevId: 807584792
2025-09-16 01:23:46 -07:00
akazwz
3d8eef20a6 Update Go modules to require Go 1.21 and replace ioutil with io package in reader.go 2025-09-12 23:52:50 +08:00
Evgenii Kliuchnikov
41a22f07f2 modernize PY3 class definition
PiperOrigin-RevId: 804460135
2025-09-08 09:15:53 -07:00
Evgenii Kliuchnikov
98a89b1563 temporary rollback
PiperOrigin-RevId: 803462595
2025-09-05 07:57:59 -07:00
83 changed files with 1891 additions and 1976 deletions

1
.gitattributes vendored
View File

@@ -51,3 +51,4 @@ tests/testdata/empty !export-ignore
tests/testdata/empty.compressed !export-ignore
tests/testdata/ukkonooa !export-ignore
tests/testdata/ukkonooa.compressed !export-ignore
tests/testdata/zerosukkanooa.compressed !export-ignore

View File

@@ -6,19 +6,23 @@
# Workflow for building and running tests under Ubuntu
name: Build/Test
on:
push:
branches:
- master
pull_request:
types: [opened, reopened, labeled, synchronize]
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
ubuntu_build:
build_test:
name: Build and test ${{ matrix.name }}
runs-on: ${{ matrix.os || 'ubuntu-latest' }}
defaults:
@@ -28,18 +32,36 @@ jobs:
fail-fast: false
matrix:
include:
- name: cmake:gcc
build_system: cmake
c_compiler: gcc
cxx_compiler: g++
- name: cmake:gcc-old
build_system: cmake
c_compiler: gcc
cxx_compiler: g++
os: ubuntu-22.04
- name: cmake:clang
build_system: cmake
c_compiler: clang
cxx_compiler: clang
- name: cmake:clang-old
build_system: cmake
c_compiler: clang
cxx_compiler: clang
os: ubuntu-22.04
- name: cmake:package
build_system: cmake
cmake_args: -DBROTLI_BUILD_FOR_PACKAGE=ON
- name: cmake:static
build_system: cmake
cmake_args: -DBUILD_SHARED_LIBS=OFF
- name: cmake:clang:asan
build_system: cmake
sanitizer: address
@@ -78,6 +100,16 @@ jobs:
cxx_compiler: g++
os: macos-latest
- name: cmake-ios:clang
build_system: cmake
c_compiler: clang
cxx_compiler: clang++
os: macos-latest
skip_tests: true # TODO(eustas): run tests in a simulator
cmake_args: >-
-DCMAKE_SYSTEM_NAME=iOS
-DCMAKE_OSX_ARCHITECTURES=arm64
- name: cmake-win64:msvc-rel
build_system: cmake
cmake_generator: Visual Studio 17 2022
@@ -105,7 +137,6 @@ jobs:
build_system: python
python_version: "3.10"
# TODO: investigate why win-builds can't run tests
py_setuptools_cmd: build_ext
os: windows-2022
- name: maven
@@ -174,6 +205,12 @@ jobs:
CXX: ${{ matrix.cxx_compiler || 'gcc' }}
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Install extra deps @ Ubuntu
if: ${{ runner.os == 'Linux' }}
# Already installed: bazel, clang{13-15}, cmake, gcc{9.5-13.1}, java{8,11,17,21}, maven, python{3.10}
@@ -183,23 +220,16 @@ jobs:
sudo apt install -y ${EXTRA_PACKAGES}
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1
#- name: Checkout VC9 for Python
# if: ${{ runner.os == 'Windows' && matrix.build_system == 'python' && matrix.python_version == '2.7' }}
# uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
# with:
# repository: reider-roque/sulley-win-installer
# path: third_party/VCForPython27
- name: Configure / Build / Test with CMake
if: ${{ matrix.build_system == 'cmake' }}
run: |
export ASAN_OPTIONS=detect_leaks=0
declare -a CMAKE_OPTIONS=()
declare -a CMAKE_OPTIONS=(${{ matrix.cmake_args || '' }})
CMAKE_OPTIONS+=("-DCMAKE_VERBOSE_MAKEFILE=ON")
[ ! -z '${{ matrix.c_compiler || '' }}' ] && CMAKE_OPTIONS+=(-DCMAKE_C_COMPILER='${{ matrix.c_compiler }}')
[ ! -z '${{ matrix.cxx_compiler || '' }}' ] && CMAKE_OPTIONS+=(-DCMAKE_CXX_COMPILER='${{ matrix.cxx_compiler }}')
@@ -212,7 +242,9 @@ jobs:
cmake -B out . ${CMAKE_OPTIONS[*]} -DCMAKE_C_FLAGS='${{ matrix.c_flags || '' }}'
cmake --build out ${CMAKE_BUILD_OPTIONS[*]}
cd out; ctest ${CMAKE_TEST_OPTIONS[*]}; cd ..
cd out
[ ! -z '${{ matrix.skip_tests || '' }}' ] || ctest ${CMAKE_TEST_OPTIONS[*]}
cd ..
- name: Quick Fuzz
if: ${{ matrix.build_system == 'fuzz' }}
@@ -289,22 +321,35 @@ jobs:
# cd integration
# mvn -B verify
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
if: ${{ matrix.build_system == 'python' }}
with:
python-version: ${{ matrix.python_version }}
# TODO: investigate, why msiexec hangs
#- name: Install VC9 for Python
# if: ${{ runner.os == 'Windows' && matrix.build_system == 'python' && matrix.python_version == '2.7' }}
# run: |
# echo "070474db76a2e625513a5835df4595df9324d820f9cc97eab2a596dcbc2f5cbf third_party/VCForPython27/VCForPython27.msi" | sha256sum --check --status
# msiexec ALLUSERS=1 /qn /norestart /i third_party/VCForPython27/VCForPython27.msi /l*v ${RUNNER_TEMP}/msiexec.log
# cat ${RUNNER_TEMP}/msiexec.log
# TODO(eustas): use modern setuptools (split out testing)
- name: Build / Test with Python
if: ${{ matrix.build_system == 'python' }}
run: |
python -VV
python -c "import sys; sys.exit('Invalid python version') if '.'.join(map(str,sys.version_info[0:2])) != '${{ matrix.python_version }}' else True"
python setup.py ${{ matrix.py_setuptools_cmd || 'test'}}
pip install setuptools==51.3.3 pytest
python setup.py build_ext --inplace
pytest ./python/tests
build_test_dotnet:
name: Build and test with .NET
runs-on: ubuntu-latest
steps:
- name: Checkout the source
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1
- name: Build / Test
run: |
cd csharp
dotnet build brotlidec.csproj --configuration Release
dotnet test brotlidec.Tests.csproj

70
.github/workflows/build_test_wasm.yml vendored Normal file
View File

@@ -0,0 +1,70 @@
# Copyright 2025 Google Inc. All Rights Reserved.
#
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
# Workflow for building and running tests with WASM
name: Build/Test WASM
on:
push:
branches:
- master
pull_request:
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
build_test_wasm:
name: Build and test with WASM
runs-on: ubuntu-latest
env:
CCACHE_DIR: ${{ github.workspace }}/.ccache
BUILD_TARGET: wasm32
EM_VERSION: 3.1.51
# As of 28.08.2025 ubuntu-latest is 24.04; it is shipped with node 22.18
NODE_VERSION: 22
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: true
fetch-depth: 1
- name: Install node
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
with:
node-version: ${{env.NODE_VERSION}}
- name: Get non-EMSDK node path
run: which node >> $HOME/.base_node_path
- name: Install emsdk
uses: mymindstorm/setup-emsdk@6ab9eb1bda2574c4ddb79809fc9247783eaf9021 # v14
with:
version: ${{env.EM_VERSION}}
no-cache: true
- name: Set EMSDK node version
run: |
echo "NODE_JS='$(cat $HOME/.base_node_path)'" >> $EMSDK/.emscripten
emsdk construct_env
- name: Build
run: |
LDFLAGS=" -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 " emcmake cmake -B out .
cmake --build out
cd out; ctest --output-on-failure; cd ..

View File

@@ -9,6 +9,9 @@ on:
schedule:
- cron: '18 15 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
@@ -30,12 +33,18 @@ jobs:
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby', 'swift' ]
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/init@cf1bb45a277cb3c205638b2cd5c984db1c46a412 # v3.29.5
with:
languages: ${{ matrix.language }}
# CodeQL is currently crashing on files with large lists:
@@ -47,7 +56,7 @@ jobs:
- if: matrix.language == 'cpp'
name: Build CPP
uses: github/codeql-action/autobuild@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/autobuild@cf1bb45a277cb3c205638b2cd5c984db1c46a412 # v3.29.5
- if: matrix.language == 'cpp' || matrix.language == 'java'
name: Build Java
@@ -57,7 +66,7 @@ jobs:
- if: matrix.language == 'javascript'
name: Build JS
uses: github/codeql-action/autobuild@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/autobuild@cf1bb45a277cb3c205638b2cd5c984db1c46a412 # v3.29.5
- if: matrix.language == 'cpp' || matrix.language == 'python'
name: Build Python
@@ -65,7 +74,7 @@ jobs:
python setup.py build_ext
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/analyze@cf1bb45a277cb3c205638b2cd5c984db1c46a412 # v3.29.5
with:
category: "/language:${{matrix.language}}"
ref: "${{ github.ref != 'master' && github.ref || '/refs/heads/master' }}"

View File

@@ -6,8 +6,12 @@
# Workflow for building / running oss-fuzz.
name: CIFuzz
on: [pull_request]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
@@ -16,17 +20,25 @@ jobs:
Fuzzing:
runs-on: ubuntu-latest
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Build Fuzzers
uses: google/oss-fuzz/infra/cifuzz/actions/build_fuzzers@master
uses: google/oss-fuzz/infra/cifuzz/actions/build_fuzzers@3e6a7fd7bcd631647ab9beed1fe0897498e6af39 # 22.09.2025
with:
oss-fuzz-project-name: 'brotli'
dry-run: false
- name: Run Fuzzers
uses: google/oss-fuzz/infra/cifuzz/actions/run_fuzzers@master
uses: google/oss-fuzz/infra/cifuzz/actions/run_fuzzers@3e6a7fd7bcd631647ab9beed1fe0897498e6af39 # 22.09.2025
with:
oss-fuzz-project-name: 'brotli'
fuzz-seconds: 600
dry-run: false
- name: Upload Crash
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
if: failure()

55
.github/workflows/lint.yml vendored Normal file
View File

@@ -0,0 +1,55 @@
# Copyright 2025 Google Inc. All Rights Reserved.
#
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
# Workflow for checking typos and buildifier, formatting, etc.
name: "Lint"
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
schedule:
- cron: '18 15 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
check:
name: Lint
runs-on: 'ubuntu-latest'
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Install tools
run: |
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
brew install buildifier ruff typos-cli
- name: Check typos
run: |
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
./scripts/check_typos.sh
- name: Lint Python code
run: |
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
ruff check --extend-select=C4,C90,PERF,RET,SIM,W
# TODO(eustas): run buildifier

View File

@@ -14,7 +14,10 @@ on:
release:
types: [ published ]
pull_request:
types: [opened, reopened, labeled, synchronize]
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
@@ -59,13 +62,19 @@ jobs:
VCPKG_DISABLE_METRICS: 1
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1
- uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
- uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
id: cache-vcpkg
with:
path: vcpkg
@@ -76,7 +85,7 @@ jobs:
shell: 'powershell'
run: |
Invoke-WebRequest -Uri "https://github.com/microsoft/vcpkg/archive/refs/tags/${{ env.VCPKG_VERSION }}.zip" -OutFile "vcpkg.zip"
- name: Bootstrap vcpkg
if: steps.cache-vcpkg.outputs.cache-hit != 'true'
shell: 'bash'
@@ -100,23 +109,19 @@ jobs:
-DCMAKE_TOOLCHAIN_FILE=${VCPKG_ROOT}/scripts/buildsystems/vcpkg.cmake \
-DVCPKG_TARGET_TRIPLET=${{ matrix.triplet }} \
#
- name: Build
shell: 'bash'
run: |
set -x
cmake --build out --config Release
- name: Install
shell: 'bash'
run: |
set -x
cmake --build out --config Release --target install
cp LICENSE prefix/bin/LICENSE.brotli
- name: Upload artifacts
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: brotli-${{matrix.triplet}}
path: |
prefix/bin/*
- name: Package release zip
shell: 'powershell'
@@ -124,11 +129,12 @@ jobs:
Compress-Archive -Path prefix\bin\* `
-DestinationPath brotli-${{matrix.triplet}}.zip
- name: Upload binaries to release
if: github.event_name == 'release'
uses: softprops/action-gh-release@72f2c25fcb47643c292f7107632f7a47c1df5cd8 # v0.1.15
- name: Upload package
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
files: brotli-${{matrix.triplet}}.zip
name: brotli-${{matrix.triplet}}
path: brotli-${{matrix.triplet}}.zip
compression-level: 0
testdata_upload:
name: Upload testdata
@@ -138,8 +144,13 @@ jobs:
shell: bash
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1
@@ -148,14 +159,42 @@ jobs:
run: |
tar cvfJ testdata.txz tests/testdata
- name: Upload archive to release
if: github.event_name == 'release'
uses: softprops/action-gh-release@72f2c25fcb47643c292f7107632f7a47c1df5cd8 # v0.1.15
- name: Upload archive
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
files: testdata.txz
name: testdata
path: testdata.txz
compression-level: 0
publish_release_assets:
name: Publish release assets
needs: [windows_build, testdata_upload]
if: github.event_name == 'release'
runs-on: [ubuntu-latest]
permissions:
contents: write
steps:
- name: Checkout the source
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1
- name: Download all artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: release_assets
merge-multiple: true
- name: Publish assets
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ github.event.release.tag_name }} ./release_assets/*
archive_build:
needs: testdata_upload
needs: publish_release_assets
name: Build and test from archive
runs-on: 'ubuntu-latest'
defaults:
@@ -163,8 +202,13 @@ jobs:
shell: bash
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
submodules: false
fetch-depth: 1

View File

@@ -3,6 +3,7 @@
# policy, and support documentation.
name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
@@ -14,13 +15,13 @@ on:
push:
branches: [ "master" ]
# Declare default permissions as read only.
permissions: read-all
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
# Declare default permissions as read only.
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
@@ -35,13 +36,18 @@ jobs:
# actions: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
egress-policy: audit
- name: "Checkout code"
uses: actions/checkout@v4 # v3.1.0
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@05b42c624433fc40578a4040d5cf5e36ddca8cde # v2.4.2
uses: ossf/scorecard-action@4eaacf0543bb3f2c246792bd56e8cdeffafb205a # v2.4.3
with:
results_file: results.sarif
results_format: sarif
@@ -71,6 +77,6 @@ jobs:
# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@17573ee1cc1b9d061760f3a006fc4aac4f944fd5 # v2.2.4
uses: github/codeql-action/upload-sarif@cf1bb45a277cb3c205638b2cd5c984db1c46a412 # v2.23.3
with:
sarif_file: results.sarif

View File

@@ -7,12 +7,38 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## Unreleased
## [1.2.0] - 2025-10-27
### SECURITY
- python: added `Decompressor::can_accept_more_data` method and optional
`max_output_length` argument `Decompressor::process`;
that allows mitigation of unexpextedely large output;
`output_buffer_limit` argument `Decompressor::process`;
that allows mitigation of unexpectedly large output;
reported by Charles Chan (https://github.com/charleswhchan)
### Added
- **decoder / encoder: added static initialization to reduce binary size**
- python: allow limiting decoder output (see SECURITY section)
- CLI: `brcat` alias; allow decoding concatenated brotli streams
- kt: pure Kotlin decoder
- cgo: support "raw" dictionaries
- build: Bazel modules
### Removed
- java: dropped `finalize()` for native entities
### Fixed
- java: in `compress` pass correct length to native encoder
### Improved
- build: install man pages
- build: updated / fixed / refined Bazel buildfiles
- encoder: faster encoding
- cgo: link via pkg-config
- python: modernize extension / allow multi-phase module initialization
### Changed
- decoder / encoder: static tables use "small" model (allows 2GiB+ binaries)
## [1.1.0] - 2023-08-28

View File

@@ -10,11 +10,8 @@ cmake_minimum_required(VERSION 3.15)
cmake_policy(SET CMP0048 NEW)
project(brotli C)
option(BUILD_SHARED_LIBS "Build shared libraries" ON)
set(BROTLI_BUILD_TOOLS ON CACHE BOOL "Build/install CLI tools")
if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
message(STATUS "Setting build type to Release as none was specified.")
if (NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
message(STATUS "Setting build type to Release as none was specified")
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
else()
message(STATUS "Build type is '${CMAKE_BUILD_TYPE}'")
@@ -33,20 +30,34 @@ else()
message("-- Compiler is not EMSCRIPTEN")
endif()
if (BROTLI_EMSCRIPTEN)
message(STATUS "Switching to static build for EMSCRIPTEN")
set(BUILD_SHARED_LIBS OFF)
endif()
# Reflect CMake variable as a build option.
option(BUILD_SHARED_LIBS "Build shared libraries" ON)
set(BROTLI_BUILD_TOOLS ON CACHE BOOL "Build/install CLI tools")
set(BROTLI_BUILD_FOR_PACKAGE OFF CACHE BOOL "Build/install both shared and static libraries")
if (BROTLI_BUILD_FOR_PACKAGE AND NOT BUILD_SHARED_LIBS)
message(FATAL_ERROR "Both BROTLI_BUILD_FOR_PACKAGE and BUILD_SHARED_LIBS are set")
endif()
# If Brotli is being bundled in another project, we don't want to
# install anything. However, we want to let people override this, so
# we'll use the BROTLI_BUNDLED_MODE variable to let them do that; just
# set it to OFF in your project before you add_subdirectory(brotli).
get_directory_property(BROTLI_PARENT_DIRECTORY PARENT_DIRECTORY)
if(NOT DEFINED BROTLI_BUNDLED_MODE)
if (NOT DEFINED BROTLI_BUNDLED_MODE)
# Bundled mode hasn't been set one way or the other, set the default
# depending on whether or not we are the top-level project.
if(BROTLI_PARENT_DIRECTORY)
if (BROTLI_PARENT_DIRECTORY)
set(BROTLI_BUNDLED_MODE ON)
else()
set(BROTLI_BUNDLED_MODE OFF)
endif()
endif()
endif() # BROTLI_BUNDLED_MODE
mark_as_advanced(BROTLI_BUNDLED_MODE)
include(GNUInstallDirs)
@@ -80,66 +91,79 @@ endif ()
include(CheckLibraryExists)
set(LIBM_LIBRARY)
set(LIBM_DEP)
CHECK_LIBRARY_EXISTS(m log2 "" HAVE_LIB_M)
if(HAVE_LIB_M)
if (HAVE_LIB_M)
set(LIBM_LIBRARY "m")
if (NOT BUILD_SHARED_LIBS)
set(LIBM_DEP "-lm")
endif()
endif()
set(BROTLI_INCLUDE_DIRS "${CMAKE_CURRENT_SOURCE_DIR}/c/include")
mark_as_advanced(BROTLI_INCLUDE_DIRS)
set(BROTLI_LIBRARIES_CORE brotlienc brotlidec brotlicommon)
set(BROTLI_LIBRARIES ${BROTLI_LIBRARIES_CORE} ${LIBM_LIBRARY})
if (BROTLI_BUILD_FOR_PACKAGE)
set(BROTLI_SHARED_LIBRARIES brotlienc brotlidec brotlicommon)
set(BROTLI_STATIC_LIBRARIES brotlienc-static brotlidec-static brotlicommon-static)
set(BROTLI_LIBRARIES ${BROTLI_SHARED_LIBRARIES} ${LIBM_LIBRARY})
else() # NOT BROTLI_BUILD_FOR_PACKAGE
if (BUILD_SHARED_LIBS)
set(BROTLI_SHARED_LIBRARIES brotlienc brotlidec brotlicommon)
set(BROTLI_STATIC_LIBRARIES)
else() # NOT BUILD_SHARED_LIBS
set(BROTLI_SHARED_LIBRARIES)
set(BROTLI_STATIC_LIBRARIES brotlienc brotlidec brotlicommon)
endif()
set(BROTLI_LIBRARIES ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES} ${LIBM_LIBRARY})
endif() # BROTLI_BUILD_FOR_PACKAGE
mark_as_advanced(BROTLI_LIBRARIES)
if(${CMAKE_SYSTEM_NAME} MATCHES "Linux")
add_definitions(-DOS_LINUX)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "FreeBSD")
add_definitions(-DOS_FREEBSD)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "Darwin")
add_definitions(-DOS_MACOSX)
endif()
if (MSVC)
message(STATUS "Defining _CRT_SECURE_NO_WARNINGS to avoid warnings about security")
if(BROTLI_EMSCRIPTEN)
set(BUILD_SHARED_LIBS OFF)
add_definitions(-D_CRT_SECURE_NO_WARNINGS)
endif()
file(GLOB_RECURSE BROTLI_COMMON_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/common/*.c)
add_library(brotlicommon ${BROTLI_COMMON_SOURCES})
file(GLOB_RECURSE BROTLI_DEC_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/dec/*.c)
add_library(brotlidec ${BROTLI_DEC_SOURCES})
file(GLOB_RECURSE BROTLI_ENC_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/enc/*.c)
add_library(brotlicommon ${BROTLI_COMMON_SOURCES})
add_library(brotlidec ${BROTLI_DEC_SOURCES})
add_library(brotlienc ${BROTLI_ENC_SOURCES})
if (BROTLI_BUILD_FOR_PACKAGE)
add_library(brotlicommon-static STATIC ${BROTLI_COMMON_SOURCES})
add_library(brotlidec-static STATIC ${BROTLI_DEC_SOURCES})
add_library(brotlienc-static STATIC ${BROTLI_ENC_SOURCES})
endif()
# Older CMake versions does not understand INCLUDE_DIRECTORIES property.
include_directories(${BROTLI_INCLUDE_DIRS})
if(BUILD_SHARED_LIBS)
foreach(lib ${BROTLI_LIBRARIES_CORE})
if (BUILD_SHARED_LIBS)
foreach(lib ${BROTLI_SHARED_LIBRARIES})
target_compile_definitions(${lib} PUBLIC "BROTLI_SHARED_COMPILATION" )
string(TOUPPER "${lib}" LIB)
set_target_properties (${lib} PROPERTIES DEFINE_SYMBOL "${LIB}_SHARED_COMPILATION")
endforeach()
endif()
endif() # BUILD_SHARED_LIBS
foreach(lib ${BROTLI_LIBRARIES_CORE})
foreach(lib ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES})
target_link_libraries(${lib} ${LIBM_LIBRARY})
set_property(TARGET ${lib} APPEND PROPERTY INCLUDE_DIRECTORIES ${BROTLI_INCLUDE_DIRS})
set_target_properties(${lib} PROPERTIES
VERSION "${BROTLI_ABI_COMPATIBILITY}.${BROTLI_ABI_AGE}.${BROTLI_ABI_REVISION}"
SOVERSION "${BROTLI_ABI_COMPATIBILITY}")
if(NOT BROTLI_EMSCRIPTEN)
if (NOT BROTLI_EMSCRIPTEN)
set_target_properties(${lib} PROPERTIES POSITION_INDEPENDENT_CODE TRUE)
endif()
set_property(TARGET ${lib} APPEND PROPERTY INTERFACE_INCLUDE_DIRECTORIES "$<BUILD_INTERFACE:${BROTLI_INCLUDE_DIRS}>")
endforeach()
endforeach() # BROTLI_xxx_LIBRARIES
if(NOT BROTLI_EMSCRIPTEN)
target_link_libraries(brotlidec brotlicommon)
target_link_libraries(brotlienc brotlicommon)
endif()
target_link_libraries(brotlidec brotlicommon)
target_link_libraries(brotlienc brotlicommon)
# For projects stuck on older versions of CMake, this will set the
# BROTLI_INCLUDE_DIRS and BROTLI_LIBRARIES variables so they still
@@ -147,19 +171,21 @@ endif()
#
# include_directories(${BROTLI_INCLUDE_DIRS})
# target_link_libraries(foo ${BROTLI_LIBRARIES})
if(BROTLI_PARENT_DIRECTORY)
if (BROTLI_PARENT_DIRECTORY)
set(BROTLI_INCLUDE_DIRS "${BROTLI_INCLUDE_DIRS}" PARENT_SCOPE)
set(BROTLI_LIBRARIES "${BROTLI_LIBRARIES}" PARENT_SCOPE)
endif()
# Build the brotli executable
if(BROTLI_BUILD_TOOLS)
if (BROTLI_BUILD_TOOLS)
add_executable(brotli c/tools/brotli.c)
target_link_libraries(brotli ${BROTLI_LIBRARIES})
# brotli is a CLI tool
set_target_properties(brotli PROPERTIES MACOSX_BUNDLE OFF)
endif()
# Installation
if(NOT BROTLI_BUNDLED_MODE)
if (NOT BROTLI_BUNDLED_MODE)
if (BROTLI_BUILD_TOOLS)
install(
TARGETS brotli
@@ -168,7 +194,7 @@ if(NOT BROTLI_BUNDLED_MODE)
endif()
install(
TARGETS ${BROTLI_LIBRARIES_CORE}
TARGETS ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES}
ARCHIVE DESTINATION "${CMAKE_INSTALL_LIBDIR}"
LIBRARY DESTINATION "${CMAKE_INSTALL_LIBDIR}"
RUNTIME DESTINATION "${CMAKE_INSTALL_BINDIR}"
@@ -183,17 +209,28 @@ endif() # BROTLI_BUNDLED_MODE
# Tests
# Integration tests, those depend on `brotli` binary
if(NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
if (NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
# If we're targeting Windows but not running on Windows, we need Wine
# to run the tests...
if(WIN32 AND NOT CMAKE_HOST_WIN32)
if (WIN32 AND NOT CMAKE_HOST_WIN32)
find_program(BROTLI_WRAPPER NAMES wine)
if(NOT BROTLI_WRAPPER)
if (NOT BROTLI_WRAPPER)
message(STATUS "wine not found, disabling tests")
set(BROTLI_DISABLE_TESTS TRUE)
endif()
endif()
endif() # WIN32 emulation
if (BROTLI_EMSCRIPTEN)
find_program(BROTLI_WRAPPER NAMES node)
if (NOT BROTLI_WRAPPER)
message(STATUS "node not found, disabling tests")
set(BROTLI_DISABLE_TESTS TRUE)
endif()
endif() # BROTLI_EMSCRIPTEN
endif() # BROTLI_DISABLE_TESTS
# NB: BROTLI_DISABLE_TESTS might have changed.
if (NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
# If our compiler is a cross-compiler that we know about (arm/aarch64),
# then we need to use qemu to execute the tests.
if ("${CMAKE_C_COMPILER}" MATCHES "^.*/arm-linux-gnueabihf-.*$")
@@ -255,13 +292,16 @@ if(NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
tests/testdata/*.compressed*)
foreach(INPUT ${COMPATIBILITY_INPUTS})
add_test(NAME "${BROTLI_TEST_PREFIX}compatibility/${INPUT}"
COMMAND "${CMAKE_COMMAND}"
-DBROTLI_WRAPPER=${BROTLI_WRAPPER}
-DBROTLI_WRAPPER_LD_PREFIX=${BROTLI_WRAPPER_LD_PREFIX}
-DBROTLI_CLI=$<TARGET_FILE:brotli>
-DINPUT=${CMAKE_CURRENT_SOURCE_DIR}/${INPUT}
-P ${CMAKE_CURRENT_SOURCE_DIR}/tests/run-compatibility-test.cmake)
string(REGEX REPLACE "([a-zA-Z0-9\\.]+)\\.compressed(\\.[0-9]+)?$" "\\1" UNCOMPRESSED_INPUT "${INPUT}")
if (EXISTS ${UNCOMPRESSED_INPUT})
add_test(NAME "${BROTLI_TEST_PREFIX}compatibility/${INPUT}"
COMMAND "${CMAKE_COMMAND}"
-DBROTLI_WRAPPER=${BROTLI_WRAPPER}
-DBROTLI_WRAPPER_LD_PREFIX=${BROTLI_WRAPPER_LD_PREFIX}
-DBROTLI_CLI=$<TARGET_FILE:brotli>
-DINPUT=${CMAKE_CURRENT_SOURCE_DIR}/${INPUT}
-P ${CMAKE_CURRENT_SOURCE_DIR}/tests/run-compatibility-test.cmake)
endif()
endforeach()
endif() # BROTLI_DISABLE_TESTS
@@ -283,19 +323,19 @@ function(generate_pkg_config_path outvar path)
get_filename_component(value_full "${value}" ABSOLUTE)
string(LENGTH "${value}" value_length)
if(path_length EQUAL value_length AND path STREQUAL value)
if (path_length EQUAL value_length AND path STREQUAL value)
set("${outvar}" "\${${name}}")
break()
elseif(path_length GREATER value_length)
elseif (path_length GREATER value_length)
# We might be in a subdirectory of the value, but we have to be
# careful about a prefix matching but not being a subdirectory
# (for example, /usr/lib64 is not a subdirectory of /usr/lib).
# We'll do this by making sure the next character is a directory
# separator.
string(SUBSTRING "${path}" ${value_length} 1 sep)
if(sep STREQUAL "/")
if (sep STREQUAL "/")
string(SUBSTRING "${path}" 0 ${value_length} s)
if(s STREQUAL value)
if (s STREQUAL value)
string(SUBSTRING "${path}" "${value_length}" -1 suffix)
set("${outvar}" "\${${name}}${suffix}")
break()
@@ -316,6 +356,7 @@ function(transform_pc_file INPUT_FILE OUTPUT_FILE VERSION)
set(PREFIX "${CMAKE_INSTALL_PREFIX}")
string(REGEX REPLACE "@prefix@" "${PREFIX}" TEXT ${TEXT})
string(REGEX REPLACE "@exec_prefix@" "${PREFIX}" TEXT ${TEXT})
string(REGEX REPLACE "@libm@" "${LIBM_DEP}" TEXT ${TEXT})
generate_pkg_config_path(LIBDIR "${CMAKE_INSTALL_FULL_LIBDIR}" prefix "${PREFIX}")
string(REGEX REPLACE "@libdir@" "${LIBDIR}" TEXT ${TEXT})
@@ -334,7 +375,7 @@ transform_pc_file("scripts/libbrotlidec.pc.in" "${CMAKE_CURRENT_BINARY_DIR}/libb
transform_pc_file("scripts/libbrotlienc.pc.in" "${CMAKE_CURRENT_BINARY_DIR}/libbrotlienc.pc" "${BROTLI_VERSION}")
if(NOT BROTLI_BUNDLED_MODE)
if (NOT BROTLI_BUNDLED_MODE)
install(FILES "${CMAKE_CURRENT_BINARY_DIR}/libbrotlicommon.pc"
DESTINATION "${CMAKE_INSTALL_LIBDIR}/pkgconfig")
install(FILES "${CMAKE_CURRENT_BINARY_DIR}/libbrotlidec.pc"
@@ -345,11 +386,11 @@ endif() # BROTLI_BUNDLED_MODE
if (BROTLI_BUILD_TOOLS)
install(FILES "docs/brotli.1"
DESTINATION "${CMAKE_INSTALL_FULL_MANDIR}/man1")
DESTINATION "${CMAKE_INSTALL_MANDIR}/man1")
endif()
install(FILES docs/constants.h.3 docs/decode.h.3 docs/encode.h.3 docs/types.h.3
DESTINATION "${CMAKE_INSTALL_FULL_MANDIR}/man3")
DESTINATION "${CMAKE_INSTALL_MANDIR}/man3")
if (ENABLE_COVERAGE STREQUAL "yes")
setup_target_for_coverage(coverage test coverage)

View File

@@ -9,7 +9,6 @@ include c/include/brotli/*.h
include LICENSE
include MANIFEST.in
include python/_brotli.cc
include python/bro.py
include python/brotli.py
include python/README.md
include python/tests/*

View File

@@ -7,6 +7,6 @@
module(
name = "brotli",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli",
)

4
README
View File

@@ -7,9 +7,9 @@ currently available general-purpose compression methods. It is similar in speed
with deflate but offers more dense compression.
The specification of the Brotli Compressed Data Format is defined in RFC 7932
https://tools.ietf.org/html/rfc7932
https://datatracker.ietf.org/doc/html/rfc7932
Brotli is open-sourced under the MIT License, see the LICENSE file.
Brotli mailing list:
https://groups.google.com/forum/#!forum/brotli
https://groups.google.com/g/brotli

View File

@@ -12,7 +12,8 @@ and 2nd order context modeling, with a compression ratio comparable to the best
currently available general-purpose compression methods. It is similar in speed
with deflate but offers more dense compression.
The specification of the Brotli Compressed Data Format is defined in [RFC 7932](https://tools.ietf.org/html/rfc7932).
The specification of the Brotli Compressed Data Format is defined in
[RFC 7932](https://datatracker.ietf.org/doc/html/rfc7932).
Brotli is open-sourced under the MIT License, see the LICENSE file.
@@ -21,11 +22,23 @@ Brotli is open-sourced under the MIT License, see the LICENSE file.
> to modify "raw" ranges of the compressed stream and the decoder will not
> notice that.
### Installation
In most Linux distributions, installing `brotli` is just a matter of using
the package management system. For example in Debian-based distributions:
`apt install brotli` will install `brotli`. On MacOS, you can use
[Homebrew](https://brew.sh/): `brew install brotli`.
[![brotli packaging status](https://repology.org/badge/vertical-allrepos/brotli.svg?exclude_unsupported=1&columns=3&exclude_sources=modules,site&header=brotli%20packaging%20status)](https://repology.org/project/brotli/versions)
Of course you can also build brotli from sources.
### Build instructions
#### Vcpkg
You can download and install brotli using the [vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
You can download and install brotli using the
[vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
@@ -33,11 +46,13 @@ You can download and install brotli using the [vcpkg](https://github.com/Microso
./vcpkg integrate install
./vcpkg install brotli
The brotli port in vcpkg is kept up to date by Microsoft team members and community contributors. If the version is out of date, please [create an issue or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
The brotli port in vcpkg is kept up to date by Microsoft team members and
community contributors. If the version is out of date, please [create an issue
or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
#### Bazel
See [Bazel](http://www.bazel.build/)
See [Bazel](https://www.bazel.build/)
#### CMake
@@ -65,7 +80,7 @@ from source, development, and testing.
### Contributing
We glad to answer/library related questions in
[brotli mailing list](https://groups.google.com/forum/#!forum/brotli).
[brotli mailing list](https://groups.google.com/g/brotli).
Regular issues / feature requests should be reported in
[issue tracker](https://github.com/google/brotli/issues).
@@ -76,20 +91,24 @@ For contributing changes please read [CONTRIBUTING](CONTRIBUTING.md).
### Benchmarks
* [Squash Compression Benchmark](https://quixdb.github.io/squash-benchmark/) / [Unstable Squash Compression Benchmark](https://quixdb.github.io/squash-benchmark/unstable/)
* [Large Text Compression Benchmark](http://mattmahoney.net/dc/text.html)
* [Large Text Compression Benchmark](https://mattmahoney.net/dc/text.html)
* [Lzturbo Benchmark](https://sites.google.com/site/powturbo/home/benchmark)
### Related projects
> **Disclaimer:** Brotli authors take no responsibility for the third party projects mentioned in this section.
Independent [decoder](https://github.com/madler/brotli) implementation by Mark Adler, based entirely on format specification.
Independent [decoder](https://github.com/madler/brotli) implementation
by Mark Adler, based entirely on format specification.
JavaScript port of brotli [decoder](https://github.com/devongovett/brotli.js). Could be used directly via `npm install brotli`
JavaScript port of brotli [decoder](https://github.com/devongovett/brotli.js).
Could be used directly via `npm install brotli`
Hand ported [decoder / encoder](https://github.com/dominikhlbg/BrotliHaxe) in haxe by Dominik Homberger. Output source code: JavaScript, PHP, Python, Java and C#
Hand ported [decoder / encoder](https://github.com/dominikhlbg/BrotliHaxe)
in haxe by Dominik Homberger.
Output source code: JavaScript, PHP, Python, Java and C#
7Zip [plugin](https://github.com/mcmilk/7-Zip-Zstd)
Dart [native bindings](https://github.com/thosakwe/brotli)
Dart compression framework with [fast FFI-based Brotli implementation](https://pub.dev/documentation/es_compression/latest/brotli/brotli-library.html) with ready-to-use prebuilt binaries for Win/Linux/Mac
Dart compression framework with
[fast FFI-based Brotli implementation](https://pub.dev/documentation/es_compression/latest/brotli/)
with ready-to-use prebuilt binaries for Win/Linux/Mac

View File

@@ -203,9 +203,23 @@ OR:
#define BROTLI_TARGET_LOONGARCH64
#endif
/* This does not seem to be an indicator of z/Architecture (64-bit); neither
that allows to use unaligned loads. */
#if defined(__s390x__)
#define BROTLI_TARGET_S390X
#endif
#if defined(__mips64)
#define BROTLI_TARGET_MIPS64
#endif
#if defined(__ia64__) || defined(_M_IA64)
#define BROTLI_TARGET_IA64
#endif
#if defined(BROTLI_TARGET_X64) || defined(BROTLI_TARGET_ARMV8_64) || \
defined(BROTLI_TARGET_POWERPC64) || defined(BROTLI_TARGET_RISCV64) || \
defined(BROTLI_TARGET_LOONGARCH64)
defined(BROTLI_TARGET_LOONGARCH64) || defined(BROTLI_TARGET_MIPS64)
#define BROTLI_TARGET_64_BITS 1
#else
#define BROTLI_TARGET_64_BITS 0
@@ -267,6 +281,46 @@ OR:
#endif
/* Portable unaligned memory access: read / write values via memcpy. */
#if !defined(BROTLI_USE_PACKED_FOR_UNALIGNED)
#if defined(__mips__) && (!defined(__mips_isa_rev) || __mips_isa_rev < 6)
#define BROTLI_USE_PACKED_FOR_UNALIGNED 1
#else
#define BROTLI_USE_PACKED_FOR_UNALIGNED 0
#endif
#endif /* defined(BROTLI_USE_PACKED_FOR_UNALIGNED) */
#if BROTLI_USE_PACKED_FOR_UNALIGNED
typedef union BrotliPackedValue {
uint16_t u16;
uint32_t u32;
uint64_t u64;
size_t szt;
} __attribute__ ((packed)) BrotliPackedValue;
static BROTLI_INLINE uint16_t BrotliUnalignedRead16(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u16;
}
static BROTLI_INLINE uint32_t BrotliUnalignedRead32(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u32;
}
static BROTLI_INLINE uint64_t BrotliUnalignedRead64(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u64;
}
static BROTLI_INLINE size_t BrotliUnalignedReadSizeT(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->szt;
}
static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
BrotliPackedValue* address = (BrotliPackedValue*)p;
address->u64 = v;
}
#else /* not BROTLI_USE_PACKED_FOR_UNALIGNED */
static BROTLI_INLINE uint16_t BrotliUnalignedRead16(const void* p) {
uint16_t t;
memcpy(&t, p, sizeof t);
@@ -291,6 +345,34 @@ static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
memcpy(p, &v, sizeof v);
}
#endif /* BROTLI_USE_PACKED_FOR_UNALIGNED */
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap16, 4, 3, 0)
#define BROTLI_BSWAP16(V) ((uint16_t)__builtin_bswap16(V))
#else
#define BROTLI_BSWAP16(V) ((uint16_t)( \
(((V) & 0xFFU) << 8) | \
(((V) >> 8) & 0xFFU)))
#endif
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap32, 4, 3, 0)
#define BROTLI_BSWAP32(V) ((uint32_t)__builtin_bswap32(V))
#else
#define BROTLI_BSWAP32(V) ((uint32_t)( \
(((V) & 0xFFU) << 24) | (((V) & 0xFF00U) << 8) | \
(((V) >> 8) & 0xFF00U) | (((V) >> 24) & 0xFFU)))
#endif
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap64, 4, 3, 0)
#define BROTLI_BSWAP64(V) ((uint64_t)__builtin_bswap64(V))
#else
#define BROTLI_BSWAP64(V) ((uint64_t)( \
(((V) & 0xFFU) << 56) | (((V) & 0xFF00U) << 40) | \
(((V) & 0xFF0000U) << 24) | (((V) & 0xFF000000U) << 8) | \
(((V) >> 8) & 0xFF000000U) | (((V) >> 24) & 0xFF0000U) | \
(((V) >> 40) & 0xFF00U) | (((V) >> 56) & 0xFFU)))
#endif
#if BROTLI_LITTLE_ENDIAN
/* Straight endianness. Just read / write values. */
#define BROTLI_UNALIGNED_LOAD16LE BrotliUnalignedRead16
@@ -298,32 +380,20 @@ static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
#define BROTLI_UNALIGNED_LOAD64LE BrotliUnalignedRead64
#define BROTLI_UNALIGNED_STORE64LE BrotliUnalignedWrite64
#elif BROTLI_BIG_ENDIAN /* BROTLI_LITTLE_ENDIAN */
/* Explain compiler to byte-swap values. */
#define BROTLI_BSWAP16_(V) ((uint16_t)( \
(((V) & 0xFFU) << 8) | \
(((V) >> 8) & 0xFFU)))
static BROTLI_INLINE uint16_t BROTLI_UNALIGNED_LOAD16LE(const void* p) {
uint16_t value = BrotliUnalignedRead16(p);
return BROTLI_BSWAP16_(value);
return BROTLI_BSWAP16(value);
}
#define BROTLI_BSWAP32_(V) ( \
(((V) & 0xFFU) << 24) | (((V) & 0xFF00U) << 8) | \
(((V) >> 8) & 0xFF00U) | (((V) >> 24) & 0xFFU))
static BROTLI_INLINE uint32_t BROTLI_UNALIGNED_LOAD32LE(const void* p) {
uint32_t value = BrotliUnalignedRead32(p);
return BROTLI_BSWAP32_(value);
return BROTLI_BSWAP32(value);
}
#define BROTLI_BSWAP64_(V) ( \
(((V) & 0xFFU) << 56) | (((V) & 0xFF00U) << 40) | \
(((V) & 0xFF0000U) << 24) | (((V) & 0xFF000000U) << 8) | \
(((V) >> 8) & 0xFF000000U) | (((V) >> 24) & 0xFF0000U) | \
(((V) >> 40) & 0xFF00U) | (((V) >> 56) & 0xFFU))
static BROTLI_INLINE uint64_t BROTLI_UNALIGNED_LOAD64LE(const void* p) {
uint64_t value = BrotliUnalignedRead64(p);
return BROTLI_BSWAP64_(value);
return BROTLI_BSWAP64(value);
}
static BROTLI_INLINE void BROTLI_UNALIGNED_STORE64LE(void* p, uint64_t v) {
uint64_t value = BROTLI_BSWAP64_(v);
uint64_t value = BROTLI_BSWAP64(v);
BrotliUnalignedWrite64(p, value);
}
#else /* BROTLI_LITTLE_ENDIAN */
@@ -599,13 +669,14 @@ BROTLI_UNUSED_FUNCTION void BrotliSuppressUnusedFunctions(void) {
#undef BROTLI_TEST
#endif
#if BROTLI_GNUC_HAS_ATTRIBUTE(model, 3, 0, 3)
#if !defined(BROTLI_MODEL) && BROTLI_GNUC_HAS_ATTRIBUTE(model, 3, 0, 3) && \
!defined(BROTLI_TARGET_IA64) && !defined(BROTLI_TARGET_LOONGARCH64)
#define BROTLI_MODEL(M) __attribute__((model(M)))
#else
#define BROTLI_MODEL(M) /* M */
#endif
#if BROTLI_GNUC_HAS_ATTRIBUTE(cold, 4, 3, 0)
#if !defined(BROTLI_COLD) && BROTLI_GNUC_HAS_ATTRIBUTE(cold, 4, 3, 0)
#define BROTLI_COLD __attribute__((cold))
#else
#define BROTLI_COLD /* cold */

View File

@@ -275,7 +275,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
size_t pos = 0;
uint32_t chunk_size = 0;
size_t total_prefix_suffix_count = 0;
size_t trasform_list_start[SHARED_BROTLI_NUM_DICTIONARY_CONTEXTS];
size_t transform_list_start[SHARED_BROTLI_NUM_DICTIONARY_CONTEXTS];
uint16_t temporary_prefix_suffix_table[256];
/* Skip magic header bytes. */
@@ -329,7 +329,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
for (i = 0; i < dict->num_transform_lists; i++) {
BROTLI_BOOL ok = BROTLI_FALSE;
size_t prefix_suffix_count = 0;
trasform_list_start[i] = pos;
transform_list_start[i] = pos;
dict->transforms_instances[i].prefix_suffix_map =
temporary_prefix_suffix_table;
ok = ParseTransformsList(
@@ -347,7 +347,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
total_prefix_suffix_count = 0;
for (i = 0; i < dict->num_transform_lists; i++) {
size_t prefix_suffix_count = 0;
size_t position = trasform_list_start[i];
size_t position = transform_list_start[i];
uint16_t* prefix_suffix_map =
&dict->prefix_suffix_maps[total_prefix_suffix_count];
BROTLI_BOOL ok = ParsePrefixSuffixTable(

View File

@@ -18,7 +18,7 @@
BrotliEncoderVersion methods. */
#define BROTLI_VERSION_MAJOR 1
#define BROTLI_VERSION_MINOR 1
#define BROTLI_VERSION_MINOR 2
#define BROTLI_VERSION_PATCH 0
#define BROTLI_VERSION BROTLI_MAKE_HEX_VERSION( \
@@ -32,9 +32,9 @@
- interfaces not changed -> current:revision+1:age
*/
#define BROTLI_ABI_CURRENT 2
#define BROTLI_ABI_CURRENT 3
#define BROTLI_ABI_REVISION 0
#define BROTLI_ABI_AGE 1
#define BROTLI_ABI_AGE 2
#if BROTLI_VERSION_MAJOR != (BROTLI_ABI_CURRENT - BROTLI_ABI_AGE)
#error ABI/API version inconsistency

View File

@@ -484,7 +484,7 @@ static BROTLI_INLINE int BrotliCopyPreloadedSymbolsToU8(const HuffmanCode* table
/* Calculate range where CheckInputAmount is always true.
Start with the number of bytes we can read. */
int64_t new_lim = br->guard_in - br->next_in;
/* Convert to bits, since sybmols use variable number of bits. */
/* Convert to bits, since symbols use variable number of bits. */
new_lim *= 8;
/* At most 15 bits per symbol, so this is safe. */
new_lim /= 15;
@@ -1539,7 +1539,7 @@ static BROTLI_BOOL AttachCompoundDictionary(
return BROTLI_TRUE;
}
static void EnsureCoumpoundDictionaryInitialized(BrotliDecoderState* state) {
static void EnsureCompoundDictionaryInitialized(BrotliDecoderState* state) {
BrotliDecoderCompoundDictionary* addon = state->compound_dictionary;
/* 256 = (1 << 8) slots in block map. */
int block_bits = 8;
@@ -1560,7 +1560,7 @@ static BROTLI_BOOL InitializeCompoundDictionaryCopy(BrotliDecoderState* s,
int address, int length) {
BrotliDecoderCompoundDictionary* addon = s->compound_dictionary;
int index;
EnsureCoumpoundDictionaryInitialized(s);
EnsureCompoundDictionaryInitialized(s);
index = addon->block_map[address >> addon->block_bits];
while (address >= addon->chunk_offsets[index + 1]) index++;
if (addon->total_size < address + length) return BROTLI_FALSE;

View File

@@ -343,22 +343,22 @@ static uint32_t ComputeDistanceShortcut(const size_t block_start,
const size_t max_backward_limit,
const size_t gap,
const ZopfliNode* nodes) {
const size_t clen = ZopfliNodeCopyLength(&nodes[pos]);
const size_t ilen = nodes[pos].dcode_insert_length & 0x7FFFFFF;
const size_t c_len = ZopfliNodeCopyLength(&nodes[pos]);
const size_t i_len = nodes[pos].dcode_insert_length & 0x7FFFFFF;
const size_t dist = ZopfliNodeCopyDistance(&nodes[pos]);
/* Since |block_start + pos| is the end position of the command, the copy part
starts from |block_start + pos - clen|. Distances that are greater than
starts from |block_start + pos - c_len|. Distances that are greater than
this or greater than |max_backward_limit| + |gap| are static dictionary
references, and do not update the last distances.
Also distance code 0 (last distance) does not update the last distances. */
if (pos == 0) {
return 0;
} else if (dist + clen <= block_start + pos + gap &&
} else if (dist + c_len <= block_start + pos + gap &&
dist <= max_backward_limit + gap &&
ZopfliNodeDistanceCode(&nodes[pos]) > 0) {
return (uint32_t)pos;
} else {
return nodes[pos - clen - ilen].u.shortcut;
return nodes[pos - c_len - i_len].u.shortcut;
}
}
@@ -376,12 +376,12 @@ static void ComputeDistanceCache(const size_t pos,
int idx = 0;
size_t p = nodes[pos].u.shortcut;
while (idx < 4 && p > 0) {
const size_t ilen = nodes[p].dcode_insert_length & 0x7FFFFFF;
const size_t clen = ZopfliNodeCopyLength(&nodes[p]);
const size_t i_len = nodes[p].dcode_insert_length & 0x7FFFFFF;
const size_t c_len = ZopfliNodeCopyLength(&nodes[p]);
const size_t dist = ZopfliNodeCopyDistance(&nodes[p]);
dist_cache[idx++] = (int)dist;
/* Because of prerequisite, p >= clen + ilen >= 2. */
p = nodes[p - clen - ilen].u.shortcut;
/* Because of prerequisite, p >= c_len + i_len >= 2. */
p = nodes[p - c_len - i_len].u.shortcut;
}
for (; idx < 4; ++idx) {
dist_cache[idx] = *starting_dist_cache++;
@@ -433,7 +433,7 @@ static size_t UpdateNodes(
const CompoundDictionary* addon = &params->dictionary.compound;
size_t gap = addon->total_size;
BROTLI_DCHECK(cur_ix_masked + max_len <= ringbuffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_len <= ringbuffer_mask + 1);
EvaluateNode(block_start + stream_offset, pos, max_backward_limit, gap,
starting_dist_cache, model, queue, nodes);

View File

@@ -545,7 +545,7 @@ static BROTLI_INLINE void FindCompoundDictionaryMatch(
source = (const uint8_t*)BROTLI_UNALIGNED_LOAD_PTR((const uint8_t**)tail);
}
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
for (i = 0; i < 4; ++i) {
const size_t distance = (size_t)distance_cache[i];
@@ -656,7 +656,7 @@ static BROTLI_INLINE size_t FindAllCompoundDictionaryMatches(
source = (const uint8_t*)BROTLI_UNALIGNED_LOAD_PTR((const uint8_t**)tail);
}
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
while (item == 0) {
size_t offset;

View File

@@ -213,7 +213,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
out->len = 0;
out->len_code_delta = 0;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
/* Try last distance first. */
for (i = 0; i < NUM_LAST_DISTANCES_TO_CHECK; ++i) {

View File

@@ -178,7 +178,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
out->len = 0;
out->len_code_delta = 0;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
/* Try last distance first. */
for (i = 0; i < (size_t)self->num_last_distances_to_check_; ++i) {

View File

@@ -195,7 +195,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
out->len = 0;
out->len_code_delta = 0;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
/* Try last distance first. */
for (i = 0; i < (size_t)self->num_last_distances_to_check_; ++i) {

View File

@@ -178,7 +178,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
out->len = 0;
out->len_code_delta = 0;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
/* Try last distance first. */
for (i = 0; i < (size_t)self->num_last_distances_to_check_; ++i) {

View File

@@ -165,7 +165,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
size_t cached_backward = (size_t)distance_cache[0];
size_t prev_ix = cur_ix - cached_backward;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
out->len_code_delta = 0;
if (prev_ix < cur_ix) {

View File

@@ -170,7 +170,7 @@ static BROTLI_INLINE void FN(FindLongestMatch)(
out->len = 0;
out->len_code_delta = 0;
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask);
BROTLI_DCHECK(cur_ix_masked + max_length <= ring_buffer_mask + 1);
/* Try last distance first. */
for (i = 0; i < (size_t)self->num_last_distances_to_check_; ++i) {

View File

@@ -7,13 +7,13 @@
module(
name = "brotli_fuzz",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_fuzz",
)
bazel_dep(name = "rules_fuzzing", version = "0.5.2")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "../..",

View File

@@ -283,6 +283,10 @@ typedef struct BrotliEncoderPreparedDictionaryStruct
* passed to @p alloc_func and @p free_func when they are called. @p free_func
* has to return without doing anything when asked to free a NULL pointer.
*
* @warning Created instance is "lean"; it does not contain copy of @p data,
* rather it contains only pointer to it; therefore,
* @p data @b MUST outlive the created instance.
*
* @param type type of dictionary stored in data
* @param data_size size of @p data buffer
* @param data pointer to the dictionary data

View File

@@ -269,20 +269,20 @@
#if defined(_WIN32)
#if defined(BROTLICOMMON_SHARED_COMPILATION)
#define BROTLI_COMMON_API __declspec(dllexport)
#else
#else /* !BROTLICOMMON_SHARED_COMPILATION */
#define BROTLI_COMMON_API __declspec(dllimport)
#endif /* BROTLICOMMON_SHARED_COMPILATION */
#if defined(BROTLIDEC_SHARED_COMPILATION)
#define BROTLI_DEC_API __declspec(dllexport)
#else
#else /* !BROTLIDEC_SHARED_COMPILATION */
#define BROTLI_DEC_API __declspec(dllimport)
#endif /* BROTLIDEC_SHARED_COMPILATION */
#if defined(BROTLIENC_SHARED_COMPILATION)
#define BROTLI_ENC_API __declspec(dllexport)
#else
#else /* !BROTLIENC_SHARED_COMPILATION */
#define BROTLI_ENC_API __declspec(dllimport)
#endif /* BROTLIENC_SHARED_COMPILATION */
#else /* _WIN32 */
#else /* !_WIN32 */
#define BROTLI_COMMON_API BROTLI_PUBLIC
#define BROTLI_DEC_API BROTLI_PUBLIC
#define BROTLI_ENC_API BROTLI_PUBLIC

View File

@@ -0,0 +1,20 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<EnableDefaultCompileItems>false</EnableDefaultCompileItems>
</PropertyGroup>
<ItemGroup>
<!-- Both regular sources and test sources -->
<Compile Include="org\brotli\dec\*.cs" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="18.0.0" />
<!-- Stick to NUnit3 until tests are regenerated -->
<PackageReference Include="NUnit" Version="3.14.0" />
<PackageReference Include="NUnit3TestAdapter" Version="5.2.0" />
</ItemGroup>
</Project>

13
csharp/brotlidec.csproj Normal file
View File

@@ -0,0 +1,13 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<EnableDefaultCompileItems>false</EnableDefaultCompileItems>
</PropertyGroup>
<ItemGroup>
<Compile Include="org\brotli\dec\*.cs" />
<Compile Remove="**\*Test.cs" />
</ItemGroup>
</Project>

View File

@@ -63,7 +63,7 @@ echo "${CODE//$PATTERN/$REPLACEMENT}" > org/brotli/dec/BrotliInputStream.cs
#-------------------------------------------------------------------------------
echo -e '\n\033[01;33mDowloading dependencies.\033[00m'
echo -e '\n\033[01;33mDownloading dependencies.\033[00m'
cd build
nuget install NUnit -Version 3.6.1

View File

@@ -535,6 +535,11 @@ Result is only valid if quality is at least \fC2\fP and, in case \fBBrotliEncode
.PP
Prepares a shared dictionary from the given file format for the encoder\&. \fCalloc_func\fP and \fCfree_func\fP \fBMUST\fP be both zero or both non-zero\&. In the case they are both zero, default memory allocators are used\&. \fCopaque\fP is passed to \fCalloc_func\fP and \fCfree_func\fP when they are called\&. \fCfree_func\fP has to return without doing anything when asked to free a NULL pointer\&.
.PP
\fBWarning:\fP
.RS 4
Created instance is 'lean'; it does not contain copy of \fCdata\fP, rather it contains only pointer to it; therefore, \fCdata\fP \fBMUST\fP outlive the created instance\&.
.RE
.PP
\fBParameters:\fP
.RS 4
\fItype\fP type of dictionary stored in data

View File

@@ -1879,8 +1879,8 @@ func copyRawBytes(s *_State, data []int8, offset int32, length int32) int32 {
}
for len > 0 {
var chunkLen int32 = readInput(s, data, pos, len)
if len < -1 {
return len
if chunkLen < -1 {
return chunkLen
}
if chunkLen <= 0 {
return makeError(s, -16)

View File

@@ -1 +1,3 @@
module github.com/google/brotli/go/brotli
go 1.21

View File

@@ -9,7 +9,6 @@ import (
"bytes"
"errors"
"io"
"io/ioutil"
"strconv"
"unsafe"
)
@@ -127,5 +126,5 @@ func Decode(encodedData []byte) ([]byte, error) {
func DecodeWithRawDictionary(encodedData []byte, dictionary []byte) ([]byte, error) {
r := NewReaderWithOptions(bytes.NewReader(encodedData), ReaderOptions{RawDictionary: dictionary})
defer r.Close()
return ioutil.ReadAll(r)
return io.ReadAll(r)
}

View File

@@ -1 +1,3 @@
module github.com/google/brotli/go/cbrotli
go 1.21

View File

@@ -7,7 +7,7 @@
module(
name = "brotli_java",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_java",
)
@@ -16,7 +16,7 @@ bazel_dep(name = "rules_jvm_external", version = "6.7")
bazel_dep(name = "rules_kotlin", version = "2.1.4")
bazel_dep(name = "platforms", version = "0.0.11")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "..",

View File

@@ -72,10 +72,10 @@ final class BitReader {
while (bytesInBuffer < CAPACITY) {
final int spaceLeft = CAPACITY - bytesInBuffer;
final int len = Utils.readInput(s, s.byteBuffer, bytesInBuffer, spaceLeft);
// EOF is -1 in Java, but 0 in C#.
if (len < BROTLI_ERROR) {
return len;
}
// EOF is -1 in Java, but 0 in C#.
if (len <= 0) {
s.endOfStreamReached = 1;
s.tailBytes = bytesInBuffer;
@@ -276,10 +276,10 @@ final class BitReader {
// Now it is possible to copy bytes directly.
while (len > 0) {
final int chunkLen = Utils.readInput(s, data, pos, len);
// EOF is -1 in Java, but 0 in C#.
if (len < BROTLI_ERROR) {
return len;
if (chunkLen < BROTLI_ERROR) {
return chunkLen;
}
// EOF is -1 in Java, but 0 in C#.
if (chunkLen <= 0) {
return Utils.makeError(s, BROTLI_ERROR_TRUNCATED_INPUT);
}

View File

@@ -6,42 +6,73 @@
package org.brotli.dec;
/**
* Possible errors from decoder.
*/
public class BrotliError {
/** Possible errors from decoder. */
public final class BrotliError {
/** Success; anything greater is also success. */
public static final int BROTLI_OK = 0;
/** Success; decoder has finished decompressing the input. */
public static final int BROTLI_OK_DONE = BROTLI_OK + 1;
/** Success; decoder has more output to produce. */
public static final int BROTLI_OK_NEED_MORE_OUTPUT = BROTLI_OK + 2;
// It is important that actual error codes are LESS than -1!
/** Error code threshold; actual error codes are LESS than -1! */
public static final int BROTLI_ERROR = -1;
/** Stream error: corrupted code length table. */
public static final int BROTLI_ERROR_CORRUPTED_CODE_LENGTH_TABLE = BROTLI_ERROR - 1;
/** Stream error: corrupted context map. */
public static final int BROTLI_ERROR_CORRUPTED_CONTEXT_MAP = BROTLI_ERROR - 2;
/** Stream error: corrupted Huffman code histogram. */
public static final int BROTLI_ERROR_CORRUPTED_HUFFMAN_CODE_HISTOGRAM = BROTLI_ERROR - 3;
/** Stream error: corrupted padding bits. */
public static final int BROTLI_ERROR_CORRUPTED_PADDING_BITS = BROTLI_ERROR - 4;
/** Stream error: corrupted reserved bit. */
public static final int BROTLI_ERROR_CORRUPTED_RESERVED_BIT = BROTLI_ERROR - 5;
/** Stream error: duplicate simple Huffman symbol. */
public static final int BROTLI_ERROR_DUPLICATE_SIMPLE_HUFFMAN_SYMBOL = BROTLI_ERROR - 6;
/** Stream error: exuberant nibble. */
public static final int BROTLI_ERROR_EXUBERANT_NIBBLE = BROTLI_ERROR - 7;
/** Stream error: invalid backward reference. */
public static final int BROTLI_ERROR_INVALID_BACKWARD_REFERENCE = BROTLI_ERROR - 8;
/** Stream error: invalid metablock length. */
public static final int BROTLI_ERROR_INVALID_METABLOCK_LENGTH = BROTLI_ERROR - 9;
/** Stream error: invalid window bits. */
public static final int BROTLI_ERROR_INVALID_WINDOW_BITS = BROTLI_ERROR - 10;
/** Stream error: negative distance. */
public static final int BROTLI_ERROR_NEGATIVE_DISTANCE = BROTLI_ERROR - 11;
/** Stream error: read after end of input buffer. */
public static final int BROTLI_ERROR_READ_AFTER_END = BROTLI_ERROR - 12;
/** IO error: read failed. */
public static final int BROTLI_ERROR_READ_FAILED = BROTLI_ERROR - 13;
/** IO error: symbol out of range. */
public static final int BROTLI_ERROR_SYMBOL_OUT_OF_RANGE = BROTLI_ERROR - 14;
/** Stream error: truncated input. */
public static final int BROTLI_ERROR_TRUNCATED_INPUT = BROTLI_ERROR - 15;
/** Stream error: unused bytes after end of stream. */
public static final int BROTLI_ERROR_UNUSED_BYTES_AFTER_END = BROTLI_ERROR - 16;
/** Stream error: unused Huffman space. */
public static final int BROTLI_ERROR_UNUSED_HUFFMAN_SPACE = BROTLI_ERROR - 17;
/** Exception code threshold. */
public static final int BROTLI_PANIC = -21;
/** Exception: stream is already closed. */
public static final int BROTLI_PANIC_ALREADY_CLOSED = BROTLI_PANIC - 1;
/** Exception: max distance is too small. */
public static final int BROTLI_PANIC_MAX_DISTANCE_TOO_SMALL = BROTLI_PANIC - 2;
/** Exception: state is not fresh. */
public static final int BROTLI_PANIC_STATE_NOT_FRESH = BROTLI_PANIC - 3;
/** Exception: state is not initialized. */
public static final int BROTLI_PANIC_STATE_NOT_INITIALIZED = BROTLI_PANIC - 4;
/** Exception: state is not uninitialized. */
public static final int BROTLI_PANIC_STATE_NOT_UNINITIALIZED = BROTLI_PANIC - 5;
/** Exception: too many dictionary chunks. */
public static final int BROTLI_PANIC_TOO_MANY_DICTIONARY_CHUNKS = BROTLI_PANIC - 6;
/** Exception: unexpected state. */
public static final int BROTLI_PANIC_UNEXPECTED_STATE = BROTLI_PANIC - 7;
/** Exception: unreachable code. */
public static final int BROTLI_PANIC_UNREACHABLE = BROTLI_PANIC - 8;
/** Exception: unaligned copy bytes. */
public static final int BROTLI_PANIC_UNALIGNED_COPY_BYTES = BROTLI_PANIC - 9;
/** Non-instantiable. */
private BrotliError() {}
}

View File

@@ -16,6 +16,7 @@ import java.io.InputStream;
*/
public class BrotliInputStream extends InputStream {
/** Default size of internal buffer (used for faster byte-by-byte reading). */
public static final int DEFAULT_INTERNAL_BUFFER_SIZE = 256;
/**
@@ -93,14 +94,17 @@ public class BrotliInputStream extends InputStream {
}
}
/** Attach "RAW" dictionary (chunk) to decoder. */
public void attachDictionaryChunk(byte[] data) {
Decode.attachDictionaryChunk(state, data);
}
/** Request decoder to produce output as soon as it is available. */
public void enableEagerOutput() {
Decode.enableEagerOutput(state);
}
/** Enable "large window" stream feature. */
public void enableLargeWindow() {
Decode.enableLargeWindow(state);
}

View File

@@ -85,6 +85,7 @@ public class DecodeTest {
@Test
public void testUkkonooa() throws IOException {
// typo:off
checkDecodeResource(
"ukko nooa, ukko nooa oli kunnon mies, kun han meni saunaan, "
+ "pisti laukun naulaan, ukko nooa, ukko nooa oli kunnon mies.",
@@ -92,6 +93,7 @@ public class DecodeTest {
+ "6\u000E\u009C\u00E0\u0090\u0003\u00F7\u008B\u009E8\u00E6\u00B6\u0000\u00AB\u00C3\u00CA"
+ "\u00A0\u00C2\u00DAf6\u00DC\u00CD\u0080\u008D.!\u00D7n\u00E3\u00EAL\u00B8\u00F0\u00D2"
+ "\u00B8\u00C7\u00C2pM:\u00F0i~\u00A1\u00B8Es\u00AB\u00C4W\u001E");
// typo:on
}
@Test
@@ -142,7 +144,6 @@ public class DecodeTest {
public void testUtils() {
new Context();
new Decode();
new Dictionary();
new Huffman();
}
}

View File

@@ -6,6 +6,7 @@ import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
/** Toy decoder CLI; mostly used for simple benchmarking. */
public class Decoder {
private static long decodeBytes(InputStream input, OutputStream output, byte[] buffer)
throws IOException {
@@ -53,6 +54,7 @@ public class Decoder {
System.out.println(mbDecoded / timeDelta + " MiB/s");
}
/** CLI entry point. */
public static void main(String... args) throws IOException {
if (args.length != 2 && args.length != 3) {
System.out.println("Usage: decoder <compressed_in> <decompressed_out> [repeat]");
@@ -69,4 +71,7 @@ public class Decoder {
decompress(args[0], args[1], buffer);
}
}
/** Non-instantiable. */
private Decoder() {}
}

View File

@@ -41,6 +41,7 @@ public final class Dictionary {
private static final int DICTIONARY_DEBUG = Utils.isDebugMode();
/** Initialize static dictionary. */
public static void setData(ByteBuffer newData, int[] newSizeBits) {
if (DICTIONARY_DEBUG != 0) {
if ((Utils.isDirect(newData) == 0) || (Utils.isReadOnly(newData) == 0)) {
@@ -90,6 +91,7 @@ public final class Dictionary {
Dictionary.data = newData;
}
/** Access static dictionary. */
public static ByteBuffer getData() {
if (data.capacity() != 0) {
return data;
@@ -100,4 +102,7 @@ public final class Dictionary {
/* Might have been set when {@link DictionaryData} was loaded.*/
return data;
}
/** Non-instantiable. */
private Dictionary() {}
}

File diff suppressed because one or more lines are too long

View File

@@ -283,6 +283,7 @@ public class SynthTest {
*/
compressed,
true,
// typo:off
"|categories|categories | categories |ategories|Categories |categories the | categories|s cat"
+ "egories |categories of |Categories|categories and |tegories|categorie|, categories |catego"
+ "ries, | Categories |categories in |categories to |e categories |categories\"|categories.|c"
@@ -301,6 +302,7 @@ public class SynthTest {
+ "\"|categoriesous |CATEGORIES, |Categories='| Categories,| CATEGORIES=\"| CATEGORIES, |CATE"
+ "GORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES='| CATEGORIES. | Categories=\"| C"
+ "ATEGORIES='| Categories='"
// typo:on
);
}

View File

@@ -64,6 +64,7 @@ final class Transform {
private static final int SHIFT_ALL = SHIFT_FIRST + 1;
// Bundle of 0-terminated strings.
// typo:off
private static final String PREFIX_SUFFIX_SRC = "# #s #, #e #.# the #.com/#\u00C2\u00A0# of # and"
+ " # in # to #\"#\">#\n#]# for # a # that #. # with #'# from # by #. The # on # as # is #ing"
+ " #\n\t#:#ed #(# at #ly #=\"# of the #. This #,# not #er #al #='#ful #ive #less #est #ize #"
@@ -73,6 +74,7 @@ final class Transform {
+ " ; < ' != > ?! 4 @ 4 2 & A *# ( B C& ) % ) !*# *-% A +! *. D! %' & E *6 F "
+ " G% ! *A *% H! D I!+! J!+ K +- *4! A L!*4 M N +6 O!*% +.! K *G P +%( ! G *D +D "
+ " Q +# *K!*G!+D!+# +G +A +4!+% +K!+4!*D!+K!*K";
// typo:on
private static void unpackTransforms(byte[] prefixSuffix,
int[] prefixSuffixHeads, int[] transforms, String prefixSuffixSrc, String transformsSrc) {

View File

@@ -1706,8 +1706,8 @@ internal fun copyRawBytes(s: State, data: ByteArray, offset: Int, length: Int):
}
while (len > 0) {
val chunkLen: Int = readInput(s, data, pos, len);
if (len < -1) {
return len;
if (chunkLen < -1) {
return chunkLen;
}
if (chunkLen <= 0) {
return makeError(s, -16);

View File

@@ -5,10 +5,10 @@
<parent>
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</parent>
<artifactId>dec</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -43,7 +43,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.1.2</version>
<version>3.5.4</version>
<configuration>
<systemPropertyVariables>
<BROTLI_ENABLE_ASSERTS>true</BROTLI_ENABLE_ASSERTS>
@@ -53,7 +53,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.3.0</version>
<version>3.3.1</version>
<configuration>
<finalName>${project.groupId}.${project.artifactId}-${project.version}</finalName>
</configuration>
@@ -78,7 +78,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.5.0</version>
<version>3.12.0</version>
<configuration>
<source>8</source>
<finalName>${project.groupId}.${project.artifactId}-${project.version}</finalName>
@@ -102,7 +102,7 @@
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>5.1.9</version>
<version>6.0.0</version>
<configuration>
<archive>
<forced>true</forced>
@@ -134,7 +134,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.3.0</version>
<version>3.4.2</version>
<configuration>
<archive>
<manifestFile>${manifestfile}</manifestFile>

View File

@@ -5,10 +5,10 @@
<parent>
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</parent>
<artifactId>integration</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -17,7 +17,7 @@
<dependency>
<groupId>org.brotli</groupId>
<artifactId>dec</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</dependency>
</dependencies>
@@ -27,7 +27,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<version>3.5.1</version>
<executions>
<execution>
<id>data</id>

View File

@@ -4,7 +4,7 @@
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -57,7 +57,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<version>3.2.8</version>
<executions>
<execution>
<id>sign-artifacts</id>
@@ -78,16 +78,16 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<version>3.14.0</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.13</version>
<version>1.7.0</version>
<extensions>true</extensions>
<configuration>
<serverId>ossrh</serverId>

View File

@@ -15,7 +15,7 @@ import java.util.ArrayList;
/**
* Base class for InputStream / Channel implementations.
*/
public class Decoder {
public class Decoder implements AutoCloseable {
private static final ByteBuffer EMPTY_BUFFER = ByteBuffer.allocate(0);
private final ReadableByteChannel source;
private final DecoderJNI.Wrapper decoder;
@@ -129,7 +129,8 @@ public class Decoder {
return limit;
}
void close() throws IOException {
@Override
public void close() throws IOException {
if (closed) {
return;
}
@@ -140,9 +141,9 @@ public class Decoder {
/** Decodes the given data buffer starting at offset till length. */
public static byte[] decompress(byte[] data, int offset, int length) throws IOException {
DecoderJNI.Wrapper decoder = new DecoderJNI.Wrapper(length);
ArrayList<byte[]> output = new ArrayList<byte[]>();
ArrayList<byte[]> output = new ArrayList<>();
int totalOutputSize = 0;
DecoderJNI.Wrapper decoder = new DecoderJNI.Wrapper(length);
try {
decoder.getInputBuffer().put(data, offset, length);
decoder.push(length);

View File

@@ -122,14 +122,5 @@ public class DecoderJNI {
nativeDestroy(context);
context[0] = 0;
}
@Override
protected void finalize() throws Throwable {
if (context[0] != 0) {
/* TODO(eustas): log resource leak? */
destroy();
}
super.finalize();
}
}
}

View File

@@ -42,8 +42,8 @@ public class BrotliEncoderChannelTest extends BrotliJniTestBase {
try {
List<String> entries = BundleHelper.listEntries(bundle);
for (String entry : entries) {
suite.addTest(new ChannleTestCase(entry, TestMode.WRITE_ALL));
suite.addTest(new ChannleTestCase(entry, TestMode.WRITE_CHUNKS));
suite.addTest(new ChannelTestCase(entry, TestMode.WRITE_ALL));
suite.addTest(new ChannelTestCase(entry, TestMode.WRITE_CHUNKS));
}
} finally {
bundle.close();
@@ -52,10 +52,10 @@ public class BrotliEncoderChannelTest extends BrotliJniTestBase {
}
/** Test case with a unique name. */
static class ChannleTestCase extends TestCase {
static class ChannelTestCase extends TestCase {
final String entryName;
final TestMode mode;
ChannleTestCase(String entryName, TestMode mode) {
ChannelTestCase(String entryName, TestMode mode) {
super("BrotliEncoderChannelTest." + entryName + "." + mode.name());
this.entryName = entryName;
this.mode = mode;

View File

@@ -6,10 +6,10 @@
package org.brotli.wrapper.enc;
import org.brotli.enc.PreparedDictionary;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.channels.Channels;
import org.brotli.enc.PreparedDictionary;
/**
* Output stream that wraps native brotli encoder.

View File

@@ -17,7 +17,7 @@ import org.brotli.enc.PreparedDictionary;
/**
* Base class for OutputStream / Channel implementations.
*/
public class Encoder {
public class Encoder implements AutoCloseable {
private final WritableByteChannel destination;
private final List<PreparedDictionary> dictionaries;
private final EncoderJNI.Wrapper encoder;
@@ -65,12 +65,6 @@ public class Encoder {
public Parameters() { }
private Parameters(Parameters other) {
this.quality = other.quality;
this.lgwin = other.lgwin;
this.mode = other.mode;
}
/**
* Setup encoder quality.
*
@@ -199,7 +193,8 @@ public class Encoder {
encode(EncoderJNI.Operation.FLUSH);
}
void close() throws IOException {
@Override
public void close() throws IOException {
if (closed) {
return;
}
@@ -221,10 +216,10 @@ public class Encoder {
return empty;
}
/* data.length > 0 */
ArrayList<byte[]> output = new ArrayList<>();
int totalOutputSize = 0;
EncoderJNI.Wrapper encoder =
new EncoderJNI.Wrapper(length, params.quality, params.lgwin, params.mode);
ArrayList<byte[]> output = new ArrayList<byte[]>();
int totalOutputSize = 0;
try {
encoder.getInputBuffer().put(data, offset, length);
encoder.push(EncoderJNI.Operation.FINISH, length);

View File

@@ -6,9 +6,9 @@
package org.brotli.wrapper.enc;
import org.brotli.enc.PreparedDictionary;
import java.io.IOException;
import java.nio.ByteBuffer;
import org.brotli.enc.PreparedDictionary;
/**
* JNI wrapper for brotli encoder.
@@ -28,7 +28,7 @@ class EncoderJNI {
FINISH
}
private static class PreparedDictionaryImpl implements PreparedDictionary {
private static class PreparedDictionaryImpl implements AutoCloseable, PreparedDictionary {
private ByteBuffer data;
/** Reference to (non-copied) LZ data. */
private ByteBuffer rawData;
@@ -43,15 +43,11 @@ class EncoderJNI {
}
@Override
protected void finalize() throws Throwable {
try {
ByteBuffer data = this.data;
this.data = null;
this.rawData = null;
nativeDestroyDictionary(data);
} finally {
super.finalize();
}
public void close() {
ByteBuffer data = this.data;
this.data = null;
this.rawData = null;
nativeDestroyDictionary(data);
}
}
@@ -168,14 +164,5 @@ class EncoderJNI {
nativeDestroy(context);
context[0] = 0;
}
@Override
protected void finalize() throws Throwable {
if (context[0] != 0) {
/* TODO(eustas): log resource leak? */
destroy();
}
super.finalize();
}
}
}

File diff suppressed because one or more lines are too long

2
js/decode.min.js vendored

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -239,6 +239,7 @@ testAllTransforms10() {
*/
compressed,
true,
// typo:off
'|categories|categories | categories |ategories|Categories |categories the '
+ '| categories|s categories |categories of |Categories|categories and |teg'
+ 'ories|categorie|, categories |categories, | Categories |categories in |c'
@@ -261,6 +262,7 @@ testAllTransforms10() {
+ '|CATEGORIES, |Categories=\'| Categories,| CATEGORIES="| CATEGORIES, |CAT'
+ 'EGORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES=\'| CATEGORIES'
+ '. | Categories="| CATEGORIES=\'| Categories=\''
// typo:on
);
},

View File

@@ -229,6 +229,7 @@ testAllTransforms10() {
*/
compressed,
true,
// typo:off
'|categories|categories | categories |ategories|Categories |categories the '
+ '| categories|s categories |categories of |Categories|categories and |teg'
+ 'ories|categorie|, categories |categories, | Categories |categories in |c'
@@ -251,6 +252,7 @@ testAllTransforms10() {
+ '|CATEGORIES, |Categories=\'| Categories,| CATEGORIES="| CATEGORIES, |CAT'
+ 'EGORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES=\'| CATEGORIES'
+ '. | Categories="| CATEGORIES=\'| Categories=\''
// typo:on
);
},

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
This directory contains the code for the Python `brotli` module,
`bro.py` tool, and roundtrip tests.
and roundtrip tests.
Only Python 2.7+ is supported.
Only Python 3.10+ is supported.
We provide a `Makefile` to simplify common development commands.
@@ -17,13 +17,17 @@ following command from this directory:
$ make install
If you already have native Brotli installed on your system and want to use this one instead of the vendored sources, you
should set the `USE_SYSTEM_BROTLI=1` environment variable when building the wheel, like this:
If you already have native Brotli installed on your system and want to use
this one instead of the vendored sources, you should set
the `USE_SYSTEM_BROTLI=1` environment variable when building the wheel,
like this:
$ USE_SYSTEM_BROTLI=1 pip install brotli --no-binary brotli
Brotli is found via the `pkg-config` utility. Moreover, you must build all 3 `brotlicommon`, `brotlienc`, and `brotlidec`
components. If you're installing brotli from the package manager, you need the development package, like this on Fedora:
Brotli is found via the `pkg-config` utility. Moreover, you must build
all 3 `brotlicommon`, `brotlienc`, and `brotlidec` components. If you're
installing brotli from the package manager, you need the development package,
like this on Fedora:
$ dnf install brotli brotli-devel
@@ -45,8 +49,8 @@ able to edit the source files, you can use the `setuptools`
### Code Style
Brotli's code follows the [Google Python Style Guide][]. To
automatically format your code, first install [YAPF][]:
Brotli code follows the [Google Python Style Guide][].
To automatically format your code, first install [YAPF][]:
$ pip install yapf
@@ -56,7 +60,6 @@ Then, to format all files in the project, you can run:
See the [YAPF usage][] documentation for more information.
[PyPI]: https://pypi.org/project/Brotli/
[development mode]: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode
[Google Python Style Guide]: https://google.github.io/styleguide/pyguide.html

File diff suppressed because it is too large Load Diff

View File

@@ -1,194 +0,0 @@
#! /usr/bin/env python
"""Compression/decompression utility using the Brotli algorithm."""
# Note: Python2 has been deprecated long ago, but some projects out in
# the wide world may still use it nevertheless. This should not
# deprive them from being able to run Brotli.
from __future__ import print_function
import argparse
import os
import platform
import sys
import brotli
# default values of encoder parameters
_DEFAULT_PARAMS = {
'mode': brotli.MODE_GENERIC,
'quality': 11,
'lgwin': 22,
'lgblock': 0,
}
def get_binary_stdio(stream):
"""Return the specified stdin/stdout/stderr stream.
If the stdio stream requested (i.e. sys.(stdin|stdout|stderr))
has been replaced with a stream object that does not have a `.buffer`
attribute, this will return the original stdio stream's buffer, i.e.
`sys.__(stdin|stdout|stderr)__.buffer`.
Args:
stream: One of 'stdin', 'stdout', 'stderr'.
Returns:
The stream, as a 'raw' buffer object (i.e. io.BufferedIOBase subclass
instance such as io.Bufferedreader/io.BufferedWriter), suitable for
reading/writing binary data from/to it.
"""
if stream == 'stdin': stdio = sys.stdin
elif stream == 'stdout': stdio = sys.stdout
elif stream == 'stderr': stdio = sys.stderr
else:
raise ValueError('invalid stream name: %s' % (stream,))
if sys.version_info[0] < 3:
if sys.platform == 'win32':
# set I/O stream binary flag on python2.x (Windows)
runtime = platform.python_implementation()
if runtime == 'PyPy':
# the msvcrt trick doesn't work in pypy, so use fdopen().
mode = 'rb' if stream == 'stdin' else 'wb'
stdio = os.fdopen(stdio.fileno(), mode, 0)
else:
# this works with CPython -- untested on other implementations
import msvcrt
msvcrt.setmode(stdio.fileno(), os.O_BINARY)
return stdio
else:
try:
return stdio.buffer
except AttributeError:
# The Python reference explains
# (-> https://docs.python.org/3/library/sys.html#sys.stdin)
# that the `.buffer` attribute might not exist, since
# the standard streams might have been replaced by something else
# (such as an `io.StringIO()` - perhaps via
# `contextlib.redirect_stdout()`).
# We fall back to the original stdio in these cases.
if stream == 'stdin': return sys.__stdin__.buffer
if stream == 'stdout': return sys.__stdout__.buffer
if stream == 'stderr': return sys.__stderr__.buffer
assert False, 'Impossible Situation.'
def main(args=None):
parser = argparse.ArgumentParser(
prog=os.path.basename(__file__), description=__doc__)
parser.add_argument(
'--version', action='version', version=brotli.version)
parser.add_argument(
'-i',
'--input',
metavar='FILE',
type=str,
dest='infile',
help='Input file',
default=None)
parser.add_argument(
'-o',
'--output',
metavar='FILE',
type=str,
dest='outfile',
help='Output file',
default=None)
parser.add_argument(
'-f',
'--force',
action='store_true',
help='Overwrite existing output file',
default=False)
parser.add_argument(
'-d',
'--decompress',
action='store_true',
help='Decompress input file',
default=False)
params = parser.add_argument_group('optional encoder parameters')
params.add_argument(
'-m',
'--mode',
metavar='MODE',
type=int,
choices=[0, 1, 2],
help='The compression mode can be 0 for generic input, '
'1 for UTF-8 encoded text, or 2 for WOFF 2.0 font data. '
'Defaults to 0.')
params.add_argument(
'-q',
'--quality',
metavar='QUALITY',
type=int,
choices=list(range(0, 12)),
help='Controls the compression-speed vs compression-density '
'tradeoff. The higher the quality, the slower the '
'compression. Range is 0 to 11. Defaults to 11.')
params.add_argument(
'--lgwin',
metavar='LGWIN',
type=int,
choices=list(range(10, 25)),
help='Base 2 logarithm of the sliding window size. Range is '
'10 to 24. Defaults to 22.')
params.add_argument(
'--lgblock',
metavar='LGBLOCK',
type=int,
choices=[0] + list(range(16, 25)),
help='Base 2 logarithm of the maximum input block size. '
'Range is 16 to 24. If set to 0, the value will be set based '
'on the quality. Defaults to 0.')
# set default values using global _DEFAULT_PARAMS dictionary
parser.set_defaults(**_DEFAULT_PARAMS)
options = parser.parse_args(args=args)
if options.infile:
try:
with open(options.infile, 'rb') as infile:
data = infile.read()
except OSError:
parser.error('Could not read --infile: %s' % (infile,))
else:
if sys.stdin.isatty():
# interactive console, just quit
parser.error('No input (called from interactive terminal).')
infile = get_binary_stdio('stdin')
data = infile.read()
if options.outfile:
# Caution! If `options.outfile` is a broken symlink, will try to
# redirect the write according to symlink.
if os.path.exists(options.outfile) and not options.force:
parser.error(('Target --outfile=%s already exists, '
'but --force was not requested.') % (options.outfile,))
outfile = open(options.outfile, 'wb')
did_open_outfile = True
else:
outfile = get_binary_stdio('stdout')
did_open_outfile = False
try:
try:
if options.decompress:
data = brotli.decompress(data)
else:
data = brotli.compress(
data,
mode=options.mode,
quality=options.quality,
lgwin=options.lgwin,
lgblock=options.lgblock)
outfile.write(data)
finally:
if did_open_outfile: outfile.close()
except brotli.error as e:
parser.exit(1,
'bro: error: %s: %s' % (e, options.infile or '{stdin}'))
if __name__ == '__main__':
main()

View File

@@ -1,126 +1,83 @@
"""Common utilities for Brotli tests."""
from __future__ import print_function
import filecmp
import glob
import itertools
import os
import pathlib
import sys
import sysconfig
import tempfile
import unittest
project_dir = os.path.abspath(os.path.join(__file__, '..', '..', '..'))
test_dir = os.getenv("BROTLI_TESTS_PATH")
BRO_ARGS = [os.getenv("BROTLI_WRAPPER")]
project_dir = str(pathlib.PurePath(__file__).parent.parent.parent)
runtime_dir = os.getenv('TEST_SRCDIR')
test_dir = os.getenv('BROTLI_TESTS_PATH')
# Fallbacks
if test_dir is None:
if test_dir and runtime_dir:
test_dir = os.path.join(runtime_dir, test_dir)
elif test_dir is None:
test_dir = os.path.join(project_dir, 'tests')
if BRO_ARGS[0] is None:
python_exe = sys.executable or 'python'
bro_path = os.path.join(project_dir, 'python', 'bro.py')
BRO_ARGS = [python_exe, bro_path]
# Get the platform/version-specific build folder.
# By default, the distutils build base is in the same location as setup.py.
platform_lib_name = 'lib.{platform}-{version[0]}.{version[1]}'.format(
platform=sysconfig.get_platform(), version=sys.version_info)
platform=sysconfig.get_platform(), version=sys.version_info
)
build_dir = os.path.join(project_dir, 'bin', platform_lib_name)
# Prepend the build folder to sys.path and the PYTHONPATH environment variable.
if build_dir not in sys.path:
sys.path.insert(0, build_dir)
TEST_ENV = os.environ.copy()
sys.path.insert(0, build_dir)
TEST_ENV = dict(os.environ)
if 'PYTHONPATH' not in TEST_ENV:
TEST_ENV['PYTHONPATH'] = build_dir
TEST_ENV['PYTHONPATH'] = build_dir
else:
TEST_ENV['PYTHONPATH'] = build_dir + os.pathsep + TEST_ENV['PYTHONPATH']
TEST_ENV['PYTHONPATH'] = build_dir + os.pathsep + TEST_ENV['PYTHONPATH']
TESTDATA_DIR = os.path.join(test_dir, 'testdata')
TESTDATA_FILES = [
'empty', # Empty file
'10x10y', # Small text
'alice29.txt', # Large text
'random_org_10k.bin', # Small data
'mapsdatazrh', # Large data
'ukkonooa', # Poem
'cp1251-utf16le', # Codepage 1251 table saved in UTF16-LE encoding
'cp852-utf8', # Codepage 852 table saved in UTF8 encoding
]
# Some files might be missing in a lightweight sources pack.
TESTDATA_PATH_CANDIDATES = [
os.path.join(TESTDATA_DIR, f) for f in TESTDATA_FILES
]
TESTDATA_PATHS = [
path for path in TESTDATA_PATH_CANDIDATES if os.path.isfile(path)
]
TESTDATA_PATHS_FOR_DECOMPRESSION = glob.glob(
os.path.join(TESTDATA_DIR, '*.compressed'))
TEMP_DIR = tempfile.mkdtemp()
def gather_text_inputs():
"""Discover inputs for decompression tests."""
all_inputs = [
'empty', # Empty file
'10x10y', # Small text
'alice29.txt', # Large text
'random_org_10k.bin', # Small data
'mapsdatazrh', # Large data
'ukkonooa', # Poem
'cp1251-utf16le', # Codepage 1251 table saved in UTF16-LE encoding
'cp852-utf8', # Codepage 852 table saved in UTF8 encoding
# TODO(eustas): add test on already compressed content
]
# Filter out non-existing files; e.g. in lightweight sources pack.
return [
f for f in all_inputs if os.path.isfile(os.path.join(TESTDATA_DIR, f))
]
def get_temp_compressed_name(filename):
return os.path.join(TEMP_DIR, os.path.basename(filename + '.bro'))
def gather_compressed_inputs():
"""Discover inputs for compression tests."""
candidates = glob.glob(os.path.join(TESTDATA_DIR, '*.compressed'))
pairs = [(f, f.split('.compressed')[0]) for f in candidates]
existing = [
pair
for pair in pairs
if os.path.isfile(pair[0]) and os.path.isfile(pair[1])
]
return [
(os.path.basename(pair[0]), (os.path.basename(pair[1])))
for pair in existing
]
def get_temp_uncompressed_name(filename):
return os.path.join(TEMP_DIR, os.path.basename(filename + '.unbro'))
def take_input(input_name):
with open(os.path.join(TESTDATA_DIR, input_name), 'rb') as f:
return f.read()
def bind_method_args(method, *args, **kwargs):
return lambda self: method(self, *args, **kwargs)
def has_input(input_name):
return os.path.isfile(os.path.join(TESTDATA_DIR, input_name))
def generate_test_methods(test_case_class,
for_decompression=False,
variants=None):
# Add test methods for each test data file. This makes identifying problems
# with specific compression scenarios easier.
if for_decompression:
paths = TESTDATA_PATHS_FOR_DECOMPRESSION
else:
paths = TESTDATA_PATHS
opts = []
if variants:
opts_list = []
for k, v in variants.items():
opts_list.append([r for r in itertools.product([k], v)])
for o in itertools.product(*opts_list):
opts_name = '_'.join([str(i) for i in itertools.chain(*o)])
opts_dict = dict(o)
opts.append([opts_name, opts_dict])
else:
opts.append(['', {}])
for method in [m for m in dir(test_case_class) if m.startswith('_test')]:
for testdata in paths:
for (opts_name, opts_dict) in opts:
f = os.path.splitext(os.path.basename(testdata))[0]
name = 'test_{method}_{options}_{file}'.format(
method=method, options=opts_name, file=f)
func = bind_method_args(
getattr(test_case_class, method), testdata, **opts_dict)
setattr(test_case_class, name, func)
class TestCase(unittest.TestCase):
def tearDown(self):
for f in TESTDATA_PATHS:
try:
os.unlink(get_temp_compressed_name(f))
except OSError:
pass
try:
os.unlink(get_temp_uncompressed_name(f))
except OSError:
pass
def assertFilesMatch(self, first, second):
self.assertTrue(
filecmp.cmp(first, second, shallow=False),
'File {} differs from {}'.format(first, second))
def chunk_input(data, chunk_size):
return [data[i:i + chunk_size] for i in range(0, len(data), chunk_size)]

View File

@@ -1,101 +0,0 @@
# Copyright 2016 The Brotli Authors. All rights reserved.
#
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import subprocess
import unittest
from . import _test_utils
BRO_ARGS = _test_utils.BRO_ARGS
TEST_ENV = _test_utils.TEST_ENV
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
class TestBroDecompress(_test_utils.TestCase):
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _decompress_file(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-f', '-d', '-i', test_data, '-o', temp_uncompressed]
subprocess.check_call(args, env=TEST_ENV)
def _decompress_pipe(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-d']
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV)
def _test_decompress_file(self, test_data):
self._decompress_file(test_data)
self._check_decompression(test_data)
def _test_decompress_pipe(self, test_data):
self._decompress_pipe(test_data)
self._check_decompression(test_data)
_test_utils.generate_test_methods(TestBroDecompress, for_decompression=True)
class TestBroCompress(_test_utils.TestCase):
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
def _check_decompression(self, test_data, **kwargs):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
args = BRO_ARGS + ['-f', '-d']
args.extend(['-i', temp_compressed, '-o', temp_uncompressed])
subprocess.check_call(args, env=TEST_ENV)
self.assertFilesMatch(temp_uncompressed, original)
def _compress_file(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS + ['-f']
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
args.extend(['-i', test_data, '-o', temp_compressed])
subprocess.check_call(args, env=TEST_ENV)
def _compress_pipe(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV)
def _test_compress_file(self, test_data, **kwargs):
self._compress_file(test_data, **kwargs)
self._check_decompression(test_data)
def _test_compress_pipe(self, test_data, **kwargs):
self._compress_pipe(test_data, **kwargs)
self._check_decompression(test_data)
_test_utils.generate_test_methods(
TestBroCompress, variants=TestBroCompress.VARIANTS)
if __name__ == '__main__':
unittest.main()

View File

@@ -3,39 +3,17 @@
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import unittest
import brotli
import pytest
from . import _test_utils
import brotli
class TestCompress(_test_utils.TestCase):
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
def _check_decompression(self, test_data, **kwargs):
kwargs = {}
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read(), **kwargs))
self.assertFilesMatch(temp_uncompressed, original)
def _compress(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.compress(in_file.read(), **kwargs))
def _test_compress(self, test_data, **kwargs):
self._compress(test_data, **kwargs)
self._check_decompression(test_data, **kwargs)
_test_utils.generate_test_methods(TestCompress, variants=TestCompress.VARIANTS)
if __name__ == '__main__':
unittest.main()
@pytest.mark.parametrize("quality", [1, 6, 9, 11])
@pytest.mark.parametrize("lgwin", [10, 15, 20, 24])
@pytest.mark.parametrize("text_name", _test_utils.gather_text_inputs())
def test_compress(quality, lgwin, text_name):
original = _test_utils.take_input(text_name)
compressed = brotli.compress(original, quality=quality, lgwin=lgwin)
decompressed = brotli.decompress(compressed)
assert original == decompressed

View File

@@ -3,92 +3,49 @@
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import functools
import unittest
import brotli
import pytest
from . import _test_utils
import brotli
# Do not inherit from TestCase here to ensure that test methods
# are not run automatically and instead are run as part of a specific
# configuration below.
class _TestCompressor(object):
CHUNK_SIZE = 2048
def tearDown(self):
self.compressor = None
def _check_decompression(self, test_data):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
self.assertFilesMatch(temp_uncompressed, original)
def _test_single_process(self, test_data):
# Write single-shot compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(self.compressor.process(in_file.read()))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process(self, test_data):
# Write chunked compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process_and_flush(self, test_data):
# Write chunked and flushed compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.flush())
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
@pytest.mark.parametrize("quality", [1, 6, 9, 11])
@pytest.mark.parametrize("text_name", _test_utils.gather_text_inputs())
def test_single_process(quality, text_name):
original = _test_utils.take_input(text_name)
compressor = brotli.Compressor(quality=quality)
compressed = compressor.process(original)
compressed += compressor.finish()
decompressed = brotli.decompress(compressed)
assert original == decompressed
_test_utils.generate_test_methods(_TestCompressor)
@pytest.mark.parametrize("quality", [1, 6, 9, 11])
@pytest.mark.parametrize("text_name", _test_utils.gather_text_inputs())
def test_multiple_process(quality, text_name):
original = _test_utils.take_input(text_name)
chunk_size = 2048
chunks = _test_utils.chunk_input(original, chunk_size)
compressor = brotli.Compressor(quality=quality)
compressed = b''
for chunk in chunks:
compressed += compressor.process(chunk)
compressed += compressor.finish()
decompressed = brotli.decompress(compressed)
assert original == decompressed
class TestCompressorQuality1(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=1)
class TestCompressorQuality6(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=6)
class TestCompressorQuality9(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=9)
class TestCompressorQuality11(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=11)
if __name__ == '__main__':
unittest.main()
@pytest.mark.parametrize("quality", [1, 6, 9, 11])
@pytest.mark.parametrize("text_name", _test_utils.gather_text_inputs())
def test_multiple_process_and_flush(quality, text_name):
original = _test_utils.take_input(text_name)
chunk_size = 2048
chunks = _test_utils.chunk_input(original, chunk_size)
compressor = brotli.Compressor(quality=quality)
compressed = b''
for chunk in chunks:
compressed += compressor.process(chunk)
compressed += compressor.flush()
compressed += compressor.finish()
decompressed = brotli.decompress(compressed)
assert original == decompressed

View File

@@ -3,40 +3,22 @@
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import unittest
import brotli
import pytest
from . import _test_utils
import brotli
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
@pytest.mark.parametrize(
'compressed_name, original_name', _test_utils.gather_compressed_inputs()
)
def test_decompress(compressed_name, original_name):
compressed = _test_utils.take_input(compressed_name)
original = _test_utils.take_input(original_name)
decompressed = brotli.decompress(compressed)
assert decompressed == original
class TestDecompress(_test_utils.TestCase):
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
brotli.decompress(brotli.compress(b'a') + b'a')
_test_utils.generate_test_methods(TestDecompress, for_decompression=True)
if __name__ == '__main__':
unittest.main()
def test_garbage_appended():
with pytest.raises(brotli.error):
brotli.decompress(brotli.compress(b'a') + b'a')

View File

@@ -3,99 +3,89 @@
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import functools
import os
import unittest
import brotli
import pytest
from . import _test_utils
import brotli
MIN_OUTPUT_BUFFER_SIZE = 32768 # Actually, several bytes less.
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
@pytest.mark.parametrize(
'compressed_name, original_name', _test_utils.gather_compressed_inputs()
)
def test_decompress(compressed_name, original_name):
decompressor = brotli.Decompressor()
compressed = _test_utils.take_input(compressed_name)
original = _test_utils.take_input(original_name)
chunk_size = 1
chunks = _test_utils.chunk_input(compressed, chunk_size)
decompressed = b''
for chunk in chunks:
decompressed += decompressor.process(chunk)
assert decompressor.is_finished()
assert original == decompressed
class TestDecompressor(_test_utils.TestCase):
CHUNK_SIZE = 1
def setUp(self):
self.decompressor = brotli.Decompressor()
def tearDown(self):
self.decompressor = None
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.decompressor.process(data))
self.assertTrue(self.decompressor.is_finished())
def _decompress_with_limit(self, test_data, max_output_length):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
chunk_iter = iter(functools.partial(in_file.read, 10 * 1024), b'')
while not self.decompressor.is_finished():
data = b''
if self.decompressor.can_accept_more_data():
data = next(chunk_iter, b'')
decompressed_data = self.decompressor.process(data, max_output_length=max_output_length)
self.assertTrue(len(decompressed_data) <= max_output_length)
out_file.write(decompressed_data)
self.assertTrue(next(chunk_iter, None) == None)
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def _test_decompress_with_limit(self, test_data):
self._decompress_with_limit(test_data, max_output_length=20)
self._check_decompression(test_data)
def test_too_much_input(self):
with open(os.path.join(_test_utils.TESTDATA_DIR, "zerosukkanooa.compressed"), 'rb') as in_file:
compressed = in_file.read()
self.decompressor.process(compressed[:-1], max_output_length=1)
# the following assertion checks whether the test setup is correct
self.assertTrue(not self.decompressor.can_accept_more_data())
with self.assertRaises(brotli.error):
self.decompressor.process(compressed[-1:])
def test_changing_limit(self):
test_data = os.path.join(_test_utils.TESTDATA_DIR, "zerosukkanooa.compressed")
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
compressed = in_file.read()
uncompressed = self.decompressor.process(compressed[:-1], max_output_length=1)
self.assertTrue(len(uncompressed) <= 1)
out_file.write(uncompressed)
while not self.decompressor.can_accept_more_data():
out_file.write(self.decompressor.process(b''))
out_file.write(self.decompressor.process(compressed[-1:]))
self._check_decompression(test_data)
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
self.decompressor.process(brotli.compress(b'a') + b'a')
def test_already_finished(self):
self.decompressor.process(brotli.compress(b'a'))
with self.assertRaises(brotli.error):
self.decompressor.process(b'a')
@pytest.mark.parametrize(
'compressed_name, original_name', _test_utils.gather_compressed_inputs()
)
def test_decompress_with_limit(compressed_name, original_name):
decompressor = brotli.Decompressor()
compressed = _test_utils.take_input(compressed_name)
original = _test_utils.take_input(original_name)
chunk_size = 10 * 1024
output_buffer_limit = 10922
chunks = _test_utils.chunk_input(compressed, chunk_size)
decompressed = b''
while not decompressor.is_finished():
data = b''
if decompressor.can_accept_more_data() and chunks:
data = chunks.pop(0)
decompressed_chunk = decompressor.process(
data, output_buffer_limit=output_buffer_limit
)
assert len(decompressed_chunk) <= MIN_OUTPUT_BUFFER_SIZE
decompressed += decompressed_chunk
assert not chunks
assert original == decompressed
_test_utils.generate_test_methods(TestDecompressor, for_decompression=True)
def test_too_much_input():
decompressor = brotli.Decompressor()
compressed = _test_utils.take_input('zerosukkanooa.compressed')
decompressor.process(compressed[:-1], output_buffer_limit=10240)
# The following assertion checks whether the test setup is correct.
assert not decompressor.can_accept_more_data()
with pytest.raises(brotli.error):
decompressor.process(compressed[-1:])
if __name__ == '__main__':
unittest.main()
def test_changing_limit():
decompressor = brotli.Decompressor()
input_name = 'zerosukkanooa'
compressed = _test_utils.take_input(input_name + '.compressed')
check_output = _test_utils.has_input(input_name)
decompressed = decompressor.process(
compressed[:-1], output_buffer_limit=10240
)
assert len(decompressed) <= MIN_OUTPUT_BUFFER_SIZE
while not decompressor.can_accept_more_data():
decompressed += decompressor.process(b'')
decompressed += decompressor.process(compressed[-1:])
if check_output:
original = _test_utils.take_input(input_name)
assert original == decompressed
def test_garbage_appended():
decompressor = brotli.Decompressor()
with pytest.raises(brotli.error):
decompressor.process(brotli.compress(b'a') + b'a')
def test_already_finished():
decompressor = brotli.Decompressor()
decompressor.process(brotli.compress(b'a'))
with pytest.raises(brotli.error):
decompressor.process(b'a')

View File

@@ -7,14 +7,14 @@
module(
name = "brotli_research",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_research",
)
bazel_dep(name = "divsufsort", version = "2.0.1")
bazel_dep(name = "esaxx", version = "20250106.0")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "..",

View File

@@ -8,6 +8,7 @@ I found the following issues with the Brotli format:
- The block type code is useless if NBLTYPES==2, you would only need 1 symbol
anyway, so why don't you just switch to "the other" type?
"""
# ruff: noqa
import struct
from operator import itemgetter, methodcaller
from itertools import accumulate, repeat
@@ -1286,8 +1287,9 @@ class WordList:
return word.encode('utf8')
#Super compact form of action table.
#_ means space, .U means UpperCaseAll, U(w) means UpperCaseFirst
# Super compact form of action table.
# _ means space, .U means UpperCaseAll, U(w) means UpperCaseFirst
# typo:off
actionTable = r"""
0:w 25:w+_for_ 50:w+\n\t 75:w+. This_100:w+ize_
1:w+_ 26:w[3:] 51:w+: 76:w+, 101:w.U+.
@@ -1315,6 +1317,7 @@ class WordList:
23:w[:-3] 48:w[:-7] 98:_+w+=\'
24:w+] 49:w[:-1]+ing_ 74:U(w)+\' 99:U(w)+,
"""
# typo:on
def compileActions(self):
"""Build the action table from the text above

10
scripts/check_typos.sh Executable file
View File

@@ -0,0 +1,10 @@
#!/bin/bash
HERE=`realpath $(dirname "$0")`
PROJECT_DIR=`realpath ${HERE}/..`
SRC_EXT="bazel|bzl|c|cc|cmake|gni|h|html|in|java|js|m|md|nix|py|rst|sh|ts|txt|yaml|yml"
cd "${PROJECT_DIR}"
sources=`find . -type f | sort |grep -E "\.(${SRC_EXT})$" | grep -v -E "^(./)?tests/testdata/" | grep -v -E "\.min\.js$" | grep -v -E "brotli_dictionary\.txt$"`
echo "Checking sources:"
echo "${sources}"
typos -c "${HERE}/typos.toml" ${sources}

View File

@@ -18,13 +18,11 @@ for line in lines:
if appendix_a_found:
if re_data_line.match(line) is not None:
data = line.strip()
for i in range(32):
dictionary.append(int(data[2 * i:2 * i + 2], 16))
dictionary.extend(int(data[2 * i:2 * i + 2], 16) for i in range(32))
if len(dictionary) == 122784:
break
else:
if line.startswith("Appendix A."):
appendix_a_found = True
elif line.startswith("Appendix A."):
appendix_a_found = True
bin_path = "dictionary.bin"

View File

@@ -40,13 +40,12 @@ for b in data:
is_skip = False
hi.append(unichr(cntr))
cntr = skip_flip_offset + 1
elif value >= 0x80:
cntr += 1
else:
if value >= 0x80:
cntr += 1
else:
is_skip = True
hi.append(unichr(cntr))
cntr = skip_flip_offset + 1
is_skip = True
hi.append(unichr(cntr))
cntr = skip_flip_offset + 1
hi.append(unichr(cntr))
low0 = low[0:len(low) // 2]
@@ -56,15 +55,15 @@ low1 = low[len(low) // 2:len(low)]
def escape(chars):
result = []
for c in chars:
if "\r" == c:
if c == "\r":
result.append("\\r")
elif "\n" == c:
elif c == "\n":
result.append("\\n")
elif "\t" == c:
elif c == "\t":
result.append("\\t")
elif "\"" == c:
elif c == "\"":
result.append("\\\"")
elif "\\" == c:
elif c == "\\":
result.append("\\\\")
elif ord(c) < 32 or ord(c) >= 127:
result.append("\\u%04X" % ord(c))

View File

@@ -7,5 +7,5 @@ Name: libbrotlicommon
URL: https://github.com/google/brotli
Description: Brotli common dictionary library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlicommon
Libs: -L${libdir} -lbrotlicommon @libm@
Cflags: -I${includedir}

View File

@@ -8,5 +8,5 @@ URL: https://github.com/google/brotli
Description: Brotli decoder library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlidec
Requires.private: libbrotlicommon >= 1.1.0
Requires.private: libbrotlicommon >= 1.2.0
Cflags: -I${includedir}

View File

@@ -8,5 +8,5 @@ URL: https://github.com/google/brotli
Description: Brotli encoder library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlienc
Requires.private: libbrotlicommon >= 1.1.0
Requires.private: libbrotlicommon >= 1.2.0
Cflags: -I${includedir}

17
scripts/typos.toml Normal file
View File

@@ -0,0 +1,17 @@
[default]
extend-ignore-re = [
"(?Rm)^.*// notypo$", # disable check in current line
"(?s)(#|//)\\s*typo:off.*?\\n\\s*(#|//)\\s*typo:on", # disable check in block
"0x[0-9a-fA-F]+[ ,u]", # hexadecimal literal
"\\W2-nd\\W", # second
"\\W2\\^nd\\W", # second with superscript
]
[default.extend-words]
sais = "sais" # SAIS library
uncompressible = "uncompressible" # personal choice
flate = "flate" # compression algorithm
[default.extend-identifiers]
gl_pathc = "gl_pathc" # glob_t

View File

@@ -4,16 +4,11 @@
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
import os
import platform
import re
import unittest
try:
from setuptools import Extension
from setuptools import setup
except:
from distutils.core import Extension
from distutils.core import setup
from setuptools import Extension
from setuptools import setup
from distutils.command.build_ext import build_ext
from distutils import errors
from distutils import dep_util
@@ -24,7 +19,7 @@ from distutils import log
CURR_DIR = os.path.abspath(os.path.dirname(os.path.realpath(__file__)))
def bool_from_environ(key: str):
def bool_from_environ(key):
value = os.environ.get(key)
if not value:
return False
@@ -32,7 +27,7 @@ def bool_from_environ(key: str):
return True
if value == "0":
return False
raise ValueError(f"Environment variable {key} has invalid value {value}. Please set it to 1, 0 or an empty string")
raise ValueError("Environment variable {} has invalid value {}. Please set it to 1, 0 or an empty string".format(key, value))
def read_define(path, macro):
@@ -58,8 +53,7 @@ def get_version():
def get_test_suite():
test_loader = unittest.TestLoader()
test_suite = test_loader.discover("python", pattern="*_test.py")
return test_suite
return test_loader.discover("python", pattern="*_test.py")
class BuildExt(build_ext):
@@ -82,21 +76,15 @@ class BuildExt(build_ext):
if not (self.force or dep_util.newer_group(depends, ext_path, "newer")):
log.debug("skipping '%s' extension (up-to-date)", ext.name)
return
else:
log.info("building '%s' extension", ext.name)
log.info("building '%s' extension", ext.name)
c_sources = []
for source in ext.sources:
if source.endswith(".c"):
c_sources.append(source)
c_sources = [source for source in ext.sources if source.endswith(".c")]
extra_args = ext.extra_compile_args or []
objects = []
macros = ext.define_macros[:]
if platform.system() == "Darwin":
macros.append(("OS_MACOSX", "1"))
elif self.compiler.compiler_type == "mingw32":
if self.compiler.compiler_type == "mingw32":
# On Windows Python 2.7, pyconfig.h defines "hypot" as "_hypot",
# This clashes with GCC's cmath, and causes compilation errors when
# building under MinGW: http://bugs.python.org/issue11566
@@ -142,7 +130,7 @@ class BuildExt(build_ext):
)
NAME = "Brotli"
NAME = "brotli"
VERSION = get_version()
@@ -160,19 +148,20 @@ CLASSIFIERS = [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
# Deprecated, see https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details.
# "License :: OSI Approved :: MIT License",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX :: Linux",
"Programming Language :: C",
"Programming Language :: C++",
"Programming Language :: Python",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.3",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14",
"Programming Language :: Unix Shell",
"Topic :: Software Development :: Libraries",
"Topic :: Software Development :: Libraries :: Python Modules",
@@ -190,7 +179,7 @@ USE_SYSTEM_BROTLI = bool_from_environ("USE_SYSTEM_BROTLI")
if USE_SYSTEM_BROTLI:
import pkgconfig
REQUIRED_BROTLI_SYSTEM_LIBRARIES = ["libbrotlicommon", "libbrotlienc", "libbrotlidec"]
define_macros = []