Compare commits

...

41 Commits

Author SHA1 Message Date
Evgenii Kliuchnikov
8ca2312c61 fix release workflow
PiperOrigin-RevId: 822073417
2025-10-21 05:37:07 -07:00
Evgenii Kliuchnikov
ee771daf20 fix copy-paste in Java decoder
PiperOrigin-RevId: 822024938
2025-10-21 02:42:59 -07:00
Evgenii Kliuchnikov
42aee32891 try to fix release workflow
PiperOrigin-RevId: 822012047
2025-10-21 02:03:38 -07:00
Evgenii Kliuchnikov
392c06bac0 redesign release resource uploading
PiperOrigin-RevId: 821982935
2025-10-21 00:22:30 -07:00
Evgenii Kliuchnikov
1964cdb1b9 ramp up all GH actions plugins
PiperOrigin-RevId: 821598646
2025-10-20 05:07:13 -07:00
Evgenii Kliuchnikov
61605b1cb3 pick VCPKG patches
PiperOrigin-RevId: 821593009
2025-10-20 04:44:24 -07:00
Evgenii Kliuchnikov
4b0f27b6f9 pick changes from Alpine patch
PiperOrigin-RevId: 816164347
2025-10-07 05:39:06 -07:00
Evgenii Kliuchnikov
1e4425a372 pick changes from Debian patch
PiperOrigin-RevId: 816157554
2025-10-07 05:16:20 -07:00
Copybara-Service
f038020bd7 Merge pull request #1346 from google:dependabot/github_actions/softprops/action-gh-release-2.3.4
PiperOrigin-RevId: 816151932
2025-10-07 04:56:02 -07:00
Copybara-Service
4d5a32bf45 Merge pull request #1347 from google:dependabot/github_actions/ossf/scorecard-action-2.4.3
PiperOrigin-RevId: 816151799
2025-10-07 04:54:57 -07:00
Evgenii Kliuchnikov
34e43eb020 fix typos
PiperOrigin-RevId: 815676548
2025-10-06 05:16:07 -07:00
Evgenii Kliuchnikov
b3142143f6 add installation section to README
PiperOrigin-RevId: 815667870
2025-10-06 04:45:22 -07:00
Evgenii Kliuchnikov
0e7ea31e6b add alternative unaligned memory access for MIPS
PiperOrigin-RevId: 815662268
2025-10-06 04:26:03 -07:00
Evgenii Kliuchnikov
da2e091eb7 prepare for v1.2.0.rc1
PiperOrigin-RevId: 815650799
2025-10-06 03:48:49 -07:00
dependabot[bot]
30576423b8 Bump ossf/scorecard-action from 2.4.2 to 2.4.3
Bumps [ossf/scorecard-action](https://github.com/ossf/scorecard-action) from 2.4.2 to 2.4.3.
- [Release notes](https://github.com/ossf/scorecard-action/releases)
- [Changelog](https://github.com/ossf/scorecard-action/blob/main/RELEASE.md)
- [Commits](05b42c6244...4eaacf0543)

---
updated-dependencies:
- dependency-name: ossf/scorecard-action
  dependency-version: 2.4.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 09:30:15 +00:00
dependabot[bot]
9cf25439ad Bump softprops/action-gh-release from 2.3.3 to 2.3.4
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.3.3 to 2.3.4.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](6cbd405e2c...62c96d0c4e)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-version: 2.3.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-06 09:29:59 +00:00
Copybara-Service
82d3c163cb Merge pull request #1328 from akazwz:go
PiperOrigin-RevId: 815626477
2025-10-06 02:29:02 -07:00
Copybara-Service
4876ada111 Merge pull request #1335 from google:dependabot/github_actions/actions/cache-4.3.0
PiperOrigin-RevId: 815624406
2025-10-06 02:21:54 -07:00
Eugene Kliuchnikov
f1c80224e8 Fix some typos / non-typos. (#1345) 2025-10-03 12:16:07 +02:00
Evgenii Kliuchnikov
ed93810e27 support multi-phase initialization
PiperOrigin-RevId: 814128632
2025-10-02 01:51:02 -07:00
Eugene Kliuchnikov
7cc02a1687 Merge branch 'master' into dependabot/github_actions/actions/cache-4.3.0 2025-10-01 21:06:49 +02:00
Evgenii Kliuchnikov
54481d4ebe use builtin bswap when available
PiperOrigin-RevId: 813849285
2025-10-01 11:56:09 -07:00
Evgenii Kliuchnikov
a896e79d4f Java: ramp-up artifact versions in pom files
PiperOrigin-RevId: 813673237
2025-10-01 03:26:40 -07:00
Evgenii Kliuchnikov
947f74e908 update links in readme
PiperOrigin-RevId: 813665159
2025-10-01 02:58:49 -07:00
Evgenii Kliuchnikov
916e4a46a8 update docs
PiperOrigin-RevId: 813658707
2025-10-01 02:37:06 -07:00
Eugene Kliuchnikov
e4e56a3203 Add missing newline 2025-09-29 16:32:28 +02:00
Eugene Kliuchnikov
2c5f2d1198 Merge branch 'master' into go 2025-09-29 15:39:10 +02:00
Eugene Kliuchnikov
f3b0ceed2d Merge branch 'master' into dependabot/github_actions/actions/cache-4.3.0 2025-09-29 15:37:19 +02:00
Evgenii Kliuchnikov
1f6ab76bff use module-bound exception
PiperOrigin-RevId: 812739918
2025-09-29 05:12:31 -07:00
dependabot[bot]
5c79b32b14 Bump actions/cache from 4.2.4 to 4.3.0
Bumps [actions/cache](https://github.com/actions/cache) from 4.2.4 to 4.3.0.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](0400d5f644...0057852bfa)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-version: 4.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-29 09:19:37 +00:00
Copybara-Service
d74b0a4a22 Merge pull request #1323 from google:dependabot/github_actions/actions/setup-python-6.0.0
PiperOrigin-RevId: 811710346
2025-09-26 01:25:56 -07:00
Copybara-Service
dbcb332b66 Merge pull request #1324 from google:dependabot/github_actions/softprops/action-gh-release-2.3.3
PiperOrigin-RevId: 811710240
2025-09-26 01:24:57 -07:00
dependabot[bot]
c0d785dfe2 Bump actions/setup-python from 5.6.0 to 6.0.0
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.6.0 to 6.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](a26af69be9...e797f83bcb)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-25 15:02:31 +00:00
dependabot[bot]
466613c266 Bump softprops/action-gh-release from 2.3.2 to 2.3.3
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.3.2 to 2.3.3.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](72f2c25fcb...6cbd405e2c)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-version: 2.3.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-25 15:01:52 +00:00
Evgenii Kliuchnikov
1406898440 Build and test with PY2.7
PiperOrigin-RevId: 811352084
2025-09-25 08:00:20 -07:00
Evgenii Kliuchnikov
0bef8a6936 clarify that prepared dictionaries are "lean"
PiperOrigin-RevId: 811236534
2025-09-25 01:27:28 -07:00
Evgenii Kliuchnikov
9686382ff3 PY: continue renovation of extension
Fixed unchecked malloc for "tail" input data.
Fixed inconsistencies in docstrings.

Rewritten "growable buffer" to C-code, so it could run without acquiring GIL.

Breaking changes:
 - native object allocation failures now handled at object creation time
 - some lower level exceptions (e.g. OOM) are not shadowed by brotli.error

PiperOrigin-RevId: 810813664
2025-09-24 03:52:44 -07:00
Evgenii Kliuchnikov
85d46ce6b5 Drop finalize()
Now it is solely embedders responisbility to close things that hold native resources. No more "safety net".

Consider "try-with-resources". For longer lasting items (e.g. native PreparedDictionary) use Cleaner as a last resort.

PiperOrigin-RevId: 807584792
2025-09-16 01:23:46 -07:00
akazwz
3d8eef20a6 Update Go modules to require Go 1.21 and replace ioutil with io package in reader.go 2025-09-12 23:52:50 +08:00
Evgenii Kliuchnikov
41a22f07f2 modernize PY3 class definition
PiperOrigin-RevId: 804460135
2025-09-08 09:15:53 -07:00
Evgenii Kliuchnikov
98a89b1563 temporary rollback
PiperOrigin-RevId: 803462595
2025-09-05 07:57:59 -07:00
62 changed files with 1961 additions and 1497 deletions

1
.gitattributes vendored
View File

@@ -51,3 +51,4 @@ tests/testdata/empty !export-ignore
tests/testdata/empty.compressed !export-ignore
tests/testdata/ukkonooa !export-ignore
tests/testdata/ukkonooa.compressed !export-ignore
tests/testdata/zerosukkanooa.compressed !export-ignore

View File

@@ -6,19 +6,23 @@
# Workflow for building and running tests under Ubuntu
name: Build/Test
on:
push:
branches:
- master
pull_request:
types: [opened, reopened, labeled, synchronize]
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
ubuntu_build:
build_test:
name: Build and test ${{ matrix.name }}
runs-on: ${{ matrix.os || 'ubuntu-latest' }}
defaults:
@@ -28,18 +32,36 @@ jobs:
fail-fast: false
matrix:
include:
- name: cmake:gcc
build_system: cmake
c_compiler: gcc
cxx_compiler: g++
- name: cmake:gcc-old
build_system: cmake
c_compiler: gcc
cxx_compiler: g++
os: ubuntu-22.04
- name: cmake:clang
build_system: cmake
c_compiler: clang
cxx_compiler: clang
- name: cmake:clang-old
build_system: cmake
c_compiler: clang
cxx_compiler: clang
os: ubuntu-22.04
- name: cmake:package
build_system: cmake
cmake_args: -DBROTLI_BUILD_FOR_PACKAGE=ON
- name: cmake:static
build_system: cmake
cmake_args: -DBUILD_SHARED_LIBS=OFF
- name: cmake:clang:asan
build_system: cmake
sanitizer: address
@@ -174,6 +196,12 @@ jobs:
CXX: ${{ matrix.cxx_compiler || 'gcc' }}
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Install extra deps @ Ubuntu
if: ${{ runner.os == 'Linux' }}
# Already installed: bazel, clang{13-15}, cmake, gcc{9.5-13.1}, java{8,11,17,21}, maven, python{3.10}
@@ -183,23 +211,16 @@ jobs:
sudo apt install -y ${EXTRA_PACKAGES}
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1
#- name: Checkout VC9 for Python
# if: ${{ runner.os == 'Windows' && matrix.build_system == 'python' && matrix.python_version == '2.7' }}
# uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
# with:
# repository: reider-roque/sulley-win-installer
# path: third_party/VCForPython27
- name: Configure / Build / Test with CMake
if: ${{ matrix.build_system == 'cmake' }}
run: |
export ASAN_OPTIONS=detect_leaks=0
declare -a CMAKE_OPTIONS=()
declare -a CMAKE_OPTIONS=(${{ matrix.cmake_args || '' }})
CMAKE_OPTIONS+=("-DCMAKE_VERBOSE_MAKEFILE=ON")
[ ! -z '${{ matrix.c_compiler || '' }}' ] && CMAKE_OPTIONS+=(-DCMAKE_C_COMPILER='${{ matrix.c_compiler }}')
[ ! -z '${{ matrix.cxx_compiler || '' }}' ] && CMAKE_OPTIONS+=(-DCMAKE_CXX_COMPILER='${{ matrix.cxx_compiler }}')
@@ -289,22 +310,48 @@ jobs:
# cd integration
# mvn -B verify
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
if: ${{ matrix.build_system == 'python' }}
with:
python-version: ${{ matrix.python_version }}
# TODO: investigate, why msiexec hangs
#- name: Install VC9 for Python
# if: ${{ runner.os == 'Windows' && matrix.build_system == 'python' && matrix.python_version == '2.7' }}
# run: |
# echo "070474db76a2e625513a5835df4595df9324d820f9cc97eab2a596dcbc2f5cbf third_party/VCForPython27/VCForPython27.msi" | sha256sum --check --status
# msiexec ALLUSERS=1 /qn /norestart /i third_party/VCForPython27/VCForPython27.msi /l*v ${RUNNER_TEMP}/msiexec.log
# cat ${RUNNER_TEMP}/msiexec.log
# TODO(eustas): use modern setuptools (split out testing)
- name: Build / Test with Python
if: ${{ matrix.build_system == 'python' }}
run: |
python -VV
python -c "import sys; sys.exit('Invalid python version') if '.'.join(map(str,sys.version_info[0:2])) != '${{ matrix.python_version }}' else True"
pip install setuptools==51.3.3
python setup.py ${{ matrix.py_setuptools_cmd || 'test'}}
build_test_py27:
name: Build and test with Python 2.7
runs-on: ubuntu-latest
container:
image: ubuntu:22.04
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Install deps
run: |
apt update
apt install -y curl gcc python2.7 python2.7-dev
curl https://bootstrap.pypa.io/pip/2.7/get-pip.py --output get-pip.py
python2.7 get-pip.py
python2.7 -m pip install distutils-pytest==0.1
- name: Checkout the source
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1
- name: Build / Test
run: |
python2.7 -VV
python2.7 -c "import sys; sys.exit('Invalid python version') if '.'.join(map(str,sys.version_info[0:2])) != '2.7' else True"
python2.7 setup.py test

70
.github/workflows/build_test_wasm.yml vendored Normal file
View File

@@ -0,0 +1,70 @@
# Copyright 2025 Google Inc. All Rights Reserved.
#
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
# Workflow for building and running tests with WASM
name: Build/Test WASM
on:
push:
branches:
- master
pull_request:
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
build_test_wasm:
name: Build and test with WASM
runs-on: ubuntu-latest
env:
CCACHE_DIR: ${{ github.workspace }}/.ccache
BUILD_TARGET: wasm32
EM_VERSION: 3.1.51
# As of 28.08.2025 ubuntu-latest is 24.04; it is shipped with node 22.18
NODE_VERSION: 22
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: true
fetch-depth: 1
- name: Install node
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version: ${{env.NODE_VERSION}}
- name: Get non-EMSDK node path
run: which node >> $HOME/.base_node_path
- name: Install emsdk
uses: mymindstorm/setup-emsdk@6ab9eb1bda2574c4ddb79809fc9247783eaf9021 # v14
with:
version: ${{env.EM_VERSION}}
no-cache: true
- name: Set EMSDK node version
run: |
echo "NODE_JS='$(cat $HOME/.base_node_path)'" >> $EMSDK/.emscripten
emsdk construct_env
- name: Build
run: |
LDFLAGS=" -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 " emcmake cmake -B out .
cmake --build out
cd out; ctest --output-on-failure; cd ..

View File

@@ -9,6 +9,9 @@ on:
schedule:
- cron: '18 15 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
@@ -30,12 +33,18 @@ jobs:
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby', 'swift' ]
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/init@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v3.29.5
with:
languages: ${{ matrix.language }}
# CodeQL is currently crashing on files with large lists:
@@ -47,7 +56,7 @@ jobs:
- if: matrix.language == 'cpp'
name: Build CPP
uses: github/codeql-action/autobuild@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/autobuild@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v3.29.5
- if: matrix.language == 'cpp' || matrix.language == 'java'
name: Build Java
@@ -57,7 +66,7 @@ jobs:
- if: matrix.language == 'javascript'
name: Build JS
uses: github/codeql-action/autobuild@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/autobuild@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v3.29.5
- if: matrix.language == 'cpp' || matrix.language == 'python'
name: Build Python
@@ -65,7 +74,7 @@ jobs:
python setup.py build_ext
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@43750fe4fc4f068f04f2215206e6f6a29c78c763 # v2.14.4
uses: github/codeql-action/analyze@f443b600d91635bebf5b0d9ebc620189c0d6fba5 # v3.29.5
with:
category: "/language:${{matrix.language}}"
ref: "${{ github.ref != 'master' && github.ref || '/refs/heads/master' }}"

View File

@@ -6,8 +6,12 @@
# Workflow for building / running oss-fuzz.
name: CIFuzz
on: [pull_request]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
@@ -16,17 +20,25 @@ jobs:
Fuzzing:
runs-on: ubuntu-latest
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Build Fuzzers
uses: google/oss-fuzz/infra/cifuzz/actions/build_fuzzers@master
uses: google/oss-fuzz/infra/cifuzz/actions/build_fuzzers@3e6a7fd7bcd631647ab9beed1fe0897498e6af39 # 22.09.2025
with:
oss-fuzz-project-name: 'brotli'
dry-run: false
- name: Run Fuzzers
uses: google/oss-fuzz/infra/cifuzz/actions/run_fuzzers@master
uses: google/oss-fuzz/infra/cifuzz/actions/run_fuzzers@3e6a7fd7bcd631647ab9beed1fe0897498e6af39 # 22.09.2025
with:
oss-fuzz-project-name: 'brotli'
fuzz-seconds: 600
dry-run: false
- name: Upload Crash
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
if: failure()

50
.github/workflows/lint.yml vendored Normal file
View File

@@ -0,0 +1,50 @@
# Copyright 2025 Google Inc. All Rights Reserved.
#
# Distributed under MIT license.
# See file LICENSE for detail or copy at https://opensource.org/licenses/MIT
# Workflow for checking typos and buildifier, formatting, etc.
name: "Lint"
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
schedule:
- cron: '18 15 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
check:
name: Lint
runs-on: 'ubuntu-latest'
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Install tools
run: |
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
brew install buildifier typos-cli
- name: Check typos
run: |
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
./scripts/check_typos.sh
# TODO(eustas): run buildifier

View File

@@ -14,7 +14,10 @@ on:
release:
types: [ published ]
pull_request:
types: [opened, reopened, labeled, synchronize]
types: [opened, reopened, labeled, unlabeled, synchronize]
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
@@ -59,13 +62,19 @@ jobs:
VCPKG_DISABLE_METRICS: 1
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1
- uses: actions/cache@0400d5f644dc74513175e3cd8d07132dd4860809 # v4.2.4
- uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
id: cache-vcpkg
with:
path: vcpkg
@@ -76,7 +85,7 @@ jobs:
shell: 'powershell'
run: |
Invoke-WebRequest -Uri "https://github.com/microsoft/vcpkg/archive/refs/tags/${{ env.VCPKG_VERSION }}.zip" -OutFile "vcpkg.zip"
- name: Bootstrap vcpkg
if: steps.cache-vcpkg.outputs.cache-hit != 'true'
shell: 'bash'
@@ -100,23 +109,19 @@ jobs:
-DCMAKE_TOOLCHAIN_FILE=${VCPKG_ROOT}/scripts/buildsystems/vcpkg.cmake \
-DVCPKG_TARGET_TRIPLET=${{ matrix.triplet }} \
#
- name: Build
shell: 'bash'
run: |
set -x
cmake --build out --config Release
- name: Install
shell: 'bash'
run: |
set -x
cmake --build out --config Release --target install
cp LICENSE prefix/bin/LICENSE.brotli
- name: Upload artifacts
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: brotli-${{matrix.triplet}}
path: |
prefix/bin/*
- name: Package release zip
shell: 'powershell'
@@ -124,11 +129,12 @@ jobs:
Compress-Archive -Path prefix\bin\* `
-DestinationPath brotli-${{matrix.triplet}}.zip
- name: Upload binaries to release
if: github.event_name == 'release'
uses: softprops/action-gh-release@72f2c25fcb47643c292f7107632f7a47c1df5cd8 # v0.1.15
- name: Upload package
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
files: brotli-${{matrix.triplet}}.zip
name: brotli-${{matrix.triplet}}
path: brotli-${{matrix.triplet}}.zip
compression-level: 0
testdata_upload:
name: Upload testdata
@@ -138,8 +144,13 @@ jobs:
shell: bash
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1
@@ -148,14 +159,42 @@ jobs:
run: |
tar cvfJ testdata.txz tests/testdata
- name: Upload archive to release
if: github.event_name == 'release'
uses: softprops/action-gh-release@72f2c25fcb47643c292f7107632f7a47c1df5cd8 # v0.1.15
- name: Upload archive
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
files: testdata.txz
name: testdata
path: testdata.txz
compression-level: 0
publish_release_assets:
name: Publish release assets
needs: [windows_build, testdata_upload]
if: github.event_name == 'release'
runs-on: [ubuntu-latest]
permissions:
contents: write
steps:
- name: Checkout the source
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1
- name: Download all artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
path: release_assets
merge-multiple: true
- name: Publish assets
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ github.event.release.tag_name }} ./release_assets/*
archive_build:
needs: testdata_upload
needs: publish_release_assets
name: Build and test from archive
runs-on: 'ubuntu-latest'
defaults:
@@ -163,8 +202,13 @@ jobs:
shell: bash
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: Checkout the source
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # v4.0.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
submodules: false
fetch-depth: 1

View File

@@ -3,6 +3,7 @@
# policy, and support documentation.
name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
@@ -14,13 +15,13 @@ on:
push:
branches: [ "master" ]
# Declare default permissions as read only.
permissions: read-all
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event_name }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
# Declare default permissions as read only.
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
@@ -35,13 +36,18 @@ jobs:
# actions: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@f4a75cfd619ee5ce8d5b864b0d183aff3c69b55a # v2.13.1
with:
egress-policy: audit
- name: "Checkout code"
uses: actions/checkout@v4 # v3.1.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@05b42c624433fc40578a4040d5cf5e36ddca8cde # v2.4.2
uses: ossf/scorecard-action@4eaacf0543bb3f2c246792bd56e8cdeffafb205a # v2.4.3
with:
results_file: results.sarif
results_format: sarif
@@ -71,6 +77,6 @@ jobs:
# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@17573ee1cc1b9d061760f3a006fc4aac4f944fd5 # v2.2.4
uses: github/codeql-action/upload-sarif@17783bfb99b07f70fae080b654aed0c514057477 # v2.23.3
with:
sarif_file: results.sarif

View File

@@ -7,12 +7,38 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## Unreleased
## [1.2.0] - 2025-10-xx
### SECURITY
- python: added `Decompressor::can_accept_more_data` method and optional
`max_output_length` argument `Decompressor::process`;
that allows mitigation of unexpextedely large output;
`output_buffer_limit` argument `Decompressor::process`;
that allows mitigation of unexpectedly large output;
reported by Charles Chan (https://github.com/charleswhchan)
### Added
- **decoder / encoder: added static initialization to reduce binary size**
- python: allow limiting decoder output (see SECURITY section)
- CLI: `brcat` alias; allow decoding concatenated brotli streams
- kt: pure Kotlin decoder
- cgo: support "raw" dictionaries
- build: Bazel modules
### Removed
- java: dropped `finalize()` for native entities
### Fixed
- java: in `compress` pass correct length to native encoder
### Improved
- build: install man pages
- build: updated / fixed / refined Bazel buildfiles
- encoder: faster encoding
- cgo: link via pkg-config
- python: modernize extension / allow multi-phase module initialization
### Changed
- decoder / encoder: static tables use "small" model (allows 2GiB+ binaries)
## [1.1.0] - 2023-08-28

View File

@@ -10,11 +10,8 @@ cmake_minimum_required(VERSION 3.15)
cmake_policy(SET CMP0048 NEW)
project(brotli C)
option(BUILD_SHARED_LIBS "Build shared libraries" ON)
set(BROTLI_BUILD_TOOLS ON CACHE BOOL "Build/install CLI tools")
if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
message(STATUS "Setting build type to Release as none was specified.")
if (NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
message(STATUS "Setting build type to Release as none was specified")
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
else()
message(STATUS "Build type is '${CMAKE_BUILD_TYPE}'")
@@ -33,20 +30,34 @@ else()
message("-- Compiler is not EMSCRIPTEN")
endif()
if (BROTLI_EMSCRIPTEN)
message(STATUS "Switching to static build for EMSCRIPTEN")
set(BUILD_SHARED_LIBS OFF)
endif()
# Reflect CMake variable as a build option.
option(BUILD_SHARED_LIBS "Build shared libraries" ON)
set(BROTLI_BUILD_TOOLS ON CACHE BOOL "Build/install CLI tools")
set(BROTLI_BUILD_FOR_PACKAGE OFF CACHE BOOL "Build/install both shared and static libraries")
if (BROTLI_BUILD_FOR_PACKAGE AND NOT BUILD_SHARED_LIBS)
message(FATAL_ERROR "Both BROTLI_BUILD_FOR_PACKAGE and BUILD_SHARED_LIBS are set")
endif()
# If Brotli is being bundled in another project, we don't want to
# install anything. However, we want to let people override this, so
# we'll use the BROTLI_BUNDLED_MODE variable to let them do that; just
# set it to OFF in your project before you add_subdirectory(brotli).
get_directory_property(BROTLI_PARENT_DIRECTORY PARENT_DIRECTORY)
if(NOT DEFINED BROTLI_BUNDLED_MODE)
if (NOT DEFINED BROTLI_BUNDLED_MODE)
# Bundled mode hasn't been set one way or the other, set the default
# depending on whether or not we are the top-level project.
if(BROTLI_PARENT_DIRECTORY)
if (BROTLI_PARENT_DIRECTORY)
set(BROTLI_BUNDLED_MODE ON)
else()
set(BROTLI_BUNDLED_MODE OFF)
endif()
endif()
endif() # BROTLI_BUNDLED_MODE
mark_as_advanced(BROTLI_BUNDLED_MODE)
include(GNUInstallDirs)
@@ -80,66 +91,79 @@ endif ()
include(CheckLibraryExists)
set(LIBM_LIBRARY)
set(LIBM_DEP)
CHECK_LIBRARY_EXISTS(m log2 "" HAVE_LIB_M)
if(HAVE_LIB_M)
if (HAVE_LIB_M)
set(LIBM_LIBRARY "m")
if (NOT BUILD_SHARED_LIBS)
set(LIBM_DEP "-lm")
endif()
endif()
set(BROTLI_INCLUDE_DIRS "${CMAKE_CURRENT_SOURCE_DIR}/c/include")
mark_as_advanced(BROTLI_INCLUDE_DIRS)
set(BROTLI_LIBRARIES_CORE brotlienc brotlidec brotlicommon)
set(BROTLI_LIBRARIES ${BROTLI_LIBRARIES_CORE} ${LIBM_LIBRARY})
if (BROTLI_BUILD_FOR_PACKAGE)
set(BROTLI_SHARED_LIBRARIES brotlienc brotlidec brotlicommon)
set(BROTLI_STATIC_LIBRARIES brotlienc-static brotlidec-static brotlicommon-static)
set(BROTLI_LIBRARIES ${BROTLI_SHARED_LIBRARIES} ${LIBM_LIBRARY})
else() # NOT BROTLI_BUILD_FOR_PACKAGE
if (BUILD_SHARED_LIBS)
set(BROTLI_SHARED_LIBRARIES brotlienc brotlidec brotlicommon)
set(BROTLI_STATIC_LIBRARIES)
else() # NOT BUILD_SHARED_LIBS
set(BROTLI_SHARED_LIBRARIES)
set(BROTLI_STATIC_LIBRARIES brotlienc brotlidec brotlicommon)
endif()
set(BROTLI_LIBRARIES ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES} ${LIBM_LIBRARY})
endif() # BROTLI_BUILD_FOR_PACKAGE
mark_as_advanced(BROTLI_LIBRARIES)
if(${CMAKE_SYSTEM_NAME} MATCHES "Linux")
add_definitions(-DOS_LINUX)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "FreeBSD")
add_definitions(-DOS_FREEBSD)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "Darwin")
add_definitions(-DOS_MACOSX)
endif()
if (MSVC)
message(STATUS "Defining _CRT_SECURE_NO_WARNINGS to avoid warnings about security")
if(BROTLI_EMSCRIPTEN)
set(BUILD_SHARED_LIBS OFF)
add_definitions(-D_CRT_SECURE_NO_WARNINGS)
endif()
file(GLOB_RECURSE BROTLI_COMMON_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/common/*.c)
add_library(brotlicommon ${BROTLI_COMMON_SOURCES})
file(GLOB_RECURSE BROTLI_DEC_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/dec/*.c)
add_library(brotlidec ${BROTLI_DEC_SOURCES})
file(GLOB_RECURSE BROTLI_ENC_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} c/enc/*.c)
add_library(brotlicommon ${BROTLI_COMMON_SOURCES})
add_library(brotlidec ${BROTLI_DEC_SOURCES})
add_library(brotlienc ${BROTLI_ENC_SOURCES})
if (BROTLI_BUILD_FOR_PACKAGE)
add_library(brotlicommon-static STATIC ${BROTLI_COMMON_SOURCES})
add_library(brotlidec-static STATIC ${BROTLI_DEC_SOURCES})
add_library(brotlienc-static STATIC ${BROTLI_ENC_SOURCES})
endif()
# Older CMake versions does not understand INCLUDE_DIRECTORIES property.
include_directories(${BROTLI_INCLUDE_DIRS})
if(BUILD_SHARED_LIBS)
foreach(lib ${BROTLI_LIBRARIES_CORE})
if (BUILD_SHARED_LIBS)
foreach(lib ${BROTLI_SHARED_LIBRARIES})
target_compile_definitions(${lib} PUBLIC "BROTLI_SHARED_COMPILATION" )
string(TOUPPER "${lib}" LIB)
set_target_properties (${lib} PROPERTIES DEFINE_SYMBOL "${LIB}_SHARED_COMPILATION")
endforeach()
endif()
endif() # BUILD_SHARED_LIBS
foreach(lib ${BROTLI_LIBRARIES_CORE})
foreach(lib ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES})
target_link_libraries(${lib} ${LIBM_LIBRARY})
set_property(TARGET ${lib} APPEND PROPERTY INCLUDE_DIRECTORIES ${BROTLI_INCLUDE_DIRS})
set_target_properties(${lib} PROPERTIES
VERSION "${BROTLI_ABI_COMPATIBILITY}.${BROTLI_ABI_AGE}.${BROTLI_ABI_REVISION}"
SOVERSION "${BROTLI_ABI_COMPATIBILITY}")
if(NOT BROTLI_EMSCRIPTEN)
if (NOT BROTLI_EMSCRIPTEN)
set_target_properties(${lib} PROPERTIES POSITION_INDEPENDENT_CODE TRUE)
endif()
set_property(TARGET ${lib} APPEND PROPERTY INTERFACE_INCLUDE_DIRECTORIES "$<BUILD_INTERFACE:${BROTLI_INCLUDE_DIRS}>")
endforeach()
endforeach() # BROTLI_xxx_LIBRARIES
if(NOT BROTLI_EMSCRIPTEN)
target_link_libraries(brotlidec brotlicommon)
target_link_libraries(brotlienc brotlicommon)
endif()
target_link_libraries(brotlidec brotlicommon)
target_link_libraries(brotlienc brotlicommon)
# For projects stuck on older versions of CMake, this will set the
# BROTLI_INCLUDE_DIRS and BROTLI_LIBRARIES variables so they still
@@ -147,19 +171,19 @@ endif()
#
# include_directories(${BROTLI_INCLUDE_DIRS})
# target_link_libraries(foo ${BROTLI_LIBRARIES})
if(BROTLI_PARENT_DIRECTORY)
if (BROTLI_PARENT_DIRECTORY)
set(BROTLI_INCLUDE_DIRS "${BROTLI_INCLUDE_DIRS}" PARENT_SCOPE)
set(BROTLI_LIBRARIES "${BROTLI_LIBRARIES}" PARENT_SCOPE)
endif()
# Build the brotli executable
if(BROTLI_BUILD_TOOLS)
if (BROTLI_BUILD_TOOLS)
add_executable(brotli c/tools/brotli.c)
target_link_libraries(brotli ${BROTLI_LIBRARIES})
endif()
# Installation
if(NOT BROTLI_BUNDLED_MODE)
if (NOT BROTLI_BUNDLED_MODE)
if (BROTLI_BUILD_TOOLS)
install(
TARGETS brotli
@@ -168,7 +192,7 @@ if(NOT BROTLI_BUNDLED_MODE)
endif()
install(
TARGETS ${BROTLI_LIBRARIES_CORE}
TARGETS ${BROTLI_SHARED_LIBRARIES} ${BROTLI_STATIC_LIBRARIES}
ARCHIVE DESTINATION "${CMAKE_INSTALL_LIBDIR}"
LIBRARY DESTINATION "${CMAKE_INSTALL_LIBDIR}"
RUNTIME DESTINATION "${CMAKE_INSTALL_BINDIR}"
@@ -183,17 +207,28 @@ endif() # BROTLI_BUNDLED_MODE
# Tests
# Integration tests, those depend on `brotli` binary
if(NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
if (NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
# If we're targeting Windows but not running on Windows, we need Wine
# to run the tests...
if(WIN32 AND NOT CMAKE_HOST_WIN32)
if (WIN32 AND NOT CMAKE_HOST_WIN32)
find_program(BROTLI_WRAPPER NAMES wine)
if(NOT BROTLI_WRAPPER)
if (NOT BROTLI_WRAPPER)
message(STATUS "wine not found, disabling tests")
set(BROTLI_DISABLE_TESTS TRUE)
endif()
endif()
endif() # WIN32 emulation
if (BROTLI_EMSCRIPTEN)
find_program(BROTLI_WRAPPER NAMES node)
if (NOT BROTLI_WRAPPER)
message(STATUS "node not found, disabling tests")
set(BROTLI_DISABLE_TESTS TRUE)
endif()
endif() # BROTLI_EMSCRIPTEN
endif() # BROTLI_DISABLE_TESTS
# NB: BROTLI_DISABLE_TESTS might have changed.
if (NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
# If our compiler is a cross-compiler that we know about (arm/aarch64),
# then we need to use qemu to execute the tests.
if ("${CMAKE_C_COMPILER}" MATCHES "^.*/arm-linux-gnueabihf-.*$")
@@ -255,13 +290,16 @@ if(NOT BROTLI_DISABLE_TESTS AND BROTLI_BUILD_TOOLS)
tests/testdata/*.compressed*)
foreach(INPUT ${COMPATIBILITY_INPUTS})
add_test(NAME "${BROTLI_TEST_PREFIX}compatibility/${INPUT}"
COMMAND "${CMAKE_COMMAND}"
-DBROTLI_WRAPPER=${BROTLI_WRAPPER}
-DBROTLI_WRAPPER_LD_PREFIX=${BROTLI_WRAPPER_LD_PREFIX}
-DBROTLI_CLI=$<TARGET_FILE:brotli>
-DINPUT=${CMAKE_CURRENT_SOURCE_DIR}/${INPUT}
-P ${CMAKE_CURRENT_SOURCE_DIR}/tests/run-compatibility-test.cmake)
string(REGEX REPLACE "([a-zA-Z0-9\\.]+)\\.compressed(\\.[0-9]+)?$" "\\1" UNCOMPRESSED_INPUT "${INPUT}")
if (EXISTS ${UNCOMPRESSED_INPUT})
add_test(NAME "${BROTLI_TEST_PREFIX}compatibility/${INPUT}"
COMMAND "${CMAKE_COMMAND}"
-DBROTLI_WRAPPER=${BROTLI_WRAPPER}
-DBROTLI_WRAPPER_LD_PREFIX=${BROTLI_WRAPPER_LD_PREFIX}
-DBROTLI_CLI=$<TARGET_FILE:brotli>
-DINPUT=${CMAKE_CURRENT_SOURCE_DIR}/${INPUT}
-P ${CMAKE_CURRENT_SOURCE_DIR}/tests/run-compatibility-test.cmake)
endif()
endforeach()
endif() # BROTLI_DISABLE_TESTS
@@ -283,19 +321,19 @@ function(generate_pkg_config_path outvar path)
get_filename_component(value_full "${value}" ABSOLUTE)
string(LENGTH "${value}" value_length)
if(path_length EQUAL value_length AND path STREQUAL value)
if (path_length EQUAL value_length AND path STREQUAL value)
set("${outvar}" "\${${name}}")
break()
elseif(path_length GREATER value_length)
elseif (path_length GREATER value_length)
# We might be in a subdirectory of the value, but we have to be
# careful about a prefix matching but not being a subdirectory
# (for example, /usr/lib64 is not a subdirectory of /usr/lib).
# We'll do this by making sure the next character is a directory
# separator.
string(SUBSTRING "${path}" ${value_length} 1 sep)
if(sep STREQUAL "/")
if (sep STREQUAL "/")
string(SUBSTRING "${path}" 0 ${value_length} s)
if(s STREQUAL value)
if (s STREQUAL value)
string(SUBSTRING "${path}" "${value_length}" -1 suffix)
set("${outvar}" "\${${name}}${suffix}")
break()
@@ -316,6 +354,7 @@ function(transform_pc_file INPUT_FILE OUTPUT_FILE VERSION)
set(PREFIX "${CMAKE_INSTALL_PREFIX}")
string(REGEX REPLACE "@prefix@" "${PREFIX}" TEXT ${TEXT})
string(REGEX REPLACE "@exec_prefix@" "${PREFIX}" TEXT ${TEXT})
string(REGEX REPLACE "@libm@" "${LIBM_DEP}" TEXT ${TEXT})
generate_pkg_config_path(LIBDIR "${CMAKE_INSTALL_FULL_LIBDIR}" prefix "${PREFIX}")
string(REGEX REPLACE "@libdir@" "${LIBDIR}" TEXT ${TEXT})
@@ -334,7 +373,7 @@ transform_pc_file("scripts/libbrotlidec.pc.in" "${CMAKE_CURRENT_BINARY_DIR}/libb
transform_pc_file("scripts/libbrotlienc.pc.in" "${CMAKE_CURRENT_BINARY_DIR}/libbrotlienc.pc" "${BROTLI_VERSION}")
if(NOT BROTLI_BUNDLED_MODE)
if (NOT BROTLI_BUNDLED_MODE)
install(FILES "${CMAKE_CURRENT_BINARY_DIR}/libbrotlicommon.pc"
DESTINATION "${CMAKE_INSTALL_LIBDIR}/pkgconfig")
install(FILES "${CMAKE_CURRENT_BINARY_DIR}/libbrotlidec.pc"

View File

@@ -7,6 +7,6 @@
module(
name = "brotli",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli",
)

4
README
View File

@@ -7,9 +7,9 @@ currently available general-purpose compression methods. It is similar in speed
with deflate but offers more dense compression.
The specification of the Brotli Compressed Data Format is defined in RFC 7932
https://tools.ietf.org/html/rfc7932
https://datatracker.ietf.org/doc/html/rfc7932
Brotli is open-sourced under the MIT License, see the LICENSE file.
Brotli mailing list:
https://groups.google.com/forum/#!forum/brotli
https://groups.google.com/g/brotli

View File

@@ -12,7 +12,8 @@ and 2nd order context modeling, with a compression ratio comparable to the best
currently available general-purpose compression methods. It is similar in speed
with deflate but offers more dense compression.
The specification of the Brotli Compressed Data Format is defined in [RFC 7932](https://tools.ietf.org/html/rfc7932).
The specification of the Brotli Compressed Data Format is defined in
[RFC 7932](https://datatracker.ietf.org/doc/html/rfc7932).
Brotli is open-sourced under the MIT License, see the LICENSE file.
@@ -21,11 +22,23 @@ Brotli is open-sourced under the MIT License, see the LICENSE file.
> to modify "raw" ranges of the compressed stream and the decoder will not
> notice that.
### Installation
In most Linux distributions, installing `brotli` is just a matter of using
the package management system. For example in Debian-based distributions:
`apt install brotli` will install `brotli`. On MacOS, you can use
[Homebrew](https://brew.sh/): `brew install brotli`.
[![brotli packaging status](https://repology.org/badge/vertical-allrepos/brotli.svg?exclude_unsupported=1&columns=3&exclude_sources=modules,site&header=brotli%20packaging%20status)](https://repology.org/project/brotli/versions)
Of course you can also build brotli from sources.
### Build instructions
#### Vcpkg
You can download and install brotli using the [vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
You can download and install brotli using the
[vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
@@ -33,11 +46,13 @@ You can download and install brotli using the [vcpkg](https://github.com/Microso
./vcpkg integrate install
./vcpkg install brotli
The brotli port in vcpkg is kept up to date by Microsoft team members and community contributors. If the version is out of date, please [create an issue or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
The brotli port in vcpkg is kept up to date by Microsoft team members and
community contributors. If the version is out of date, please [create an issue
or pull request](https://github.com/Microsoft/vcpkg) on the vcpkg repository.
#### Bazel
See [Bazel](http://www.bazel.build/)
See [Bazel](https://www.bazel.build/)
#### CMake
@@ -65,7 +80,7 @@ from source, development, and testing.
### Contributing
We glad to answer/library related questions in
[brotli mailing list](https://groups.google.com/forum/#!forum/brotli).
[brotli mailing list](https://groups.google.com/g/brotli).
Regular issues / feature requests should be reported in
[issue tracker](https://github.com/google/brotli/issues).
@@ -76,20 +91,24 @@ For contributing changes please read [CONTRIBUTING](CONTRIBUTING.md).
### Benchmarks
* [Squash Compression Benchmark](https://quixdb.github.io/squash-benchmark/) / [Unstable Squash Compression Benchmark](https://quixdb.github.io/squash-benchmark/unstable/)
* [Large Text Compression Benchmark](http://mattmahoney.net/dc/text.html)
* [Large Text Compression Benchmark](https://mattmahoney.net/dc/text.html)
* [Lzturbo Benchmark](https://sites.google.com/site/powturbo/home/benchmark)
### Related projects
> **Disclaimer:** Brotli authors take no responsibility for the third party projects mentioned in this section.
Independent [decoder](https://github.com/madler/brotli) implementation by Mark Adler, based entirely on format specification.
Independent [decoder](https://github.com/madler/brotli) implementation
by Mark Adler, based entirely on format specification.
JavaScript port of brotli [decoder](https://github.com/devongovett/brotli.js). Could be used directly via `npm install brotli`
JavaScript port of brotli [decoder](https://github.com/devongovett/brotli.js).
Could be used directly via `npm install brotli`
Hand ported [decoder / encoder](https://github.com/dominikhlbg/BrotliHaxe) in haxe by Dominik Homberger. Output source code: JavaScript, PHP, Python, Java and C#
Hand ported [decoder / encoder](https://github.com/dominikhlbg/BrotliHaxe)
in haxe by Dominik Homberger.
Output source code: JavaScript, PHP, Python, Java and C#
7Zip [plugin](https://github.com/mcmilk/7-Zip-Zstd)
Dart [native bindings](https://github.com/thosakwe/brotli)
Dart compression framework with [fast FFI-based Brotli implementation](https://pub.dev/documentation/es_compression/latest/brotli/brotli-library.html) with ready-to-use prebuilt binaries for Win/Linux/Mac
Dart compression framework with
[fast FFI-based Brotli implementation](https://pub.dev/documentation/es_compression/latest/brotli/)
with ready-to-use prebuilt binaries for Win/Linux/Mac

View File

@@ -203,9 +203,19 @@ OR:
#define BROTLI_TARGET_LOONGARCH64
#endif
/* This does not seem to be an indicator of z/Architecture (64-bit); neither
that allows to use unaligned loads. */
#if defined(__s390x__)
#define BROTLI_TARGET_S390X
#endif
#if defined(__mips64)
#define BROTLI_TARGET_MIPS64
#endif
#if defined(BROTLI_TARGET_X64) || defined(BROTLI_TARGET_ARMV8_64) || \
defined(BROTLI_TARGET_POWERPC64) || defined(BROTLI_TARGET_RISCV64) || \
defined(BROTLI_TARGET_LOONGARCH64)
defined(BROTLI_TARGET_LOONGARCH64) || defined(BROTLI_TARGET_MIPS64)
#define BROTLI_TARGET_64_BITS 1
#else
#define BROTLI_TARGET_64_BITS 0
@@ -267,6 +277,46 @@ OR:
#endif
/* Portable unaligned memory access: read / write values via memcpy. */
#if !defined(BROTLI_USE_PACKED_FOR_UNALIGNED)
#if defined(__mips__) && (!defined(__mips_isa_rev) || __mips_isa_rev < 6)
#define BROTLI_USE_PACKED_FOR_UNALIGNED 1
#else
#define BROTLI_USE_PACKED_FOR_UNALIGNED 0
#endif
#endif /* defined(BROTLI_USE_PACKED_FOR_UNALIGNED) */
#if BROTLI_USE_PACKED_FOR_UNALIGNED
typedef union BrotliPackedValue {
uint16_t u16;
uint32_t u32;
uint64_t u64;
size_t szt;
} __attribute__ ((packed)) BrotliPackedValue;
static BROTLI_INLINE uint16_t BrotliUnalignedRead16(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u16;
}
static BROTLI_INLINE uint32_t BrotliUnalignedRead32(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u32;
}
static BROTLI_INLINE uint64_t BrotliUnalignedRead64(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->u64;
}
static BROTLI_INLINE size_t BrotliUnalignedReadSizeT(const void* p) {
const BrotliPackedValue* address = (const BrotliPackedValue*)p;
return address->szt;
}
static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
BrotliPackedValue* address = (BrotliPackedValue*)p;
address->u64 = v;
}
#else /* not BROTLI_USE_PACKED_FOR_UNALIGNED */
static BROTLI_INLINE uint16_t BrotliUnalignedRead16(const void* p) {
uint16_t t;
memcpy(&t, p, sizeof t);
@@ -291,6 +341,34 @@ static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
memcpy(p, &v, sizeof v);
}
#endif /* BROTLI_USE_PACKED_FOR_UNALIGNED */
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap16, 4, 3, 0)
#define BROTLI_BSWAP16(V) ((uint16_t)__builtin_bswap16(V))
#else
#define BROTLI_BSWAP16(V) ((uint16_t)( \
(((V) & 0xFFU) << 8) | \
(((V) >> 8) & 0xFFU)))
#endif
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap32, 4, 3, 0)
#define BROTLI_BSWAP32(V) ((uint32_t)__builtin_bswap32(V))
#else
#define BROTLI_BSWAP32(V) ((uint32_t)( \
(((V) & 0xFFU) << 24) | (((V) & 0xFF00U) << 8) | \
(((V) >> 8) & 0xFF00U) | (((V) >> 24) & 0xFFU)))
#endif
#if BROTLI_GNUC_HAS_BUILTIN(__builtin_bswap64, 4, 3, 0)
#define BROTLI_BSWAP64(V) ((uint64_t)__builtin_bswap64(V))
#else
#define BROTLI_BSWAP64(V) ((uint64_t)( \
(((V) & 0xFFU) << 56) | (((V) & 0xFF00U) << 40) | \
(((V) & 0xFF0000U) << 24) | (((V) & 0xFF000000U) << 8) | \
(((V) >> 8) & 0xFF000000U) | (((V) >> 24) & 0xFF0000U) | \
(((V) >> 40) & 0xFF00U) | (((V) >> 56) & 0xFFU)))
#endif
#if BROTLI_LITTLE_ENDIAN
/* Straight endianness. Just read / write values. */
#define BROTLI_UNALIGNED_LOAD16LE BrotliUnalignedRead16
@@ -298,32 +376,20 @@ static BROTLI_INLINE void BrotliUnalignedWrite64(void* p, uint64_t v) {
#define BROTLI_UNALIGNED_LOAD64LE BrotliUnalignedRead64
#define BROTLI_UNALIGNED_STORE64LE BrotliUnalignedWrite64
#elif BROTLI_BIG_ENDIAN /* BROTLI_LITTLE_ENDIAN */
/* Explain compiler to byte-swap values. */
#define BROTLI_BSWAP16_(V) ((uint16_t)( \
(((V) & 0xFFU) << 8) | \
(((V) >> 8) & 0xFFU)))
static BROTLI_INLINE uint16_t BROTLI_UNALIGNED_LOAD16LE(const void* p) {
uint16_t value = BrotliUnalignedRead16(p);
return BROTLI_BSWAP16_(value);
return BROTLI_BSWAP16(value);
}
#define BROTLI_BSWAP32_(V) ( \
(((V) & 0xFFU) << 24) | (((V) & 0xFF00U) << 8) | \
(((V) >> 8) & 0xFF00U) | (((V) >> 24) & 0xFFU))
static BROTLI_INLINE uint32_t BROTLI_UNALIGNED_LOAD32LE(const void* p) {
uint32_t value = BrotliUnalignedRead32(p);
return BROTLI_BSWAP32_(value);
return BROTLI_BSWAP32(value);
}
#define BROTLI_BSWAP64_(V) ( \
(((V) & 0xFFU) << 56) | (((V) & 0xFF00U) << 40) | \
(((V) & 0xFF0000U) << 24) | (((V) & 0xFF000000U) << 8) | \
(((V) >> 8) & 0xFF000000U) | (((V) >> 24) & 0xFF0000U) | \
(((V) >> 40) & 0xFF00U) | (((V) >> 56) & 0xFFU))
static BROTLI_INLINE uint64_t BROTLI_UNALIGNED_LOAD64LE(const void* p) {
uint64_t value = BrotliUnalignedRead64(p);
return BROTLI_BSWAP64_(value);
return BROTLI_BSWAP64(value);
}
static BROTLI_INLINE void BROTLI_UNALIGNED_STORE64LE(void* p, uint64_t v) {
uint64_t value = BROTLI_BSWAP64_(v);
uint64_t value = BROTLI_BSWAP64(v);
BrotliUnalignedWrite64(p, value);
}
#else /* BROTLI_LITTLE_ENDIAN */

View File

@@ -275,7 +275,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
size_t pos = 0;
uint32_t chunk_size = 0;
size_t total_prefix_suffix_count = 0;
size_t trasform_list_start[SHARED_BROTLI_NUM_DICTIONARY_CONTEXTS];
size_t transform_list_start[SHARED_BROTLI_NUM_DICTIONARY_CONTEXTS];
uint16_t temporary_prefix_suffix_table[256];
/* Skip magic header bytes. */
@@ -329,7 +329,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
for (i = 0; i < dict->num_transform_lists; i++) {
BROTLI_BOOL ok = BROTLI_FALSE;
size_t prefix_suffix_count = 0;
trasform_list_start[i] = pos;
transform_list_start[i] = pos;
dict->transforms_instances[i].prefix_suffix_map =
temporary_prefix_suffix_table;
ok = ParseTransformsList(
@@ -347,7 +347,7 @@ static BROTLI_BOOL ParseDictionary(const uint8_t* encoded, size_t size,
total_prefix_suffix_count = 0;
for (i = 0; i < dict->num_transform_lists; i++) {
size_t prefix_suffix_count = 0;
size_t position = trasform_list_start[i];
size_t position = transform_list_start[i];
uint16_t* prefix_suffix_map =
&dict->prefix_suffix_maps[total_prefix_suffix_count];
BROTLI_BOOL ok = ParsePrefixSuffixTable(

View File

@@ -18,7 +18,7 @@
BrotliEncoderVersion methods. */
#define BROTLI_VERSION_MAJOR 1
#define BROTLI_VERSION_MINOR 1
#define BROTLI_VERSION_MINOR 2
#define BROTLI_VERSION_PATCH 0
#define BROTLI_VERSION BROTLI_MAKE_HEX_VERSION( \
@@ -32,9 +32,9 @@
- interfaces not changed -> current:revision+1:age
*/
#define BROTLI_ABI_CURRENT 2
#define BROTLI_ABI_CURRENT 3
#define BROTLI_ABI_REVISION 0
#define BROTLI_ABI_AGE 1
#define BROTLI_ABI_AGE 2
#if BROTLI_VERSION_MAJOR != (BROTLI_ABI_CURRENT - BROTLI_ABI_AGE)
#error ABI/API version inconsistency

View File

@@ -484,7 +484,7 @@ static BROTLI_INLINE int BrotliCopyPreloadedSymbolsToU8(const HuffmanCode* table
/* Calculate range where CheckInputAmount is always true.
Start with the number of bytes we can read. */
int64_t new_lim = br->guard_in - br->next_in;
/* Convert to bits, since sybmols use variable number of bits. */
/* Convert to bits, since symbols use variable number of bits. */
new_lim *= 8;
/* At most 15 bits per symbol, so this is safe. */
new_lim /= 15;
@@ -1539,7 +1539,7 @@ static BROTLI_BOOL AttachCompoundDictionary(
return BROTLI_TRUE;
}
static void EnsureCoumpoundDictionaryInitialized(BrotliDecoderState* state) {
static void EnsureCompoundDictionaryInitialized(BrotliDecoderState* state) {
BrotliDecoderCompoundDictionary* addon = state->compound_dictionary;
/* 256 = (1 << 8) slots in block map. */
int block_bits = 8;
@@ -1560,7 +1560,7 @@ static BROTLI_BOOL InitializeCompoundDictionaryCopy(BrotliDecoderState* s,
int address, int length) {
BrotliDecoderCompoundDictionary* addon = s->compound_dictionary;
int index;
EnsureCoumpoundDictionaryInitialized(s);
EnsureCompoundDictionaryInitialized(s);
index = addon->block_map[address >> addon->block_bits];
while (address >= addon->chunk_offsets[index + 1]) index++;
if (addon->total_size < address + length) return BROTLI_FALSE;

View File

@@ -343,22 +343,22 @@ static uint32_t ComputeDistanceShortcut(const size_t block_start,
const size_t max_backward_limit,
const size_t gap,
const ZopfliNode* nodes) {
const size_t clen = ZopfliNodeCopyLength(&nodes[pos]);
const size_t ilen = nodes[pos].dcode_insert_length & 0x7FFFFFF;
const size_t c_len = ZopfliNodeCopyLength(&nodes[pos]);
const size_t i_len = nodes[pos].dcode_insert_length & 0x7FFFFFF;
const size_t dist = ZopfliNodeCopyDistance(&nodes[pos]);
/* Since |block_start + pos| is the end position of the command, the copy part
starts from |block_start + pos - clen|. Distances that are greater than
starts from |block_start + pos - c_len|. Distances that are greater than
this or greater than |max_backward_limit| + |gap| are static dictionary
references, and do not update the last distances.
Also distance code 0 (last distance) does not update the last distances. */
if (pos == 0) {
return 0;
} else if (dist + clen <= block_start + pos + gap &&
} else if (dist + c_len <= block_start + pos + gap &&
dist <= max_backward_limit + gap &&
ZopfliNodeDistanceCode(&nodes[pos]) > 0) {
return (uint32_t)pos;
} else {
return nodes[pos - clen - ilen].u.shortcut;
return nodes[pos - c_len - i_len].u.shortcut;
}
}
@@ -376,12 +376,12 @@ static void ComputeDistanceCache(const size_t pos,
int idx = 0;
size_t p = nodes[pos].u.shortcut;
while (idx < 4 && p > 0) {
const size_t ilen = nodes[p].dcode_insert_length & 0x7FFFFFF;
const size_t clen = ZopfliNodeCopyLength(&nodes[p]);
const size_t i_len = nodes[p].dcode_insert_length & 0x7FFFFFF;
const size_t c_len = ZopfliNodeCopyLength(&nodes[p]);
const size_t dist = ZopfliNodeCopyDistance(&nodes[p]);
dist_cache[idx++] = (int)dist;
/* Because of prerequisite, p >= clen + ilen >= 2. */
p = nodes[p - clen - ilen].u.shortcut;
/* Because of prerequisite, p >= c_len + i_len >= 2. */
p = nodes[p - c_len - i_len].u.shortcut;
}
for (; idx < 4; ++idx) {
dist_cache[idx] = *starting_dist_cache++;

View File

@@ -7,13 +7,13 @@
module(
name = "brotli_fuzz",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_fuzz",
)
bazel_dep(name = "rules_fuzzing", version = "0.5.2")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "../..",

View File

@@ -283,6 +283,10 @@ typedef struct BrotliEncoderPreparedDictionaryStruct
* passed to @p alloc_func and @p free_func when they are called. @p free_func
* has to return without doing anything when asked to free a NULL pointer.
*
* @warning Created instance is "lean"; it does not contain copy of @p data,
* rather it contains only pointer to it; therefore,
* @p data @b MUST outlive the created instance.
*
* @param type type of dictionary stored in data
* @param data_size size of @p data buffer
* @param data pointer to the dictionary data

View File

@@ -269,20 +269,20 @@
#if defined(_WIN32)
#if defined(BROTLICOMMON_SHARED_COMPILATION)
#define BROTLI_COMMON_API __declspec(dllexport)
#else
#else /* !BROTLICOMMON_SHARED_COMPILATION */
#define BROTLI_COMMON_API __declspec(dllimport)
#endif /* BROTLICOMMON_SHARED_COMPILATION */
#if defined(BROTLIDEC_SHARED_COMPILATION)
#define BROTLI_DEC_API __declspec(dllexport)
#else
#else /* !BROTLIDEC_SHARED_COMPILATION */
#define BROTLI_DEC_API __declspec(dllimport)
#endif /* BROTLIDEC_SHARED_COMPILATION */
#if defined(BROTLIENC_SHARED_COMPILATION)
#define BROTLI_ENC_API __declspec(dllexport)
#else
#else /* !BROTLIENC_SHARED_COMPILATION */
#define BROTLI_ENC_API __declspec(dllimport)
#endif /* BROTLIENC_SHARED_COMPILATION */
#else /* _WIN32 */
#else /* !_WIN32 */
#define BROTLI_COMMON_API BROTLI_PUBLIC
#define BROTLI_DEC_API BROTLI_PUBLIC
#define BROTLI_ENC_API BROTLI_PUBLIC

View File

@@ -63,7 +63,7 @@ echo "${CODE//$PATTERN/$REPLACEMENT}" > org/brotli/dec/BrotliInputStream.cs
#-------------------------------------------------------------------------------
echo -e '\n\033[01;33mDowloading dependencies.\033[00m'
echo -e '\n\033[01;33mDownloading dependencies.\033[00m'
cd build
nuget install NUnit -Version 3.6.1

View File

@@ -535,6 +535,11 @@ Result is only valid if quality is at least \fC2\fP and, in case \fBBrotliEncode
.PP
Prepares a shared dictionary from the given file format for the encoder\&. \fCalloc_func\fP and \fCfree_func\fP \fBMUST\fP be both zero or both non-zero\&. In the case they are both zero, default memory allocators are used\&. \fCopaque\fP is passed to \fCalloc_func\fP and \fCfree_func\fP when they are called\&. \fCfree_func\fP has to return without doing anything when asked to free a NULL pointer\&.
.PP
\fBWarning:\fP
.RS 4
Created instance is 'lean'; it does not contain copy of \fCdata\fP, rather it contains only pointer to it; therefore, \fCdata\fP \fBMUST\fP outlive the created instance\&.
.RE
.PP
\fBParameters:\fP
.RS 4
\fItype\fP type of dictionary stored in data

View File

@@ -1 +1,3 @@
module github.com/google/brotli/go/brotli
go 1.21

View File

@@ -9,7 +9,6 @@ import (
"bytes"
"errors"
"io"
"io/ioutil"
"strconv"
"unsafe"
)
@@ -127,5 +126,5 @@ func Decode(encodedData []byte) ([]byte, error) {
func DecodeWithRawDictionary(encodedData []byte, dictionary []byte) ([]byte, error) {
r := NewReaderWithOptions(bytes.NewReader(encodedData), ReaderOptions{RawDictionary: dictionary})
defer r.Close()
return ioutil.ReadAll(r)
return io.ReadAll(r)
}

View File

@@ -1 +1,3 @@
module github.com/google/brotli/go/cbrotli
go 1.21

View File

@@ -7,7 +7,7 @@
module(
name = "brotli_java",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_java",
)
@@ -16,7 +16,7 @@ bazel_dep(name = "rules_jvm_external", version = "6.7")
bazel_dep(name = "rules_kotlin", version = "2.1.4")
bazel_dep(name = "platforms", version = "0.0.11")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "..",

View File

@@ -72,10 +72,10 @@ final class BitReader {
while (bytesInBuffer < CAPACITY) {
final int spaceLeft = CAPACITY - bytesInBuffer;
final int len = Utils.readInput(s, s.byteBuffer, bytesInBuffer, spaceLeft);
// EOF is -1 in Java, but 0 in C#.
if (len < BROTLI_ERROR) {
return len;
}
// EOF is -1 in Java, but 0 in C#.
if (len <= 0) {
s.endOfStreamReached = 1;
s.tailBytes = bytesInBuffer;
@@ -276,10 +276,10 @@ final class BitReader {
// Now it is possible to copy bytes directly.
while (len > 0) {
final int chunkLen = Utils.readInput(s, data, pos, len);
// EOF is -1 in Java, but 0 in C#.
if (len < BROTLI_ERROR) {
return len;
if (chunkLen < BROTLI_ERROR) {
return chunkLen;
}
// EOF is -1 in Java, but 0 in C#.
if (chunkLen <= 0) {
return Utils.makeError(s, BROTLI_ERROR_TRUNCATED_INPUT);
}

View File

@@ -85,6 +85,7 @@ public class DecodeTest {
@Test
public void testUkkonooa() throws IOException {
// typo:off
checkDecodeResource(
"ukko nooa, ukko nooa oli kunnon mies, kun han meni saunaan, "
+ "pisti laukun naulaan, ukko nooa, ukko nooa oli kunnon mies.",
@@ -92,6 +93,7 @@ public class DecodeTest {
+ "6\u000E\u009C\u00E0\u0090\u0003\u00F7\u008B\u009E8\u00E6\u00B6\u0000\u00AB\u00C3\u00CA"
+ "\u00A0\u00C2\u00DAf6\u00DC\u00CD\u0080\u008D.!\u00D7n\u00E3\u00EAL\u00B8\u00F0\u00D2"
+ "\u00B8\u00C7\u00C2pM:\u00F0i~\u00A1\u00B8Es\u00AB\u00C4W\u001E");
// typo:on
}
@Test

File diff suppressed because one or more lines are too long

View File

@@ -283,6 +283,7 @@ public class SynthTest {
*/
compressed,
true,
// typo:off
"|categories|categories | categories |ategories|Categories |categories the | categories|s cat"
+ "egories |categories of |Categories|categories and |tegories|categorie|, categories |catego"
+ "ries, | Categories |categories in |categories to |e categories |categories\"|categories.|c"
@@ -301,6 +302,7 @@ public class SynthTest {
+ "\"|categoriesous |CATEGORIES, |Categories='| Categories,| CATEGORIES=\"| CATEGORIES, |CATE"
+ "GORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES='| CATEGORIES. | Categories=\"| C"
+ "ATEGORIES='| Categories='"
// typo:on
);
}

View File

@@ -64,6 +64,7 @@ final class Transform {
private static final int SHIFT_ALL = SHIFT_FIRST + 1;
// Bundle of 0-terminated strings.
// typo:off
private static final String PREFIX_SUFFIX_SRC = "# #s #, #e #.# the #.com/#\u00C2\u00A0# of # and"
+ " # in # to #\"#\">#\n#]# for # a # that #. # with #'# from # by #. The # on # as # is #ing"
+ " #\n\t#:#ed #(# at #ly #=\"# of the #. This #,# not #er #al #='#ful #ive #less #est #ize #"
@@ -73,6 +74,7 @@ final class Transform {
+ " ; < ' != > ?! 4 @ 4 2 & A *# ( B C& ) % ) !*# *-% A +! *. D! %' & E *6 F "
+ " G% ! *A *% H! D I!+! J!+ K +- *4! A L!*4 M N +6 O!*% +.! K *G P +%( ! G *D +D "
+ " Q +# *K!*G!+D!+# +G +A +4!+% +K!+4!*D!+K!*K";
// typo:on
private static void unpackTransforms(byte[] prefixSuffix,
int[] prefixSuffixHeads, int[] transforms, String prefixSuffixSrc, String transformsSrc) {

View File

@@ -5,10 +5,10 @@
<parent>
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</parent>
<artifactId>dec</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -43,7 +43,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.1.2</version>
<version>3.5.4</version>
<configuration>
<systemPropertyVariables>
<BROTLI_ENABLE_ASSERTS>true</BROTLI_ENABLE_ASSERTS>
@@ -53,7 +53,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.3.0</version>
<version>3.3.1</version>
<configuration>
<finalName>${project.groupId}.${project.artifactId}-${project.version}</finalName>
</configuration>
@@ -78,7 +78,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.5.0</version>
<version>3.12.0</version>
<configuration>
<source>8</source>
<finalName>${project.groupId}.${project.artifactId}-${project.version}</finalName>
@@ -102,7 +102,7 @@
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>5.1.9</version>
<version>6.0.0</version>
<configuration>
<archive>
<forced>true</forced>
@@ -134,7 +134,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.3.0</version>
<version>3.4.2</version>
<configuration>
<archive>
<manifestFile>${manifestfile}</manifestFile>

View File

@@ -5,10 +5,10 @@
<parent>
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</parent>
<artifactId>integration</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -17,7 +17,7 @@
<dependency>
<groupId>org.brotli</groupId>
<artifactId>dec</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
</dependency>
</dependencies>
@@ -27,7 +27,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<version>3.5.1</version>
<executions>
<execution>
<id>data</id>

View File

@@ -4,7 +4,7 @@
<groupId>org.brotli</groupId>
<artifactId>parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
<version>1.2.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>${project.groupId}:${project.artifactId}</name>
@@ -57,7 +57,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<version>3.2.8</version>
<executions>
<execution>
<id>sign-artifacts</id>
@@ -78,16 +78,16 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<version>3.14.0</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.13</version>
<version>1.7.0</version>
<extensions>true</extensions>
<configuration>
<serverId>ossrh</serverId>

View File

@@ -15,7 +15,7 @@ import java.util.ArrayList;
/**
* Base class for InputStream / Channel implementations.
*/
public class Decoder {
public class Decoder implements AutoCloseable {
private static final ByteBuffer EMPTY_BUFFER = ByteBuffer.allocate(0);
private final ReadableByteChannel source;
private final DecoderJNI.Wrapper decoder;
@@ -129,7 +129,8 @@ public class Decoder {
return limit;
}
void close() throws IOException {
@Override
public void close() throws IOException {
if (closed) {
return;
}
@@ -140,9 +141,9 @@ public class Decoder {
/** Decodes the given data buffer starting at offset till length. */
public static byte[] decompress(byte[] data, int offset, int length) throws IOException {
DecoderJNI.Wrapper decoder = new DecoderJNI.Wrapper(length);
ArrayList<byte[]> output = new ArrayList<byte[]>();
ArrayList<byte[]> output = new ArrayList<>();
int totalOutputSize = 0;
DecoderJNI.Wrapper decoder = new DecoderJNI.Wrapper(length);
try {
decoder.getInputBuffer().put(data, offset, length);
decoder.push(length);

View File

@@ -122,14 +122,5 @@ public class DecoderJNI {
nativeDestroy(context);
context[0] = 0;
}
@Override
protected void finalize() throws Throwable {
if (context[0] != 0) {
/* TODO(eustas): log resource leak? */
destroy();
}
super.finalize();
}
}
}

View File

@@ -42,8 +42,8 @@ public class BrotliEncoderChannelTest extends BrotliJniTestBase {
try {
List<String> entries = BundleHelper.listEntries(bundle);
for (String entry : entries) {
suite.addTest(new ChannleTestCase(entry, TestMode.WRITE_ALL));
suite.addTest(new ChannleTestCase(entry, TestMode.WRITE_CHUNKS));
suite.addTest(new ChannelTestCase(entry, TestMode.WRITE_ALL));
suite.addTest(new ChannelTestCase(entry, TestMode.WRITE_CHUNKS));
}
} finally {
bundle.close();
@@ -52,10 +52,10 @@ public class BrotliEncoderChannelTest extends BrotliJniTestBase {
}
/** Test case with a unique name. */
static class ChannleTestCase extends TestCase {
static class ChannelTestCase extends TestCase {
final String entryName;
final TestMode mode;
ChannleTestCase(String entryName, TestMode mode) {
ChannelTestCase(String entryName, TestMode mode) {
super("BrotliEncoderChannelTest." + entryName + "." + mode.name());
this.entryName = entryName;
this.mode = mode;

View File

@@ -6,10 +6,10 @@
package org.brotli.wrapper.enc;
import org.brotli.enc.PreparedDictionary;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.channels.Channels;
import org.brotli.enc.PreparedDictionary;
/**
* Output stream that wraps native brotli encoder.

View File

@@ -17,7 +17,7 @@ import org.brotli.enc.PreparedDictionary;
/**
* Base class for OutputStream / Channel implementations.
*/
public class Encoder {
public class Encoder implements AutoCloseable {
private final WritableByteChannel destination;
private final List<PreparedDictionary> dictionaries;
private final EncoderJNI.Wrapper encoder;
@@ -65,12 +65,6 @@ public class Encoder {
public Parameters() { }
private Parameters(Parameters other) {
this.quality = other.quality;
this.lgwin = other.lgwin;
this.mode = other.mode;
}
/**
* Setup encoder quality.
*
@@ -199,7 +193,8 @@ public class Encoder {
encode(EncoderJNI.Operation.FLUSH);
}
void close() throws IOException {
@Override
public void close() throws IOException {
if (closed) {
return;
}
@@ -221,10 +216,10 @@ public class Encoder {
return empty;
}
/* data.length > 0 */
ArrayList<byte[]> output = new ArrayList<>();
int totalOutputSize = 0;
EncoderJNI.Wrapper encoder =
new EncoderJNI.Wrapper(length, params.quality, params.lgwin, params.mode);
ArrayList<byte[]> output = new ArrayList<byte[]>();
int totalOutputSize = 0;
try {
encoder.getInputBuffer().put(data, offset, length);
encoder.push(EncoderJNI.Operation.FINISH, length);

View File

@@ -6,9 +6,9 @@
package org.brotli.wrapper.enc;
import org.brotli.enc.PreparedDictionary;
import java.io.IOException;
import java.nio.ByteBuffer;
import org.brotli.enc.PreparedDictionary;
/**
* JNI wrapper for brotli encoder.
@@ -28,7 +28,7 @@ class EncoderJNI {
FINISH
}
private static class PreparedDictionaryImpl implements PreparedDictionary {
private static class PreparedDictionaryImpl implements AutoCloseable, PreparedDictionary {
private ByteBuffer data;
/** Reference to (non-copied) LZ data. */
private ByteBuffer rawData;
@@ -43,15 +43,11 @@ class EncoderJNI {
}
@Override
protected void finalize() throws Throwable {
try {
ByteBuffer data = this.data;
this.data = null;
this.rawData = null;
nativeDestroyDictionary(data);
} finally {
super.finalize();
}
public void close() {
ByteBuffer data = this.data;
this.data = null;
this.rawData = null;
nativeDestroyDictionary(data);
}
}
@@ -168,14 +164,5 @@ class EncoderJNI {
nativeDestroy(context);
context[0] = 0;
}
@Override
protected void finalize() throws Throwable {
if (context[0] != 0) {
/* TODO(eustas): log resource leak? */
destroy();
}
super.finalize();
}
}
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -239,6 +239,7 @@ testAllTransforms10() {
*/
compressed,
true,
// typo:off
'|categories|categories | categories |ategories|Categories |categories the '
+ '| categories|s categories |categories of |Categories|categories and |teg'
+ 'ories|categorie|, categories |categories, | Categories |categories in |c'
@@ -261,6 +262,7 @@ testAllTransforms10() {
+ '|CATEGORIES, |Categories=\'| Categories,| CATEGORIES="| CATEGORIES, |CAT'
+ 'EGORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES=\'| CATEGORIES'
+ '. | Categories="| CATEGORIES=\'| Categories=\''
// typo:on
);
},

View File

@@ -229,6 +229,7 @@ testAllTransforms10() {
*/
compressed,
true,
// typo:off
'|categories|categories | categories |ategories|Categories |categories the '
+ '| categories|s categories |categories of |Categories|categories and |teg'
+ 'ories|categorie|, categories |categories, | Categories |categories in |c'
@@ -251,6 +252,7 @@ testAllTransforms10() {
+ '|CATEGORIES, |Categories=\'| Categories,| CATEGORIES="| CATEGORIES, |CAT'
+ 'EGORIES,|CATEGORIES(|CATEGORIES. | CATEGORIES.|CATEGORIES=\'| CATEGORIES'
+ '. | Categories="| CATEGORIES=\'| Categories=\''
// typo:on
);
},

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,5 @@
"""Common utilities for Brotli tests."""
from __future__ import print_function
import filecmp
import glob
@@ -8,10 +10,10 @@ import sysconfig
import tempfile
import unittest
project_dir = os.path.abspath(os.path.join(__file__, '..', '..', '..'))
test_dir = os.getenv("BROTLI_TESTS_PATH")
BRO_ARGS = [os.getenv("BROTLI_WRAPPER")]
# TODO(eustas): use str(pathlib.PurePath(file).parent.parent) for Python 3.4+
project_dir = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
test_dir = os.getenv('BROTLI_TESTS_PATH')
BRO_ARGS = [os.getenv('BROTLI_WRAPPER')]
# Fallbacks
if test_dir is None:
@@ -24,17 +26,18 @@ if BRO_ARGS[0] is None:
# Get the platform/version-specific build folder.
# By default, the distutils build base is in the same location as setup.py.
platform_lib_name = 'lib.{platform}-{version[0]}.{version[1]}'.format(
platform=sysconfig.get_platform(), version=sys.version_info)
platform=sysconfig.get_platform(), version=sys.version_info
)
build_dir = os.path.join(project_dir, 'bin', platform_lib_name)
# Prepend the build folder to sys.path and the PYTHONPATH environment variable.
if build_dir not in sys.path:
sys.path.insert(0, build_dir)
TEST_ENV = os.environ.copy()
sys.path.insert(0, build_dir)
TEST_ENV = dict(os.environ)
if 'PYTHONPATH' not in TEST_ENV:
TEST_ENV['PYTHONPATH'] = build_dir
TEST_ENV['PYTHONPATH'] = build_dir
else:
TEST_ENV['PYTHONPATH'] = build_dir + os.pathsep + TEST_ENV['PYTHONPATH']
TEST_ENV['PYTHONPATH'] = build_dir + os.pathsep + TEST_ENV['PYTHONPATH']
TESTDATA_DIR = os.path.join(test_dir, 'testdata')
@@ -47,6 +50,7 @@ TESTDATA_FILES = [
'ukkonooa', # Poem
'cp1251-utf16le', # Codepage 1251 table saved in UTF16-LE encoding
'cp852-utf8', # Codepage 852 table saved in UTF8 encoding
# TODO(eustas): add test on already compressed content
]
# Some files might be missing in a lightweight sources pack.
@@ -59,68 +63,91 @@ TESTDATA_PATHS = [
]
TESTDATA_PATHS_FOR_DECOMPRESSION = glob.glob(
os.path.join(TESTDATA_DIR, '*.compressed'))
os.path.join(TESTDATA_DIR, '*.compressed')
)
TEMP_DIR = tempfile.mkdtemp()
def get_temp_compressed_name(filename):
return os.path.join(TEMP_DIR, os.path.basename(filename + '.bro'))
return os.path.join(TEMP_DIR, os.path.basename(filename + '.bro'))
def get_temp_uncompressed_name(filename):
return os.path.join(TEMP_DIR, os.path.basename(filename + '.unbro'))
return os.path.join(TEMP_DIR, os.path.basename(filename + '.unbro'))
def bind_method_args(method, *args, **kwargs):
return lambda self: method(self, *args, **kwargs)
return lambda self: method(self, *args, **kwargs)
def generate_test_methods(test_case_class,
for_decompression=False,
variants=None):
# Add test methods for each test data file. This makes identifying problems
# with specific compression scenarios easier.
if for_decompression:
paths = TESTDATA_PATHS_FOR_DECOMPRESSION
else:
paths = TESTDATA_PATHS
opts = []
if variants:
opts_list = []
for k, v in variants.items():
opts_list.append([r for r in itertools.product([k], v)])
for o in itertools.product(*opts_list):
opts_name = '_'.join([str(i) for i in itertools.chain(*o)])
opts_dict = dict(o)
opts.append([opts_name, opts_dict])
else:
opts.append(['', {}])
for method in [m for m in dir(test_case_class) if m.startswith('_test')]:
for testdata in paths:
for (opts_name, opts_dict) in opts:
f = os.path.splitext(os.path.basename(testdata))[0]
name = 'test_{method}_{options}_{file}'.format(
method=method, options=opts_name, file=f)
func = bind_method_args(
getattr(test_case_class, method), testdata, **opts_dict)
setattr(test_case_class, name, func)
# TODO(eustas): migrate to absl.testing.parameterized.
def generate_test_methods(
test_case_class, for_decompression=False, variants=None
):
"""Adds test methods for each test data file and each variant.
This makes identifying problems with specific compression scenarios easier.
Args:
test_case_class: The test class to add methods to.
for_decompression: If True, uses compressed test data files.
variants: A dictionary where keys are option names and values are lists of
possible values for that option. Each combination of variants will
generate a separate test method.
"""
if for_decompression:
paths = [
path for path in TESTDATA_PATHS_FOR_DECOMPRESSION
if os.path.exists(path.replace('.compressed', ''))
]
else:
paths = TESTDATA_PATHS
opts = []
if variants:
opts_list = []
for k, v in variants.items():
opts_list.append([r for r in itertools.product([k], v)])
for o in itertools.product(*opts_list):
opts_name = '_'.join([str(i) for i in itertools.chain(*o)])
opts_dict = dict(o)
opts.append([opts_name, opts_dict])
else:
opts.append(['', {}])
for method in [m for m in dir(test_case_class) if m.startswith('_test')]:
for testdata in paths:
for opts_name, opts_dict in opts:
f = os.path.splitext(os.path.basename(testdata))[0]
name = 'test_{method}_{options}_{file}'.format(
method=method, options=opts_name, file=f
)
func = bind_method_args(
getattr(test_case_class, method), testdata, **opts_dict
)
setattr(test_case_class, name, func)
class TestCase(unittest.TestCase):
"""Base class for Brotli test cases.
def tearDown(self):
for f in TESTDATA_PATHS:
try:
os.unlink(get_temp_compressed_name(f))
except OSError:
pass
try:
os.unlink(get_temp_uncompressed_name(f))
except OSError:
pass
Provides common setup and teardown logic, including cleaning up temporary
files and a utility for comparing file contents.
"""
def assertFilesMatch(self, first, second):
self.assertTrue(
filecmp.cmp(first, second, shallow=False),
'File {} differs from {}'.format(first, second))
def tearDown(self):
for f in TESTDATA_PATHS:
try:
os.unlink(get_temp_compressed_name(f))
except OSError:
pass
try:
os.unlink(get_temp_uncompressed_name(f))
except OSError:
pass
# super().tearDown() # Requires Py3+
def assert_files_match(self, first, second):
self.assertTrue(
filecmp.cmp(first, second, shallow=False),
'File {} differs from {}'.format(first, second),
)

View File

@@ -13,37 +13,38 @@ TEST_ENV = _test_utils.TEST_ENV
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
return test_data.split('.compressed')[0]
class TestBroDecompress(_test_utils.TestCase):
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assert_files_match(temp_uncompressed, original)
def _decompress_file(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-f', '-d', '-i', test_data, '-o', temp_uncompressed]
subprocess.check_call(args, env=TEST_ENV)
def _decompress_file(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-f', '-d', '-i', test_data, '-o', temp_uncompressed]
subprocess.check_call(args, env=TEST_ENV)
def _decompress_pipe(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-d']
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV)
def _decompress_pipe(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
args = BRO_ARGS + ['-d']
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV
)
def _test_decompress_file(self, test_data):
self._decompress_file(test_data)
self._check_decompression(test_data)
def _test_decompress_file(self, test_data):
self._decompress_file(test_data)
self._check_decompression(test_data)
def _test_decompress_pipe(self, test_data):
self._decompress_pipe(test_data)
self._check_decompression(test_data)
def _test_decompress_pipe(self, test_data):
self._decompress_pipe(test_data)
self._check_decompression(test_data)
_test_utils.generate_test_methods(TestBroDecompress, for_decompression=True)
@@ -51,51 +52,53 @@ _test_utils.generate_test_methods(TestBroDecompress, for_decompression=True)
class TestBroCompress(_test_utils.TestCase):
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
def _check_decompression(self, test_data, **kwargs):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
args = BRO_ARGS + ['-f', '-d']
args.extend(['-i', temp_compressed, '-o', temp_uncompressed])
subprocess.check_call(args, env=TEST_ENV)
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
args = BRO_ARGS + ['-f', '-d']
args.extend(['-i', temp_compressed, '-o', temp_uncompressed])
subprocess.check_call(args, env=TEST_ENV)
self.assert_files_match(temp_uncompressed, original)
def _compress_file(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS + ['-f']
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
args.extend(['-i', test_data, '-o', temp_compressed])
subprocess.check_call(args, env=TEST_ENV)
def _compress_file(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS + ['-f']
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
args.extend(['-i', test_data, '-o', temp_compressed])
subprocess.check_call(args, env=TEST_ENV)
def _compress_pipe(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV)
def _compress_pipe(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
args = BRO_ARGS
if 'quality' in kwargs:
args.extend(['-q', str(kwargs['quality'])])
if 'lgwin' in kwargs:
args.extend(['--lgwin', str(kwargs['lgwin'])])
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
subprocess.check_call(
args, stdin=in_file, stdout=out_file, env=TEST_ENV
)
def _test_compress_file(self, test_data, **kwargs):
self._compress_file(test_data, **kwargs)
self._check_decompression(test_data)
def _test_compress_file(self, test_data, **kwargs):
self._compress_file(test_data, **kwargs)
self._check_decompression(test_data)
def _test_compress_pipe(self, test_data, **kwargs):
self._compress_pipe(test_data, **kwargs)
self._check_decompression(test_data)
def _test_compress_pipe(self, test_data, **kwargs):
self._compress_pipe(test_data, **kwargs)
self._check_decompression(test_data)
_test_utils.generate_test_methods(
TestBroCompress, variants=TestBroCompress.VARIANTS)
TestBroCompress, variants=TestBroCompress.VARIANTS
)
if __name__ == '__main__':
unittest.main()
unittest.main()

View File

@@ -5,37 +5,37 @@
import unittest
from . import _test_utils
import brotli
from . import _test_utils
class TestCompress(_test_utils.TestCase):
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
VARIANTS = {'quality': (1, 6, 9, 11), 'lgwin': (10, 15, 20, 24)}
def _check_decompression(self, test_data, **kwargs):
kwargs = {}
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read(), **kwargs))
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
self.assert_files_match(temp_uncompressed, original)
def _compress(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.compress(in_file.read(), **kwargs))
def _compress(self, test_data, **kwargs):
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.compress(in_file.read(), **kwargs))
def _test_compress(self, test_data, **kwargs):
self._compress(test_data, **kwargs)
self._check_decompression(test_data, **kwargs)
def _test_compress(self, test_data, **kwargs):
self._compress(test_data, **kwargs)
self._check_decompression(test_data)
_test_utils.generate_test_methods(TestCompress, variants=TestCompress.VARIANTS)
if __name__ == '__main__':
unittest.main()
unittest.main()

View File

@@ -6,61 +6,63 @@
import functools
import unittest
from . import _test_utils
import brotli
from . import _test_utils
# Do not inherit from TestCase here to ensure that test methods
# are not run automatically and instead are run as part of a specific
# configuration below.
class _TestCompressor(object):
CHUNK_SIZE = 2048
CHUNK_SIZE = 2048
def tearDown(self):
self.compressor = None
def tearDown(self):
self.compressor = None
# super().tearDown() # Requires Py3+
def _check_decompression(self, test_data):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Write decompression to temp file and verify it matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
original = test_data
with open(temp_uncompressed, 'wb') as out_file:
with open(temp_compressed, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
self.assert_files_match(temp_uncompressed, original)
def _test_single_process(self, test_data):
# Write single-shot compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(self.compressor.process(in_file.read()))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_single_process(self, test_data):
# Write single-shot compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(self.compressor.process(in_file.read()))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process(self, test_data):
# Write chunked compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process(self, test_data):
# Write chunked compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process_and_flush(self, test_data):
# Write chunked and flushed compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.flush())
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
def _test_multiple_process_and_flush(self, test_data):
# Write chunked and flushed compression to temp file.
temp_compressed = _test_utils.get_temp_compressed_name(test_data)
with open(temp_compressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.compressor.process(data))
out_file.write(self.compressor.flush())
out_file.write(self.compressor.finish())
self._check_decompression(test_data)
_test_utils.generate_test_methods(_TestCompressor)
@@ -68,27 +70,31 @@ _test_utils.generate_test_methods(_TestCompressor)
class TestCompressorQuality1(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=1)
def setUp(self):
# super().setUp() # Requires Py3+
self.compressor = brotli.Compressor(quality=1)
class TestCompressorQuality6(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=6)
def setUp(self):
# super().setUp() # Requires Py3+
self.compressor = brotli.Compressor(quality=6)
class TestCompressorQuality9(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=9)
def setUp(self):
# super().setUp() # Requires Py3+
self.compressor = brotli.Compressor(quality=9)
class TestCompressorQuality11(_TestCompressor, _test_utils.TestCase):
def setUp(self):
self.compressor = brotli.Compressor(quality=11)
def setUp(self):
# super().setUp() # Requires Py3+
self.compressor = brotli.Compressor(quality=11)
if __name__ == '__main__':
unittest.main()
unittest.main()

View File

@@ -5,38 +5,39 @@
import unittest
from . import _test_utils
import brotli
from . import _test_utils
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
return test_data.split('.compressed')[0]
class TestDecompress(_test_utils.TestCase):
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assert_files_match(temp_uncompressed, original)
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
out_file.write(brotli.decompress(in_file.read()))
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
brotli.decompress(brotli.compress(b'a') + b'a')
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
brotli.decompress(brotli.compress(b'a') + b'a')
_test_utils.generate_test_methods(TestDecompress, for_decompression=True)
if __name__ == '__main__':
unittest.main()
unittest.main()

View File

@@ -7,95 +7,112 @@ import functools
import os
import unittest
from . import _test_utils
import brotli
from . import _test_utils
def _get_original_name(test_data):
return test_data.split('.compressed')[0]
return test_data.split('.compressed')[0]
class TestDecompressor(_test_utils.TestCase):
CHUNK_SIZE = 1
CHUNK_SIZE = 1
MIN_OUTPUT_BUFFER_SIZE = 32768 # Actually, several bytes less.
def setUp(self):
self.decompressor = brotli.Decompressor()
def setUp(self):
# super().setUp() # Requires Py3+
self.decompressor = brotli.Decompressor()
def tearDown(self):
self.decompressor = None
def tearDown(self):
self.decompressor = None
# super().tearDown() # Requires Py3+
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assertFilesMatch(temp_uncompressed, original)
def _check_decompression(self, test_data):
# Verify decompression matches the original.
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
original = _get_original_name(test_data)
self.assert_files_match(temp_uncompressed, original)
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.decompressor.process(data))
self.assertTrue(self.decompressor.is_finished())
def _decompress(self, test_data):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
read_chunk = functools.partial(in_file.read, self.CHUNK_SIZE)
for data in iter(read_chunk, b''):
out_file.write(self.decompressor.process(data))
self.assertTrue(self.decompressor.is_finished())
def _decompress_with_limit(self, test_data, max_output_length):
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
chunk_iter = iter(functools.partial(in_file.read, 10 * 1024), b'')
while not self.decompressor.is_finished():
data = b''
if self.decompressor.can_accept_more_data():
data = next(chunk_iter, b'')
decompressed_data = self.decompressor.process(data, max_output_length=max_output_length)
self.assertTrue(len(decompressed_data) <= max_output_length)
out_file.write(decompressed_data)
self.assertTrue(next(chunk_iter, None) == None)
def _decompress_with_limit(self, test_data):
output_buffer_limit = 10922
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
chunk_iter = iter(functools.partial(in_file.read, 10 * 1024), b'')
while not self.decompressor.is_finished():
data = b''
if self.decompressor.can_accept_more_data():
data = next(chunk_iter, b'')
decompressed_data = self.decompressor.process(
data, output_buffer_limit=output_buffer_limit
)
self.assertLessEqual(
len(decompressed_data), self.MIN_OUTPUT_BUFFER_SIZE
)
out_file.write(decompressed_data)
self.assertIsNone(next(chunk_iter, None))
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def _test_decompress(self, test_data):
self._decompress(test_data)
self._check_decompression(test_data)
def _test_decompress_with_limit(self, test_data):
self._decompress_with_limit(test_data, max_output_length=20)
self._check_decompression(test_data)
def _test_decompress_with_limit(self, test_data):
self._decompress_with_limit(test_data)
self._check_decompression(test_data)
def test_too_much_input(self):
with open(os.path.join(_test_utils.TESTDATA_DIR, "zerosukkanooa.compressed"), 'rb') as in_file:
compressed = in_file.read()
self.decompressor.process(compressed[:-1], max_output_length=1)
# the following assertion checks whether the test setup is correct
self.assertTrue(not self.decompressor.can_accept_more_data())
with self.assertRaises(brotli.error):
self.decompressor.process(compressed[-1:])
def test_too_much_input(self):
with open(
os.path.join(_test_utils.TESTDATA_DIR, 'zerosukkanooa.compressed'), 'rb'
) as in_file:
compressed = in_file.read()
self.decompressor.process(compressed[:-1], output_buffer_limit=10240)
# the following assertion checks whether the test setup is correct
self.assertFalse(self.decompressor.can_accept_more_data())
with self.assertRaises(brotli.error):
self.decompressor.process(compressed[-1:])
def test_changing_limit(self):
test_data = os.path.join(_test_utils.TESTDATA_DIR, "zerosukkanooa.compressed")
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
compressed = in_file.read()
uncompressed = self.decompressor.process(compressed[:-1], max_output_length=1)
self.assertTrue(len(uncompressed) <= 1)
out_file.write(uncompressed)
while not self.decompressor.can_accept_more_data():
out_file.write(self.decompressor.process(b''))
out_file.write(self.decompressor.process(compressed[-1:]))
self._check_decompression(test_data)
def test_changing_limit(self):
test_data = os.path.join(
_test_utils.TESTDATA_DIR, 'zerosukkanooa.compressed'
)
check_output = os.path.exists(test_data.replace('.compressed', ''))
temp_uncompressed = _test_utils.get_temp_uncompressed_name(test_data)
with open(temp_uncompressed, 'wb') as out_file:
with open(test_data, 'rb') as in_file:
compressed = in_file.read()
uncompressed = self.decompressor.process(
compressed[:-1], output_buffer_limit=10240
)
self.assertLessEqual(len(uncompressed), self.MIN_OUTPUT_BUFFER_SIZE)
out_file.write(uncompressed)
while not self.decompressor.can_accept_more_data():
out_file.write(self.decompressor.process(b''))
out_file.write(self.decompressor.process(compressed[-1:]))
if check_output:
self._check_decompression(test_data)
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
self.decompressor.process(brotli.compress(b'a') + b'a')
def test_garbage_appended(self):
with self.assertRaises(brotli.error):
self.decompressor.process(brotli.compress(b'a') + b'a')
def test_already_finished(self):
self.decompressor.process(brotli.compress(b'a'))
with self.assertRaises(brotli.error):
self.decompressor.process(b'a')
def test_already_finished(self):
self.decompressor.process(brotli.compress(b'a'))
with self.assertRaises(brotli.error):
self.decompressor.process(b'a')
_test_utils.generate_test_methods(TestDecompressor, for_decompression=True)
if __name__ == '__main__':
unittest.main()
unittest.main()

View File

@@ -7,14 +7,14 @@
module(
name = "brotli_research",
version = "1.1.0",
version = "1.2.0",
repo_name = "org_brotli_research",
)
bazel_dep(name = "divsufsort", version = "2.0.1")
bazel_dep(name = "esaxx", version = "20250106.0")
bazel_dep(name = "brotli", version = "1.1.0", repo_name = "org_brotli")
bazel_dep(name = "brotli", version = "1.2.0", repo_name = "org_brotli")
local_path_override(
module_name = "brotli",
path = "..",

View File

@@ -1286,8 +1286,9 @@ class WordList:
return word.encode('utf8')
#Super compact form of action table.
#_ means space, .U means UpperCaseAll, U(w) means UpperCaseFirst
# Super compact form of action table.
# _ means space, .U means UpperCaseAll, U(w) means UpperCaseFirst
# typo:off
actionTable = r"""
0:w 25:w+_for_ 50:w+\n\t 75:w+. This_100:w+ize_
1:w+_ 26:w[3:] 51:w+: 76:w+, 101:w.U+.
@@ -1315,6 +1316,7 @@ class WordList:
23:w[:-3] 48:w[:-7] 98:_+w+=\'
24:w+] 49:w[:-1]+ing_ 74:U(w)+\' 99:U(w)+,
"""
# typo:on
def compileActions(self):
"""Build the action table from the text above

10
scripts/check_typos.sh Executable file
View File

@@ -0,0 +1,10 @@
#!/bin/bash
HERE=`realpath $(dirname "$0")`
PROJECT_DIR=`realpath ${HERE}/..`
SRC_EXT="bazel|bzl|c|cc|cmake|gni|h|html|in|java|js|m|md|nix|py|rst|sh|ts|txt|yaml|yml"
cd "${PROJECT_DIR}"
sources=`find . -type f | sort |grep -E "\.(${SRC_EXT})$" | grep -v -E "^(./)?tests/testdata/" | grep -v -E "\.min\.js$" | grep -v -E "brotli_dictionary\.txt$"`
echo "Checking sources:"
echo "${sources}"
typos -c "${HERE}/typos.toml" ${sources}

View File

@@ -7,5 +7,5 @@ Name: libbrotlicommon
URL: https://github.com/google/brotli
Description: Brotli common dictionary library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlicommon
Libs: -L${libdir} -lbrotlicommon @libm@
Cflags: -I${includedir}

View File

@@ -8,5 +8,5 @@ URL: https://github.com/google/brotli
Description: Brotli decoder library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlidec
Requires.private: libbrotlicommon >= 1.1.0
Requires.private: libbrotlicommon >= 1.2.0
Cflags: -I${includedir}

View File

@@ -8,5 +8,5 @@ URL: https://github.com/google/brotli
Description: Brotli encoder library
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lbrotlienc
Requires.private: libbrotlicommon >= 1.1.0
Requires.private: libbrotlicommon >= 1.2.0
Cflags: -I${includedir}

17
scripts/typos.toml Normal file
View File

@@ -0,0 +1,17 @@
[default]
extend-ignore-re = [
"(?Rm)^.*// notypo$", # disable check in current line
"(?s)(#|//)\\s*typo:off.*?\\n\\s*(#|//)\\s*typo:on", # disable check in block
"0x[0-9a-fA-F]+[ ,u]", # hexadecimal literal
"\\W2-nd\\W", # second
"\\W2\\^nd\\W", # second with superscript
]
[default.extend-words]
sais = "sais" # SAIS library
uncompressible = "uncompressible" # personal choice
flate = "flate" # compression algorithm
[default.extend-identifiers]
gl_pathc = "gl_pathc" # glob_t

View File

@@ -24,7 +24,7 @@ from distutils import log
CURR_DIR = os.path.abspath(os.path.dirname(os.path.realpath(__file__)))
def bool_from_environ(key: str):
def bool_from_environ(key):
value = os.environ.get(key)
if not value:
return False
@@ -32,7 +32,7 @@ def bool_from_environ(key: str):
return True
if value == "0":
return False
raise ValueError(f"Environment variable {key} has invalid value {value}. Please set it to 1, 0 or an empty string")
raise ValueError("Environment variable {} has invalid value {}. Please set it to 1, 0 or an empty string".format(key, value))
def read_define(path, macro):
@@ -94,9 +94,7 @@ class BuildExt(build_ext):
objects = []
macros = ext.define_macros[:]
if platform.system() == "Darwin":
macros.append(("OS_MACOSX", "1"))
elif self.compiler.compiler_type == "mingw32":
if self.compiler.compiler_type == "mingw32":
# On Windows Python 2.7, pyconfig.h defines "hypot" as "_hypot",
# This clashes with GCC's cmath, and causes compilation errors when
# building under MinGW: http://bugs.python.org/issue11566
@@ -160,7 +158,8 @@ CLASSIFIERS = [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
# Deprecated, see https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details.
# "License :: OSI Approved :: MIT License",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX :: Linux",