Merge branch v3.1 into v3.2

This commit is contained in:
kevinpollet 2024-10-09 16:28:03 +02:00
commit be13b5b55d
No known key found for this signature in database
GPG key ID: 0C9A5DDD1B292453
22 changed files with 1616 additions and 1483 deletions

View file

@ -7,7 +7,7 @@ on:
env: env:
GO_VERSION: '1.23' GO_VERSION: '1.23'
GOLANGCI_LINT_VERSION: v1.60.3 GOLANGCI_LINT_VERSION: v1.61.0
MISSPELL_VERSION: v0.6.0 MISSPELL_VERSION: v0.6.0
jobs: jobs:

View file

@ -25,7 +25,7 @@ global_job_config:
- export "PATH=${GOPATH}/bin:${PATH}" - export "PATH=${GOPATH}/bin:${PATH}"
- mkdir -vp "${SEMAPHORE_GIT_DIR}" "${GOPATH}/bin" - mkdir -vp "${SEMAPHORE_GIT_DIR}" "${GOPATH}/bin"
- export GOPROXY=https://proxy.golang.org,direct - export GOPROXY=https://proxy.golang.org,direct
- curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b "${GOPATH}/bin" v1.60.3 - curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b "${GOPATH}/bin" v1.61.0
- curl -sSfL https://gist.githubusercontent.com/traefiker/6d7ac019c11d011e4f131bb2cca8900e/raw/goreleaser.sh | bash -s -- -b "${GOPATH}/bin" - curl -sSfL https://gist.githubusercontent.com/traefiker/6d7ac019c11d011e4f131bb2cca8900e/raw/goreleaser.sh | bash -s -- -b "${GOPATH}/bin"
- checkout - checkout
- cache restore traefik-$(checksum go.sum) - cache restore traefik-$(checksum go.sum)

View file

@ -1,3 +1,27 @@
## [v3.1.6](https://github.com/traefik/traefik/tree/v3.1.6) (2024-10-09)
[All Commits](https://github.com/traefik/traefik/compare/v3.1.5...v3.1.6)
**Bug fixes:**
- **[middleware]** Reuse compression writers ([#11168](https://github.com/traefik/traefik/pull/11168) by [michelheusschen](https://github.com/michelheusschen))
- **[middleware]** Use correct default weight in Accept-Encoding ([#11084](https://github.com/traefik/traefik/pull/11084) by [michelheusschen](https://github.com/michelheusschen))
- **[plugins]** Close wasm middleware to prevent memory leak ([#11151](https://github.com/traefik/traefik/pull/11151) by [ttys3](https://github.com/ttys3))
**Misc:**
- Merge branch v2.11 into v3.1 ([#11179](https://github.com/traefik/traefik/pull/11179) by [kevinpollet](https://github.com/kevinpollet))
- Merge branch v2.11 into v3.1 ([#11174](https://github.com/traefik/traefik/pull/11174) by [mmatur](https://github.com/mmatur))
## [v2.11.12](https://github.com/traefik/traefik/tree/v2.11.12) (2024-10-09)
[All Commits](https://github.com/traefik/traefik/compare/v2.11.11...v2.11.12)
**Bug fixes:**
- **[middleware]** Bump github.com/klauspost/compress to dbd6c381492a ([#11162](https://github.com/traefik/traefik/pull/11162) by [kevinpollet](https://github.com/kevinpollet))
- **[webui]** Upgrade to node 22.9 and yarn lock to fix vulnerabilities ([#11173](https://github.com/traefik/traefik/pull/11173) by [kevinpollet](https://github.com/kevinpollet))
- **[webui]** Adopt a layout for the large amount of entrypoint port numbers ([#11157](https://github.com/traefik/traefik/pull/11157) by [framebassman](https://github.com/framebassman))
**Documentation:**
- **[accesslogs]** Clarify that only header fields may be redacted in access-logs ([#11139](https://github.com/traefik/traefik/pull/11139) by [mattbnz](https://github.com/mattbnz))
- Update business callout ([#11172](https://github.com/traefik/traefik/pull/11172) by [tomatokoolaid](https://github.com/tomatokoolaid))
## [v3.2.0-rc1](https://github.com/traefik/traefik/tree/v3.2.0-rc1) (2024-10-02) ## [v3.2.0-rc1](https://github.com/traefik/traefik/tree/v3.2.0-rc1) (2024-10-02)
[All Commits](https://github.com/traefik/traefik/compare/v3.1.0-rc1...v3.2.0-rc1) [All Commits](https://github.com/traefik/traefik/compare/v3.1.0-rc1...v3.2.0-rc1)
@ -32,6 +56,8 @@
**Bug fixes:** **Bug fixes:**
- **[k8s/ingress,k8s]** Disable IngressClass lookup when disableClusterScopeResources is enabled ([#11111](https://github.com/traefik/traefik/pull/11111) by [jnoordsij](https://github.com/jnoordsij)) - **[k8s/ingress,k8s]** Disable IngressClass lookup when disableClusterScopeResources is enabled ([#11111](https://github.com/traefik/traefik/pull/11111) by [jnoordsij](https://github.com/jnoordsij))
- **[server]** Rework condition to not log on timeout ([#11132](https://github.com/traefik/traefik/pull/11132) by [rtribotte](https://github.com/rtribotte)) - **[server]** Rework condition to not log on timeout ([#11132](https://github.com/traefik/traefik/pull/11132) by [rtribotte](https://github.com/rtribotte))
**Misc:**
- Merge branch v2.11 into v3.1 ([#11149](https://github.com/traefik/traefik/pull/11149) by [kevinpollet](https://github.com/kevinpollet)) - Merge branch v2.11 into v3.1 ([#11149](https://github.com/traefik/traefik/pull/11149) by [kevinpollet](https://github.com/kevinpollet))
- Merge branch v2.11 into v3.1 ([#11142](https://github.com/traefik/traefik/pull/11142) by [rtribotte](https://github.com/rtribotte)) - Merge branch v2.11 into v3.1 ([#11142](https://github.com/traefik/traefik/pull/11142) by [rtribotte](https://github.com/rtribotte))

View file

@ -1,13 +1,10 @@
SRCS = $(shell git ls-files '*.go' | grep -v '^vendor/') SRCS = $(shell git ls-files '*.go' | grep -v '^vendor/')
TAG_NAME := $(shell git tag -l --contains HEAD) TAG_NAME := $(shell git describe --abbrev=0 --tags --exact-match)
SHA := $(shell git rev-parse HEAD) SHA := $(shell git rev-parse HEAD)
VERSION_GIT := $(if $(TAG_NAME),$(TAG_NAME),$(SHA)) VERSION_GIT := $(if $(TAG_NAME),$(TAG_NAME),$(SHA))
VERSION := $(if $(VERSION),$(VERSION),$(VERSION_GIT)) VERSION := $(if $(VERSION),$(VERSION),$(VERSION_GIT))
GIT_BRANCH := $(subst heads/,,$(shell git rev-parse --abbrev-ref HEAD 2>/dev/null))
REPONAME := $(shell echo $(REPO) | tr '[:upper:]' '[:lower:]')
BIN_NAME := traefik BIN_NAME := traefik
CODENAME ?= cheddar CODENAME ?= cheddar

View file

@ -5,6 +5,6 @@
Add API Gateway or API Management capabilities seamlessly to your existing Traefik deployments. Add API Gateway or API Management capabilities seamlessly to your existing Traefik deployments.
No rip and replace. No learning curve. No rip and replace. No learning curve.
- [Explore our API Gateway](https://traefik.io/traefik-hub-api-gateway/) - [Explore our API Gateway](https://traefik.io/traefik-hub-api-gateway/) ([Watch the Demo Video](https://info.traefik.io/watch-traefik-api-gw-demo?cta=doc))
- [Explore our API Management](https://traefik.io/traefik-hub/) - [Explore our API Management](https://traefik.io/traefik-hub/)
- [Get 24/7/365 Commercial Support for Traefik OSS](https://info.traefik.io/request-commercial-support) - [Get 24/7/365 Commercial Support for Traefik OSS](https://info.traefik.io/request-commercial-support)

View file

@ -158,7 +158,8 @@ Each field can be set to:
- `keep` to keep the value - `keep` to keep the value
- `drop` to drop the value - `drop` to drop the value
- `redact` to replace the value with "redacted"
Header fields may also optionally be set to `redact` to replace the value with "REDACTED".
The `defaultMode` for `fields.names` is `keep`. The `defaultMode` for `fields.names` is `keep`.

9
go.mod
View file

@ -31,10 +31,10 @@ require (
github.com/hashicorp/go-retryablehttp v0.7.7 github.com/hashicorp/go-retryablehttp v0.7.7
github.com/hashicorp/go-version v1.6.0 github.com/hashicorp/go-version v1.6.0
github.com/hashicorp/nomad/api v0.0.0-20240122103822-8a4bd61caf74 // No tag on the repo. github.com/hashicorp/nomad/api v0.0.0-20240122103822-8a4bd61caf74 // No tag on the repo.
github.com/http-wasm/http-wasm-host-go v0.6.0 github.com/http-wasm/http-wasm-host-go v0.7.0
github.com/influxdata/influxdb-client-go/v2 v2.7.0 github.com/influxdata/influxdb-client-go/v2 v2.7.0
github.com/influxdata/influxdb1-client v0.0.0-20200827194710-b269163b24ab // No tag on the repo. github.com/influxdata/influxdb1-client v0.0.0-20200827194710-b269163b24ab // No tag on the repo.
github.com/klauspost/compress v1.17.11-0.20240927175842-8e14b1b5a913 // Required to have the content-type fix: https://github.com/klauspost/compress/pull/1011 github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a // Required to have the content-type fix: https://github.com/klauspost/compress/pull/1013
github.com/kvtools/consul v1.0.2 github.com/kvtools/consul v1.0.2
github.com/kvtools/etcdv3 v1.0.2 github.com/kvtools/etcdv3 v1.0.2
github.com/kvtools/redis v1.1.0 github.com/kvtools/redis v1.1.0
@ -61,7 +61,7 @@ require (
github.com/tailscale/tscert v0.0.0-20230806124524-28a91b69a046 // No tag on the repo. github.com/tailscale/tscert v0.0.0-20230806124524-28a91b69a046 // No tag on the repo.
github.com/testcontainers/testcontainers-go v0.32.0 github.com/testcontainers/testcontainers-go v0.32.0
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0 github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0
github.com/tetratelabs/wazero v1.7.2 github.com/tetratelabs/wazero v1.8.0
github.com/tidwall/gjson v1.17.0 github.com/tidwall/gjson v1.17.0
github.com/traefik/grpc-web v0.16.0 github.com/traefik/grpc-web v0.16.0
github.com/traefik/paerser v0.2.1 github.com/traefik/paerser v0.2.1
@ -374,6 +374,3 @@ replace (
// ambiguous import: found package github.com/tencentcloud/tencentcloud-sdk-go/tencentcloud/common/http in multiple modules // ambiguous import: found package github.com/tencentcloud/tencentcloud-sdk-go/tencentcloud/common/http in multiple modules
// tencentcloud uses monorepo with multimodule but the go.mod files are incomplete. // tencentcloud uses monorepo with multimodule but the go.mod files are incomplete.
exclude github.com/tencentcloud/tencentcloud-sdk-go v3.0.83+incompatible exclude github.com/tencentcloud/tencentcloud-sdk-go v3.0.83+incompatible
// Replace to handle new wasmexport in official go and wazergo for http calls.
replace github.com/http-wasm/http-wasm-host-go => github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70

12
go.sum
View file

@ -551,6 +551,8 @@ github.com/hashicorp/serf v0.8.2/go.mod h1:6hOLApaqBFA1NXqRQAsxw9QxuDEvNxSQRwA/J
github.com/hashicorp/serf v0.10.1 h1:Z1H2J60yRKvfDYAOZLd2MU0ND4AH/WDz7xYHDWQsIPY= github.com/hashicorp/serf v0.10.1 h1:Z1H2J60yRKvfDYAOZLd2MU0ND4AH/WDz7xYHDWQsIPY=
github.com/hashicorp/serf v0.10.1/go.mod h1:yL2t6BqATOLGc5HF7qbFkTfXoPIY0WZdWHfEvMqbG+4= github.com/hashicorp/serf v0.10.1/go.mod h1:yL2t6BqATOLGc5HF7qbFkTfXoPIY0WZdWHfEvMqbG+4=
github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU= github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=
github.com/http-wasm/http-wasm-host-go v0.7.0 h1:+1KrRyOO6tWiDB24QrtSYyDmzFLBBs3jioKaUT0mq1c=
github.com/http-wasm/http-wasm-host-go v0.7.0/go.mod h1:adXKcLmL7yuavH/e0kBAp7b3TgAHTo/enCduyN5bXGM=
github.com/huandu/xstrings v1.3.3/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE= github.com/huandu/xstrings v1.3.3/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE=
github.com/huandu/xstrings v1.5.0 h1:2ag3IFq9ZDANvthTwTiqSSZLjDc+BedvHPAp5tJy2TI= github.com/huandu/xstrings v1.5.0 h1:2ag3IFq9ZDANvthTwTiqSSZLjDc+BedvHPAp5tJy2TI=
github.com/huandu/xstrings v1.5.0/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE= github.com/huandu/xstrings v1.5.0/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE=
@ -597,8 +599,8 @@ github.com/kisielk/errcheck v1.1.0/go.mod h1:EZBBE59ingxPouuu3KfxchcWSUPOHkagtvW
github.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8= github.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8=
github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck= github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=
github.com/klauspost/compress v1.10.3/go.mod h1:aoV0uJVorq1K+umq18yTdKaF57EivdYsUV+/s2qKfXs= github.com/klauspost/compress v1.10.3/go.mod h1:aoV0uJVorq1K+umq18yTdKaF57EivdYsUV+/s2qKfXs=
github.com/klauspost/compress v1.17.11-0.20240927175842-8e14b1b5a913 h1:7s7Xd7zVElAw1qh/eh+tXDNfDNXXj38Tpq54eeG6/BM= github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a h1:cwHOqPB4H4iQq8177kf2SxpjNbcjJ2m3lNwKIe28Hqg=
github.com/klauspost/compress v1.17.11-0.20240927175842-8e14b1b5a913/go.mod h1:pMDklpSncoRMuLFrf1W9Ss9KT+0rH90U12bZKk7uwG0= github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a/go.mod h1:pMDklpSncoRMuLFrf1W9Ss9KT+0rH90U12bZKk7uwG0=
github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg= github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
github.com/klauspost/cpuid/v2 v2.2.5 h1:0E5MSMDEoAulmXNFquVs//DdoomxaoTY1kUhbc/qbZg= github.com/klauspost/cpuid/v2 v2.2.5 h1:0E5MSMDEoAulmXNFquVs//DdoomxaoTY1kUhbc/qbZg=
github.com/klauspost/cpuid/v2 v2.2.5/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws= github.com/klauspost/cpuid/v2 v2.2.5/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws=
@ -995,8 +997,8 @@ github.com/testcontainers/testcontainers-go v0.32.0 h1:ug1aK08L3gCHdhknlTTwWjPHP
github.com/testcontainers/testcontainers-go v0.32.0/go.mod h1:CRHrzHLQhlXUsa5gXjTOfqIEJcrK5+xMDmBr/WMI88E= github.com/testcontainers/testcontainers-go v0.32.0/go.mod h1:CRHrzHLQhlXUsa5gXjTOfqIEJcrK5+xMDmBr/WMI88E=
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0 h1:Z3DTMveNUqeGJZ+CXZhpvI7OF1BS71Ywi3SwoXLZ4Lc= github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0 h1:Z3DTMveNUqeGJZ+CXZhpvI7OF1BS71Ywi3SwoXLZ4Lc=
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0/go.mod h1:SYp1WtvNc3n/cg5atO6LvaOd2aqkQYMSDCcWPOUdaZg= github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0/go.mod h1:SYp1WtvNc3n/cg5atO6LvaOd2aqkQYMSDCcWPOUdaZg=
github.com/tetratelabs/wazero v1.7.2 h1:1+z5nXJNwMLPAWaTePFi49SSTL0IMx/i3Fg8Yc25GDc= github.com/tetratelabs/wazero v1.8.0 h1:iEKu0d4c2Pd+QSRieYbnQC9yiFlMS9D+Jr0LsRmcF4g=
github.com/tetratelabs/wazero v1.7.2/go.mod h1:ytl6Zuh20R/eROuyDaGPkp82O9C/DJfXAwJfQ3X6/7Y= github.com/tetratelabs/wazero v1.8.0/go.mod h1:yAI0XTsMBhREkM/YDAK/zNou3GoiAce1P6+rp/wQhjs=
github.com/tidwall/gjson v1.17.0 h1:/Jocvlh98kcTfpN2+JzGQWQcqrPQwDrVEMApx/M5ZwM= github.com/tidwall/gjson v1.17.0 h1:/Jocvlh98kcTfpN2+JzGQWQcqrPQwDrVEMApx/M5ZwM=
github.com/tidwall/gjson v1.17.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk= github.com/tidwall/gjson v1.17.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA= github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
@ -1011,8 +1013,6 @@ github.com/tklauser/numcpus v0.6.1/go.mod h1:1XfjsgE2zo8GVw7POkMbHENHzVg3GzmoZ9f
github.com/tmc/grpc-websocket-proxy v0.0.0-20190109142713-0ad062ec5ee5/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U= github.com/tmc/grpc-websocket-proxy v0.0.0-20190109142713-0ad062ec5ee5/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U=
github.com/traefik/grpc-web v0.16.0 h1:eeUWZaFg6ZU0I9dWOYE2D5qkNzRBmXzzuRlxdltascY= github.com/traefik/grpc-web v0.16.0 h1:eeUWZaFg6ZU0I9dWOYE2D5qkNzRBmXzzuRlxdltascY=
github.com/traefik/grpc-web v0.16.0/go.mod h1:2ttniSv7pTgBWIU2HZLokxRfFX3SA60c/DTmQQgVml4= github.com/traefik/grpc-web v0.16.0/go.mod h1:2ttniSv7pTgBWIU2HZLokxRfFX3SA60c/DTmQQgVml4=
github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70 h1:I+oBnV0orhmasb87yaX54tOAfqrV9+yKoQ1Cum5mq8w=
github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70/go.mod h1:zQB3w+df4hryDEqBorGyA1DwPJ86LfKIASNLFuj6CuI=
github.com/traefik/paerser v0.2.1 h1:LFgeak1NmjEHF53c9ENdXdL1UMkF/lD5t+7Evsz4hH4= github.com/traefik/paerser v0.2.1 h1:LFgeak1NmjEHF53c9ENdXdL1UMkF/lD5t+7Evsz4hH4=
github.com/traefik/paerser v0.2.1/go.mod h1:7BBDd4FANoVgaTZG+yh26jI6CA2nds7D/4VTEdIsh24= github.com/traefik/paerser v0.2.1/go.mod h1:7BBDd4FANoVgaTZG+yh26jI6CA2nds7D/4VTEdIsh24=
github.com/traefik/yaegi v0.16.1 h1:f1De3DVJqIDKmnasUF6MwmWv1dSEEat0wcpXhD2On3E= github.com/traefik/yaegi v0.16.1 h1:f1De3DVJqIDKmnasUF6MwmWv1dSEEat0wcpXhD2On3E=

View file

@ -1,6 +1,7 @@
package compress package compress
import ( import (
"cmp"
"slices" "slices"
"strconv" "strconv"
"strings" "strings"
@ -19,7 +20,7 @@ const (
type Encoding struct { type Encoding struct {
Type string Type string
Weight *float64 Weight float64
} }
func getCompressionEncoding(acceptEncoding []string, defaultEncoding string, supportedEncodings []string) string { func getCompressionEncoding(acceptEncoding []string, defaultEncoding string, supportedEncodings []string) string {
@ -42,11 +43,11 @@ func getCompressionEncoding(acceptEncoding []string, defaultEncoding string, sup
encoding := encodings[0] encoding := encodings[0]
if encoding.Type == identityName && encoding.Weight != nil && *encoding.Weight == 0 { if encoding.Type == identityName && encoding.Weight == 0 {
return notAcceptable return notAcceptable
} }
if encoding.Type == wildcardName && encoding.Weight != nil && *encoding.Weight == 0 { if encoding.Type == wildcardName && encoding.Weight == 0 {
return notAcceptable return notAcceptable
} }
@ -87,11 +88,13 @@ func parseAcceptEncoding(acceptEncoding, supportedEncodings []string) ([]Encodin
continue continue
} }
var weight *float64 // If no "q" parameter is present, the default weight is 1.
// https://www.rfc-editor.org/rfc/rfc9110.html#name-quality-values
weight := 1.0
if len(parsed) > 1 && strings.HasPrefix(parsed[1], "q=") { if len(parsed) > 1 && strings.HasPrefix(parsed[1], "q=") {
w, _ := strconv.ParseFloat(strings.TrimPrefix(parsed[1], "q="), 64) w, _ := strconv.ParseFloat(strings.TrimPrefix(parsed[1], "q="), 64)
weight = &w weight = w
hasWeight = true hasWeight = true
} }
@ -102,41 +105,9 @@ func parseAcceptEncoding(acceptEncoding, supportedEncodings []string) ([]Encodin
} }
} }
slices.SortFunc(encodings, compareEncoding) slices.SortFunc(encodings, func(a, b Encoding) int {
return cmp.Compare(b.Weight, a.Weight)
})
return encodings, hasWeight return encodings, hasWeight
} }
func compareEncoding(a, b Encoding) int {
lhs, rhs := a.Weight, b.Weight
if lhs == nil && rhs == nil {
return 0
}
if lhs == nil && *rhs == 0 {
return -1
}
if lhs == nil {
return 1
}
if rhs == nil && *lhs == 0 {
return 1
}
if rhs == nil {
return -1
}
if *lhs < *rhs {
return 1
}
if *lhs > *rhs {
return -1
}
return 0
}

View file

@ -87,6 +87,12 @@ func Test_getCompressionEncoding(t *testing.T) {
supportedEncodings: []string{zstdName, brotliName}, supportedEncodings: []string{zstdName, brotliName},
expected: brotliName, expected: brotliName,
}, },
{
desc: "mixed weight",
acceptEncoding: []string{"gzip, br;q=0.9"},
supportedEncodings: []string{gzipName, brotliName},
expected: gzipName,
},
} }
for _, test := range testCases { for _, test := range testCases {
@ -116,10 +122,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "weight", desc: "weight",
values: []string{"br;q=1.0, zstd;q=0.9, gzip;q=0.8, *;q=0.1"}, values: []string{"br;q=1.0, zstd;q=0.9, gzip;q=0.8, *;q=0.1"},
expected: []Encoding{ expected: []Encoding{
{Type: brotliName, Weight: ptr[float64](1)}, {Type: brotliName, Weight: 1},
{Type: zstdName, Weight: ptr(0.9)}, {Type: zstdName, Weight: 0.9},
{Type: gzipName, Weight: ptr(0.8)}, {Type: gzipName, Weight: 0.8},
{Type: wildcardName, Weight: ptr(0.1)}, {Type: wildcardName, Weight: 0.1},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -128,9 +134,9 @@ func Test_parseAcceptEncoding(t *testing.T) {
values: []string{"br;q=1.0, zstd;q=0.9, gzip;q=0.8, *;q=0.1"}, values: []string{"br;q=1.0, zstd;q=0.9, gzip;q=0.8, *;q=0.1"},
supportedEncodings: []string{brotliName, gzipName}, supportedEncodings: []string{brotliName, gzipName},
expected: []Encoding{ expected: []Encoding{
{Type: brotliName, Weight: ptr[float64](1)}, {Type: brotliName, Weight: 1},
{Type: gzipName, Weight: ptr(0.8)}, {Type: gzipName, Weight: 0.8},
{Type: wildcardName, Weight: ptr(0.1)}, {Type: wildcardName, Weight: 0.1},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -138,10 +144,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "mixed", desc: "mixed",
values: []string{"zstd,gzip, br;q=1.0, *;q=0"}, values: []string{"zstd,gzip, br;q=1.0, *;q=0"},
expected: []Encoding{ expected: []Encoding{
{Type: brotliName, Weight: ptr[float64](1)}, {Type: zstdName, Weight: 1},
{Type: zstdName}, {Type: gzipName, Weight: 1},
{Type: gzipName}, {Type: brotliName, Weight: 1},
{Type: wildcardName, Weight: ptr[float64](0)}, {Type: wildcardName, Weight: 0},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -150,8 +156,8 @@ func Test_parseAcceptEncoding(t *testing.T) {
values: []string{"zstd,gzip, br;q=1.0, *;q=0"}, values: []string{"zstd,gzip, br;q=1.0, *;q=0"},
supportedEncodings: []string{zstdName}, supportedEncodings: []string{zstdName},
expected: []Encoding{ expected: []Encoding{
{Type: zstdName}, {Type: zstdName, Weight: 1},
{Type: wildcardName, Weight: ptr[float64](0)}, {Type: wildcardName, Weight: 0},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -159,10 +165,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "no weight", desc: "no weight",
values: []string{"zstd, gzip, br, *"}, values: []string{"zstd, gzip, br, *"},
expected: []Encoding{ expected: []Encoding{
{Type: zstdName}, {Type: zstdName, Weight: 1},
{Type: gzipName}, {Type: gzipName, Weight: 1},
{Type: brotliName}, {Type: brotliName, Weight: 1},
{Type: wildcardName}, {Type: wildcardName, Weight: 1},
}, },
assertWeight: assert.False, assertWeight: assert.False,
}, },
@ -171,8 +177,8 @@ func Test_parseAcceptEncoding(t *testing.T) {
values: []string{"zstd, gzip, br, *"}, values: []string{"zstd, gzip, br, *"},
supportedEncodings: []string{"gzip"}, supportedEncodings: []string{"gzip"},
expected: []Encoding{ expected: []Encoding{
{Type: gzipName}, {Type: gzipName, Weight: 1},
{Type: wildcardName}, {Type: wildcardName, Weight: 1},
}, },
assertWeight: assert.False, assertWeight: assert.False,
}, },
@ -180,9 +186,9 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "weight and identity", desc: "weight and identity",
values: []string{"gzip;q=1.0, identity; q=0.5, *;q=0"}, values: []string{"gzip;q=1.0, identity; q=0.5, *;q=0"},
expected: []Encoding{ expected: []Encoding{
{Type: gzipName, Weight: ptr[float64](1)}, {Type: gzipName, Weight: 1},
{Type: identityName, Weight: ptr(0.5)}, {Type: identityName, Weight: 0.5},
{Type: wildcardName, Weight: ptr[float64](0)}, {Type: wildcardName, Weight: 0},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -191,8 +197,8 @@ func Test_parseAcceptEncoding(t *testing.T) {
values: []string{"gzip;q=1.0, identity; q=0.5, *;q=0"}, values: []string{"gzip;q=1.0, identity; q=0.5, *;q=0"},
supportedEncodings: []string{"br"}, supportedEncodings: []string{"br"},
expected: []Encoding{ expected: []Encoding{
{Type: identityName, Weight: ptr(0.5)}, {Type: identityName, Weight: 0.5},
{Type: wildcardName, Weight: ptr[float64](0)}, {Type: wildcardName, Weight: 0},
}, },
assertWeight: assert.True, assertWeight: assert.True,
}, },
@ -213,7 +219,3 @@ func Test_parseAcceptEncoding(t *testing.T) {
}) })
} }
} }
func ptr[T any](t T) *T {
return &t
}

View file

@ -688,39 +688,32 @@ func Test1xxResponses(t *testing.T) {
assert.NotEqualValues(t, body, fakeBody) assert.NotEqualValues(t, body, fakeBody)
} }
func BenchmarkCompress(b *testing.B) { func BenchmarkCompressGzip(b *testing.B) {
runCompressionBenchmark(b, gzipName)
}
func BenchmarkCompressBrotli(b *testing.B) {
runCompressionBenchmark(b, brotliName)
}
func BenchmarkCompressZstandard(b *testing.B) {
runCompressionBenchmark(b, zstdName)
}
func runCompressionBenchmark(b *testing.B, algorithm string) {
b.Helper()
testCases := []struct { testCases := []struct {
name string name string
parallel bool parallel bool
size int size int
}{ }{
{ {"2k", false, 2048},
name: "2k", {"20k", false, 20480},
size: 2048, {"100k", false, 102400},
}, {"2k parallel", true, 2048},
{ {"20k parallel", true, 20480},
name: "20k", {"100k parallel", true, 102400},
size: 20480,
},
{
name: "100k",
size: 102400,
},
{
name: "2k parallel",
parallel: true,
size: 2048,
},
{
name: "20k parallel",
parallel: true,
size: 20480,
},
{
name: "100k parallel",
parallel: true,
size: 102400,
},
} }
for _, test := range testCases { for _, test := range testCases {
@ -734,7 +727,7 @@ func BenchmarkCompress(b *testing.B) {
handler, _ := New(context.Background(), next, dynamic.Compress{}, "testing") handler, _ := New(context.Background(), next, dynamic.Compress{}, "testing")
req, _ := http.NewRequest(http.MethodGet, "/whatever", nil) req, _ := http.NewRequest(http.MethodGet, "/whatever", nil)
req.Header.Set("Accept-Encoding", "gzip") req.Header.Set("Accept-Encoding", algorithm)
b.ReportAllocs() b.ReportAllocs()
b.SetBytes(int64(test.size)) b.SetBytes(int64(test.size))
@ -742,7 +735,7 @@ func BenchmarkCompress(b *testing.B) {
b.ResetTimer() b.ResetTimer()
b.RunParallel(func(pb *testing.PB) { b.RunParallel(func(pb *testing.PB) {
for pb.Next() { for pb.Next() {
runBenchmark(b, req, handler) runBenchmark(b, req, handler, algorithm)
} }
}) })
return return
@ -750,13 +743,13 @@ func BenchmarkCompress(b *testing.B) {
b.ResetTimer() b.ResetTimer()
for range b.N { for range b.N {
runBenchmark(b, req, handler) runBenchmark(b, req, handler, algorithm)
} }
}) })
} }
} }
func runBenchmark(b *testing.B, req *http.Request, handler http.Handler) { func runBenchmark(b *testing.B, req *http.Request, handler http.Handler, algorithm string) {
b.Helper() b.Helper()
res := httptest.NewRecorder() res := httptest.NewRecorder()
@ -765,7 +758,7 @@ func runBenchmark(b *testing.B, req *http.Request, handler http.Handler) {
b.Fatalf("Expected 200 but got %d", code) b.Fatalf("Expected 200 but got %d", code)
} }
assert.Equal(b, gzipName, res.Header().Get(contentEncodingHeader)) assert.Equal(b, algorithm, res.Header().Get(contentEncodingHeader))
} }
func generateBytes(length int) []byte { func generateBytes(length int) []byte {

View file

@ -8,6 +8,7 @@ import (
"mime" "mime"
"net" "net"
"net/http" "net/http"
"sync"
"github.com/andybalholm/brotli" "github.com/andybalholm/brotli"
"github.com/klauspost/compress/zstd" "github.com/klauspost/compress/zstd"
@ -45,6 +46,7 @@ type CompressionHandler struct {
excludedContentTypes []parsedContentType excludedContentTypes []parsedContentType
includedContentTypes []parsedContentType includedContentTypes []parsedContentType
next http.Handler next http.Handler
writerPool sync.Pool
} }
// NewCompressionHandler returns a new compressing handler. // NewCompressionHandler returns a new compressing handler.
@ -92,7 +94,7 @@ func NewCompressionHandler(cfg Config, next http.Handler) (http.Handler, error)
func (c *CompressionHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request) { func (c *CompressionHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request) {
rw.Header().Add(vary, acceptEncoding) rw.Header().Add(vary, acceptEncoding)
compressionWriter, err := newCompressionWriter(c.cfg.Algorithm, rw) compressionWriter, err := c.getCompressionWriter(rw)
if err != nil { if err != nil {
logger := middlewares.GetLogger(r.Context(), c.cfg.MiddlewareName, typeName) logger := middlewares.GetLogger(r.Context(), c.cfg.MiddlewareName, typeName)
logger.Debug().Msgf("Create compression handler: %v", err) logger.Debug().Msgf("Create compression handler: %v", err)
@ -100,6 +102,7 @@ func (c *CompressionHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request)
rw.WriteHeader(http.StatusInternalServerError) rw.WriteHeader(http.StatusInternalServerError)
return return
} }
defer c.putCompressionWriter(compressionWriter)
responseWriter := &responseWriter{ responseWriter := &responseWriter{
rw: rw, rw: rw,
@ -130,6 +133,8 @@ type compression interface {
// as it would otherwise send some extra "end of compression" bytes. // as it would otherwise send some extra "end of compression" bytes.
// Close also makes sure to flush whatever was left to write from the buffer. // Close also makes sure to flush whatever was left to write from the buffer.
Close() error Close() error
// Reset reinitializes the state of the encoder, allowing it to be reused.
Reset(w io.Writer)
} }
type compressionWriter struct { type compressionWriter struct {
@ -137,6 +142,19 @@ type compressionWriter struct {
alg string alg string
} }
func (c *CompressionHandler) getCompressionWriter(rw io.Writer) (*compressionWriter, error) {
if writer, ok := c.writerPool.Get().(*compressionWriter); ok {
writer.compression.Reset(rw)
return writer, nil
}
return newCompressionWriter(c.cfg.Algorithm, rw)
}
func (c *CompressionHandler) putCompressionWriter(writer *compressionWriter) {
writer.Reset(nil)
c.writerPool.Put(writer)
}
func newCompressionWriter(algo string, in io.Writer) (*compressionWriter, error) { func newCompressionWriter(algo string, in io.Writer) (*compressionWriter, error) {
switch algo { switch algo {
case brotliName: case brotliName:

View file

@ -8,6 +8,7 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"reflect" "reflect"
"runtime"
"strings" "strings"
"github.com/http-wasm/http-wasm-host-go/handler" "github.com/http-wasm/http-wasm-host-go/handler"
@ -135,7 +136,19 @@ func (b *wasmMiddlewareBuilder) buildMiddleware(ctx context.Context, next http.H
return nil, nil, fmt.Errorf("creating middleware: %w", err) return nil, nil, fmt.Errorf("creating middleware: %w", err)
} }
return mw.NewHandler(ctx, next), applyCtx, nil h := mw.NewHandler(ctx, next)
// Traefik does not Close the middleware when creating a new instance on a configuration change.
// When the middleware is marked to be GC, we need to close it so the wasm instance is properly closed.
// Reference: https://github.com/traefik/traefik/issues/11119
runtime.SetFinalizer(h, func(_ http.Handler) {
if err := mw.Close(ctx); err != nil {
logger.Err(err).Msg("[wasm] middleware Close failed")
} else {
logger.Debug().Msg("[wasm] middleware Close ok")
}
})
return h, applyCtx, nil
} }
// WasmMiddleware is an HTTP handler plugin wrapper. // WasmMiddleware is an HTTP handler plugin wrapper.

View file

@ -1,8 +1,8 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# shellcheck disable=SC2046
set -e -o pipefail set -e -o pipefail
# shellcheck disable=SC1091 # Cannot check source of this file
source /go/src/k8s.io/code-generator/kube_codegen.sh source /go/src/k8s.io/code-generator/kube_codegen.sh
git config --global --add safe.directory "/go/src/${PROJECT_MODULE}" git config --global --add safe.directory "/go/src/${PROJECT_MODULE}"

View file

@ -4,11 +4,11 @@ RepositoryName = "traefik"
OutputType = "file" OutputType = "file"
FileName = "traefik_changelog.md" FileName = "traefik_changelog.md"
# example new bugfix v3.1.5 # example new bugfix v3.1.6
CurrentRef = "v3.1" CurrentRef = "v3.1"
PreviousRef = "v3.1.4" PreviousRef = "v3.1.5"
BaseBranch = "v3.1" BaseBranch = "v3.1"
FutureCurrentRefName = "v3.1.5" FutureCurrentRefName = "v3.1.6"
ThresholdPreviousRef = 10 ThresholdPreviousRef = 10
ThresholdCurrentRef = 10 ThresholdCurrentRef = 10

View file

@ -15,9 +15,9 @@ for os in linux darwin windows freebsd openbsd; do
go clean -cache go clean -cache
done done
cat dist/**/*_checksums.txt >> dist/traefik_${VERSION}_checksums.txt cat dist/**/*_checksums.txt >> "dist/traefik_${VERSION}_checksums.txt"
rm dist/**/*_checksums.txt rm dist/**/*_checksums.txt
tar cfz dist/traefik-${VERSION}.src.tar.gz \ tar cfz "dist/traefik-${VERSION}.src.tar.gz" \
--exclude-vcs \ --exclude-vcs \
--exclude .idea \ --exclude .idea \
--exclude .travis \ --exclude .travis \
@ -25,4 +25,4 @@ tar cfz dist/traefik-${VERSION}.src.tar.gz \
--exclude .github \ --exclude .github \
--exclude dist . --exclude dist .
chown -R $(id -u):$(id -g) dist/ chown -R "$(id -u)":"$(id -g)" dist/

View file

@ -6,6 +6,7 @@ script_dir="$( cd "$( dirname "${0}" )" && pwd -P)"
if command -v shellcheck if command -v shellcheck
then then
exit_code=0
# The list of shell script come from the (grep ...) command, feeding the loop # The list of shell script come from the (grep ...) command, feeding the loop
while IFS= read -r script_to_check while IFS= read -r script_to_check
do do
@ -18,7 +19,13 @@ then
| grep -v '.git/' | grep -v 'vendor/' | grep -v 'node_modules/' \ | grep -v '.git/' | grep -v 'vendor/' | grep -v 'node_modules/' \
| cut -d':' -f1 | cut -d':' -f1
) )
wait # Wait for all background command to be completed # Wait for all background command to be completed
for p in $(jobs -p)
do
wait "$p" || exit_code=$?
done
exit $exit_code
else else
echo "== Command shellcheck not found in your PATH. No shell script checked." echo "== Command shellcheck not found in your PATH. No shell script checked."
exit 1
fi fi

View file

@ -1,7 +1,7 @@
FROM node:20.14 FROM node:22.9-alpine3.20
# Current Active LTS release according to (https://nodejs.org/en/about/releases/) # Current Active LTS release according to (https://nodejs.org/en/about/releases/)
ENV WEBUI_DIR /src/webui ENV WEBUI_DIR=/src/webui
RUN mkdir -p $WEBUI_DIR RUN mkdir -p $WEBUI_DIR
COPY package.json $WEBUI_DIR/ COPY package.json $WEBUI_DIR/

View file

@ -50,8 +50,11 @@
"postcss": "^8.4.14", "postcss": "^8.4.14",
"vitest": "^1.6.0" "vitest": "^1.6.0"
}, },
"resolutions": {
"cookie": "^0.7.0"
},
"engines": { "engines": {
"node": "^20 || ^18 || ^16", "node": "^22 || ^20 || ^18 || ^16",
"npm": ">= 6.13.4", "npm": ">= 6.13.4",
"yarn": ">= 1.22.22" "yarn": ">= 1.22.22"
}, },

View file

@ -64,7 +64,7 @@ export default defineComponent({
&-focus { &-focus {
border: solid 2px $accent; border: solid 2px $accent;
} }
&-ex-size{ &-ex-size {
.text-h3 { .text-h3 {
font-size: 22px; font-size: 22px;
} }

View file

@ -15,7 +15,7 @@
<div <div
v-for="(entryItems, index) in entryAll.items" v-for="(entryItems, index) in entryAll.items"
:key="index" :key="index"
class="col-12 col-sm-6 col-md-2" class="col-12 col-sm-6 col-md-3"
> >
<panel-entry <panel-entry
:name="entryItems.name" :name="entryItems.name"

File diff suppressed because it is too large Load diff