Compare commits

..

8 Commits

Author SHA1 Message Date
Eric Eastwood
c75fb2eeb6 Add changelog 2025-11-20 17:36:21 -06:00
Eric Eastwood
ad94b9103e Use consistent indentation 2025-11-20 17:31:42 -06:00
Eric Eastwood
588c2b4db9 Use return instead of exit
This is useful so that the script can be sourced without exiting the calling subshell.
2025-11-20 17:29:07 -06:00
Eric Eastwood
573accd0df Run complement.sh logic as function
This is useful as later we can refactor the early `exit` calls to `return` calls
so that the script can be sourced without exiting the calling subshell.
2025-11-20 17:25:50 -06:00
Devon Hudson
7a9660367a Capitalize Synapse in CHANGES.md 2025-11-18 18:04:26 -07:00
Devon Hudson
b631bf7c2a 1.143.0rc2 2025-11-18 17:38:04 -07:00
Devon Hudson
2cffd755f2 Fix duplicate poetry version from merging patch release 2025-11-18 17:31:55 -07:00
Devon Hudson
f5bf02eff6 1.143.0rc1 2025-11-18 13:20:59 -07:00
40 changed files with 290 additions and 439 deletions

View File

@@ -1,3 +1,74 @@
# Synapse 1.143.0rc2 (2025-11-18)
## Internal Changes
- Fixes docker image creation in the release workflow.
# Synapse 1.143.0rc1 (2025-11-18)
## Dropping support for PostgreSQL 13
In line with our [deprecation policy](https://github.com/element-hq/synapse/blob/develop/docs/deprecation_policy.md), we've dropped
support for PostgreSQL 13, as it is no longer supported upstream.
This release of Synapse requires PostgreSQL 14+.
## Features
- Support multiple config files in `register_new_matrix_user`. ([\#18784](https://github.com/element-hq/synapse/issues/18784))
- Remove authentication from `POST /_matrix/client/v1/delayed_events`, and allow calling this endpoint with the update action to take (`send`/`cancel`/`restart`) in the request path instead of the body. ([\#19152](https://github.com/element-hq/synapse/issues/19152))
## Bugfixes
- Fixed a longstanding bug where background updates were only run on the `main` database. ([\#19181](https://github.com/element-hq/synapse/issues/19181))
- Fixed a bug introduced in v1.142.0 preventing subpaths in MAS endpoints from working. ([\#19186](https://github.com/element-hq/synapse/issues/19186))
- Fix the SQLite-to-PostgreSQL migration script to correctly migrate a boolean column in the `delayed_events` table. ([\#19155](https://github.com/element-hq/synapse/issues/19155))
## Improved Documentation
- Improve documentation around streams, particularly ID generators and adding new streams. ([\#18943](https://github.com/element-hq/synapse/issues/18943))
## Deprecations and Removals
- Remove support for PostgreSQL 13. ([\#19170](https://github.com/element-hq/synapse/issues/19170))
## Internal Changes
- Provide additional servers with federation room directory results. ([\#18970](https://github.com/element-hq/synapse/issues/18970))
- Add a shortcut return when there are no events to purge. ([\#19093](https://github.com/element-hq/synapse/issues/19093))
- Write union types as `X | Y` where possible, as per PEP 604, added in Python 3.10. ([\#19111](https://github.com/element-hq/synapse/issues/19111))
- Reduce cardinality of `synapse_storage_events_persisted_events_sep_total` metric by removing `origin_entity` label. This also separates out events sent by local application services by changing the `origin_type` for such events to `application_service`. The `type` field also only tracks common event types, and anything else is bucketed under `*other*`. ([\#19133](https://github.com/element-hq/synapse/issues/19133), [\#19168](https://github.com/element-hq/synapse/issues/19168))
- Run trial tests on Python 3.14 for PRs. ([\#19135](https://github.com/element-hq/synapse/issues/19135))
- Update `pyproject.toml` project metadata to be compatible with standard Python packaging tooling. ([\#19137](https://github.com/element-hq/synapse/issues/19137))
- Minor speed up of processing of inbound replication. ([\#19138](https://github.com/element-hq/synapse/issues/19138), [\#19145](https://github.com/element-hq/synapse/issues/19145), [\#19146](https://github.com/element-hq/synapse/issues/19146))
- Ignore recent Python language refactors from git blame (`.git-blame-ignore-revs`). ([\#19150](https://github.com/element-hq/synapse/issues/19150))
- Bump lower bounds of dependencies `parameterized` to `0.9.0` and `idna` to `3.3` as those are the first to advertise support for Python 3.10. ([\#19167](https://github.com/element-hq/synapse/issues/19167))
- Point out which event caused the exception when checking [MSC4293](https://github.com/matrix-org/matrix-spec-proposals/pull/4293) redactions. ([\#19169](https://github.com/element-hq/synapse/issues/19169))
- Restore printing `sentinel` for the log record `request` when no logcontext is active. ([\#19172](https://github.com/element-hq/synapse/issues/19172))
- Add debug logs to track `Clock` utilities. ([\#19173](https://github.com/element-hq/synapse/issues/19173))
- Remove explicit python version skips in `cibuildwheel` config as it's no longer required after [#19137](https://github.com/element-hq/synapse/pull/19137). ([\#19177](https://github.com/element-hq/synapse/issues/19177))
- Fix potential lost logcontext when `PerDestinationQueue.shutdown(...)` is called. ([\#19178](https://github.com/element-hq/synapse/issues/19178))
- Fix bad deferred logcontext handling across the codebase. ([\#19180](https://github.com/element-hq/synapse/issues/19180))
### Updates to locked dependencies
* Bump bytes from 1.10.1 to 1.11.0. ([\#19193](https://github.com/element-hq/synapse/issues/19193))
* Bump click from 8.1.8 to 8.3.1. ([\#19195](https://github.com/element-hq/synapse/issues/19195))
* Bump cryptography from 43.0.3 to 45.0.7. ([\#19159](https://github.com/element-hq/synapse/issues/19159))
* Bump docker/metadata-action from 5.8.0 to 5.9.0. ([\#19161](https://github.com/element-hq/synapse/issues/19161))
* Bump pydantic from 2.12.3 to 2.12.4. ([\#19158](https://github.com/element-hq/synapse/issues/19158))
* Bump pyo3-log from 0.13.1 to 0.13.2. ([\#19156](https://github.com/element-hq/synapse/issues/19156))
* Bump ruff from 0.14.3 to 0.14.5. ([\#19196](https://github.com/element-hq/synapse/issues/19196))
* Bump sentry-sdk from 2.34.1 to 2.43.0. ([\#19157](https://github.com/element-hq/synapse/issues/19157))
* Bump sentry-sdk from 2.43.0 to 2.44.0. ([\#19197](https://github.com/element-hq/synapse/issues/19197))
* Bump tomli from 2.2.1 to 2.3.0. ([\#19194](https://github.com/element-hq/synapse/issues/19194))
* Bump types-netaddr from 1.3.0.20240530 to 1.3.0.20251108. ([\#19160](https://github.com/element-hq/synapse/issues/19160))
# Synapse 1.142.1 (2025-11-18)
## Bugfixes

View File

@@ -1 +0,0 @@
Support multiple config files in `register_new_matrix_user`.

View File

@@ -1 +0,0 @@
Improve documentation around streams, particularly ID generators and adding new streams.

View File

@@ -1 +0,0 @@
Provide additional servers with federation room directory results.

View File

@@ -1 +0,0 @@
Add a shortcut return when there are no events to purge.

View File

@@ -1 +0,0 @@
Write union types as `X | Y` where possible, as per PEP 604, added in Python 3.10.

View File

@@ -1 +0,0 @@
Reduce cardinality of `synapse_storage_events_persisted_events_sep_total` metric by removing `origin_entity` label. This also separates out events sent by local application services by changing the `origin_type` for such events to `application_service`. The `type` field also only tracks common event types, and anything else is bucketed under `*other*`.

View File

@@ -1 +0,0 @@
Run trial tests on Python 3.14 for PRs.

View File

@@ -1 +0,0 @@
Update `pyproject.toml` project metadata to be compatible with standard Python packaging tooling.

View File

@@ -1 +0,0 @@
Minor speed up of processing of inbound replication.

View File

@@ -1 +0,0 @@
Minor speed up of processing of inbound replication.

View File

@@ -1 +0,0 @@
Minor speed up of processing of inbound replication.

View File

@@ -1 +0,0 @@
Ignore recent Python language refactors from git blame (`.git-blame-ignore-revs`).

View File

@@ -1 +0,0 @@
Remove authentication from `POST /_matrix/client/v1/delayed_events`, and allow calling this endpoint with the update action to take (`send`/`cancel`/`restart`) in the request path instead of the body.

View File

@@ -1 +0,0 @@
Let the SQLite-to-PostgreSQL migration script correctly migrate a boolean column in the `delayed_events` table.

View File

@@ -1 +0,0 @@
Bump lower bounds of dependencies `parameterized` to `0.9.0` and `idna` to `3.3` as those are the first to advertise support for Python 3.10.

View File

@@ -1 +0,0 @@
Reduce cardinality of `synapse_storage_events_persisted_events_sep_total` metric by removing `origin_entity` label. This also separates out events sent by local application services by changing the `origin_type` for such events to `application_service`. The `type` field also only tracks common event types, and anything else is bucketed under `*other*`.

View File

@@ -1 +0,0 @@
Point out which event caused the exception when checking [MSC4293](https://github.com/matrix-org/matrix-spec-proposals/pull/4293) redactions.

View File

@@ -1 +0,0 @@
Remove support for PostgreSQL 13.

View File

@@ -1 +0,0 @@
Restore printing `sentinel` for the log record `request` when no logcontext is active.

View File

@@ -1 +0,0 @@
Add debug logs to track `Clock` utilities.

View File

@@ -1 +0,0 @@
Remove explicit python version skips in `cibuildwheel` config as it's no longer required after [#19137](https://github.com/element-hq/synapse/pull/19137).

View File

@@ -1 +0,0 @@
Fix potential lost logcontext when `PerDestinationQueue.shutdown(...)`.

View File

@@ -1 +0,0 @@
Fix bad deferred logcontext handling across the codebase.

View File

@@ -1 +0,0 @@
Run background updates on all databases.

View File

@@ -1 +0,0 @@
Fix regression preventing subpaths in MAS endpoints.

View File

@@ -1 +0,0 @@
Add experimentatal implememntation of MSC4380 (invite blocking).

1
changelog.d/19209.misc Normal file
View File

@@ -0,0 +1 @@
Refactor `scripts-dev/complement.sh` logic to avoid `exit` to facilitate being able to source it from other scripts (composable).

12
debian/changelog vendored
View File

@@ -1,3 +1,15 @@
matrix-synapse-py3 (1.143.0~rc2) stable; urgency=medium
* New Synapse release 1.143.0rc2.
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Nov 2025 17:36:08 -0700
matrix-synapse-py3 (1.143.0~rc1) stable; urgency=medium
* New Synapse release 1.143.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Nov 2025 13:08:39 -0700
matrix-synapse-py3 (1.142.1) stable; urgency=medium
* New Synapse release 1.142.1.

View File

@@ -1,6 +1,6 @@
[project]
name = "matrix-synapse"
version = "1.142.0"
version = "1.143.0rc2"
description = "Homeserver for the Matrix decentralised comms protocol"
readme = "README.rst"
authors = [
@@ -291,13 +291,6 @@ manifest-path = "rust/Cargo.toml"
module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.142.1"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later OR LicenseRef-Element-Commercial"
readme = "README.rst"
repository = "https://github.com/element-hq/synapse"
packages = [
{ include = "synapse" },
]

View File

@@ -1,5 +1,5 @@
$schema: https://element-hq.github.io/synapse/latest/schema/v1/meta.schema.json
$id: https://element-hq.github.io/synapse/schema/synapse/v1.142/synapse-config.schema.json
$id: https://element-hq.github.io/synapse/schema/synapse/v1.143/synapse-config.schema.json
type: object
properties:
modules:

View File

@@ -72,153 +72,154 @@ For help on arguments to 'go test', run 'go help testflag'.
EOF
}
# parse our arguments
skip_docker_build=""
skip_complement_run=""
while [ $# -ge 1 ]; do
main() {
# parse our arguments
skip_docker_build=""
skip_complement_run=""
while [ $# -ge 1 ]; do
arg=$1
case "$arg" in
"-h")
usage
exit 1
;;
"-f"|"--fast")
skip_docker_build=1
;;
"--build-only")
skip_complement_run=1
;;
"-e"|"--editable")
use_editable_synapse=1
;;
"--rebuild-editable")
rebuild_editable_synapse=1
;;
*)
# unknown arg: presumably an argument to gotest. break the loop.
break
"-h")
usage
return 1
;;
"-f"|"--fast")
skip_docker_build=1
;;
"--build-only")
skip_complement_run=1
;;
"-e"|"--editable")
use_editable_synapse=1
;;
"--rebuild-editable")
rebuild_editable_synapse=1
;;
*)
# unknown arg: presumably an argument to gotest. break the loop.
break
esac
shift
done
done
# enable buildkit for the docker builds
export DOCKER_BUILDKIT=1
# enable buildkit for the docker builds
export DOCKER_BUILDKIT=1
# Determine whether to use the docker or podman container runtime.
if [ -n "$PODMAN" ]; then
export CONTAINER_RUNTIME=podman
export DOCKER_HOST=unix://$XDG_RUNTIME_DIR/podman/podman.sock
export BUILDAH_FORMAT=docker
export COMPLEMENT_HOSTNAME_RUNNING_COMPLEMENT=host.containers.internal
else
export CONTAINER_RUNTIME=docker
fi
# Determine whether to use the docker or podman container runtime.
if [ -n "$PODMAN" ]; then
export CONTAINER_RUNTIME=podman
export DOCKER_HOST=unix://$XDG_RUNTIME_DIR/podman/podman.sock
export BUILDAH_FORMAT=docker
export COMPLEMENT_HOSTNAME_RUNNING_COMPLEMENT=host.containers.internal
else
export CONTAINER_RUNTIME=docker
fi
# Change to the repository root
cd "$(dirname $0)/.."
# Change to the repository root
cd "$(dirname $0)/.."
# Check for a user-specified Complement checkout
if [[ -z "$COMPLEMENT_DIR" ]]; then
COMPLEMENT_REF=${COMPLEMENT_REF:-main}
echo "COMPLEMENT_DIR not set. Fetching Complement checkout from ${COMPLEMENT_REF}..."
wget -Nq https://github.com/matrix-org/complement/archive/${COMPLEMENT_REF}.tar.gz
tar -xzf ${COMPLEMENT_REF}.tar.gz
COMPLEMENT_DIR=complement-${COMPLEMENT_REF}
echo "Checkout available at 'complement-${COMPLEMENT_REF}'"
fi
# Check for a user-specified Complement checkout
if [[ -z "$COMPLEMENT_DIR" ]]; then
COMPLEMENT_REF=${COMPLEMENT_REF:-main}
echo "COMPLEMENT_DIR not set. Fetching Complement checkout from ${COMPLEMENT_REF}..."
wget -Nq https://github.com/matrix-org/complement/archive/${COMPLEMENT_REF}.tar.gz
tar -xzf ${COMPLEMENT_REF}.tar.gz
COMPLEMENT_DIR=complement-${COMPLEMENT_REF}
echo "Checkout available at 'complement-${COMPLEMENT_REF}'"
fi
if [ -n "$use_editable_synapse" ]; then
if [ -n "$use_editable_synapse" ]; then
if [[ -e synapse/synapse_rust.abi3.so ]]; then
# In an editable install, back up the host's compiled Rust module to prevent
# inconvenience; the container will overwrite the module with its own copy.
mv -n synapse/synapse_rust.abi3.so synapse/synapse_rust.abi3.so~host
# And restore it on exit:
synapse_pkg=`realpath synapse`
trap "mv -f '$synapse_pkg/synapse_rust.abi3.so~host' '$synapse_pkg/synapse_rust.abi3.so'" EXIT
# In an editable install, back up the host's compiled Rust module to prevent
# inconvenience; the container will overwrite the module with its own copy.
mv -n synapse/synapse_rust.abi3.so synapse/synapse_rust.abi3.so~host
# And restore it on exit:
synapse_pkg=`realpath synapse`
trap "mv -f '$synapse_pkg/synapse_rust.abi3.so~host' '$synapse_pkg/synapse_rust.abi3.so'" EXIT
fi
editable_mount="$(realpath .):/editable-src:z"
if [ -n "$rebuild_editable_synapse" ]; then
unset skip_docker_build
unset skip_docker_build
elif $CONTAINER_RUNTIME inspect complement-synapse-editable &>/dev/null; then
# complement-synapse-editable already exists: see if we can still use it:
# - The Rust module must still be importable; it will fail to import if the Rust source has changed.
# - The Poetry lock file must be the same (otherwise we assume dependencies have changed)
# complement-synapse-editable already exists: see if we can still use it:
# - The Rust module must still be importable; it will fail to import if the Rust source has changed.
# - The Poetry lock file must be the same (otherwise we assume dependencies have changed)
# First set up the module in the right place for an editable installation.
$CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'cp' complement-synapse-editable -- /synapse_rust.abi3.so.bak /editable-src/synapse/synapse_rust.abi3.so
# First set up the module in the right place for an editable installation.
$CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'cp' complement-synapse-editable -- /synapse_rust.abi3.so.bak /editable-src/synapse/synapse_rust.abi3.so
if ($CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'python' complement-synapse-editable -c 'import synapse.synapse_rust' \
&& $CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'diff' complement-synapse-editable --brief /editable-src/poetry.lock /poetry.lock.bak); then
skip_docker_build=1
else
echo "Editable Synapse image is stale. Will rebuild."
unset skip_docker_build
fi
if ($CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'python' complement-synapse-editable -c 'import synapse.synapse_rust' \
&& $CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'diff' complement-synapse-editable --brief /editable-src/poetry.lock /poetry.lock.bak); then
skip_docker_build=1
else
echo "Editable Synapse image is stale. Will rebuild."
unset skip_docker_build
fi
fi
fi
fi
if [ -z "$skip_docker_build" ]; then
if [ -z "$skip_docker_build" ]; then
if [ -n "$use_editable_synapse" ]; then
# Build a special image designed for use in development with editable
# installs.
$CONTAINER_RUNTIME build -t synapse-editable \
-f "docker/editable.Dockerfile" .
# Build a special image designed for use in development with editable
# installs.
$CONTAINER_RUNTIME build -t synapse-editable \
-f "docker/editable.Dockerfile" .
$CONTAINER_RUNTIME build -t synapse-workers-editable \
--build-arg FROM=synapse-editable \
-f "docker/Dockerfile-workers" .
$CONTAINER_RUNTIME build -t synapse-workers-editable \
--build-arg FROM=synapse-editable \
-f "docker/Dockerfile-workers" .
$CONTAINER_RUNTIME build -t complement-synapse-editable \
--build-arg FROM=synapse-workers-editable \
-f "docker/complement/Dockerfile" "docker/complement"
$CONTAINER_RUNTIME build -t complement-synapse-editable \
--build-arg FROM=synapse-workers-editable \
-f "docker/complement/Dockerfile" "docker/complement"
# Prepare the Rust module
$CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'cp' complement-synapse-editable -- /synapse_rust.abi3.so.bak /editable-src/synapse/synapse_rust.abi3.so
# Prepare the Rust module
$CONTAINER_RUNTIME run --rm -v $editable_mount --entrypoint 'cp' complement-synapse-editable -- /synapse_rust.abi3.so.bak /editable-src/synapse/synapse_rust.abi3.so
else
# Build the base Synapse image from the local checkout
echo_if_github "::group::Build Docker image: matrixdotorg/synapse"
$CONTAINER_RUNTIME build -t matrixdotorg/synapse \
--build-arg TEST_ONLY_SKIP_DEP_HASH_VERIFICATION \
--build-arg TEST_ONLY_IGNORE_POETRY_LOCKFILE \
-f "docker/Dockerfile" .
echo_if_github "::endgroup::"
# Build the base Synapse image from the local checkout
echo_if_github "::group::Build Docker image: matrixdotorg/synapse"
$CONTAINER_RUNTIME build -t matrixdotorg/synapse \
--build-arg TEST_ONLY_SKIP_DEP_HASH_VERIFICATION \
--build-arg TEST_ONLY_IGNORE_POETRY_LOCKFILE \
-f "docker/Dockerfile" .
echo_if_github "::endgroup::"
# Build the workers docker image (from the base Synapse image we just built).
echo_if_github "::group::Build Docker image: matrixdotorg/synapse-workers"
$CONTAINER_RUNTIME build -t matrixdotorg/synapse-workers -f "docker/Dockerfile-workers" .
echo_if_github "::endgroup::"
# Build the workers docker image (from the base Synapse image we just built).
echo_if_github "::group::Build Docker image: matrixdotorg/synapse-workers"
$CONTAINER_RUNTIME build -t matrixdotorg/synapse-workers -f "docker/Dockerfile-workers" .
echo_if_github "::endgroup::"
# Build the unified Complement image (from the worker Synapse image we just built).
echo_if_github "::group::Build Docker image: complement/Dockerfile"
$CONTAINER_RUNTIME build -t complement-synapse \
`# This is the tag we end up pushing to the registry (see` \
`# .github/workflows/push_complement_image.yml) so let's just label it now` \
`# so people can reference it by the same name locally.` \
-t ghcr.io/element-hq/synapse/complement-synapse \
-f "docker/complement/Dockerfile" "docker/complement"
echo_if_github "::endgroup::"
# Build the unified Complement image (from the worker Synapse image we just built).
echo_if_github "::group::Build Docker image: complement/Dockerfile"
$CONTAINER_RUNTIME build -t complement-synapse \
`# This is the tag we end up pushing to the registry (see` \
`# .github/workflows/push_complement_image.yml) so let's just label it now` \
`# so people can reference it by the same name locally.` \
-t ghcr.io/element-hq/synapse/complement-synapse \
-f "docker/complement/Dockerfile" "docker/complement"
echo_if_github "::endgroup::"
fi
fi
fi
if [ -n "$skip_complement_run" ]; then
echo "Skipping Complement run as requested."
exit
fi
if [ -n "$skip_complement_run" ]; then
echo "Skipping Complement run as requested."
return 0
fi
export COMPLEMENT_BASE_IMAGE=complement-synapse
if [ -n "$use_editable_synapse" ]; then
export COMPLEMENT_BASE_IMAGE=complement-synapse
if [ -n "$use_editable_synapse" ]; then
export COMPLEMENT_BASE_IMAGE=complement-synapse-editable
export COMPLEMENT_HOST_MOUNTS="$editable_mount"
fi
fi
extra_test_args=()
extra_test_args=()
test_packages=(
test_packages=(
./tests/csapi
./tests
./tests/msc3874
@@ -231,71 +232,80 @@ test_packages=(
./tests/msc4140
./tests/msc4155
./tests/msc4306
)
)
# Enable dirty runs, so tests will reuse the same container where possible.
# This significantly speeds up tests, but increases the possibility of test pollution.
export COMPLEMENT_ENABLE_DIRTY_RUNS=1
# Enable dirty runs, so tests will reuse the same container where possible.
# This significantly speeds up tests, but increases the possibility of test pollution.
export COMPLEMENT_ENABLE_DIRTY_RUNS=1
# All environment variables starting with PASS_ will be shared.
# (The prefix is stripped off before reaching the container.)
export COMPLEMENT_SHARE_ENV_PREFIX=PASS_
# All environment variables starting with PASS_ will be shared.
# (The prefix is stripped off before reaching the container.)
export COMPLEMENT_SHARE_ENV_PREFIX=PASS_
# It takes longer than 10m to run the whole suite.
extra_test_args+=("-timeout=60m")
# It takes longer than 10m to run the whole suite.
extra_test_args+=("-timeout=60m")
if [[ -n "$WORKERS" ]]; then
# Use workers.
export PASS_SYNAPSE_COMPLEMENT_USE_WORKERS=true
if [[ -n "$WORKERS" ]]; then
# Use workers.
export PASS_SYNAPSE_COMPLEMENT_USE_WORKERS=true
# Pass through the workers defined. If none, it will be an empty string
export PASS_SYNAPSE_WORKER_TYPES="$WORKER_TYPES"
# Pass through the workers defined. If none, it will be an empty string
export PASS_SYNAPSE_WORKER_TYPES="$WORKER_TYPES"
# Workers can only use Postgres as a database.
export PASS_SYNAPSE_COMPLEMENT_DATABASE=postgres
# And provide some more configuration to complement.
# It can take quite a while to spin up a worker-mode Synapse for the first
# time (the main problem is that we start 14 python processes for each test,
# and complement likes to do two of them in parallel).
export COMPLEMENT_SPAWN_HS_TIMEOUT_SECS=120
else
export PASS_SYNAPSE_COMPLEMENT_USE_WORKERS=
if [[ -n "$POSTGRES" ]]; then
# Workers can only use Postgres as a database.
export PASS_SYNAPSE_COMPLEMENT_DATABASE=postgres
# And provide some more configuration to complement.
# It can take quite a while to spin up a worker-mode Synapse for the first
# time (the main problem is that we start 14 python processes for each test,
# and complement likes to do two of them in parallel).
export COMPLEMENT_SPAWN_HS_TIMEOUT_SECS=120
else
export PASS_SYNAPSE_COMPLEMENT_DATABASE=sqlite
export PASS_SYNAPSE_COMPLEMENT_USE_WORKERS=
if [[ -n "$POSTGRES" ]]; then
export PASS_SYNAPSE_COMPLEMENT_DATABASE=postgres
else
export PASS_SYNAPSE_COMPLEMENT_DATABASE=sqlite
fi
fi
if [[ -n "$ASYNCIO_REACTOR" ]]; then
# Enable the Twisted asyncio reactor
export PASS_SYNAPSE_COMPLEMENT_USE_ASYNCIO_REACTOR=true
fi
if [[ -n "$UNIX_SOCKETS" ]]; then
# Enable full on Unix socket mode for Synapse, Redis and Postgresql
export PASS_SYNAPSE_USE_UNIX_SOCKET=1
fi
if [[ -n "$SYNAPSE_TEST_LOG_LEVEL" ]]; then
# Set the log level to what is desired
export PASS_SYNAPSE_LOG_LEVEL="$SYNAPSE_TEST_LOG_LEVEL"
# Allow logging sensitive things (currently SQL queries & parameters).
# (This won't have any effect if we're not logging at DEBUG level overall.)
# Since this is just a test suite, this is fine and won't reveal anyone's
# personal information
export PASS_SYNAPSE_LOG_SENSITIVE=1
fi
# Log a few more useful things for a developer attempting to debug something
# particularly tricky.
export PASS_SYNAPSE_LOG_TESTING=1
# Run the tests!
echo "Images built; running complement with ${extra_test_args[@]} $@ ${test_packages[@]}"
cd "$COMPLEMENT_DIR"
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" "${test_packages[@]}"
}
main "$@"
# For any non-zero exit code (indicating some sort of error happened), we want to exit
# with that code.
exit_code=$?
if [ $exit_code -ne 0 ]; then
exit $exit_code
fi
if [[ -n "$ASYNCIO_REACTOR" ]]; then
# Enable the Twisted asyncio reactor
export PASS_SYNAPSE_COMPLEMENT_USE_ASYNCIO_REACTOR=true
fi
if [[ -n "$UNIX_SOCKETS" ]]; then
# Enable full on Unix socket mode for Synapse, Redis and Postgresql
export PASS_SYNAPSE_USE_UNIX_SOCKET=1
fi
if [[ -n "$SYNAPSE_TEST_LOG_LEVEL" ]]; then
# Set the log level to what is desired
export PASS_SYNAPSE_LOG_LEVEL="$SYNAPSE_TEST_LOG_LEVEL"
# Allow logging sensitive things (currently SQL queries & parameters).
# (This won't have any effect if we're not logging at DEBUG level overall.)
# Since this is just a test suite, this is fine and won't reveal anyone's
# personal information
export PASS_SYNAPSE_LOG_SENSITIVE=1
fi
# Log a few more useful things for a developer attempting to debug something
# particularly tricky.
export PASS_SYNAPSE_LOG_TESTING=1
# Run the tests!
echo "Images built; running complement with ${extra_test_args[@]} $@ ${test_packages[@]}"
cd "$COMPLEMENT_DIR"
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" "${test_packages[@]}"

View File

@@ -307,10 +307,6 @@ class AccountDataTypes:
MSC4155_INVITE_PERMISSION_CONFIG: Final = (
"org.matrix.msc4155.invite_permission_config"
)
# MSC4380: Invite blocking
MSC4380_INVITE_PERMISSION_CONFIG: Final = (
"org.matrix.msc4380.invite_permission_config"
)
# Synapse-specific behaviour. See "Client-Server API Extensions" documentation
# in Admin API for more information.
SYNAPSE_ADMIN_CLIENT_CONFIG: Final = "io.element.synapse.admin_client_config"

View File

@@ -137,7 +137,7 @@ class Codes(str, Enum):
PROFILE_TOO_LARGE = "M_PROFILE_TOO_LARGE"
KEY_TOO_LARGE = "M_KEY_TOO_LARGE"
# Part of MSC4155/MSC4380
# Part of MSC4155
INVITE_BLOCKED = "ORG.MATRIX.MSC4155.M_INVITE_BLOCKED"
# Part of MSC4190

View File

@@ -593,6 +593,3 @@ class ExperimentalConfig(Config):
# MSC4306: Thread Subscriptions
# (and MSC4308: Thread Subscriptions extension to Sliding Sync)
self.msc4306_enabled: bool = experimental.get("msc4306_enabled", False)
# MSC4380: Invite blocking
self.msc4380_enabled: bool = experimental.get("msc4380_enabled", False)

View File

@@ -182,8 +182,6 @@ class VersionsRestServlet(RestServlet):
"org.matrix.msc4306": self.config.experimental.msc4306_enabled,
# MSC4169: Backwards-compatible redaction sending using `/send`
"com.beeper.msc4169": self.config.experimental.msc4169_enabled,
# MSC4380: Invite blocking
"org.matrix.msc4380": self.config.experimental.msc4380_enabled,
},
},
)

View File

@@ -40,12 +40,7 @@ from synapse.storage.database import (
)
from synapse.storage.databases.main.cache import CacheInvalidationWorkerStore
from synapse.storage.databases.main.push_rule import PushRulesWorkerStore
from synapse.storage.invite_rule import (
AllowAllInviteRulesConfig,
InviteRulesConfig,
MSC4155InviteRulesConfig,
MSC4380InviteRulesConfig,
)
from synapse.storage.invite_rule import InviteRulesConfig
from synapse.storage.util.id_generators import MultiWriterIdGenerator
from synapse.types import JsonDict, JsonMapping
from synapse.util.caches.descriptors import cached
@@ -109,7 +104,6 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
)
self._msc4155_enabled = hs.config.experimental.msc4155_enabled
self._msc4380_enabled = hs.config.experimental.msc4380_enabled
def get_max_account_data_stream_id(self) -> int:
"""Get the current max stream ID for account data stream
@@ -568,28 +562,20 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
async def get_invite_config_for_user(self, user_id: str) -> InviteRulesConfig:
"""
Get the invite configuration for the given user.
Get the invite configuration for the current user.
Args:
user_id: The user whose invite configuration should be returned.
user_id:
"""
if self._msc4380_enabled:
data = await self.get_global_account_data_by_type_for_user(
user_id, AccountDataTypes.MSC4380_INVITE_PERMISSION_CONFIG
)
# If the user has an MSC4380-style config setting, prioritise that
# above an MSC4155 one
if data is not None:
return MSC4380InviteRulesConfig.from_account_data(data)
if self._msc4155_enabled:
data = await self.get_global_account_data_by_type_for_user(
user_id, AccountDataTypes.MSC4155_INVITE_PERMISSION_CONFIG
)
if data is not None:
return MSC4155InviteRulesConfig(data)
if not self._msc4155_enabled:
# This equates to allowing all invites, as if the setting was off.
return InviteRulesConfig(None)
return AllowAllInviteRulesConfig()
data = await self.get_global_account_data_by_type_for_user(
user_id, AccountDataTypes.MSC4155_INVITE_PERMISSION_CONFIG
)
return InviteRulesConfig(data)
async def get_admin_client_config_for_user(self, user_id: str) -> AdminClientConfig:
"""

View File

@@ -1,9 +1,7 @@
import logging
from abc import abstractmethod
from enum import Enum
from typing import Pattern
import attr
from matrix_common.regex import glob_to_regex
from synapse.types import JsonMapping, UserID
@@ -20,29 +18,9 @@ class InviteRule(Enum):
class InviteRulesConfig:
"""An object encapsulating a given user's choices about whether to accept invites."""
"""Class to determine if a given user permits an invite from another user, and the action to take."""
@abstractmethod
def get_invite_rule(self, inviter_user_id: str) -> InviteRule:
"""Get the invite rule that matches this user. Will return InviteRule.ALLOW if no rules match
Args:
inviter_user_id: The user ID of the inviting user.
"""
@attr.s(slots=True)
class AllowAllInviteRulesConfig(InviteRulesConfig):
"""An `InviteRulesConfig` implementation which will accept all invites."""
def get_invite_rule(self, inviter_user_id: str) -> InviteRule:
return InviteRule.ALLOW
class MSC4155InviteRulesConfig(InviteRulesConfig):
"""An object encapsulating [MSC4155](https://github.com/matrix-org/matrix-spec-proposals/pull/4155) invite rules."""
def __init__(self, account_data: JsonMapping):
def __init__(self, account_data: JsonMapping | None):
self.allowed_users: list[Pattern[str]] = []
self.ignored_users: list[Pattern[str]] = []
self.blocked_users: list[Pattern[str]] = []
@@ -132,20 +110,3 @@ class MSC4155InviteRulesConfig(InviteRulesConfig):
return rule
return InviteRule.ALLOW
@attr.s(slots=True, auto_attribs=True)
class MSC4380InviteRulesConfig(InviteRulesConfig):
block_all: bool
"""If true, all invites are blocked."""
@classmethod
def from_account_data(cls, data: JsonMapping) -> "MSC4380InviteRulesConfig":
block_all = data.get("block_all")
if not isinstance(block_all, bool):
block_all = False
return cls(block_all=block_all)
def get_invite_rule(self, inviter_user_id: str) -> InviteRule:
return InviteRule.BLOCK if self.block_all else InviteRule.ALLOW

View File

@@ -458,9 +458,7 @@ class RoomMemberMasterHandlerTestCase(HomeserverTestCase):
self.assertEqual(initial_count, new_count)
class TestMSC4155InviteFiltering(FederatingHomeserverTestCase):
"""Tests for MSC4155-style invite filtering."""
class TestInviteFiltering(FederatingHomeserverTestCase):
servlets = [
synapse.rest.admin.register_servlets,
synapse.rest.client.login.register_servlets,
@@ -620,145 +618,3 @@ class TestMSC4155InviteFiltering(FederatingHomeserverTestCase):
).value
self.assertEqual(f.code, 403)
self.assertEqual(f.errcode, "ORG.MATRIX.MSC4155.M_INVITE_BLOCKED")
class TestMSC4380InviteFiltering(FederatingHomeserverTestCase):
"""Tests for MSC4380-style invite filtering."""
servlets = [
synapse.rest.admin.register_servlets,
synapse.rest.client.login.register_servlets,
synapse.rest.client.room.register_servlets,
]
def prepare(self, reactor: MemoryReactor, clock: Clock, hs: HomeServer) -> None:
self.handler = hs.get_room_member_handler()
self.fed_handler = hs.get_federation_handler()
self.store = hs.get_datastores().main
# Create two users.
self.alice = self.register_user("alice", "pass")
self.alice_token = self.login("alice", "pass")
self.bob = self.register_user("bob", "pass")
self.bob_token = self.login("bob", "pass")
@override_config({"experimental_features": {"msc4380_enabled": True}})
def test_misc4380_block_invite_local(self) -> None:
"""Test that MSC4380 will block a user from being invited to a room"""
room_id = self.helper.create_room_as(self.alice, tok=self.alice_token)
self.get_success(
self.store.add_account_data_for_user(
self.bob,
AccountDataTypes.MSC4380_INVITE_PERMISSION_CONFIG,
{
"block_all": True,
},
)
)
f = self.get_failure(
self.handler.update_membership(
requester=create_requester(self.alice),
target=UserID.from_string(self.bob),
room_id=room_id,
action=Membership.INVITE,
),
SynapseError,
).value
self.assertEqual(f.code, 403)
self.assertEqual(f.errcode, "ORG.MATRIX.MSC4155.M_INVITE_BLOCKED")
@override_config({"experimental_features": {"msc4380_enabled": True}})
def test_misc4380_non_bool_setting(self) -> None:
"""Test that `block_all` being set to something non-booly is the same as False."""
room_id = self.helper.create_room_as(self.alice, tok=self.alice_token)
self.get_success(
self.store.add_account_data_for_user(
self.bob,
AccountDataTypes.MSC4380_INVITE_PERMISSION_CONFIG,
{
"block_all": "True",
},
)
)
self.get_success(
self.handler.update_membership(
requester=create_requester(self.alice),
target=UserID.from_string(self.bob),
room_id=room_id,
action=Membership.INVITE,
)
)
@override_config({"experimental_features": {"msc4380_enabled": False}})
def test_msc4380_disabled_allow_invite_local(self) -> None:
"""Test that MSC4380 will block a user from being invited to a room"""
room_id = self.helper.create_room_as(self.alice, tok=self.alice_token)
self.get_success(
self.store.add_account_data_for_user(
self.bob,
AccountDataTypes.MSC4380_INVITE_PERMISSION_CONFIG,
{
"block_all": True,
},
)
)
self.get_success(
self.handler.update_membership(
requester=create_requester(self.alice),
target=UserID.from_string(self.bob),
room_id=room_id,
action=Membership.INVITE,
),
)
@override_config({"experimental_features": {"msc4380_enabled": True}})
def test_msc4380_block_invite_remote(self) -> None:
"""Test that MSC4380 will block a user from being invited to a room by a remote user."""
# A remote user who sends the invite
remote_server = "otherserver"
remote_user = "@otheruser:" + remote_server
self.get_success(
self.store.add_account_data_for_user(
self.bob,
AccountDataTypes.MSC4380_INVITE_PERMISSION_CONFIG,
{"block_all": True},
)
)
room_id = self.helper.create_room_as(
room_creator=self.alice, tok=self.alice_token
)
room_version = self.get_success(self.store.get_room_version(room_id))
invite_event = event_from_pdu_json(
{
"type": EventTypes.Member,
"content": {"membership": "invite"},
"room_id": room_id,
"sender": remote_user,
"state_key": self.bob,
"depth": 32,
"prev_events": [],
"auth_events": [],
"origin_server_ts": self.clock.time_msec(),
},
room_version,
)
f = self.get_failure(
self.fed_handler.on_invite_request(
remote_server,
invite_event,
invite_event.room_version,
),
SynapseError,
).value
self.assertEqual(f.code, 403)
self.assertEqual(f.errcode, "ORG.MATRIX.MSC4155.M_INVITE_BLOCKED")

View File

@@ -1,8 +1,4 @@
from synapse.storage.invite_rule import (
AllowAllInviteRulesConfig,
InviteRule,
MSC4155InviteRulesConfig,
)
from synapse.storage.invite_rule import InviteRule, InviteRulesConfig
from synapse.types import UserID
from tests import unittest
@@ -14,23 +10,23 @@ ignored_user = UserID.from_string("@ignored:ignore.example.org")
class InviteFilterTestCase(unittest.TestCase):
def test_allow_all(self) -> None:
def test_empty(self) -> None:
"""Permit by default"""
config = AllowAllInviteRulesConfig()
config = InviteRulesConfig(None)
self.assertEqual(
config.get_invite_rule(regular_user.to_string()), InviteRule.ALLOW
)
def test_ignore_invalid(self) -> None:
"""Invalid strings are ignored"""
config = MSC4155InviteRulesConfig({"blocked_users": ["not a user"]})
config = InviteRulesConfig({"blocked_users": ["not a user"]})
self.assertEqual(
config.get_invite_rule(blocked_user.to_string()), InviteRule.ALLOW
)
def test_user_blocked(self) -> None:
"""Permit all, except explicitly blocked users"""
config = MSC4155InviteRulesConfig({"blocked_users": [blocked_user.to_string()]})
config = InviteRulesConfig({"blocked_users": [blocked_user.to_string()]})
self.assertEqual(
config.get_invite_rule(blocked_user.to_string()), InviteRule.BLOCK
)
@@ -40,7 +36,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_user_ignored(self) -> None:
"""Permit all, except explicitly ignored users"""
config = MSC4155InviteRulesConfig({"ignored_users": [ignored_user.to_string()]})
config = InviteRulesConfig({"ignored_users": [ignored_user.to_string()]})
self.assertEqual(
config.get_invite_rule(ignored_user.to_string()), InviteRule.IGNORE
)
@@ -50,7 +46,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_user_precedence(self) -> None:
"""Always take allowed over ignored, ignored over blocked, and then block."""
config = MSC4155InviteRulesConfig(
config = InviteRulesConfig(
{
"allowed_users": [allowed_user.to_string()],
"ignored_users": [allowed_user.to_string(), ignored_user.to_string()],
@@ -74,7 +70,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_server_blocked(self) -> None:
"""Block all users on the server except those allowed."""
user_on_same_server = UserID("blocked", allowed_user.domain)
config = MSC4155InviteRulesConfig(
config = InviteRulesConfig(
{
"allowed_users": [allowed_user.to_string()],
"blocked_servers": [allowed_user.domain],
@@ -90,7 +86,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_server_ignored(self) -> None:
"""Ignore all users on the server except those allowed."""
user_on_same_server = UserID("ignored", allowed_user.domain)
config = MSC4155InviteRulesConfig(
config = InviteRulesConfig(
{
"allowed_users": [allowed_user.to_string()],
"ignored_servers": [allowed_user.domain],
@@ -108,7 +104,7 @@ class InviteFilterTestCase(unittest.TestCase):
blocked_user_on_same_server = UserID("blocked", allowed_user.domain)
ignored_user_on_same_server = UserID("ignored", allowed_user.domain)
allowed_user_on_same_server = UserID("another", allowed_user.domain)
config = MSC4155InviteRulesConfig(
config = InviteRulesConfig(
{
"ignored_users": [ignored_user_on_same_server.to_string()],
"blocked_users": [blocked_user_on_same_server.to_string()],
@@ -133,7 +129,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_server_precedence(self) -> None:
"""Always take allowed over ignored, ignored over blocked, and then block."""
config = MSC4155InviteRulesConfig(
config = InviteRulesConfig(
{
"allowed_servers": [allowed_user.domain],
"ignored_servers": [allowed_user.domain, ignored_user.domain],
@@ -156,7 +152,7 @@ class InviteFilterTestCase(unittest.TestCase):
def test_server_glob(self) -> None:
"""Test that glob patterns match"""
config = MSC4155InviteRulesConfig({"blocked_servers": ["*.example.org"]})
config = InviteRulesConfig({"blocked_servers": ["*.example.org"]})
self.assertEqual(
config.get_invite_rule(allowed_user.to_string()), InviteRule.BLOCK
)