Compare commits

..

2 Commits

Author SHA1 Message Date
Erik Johnston
1ebc8975cf Newsfile 2024-10-22 15:57:32 +01:00
Erik Johnston
61c37b2c71 Fix check for outdated Rust library
This failed when install with poetry, so let's properly try and detect
what's going on.
2024-10-22 15:56:19 +01:00
59 changed files with 204 additions and 835 deletions

View File

@@ -1,63 +1,3 @@
# Synapse 1.118.0 (2024-10-29)
No significant changes since 1.118.0rc1.
### Python 3.8 support will be dropped in the next release
Python 3.8 is now [end-of-life](https://devguide.python.org/versions/). As per our [Deprecation Policy for Platform Dependencies](https://element-hq.github.io/synapse/latest/deprecation_policy.html#policy), Synapse will be dropping support for Python 3.8 in the next release; Synapse 1.119.0.
Synapse 1.118.x will be the final release to support Python 3.8. If you are running Synapse with Python 3.8, please upgrade before the 1.119.0 release, due in less than one month.
### Python 3.13 and PostgreSQL 17 support
On the other end of the spectrum, Synapse 1.118.0 is the first release to support [Python 3.13](https://www.python.org/downloads/release/python-3130/)! [PostgreSQL 17](https://www.postgresql.org/about/news/postgresql-17-released-2936/) is also supported as of this release.
# Synapse 1.118.0rc1 (2024-10-22)
### Features
- Added the `display_name_claim` option to the JWT configuration. This option allows specifying the claim key that contains the user's display name in the JWT payload. ([\#17708](https://github.com/element-hq/synapse/issues/17708))
- Implement [MSC4210](https://github.com/matrix-org/matrix-spec-proposals/pull/4210): Remove legacy mentions. Contributed by @tulir @ Beeper. ([\#17783](https://github.com/element-hq/synapse/issues/17783))
### Bugfixes
- Fix saving of PNG thumbnails, when the original image is in the CMYK color space. ([\#17736](https://github.com/element-hq/synapse/issues/17736))
- Fix bug with sliding sync where the server would not return state that was added to the `required_state` config. ([\#17785](https://github.com/element-hq/synapse/issues/17785), [\#17805](https://github.com/element-hq/synapse/issues/17805))
- Fix a bug in [MSC4186](https://github.com/matrix-org/matrix-spec-proposals/pull/4186) Sliding Sync that would cause rooms to stay forgotten and hidden even after rejoining. ([\#17835](https://github.com/element-hq/synapse/issues/17835))
### Improved Documentation
- Clarify when the `user_may_invite` and `user_may_send_3pid_invite` module callbacks are called. ([\#17627](https://github.com/element-hq/synapse/issues/17627))
- Correct documentation to refer to the `--config-path` argument instead of `--config-file`. ([\#17802](https://github.com/element-hq/synapse/issues/17802))
- Fix typo in `target_cache_memory_usage` docs. ([\#17825](https://github.com/element-hq/synapse/issues/17825))
### Internal Changes
- Slight optimization when fetching state/events for Sliding Sync. ([\#17718](https://github.com/element-hq/synapse/issues/17718))
- Add Python 3.13 and Postgres 17 to the test matrix. ([\#17752](https://github.com/element-hq/synapse/issues/17752))
- Test github token before running release script steps. ([\#17803](https://github.com/element-hq/synapse/issues/17803))
- Build debian packages for new Ubuntu versions, and stop building for no longer supported versions. ([\#17824](https://github.com/element-hq/synapse/issues/17824))
- Enable the `.org.matrix.msc4028.encrypted_event` push rule by default in accordance with [MSC4028](https://github.com/matrix-org/matrix-spec-proposals/pull/4028). Note that the corresponding experimental feature must still be switched on for this push rule to have any effect. ([\#17826](https://github.com/element-hq/synapse/issues/17826))
- Fix some typing issues uncovered by upgrading mypy to 1.11.x. ([\#17842](https://github.com/element-hq/synapse/issues/17842))
### Updates to locked dependencies
* Bump mypy from 1.10.1 to 1.11.2. ([\#17842](https://github.com/element-hq/synapse/issues/17842))
* Bump mypy-zope from 1.0.5 to 1.0.7. ([\#17827](https://github.com/element-hq/synapse/issues/17827))
* Bump phonenumbers from 8.13.46 to 8.13.47. ([\#17797](https://github.com/element-hq/synapse/issues/17797))
* Bump psycopg2 from 2.9.9 to 2.9.10. ([\#17843](https://github.com/element-hq/synapse/issues/17843))
* Bump ruff from 0.6.8 to 0.6.9. ([\#17794](https://github.com/element-hq/synapse/issues/17794))
* Bump sentry-sdk from 2.14.0 to 2.15.0. ([\#17795](https://github.com/element-hq/synapse/issues/17795))
* Bump sentry-sdk from 2.15.0 to 2.16.0. ([\#17829](https://github.com/element-hq/synapse/issues/17829))
* Bump sentry-sdk from 2.16.0 to 2.17.0. ([\#17844](https://github.com/element-hq/synapse/issues/17844))
* Bump sigstore/cosign-installer from 3.6.0 to 3.7.0. ([\#17798](https://github.com/element-hq/synapse/issues/17798))
* Bump tomli from 2.0.1 to 2.0.2. ([\#17796](https://github.com/element-hq/synapse/issues/17796))
* Bump types-requests from 2.32.0.20240914 to 2.32.0.20241016. ([\#17841](https://github.com/element-hq/synapse/issues/17841))
* Bump types-setuptools from 75.1.0.20240917 to 75.1.0.20241014. ([\#17828](https://github.com/element-hq/synapse/issues/17828))
# Synapse 1.117.0 (2024-10-15)
No significant changes since 1.117.0rc1.

32
Cargo.lock generated
View File

@@ -13,9 +13,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.91"
version = "1.0.89"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c042108f3ed77fd83760a5fd79b53be043192bb3b9dba91d8c574c0ada7850c8"
checksum = "86fdf8605db99b54d3cd748a44c6d04df638eb5dafb219b135d0149bd0db01f6"
[[package]]
name = "arc-swap"
@@ -67,9 +67,9 @@ checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c"
[[package]]
name = "bytes"
version = "1.8.0"
version = "1.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9ac0150caa2ae65ca5bd83f25c7de183dea78d4d366469f148435e2acfbad0da"
checksum = "428d9aa8fbc0670b7b8d6030a7fadd0f86151cae55e4dbbece15f3780a3dfaf3"
[[package]]
name = "cfg-if"
@@ -302,9 +302,9 @@ checksum = "5b40af805b3121feab8a3c29f04d8ad262fa8e0561883e7653e024ae4479e6de"
[[package]]
name = "proc-macro2"
version = "1.0.89"
version = "1.0.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f139b0662de085916d1fb67d2b4169d1addddda1919e696f3252b740b629986e"
checksum = "8ad3d49ab951a01fbaafe34f2ec74122942fe18a3f9814c3268f1bb72042131b"
dependencies = [
"unicode-ident",
]
@@ -444,9 +444,9 @@ dependencies = [
[[package]]
name = "regex"
version = "1.11.1"
version = "1.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b544ef1b4eac5dc2db33ea63606ae9ffcfac26c1416a2806ae0bf5f56b201191"
checksum = "38200e5ee88914975b69f657f0801b6f6dccafd44fd9326302a4aaeecfacb1d8"
dependencies = [
"aho-corasick",
"memchr",
@@ -485,18 +485,18 @@ checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "serde"
version = "1.0.213"
version = "1.0.210"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3ea7893ff5e2466df8d720bb615088341b295f849602c6956047f8f80f0e9bc1"
checksum = "c8e3592472072e6e22e0a54d5904d9febf8508f65fb8552499a1abc7d1078c3a"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.213"
version = "1.0.210"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7e85ad2009c50b58e87caa8cd6dac16bdf511bbfb7af6c33df902396aa480fa5"
checksum = "243902eda00fad750862fc144cea25caca5e20d615af0a81bee94ca738f1df1f"
dependencies = [
"proc-macro2",
"quote",
@@ -505,9 +505,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.132"
version = "1.0.128"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d726bfaff4b320266d395898905d0eba0345aae23b54aee3a737e260fd46db03"
checksum = "6ff5456707a1de34e7e37f2a6fd3d3f808c318259cbd01ab6377795054b483d8"
dependencies = [
"itoa",
"memchr",
@@ -551,9 +551,9 @@ checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "2.0.85"
version = "2.0.61"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5023162dfcd14ef8f32034d8bcd4cc5ddc61ef7a247c024a33e24e1f24d21b56"
checksum = "c993ed8ccba56ae856363b1845da7266a7cb78e1d146c8a32d54b45a8b831fc9"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -1 +0,0 @@
Support [MSC4151](https://github.com/matrix-org/matrix-spec-proposals/pull/4151)'s stable report room API.

1
changelog.d/17627.doc Normal file
View File

@@ -0,0 +1 @@
Clarify when the `user_may_invite` and `user_may_send_3pid_invite` module callbacks are called.

View File

@@ -0,0 +1 @@
Added the `display_name_claim` option to the JWT configuration. This option allows specifying the claim key that contains the user's display name in the JWT payload.

1
changelog.d/17718.misc Normal file
View File

@@ -0,0 +1 @@
Slight optimization when fetching state/events for Sliding Sync.

1
changelog.d/17736.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix saving of PNG thumbnails, when the original image is in the CMYK color space.

1
changelog.d/17752.misc Normal file
View File

@@ -0,0 +1 @@
Add Python 3.13 and Postgres 17 to the test matrix.

View File

@@ -0,0 +1 @@
Implement [MSC4210](https://github.com/matrix-org/matrix-spec-proposals/pull/4210): Remove legacy mentions. Contributed by @tulir @ Beeper.

1
changelog.d/17785.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix bug with sliding sync where the server would not return state that was added to the `required_state` config.

View File

@@ -1 +0,0 @@
Add a test for downloading and thumbnailing a CMYK JPEG.

1
changelog.d/17802.doc Normal file
View File

@@ -0,0 +1 @@
Correct documentation to refer to the `--config-path` argument instead of `--config-file`.

1
changelog.d/17803.misc Normal file
View File

@@ -0,0 +1 @@
Test github token before running release script steps.

1
changelog.d/17805.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix bug with sliding sync where the server would not return state that was added to the `required_state` config.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

View File

@@ -1 +0,0 @@
Avoid lost data on some database query retries.

1
changelog.d/17824.misc Normal file
View File

@@ -0,0 +1 @@
Build debian packages for new Ubuntu versions, and stop building for no longer supported versions.

1
changelog.d/17825.doc Normal file
View File

@@ -0,0 +1 @@
Fix typo in `target_cache_memory_usage` docs.

1
changelog.d/17826.misc Normal file
View File

@@ -0,0 +1 @@
Enable the `.org.matrix.msc4028.encrypted_event` push rule by default in accordance with [MSC4028](https://github.com/matrix-org/matrix-spec-proposals/pull/4028). Note that the corresponding experimental feature must still be switched on for this push rule to have any effect.

View File

@@ -1 +0,0 @@
Include the destination in the error of 'Destination mismatch' on federation requests.

1
changelog.d/17835.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a bug in [MSC4186](https://github.com/matrix-org/matrix-spec-proposals/pull/4186) Sliding Sync that would cause rooms to stay forgotten and hidden even after rejoining.

View File

@@ -1 +0,0 @@
Check if user has membership in a room before tagging it. Contributed by Lama Alosaimi.

1
changelog.d/17842.misc Normal file
View File

@@ -0,0 +1 @@
Fix some typing issues uncovered by upgrading mypy to 1.11.x.

View File

@@ -1,2 +0,0 @@
Fix a bug in the admin redact endpoint where the background task would not run if a worker was specified in
the config option `run_background_tasks_on`.

View File

@@ -1 +0,0 @@
Minor speed-up of sliding sync by computing extensions results in parallel.

View File

@@ -1 +0,0 @@
Remove usage of internal header encoding API.

12
debian/changelog vendored
View File

@@ -1,15 +1,3 @@
matrix-synapse-py3 (1.118.0) stable; urgency=medium
* New Synapse release 1.118.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 29 Oct 2024 15:29:53 +0100
matrix-synapse-py3 (1.118.0~rc1) stable; urgency=medium
* New Synapse release 1.118.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 22 Oct 2024 11:48:14 +0100
matrix-synapse-py3 (1.117.0) stable; urgency=medium
* New Synapse release 1.117.0.

View File

@@ -1365,9 +1365,6 @@ _Added in Synapse 1.72.0._
## Redact all the events of a user
This endpoint allows an admin to redact the events of a given user. There are no restrictions on redactions for a
local user. By default, we puppet the user who sent the message to redact it themselves. Redactions for non-local users are issued using the admin user, and will fail in rooms where the admin user is not admin/does not have the specified power level to issue redactions.
The API is
```
POST /_synapse/admin/v1/user/$user_id/redact

122
poetry.lock generated
View File

@@ -360,38 +360,38 @@ files = [
[[package]]
name = "cryptography"
version = "43.0.3"
version = "43.0.1"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false
python-versions = ">=3.7"
files = [
{file = "cryptography-43.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bf7a1932ac4176486eab36a19ed4c0492da5d97123f1406cf15e41b05e787d2e"},
{file = "cryptography-43.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63efa177ff54aec6e1c0aefaa1a241232dcd37413835a9b674b6e3f0ae2bfd3e"},
{file = "cryptography-43.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e1ce50266f4f70bf41a2c6dc4358afadae90e2a1e5342d3c08883df1675374f"},
{file = "cryptography-43.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:443c4a81bb10daed9a8f334365fe52542771f25aedaf889fd323a853ce7377d6"},
{file = "cryptography-43.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:74f57f24754fe349223792466a709f8e0c093205ff0dca557af51072ff47ab18"},
{file = "cryptography-43.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9762ea51a8fc2a88b70cf2995e5675b38d93bf36bd67d91721c309df184f49bd"},
{file = "cryptography-43.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:81ef806b1fef6b06dcebad789f988d3b37ccaee225695cf3e07648eee0fc6b73"},
{file = "cryptography-43.0.3-cp37-abi3-win32.whl", hash = "sha256:cbeb489927bd7af4aa98d4b261af9a5bc025bd87f0e3547e11584be9e9427be2"},
{file = "cryptography-43.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:f46304d6f0c6ab8e52770addfa2fc41e6629495548862279641972b6215451cd"},
{file = "cryptography-43.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:8ac43ae87929a5982f5948ceda07001ee5e83227fd69cf55b109144938d96984"},
{file = "cryptography-43.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:846da004a5804145a5f441b8530b4bf35afbf7da70f82409f151695b127213d5"},
{file = "cryptography-43.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f996e7268af62598f2fc1204afa98a3b5712313a55c4c9d434aef49cadc91d4"},
{file = "cryptography-43.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f7b178f11ed3664fd0e995a47ed2b5ff0a12d893e41dd0494f406d1cf555cab7"},
{file = "cryptography-43.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:c2e6fc39c4ab499049df3bdf567f768a723a5e8464816e8f009f121a5a9f4405"},
{file = "cryptography-43.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:e1be4655c7ef6e1bbe6b5d0403526601323420bcf414598955968c9ef3eb7d16"},
{file = "cryptography-43.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:df6b6c6d742395dd77a23ea3728ab62f98379eff8fb61be2744d4679ab678f73"},
{file = "cryptography-43.0.3-cp39-abi3-win32.whl", hash = "sha256:d56e96520b1020449bbace2b78b603442e7e378a9b3bd68de65c782db1507995"},
{file = "cryptography-43.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:0c580952eef9bf68c4747774cde7ec1d85a6e61de97281f2dba83c7d2c806362"},
{file = "cryptography-43.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d03b5621a135bffecad2c73e9f4deb1a0f977b9a8ffe6f8e002bf6c9d07b918c"},
{file = "cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a2a431ee15799d6db9fe80c82b055bae5a752bef645bba795e8e52687c69efe3"},
{file = "cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:281c945d0e28c92ca5e5930664c1cefd85efe80e5c0d2bc58dd63383fda29f83"},
{file = "cryptography-43.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:f18c716be16bc1fea8e95def49edf46b82fccaa88587a45f8dc0ff6ab5d8e0a7"},
{file = "cryptography-43.0.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a02ded6cd4f0a5562a8887df8b3bd14e822a90f97ac5e544c162899bc467664"},
{file = "cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:53a583b6637ab4c4e3591a15bc9db855b8d9dee9a669b550f311480acab6eb08"},
{file = "cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:1ec0bcf7e17c0c5669d881b1cd38c4972fade441b27bda1051665faaa89bdcaa"},
{file = "cryptography-43.0.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2ce6fae5bdad59577b44e4dfed356944fbf1d925269114c28be377692643b4ff"},
{file = "cryptography-43.0.3.tar.gz", hash = "sha256:315b9001266a492a6ff443b61238f956b214dbec9910a081ba5b6646a055a805"},
{file = "cryptography-43.0.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:8385d98f6a3bf8bb2d65a73e17ed87a3ba84f6991c155691c51112075f9ffc5d"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27e613d7077ac613e399270253259d9d53872aaf657471473ebfc9a52935c062"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:68aaecc4178e90719e95298515979814bda0cbada1256a4485414860bd7ab962"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:de41fd81a41e53267cb020bb3a7212861da53a7d39f863585d13ea11049cf277"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f98bf604c82c416bc829e490c700ca1553eafdf2912a91e23a79d97d9801372a"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:61ec41068b7b74268fa86e3e9e12b9f0c21fcf65434571dbb13d954bceb08042"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:014f58110f53237ace6a408b5beb6c427b64e084eb451ef25a28308270086494"},
{file = "cryptography-43.0.1-cp37-abi3-win32.whl", hash = "sha256:2bd51274dcd59f09dd952afb696bf9c61a7a49dfc764c04dd33ef7a6b502a1e2"},
{file = "cryptography-43.0.1-cp37-abi3-win_amd64.whl", hash = "sha256:666ae11966643886c2987b3b721899d250855718d6d9ce41b521252a17985f4d"},
{file = "cryptography-43.0.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:ac119bb76b9faa00f48128b7f5679e1d8d437365c5d26f1c2c3f0da4ce1b553d"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bbcce1a551e262dfbafb6e6252f1ae36a248e615ca44ba302df077a846a8806"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58d4e9129985185a06d849aa6df265bdd5a74ca6e1b736a77959b498e0505b85"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d03a475165f3134f773d1388aeb19c2d25ba88b6a9733c5c590b9ff7bbfa2e0c"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:511f4273808ab590912a93ddb4e3914dfd8a388fed883361b02dea3791f292e1"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:80eda8b3e173f0f247f711eef62be51b599b5d425c429b5d4ca6a05e9e856baa"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:38926c50cff6f533f8a2dae3d7f19541432610d114a70808f0926d5aaa7121e4"},
{file = "cryptography-43.0.1-cp39-abi3-win32.whl", hash = "sha256:a575913fb06e05e6b4b814d7f7468c2c660e8bb16d8d5a1faf9b33ccc569dd47"},
{file = "cryptography-43.0.1-cp39-abi3-win_amd64.whl", hash = "sha256:d75601ad10b059ec832e78823b348bfa1a59f6b8d545db3a24fd44362a1564cb"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ea25acb556320250756e53f9e20a4177515f012c9eaea17eb7587a8c4d8ae034"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c1332724be35d23a854994ff0b66530119500b6053d0bd3363265f7e5e77288d"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fba1007b3ef89946dbbb515aeeb41e30203b004f0b4b00e5e16078b518563289"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5b43d1ea6b378b54a1dc99dd8a2b5be47658fe9a7ce0a58ff0b55f4b43ef2b84"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:88cce104c36870d70c49c7c8fd22885875d950d9ee6ab54df2745f83ba0dc365"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9d3cdb25fa98afdd3d0892d132b8d7139e2c087da1712041f6b762e4f807cc96"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e710bf40870f4db63c3d7d929aa9e09e4e7ee219e703f949ec4073b4294f6172"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7c05650fe8023c5ed0d46793d4b7d7e6cd9c04e68eabe5b0aeea836e37bdcec2"},
{file = "cryptography-43.0.1.tar.gz", hash = "sha256:203e92a75716d8cfb491dc47c79e17d0d9207ccffcbcb35f598fbe463ae3444d"},
]
[package.dependencies]
@@ -404,7 +404,7 @@ nox = ["nox"]
pep8test = ["check-sdist", "click", "mypy", "ruff"]
sdist = ["build"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi", "cryptography-vectors (==43.0.3)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test = ["certifi", "cryptography-vectors (==43.0.1)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
[[package]]
@@ -1451,13 +1451,13 @@ dev = ["jinja2"]
[[package]]
name = "phonenumbers"
version = "8.13.48"
version = "8.13.47"
description = "Python version of Google's common library for parsing, formatting, storing and validating international phone numbers."
optional = false
python-versions = "*"
files = [
{file = "phonenumbers-8.13.48-py2.py3-none-any.whl", hash = "sha256:5c51939acefa390eb74119750afb10a85d3c628dc83fd62c52d6f532fcf5d205"},
{file = "phonenumbers-8.13.48.tar.gz", hash = "sha256:62d8df9b0f3c3c41571c6b396f044ddd999d61631534001b8be7fdf7ba1b18f3"},
{file = "phonenumbers-8.13.47-py2.py3-none-any.whl", hash = "sha256:5d3c0142ef7055ca5551884352e3b6b93bfe002a0bc95b8eaba39b0e2184541b"},
{file = "phonenumbers-8.13.47.tar.gz", hash = "sha256:53c5e7c6d431cafe4efdd44956078404ae9bc8b0eacc47be3105d3ccc88aaffa"},
]
[[package]]
@@ -1974,13 +1974,13 @@ six = ">=1.5"
[[package]]
name = "python-multipart"
version = "0.0.16"
version = "0.0.12"
description = "A streaming multipart parser for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "python_multipart-0.0.16-py3-none-any.whl", hash = "sha256:c2759b7b976ef3937214dfb592446b59dfaa5f04682a076f78b117c94776d87a"},
{file = "python_multipart-0.0.16.tar.gz", hash = "sha256:8dee37b88dab9b59922ca173c35acb627cc12ec74019f5cd4578369c6df36554"},
{file = "python_multipart-0.0.12-py3-none-any.whl", hash = "sha256:43dcf96cf65888a9cd3423544dd0d75ac10f7aa0c3c28a175bbcd00c9ce1aebf"},
{file = "python_multipart-0.0.12.tar.gz", hash = "sha256:045e1f98d719c1ce085ed7f7e1ef9d8ccc8c02ba02b5566d5f7521410ced58cb"},
]
[[package]]
@@ -2277,29 +2277,29 @@ files = [
[[package]]
name = "ruff"
version = "0.7.1"
version = "0.6.9"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
files = [
{file = "ruff-0.7.1-py3-none-linux_armv6l.whl", hash = "sha256:cb1bc5ed9403daa7da05475d615739cc0212e861b7306f314379d958592aaa89"},
{file = "ruff-0.7.1-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:27c1c52a8d199a257ff1e5582d078eab7145129aa02721815ca8fa4f9612dc35"},
{file = "ruff-0.7.1-py3-none-macosx_11_0_arm64.whl", hash = "sha256:588a34e1ef2ea55b4ddfec26bbe76bc866e92523d8c6cdec5e8aceefeff02d99"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94fc32f9cdf72dc75c451e5f072758b118ab8100727168a3df58502b43a599ca"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:985818742b833bffa543a84d1cc11b5e6871de1b4e0ac3060a59a2bae3969250"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:32f1e8a192e261366c702c5fb2ece9f68d26625f198a25c408861c16dc2dea9c"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:699085bf05819588551b11751eff33e9ca58b1b86a6843e1b082a7de40da1565"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:344cc2b0814047dc8c3a8ff2cd1f3d808bb23c6658db830d25147339d9bf9ea7"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4316bbf69d5a859cc937890c7ac7a6551252b6a01b1d2c97e8fc96e45a7c8b4a"},
{file = "ruff-0.7.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:79d3af9dca4c56043e738a4d6dd1e9444b6d6c10598ac52d146e331eb155a8ad"},
{file = "ruff-0.7.1-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c5c121b46abde94a505175524e51891f829414e093cd8326d6e741ecfc0a9112"},
{file = "ruff-0.7.1-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:8422104078324ea250886954e48f1373a8fe7de59283d747c3a7eca050b4e378"},
{file = "ruff-0.7.1-py3-none-musllinux_1_2_i686.whl", hash = "sha256:56aad830af8a9db644e80098fe4984a948e2b6fc2e73891538f43bbe478461b8"},
{file = "ruff-0.7.1-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:658304f02f68d3a83c998ad8bf91f9b4f53e93e5412b8f2388359d55869727fd"},
{file = "ruff-0.7.1-py3-none-win32.whl", hash = "sha256:b517a2011333eb7ce2d402652ecaa0ac1a30c114fbbd55c6b8ee466a7f600ee9"},
{file = "ruff-0.7.1-py3-none-win_amd64.whl", hash = "sha256:f38c41fcde1728736b4eb2b18850f6d1e3eedd9678c914dede554a70d5241307"},
{file = "ruff-0.7.1-py3-none-win_arm64.whl", hash = "sha256:19aa200ec824c0f36d0c9114c8ec0087082021732979a359d6f3c390a6ff2a37"},
{file = "ruff-0.7.1.tar.gz", hash = "sha256:9d8a41d4aa2dad1575adb98a82870cf5db5f76b2938cf2206c22c940034a36f4"},
{file = "ruff-0.6.9-py3-none-linux_armv6l.whl", hash = "sha256:064df58d84ccc0ac0fcd63bc3090b251d90e2a372558c0f057c3f75ed73e1ccd"},
{file = "ruff-0.6.9-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:140d4b5c9f5fc7a7b074908a78ab8d384dd7f6510402267bc76c37195c02a7ec"},
{file = "ruff-0.6.9-py3-none-macosx_11_0_arm64.whl", hash = "sha256:53fd8ca5e82bdee8da7f506d7b03a261f24cd43d090ea9db9a1dc59d9313914c"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:645d7d8761f915e48a00d4ecc3686969761df69fb561dd914a773c1a8266e14e"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eae02b700763e3847595b9d2891488989cac00214da7f845f4bcf2989007d577"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7d5ccc9e58112441de8ad4b29dcb7a86dc25c5f770e3c06a9d57e0e5eba48829"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:417b81aa1c9b60b2f8edc463c58363075412866ae4e2b9ab0f690dc1e87ac1b5"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3c866b631f5fbce896a74a6e4383407ba7507b815ccc52bcedabb6810fdb3ef7"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7b118afbb3202f5911486ad52da86d1d52305b59e7ef2031cea3425142b97d6f"},
{file = "ruff-0.6.9-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a67267654edc23c97335586774790cde402fb6bbdb3c2314f1fc087dee320bfa"},
{file = "ruff-0.6.9-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:3ef0cc774b00fec123f635ce5c547dac263f6ee9fb9cc83437c5904183b55ceb"},
{file = "ruff-0.6.9-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:12edd2af0c60fa61ff31cefb90aef4288ac4d372b4962c2864aeea3a1a2460c0"},
{file = "ruff-0.6.9-py3-none-musllinux_1_2_i686.whl", hash = "sha256:55bb01caeaf3a60b2b2bba07308a02fca6ab56233302406ed5245180a05c5625"},
{file = "ruff-0.6.9-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:925d26471fa24b0ce5a6cdfab1bb526fb4159952385f386bdcc643813d472039"},
{file = "ruff-0.6.9-py3-none-win32.whl", hash = "sha256:eb61ec9bdb2506cffd492e05ac40e5bc6284873aceb605503d8494180d6fc84d"},
{file = "ruff-0.6.9-py3-none-win_amd64.whl", hash = "sha256:785d31851c1ae91f45b3d8fe23b8ae4b5170089021fbb42402d811135f0b7117"},
{file = "ruff-0.6.9-py3-none-win_arm64.whl", hash = "sha256:a9641e31476d601f83cd602608739a0840e348bda93fec9f1ee816f8b6798b93"},
{file = "ruff-0.6.9.tar.gz", hash = "sha256:b076ef717a8e5bc819514ee1d602bbdca5b4420ae13a9cf61a0c0a4f53a2baa2"},
]
[[package]]
@@ -2783,13 +2783,13 @@ files = [
[[package]]
name = "types-psycopg2"
version = "2.9.21.20241019"
version = "2.9.21.20240819"
description = "Typing stubs for psycopg2"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-psycopg2-2.9.21.20241019.tar.gz", hash = "sha256:bca89b988d2ebd19bcd08b177d22a877ea8b841decb10ed130afcf39404612fa"},
{file = "types_psycopg2-2.9.21.20241019-py3-none-any.whl", hash = "sha256:44d091e67732d16a941baae48cd7b53bf91911bc36888652447cf1ef0c1fb3f6"},
{file = "types-psycopg2-2.9.21.20240819.tar.gz", hash = "sha256:4ed6b47464d6374fa64e5e3b234cea0f710e72123a4596d67ab50b7415a84666"},
{file = "types_psycopg2-2.9.21.20240819-py3-none-any.whl", hash = "sha256:c9192311c27d7ad561eef705f1b2df1074f2cdcf445a98a6a2fcaaaad43278cf"},
]
[[package]]
@@ -2834,13 +2834,13 @@ urllib3 = ">=2"
[[package]]
name = "types-setuptools"
version = "75.2.0.20241019"
version = "75.1.0.20241014"
description = "Typing stubs for setuptools"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-setuptools-75.2.0.20241019.tar.gz", hash = "sha256:86ea31b5f6df2c6b8f2dc8ae3f72b213607f62549b6fa2ed5866e5299f968694"},
{file = "types_setuptools-75.2.0.20241019-py3-none-any.whl", hash = "sha256:2e48ff3acd4919471e80d5e3f049cce5c177e108d5d36d2d4cee3fa4d4104258"},
{file = "types-setuptools-75.1.0.20241014.tar.gz", hash = "sha256:29b0560a8d4b4a91174be085847002c69abfcb048e20b33fc663005aedf56804"},
{file = "types_setuptools-75.1.0.20241014-py3-none-any.whl", hash = "sha256:caab58366741fb99673d0138b6e2d760717f154cfb981b74fea5e8de40f0b703"},
]
[[package]]
@@ -3122,4 +3122,4 @@ user-search = ["pyicu"]
[metadata]
lock-version = "2.0"
python-versions = "^3.8.0"
content-hash = "aa1f6d97809596c23a6d160c0c5804971dad0ba49e34b137bbfb79df038fe6f0"
content-hash = "c8a22f901970b2f851151e731532757fd3acf7ba02930952636d2e6c5c9c0c90"

View File

@@ -97,7 +97,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.118.0"
version = "1.117.0"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"
@@ -320,7 +320,7 @@ all = [
# failing on new releases. Keeping lower bounds loose here means that dependabot
# can bump versions without having to update the content-hash in the lockfile.
# This helps prevents merge conflicts when running a batch of dependabot updates.
ruff = "0.7.1"
ruff = "0.6.9"
# Type checking only works with the pydantic.v1 compat module from pydantic v2
pydantic = "^2"

View File

@@ -113,7 +113,7 @@ class Authenticator:
):
raise AuthenticationError(
HTTPStatus.UNAUTHORIZED,
f"Destination mismatch in auth header, received: {destination!r}",
"Destination mismatch in auth header",
Codes.UNAUTHORIZED,
)
if (

View File

@@ -73,8 +73,6 @@ class AdminHandler:
self._redact_all_events, REDACT_ALL_EVENTS_ACTION_NAME
)
self.hs = hs
async def get_redact_task(self, redact_id: str) -> Optional[ScheduledTask]:
"""Get the current status of an active redaction process
@@ -425,10 +423,8 @@ class AdminHandler:
user_id = task.params.get("user_id")
assert user_id is not None
# puppet the user if they're ours, otherwise use admin to redact
requester = create_requester(
user_id if self.hs.is_mine_id(user_id) else admin.user.to_string(),
authenticated_entity=admin.user.to_string(),
user_id, authenticated_entity=admin.user.to_string()
)
reason = task.params.get("reason")

View File

@@ -1190,26 +1190,6 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
origin_server_ts=origin_server_ts,
)
async def check_for_any_membership_in_room(
self, *, user_id: str, room_id: str
) -> None:
"""
Check if the user has any membership in the room and raise error if not.
Args:
user_id: The user to check.
room_id: The room to check.
Raises:
AuthError if the user doesn't have any membership in the room.
"""
result = await self.store.get_local_current_membership_for_user_in_room(
user_id=user_id, room_id=room_id
)
if result is None or result == (None, None):
raise AuthError(403, f"User {user_id} has no membership in room {room_id}")
async def _should_perform_remote_join(
self,
user_id: str,

View File

@@ -49,10 +49,7 @@ from synapse.types.handlers.sliding_sync import (
SlidingSyncConfig,
SlidingSyncResult,
)
from synapse.util.async_helpers import (
concurrently_execute,
gather_optional_coroutines,
)
from synapse.util.async_helpers import concurrently_execute
if TYPE_CHECKING:
from synapse.server import HomeServer
@@ -100,26 +97,26 @@ class SlidingSyncExtensionHandler:
if sync_config.extensions is None:
return SlidingSyncResult.Extensions()
to_device_coro = None
to_device_response = None
if sync_config.extensions.to_device is not None:
to_device_coro = self.get_to_device_extension_response(
to_device_response = await self.get_to_device_extension_response(
sync_config=sync_config,
to_device_request=sync_config.extensions.to_device,
to_token=to_token,
)
e2ee_coro = None
e2ee_response = None
if sync_config.extensions.e2ee is not None:
e2ee_coro = self.get_e2ee_extension_response(
e2ee_response = await self.get_e2ee_extension_response(
sync_config=sync_config,
e2ee_request=sync_config.extensions.e2ee,
to_token=to_token,
from_token=from_token,
)
account_data_coro = None
account_data_response = None
if sync_config.extensions.account_data is not None:
account_data_coro = self.get_account_data_extension_response(
account_data_response = await self.get_account_data_extension_response(
sync_config=sync_config,
previous_connection_state=previous_connection_state,
new_connection_state=new_connection_state,
@@ -130,9 +127,9 @@ class SlidingSyncExtensionHandler:
from_token=from_token,
)
receipts_coro = None
receipts_response = None
if sync_config.extensions.receipts is not None:
receipts_coro = self.get_receipts_extension_response(
receipts_response = await self.get_receipts_extension_response(
sync_config=sync_config,
previous_connection_state=previous_connection_state,
new_connection_state=new_connection_state,
@@ -144,9 +141,9 @@ class SlidingSyncExtensionHandler:
from_token=from_token,
)
typing_coro = None
typing_response = None
if sync_config.extensions.typing is not None:
typing_coro = self.get_typing_extension_response(
typing_response = await self.get_typing_extension_response(
sync_config=sync_config,
actual_lists=actual_lists,
actual_room_ids=actual_room_ids,
@@ -156,20 +153,6 @@ class SlidingSyncExtensionHandler:
from_token=from_token,
)
(
to_device_response,
e2ee_response,
account_data_response,
receipts_response,
typing_response,
) = await gather_optional_coroutines(
to_device_coro,
e2ee_coro,
account_data_coro,
receipts_coro,
typing_coro,
)
return SlidingSyncResult.Extensions(
to_device=to_device_response,
e2ee=e2ee_response,

View File

@@ -51,17 +51,25 @@ logger = logging.getLogger(__name__)
# "Hop-by-hop" headers (as opposed to "end-to-end" headers) as defined by RFC2616
# section 13.5.1 and referenced in RFC9110 section 7.6.1. These are meant to only be
# consumed by the immediate recipient and not be forwarded on.
HOP_BY_HOP_HEADERS_LOWERCASE = {
"connection",
"keep-alive",
"proxy-authenticate",
"proxy-authorization",
"te",
"trailers",
"transfer-encoding",
"upgrade",
HOP_BY_HOP_HEADERS = {
"Connection",
"Keep-Alive",
"Proxy-Authenticate",
"Proxy-Authorization",
"TE",
"Trailers",
"Transfer-Encoding",
"Upgrade",
}
assert all(header.lower() == header for header in HOP_BY_HOP_HEADERS_LOWERCASE)
if hasattr(Headers, "_canonicalNameCaps"):
# Twisted < 24.7.0rc1
_canonicalHeaderName = Headers()._canonicalNameCaps # type: ignore[attr-defined]
else:
# Twisted >= 24.7.0rc1
# But note that `_encodeName` still exists on prior versions,
# it just encodes differently
_canonicalHeaderName = Headers()._encodeName
def parse_connection_header_value(
@@ -84,12 +92,12 @@ def parse_connection_header_value(
Returns:
The set of header names that should not be copied over from the remote response.
The keys are lowercased.
The keys are capitalized in canonical capitalization.
"""
extra_headers_to_remove: Set[str] = set()
if connection_header_value:
extra_headers_to_remove = {
connection_option.decode("ascii").strip().lower()
_canonicalHeaderName(connection_option.strip()).decode("ascii")
for connection_option in connection_header_value.split(b",")
}
@@ -186,7 +194,7 @@ class ProxyResource(_AsyncResource):
# The `Connection` header also defines which headers should not be copied over.
connection_header = response_headers.getRawHeaders(b"connection")
extra_headers_to_remove_lowercase = parse_connection_header_value(
extra_headers_to_remove = parse_connection_header_value(
connection_header[0] if connection_header else None
)
@@ -194,10 +202,10 @@ class ProxyResource(_AsyncResource):
for k, v in response_headers.getAllRawHeaders():
# Do not copy over any hop-by-hop headers. These are meant to only be
# consumed by the immediate recipient and not be forwarded on.
header_key_lowercase = k.decode("ascii").lower()
header_key = k.decode("ascii")
if (
header_key_lowercase in HOP_BY_HOP_HEADERS_LOWERCASE
or header_key_lowercase in extra_headers_to_remove_lowercase
header_key in HOP_BY_HOP_HEADERS
or header_key in extra_headers_to_remove
):
continue

View File

@@ -37,7 +37,6 @@ import warnings
from types import TracebackType
from typing import (
TYPE_CHECKING,
Any,
Awaitable,
Callable,
Optional,
@@ -851,45 +850,6 @@ def run_in_background(
return d
def run_coroutine_in_background(
coroutine: typing.Coroutine[Any, Any, R],
) -> "defer.Deferred[R]":
"""Run the coroutine, ensuring that the current context is restored after
return from the function, and that the sentinel context is set once the
deferred returned by the function completes.
Useful for wrapping coroutines that you don't yield or await on (for
instance because you want to pass it to deferred.gatherResults()).
This is a special case of `run_in_background` where we can accept a
coroutine directly rather than a function. We can do this because coroutines
do not run until called, and so calling an async function without awaiting
cannot change the log contexts.
"""
current = current_context()
d = defer.ensureDeferred(coroutine)
# The function may have reset the context before returning, so
# we need to restore it now.
ctx = set_current_context(current)
# The original context will be restored when the deferred
# completes, but there is nothing waiting for it, so it will
# get leaked into the reactor or some other function which
# wasn't expecting it. We therefore need to reset the context
# here.
#
# (If this feels asymmetric, consider it this way: we are
# effectively forking a new thread of execution. We are
# probably currently within a ``with LoggingContext()`` block,
# which is supposed to have a single entry and exit point. But
# by spawning off another deferred, we are effectively
# adding a new exit point.)
d.addBoth(_set_context_cb, ctx)
return d
T = TypeVar("T")

View File

@@ -20,13 +20,11 @@
#
import logging
import re
from http import HTTPStatus
from typing import TYPE_CHECKING, Tuple
from synapse._pydantic_compat import StrictStr
from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
from synapse.api.urls import CLIENT_API_PREFIX
from synapse.http.server import HttpServer
from synapse.http.servlet import (
RestServlet,
@@ -107,17 +105,18 @@ class ReportEventRestServlet(RestServlet):
class ReportRoomRestServlet(RestServlet):
"""This endpoint lets clients report a room for abuse.
Introduced by MSC4151: https://github.com/matrix-org/matrix-spec-proposals/pull/4151
Whilst MSC4151 is not yet merged, this unstable endpoint is enabled on matrix.org
for content moderation purposes, and therefore backwards compatibility should be
carefully considered when changing anything on this endpoint.
More details on the MSC: https://github.com/matrix-org/matrix-spec-proposals/pull/4151
"""
# Cast the Iterable to a list so that we can `append` below.
PATTERNS = list(
client_patterns(
"/rooms/(?P<room_id>[^/]*)/report$",
releases=("v3",),
unstable=False,
v1=False,
)
PATTERNS = client_patterns(
"/org.matrix.msc4151/rooms/(?P<room_id>[^/]*)/report$",
releases=[],
v1=False,
unstable=True,
)
def __init__(self, hs: "HomeServer"):
@@ -127,16 +126,6 @@ class ReportRoomRestServlet(RestServlet):
self.clock = hs.get_clock()
self.store = hs.get_datastores().main
# TODO: Remove the unstable variant after 2-3 releases
# https://github.com/element-hq/synapse/issues/17373
if hs.config.experimental.msc4151_enabled:
self.PATTERNS.append(
re.compile(
f"^{CLIENT_API_PREFIX}/unstable/org.matrix.msc4151"
"/rooms/(?P<room_id>[^/]*)/report$"
)
)
class PostBody(RequestBodyModel):
reason: StrictStr
@@ -164,4 +153,6 @@ class ReportRoomRestServlet(RestServlet):
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
ReportEventRestServlet(hs).register(http_server)
ReportRoomRestServlet(hs).register(http_server)
if hs.config.experimental.msc4151_enabled:
ReportRoomRestServlet(hs).register(http_server)

View File

@@ -78,7 +78,6 @@ class TagServlet(RestServlet):
super().__init__()
self.auth = hs.get_auth()
self.handler = hs.get_account_data_handler()
self.room_member_handler = hs.get_room_member_handler()
async def on_PUT(
self, request: SynapseRequest, user_id: str, room_id: str, tag: str
@@ -86,12 +85,6 @@ class TagServlet(RestServlet):
requester = await self.auth.get_user_by_req(request)
if user_id != requester.user.to_string():
raise AuthError(403, "Cannot add tags for other users.")
# Check if the user has any membership in the room and raise error if not.
# Although it's not harmful for users to tag random rooms, it's just superfluous
# data we don't need to track or allow.
await self.room_member_handler.check_for_any_membership_in_room(
user_id=user_id, room_id=room_id
)
body = parse_json_object_from_request(request)

View File

@@ -249,7 +249,6 @@ class HomeServer(metaclass=abc.ABCMeta):
"""
REQUIRED_ON_BACKGROUND_TASK_STARTUP = [
"admin",
"account_validity",
"auth",
"deactivate_account",

View File

@@ -1422,7 +1422,7 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
DELETE FROM device_lists_outbound_last_success
WHERE destination = ? AND user_id = ?
"""
txn.execute_batch(sql, [(row[0], row[1]) for row in rows])
txn.execute_batch(sql, ((row[0], row[1]) for row in rows))
logger.info("Pruned %d device list outbound pokes", count)

View File

@@ -1686,7 +1686,7 @@ class PersistEventsStore:
"""
txn.execute_batch(
sql,
[
(
(
stream_id,
self._instance_name,
@@ -1699,17 +1699,17 @@ class PersistEventsStore:
state_key,
)
for etype, state_key in itertools.chain(to_delete, to_insert)
],
),
)
# Now we actually update the current_state_events table
txn.execute_batch(
"DELETE FROM current_state_events"
" WHERE room_id = ? AND type = ? AND state_key = ?",
[
(
(room_id, etype, state_key)
for etype, state_key in itertools.chain(to_delete, to_insert)
],
),
)
# We include the membership in the current state table, hence we do
@@ -1799,11 +1799,11 @@ class PersistEventsStore:
txn.execute_batch(
"DELETE FROM local_current_membership"
" WHERE room_id = ? AND user_id = ?",
[
(
(room_id, state_key)
for etype, state_key in itertools.chain(to_delete, to_insert)
if etype == EventTypes.Member and self.is_mine_id(state_key)
],
),
)
if to_insert:
@@ -3208,7 +3208,7 @@ class PersistEventsStore:
if notifiable_events:
txn.execute_batch(
sql,
[
(
(
event.room_id,
event.internal_metadata.stream_ordering,
@@ -3216,18 +3216,18 @@ class PersistEventsStore:
event.event_id,
)
for event in notifiable_events
],
),
)
# Now we delete the staging area for *all* events that were being
# persisted.
txn.execute_batch(
"DELETE FROM event_push_actions_staging WHERE event_id = ?",
[
(
(event.event_id,)
for event, _ in all_events_and_contexts
if event.internal_metadata.is_notifiable()
],
),
)
def _remove_push_actions_for_event_id_txn(

View File

@@ -729,10 +729,10 @@ class MediaRepositoryStore(MediaRepositoryBackgroundUpdateStore):
txn.execute_batch(
sql,
[
(
(time_ms, media_origin, media_id)
for media_origin, media_id in remote_media
],
),
)
sql = (
@@ -740,7 +740,7 @@ class MediaRepositoryStore(MediaRepositoryBackgroundUpdateStore):
" WHERE media_id = ?"
)
txn.execute_batch(sql, [(time_ms, media_id) for media_id in local_media])
txn.execute_batch(sql, ((time_ms, media_id) for media_id in local_media))
await self.db_pool.runInteraction(
"update_cached_last_access_time", update_cache_txn

View File

@@ -1175,7 +1175,7 @@ class RoomWorkerStore(CacheInvalidationWorkerStore):
SET quarantined_by = ?
WHERE media_origin = ? AND media_id = ?
""",
[(quarantined_by, origin, media_id) for origin, media_id in remote_mxcs],
((quarantined_by, origin, media_id) for origin, media_id in remote_mxcs),
)
total_media_quarantined += txn.rowcount if txn.rowcount > 0 else 0

View File

@@ -94,7 +94,7 @@ class SearchWorkerStore(SQLBaseStore):
VALUES (?,?,?,to_tsvector('english', ?),?,?)
"""
args1 = [
args1 = (
(
entry.event_id,
entry.room_id,
@@ -104,7 +104,7 @@ class SearchWorkerStore(SQLBaseStore):
entry.origin_server_ts,
)
for entry in entries
]
)
txn.execute_batch(sql, args1)

View File

@@ -804,11 +804,11 @@ class StateGroupDataStore(StateBackgroundUpdateStore, SQLBaseStore):
logger.info("[purge] removing redundant state groups")
txn.execute_batch(
"DELETE FROM state_groups_state WHERE state_group = ?",
[(sg,) for sg in state_groups_to_delete],
((sg,) for sg in state_groups_to_delete),
)
txn.execute_batch(
"DELETE FROM state_groups WHERE id = ?",
[(sg,) for sg in state_groups_to_delete],
((sg,) for sg in state_groups_to_delete),
)
@trace

View File

@@ -51,7 +51,7 @@ from typing import (
)
import attr
from typing_extensions import Concatenate, Literal, ParamSpec, Unpack
from typing_extensions import Concatenate, Literal, ParamSpec
from twisted.internet import defer
from twisted.internet.defer import CancelledError
@@ -61,7 +61,6 @@ from twisted.python.failure import Failure
from synapse.logging.context import (
PreserveLoggingContext,
make_deferred_yieldable,
run_coroutine_in_background,
run_in_background,
)
from synapse.util import Clock
@@ -345,7 +344,6 @@ T1 = TypeVar("T1")
T2 = TypeVar("T2")
T3 = TypeVar("T3")
T4 = TypeVar("T4")
T5 = TypeVar("T5")
@overload
@@ -404,112 +402,6 @@ def gather_results( # type: ignore[misc]
return deferred.addCallback(tuple)
@overload
async def gather_optional_coroutines(
*coroutines: Unpack[Tuple[Optional[Coroutine[Any, Any, T1]]]],
) -> Tuple[Optional[T1]]: ...
@overload
async def gather_optional_coroutines(
*coroutines: Unpack[
Tuple[
Optional[Coroutine[Any, Any, T1]],
Optional[Coroutine[Any, Any, T2]],
]
],
) -> Tuple[Optional[T1], Optional[T2]]: ...
@overload
async def gather_optional_coroutines(
*coroutines: Unpack[
Tuple[
Optional[Coroutine[Any, Any, T1]],
Optional[Coroutine[Any, Any, T2]],
Optional[Coroutine[Any, Any, T3]],
]
],
) -> Tuple[Optional[T1], Optional[T2], Optional[T3]]: ...
@overload
async def gather_optional_coroutines(
*coroutines: Unpack[
Tuple[
Optional[Coroutine[Any, Any, T1]],
Optional[Coroutine[Any, Any, T2]],
Optional[Coroutine[Any, Any, T3]],
Optional[Coroutine[Any, Any, T4]],
]
],
) -> Tuple[Optional[T1], Optional[T2], Optional[T3], Optional[T4]]: ...
@overload
async def gather_optional_coroutines(
*coroutines: Unpack[
Tuple[
Optional[Coroutine[Any, Any, T1]],
Optional[Coroutine[Any, Any, T2]],
Optional[Coroutine[Any, Any, T3]],
Optional[Coroutine[Any, Any, T4]],
Optional[Coroutine[Any, Any, T5]],
]
],
) -> Tuple[Optional[T1], Optional[T2], Optional[T3], Optional[T4], Optional[T5]]: ...
async def gather_optional_coroutines(
*coroutines: Unpack[Tuple[Optional[Coroutine[Any, Any, T1]], ...]],
) -> Tuple[Optional[T1], ...]:
"""Helper function that allows waiting on multiple coroutines at once.
The return value is a tuple of the return values of the coroutines in order.
If a `None` is passed instead of a coroutine, it will be ignored and a None
is returned in the tuple.
Note: For typechecking we need to have an explicit overload for each
distinct number of coroutines passed in. If you see type problems, it's
likely because you're using many arguments and you need to add a new
overload above.
"""
try:
results = await make_deferred_yieldable(
defer.gatherResults(
[
run_coroutine_in_background(coroutine)
for coroutine in coroutines
if coroutine is not None
],
consumeErrors=True,
)
)
results_iter = iter(results)
return tuple(
next(results_iter) if coroutine is not None else None
for coroutine in coroutines
)
except defer.FirstError as dfe:
# unwrap the error from defer.gatherResults.
# The raised exception's traceback only includes func() etc if
# the 'await' happens before the exception is thrown - ie if the failure
# happens *asynchronously* - otherwise Twisted throws away the traceback as it
# could be large.
#
# We could maybe reconstruct a fake traceback from Failure.frames. Or maybe
# we could throw Twisted into the fires of Mordor.
# suppress exception chaining, because the FirstError doesn't tell us anything
# very interesting.
assert isinstance(dfe.subFailure.value, BaseException)
raise dfe.subFailure.value from None
@attr.s(slots=True, auto_attribs=True)
class _LinearizerEntry:
# The number of things executing.

View File

@@ -903,18 +903,12 @@ class FederationClientProxyTests(BaseMultiWorkerStreamTestCase):
headers=Headers(
{
"Content-Type": ["application/json"],
# Define some hop-by-hop headers (try with varying casing to
# make sure we still match-up the headers)
"Connection": ["close, X-fOo, X-Bar", "X-baz"],
"Connection": ["close, X-Foo, X-Bar"],
# Should be removed because it's defined in the `Connection` header
"X-Foo": ["foo"],
"X-Bar": ["bar"],
# (not in canonical case)
"x-baZ": ["baz"],
# Should be removed because it's a hop-by-hop header
"Proxy-Authorization": "abcdef",
# Should be removed because it's a hop-by-hop header (not in canonical case)
"transfer-EnCoDiNg": "abcdef",
}
),
)

View File

@@ -30,19 +30,19 @@ from tests.unittest import TestCase
class ProxyTests(TestCase):
@parameterized.expand(
[
[b"close, X-Foo, X-Bar", {"close", "x-foo", "x-bar"}],
[b"close, X-Foo, X-Bar", {"Close", "X-Foo", "X-Bar"}],
# No whitespace
[b"close,X-Foo,X-Bar", {"close", "x-foo", "x-bar"}],
[b"close,X-Foo,X-Bar", {"Close", "X-Foo", "X-Bar"}],
# More whitespace
[b"close, X-Foo, X-Bar", {"close", "x-foo", "x-bar"}],
[b"close, X-Foo, X-Bar", {"Close", "X-Foo", "X-Bar"}],
# "close" directive in not the first position
[b"X-Foo, X-Bar, close", {"x-foo", "x-bar", "close"}],
[b"X-Foo, X-Bar, close", {"X-Foo", "X-Bar", "Close"}],
# Normalizes header capitalization
[b"keep-alive, x-fOo, x-bAr", {"keep-alive", "x-foo", "x-bar"}],
[b"keep-alive, x-fOo, x-bAr", {"Keep-Alive", "X-Foo", "X-Bar"}],
# Handles header names with whitespace
[
b"keep-alive, x foo, x bar",
{"keep-alive", "x foo", "x bar"},
{"Keep-Alive", "X foo", "X bar"},
],
]
)

View File

@@ -60,7 +60,7 @@ from synapse.util import Clock
from tests import unittest
from tests.server import FakeChannel
from tests.test_utils import SMALL_CMYK_JPEG, SMALL_PNG
from tests.test_utils import SMALL_PNG
from tests.unittest import override_config
from tests.utils import default_config
@@ -187,68 +187,6 @@ small_png_with_transparency = TestImage(
# different versions of Pillow.
)
small_cmyk_jpeg = TestImage(
SMALL_CMYK_JPEG,
b"image/jpeg",
b".jpeg",
# These values were sourced simply by seeing at what the tests produced at
# the time of writing. If this changes, the tests will fail.
unhexlify(
b"ffd8ffe000104a46494600010100000100010000ffdb00430006"
b"040506050406060506070706080a100a0a09090a140e0f0c1017"
b"141818171416161a1d251f1a1b231c1616202c20232627292a29"
b"191f2d302d283025282928ffdb0043010707070a080a130a0a13"
b"281a161a28282828282828282828282828282828282828282828"
b"2828282828282828282828282828282828282828282828282828"
b"2828ffc00011080020002003012200021101031101ffc4001f00"
b"0001050101010101010000000000000000010203040506070809"
b"0a0bffc400b5100002010303020403050504040000017d010203"
b"00041105122131410613516107227114328191a1082342b1c115"
b"52d1f02433627282090a161718191a25262728292a3435363738"
b"393a434445464748494a535455565758595a636465666768696a"
b"737475767778797a838485868788898a92939495969798999aa2"
b"a3a4a5a6a7a8a9aab2b3b4b5b6b7b8b9bac2c3c4c5c6c7c8c9ca"
b"d2d3d4d5d6d7d8d9dae1e2e3e4e5e6e7e8e9eaf1f2f3f4f5f6f7"
b"f8f9faffc4001f01000301010101010101010100000000000001"
b"02030405060708090a0bffc400b5110002010204040304070504"
b"0400010277000102031104052131061241510761711322328108"
b"144291a1b1c109233352f0156272d10a162434e125f11718191a"
b"262728292a35363738393a434445464748494a53545556575859"
b"5a636465666768696a737475767778797a82838485868788898a"
b"92939495969798999aa2a3a4a5a6a7a8a9aab2b3b4b5b6b7b8b9"
b"bac2c3c4c5c6c7c8c9cad2d3d4d5d6d7d8d9dae2e3e4e5e6e7e8"
b"e9eaf2f3f4f5f6f7f8f9faffda000c03010002110311003f00fa"
b"a68a28a0028a28a0028a28a0028a28a00fffd9"
),
unhexlify(
b"ffd8ffe000104a46494600010100000100010000ffdb00430006"
b"040506050406060506070706080a100a0a09090a140e0f0c1017"
b"141818171416161a1d251f1a1b231c1616202c20232627292a29"
b"191f2d302d283025282928ffdb0043010707070a080a130a0a13"
b"281a161a28282828282828282828282828282828282828282828"
b"2828282828282828282828282828282828282828282828282828"
b"2828ffc00011080001000103012200021101031101ffc4001f00"
b"0001050101010101010000000000000000010203040506070809"
b"0a0bffc400b5100002010303020403050504040000017d010203"
b"00041105122131410613516107227114328191a1082342b1c115"
b"52d1f02433627282090a161718191a25262728292a3435363738"
b"393a434445464748494a535455565758595a636465666768696a"
b"737475767778797a838485868788898a92939495969798999aa2"
b"a3a4a5a6a7a8a9aab2b3b4b5b6b7b8b9bac2c3c4c5c6c7c8c9ca"
b"d2d3d4d5d6d7d8d9dae1e2e3e4e5e6e7e8e9eaf1f2f3f4f5f6f7"
b"f8f9faffc4001f01000301010101010101010100000000000001"
b"02030405060708090a0bffc400b5110002010204040304070504"
b"0400010277000102031104052131061241510761711322328108"
b"144291a1b1c109233352f0156272d10a162434e125f11718191a"
b"262728292a35363738393a434445464748494a53545556575859"
b"5a636465666768696a737475767778797a82838485868788898a"
b"92939495969798999aa2a3a4a5a6a7a8a9aab2b3b4b5b6b7b8b9"
b"bac2c3c4c5c6c7c8c9cad2d3d4d5d6d7d8d9dae2e3e4e5e6e7e8"
b"e9eaf2f3f4f5f6f7f8f9faffda000c03010002110311003f00fa"
b"a68a28a00fffd9"
),
)
small_lossless_webp = TestImage(
unhexlify(
b"524946461a000000574542505650384c0d0000002f0000001007" b"1011118888fe0700"

View File

@@ -23,7 +23,6 @@ import hashlib
import hmac
import json
import os
import time
import urllib.parse
from binascii import unhexlify
from http import HTTPStatus
@@ -57,7 +56,6 @@ from synapse.types import JsonDict, UserID, create_requester
from synapse.util import Clock
from tests import unittest
from tests.replication._base import BaseMultiWorkerStreamTestCase
from tests.test_utils import SMALL_PNG
from tests.unittest import override_config
@@ -5129,6 +5127,7 @@ class UserRedactionTestCase(unittest.HomeserverTestCase):
"""
Test that request to redact events in all rooms user is member of is successful
"""
# join rooms, send some messages
originals = []
for rm in [self.rm1, self.rm2, self.rm3]:
@@ -5405,98 +5404,3 @@ class UserRedactionTestCase(unittest.HomeserverTestCase):
matches.append((event_id, event))
# we redacted 6 messages
self.assertEqual(len(matches), 6)
class UserRedactionBackgroundTaskTestCase(BaseMultiWorkerStreamTestCase):
servlets = [
synapse.rest.admin.register_servlets,
login.register_servlets,
admin.register_servlets,
room.register_servlets,
sync.register_servlets,
]
def prepare(self, reactor: MemoryReactor, clock: Clock, hs: HomeServer) -> None:
self.admin = self.register_user("thomas", "pass", True)
self.admin_tok = self.login("thomas", "pass")
self.bad_user = self.register_user("teresa", "pass")
self.bad_user_tok = self.login("teresa", "pass")
# create rooms - room versions 11+ store the `redacts` key in content while
# earlier ones don't so we use a mix of room versions
self.rm1 = self.helper.create_room_as(
self.admin, tok=self.admin_tok, room_version="7"
)
self.rm2 = self.helper.create_room_as(self.admin, tok=self.admin_tok)
self.rm3 = self.helper.create_room_as(
self.admin, tok=self.admin_tok, room_version="11"
)
@override_config({"run_background_tasks_on": "worker1"})
def test_redact_messages_all_rooms(self) -> None:
"""
Test that redact task successfully runs when `run_background_tasks_on` is specified
"""
self.make_worker_hs(
"synapse.app.generic_worker",
extra_config={
"worker_name": "worker1",
"run_background_tasks_on": "worker1",
"redis": {"enabled": True},
},
)
# join rooms, send some messages
original_event_ids = set()
for rm in [self.rm1, self.rm2, self.rm3]:
join = self.helper.join(rm, self.bad_user, tok=self.bad_user_tok)
original_event_ids.add(join["event_id"])
for i in range(15):
event = {"body": f"hello{i}", "msgtype": "m.text"}
res = self.helper.send_event(
rm, "m.room.message", event, tok=self.bad_user_tok, expect_code=200
)
original_event_ids.add(res["event_id"])
# redact all events in all rooms
channel = self.make_request(
"POST",
f"/_synapse/admin/v1/user/{self.bad_user}/redact",
content={"rooms": []},
access_token=self.admin_tok,
)
self.assertEqual(channel.code, 200)
id = channel.json_body.get("redact_id")
timeout_s = 10
start_time = time.time()
redact_result = ""
while redact_result != "complete":
if start_time + timeout_s < time.time():
self.fail("Timed out waiting for redactions.")
channel2 = self.make_request(
"GET",
f"/_synapse/admin/v1/user/redact_status/{id}",
access_token=self.admin_tok,
)
redact_result = channel2.json_body["status"]
if redact_result == "failed":
self.fail("Redaction task failed.")
redaction_ids = set()
for rm in [self.rm1, self.rm2, self.rm3]:
filter = json.dumps({"types": [EventTypes.Redaction]})
channel = self.make_request(
"GET",
f"rooms/{rm}/messages?filter={filter}&limit=50",
access_token=self.admin_tok,
)
self.assertEqual(channel.code, 200)
for event in channel.json_body["chunk"]:
if event["type"] == "m.room.redaction":
redaction_ids.add(event["redacts"])
self.assertIncludes(redaction_ids, original_event_ids, exact=True)

View File

@@ -66,7 +66,6 @@ from tests.media.test_media_storage import (
SVG,
TestImage,
empty_file,
small_cmyk_jpeg,
small_lossless_webp,
small_png,
small_png_with_transparency,
@@ -1917,7 +1916,6 @@ class RemoteDownloadLimiterTestCase(unittest.HomeserverTestCase):
test_images = [
small_png,
small_png_with_transparency,
small_cmyk_jpeg,
small_lossless_webp,
empty_file,
SVG,
@@ -2402,7 +2400,7 @@ class DownloadAndThumbnailTestCase(unittest.HomeserverTestCase):
if expected_body is not None:
self.assertEqual(
channel.result["body"], expected_body, channel.result["body"].hex()
channel.result["body"], expected_body, channel.result["body"]
)
else:
# ensure that the result is at least some valid image

View File

@@ -156,31 +156,58 @@ class ReportRoomTestCase(unittest.HomeserverTestCase):
self.room_id = self.helper.create_room_as(
self.other_user, tok=self.other_user_tok, is_public=True
)
self.report_path = f"/_matrix/client/v3/rooms/{self.room_id}/report"
self.report_path = (
f"/_matrix/client/unstable/org.matrix.msc4151/rooms/{self.room_id}/report"
)
@unittest.override_config(
{
"experimental_features": {"msc4151_enabled": True},
}
)
def test_reason_str(self) -> None:
data = {"reason": "this makes me sad"}
self._assert_status(200, data)
@unittest.override_config(
{
"experimental_features": {"msc4151_enabled": True},
}
)
def test_no_reason(self) -> None:
data = {"not_reason": "for typechecking"}
self._assert_status(400, data)
@unittest.override_config(
{
"experimental_features": {"msc4151_enabled": True},
}
)
def test_reason_nonstring(self) -> None:
data = {"reason": 42}
self._assert_status(400, data)
@unittest.override_config(
{
"experimental_features": {"msc4151_enabled": True},
}
)
def test_reason_null(self) -> None:
data = {"reason": None}
self._assert_status(400, data)
@unittest.override_config(
{
"experimental_features": {"msc4151_enabled": True},
}
)
def test_cannot_report_nonexistent_room(self) -> None:
"""
Tests that we don't accept event reports for rooms which do not exist.
"""
channel = self.make_request(
"POST",
"/_matrix/client/v3/rooms/!bloop:example.org/report",
"/_matrix/client/unstable/org.matrix.msc4151/rooms/!bloop:example.org/report",
{"reason": "i am very sad"},
access_token=self.other_user_tok,
shorthand=False,

View File

@@ -1,95 +0,0 @@
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2024 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
"""Tests REST events for /tags paths."""
from http import HTTPStatus
import synapse.rest.admin
from synapse.rest.client import login, room, tags
from tests import unittest
class RoomTaggingTestCase(unittest.HomeserverTestCase):
"""Tests /user/$user_id/rooms/$room_id/tags/$tag REST API."""
servlets = [
room.register_servlets,
tags.register_servlets,
login.register_servlets,
synapse.rest.admin.register_servlets_for_client_rest_resource,
]
def test_put_tag_checks_room_membership(self) -> None:
"""
Test that a user can add a tag to a room if they have membership to the room.
"""
user1_id = self.register_user("user1", "pass")
user1_tok = self.login(user1_id, "pass")
room_id = self.helper.create_room_as(user1_id, tok=user1_tok)
tag = "test_tag"
# Make the request
channel = self.make_request(
"PUT",
f"/user/{user1_id}/rooms/{room_id}/tags/{tag}",
content={"order": 0.5},
access_token=user1_tok,
)
# Check that the request was successful
self.assertEqual(channel.code, HTTPStatus.OK, channel.result)
def test_put_tag_fails_if_not_in_room(self) -> None:
"""
Test that a user cannot add a tag to a room if they don't have membership to the
room.
"""
user1_id = self.register_user("user1", "pass")
user1_tok = self.login(user1_id, "pass")
user2_id = self.register_user("user2", "pass")
user2_tok = self.login(user2_id, "pass")
# Create the room with user2 (user1 has no membership in the room)
room_id = self.helper.create_room_as(user2_id, tok=user2_tok)
tag = "test_tag"
# Make the request
channel = self.make_request(
"PUT",
f"/user/{user1_id}/rooms/{room_id}/tags/{tag}",
content={"order": 0.5},
access_token=user1_tok,
)
# Check that the request failed with the correct error
self.assertEqual(channel.code, HTTPStatus.FORBIDDEN, channel.result)
def test_put_tag_fails_if_room_does_not_exist(self) -> None:
"""
Test that a user cannot add a tag to a room if the room doesn't exist (therefore
no membership in the room.)
"""
user1_id = self.register_user("user1", "pass")
user1_tok = self.login(user1_id, "pass")
room_id = "!nonexistent:test"
tag = "test_tag"
# Make the request
channel = self.make_request(
"PUT",
f"/user/{user1_id}/rooms/{room_id}/tags/{tag}",
content={"order": 0.5},
access_token=user1_tok,
)
# Check that the request failed with the correct error
self.assertEqual(channel.code, HTTPStatus.FORBIDDEN, channel.result)

View File

@@ -23,7 +23,6 @@
Utilities for running the unit tests
"""
import base64
import json
import sys
import warnings
@@ -139,21 +138,3 @@ SMALL_PNG = unhexlify(
b"0000001f15c4890000000a49444154789c63000100000500010d"
b"0a2db40000000049454e44ae426082"
)
# A small CMYK-encoded JPEG image used in some tests.
#
# Generated with:
# img = PIL.Image.new('CMYK', (1, 1), (0, 0, 0, 0))
# img.save('minimal_cmyk.jpg', 'JPEG')
#
# Resolution: 1x1, MIME type: image/jpeg, Extension: jpeg, Size: 4 KiB
SMALL_CMYK_JPEG = base64.b64decode("""
/9j/7gAOQWRvYmUAZAAAAAAA/9sAQwAIBgYHBgUIBwcHCQkICgwUDQwLCww
ZEhMPFB0aHx4dGhwcICQuJyAiLCMcHCg3KSwwMTQ0NB8nOT04MjwuMzQy/8
AAFAgAAQABBEMRAE0RAFkRAEsRAP/EAB8AAAEFAQEBAQEBAAAAAAAAAAABA
gMEBQYHCAkKC//EALUQAAIBAwMCBAMFBQQEAAABfQECAwAEEQUSITFBBhNR
YQcicRQygZGhCCNCscEVUtHwJDNicoIJChYXGBkaJSYnKCkqNDU2Nzg5OkN
ERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6g4SFhoeIiYqSk5SVlp
eYmZqio6Slpqeoqaqys7S1tre4ubrCw8TFxsfIycrS09TV1tfY2drh4uPk5
ebn6Onq8fLz9PX29/j5+v/aAA4EQwBNAFkASwAAPwD3+vf69/r3+v/Z
""")

View File

@@ -18,7 +18,7 @@
#
#
import traceback
from typing import Any, Coroutine, Generator, List, NoReturn, Optional, Tuple, TypeVar
from typing import Generator, List, NoReturn, Optional
from parameterized import parameterized_class
@@ -39,7 +39,6 @@ from synapse.util.async_helpers import (
ObservableDeferred,
concurrently_execute,
delay_cancellation,
gather_optional_coroutines,
stop_cancellation,
timeout_deferred,
)
@@ -47,8 +46,6 @@ from synapse.util.async_helpers import (
from tests.server import get_clock
from tests.unittest import TestCase
T = TypeVar("T")
class ObservableDeferredTest(TestCase):
def test_succeed(self) -> None:
@@ -591,106 +588,3 @@ class AwakenableSleeperTests(TestCase):
sleeper.wake("name")
self.assertTrue(d1.called)
self.assertTrue(d2.called)
class GatherCoroutineTests(TestCase):
"""Tests for `gather_optional_coroutines`"""
def make_coroutine(self) -> Tuple[Coroutine[Any, Any, T], "defer.Deferred[T]"]:
"""Returns a coroutine and a deferred that it is waiting on to resolve"""
d: "defer.Deferred[T]" = defer.Deferred()
async def inner() -> T:
with PreserveLoggingContext():
return await d
return inner(), d
def test_single(self) -> None:
"Test passing in a single coroutine works"
with LoggingContext("test_ctx") as text_ctx:
deferred: "defer.Deferred[None]"
coroutine, deferred = self.make_coroutine()
gather_deferred = defer.ensureDeferred(
gather_optional_coroutines(coroutine)
)
# We shouldn't have a result yet, and should be in the sentinel
# context.
self.assertNoResult(gather_deferred)
self.assertEqual(current_context(), SENTINEL_CONTEXT)
# Resolving the deferred will resolve the coroutine
deferred.callback(None)
# All coroutines have resolved, and so we should have the results
result = self.successResultOf(gather_deferred)
self.assertEqual(result, (None,))
# We should be back in the normal context.
self.assertEqual(current_context(), text_ctx)
def test_multiple_resolve(self) -> None:
"Test passing in multiple coroutine that all resolve works"
with LoggingContext("test_ctx") as test_ctx:
deferred1: "defer.Deferred[int]"
coroutine1, deferred1 = self.make_coroutine()
deferred2: "defer.Deferred[str]"
coroutine2, deferred2 = self.make_coroutine()
gather_deferred = defer.ensureDeferred(
gather_optional_coroutines(coroutine1, coroutine2)
)
# We shouldn't have a result yet, and should be in the sentinel
# context.
self.assertNoResult(gather_deferred)
self.assertEqual(current_context(), SENTINEL_CONTEXT)
# Even if we resolve one of the coroutines, we shouldn't have a result
# yet
deferred2.callback("test")
self.assertNoResult(gather_deferred)
self.assertEqual(current_context(), SENTINEL_CONTEXT)
deferred1.callback(1)
# All coroutines have resolved, and so we should have the results
result = self.successResultOf(gather_deferred)
self.assertEqual(result, (1, "test"))
# We should be back in the normal context.
self.assertEqual(current_context(), test_ctx)
def test_multiple_fail(self) -> None:
"Test passing in multiple coroutine where one fails does the right thing"
with LoggingContext("test_ctx") as test_ctx:
deferred1: "defer.Deferred[int]"
coroutine1, deferred1 = self.make_coroutine()
deferred2: "defer.Deferred[str]"
coroutine2, deferred2 = self.make_coroutine()
gather_deferred = defer.ensureDeferred(
gather_optional_coroutines(coroutine1, coroutine2)
)
# We shouldn't have a result yet, and should be in the sentinel
# context.
self.assertNoResult(gather_deferred)
self.assertEqual(current_context(), SENTINEL_CONTEXT)
# Throw an exception in one of the coroutines
exc = Exception("test")
deferred2.errback(exc)
# Expect the gather deferred to immediately fail
result_exc = self.failureResultOf(gather_deferred)
self.assertEqual(result_exc.value, exc)
# We should be back in the normal context.
self.assertEqual(current_context(), test_ctx)