Compare commits

..

90 Commits

Author SHA1 Message Date
Devon Hudson
1246e54d7f 1.125.0 2025-02-25 08:10:32 -07:00
Quentin Gliech
30fcd586fe Tweak changelog 2025-02-18 14:47:31 +01:00
Quentin Gliech
4aa725a730 1.125.0rc1 2025-02-18 14:33:34 +01:00
qashlan
2d4f28915e Add method to get current server time in milliseconds in ModuleApi (#18144)
- Add `get_current_time_msec()` method to the [module
API](https://matrix-org.github.io/synapse/latest/modules/writing_a_module.html)
for sound time comparisons with Synapse.
- Fixes #18104 

Signed-off-by: Ahmed Qashlan <ahmedelqashlan@gmail.com>
Co-authored-by: Eric Eastwood <madlittlemods@gmail.com>
Co-authored-by: Erik Johnston <erikj@jki.re>
2025-02-18 10:20:30 +00:00
Eric Eastwood
12dc6b102f Add support to proxy outbound requests from Synapse in tests (#18158)
Adds new environment variables that can be used with the Docker image
(`SYNAPSE_HTTP_PROXY`/`SYNAPSE_HTTPS_PROXY`/`SYNAPSE_NO_PROXY`)

Useful for things like the [Secure Border
Gateway](https://element.io/server-suite/secure-border-gateways)


### Why is this necessary?

You can already configure the `HTTP_PROXY`/`HTTPS_PROXY` environment
variables to proxy outbound requests but setting this globally in the
Docker image affects all processes which isn't always desirable or
workable in the case where the proxy is running in the Docker image
itself (because the Debian packages will fail to download because the
proxy isn't up and running yet) . Adding Synapse specific environment
variables
(`SYNAPSE_HTTP_PROXY`/`SYNAPSE_HTTPS_PROXY`/`SYNAPSE_NO_PROXY`) makes
things much more targetable.
2025-02-17 10:23:04 -06:00
Erik Johnston
0c31783b4f Limit size of user directory search queries (#18172)
If a user search has many words we can end up creating really large
queries that take a long time for the database to process. Generally,
such searches don't return any results anyway (due to limits on user ID
and display name length).

We "fix" this by cheating and only searching for the first ten words.
2025-02-17 15:39:26 +00:00
V02460
e462950338 Document consequences of replacing secrets (#18138)
Document consequences of replacing secrets. The covered config options
are `registration_shared_secret`, `macaroon_secret_key`, `form_secret`
and `worker_replication_secret`.

Even though I looked at the source code to check the added documentation
is right, I would appreciate additional verification of the statements
made.

In an hand-wavy attempt at classifying how bad the consequences of
secret replacement are, I added some explanations as warnings and others
as regular paragraphs.

Closes #17971 

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-02-17 14:25:42 +00:00
dependabot[bot]
3e34f5ccc7 Bump hiredis from 3.0.0 to 3.1.0 (#18169) 2025-02-17 14:08:51 +00:00
dependabot[bot]
8ae9d9e8c5 Bump service-identity from 24.1.0 to 24.2.0 (#18171) 2025-02-17 14:08:21 +00:00
dependabot[bot]
22bb3c50d1 Bump twine from 6.0.1 to 6.1.0 (#18170) 2025-02-17 14:08:11 +00:00
Andrew Morgan
74a70190ab Update rc_presence config docs with int burst_count (#18159) 2025-02-14 17:23:40 +07:00
Eric Eastwood
0b1830b121 Document missing server config options (#18122)
I was looking into the `USE_FROZEN_DICTS` option during the review of
https://github.com/element-hq/synapse/pull/18103#discussion_r1935876168
and noticed that there are several other server config options that
aren't in the docs.
2025-02-12 11:16:06 -06:00
Devon Hudson
74aa47828d Add log message when worker lock timeouts get large (#18124)
This is to help track down a possible, but very rare, worker deadlock
that was seen on matrix.org.
In theory, you could work back from an instance of these new logs to the
approximate time when the lock was obtained and focus the diagnostic
efforts there.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-02-12 14:37:56 +00:00
qashlan
816054b012 Fix internal server error when updating 3pid address with invalid email (#18125)
When updating 3pid for a user email from admin api and sending invalid
email the server throws 500 internal server error.
changed to 400 Bad request and returned the error message

Signed-off-by: qashlan <ahmedelqashlan@gmail.com>
Signed-off-by: Ahmed Qashlan <ahmedelqashlan@gmail.com>
2025-02-12 14:06:21 +00:00
dependabot[bot]
aaffc3566e Bump ulid from 1.1.4 to 1.2.0 (#18148) 2025-02-12 13:33:14 +00:00
dependabot[bot]
fe3f462b79 Bump sigstore/cosign-installer from 3.7.0 to 3.8.0 (#18147) 2025-02-12 13:30:36 +00:00
dependabot[bot]
c274839234 Bump bytes from 1.9.0 to 1.10.0 (#18149) 2025-02-12 13:29:23 +00:00
dependabot[bot]
5a833ebbc8 Bump bcrypt from 4.2.0 to 4.2.1 (#18127) 2025-02-12 13:25:21 +00:00
dependabot[bot]
30418653fd Bump gitpython from 3.1.43 to 3.1.44 (#18128) 2025-02-12 13:24:47 +00:00
dependabot[bot]
26331cbbd5 Bump serde_json from 1.0.137 to 1.0.138 (#18129) 2025-02-12 13:24:38 +00:00
Till Faelligen
d6f9332a6b Merge branch 'master' into develop 2025-02-11 14:27:58 +01:00
Till Faelligen
c1b7c6b12e 1.124.0 2025-02-11 11:56:50 +01:00
Andrew Morgan
c1815bf5a1 Add rc_presence ratelimiting config to demo/start.sh (#18145)
Missed in https://github.com/element-hq/synapse/pull/18000
2025-02-11 14:39:12 +07:00
dependabot[bot]
703f2e8c43 Bump types-pyyaml from 6.0.12.20240917 to 6.0.12.20241230 (#18097) 2025-02-11 00:07:43 +07:00
V02460
068e22b4b7 Cleanup Python 3.8 leftovers (#17967)
Some small cleanups after Python3.8 became EOL.

- Move some type imports from `typing_extensions` to `typing`
- Remove the `abi3-py38` feature from pyo3

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-10 16:53:24 +00:00
Andrew Ferrazzutti
e4074749d2 Overload "allow_none" on DB pool static method (#17616)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-10 15:37:05 +00:00
meise
8f07ef5c93 feat: Allow multiple values for SSO attribute_requirements via comma separation (#17949)
In the current `attribute_requirements` implementation it is only
possible to allow exact matching attribute values. Multiple allowed
values for one attribute are not possible as described in #13238.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Sebastian Neuser <pzkz@infra.run>
Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-10 15:36:21 +00:00
Erik Johnston
4c84c9c4ad Don't log exceptions for obviously incorrect stream tokens (#18139)
We log incorrect ones as we want to catch bugs where Synapse returns bad
tokens. However, sometimes clients just send tokens that are e.g. empty.

---------

Co-authored-by: Eric Eastwood <erice@element.io>
2025-02-10 15:27:46 +00:00
villepeh
deb09b3836 Add Oracle Linux installation instructions (#17436)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

I forgot the guide applies to Oracle Linux as well. In fact, I ran a
small homeserver on OEL a few months back.

I did minimal installations on Rocky and OEL on VirtualBox and noticed
Codeready/Powertools repos aren't required, so I removed those commands
from the guide. I switched `RHEL`-references to `EL`.

#17423 was merged before I remembered about OEL but a new PR shouldn't
hurt :)

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-10 08:17:26 +00:00
Erik Johnston
77261301d2 Merge branch 'release-v1.124' into develop 2025-02-07 14:20:18 +00:00
Erik Johnston
0076197c97 1.124.0rc3 2025-02-07 13:42:57 +00:00
Erik Johnston
dcf7b39276 Fix performance of check_state_groups_and_bump_deletion (#18141)
Regressed as part of https://github.com/element-hq/synapse/pull/18107

This does two things:
1. Only check if the state groups have been deleted when calculating the
event context (as that's when we will insert them). This avoids lots of
checks for read operations.
2. Don't lock the `state_groups` rows when doing the check. This adds
overhead, and it doesn't prevent any races.
2025-02-07 10:18:32 +00:00
Erik Johnston
29534e7d0a Merge branch 'release-v1.124' into develop 2025-02-05 18:23:59 +00:00
Erik Johnston
553e9882bf 1.124.0rc2 2025-02-05 16:35:55 +00:00
Erik Johnston
3391da348f Fix bug where persisting some events fails after unclean shutdown. (#18137)
Introduced in #18107

`UniqueViolation: duplicate key value violates unique constraint
"state_groups_persisting_pkey"`
2025-02-05 16:26:07 +00:00
Matthew Hodgson
6fe41d2b47 make dual licensing explicit (#18134)
Update readme & LICENSE files to make it explicit that you can buy a
commercial license as an AGPL alternative from Element.
2025-02-05 13:40:10 +00:00
Erik Johnston
5b03265cfb Fix 'Fix lint' GHA (#18136)
c.f. #18121

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-05 12:30:13 +00:00
Erik Johnston
b8a333004a Fix legacy modules check_username_for_spam (#18135)
Broke in #17916, as the signature inspection incorrectly looks at the
wrapper function. We fix this by setting the signature on the wrapper
function to that of the wrapped function via `@functools.wraps`.
2025-02-05 12:07:49 +00:00
V02460
e41174cae3 Add MSC3861 config options admin_token_path and client_secret_path (#18004)
Another PR on my quest to a `*_path` variant for every secret. Adds two
config options `admin_token_path` and `client_secret_path` to the
experimental config under `experimental_features.msc3861`. Also includes
tests.

I tried to be a good citizen here by following `attrs` conventions and
not rewriting the corresponding non-path variants in the class, but
instead adding methods to retrieve the value.

Reading secrets from files has the security advantage of separating the
secrets from the config. It also simplifies secrets management in
Kubernetes. Also useful to NixOS users.
2025-02-04 12:45:33 -06:00
Erik Johnston
37e893499f 1.124.0rc1 2025-02-04 11:53:27 +00:00
Erik Johnston
c46d452c7c Fix bug where purging history could lead to increase in disk space usage (#18131)
When purging history, we try and delete any state groups that become
unreferenced (i.e. there are no longer any events that directly
reference them). When we delete a state group that is referenced by
another state group, we "de-delta" that state group so that it no longer
refers to the state group that is deleted.

There are two bugs with this approach that we fix here:
1. There is a common pattern where we end up storing two state groups
when persisting a state event: the state before and after the new state
event, where the latter is stored as a delta to the former. When
deleting state groups we only deleted the "new" state and left (and
potentially de-deltaed) the old state. This was due to a bug/typo when
trying to find referenced state groups.
2. There are times where we store unreferenced state groups in the DB,
during the purging of history these would not get rechecked and instead
always de-deltaed. Instead, we should check for this case and delete any
unreferenced state groups rather than de-deltaing them.

The effect of the above bugs is that when purging history we'd end up
with lots of unreferenced state groups that had been de-deltaed (i.e.
stored as the full state). This can lead to dramatic increases in
storage space used.
2025-02-03 19:04:19 +00:00
Erik Johnston
27dbb1b429 Add locking to more safely delete state groups: Part 2 (#18130)
This actually makes it so that deleting state groups goes via the new
mechanism.

c.f. #18107
2025-02-03 17:58:55 +00:00
Erik Johnston
aa6e5c2ecb Add locking to more safely delete state groups: Part 1 (#18107)
Currently we don't really have anything that stops us from deleting
state groups when an in-flight event references it. This is a fairly
rare race currently, but we want to be able to more aggressively delete
state groups so it is important to address this to ensure that the
database remains valid.

This implements the locking, but doesn't actually use it.

See the class docstring of the new data store for an explanation for how
this works.

---------

Co-authored-by: Devon Hudson <devon.dmytro@gmail.com>
2025-02-03 17:29:15 +00:00
Andrew Morgan
ac1bf682ff Allow (un)block_room storage functions to be called on workers (#18119)
This is so workers can call these functions.

This was preventing the [Delete Room Admin
API](https://element-hq.github.io/synapse/latest/admin_api/rooms.html#version-2-new-version)
from succeeding when `block: true` was specified. This was because we
had `run_background_tasks_on` configured to run on a separate worker.

As workers weren't able to call the `block_room` storage function before
this PR, the (delete room) task failed when taken off the queue by the
worker.
2025-01-30 20:48:12 +00:00
Eric Eastwood
a0b70473fc Raise an error if someone is using an incorrect suffix in a config duration string (#18112)
Previously, a value like `5q` would be interpreted as 5 milliseconds. We
should just raise an error instead of letting someone run with a
misconfiguration.
2025-01-29 18:14:02 -06:00
Devon Hudson
95a85b1129 Merge branch 'master' into develop 2025-01-28 09:23:26 -07:00
Devon Hudson
3d8535b1de 1.123.0 2025-01-28 08:37:58 -07:00
Will Hunt
628351b98d Never autojoin deactivated & suspended users. (#18073)
This PR changes the logic so that deactivated users are always ignored.
Suspended users were already effectively ignored as Synapse forbids a
join while suspended.

---------

Co-authored-by: Devon Hudson <devon.dmytro@gmail.com>
2025-01-28 00:37:24 +00:00
dependabot[bot]
8f27b3af07 Bump python-multipart from 0.0.18 to 0.0.20 (#18096)
Bumps [python-multipart](https://github.com/Kludex/python-multipart)
from 0.0.18 to 0.0.20.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/releases">python-multipart's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.0.20</h2>
<h2>What's Changed</h2>
<ul>
<li>Handle messages containing only end boundary, fixes <a
href="https://redirect.github.com/Kludex/python-multipart/issues/38">#38</a>
by <a href="https://github.com/jhnstrk"><code>@​jhnstrk</code></a> in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/142">Kludex/python-multipart#142</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/Mr-Sunglasses"><code>@​Mr-Sunglasses</code></a>
made their first contribution in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/185">Kludex/python-multipart#185</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.19...0.0.20">https://github.com/Kludex/python-multipart/compare/0.0.19...0.0.20</a></p>
<h2>Version 0.0.19</h2>
<h2>What's Changed</h2>
<ul>
<li>Don't warn when CRLF is found after last boundary by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/193">Kludex/python-multipart#193</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.19">https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.19</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/blob/master/CHANGELOG.md">python-multipart's
changelog</a>.</em></p>
<blockquote>
<h2>0.0.20 (2024-12-16)</h2>
<ul>
<li>Handle messages containing only end boundary <a
href="https://redirect.github.com/Kludex/python-multipart/pull/142">#142</a>.</li>
</ul>
<h2>0.0.19 (2024-11-30)</h2>
<ul>
<li>Don't warn when CRLF is found after last boundary on
<code>MultipartParser</code> <a
href="https://redirect.github.com/Kludex/python-multipart/pull/193">#193</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b083cef4d6"><code>b083cef</code></a>
Version 0.0.20 (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/197">#197</a>)</li>
<li><a
href="04d3cf5ef5"><code>04d3cf5</code></a>
Handle messages containing only end boundary, fixes <a
href="https://redirect.github.com/Kludex/python-multipart/issues/38">#38</a>
(<a
href="https://redirect.github.com/Kludex/python-multipart/issues/142">#142</a>)</li>
<li><a
href="f1c5a2821b"><code>f1c5a28</code></a>
feat: Add python 3.13 in CI matrix. (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/185">#185</a>)</li>
<li><a
href="4bffa0c7c6"><code>4bffa0c</code></a>
doc: A file parameter is not a field (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/127">#127</a>)</li>
<li><a
href="6f3295bc79"><code>6f3295b</code></a>
Bump astral-sh/setup-uv from 3 to 4 in the github-actions group (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/194">#194</a>)</li>
<li><a
href="c4fe4d3ceb"><code>c4fe4d3</code></a>
Don't warn when CRLF is found after last boundary (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/193">#193</a>)</li>
<li>See full diff in <a
href="https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.20">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=python-multipart&package-manager=pip&previous-version=0.0.18&new-version=0.0.20)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:28:00 +00:00
dependabot[bot]
579f4ac1cd Bump serde_json from 1.0.135 to 1.0.137 (#18099)
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.135 to
1.0.137.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/json/releases">serde_json's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.137</h2>
<ul>
<li>Turn on &quot;float_roundtrip&quot; and &quot;unbounded_depth&quot;
features for serde_json in play.rust-lang.org (<a
href="https://redirect.github.com/serde-rs/json/issues/1231">#1231</a>)</li>
</ul>
<h2>v1.0.136</h2>
<ul>
<li>Optimize serde_json::value::Serializer::serialize_map by using
Map::with_capacity (<a
href="https://redirect.github.com/serde-rs/json/issues/1230">#1230</a>,
thanks <a
href="https://github.com/goffrie"><code>@​goffrie</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eb49e28204"><code>eb49e28</code></a>
Release 1.0.137</li>
<li><a
href="51c48ab3b0"><code>51c48ab</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1231">#1231</a>
from dtolnay/playground</li>
<li><a
href="7d8f15b963"><code>7d8f15b</code></a>
Enable &quot;float_roundtrip&quot; and &quot;unbounded_depth&quot;
features in playground</li>
<li><a
href="a46f14cf2e"><code>a46f14c</code></a>
Release 1.0.136</li>
<li><a
href="eb9f3f6387"><code>eb9f3f6</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1230">#1230</a>
from goffrie/patch-1</li>
<li><a
href="513e5b2f74"><code>513e5b2</code></a>
Use Map::with_capacity in value::Serializer::serialize_map</li>
<li>See full diff in <a
href="https://github.com/serde-rs/json/compare/v1.0.135...v1.0.137">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde_json&package-manager=cargo&previous-version=1.0.135&new-version=1.0.137)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:24:07 +00:00
dependabot[bot]
c53999dab8 Bump types-bleach from 6.1.0.20240331 to 6.2.0.20241123 (#18082)
Bumps [types-bleach](https://github.com/python/typeshed) from
6.1.0.20240331 to 6.2.0.20241123.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-bleach&package-manager=pip&previous-version=6.1.0.20240331&new-version=6.2.0.20241123)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:04:41 +00:00
Andrew Morgan
b41a9ebb38 OIDC: increase length of generated nonce parameter from 30->32 chars (#18109) 2025-01-27 18:39:51 +00:00
Eric Eastwood
6ec5e13ec9 Fix join being denied after being invited over federation (#18075)
This also happens for rejecting an invite. Basically, any out-of-band membership transition where we first get the membership as an `outlier` and then rely on federation filling us in to de-outlier it.

This PR mainly addresses automated test flakiness, bots/scripts, and options within Synapse like [`auto_accept_invites`](https://element-hq.github.io/synapse/v1.122/usage/configuration/config_documentation.html#auto_accept_invites) that are able to react quickly (before federation is able to push us events), but also helps in generic scenarios where federation is lagging.

I initially thought this might be a Synapse consistency issue (see issues labeled with [`Z-Read-After-Write`](https://github.com/matrix-org/synapse/labels/Z-Read-After-Write)) but it seems to be an event auth logic problem. Workers probably do increase the number of possible race condition scenarios that make this visible though (replication and cache invalidation lag).

Fix https://github.com/element-hq/synapse/issues/15012
(probably fixes https://github.com/matrix-org/synapse/issues/15012 (https://github.com/element-hq/synapse/issues/15012))
Related to https://github.com/matrix-org/matrix-spec/issues/2062

Problems:

 1. We don't consider [out-of-band membership](https://github.com/element-hq/synapse/blob/develop/docs/development/room-dag-concepts.md#out-of-band-membership-events) (outliers) in our `event_auth` logic even though we expose them in `/sync`.
 1. (This PR doesn't address this point) Perhaps we should consider authing events in the persistence queue as events already in the queue could allow subsequent events to be allowed (events come through many channels: federation transaction, remote invite, remote join, local send). But this doesn't save us in the case where the event is more delayed over federation.


### What happened before?

I wrote some Complement test that stresses this exact scenario and reproduces the problem: https://github.com/matrix-org/complement/pull/757

```
COMPLEMENT_ALWAYS_PRINT_SERVER_LOGS=1 COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh -run TestSynapseConsistency
```


We have `hs1` and `hs2` running in monolith mode (no workers):

 1. `@charlie1:hs2` is invited and joins the room:
     1. `hs1` invites `@charlie1:hs2` to a room which we receive on `hs2` as `PUT /_matrix/federation/v1/invite/{roomId}/{eventId}` (`on_invite_request(...)`) and the invite membership is persisted as an outlier. The `room_memberships` and `local_current_membership` database tables are also updated which means they are visible down `/sync` at this point.
     1. `@charlie1:hs2` decides to join because it saw the invite down `/sync`. Because `hs2` is not yet in the room, this happens as a remote join `make_join`/`send_join` which comes back with all of the auth events needed to auth successfully and now `@charlie1:hs2` is successfully joined to the room.
 1. `@charlie2:hs2` is invited and and tries to join the room:
     1. `hs1` invites `@charlie2:hs2` to the room which we receive on `hs2` as `PUT /_matrix/federation/v1/invite/{roomId}/{eventId}` (`on_invite_request(...)`) and the invite membership is persisted as an outlier. The `room_memberships` and `local_current_membership` database tables are also updated which means they are visible down `/sync` at this point.
     1. Because `hs2` is already participating in the room, we also see the invite come over federation in a transaction and we start processing it (not done yet, see below)
     1. `@charlie2:hs2` decides to join because it saw the invite down `/sync`. Because `hs2`, is already in the room, this happens as a local join but we deny the event because our `event_auth` logic thinks that we have no membership in the room  (expected to be able to join because we saw the invite down `/sync`)
     1. We finally finish processing the `@charlie2:hs2` invite event from and de-outlier it.
         - If this finished before we tried to join we would have been fine but this is the race condition that makes this situation visible.


Logs for `hs2`:

```
🗳️ on_invite_request: handling event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=False>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
🔦 _store_room_members_txn update local_current_membership: <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
📨 Notifying about new event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
 on_invite_request: handled event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
🧲 do_invite_join for @user-2-charlie1:hs2 in !sfZVBdLUezpPWetrol:hs1
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$bwv8LxFnqfpsw_rhR7OrTjtz09gaJ23MqstKOcs7ygA, type=m.room.member, state_key=@user-1-alice:hs1, membership=join, outlier=True>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$oju1ts3G3pz5O62IesrxX5is4LxAwU3WPr4xvid5ijI, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=join, outlier=False>
📨 Notifying about new event <FrozenEventV3 event_id=$oju1ts3G3pz5O62IesrxX5is4LxAwU3WPr4xvid5ijI, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=join, outlier=False>

...

🗳️ on_invite_request: handling event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
🔦 _store_room_members_txn update local_current_membership: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
📨 Notifying about new event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
 on_invite_request: handled event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
📬 handling received PDU in room !sfZVBdLUezpPWetrol:hs1: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
📮 handle_new_client_event: handling <FrozenEventV3 event_id=$WNVDTQrxy5tCdPQHMyHyIn7tE4NWqKsZ8Bn8R4WbBSA, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=join, outlier=False>
 Denying new event <FrozenEventV3 event_id=$WNVDTQrxy5tCdPQHMyHyIn7tE4NWqKsZ8Bn8R4WbBSA, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=join, outlier=False> because 403: You are not invited to this room.
synapse.http.server - 130 - INFO - POST-16 - <SynapseRequest at 0x7f460c91fbf0 method='POST' uri='/_matrix/client/v3/join/%21sfZVBdLUezpPWetrol:hs1?server_name=hs1' clientproto='HTTP/1.0' site='8080'> SynapseError: 403 - You are not invited to this room.
📨 Notifying about new event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
 handled received PDU in room !sfZVBdLUezpPWetrol:hs1: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
```
2025-01-27 11:21:10 -06:00
dependabot[bot]
148e93576e Bump log from 0.4.22 to 0.4.25 (#18098)
Bumps [log](https://github.com/rust-lang/log) from 0.4.22 to 0.4.25.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/releases">log's
releases</a>.</em></p>
<blockquote>
<h2>0.4.25</h2>
<h2>What's Changed</h2>
<ul>
<li>Revert loosening of kv cargo features by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/662">rust-lang/log#662</a></li>
<li>Prepare for 0.4.25 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/663">rust-lang/log#663</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.24...0.4.25">https://github.com/rust-lang/log/compare/0.4.24...0.4.25</a></p>
<h2>0.4.24 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix up kv feature activation by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/659">rust-lang/log#659</a></li>
<li>Prepare for 0.4.24 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/660">rust-lang/log#660</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.23...0.4.24">https://github.com/rust-lang/log/compare/0.4.23...0.4.24</a></p>
<h2>0.4.23 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix some typos by <a
href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li>Add logforth to implementation by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li>Add <code>spdlog-rs</code> link to README by <a
href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li>Add correct lifetime to kv::Value::to_borrowed_str by <a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a> in
<a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li>docs: Add logforth as an impl by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/642">rust-lang/log#642</a></li>
<li>Add clang_log implementation by <a
href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li>Bind lifetimes of &amp;str returned from Key by the lifetime of 'k
rather than the lifetime of the Key struct by <a
href="https://github.com/gbbosak"><code>@​gbbosak</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a>
(reverted)</li>
<li>Fix up key lifetimes and add method to try get a borrowed key by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/653">rust-lang/log#653</a></li>
<li>Add Ftail implementation by <a
href="https://github.com/tjardoo"><code>@​tjardoo</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
<li>Relax feature flag for value's std_support by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/657">rust-lang/log#657</a></li>
<li>Prepare for 0.4.23 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/656">rust-lang/log#656</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li><a href="https://github.com/tisonkun"><code>@​tisonkun</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li><a href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li><a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li><a href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li><a href="https://github.com/gbbosak"><code>@​gbbosak</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li><a href="https://github.com/tjardoo"><code>@​tjardoo</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.23">https://github.com/rust-lang/log/compare/0.4.22...0.4.23</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/blob/master/CHANGELOG.md">log's
changelog</a>.</em></p>
<blockquote>
<h2>[0.4.25] - 2025-01-14</h2>
<h2>What's Changed</h2>
<ul>
<li>Revert loosening of kv cargo features by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/662">rust-lang/log#662</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.24...0.4.25">https://github.com/rust-lang/log/compare/0.4.24...0.4.25</a></p>
<h2>[0.4.24] - 2025-01-11</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix up kv feature activation by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/659">rust-lang/log#659</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.23...0.4.24">https://github.com/rust-lang/log/compare/0.4.23...0.4.24</a></p>
<h2>[0.4.23] - 2025-01-10 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix some typos by <a
href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li>Add logforth to implementation by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li>Add <code>spdlog-rs</code> link to README by <a
href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li>Add correct lifetime to kv::Value::to_borrowed_str by <a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a> in
<a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li>docs: Add logforth as an impl by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/642">rust-lang/log#642</a></li>
<li>Add clang_log implementation by <a
href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li>Bind lifetimes of &amp;str returned from Key by the lifetime of 'k
rather than the lifetime of the Key struct by <a
href="https://github.com/gbbosak"><code>@​gbbosak</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li>Fix up key lifetimes and add method to try get a borrowed key by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/653">rust-lang/log#653</a></li>
<li>Add Ftail implementation by <a
href="https://github.com/tjardoo"><code>@​tjardoo</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li><a href="https://github.com/tisonkun"><code>@​tisonkun</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li><a href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li><a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li><a href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li><a href="https://github.com/gbbosak"><code>@​gbbosak</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li><a href="https://github.com/tjardoo"><code>@​tjardoo</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.23">https://github.com/rust-lang/log/compare/0.4.22...0.4.23</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="22be810729"><code>22be810</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/663">#663</a>
from rust-lang/cargo/0.4.25</li>
<li><a
href="0279730123"><code>0279730</code></a>
prepare for 0.4.25 release</li>
<li><a
href="4099bcb357"><code>4099bcb</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/662">#662</a>
from rust-lang/fix/cargo-features</li>
<li><a
href="36e7e3f696"><code>36e7e3f</code></a>
revert loosening of kv cargo features</li>
<li><a
href="2282191854"><code>2282191</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/660">#660</a>
from rust-lang/cargo/0.4.24</li>
<li><a
href="2994f0a62c"><code>2994f0a</code></a>
prepare for 0.4.24 release</li>
<li><a
href="5fcb50eccd"><code>5fcb50e</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/659">#659</a>
from rust-lang/fix/feature-builds</li>
<li><a
href="29fe9e60ff"><code>29fe9e6</code></a>
fix up feature activation</li>
<li><a
href="b1824f2c28"><code>b1824f2</code></a>
use cargo hack in CI to test all feature combinations</li>
<li><a
href="e6b643d591"><code>e6b643d</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/656">#656</a>
from rust-lang/cargo/0.4.23</li>
<li>Additional commits viewable in <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.25">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=log&package-manager=cargo&previous-version=0.4.22&new-version=0.4.25)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 15:23:28 +00:00
dependabot[bot]
56ed412839 Bump dawidd6/action-download-artifact from 7 to 8 (#18108)
Bumps
[dawidd6/action-download-artifact](https://github.com/dawidd6/action-download-artifact)
from 7 to 8.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dawidd6/action-download-artifact/releases">dawidd6/action-download-artifact's
releases</a>.</em></p>
<blockquote>
<h2>v8</h2>
<h2>New features</h2>
<ul>
<li><code>use_unzip</code> boolean input (defaulting to false) - if set
to true, the action will use system provided <code>unzip</code> utility
for unpacking downloaded artifact(s) (note that the action will first
download the .zip artifact file, then unpack it and remove the .zip
file)</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>README: v7 by <a
href="https://github.com/haines"><code>@​haines</code></a> in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/318">dawidd6/action-download-artifact#318</a></li>
<li>Unzip by <a
href="https://github.com/dawidd6"><code>@​dawidd6</code></a> in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/325">dawidd6/action-download-artifact#325</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/haines"><code>@​haines</code></a> made
their first contribution in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/318">dawidd6/action-download-artifact#318</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/dawidd6/action-download-artifact/compare/v7...v8">https://github.com/dawidd6/action-download-artifact/compare/v7...v8</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="20319c5641"><code>20319c5</code></a>
README: v8</li>
<li><a
href="e58a9e5d14"><code>e58a9e5</code></a>
Unzip (<a
href="https://redirect.github.com/dawidd6/action-download-artifact/issues/325">#325</a>)</li>
<li><a
href="6d05268723"><code>6d05268</code></a>
node_modules: update</li>
<li><a
href="c03fb0c928"><code>c03fb0c</code></a>
README: v7 (<a
href="https://redirect.github.com/dawidd6/action-download-artifact/issues/318">#318</a>)</li>
<li>See full diff in <a
href="80620a5d27...20319c5641">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=dawidd6/action-download-artifact&package-manager=github_actions&previous-version=7&new-version=8)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 15:20:41 +00:00
Sven Mäder
9c5d08fff8 Ratelimit presence updates (#18000) 2025-01-24 19:58:01 +00:00
Max Kratz
90a6bd01c2 Contrib: Docker: updates PostgreSQL version in docker-compose.yml (#18089)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 18:54:31 +00:00
Till Faelligen
aa07a01452 One more tiny change 2025-01-21 15:01:16 +01:00
Till Faelligen
8364c01a2b Update changelog 2025-01-21 14:58:20 +01:00
Till Faelligen
e27808f306 1.123.0rc1 2025-01-21 14:46:40 +01:00
Quentin Gliech
048c1ac7f6 Support the new /auth_metadata endpoint defined in MSC2965. (#18093)
See the updated MSC2965

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 13:48:49 +01:00
Patrick Cloke
ca290d325c Implement MSC4133 to support custom profile fields. (#17488)
Implementation of
[MSC4133](https://github.com/matrix-org/matrix-spec-proposals/pull/4133)
to support custom profile fields. It is behind an experimental flag and
includes tests.


### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 11:11:04 +00:00
Eric Eastwood
0a31cf18cd Document possibility of configuring tls for a worker instance in instance_map (#18064) 2025-01-20 12:40:05 -06:00
Erik Johnston
48db0c2d6c Drop indices concurrently on background updates (#18091)
Otherwise these can race with other long running queries and lock out
all other queries.

This caused problems in v1.22.0 as we added an index to `events` table
in #17948, but that got interrupted and so next time we ran the
background update we needed to delete the half-finished index. However,
that got blocked behind some long running queries and then locked other
queries out (stopping workers from even starting).
2025-01-20 17:14:06 +00:00
dependabot[bot]
24c4d82aeb Bump pyo3 from 0.23.3 to 0.23.4 (#18079) 2025-01-16 14:18:06 +00:00
dependabot[bot]
3fda8d3b67 Bump serde_json from 1.0.134 to 1.0.135 (#18081) 2025-01-16 14:15:01 +00:00
dependabot[bot]
5f15a549d7 Bump ulid from 1.1.3 to 1.1.4 (#18080) 2025-01-16 14:14:46 +00:00
dependabot[bot]
6cefbc6852 Bump mypy from 1.12.1 to 1.13.0 (#18083) 2025-01-16 10:17:58 +00:00
dependabot[bot]
fd3ec6435e Bump pillow from 11.0.0 to 11.1.0 (#18084) 2025-01-16 10:17:46 +00:00
Andrew Morgan
39bd6e2c16 Merge branch 'master' into develop 2025-01-14 15:41:08 +00:00
Andrew Morgan
5c736cd2af move additional release missed in last commit 2025-01-14 14:23:35 +00:00
Andrew Morgan
e70e8d132c Move 2023/4 changelog entries under docs/changelogs 2025-01-14 14:20:08 +00:00
Andrew Morgan
48334fbc40 move postgres changelog to the top 2025-01-14 14:17:55 +00:00
Andrew Morgan
b4fd694ce3 1.122.0 2025-01-14 14:14:23 +00:00
Eric Eastwood
e2d757f62d Increase rc_invites.per_issuer for Complement (#18072)
It's possible to run into `SynapseError: 429 - Too Many Requests (rc_invites.per_issuer)`

`rc_invites.per_issuer` originally introduced in
https://github.com/matrix-org/synapse/pull/13125
2025-01-13 15:01:00 -06:00
Eric Eastwood
aab3672037 Bust _membership_stream_cache cache when current state changes (#17732)
This is particularly a problem in a state reset scenario where the membership
might change without a corresponding event.

This PR is targeting a scenario where a state reset happens which causes
room membership to change. Previously, the cache would just hold onto
stale data and now we properly bust the cache in this scenario.

We have a few tests for these scenarios which you can see are now fixed
because we can remove the `FIXME` where we were previously manually
busting the cache in the test itself.

This is a general Synapse thing so by it's nature it helps out Sliding
Sync.

Fix https://github.com/element-hq/synapse/issues/17368

Prerequisite for https://github.com/element-hq/synapse/issues/17929

---

Match when are busting `_curr_state_delta_stream_cache`
2025-01-08 10:11:09 -06:00
dependabot[bot]
d0677dca39 Bump jinja2 from 3.1.4 to 3.1.5 (#18067) 2025-01-08 16:08:43 +00:00
Shay
e34fd1228d Add the ability to filter by state event type on admin room state endpoint (#18035)
Adds a query param `type` to `/_synapse/admin/v1/rooms/{room_id}/state`
that filters the state event query by state event type.

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-08 15:38:26 +00:00
Travis Ralston
beea39f000 Drop unstable MSC4151 implementation (#18052)
It's been rotated out of known clients, and should be safe for removal
now.

Fixes https://github.com/element-hq/synapse/issues/17373

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-01-07 15:45:57 -07:00
Olivier 'reivilibre
fa320c4fcb Fix typographical error in changelog 2025-01-07 17:43:41 +00:00
Olivier 'reivilibre
22c2add9c0 Merge branch 'release-v1.122' into develop 2025-01-07 17:42:44 +00:00
dependabot[bot]
60f596b4d8 Bump pyopenssl from 24.2.1 to 24.3.0 (#18062)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-07 16:50:50 +00:00
Olivier 'reivilibre
1143e14479 Tweak changelog 2025-01-07 15:20:24 +00:00
Olivier 'reivilibre
c199ede287 1.122.0rc1 2025-01-07 14:13:02 +00:00
dependabot[bot]
9fb7333a7c Bump sentry-sdk from 2.17.0 to 2.19.2 (#18061)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-06 18:27:05 +00:00
dependabot[bot]
a0a4a36891 Bump pyicu from 2.13.1 to 2.14 (#18060) 2025-01-06 18:24:49 +00:00
dependabot[bot]
49fcda31f6 Bump serde from 1.0.216 to 1.0.217 (#18059) 2025-01-06 18:23:12 +00:00
Mathieu Velten
b3ba501c52 Properly purge state groups tables when purging a room (#18024)
Currently purging a complex room can lead to a lot of orphaned rows left
behind in the state groups tables.
It seems it is because we are loosing track of state groups sometimes.

This change uses the `room_id` indexed column of `state_groups` table to
decide what to delete instead of doing an indirection through
`event_to_state_groups`.

Related to https://github.com/element-hq/synapse/issues/3364.

### Pull Request Checklist

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Erik Johnston <erikj@jki.re>
2025-01-06 15:32:18 +00:00
Patrick Cloke
6306de8e16 Refactor get_profile: do not return missing fields. (#18063)
Refactor `get_profile` to avoid returning "empty" (`None` / `null`)
fields. Currently this is not very important, but will be more useful
once #17488 lands. It does update the servlet to use this now which has
a minor change in behavior: additional fields served over federation
will now be properly sent back to clients.

It also adds constants for `avatar_url` / `displayname` although I did
not attempt to use it everywhere possible.
2025-01-03 17:23:29 +00:00
Shay
b5267678d2 Add a test to verify remote user messages can be redacted via admin api redaction endpoint if requester is admin in room (#18043) 2025-01-03 12:52:42 +00:00
167 changed files with 9847 additions and 5073 deletions

View File

@@ -30,7 +30,7 @@ jobs:
run: docker buildx inspect
- name: Install Cosign
uses: sigstore/cosign-installer@v3.7.0
uses: sigstore/cosign-installer@v3.8.0
- name: Checkout repository
uses: actions/checkout@v4

View File

@@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@80620a5d27ce0ae443b965134db88467fc607b43 # v7
uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -21,7 +21,7 @@ jobs:
# We use nightly so that `fmt` correctly groups together imports, and
# clippy correctly fixes up the benchmarks.
toolchain: nightly-2022-12-01
components: rustfmt
components: clippy, rustfmt
- uses: Swatinem/rust-cache@v2
- name: Setup Poetry

3860
CHANGES.md

File diff suppressed because it is too large Load Diff

173
Cargo.lock generated
View File

@@ -35,6 +35,12 @@ version = "0.21.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567"
[[package]]
name = "bitflags"
version = "2.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8f68f53c83ab957f72c32642f3868eec03eb974d1fb82e453128456482613d36"
[[package]]
name = "blake2"
version = "0.10.6"
@@ -61,9 +67,9 @@ checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c"
[[package]]
name = "bytes"
version = "1.9.0"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "325918d6fe32f23b19878fe4b34794ae41fc19ddbe53b10571a4874d44ffd39b"
checksum = "f61dac84819c6588b558454b194026eb1f09c293b9036ae9b159e74e73ab6cf9"
[[package]]
name = "cfg-if"
@@ -119,15 +125,14 @@ dependencies = [
[[package]]
name = "getrandom"
version = "0.2.15"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c4567c8db10ae91089c99af84c68c38da3ec2f087c3f82960bcdbf3656b6f4d7"
checksum = "43a49c392881ce6d5c3b8cb70f98717b7c07aabbdff06687b9030dbfbe2725f8"
dependencies = [
"cfg-if",
"js-sys",
"libc",
"wasi",
"wasm-bindgen",
"windows-targets",
]
[[package]]
@@ -218,9 +223,9 @@ checksum = "ae743338b92ff9146ce83992f766a31066a91a8c84a45e0e9f21e7cf6de6d346"
[[package]]
name = "log"
version = "0.4.22"
version = "0.4.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a7a70ba024b9dc04c27ea2f0c0548feb474ec5c54bba33a7f72f873a39d07b24"
checksum = "04cbf5b083de1c7e0222a7a51dbfdba1cbe1c6ab0b15e29fff3f6c077fd9cd9f"
[[package]]
name = "memchr"
@@ -272,9 +277,9 @@ dependencies = [
[[package]]
name = "pyo3"
version = "0.23.3"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e484fd2c8b4cb67ab05a318f1fd6fa8f199fcc30819f08f07d200809dba26c15"
checksum = "57fe09249128b3173d092de9523eaa75136bf7ba85e0d69eca241c7939c933cc"
dependencies = [
"anyhow",
"cfg-if",
@@ -291,9 +296,9 @@ dependencies = [
[[package]]
name = "pyo3-build-config"
version = "0.23.3"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc0e0469a84f208e20044b98965e1561028180219e35352a2afaf2b942beff3b"
checksum = "1cd3927b5a78757a0d71aa9dff669f903b1eb64b54142a9bd9f757f8fde65fd7"
dependencies = [
"once_cell",
"target-lexicon",
@@ -301,9 +306,9 @@ dependencies = [
[[package]]
name = "pyo3-ffi"
version = "0.23.3"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eb1547a7f9966f6f1a0f0227564a9945fe36b90da5a93b3933fc3dc03fae372d"
checksum = "dab6bb2102bd8f991e7749f130a70d05dd557613e39ed2deeee8e9ca0c4d548d"
dependencies = [
"libc",
"pyo3-build-config",
@@ -322,9 +327,9 @@ dependencies = [
[[package]]
name = "pyo3-macros"
version = "0.23.3"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fdb6da8ec6fa5cedd1626c886fc8749bdcbb09424a86461eb8cdf096b7c33257"
checksum = "91871864b353fd5ffcb3f91f2f703a22a9797c91b9ab497b1acac7b07ae509c7"
dependencies = [
"proc-macro2",
"pyo3-macros-backend",
@@ -334,9 +339,9 @@ dependencies = [
[[package]]
name = "pyo3-macros-backend"
version = "0.23.3"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38a385202ff5a92791168b1136afae5059d3ac118457bb7bc304c197c2d33e7d"
checksum = "43abc3b80bc20f3facd86cd3c60beed58c3e2aa26213f3cda368de39c60a27e4"
dependencies = [
"heck",
"proc-macro2",
@@ -366,20 +371,20 @@ dependencies = [
[[package]]
name = "rand"
version = "0.8.5"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404"
checksum = "3779b94aeb87e8bd4e834cee3650289ee9e0d5677f976ecdb6d219e5f4f6cd94"
dependencies = [
"libc",
"rand_chacha",
"rand_core",
"zerocopy",
]
[[package]]
name = "rand_chacha"
version = "0.3.1"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88"
checksum = "d3022b5f1df60f26e1ffddd6c66e8aa15de382ae63b3a0c1bfc0e4d3e3f325cb"
dependencies = [
"ppv-lite86",
"rand_core",
@@ -387,11 +392,12 @@ dependencies = [
[[package]]
name = "rand_core"
version = "0.6.4"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c"
checksum = "b08f3c9802962f7e1b25113931d94f43ed9725bebc59db9d0c3e9a23b67e15ff"
dependencies = [
"getrandom",
"zerocopy",
]
[[package]]
@@ -431,18 +437,18 @@ checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f"
[[package]]
name = "serde"
version = "1.0.216"
version = "1.0.217"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e"
checksum = "02fc4265df13d6fa1d00ecff087228cc0a2b5f3c0e87e258d8b94a156e984c70"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.216"
version = "1.0.217"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e"
checksum = "5a9bf7cf98d04a2b28aead066b7496853d4779c9cc183c440dbac457641e19a0"
dependencies = [
"proc-macro2",
"quote",
@@ -451,9 +457,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.134"
version = "1.0.138"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d00f4175c42ee48b15416f6193a959ba3a0d67fc699a0db9ad12df9f83991c7d"
checksum = "d434192e7da787e94a6ea7e9670b26a036d0ca41e0b7efb2676dd32bae872949"
dependencies = [
"itoa",
"memchr",
@@ -538,11 +544,10 @@ checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
[[package]]
name = "ulid"
version = "1.1.3"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04f903f293d11f31c0c29e4148f6dc0d033a7f80cebc0282bea147611667d289"
checksum = "ab82fc73182c29b02e2926a6df32f2241dbadb5cfc111fd595515b3598f46bb3"
dependencies = [
"getrandom",
"rand",
"web-time",
]
@@ -567,9 +572,12 @@ checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f"
[[package]]
name = "wasi"
version = "0.11.0+wasi-snapshot-preview1"
version = "0.13.3+wasi-0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
checksum = "26816d2e1a4a36a2940b96c5296ce403917633dff8f3440e9b236ed6f6bacad2"
dependencies = [
"wit-bindgen-rt",
]
[[package]]
name = "wasm-bindgen"
@@ -634,3 +642,96 @@ dependencies = [
"js-sys",
"wasm-bindgen",
]
[[package]]
name = "windows-targets"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9b724f72796e036ab90c1021d4780d4d3d648aca59e491e6b98e725b84e99973"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_gnullvm",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
"windows_x86_64_msvc",
]
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32a4622180e7a0ec044bb555404c800bc9fd9ec262ec147edd5989ccd0c02cd3"
[[package]]
name = "windows_aarch64_msvc"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09ec2a7bb152e2252b53fa7803150007879548bc709c039df7627cabbd05d469"
[[package]]
name = "windows_i686_gnu"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e9b5ad5ab802e97eb8e295ac6720e509ee4c243f69d781394014ebfe8bbfa0b"
[[package]]
name = "windows_i686_gnullvm"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0eee52d38c090b3caa76c563b86c3a4bd71ef1a819287c19d586d7334ae8ed66"
[[package]]
name = "windows_i686_msvc"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "240948bc05c5e7c6dabba28bf89d89ffce3e303022809e73deaefe4f6ec56c66"
[[package]]
name = "windows_x86_64_gnu"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "147a5c80aabfbf0c7d901cb5895d1de30ef2907eb21fbbab29ca94c5b08b1a78"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24d5b23dc417412679681396f2b49f3de8c1473deb516bd34410872eff51ed0d"
[[package]]
name = "windows_x86_64_msvc"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "589f6da84c646204747d1270a2a5661ea66ed1cced2631d546fdfb155959f9ec"
[[package]]
name = "wit-bindgen-rt"
version = "0.33.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3268f3d866458b787f390cf61f4bbb563b922d091359f9608842999eaee3943c"
dependencies = [
"bitflags",
]
[[package]]
name = "zerocopy"
version = "0.8.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aa91407dacce3a68c56de03abe2760159582b846c6a4acd2f456618087f12713"
dependencies = [
"zerocopy-derive",
]
[[package]]
name = "zerocopy-derive"
version = "0.8.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06718a168365cad3d5ff0bb133aad346959a2074bd4a85c121255a11304a8626"
dependencies = [
"proc-macro2",
"quote",
"syn",
]

6
LICENSE-COMMERCIAL Normal file
View File

@@ -0,0 +1,6 @@
Licensees holding a valid commercial license with Element may use this
software in accordance with the terms contained in a written agreement
between you and Element.
To purchase a commercial license please contact our sales team at
licensing@element.io

View File

@@ -10,14 +10,15 @@ implementation, written and maintained by `Element <https://element.io>`_.
`Matrix <https://github.com/matrix-org>`__ is the open standard for
secure and interoperable real time communications. You can directly run
and manage the source code in this repository, available under an AGPL
license. There is no support provided from Element unless you have a
subscription.
license (or alternatively under a commercial license from Element).
There is no support provided by Element unless you have a
subscription from Element.
Subscription alternative
========================
Subscription
============
Alternatively, for those that need an enterprise-ready solution, Element
Server Suite (ESS) is `available as a subscription <https://element.io/pricing>`_.
For those that need an enterprise-ready solution, Element
Server Suite (ESS) is `available via subscription <https://element.io/pricing>`_.
ESS builds on Synapse to offer a complete Matrix-based backend including the full
`Admin Console product <https://element.io/enterprise-functionality/admin-console>`_,
giving admins the power to easily manage an organization-wide
@@ -249,6 +250,20 @@ Developers might be particularly interested in:
Alongside all that, join our developer community on Matrix:
`#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_, featuring real humans!
Copyright and Licensing
=======================
Copyright 2014-2017 OpenMarket Ltd
Copyright 2017 Vector Creations Ltd
Copyright 2017-2025 New Vector Ltd
This software is dual-licensed by New Vector Ltd (Element). It can be used either:
(1) for free under the terms of the GNU Affero General Public License (as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version); OR
(2) under the terms of a paid-for Element Commercial License agreement between you and Element (the terms of which may vary depending on what you and Element have agreed to).
Unless required by applicable law or agreed to in writing, software distributed under the Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
.. |support| image:: https://img.shields.io/badge/matrix-community%20support-success
:alt: (get community support in #synapse:matrix.org)

View File

@@ -1 +0,0 @@
Update Alpine Linux Synapse Package Maintainer within installation.md.

View File

@@ -1 +0,0 @@
Added the `email.tlsname` config option. This allows specifying the domain name used to validate the SMTP server's TLS certificate separately from the `email.smtp_host` to connect to.

View File

@@ -1 +0,0 @@
Module developers will have access to user id of requester when adding `check_username_for_spam` callbacks to `spam_checker_module_callbacks`. Contributed by Wilson@Pangea.chat.

View File

@@ -1 +0,0 @@
Fix bug when rejecting withdrew invite with a third_party_rules module, where the invite would be stuck for the client.

View File

@@ -1,3 +0,0 @@
Add endpoints to Admin API to fetch the number of invites the provided user has sent after a given timestamp,
fetch the number of rooms the provided user has joined after a given timestamp, and get report IDs of event
reports against a provided user (ie where the user was the sender of the reported event).

View File

@@ -1 +0,0 @@
Update `synapse.app.generic_worker` documentation to only recommend `GET` requests for stream writer routes by default, unless the worker is also configured as a stream writer. Contributed by @evoL.

View File

@@ -1 +0,0 @@
Support stable account suspension from [MSC3823](https://github.com/matrix-org/matrix-spec-proposals/pull/3823).

View File

@@ -1 +0,0 @@
Add previously-undocumented `last_seen_ts` to query user admin API.

View File

@@ -1 +0,0 @@
Add `macaroon_secret_key_path` config option.

View File

@@ -1 +0,0 @@
Improve documentation for the `TaskScheduler` class.

View File

@@ -1 +0,0 @@
Fix example in reverse proxy docs to include server port.

View File

@@ -1 +0,0 @@
Add `RoomID` & `EventID` rust types.

View File

@@ -1 +0,0 @@
Fix various type errors across the codebase.

View File

@@ -1 +0,0 @@
Bump mypy from 1.11.2 to 1.12.1.

View File

@@ -1 +0,0 @@
Disable DB statement timeout when doing a purge room since it can be quite long.

View File

@@ -1 +0,0 @@
Remove some remaining uses of `twisted.internet.defer.returnValue`. Contributed by Colin Watson.

View File

@@ -1 +0,0 @@
Fix a bug preventing the admin redaction endpoint from working on messages from remote users.

View File

@@ -1 +0,0 @@
Remove support for PostgreSQL 11 and 12. Contributed by @clokep.

View File

@@ -1 +0,0 @@
Document `tls` option for a worker instance in `instance_map`.

View File

@@ -51,7 +51,7 @@ services:
- traefik.http.routers.https-synapse.tls.certResolver=le-ssl
db:
image: docker.io/postgres:12-alpine
image: docker.io/postgres:15-alpine
# Change that password, of course!
environment:
- POSTGRES_USER=synapse

60
debian/changelog vendored
View File

@@ -1,3 +1,63 @@
matrix-synapse-py3 (1.125.0) stable; urgency=medium
* New Synapse release 1.125.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 25 Feb 2025 08:10:07 -0700
matrix-synapse-py3 (1.125.0~rc1) stable; urgency=medium
* New synapse release 1.125.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Feb 2025 13:32:49 +0000
matrix-synapse-py3 (1.124.0) stable; urgency=medium
* New Synapse release 1.124.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 11 Feb 2025 11:55:22 +0100
matrix-synapse-py3 (1.124.0~rc3) stable; urgency=medium
* New Synapse release 1.124.0rc3.
-- Synapse Packaging team <packages@matrix.org> Fri, 07 Feb 2025 13:42:55 +0000
matrix-synapse-py3 (1.124.0~rc2) stable; urgency=medium
* New Synapse release 1.124.0rc2.
-- Synapse Packaging team <packages@matrix.org> Wed, 05 Feb 2025 16:35:53 +0000
matrix-synapse-py3 (1.124.0~rc1) stable; urgency=medium
* New Synapse release 1.124.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 04 Feb 2025 11:53:05 +0000
matrix-synapse-py3 (1.123.0) stable; urgency=medium
* New Synapse release 1.123.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 28 Jan 2025 08:37:34 -0700
matrix-synapse-py3 (1.123.0~rc1) stable; urgency=medium
* New Synapse release 1.123.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 21 Jan 2025 14:39:57 +0100
matrix-synapse-py3 (1.122.0) stable; urgency=medium
* New Synapse release 1.122.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 14 Jan 2025 14:14:14 +0000
matrix-synapse-py3 (1.122.0~rc1) stable; urgency=medium
* New Synapse release 1.122.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Jan 2025 14:06:19 +0000
matrix-synapse-py3 (1.121.1) stable; urgency=medium
* New Synapse release 1.121.1.

View File

@@ -138,6 +138,10 @@ for port in 8080 8081 8082; do
per_user:
per_second: 1000
burst_count: 1000
rc_presence:
per_user:
per_second: 1000
burst_count: 1000
RC
)
echo "${ratelimiting}" >> "$port.config"

View File

@@ -114,6 +114,9 @@ The following environment variables are supported in `run` mode:
is set via `docker run --user`, defaults to `991`, `991`. Note that this user
must have permission to read the config files, and write to the data directories.
* `TZ`: the [timezone](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) the container will run with. Defaults to `UTC`.
* `SYNAPSE_HTTP_PROXY`: Passed through to the Synapse process as the `http_proxy` environment variable.
* `SYNAPSE_HTTPS_PROXY`: Passed through to the Synapse process as the `https_proxy` environment variable.
* `SYNAPSE_NO_PROXY`: Passed through to the Synapse process as `no_proxy` environment variable.
For more complex setups (e.g. for workers) you can also pass your args directly to synapse using `run` mode. For example like this:

View File

@@ -85,6 +85,14 @@ rc_invites:
per_user:
per_second: 1000
burst_count: 1000
per_issuer:
per_second: 1000
burst_count: 1000
rc_presence:
per_user:
per_second: 9999
burst_count: 9999
federation_rr_transactions_per_room_per_second: 9999

View File

@@ -1,5 +1,6 @@
{% if use_forking_launcher %}
[program:synapse_fork]
environment=http_proxy="%(ENV_SYNAPSE_HTTP_PROXY)s",https_proxy="%(ENV_SYNAPSE_HTTPS_PROXY)s",no_proxy="%(ENV_SYNAPSE_NO_PROXY)s"
command=/usr/local/bin/python -m synapse.app.complement_fork_starter
{{ main_config_path }}
synapse.app.homeserver
@@ -20,6 +21,7 @@ exitcodes=0
{% else %}
[program:synapse_main]
environment=http_proxy="%(ENV_SYNAPSE_HTTP_PROXY)s",https_proxy="%(ENV_SYNAPSE_HTTPS_PROXY)s",no_proxy="%(ENV_SYNAPSE_NO_PROXY)s"
command=/usr/local/bin/prefix-log /usr/local/bin/python -m synapse.app.homeserver
--config-path="{{ main_config_path }}"
--config-path=/conf/workers/shared.yaml
@@ -36,6 +38,7 @@ exitcodes=0
{% for worker in workers %}
[program:synapse_{{ worker.name }}]
environment=http_proxy="%(ENV_SYNAPSE_HTTP_PROXY)s",https_proxy="%(ENV_SYNAPSE_HTTPS_PROXY)s",no_proxy="%(ENV_SYNAPSE_NO_PROXY)s"
command=/usr/local/bin/prefix-log /usr/local/bin/python -m {{ worker.app }}
--config-path="{{ main_config_path }}"
--config-path=/conf/workers/shared.yaml

View File

@@ -1099,6 +1099,13 @@ def main(args: List[str], environ: MutableMapping[str, str]) -> None:
else:
log("Could not find %s, will not use" % (jemallocpath,))
# Empty strings are falsy in Python so this default is fine. We just can't have these
# be undefined because supervisord will complain about our
# `%(ENV_SYNAPSE_HTTP_PROXY)s` usage.
environ.setdefault("SYNAPSE_HTTP_PROXY", "")
environ.setdefault("SYNAPSE_HTTPS_PROXY", "")
environ.setdefault("SYNAPSE_NO_PROXY", "")
# Start supervisord, which will start Synapse, all of the configured worker
# processes, redis, nginx etc. according to the config we created above.
log("Starting supervisord")

View File

@@ -385,6 +385,13 @@ The API is:
GET /_synapse/admin/v1/rooms/<room_id>/state
```
**Parameters**
The following query parameter is available:
* `type` - The type of room state event to filter by, eg "m.room.create". If provided, only state events
of this type will be returned (regardless of their `state_key` value).
A response body like the following is returned:
```json

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -310,29 +310,18 @@ sudo dnf install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
sudo dnf group install "Development Tools"
```
##### Red Hat Enterprise Linux / Rocky Linux
##### Red Hat Enterprise Linux / Rocky Linux / Oracle Linux
*Note: The term "RHEL" below refers to both Red Hat Enterprise Linux and Rocky Linux. The distributions are 1:1 binary compatible.*
*Note: The term "RHEL" below refers to Red Hat Enterprise Linux, Oracle Linux and Rocky Linux. The distributions are 1:1 binary compatible.*
It's recommended to use the latest Python versions.
RHEL 8 in particular ships with Python 3.6 by default which is EOL and therefore no longer supported by Synapse. RHEL 9 ship with Python 3.9 which is still supported by the Python core team as of this writing. However, newer Python versions provide significant performance improvements and they're available in official distributions' repositories. Therefore it's recommended to use them.
RHEL 8 in particular ships with Python 3.6 by default which is EOL and therefore no longer supported by Synapse. RHEL 9 ships with Python 3.9 which is still supported by the Python core team as of this writing. However, newer Python versions provide significant performance improvements and they're available in official distributions' repositories. Therefore it's recommended to use them.
Python 3.11 and 3.12 are available for both RHEL 8 and 9.
These commands should be run as root user.
RHEL 8
```bash
# Enable PowerTools repository
dnf config-manager --set-enabled powertools
```
RHEL 9
```bash
# Enable CodeReady Linux Builder repository
crb enable
```
Install new version of Python. You only need one of these:
```bash
# Python 3.11

View File

@@ -162,6 +162,53 @@ Example configuration:
pid_file: DATADIR/homeserver.pid
```
---
### `daemonize`
Specifies whether Synapse should be started as a daemon process. If Synapse is being
managed by [systemd](../../systemd-with-workers/), this option must be omitted or set to
`false`.
This can also be set by the `--daemonize` (`-D`) argument when starting Synapse.
See `worker_daemonize` for more information on daemonizing workers.
Example configuration:
```yaml
daemonize: true
```
---
### `print_pidfile`
Print the path to the pidfile just before daemonizing. Defaults to false.
This can also be set by the `--print-pidfile` argument when starting Synapse.
Example configuration:
```yaml
print_pidfile: true
```
---
### `user_agent_suffix`
A suffix that is appended to the Synapse user-agent (ex. `Synapse/v1.123.0`). Defaults
to None
Example configuration:
```yaml
user_agent_suffix: " (I'm a teapot; Linux x86_64)"
```
---
### `use_frozen_dicts`
Determines whether we should freeze the internal dict object in `FrozenEvent`. Freezing
prevents bugs where we accidentally share e.g. signature dicts. However, freezing a
dict is expensive. Defaults to false.
Example configuration:
```yaml
use_frozen_dicts: true
```
---
### `web_client_location`
The absolute URL to the web client which `/` will redirect to. Defaults to none.
@@ -595,6 +642,17 @@ listeners:
- names: [client, federation]
```
---
### `manhole`
Turn on the Twisted telnet manhole service on the given port. Defaults to none.
This can also be set by the `--manhole` argument when starting Synapse.
Example configuration:
```yaml
manhole: 1234
```
---
### `manhole_settings`
@@ -1868,6 +1926,27 @@ rc_federation:
concurrent: 5
```
---
### `rc_presence`
This option sets ratelimiting for presence.
The `rc_presence.per_user` option sets rate limits on how often a specific
users' presence updates are evaluated. Ratelimited presence updates sent via sync are
ignored, and no error is returned to the client.
This option also sets the rate limit for the
[`PUT /_matrix/client/v3/presence/{userId}/status`](https://spec.matrix.org/latest/client-server-api/#put_matrixclientv3presenceuseridstatus)
endpoint.
`per_user` defaults to `per_second: 0.1`, `burst_count: 1`.
Example configuration:
```yaml
rc_presence:
per_user:
per_second: 0.05
burst_count: 1
```
---
### `federation_rr_transactions_per_room_per_second`
Sets outgoing federation transaction frequency for sending read-receipts,
@@ -2513,6 +2592,14 @@ This is primarily intended for use with the `register_new_matrix_user` script
(see [Registering a user](../../setup/installation.md#registering-a-user));
however, the interface is [documented](../../admin_api/register_api.html).
Replacing an existing `registration_shared_secret` with a new one requires users
of the [Shared-Secret Registration API](../../admin_api/register_api.html) to
start using the new secret for requesting any further one-time nonces.
> ⚠️ **Warning** The additional consequences of replacing
> [`macaroon_secret_key`](#macaroon_secret_key) will apply in case it delegates
> to `registration_shared_secret`.
See also [`registration_shared_secret_path`](#registration_shared_secret_path).
Example configuration:
@@ -3089,6 +3176,11 @@ A secret which is used to sign
If none is specified, the `registration_shared_secret` is used, if one is given;
otherwise, a secret key is derived from the signing key.
> ⚠️ **Warning** Replacing an existing `macaroon_secret_key` with a new one
> will lead to invalidation of access tokens for all guest users. It will also
> break unsubscribe links in emails sent before the change. An unlucky user
> might encounter a broken SSO login flow and would have to start again.
Example configuration:
```yaml
macaroon_secret_key: <PRIVATE STRING>
@@ -3116,6 +3208,9 @@ A secret which is used to calculate HMACs for form values, to stop
falsification of values. Must be specified for the User Consent
forms to work.
Replacing an existing `form_secret` with a new one might break the user consent
page for an unlucky user and require them to reopen the page from a new link.
Example configuration:
```yaml
form_secret: <PRIVATE STRING>
@@ -3316,8 +3411,9 @@ This setting has the following sub-options:
The default is 'uid'.
* `attribute_requirements`: It is possible to configure Synapse to only allow logins if SAML attributes
match particular values. The requirements can be listed under
`attribute_requirements` as shown in the example. All of the listed attributes must
match for the login to be permitted.
`attribute_requirements` as shown in the example. All of the listed attributes must
match for the login to be permitted. Values can be specified in a `one_of` list to allow
multiple values for an attribute.
* `idp_entityid`: If the metadata XML contains multiple IdP entities then the `idp_entityid`
option must be set to the entity to redirect users to.
Most deployments only have a single IdP entity and so should omit this option.
@@ -3398,7 +3494,9 @@ saml2_config:
- attribute: userGroup
value: "staff"
- attribute: department
value: "sales"
one_of:
- "sales"
- "admins"
idp_entityid: 'https://our_idp/entityid'
```
@@ -4361,6 +4459,9 @@ HTTP requests from workers.
The default, this value is omitted (equivalently `null`), which means that
traffic between the workers and the main process is not authenticated.
Replacing an existing `worker_replication_secret` with a new one will break
communication with all workers that have not yet updated their secret.
Example configuration:
```yaml
worker_replication_secret: "secret_secret"

586
poetry.lock generated
View File

@@ -64,38 +64,36 @@ visualize = ["Twisted (>=16.1.1)", "graphviz (>0.5.1)"]
[[package]]
name = "bcrypt"
version = "4.2.0"
version = "4.2.1"
description = "Modern password hashing for your software and your servers"
optional = false
python-versions = ">=3.7"
files = [
{file = "bcrypt-4.2.0-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:096a15d26ed6ce37a14c1ac1e48119660f21b24cba457f160a4b830f3fe6b5cb"},
{file = "bcrypt-4.2.0-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c02d944ca89d9b1922ceb8a46460dd17df1ba37ab66feac4870f6862a1533c00"},
{file = "bcrypt-4.2.0-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d84cf6d877918620b687b8fd1bf7781d11e8a0998f576c7aa939776b512b98d"},
{file = "bcrypt-4.2.0-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:1bb429fedbe0249465cdd85a58e8376f31bb315e484f16e68ca4c786dcc04291"},
{file = "bcrypt-4.2.0-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:655ea221910bcac76ea08aaa76df427ef8625f92e55a8ee44fbf7753dbabb328"},
{file = "bcrypt-4.2.0-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:1ee38e858bf5d0287c39b7a1fc59eec64bbf880c7d504d3a06a96c16e14058e7"},
{file = "bcrypt-4.2.0-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:0da52759f7f30e83f1e30a888d9163a81353ef224d82dc58eb5bb52efcabc399"},
{file = "bcrypt-4.2.0-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3698393a1b1f1fd5714524193849d0c6d524d33523acca37cd28f02899285060"},
{file = "bcrypt-4.2.0-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:762a2c5fb35f89606a9fde5e51392dad0cd1ab7ae64149a8b935fe8d79dd5ed7"},
{file = "bcrypt-4.2.0-cp37-abi3-win32.whl", hash = "sha256:5a1e8aa9b28ae28020a3ac4b053117fb51c57a010b9f969603ed885f23841458"},
{file = "bcrypt-4.2.0-cp37-abi3-win_amd64.whl", hash = "sha256:8f6ede91359e5df88d1f5c1ef47428a4420136f3ce97763e31b86dd8280fbdf5"},
{file = "bcrypt-4.2.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:c52aac18ea1f4a4f65963ea4f9530c306b56ccd0c6f8c8da0c06976e34a6e841"},
{file = "bcrypt-4.2.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3bbbfb2734f0e4f37c5136130405332640a1e46e6b23e000eeff2ba8d005da68"},
{file = "bcrypt-4.2.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3413bd60460f76097ee2e0a493ccebe4a7601918219c02f503984f0a7ee0aebe"},
{file = "bcrypt-4.2.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:8d7bb9c42801035e61c109c345a28ed7e84426ae4865511eb82e913df18f58c2"},
{file = "bcrypt-4.2.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3d3a6d28cb2305b43feac298774b997e372e56c7c7afd90a12b3dc49b189151c"},
{file = "bcrypt-4.2.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:9c1c4ad86351339c5f320ca372dfba6cb6beb25e8efc659bedd918d921956bae"},
{file = "bcrypt-4.2.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:27fe0f57bb5573104b5a6de5e4153c60814c711b29364c10a75a54bb6d7ff48d"},
{file = "bcrypt-4.2.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:8ac68872c82f1add6a20bd489870c71b00ebacd2e9134a8aa3f98a0052ab4b0e"},
{file = "bcrypt-4.2.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:cb2a8ec2bc07d3553ccebf0746bbf3d19426d1c6d1adbd4fa48925f66af7b9e8"},
{file = "bcrypt-4.2.0-cp39-abi3-win32.whl", hash = "sha256:77800b7147c9dc905db1cba26abe31e504d8247ac73580b4aa179f98e6608f34"},
{file = "bcrypt-4.2.0-cp39-abi3-win_amd64.whl", hash = "sha256:61ed14326ee023917ecd093ee6ef422a72f3aec6f07e21ea5f10622b735538a9"},
{file = "bcrypt-4.2.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:39e1d30c7233cfc54f5c3f2c825156fe044efdd3e0b9d309512cc514a263ec2a"},
{file = "bcrypt-4.2.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f4f4acf526fcd1c34e7ce851147deedd4e26e6402369304220250598b26448db"},
{file = "bcrypt-4.2.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:1ff39b78a52cf03fdf902635e4c81e544714861ba3f0efc56558979dd4f09170"},
{file = "bcrypt-4.2.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:373db9abe198e8e2c70d12b479464e0d5092cc122b20ec504097b5f2297ed184"},
{file = "bcrypt-4.2.0.tar.gz", hash = "sha256:cf69eaf5185fd58f268f805b505ce31f9b9fc2d64b376642164e9244540c1221"},
{file = "bcrypt-4.2.1-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:1340411a0894b7d3ef562fb233e4b6ed58add185228650942bdc885362f32c17"},
{file = "bcrypt-4.2.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b1ee315739bc8387aa36ff127afc99120ee452924e0df517a8f3e4c0187a0f5f"},
{file = "bcrypt-4.2.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dbd0747208912b1e4ce730c6725cb56c07ac734b3629b60d4398f082ea718ad"},
{file = "bcrypt-4.2.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:aaa2e285be097050dba798d537b6efd9b698aa88eef52ec98d23dcd6d7cf6fea"},
{file = "bcrypt-4.2.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:76d3e352b32f4eeb34703370e370997065d28a561e4a18afe4fef07249cb4396"},
{file = "bcrypt-4.2.1-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:b7703ede632dc945ed1172d6f24e9f30f27b1b1a067f32f68bf169c5f08d0425"},
{file = "bcrypt-4.2.1-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:89df2aea2c43be1e1fa066df5f86c8ce822ab70a30e4c210968669565c0f4685"},
{file = "bcrypt-4.2.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:04e56e3fe8308a88b77e0afd20bec516f74aecf391cdd6e374f15cbed32783d6"},
{file = "bcrypt-4.2.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:cfdf3d7530c790432046c40cda41dfee8c83e29482e6a604f8930b9930e94139"},
{file = "bcrypt-4.2.1-cp37-abi3-win32.whl", hash = "sha256:adadd36274510a01f33e6dc08f5824b97c9580583bd4487c564fc4617b328005"},
{file = "bcrypt-4.2.1-cp37-abi3-win_amd64.whl", hash = "sha256:8c458cd103e6c5d1d85cf600e546a639f234964d0228909d8f8dbeebff82d526"},
{file = "bcrypt-4.2.1-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:8ad2f4528cbf0febe80e5a3a57d7a74e6635e41af1ea5675282a33d769fba413"},
{file = "bcrypt-4.2.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:909faa1027900f2252a9ca5dfebd25fc0ef1417943824783d1c8418dd7d6df4a"},
{file = "bcrypt-4.2.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cde78d385d5e93ece5479a0a87f73cd6fa26b171c786a884f955e165032b262c"},
{file = "bcrypt-4.2.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:533e7f3bcf2f07caee7ad98124fab7499cb3333ba2274f7a36cf1daee7409d99"},
{file = "bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:687cf30e6681eeda39548a93ce9bfbb300e48b4d445a43db4298d2474d2a1e54"},
{file = "bcrypt-4.2.1-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:041fa0155c9004eb98a232d54da05c0b41d4b8e66b6fc3cb71b4b3f6144ba837"},
{file = "bcrypt-4.2.1-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f85b1ffa09240c89aa2e1ae9f3b1c687104f7b2b9d2098da4e923f1b7082d331"},
{file = "bcrypt-4.2.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c6f5fa3775966cca251848d4d5393ab016b3afed251163c1436fefdec3b02c84"},
{file = "bcrypt-4.2.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:807261df60a8b1ccd13e6599c779014a362ae4e795f5c59747f60208daddd96d"},
{file = "bcrypt-4.2.1-cp39-abi3-win32.whl", hash = "sha256:b588af02b89d9fad33e5f98f7838bf590d6d692df7153647724a7f20c186f6bf"},
{file = "bcrypt-4.2.1-cp39-abi3-win_amd64.whl", hash = "sha256:e84e0e6f8e40a242b11bce56c313edc2be121cec3e0ec2d76fce01f6af33c07c"},
{file = "bcrypt-4.2.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:76132c176a6d9953cdc83c296aeaed65e1a708485fd55abf163e0d9f8f16ce0e"},
{file = "bcrypt-4.2.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e158009a54c4c8bc91d5e0da80920d048f918c61a581f0a63e4e93bb556d362f"},
{file = "bcrypt-4.2.1.tar.gz", hash = "sha256:6765386e3ab87f569b276988742039baab087b2cdb01e809d74e74503c2faafe"},
]
[package.extras]
@@ -472,123 +470,138 @@ smmap = ">=3.0.1,<6"
[[package]]
name = "gitpython"
version = "3.1.43"
version = "3.1.44"
description = "GitPython is a Python library used to interact with Git repositories"
optional = false
python-versions = ">=3.7"
files = [
{file = "GitPython-3.1.43-py3-none-any.whl", hash = "sha256:eec7ec56b92aad751f9912a73404bc02ba212a23adb2c7098ee668417051a1ff"},
{file = "GitPython-3.1.43.tar.gz", hash = "sha256:35f314a9f878467f5453cc1fee295c3e18e52f1b99f10f6cf5b1682e968a9e7c"},
{file = "GitPython-3.1.44-py3-none-any.whl", hash = "sha256:9e0e10cda9bed1ee64bc9a6de50e7e38a9c9943241cd7f585f6df3ed28011110"},
{file = "gitpython-3.1.44.tar.gz", hash = "sha256:c87e30b26253bf5418b01b0660f818967f3c503193838337fe5e573331249269"},
]
[package.dependencies]
gitdb = ">=4.0.1,<5"
[package.extras]
doc = ["sphinx (==4.3.2)", "sphinx-autodoc-typehints", "sphinx-rtd-theme", "sphinxcontrib-applehelp (>=1.0.2,<=1.0.4)", "sphinxcontrib-devhelp (==1.0.2)", "sphinxcontrib-htmlhelp (>=2.0.0,<=2.0.1)", "sphinxcontrib-qthelp (==1.0.3)", "sphinxcontrib-serializinghtml (==1.1.5)"]
doc = ["sphinx (>=7.1.2,<7.2)", "sphinx-autodoc-typehints", "sphinx_rtd_theme"]
test = ["coverage[toml]", "ddt (>=1.1.1,!=1.4.3)", "mock", "mypy", "pre-commit", "pytest (>=7.3.1)", "pytest-cov", "pytest-instafail", "pytest-mock", "pytest-sugar", "typing-extensions"]
[[package]]
name = "hiredis"
version = "3.0.0"
version = "3.1.0"
description = "Python wrapper for hiredis"
optional = true
python-versions = ">=3.8"
files = [
{file = "hiredis-3.0.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:4b182791c41c5eb1d9ed736f0ff81694b06937ca14b0d4dadde5dadba7ff6dae"},
{file = "hiredis-3.0.0-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:13c275b483a052dd645eb2cb60d6380f1f5215e4c22d6207e17b86be6dd87ffa"},
{file = "hiredis-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c1018cc7f12824506f165027eabb302735b49e63af73eb4d5450c66c88f47026"},
{file = "hiredis-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83a29cc7b21b746cb6a480189e49f49b2072812c445e66a9e38d2004d496b81c"},
{file = "hiredis-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e241fab6332e8fb5f14af00a4a9c6aefa22f19a336c069b7ddbf28ef8341e8d6"},
{file = "hiredis-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1fb8de899f0145d6c4d5d4bd0ee88a78eb980a7ffabd51e9889251b8f58f1785"},
{file = "hiredis-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b23291951959141173eec10f8573538e9349fa27f47a0c34323d1970bf891ee5"},
{file = "hiredis-3.0.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e421ac9e4b5efc11705a0d5149e641d4defdc07077f748667f359e60dc904420"},
{file = "hiredis-3.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:77c8006c12154c37691b24ff293c077300c22944018c3ff70094a33e10c1d795"},
{file = "hiredis-3.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:41afc0d3c18b59eb50970479a9c0e5544fb4b95e3a79cf2fbaece6ddefb926fe"},
{file = "hiredis-3.0.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:04ccae6dcd9647eae6025425ab64edb4d79fde8b9e6e115ebfabc6830170e3b2"},
{file = "hiredis-3.0.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:fe91d62b0594db5ea7d23fc2192182b1a7b6973f628a9b8b2e0a42a2be721ac6"},
{file = "hiredis-3.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:99516d99316062824a24d145d694f5b0d030c80da693ea6f8c4ecf71a251d8bb"},
{file = "hiredis-3.0.0-cp310-cp310-win32.whl", hash = "sha256:562eaf820de045eb487afaa37e6293fe7eceb5b25e158b5a1974b7e40bf04543"},
{file = "hiredis-3.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:a1c81c89ed765198da27412aa21478f30d54ef69bf5e4480089d9c3f77b8f882"},
{file = "hiredis-3.0.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:4664dedcd5933364756d7251a7ea86d60246ccf73a2e00912872dacbfcef8978"},
{file = "hiredis-3.0.0-cp311-cp311-macosx_10_15_x86_64.whl", hash = "sha256:47de0bbccf4c8a9f99d82d225f7672b9dd690d8fd872007b933ef51a302c9fa6"},
{file = "hiredis-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e43679eca508ba8240d016d8cca9d27342d70184773c15bea78a23c87a1922f1"},
{file = "hiredis-3.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:13c345e7278c210317e77e1934b27b61394fee0dec2e8bd47e71570900f75823"},
{file = "hiredis-3.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:00018f22f38530768b73ea86c11f47e8d4df65facd4e562bd78773bd1baef35e"},
{file = "hiredis-3.0.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ea3a86405baa8eb0d3639ced6926ad03e07113de54cb00fd7510cb0db76a89d"},
{file = "hiredis-3.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c073848d2b1d5561f3903879ccf4e1a70c9b1e7566c7bdcc98d082fa3e7f0a1d"},
{file = "hiredis-3.0.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5a8dffb5f5b3415a4669d25de48b617fd9d44b0bccfc4c2ab24b06406ecc9ecb"},
{file = "hiredis-3.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:22c17c96143c2a62dfd61b13803bc5de2ac526b8768d2141c018b965d0333b66"},
{file = "hiredis-3.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:c3ece960008dab66c6b8bb3a1350764677ee7c74ccd6270aaf1b1caf9ccebb46"},
{file = "hiredis-3.0.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f75999ae00a920f7dce6ecae76fa5e8674a3110e5a75f12c7a2c75ae1af53396"},
{file = "hiredis-3.0.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:e069967cbd5e1900aafc4b5943888f6d34937fc59bf8918a1a546cb729b4b1e4"},
{file = "hiredis-3.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0aacc0a78e1d94d843a6d191f224a35893e6bdfeb77a4a89264155015c65f126"},
{file = "hiredis-3.0.0-cp311-cp311-win32.whl", hash = "sha256:719c32147ba29528cb451f037bf837dcdda4ff3ddb6cdb12c4216b0973174718"},
{file = "hiredis-3.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:bdc144d56333c52c853c31b4e2e52cfbdb22d3da4374c00f5f3d67c42158970f"},
{file = "hiredis-3.0.0-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:484025d2eb8f6348f7876fc5a2ee742f568915039fcb31b478fd5c242bb0fe3a"},
{file = "hiredis-3.0.0-cp312-cp312-macosx_10_15_x86_64.whl", hash = "sha256:fcdb552ffd97151dab8e7bc3ab556dfa1512556b48a367db94b5c20253a35ee1"},
{file = "hiredis-3.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0bb6f9fd92f147ba11d338ef5c68af4fd2908739c09e51f186e1d90958c68cc1"},
{file = "hiredis-3.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fa86bf9a0ed339ec9e8a9a9d0ae4dccd8671625c83f9f9f2640729b15e07fbfd"},
{file = "hiredis-3.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e194a0d5df9456995d8f510eab9f529213e7326af6b94770abf8f8b7952ddcaa"},
{file = "hiredis-3.0.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c8a1df39d74ec507d79c7a82c8063eee60bf80537cdeee652f576059b9cdd15c"},
{file = "hiredis-3.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f91456507427ba36fd81b2ca11053a8e112c775325acc74e993201ea912d63e9"},
{file = "hiredis-3.0.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9862db92ef67a8a02e0d5370f07d380e14577ecb281b79720e0d7a89aedb9ee5"},
{file = "hiredis-3.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d10fcd9e0eeab835f492832b2a6edb5940e2f1230155f33006a8dfd3bd2c94e4"},
{file = "hiredis-3.0.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:48727d7d405d03977d01885f317328dc21d639096308de126c2c4e9950cbd3c9"},
{file = "hiredis-3.0.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:8e0bb6102ebe2efecf8a3292c6660a0e6fac98176af6de67f020bea1c2343717"},
{file = "hiredis-3.0.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:df274e3abb4df40f4c7274dd3e587dfbb25691826c948bc98d5fead019dfb001"},
{file = "hiredis-3.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:034925b5fb514f7b11aac38cd55b3fd7e9d3af23bd6497f3f20aa5b8ba58e232"},
{file = "hiredis-3.0.0-cp312-cp312-win32.whl", hash = "sha256:120f2dda469b28d12ccff7c2230225162e174657b49cf4cd119db525414ae281"},
{file = "hiredis-3.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:e584fe5f4e6681d8762982be055f1534e0170f6308a7a90f58d737bab12ff6a8"},
{file = "hiredis-3.0.0-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:122171ff47d96ed8dd4bba6c0e41d8afaba3e8194949f7720431a62aa29d8895"},
{file = "hiredis-3.0.0-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:ba9fc605ac558f0de67463fb588722878641e6fa1dabcda979e8e69ff581d0bd"},
{file = "hiredis-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a631e2990b8be23178f655cae8ac6c7422af478c420dd54e25f2e26c29e766f1"},
{file = "hiredis-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63482db3fadebadc1d01ad33afa6045ebe2ea528eb77ccaabd33ee7d9c2bad48"},
{file = "hiredis-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1f669212c390eebfbe03c4e20181f5970b82c5d0a0ad1df1785f7ffbe7d61150"},
{file = "hiredis-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a6a49ef161739f8018c69b371528bdb47d7342edfdee9ddc75a4d8caddf45a6e"},
{file = "hiredis-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98a152052b8878e5e43a2e3a14075218adafc759547c98668a21e9485882696c"},
{file = "hiredis-3.0.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:50a196af0ce657fcde9bf8a0bbe1032e22c64d8fcec2bc926a35e7ff68b3a166"},
{file = "hiredis-3.0.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:f2f312eef8aafc2255e3585dcf94d5da116c43ef837db91db9ecdc1bc930072d"},
{file = "hiredis-3.0.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:6ca41fa40fa019cde42c21add74aadd775e71458051a15a352eabeb12eb4d084"},
{file = "hiredis-3.0.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:6eecb343c70629f5af55a8b3e53264e44fa04e155ef7989de13668a0cb102a90"},
{file = "hiredis-3.0.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:c3fdad75e7837a475900a1d3a5cc09aa024293c3b0605155da2d42f41bc0e482"},
{file = "hiredis-3.0.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:8854969e7480e8d61ed7549eb232d95082a743e94138d98d7222ba4e9f7ecacd"},
{file = "hiredis-3.0.0-cp38-cp38-win32.whl", hash = "sha256:f114a6c86edbf17554672b050cce72abf489fe58d583c7921904d5f1c9691605"},
{file = "hiredis-3.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:7d99b91e42217d7b4b63354b15b41ce960e27d216783e04c4a350224d55842a4"},
{file = "hiredis-3.0.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:4c6efcbb5687cf8d2aedcc2c3ed4ac6feae90b8547427d417111194873b66b06"},
{file = "hiredis-3.0.0-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:5b5cff42a522a0d81c2ae7eae5e56d0ee7365e0c4ad50c4de467d8957aff4414"},
{file = "hiredis-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:82f794d564f4bc76b80c50b03267fe5d6589e93f08e66b7a2f674faa2fa76ebc"},
{file = "hiredis-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7a4c1791d7aa7e192f60fe028ae409f18ccdd540f8b1e6aeb0df7816c77e4a4"},
{file = "hiredis-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a2537b2cd98192323fce4244c8edbf11f3cac548a9d633dbbb12b48702f379f4"},
{file = "hiredis-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8fed69bbaa307040c62195a269f82fc3edf46b510a17abb6b30a15d7dab548df"},
{file = "hiredis-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:869f6d5537d243080f44253491bb30aa1ec3c21754003b3bddeadedeb65842b0"},
{file = "hiredis-3.0.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d435ae89073d7cd51e6b6bf78369c412216261c9c01662e7008ff00978153729"},
{file = "hiredis-3.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:204b79b30a0e6be0dc2301a4d385bb61472809f09c49f400497f1cdd5a165c66"},
{file = "hiredis-3.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3ea635101b739c12effd189cc19b2671c268abb03013fd1f6321ca29df3ca625"},
{file = "hiredis-3.0.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:f359175197fd833c8dd7a8c288f1516be45415bb5c939862ab60c2918e1e1943"},
{file = "hiredis-3.0.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:ac6d929cb33dd12ad3424b75725975f0a54b5b12dbff95f2a2d660c510aa106d"},
{file = "hiredis-3.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:100431e04d25a522ef2c3b94f294c4219c4de3bfc7d557b6253296145a144c11"},
{file = "hiredis-3.0.0-cp39-cp39-win32.whl", hash = "sha256:e1a9c14ae9573d172dc050a6f63a644457df5d01ec4d35a6a0f097f812930f83"},
{file = "hiredis-3.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:54a6dd7b478e6eb01ce15b3bb5bf771e108c6c148315bf194eb2ab776a3cac4d"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:50da7a9edf371441dfcc56288d790985ee9840d982750580710a9789b8f4a290"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9b285ef6bf1581310b0d5e8f6ce64f790a1c40e89c660e1320b35f7515433672"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0dcfa684966f25b335072115de2f920228a3c2caf79d4bfa2b30f6e4f674a948"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a41be8af1fd78ca97bc948d789a09b730d1e7587d07ca53af05758f31f4b985d"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:038756db735e417ab36ee6fd7725ce412385ed2bd0767e8179a4755ea11b804f"},
{file = "hiredis-3.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:fcecbd39bd42cef905c0b51c9689c39d0cc8b88b1671e7f40d4fb213423aef3a"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-macosx_10_15_x86_64.whl", hash = "sha256:a131377493a59fb0f5eaeb2afd49c6540cafcfba5b0b3752bed707be9e7c4eaf"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:3d22c53f0ec5c18ecb3d92aa9420563b1c5d657d53f01356114978107b00b860"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8a91e9520fbc65a799943e5c970ffbcd67905744d8becf2e75f9f0a5e8414f0"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3dc8043959b50141df58ab4f398e8ae84c6f9e673a2c9407be65fc789138f4a6"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:51b99cfac514173d7b8abdfe10338193e8a0eccdfe1870b646009d2fb7cbe4b5"},
{file = "hiredis-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:fa1fcad89d8a41d8dc10b1e54951ec1e161deabd84ed5a2c95c3c7213bdb3514"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:898636a06d9bf575d2c594129085ad6b713414038276a4bfc5db7646b8a5be78"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:466f836dbcf86de3f9692097a7a01533dc9926986022c6617dc364a402b265c5"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23142a8af92a13fc1e3f2ca1d940df3dcf2af1d176be41fe8d89e30a837a0b60"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:793c80a3d6b0b0e8196a2d5de37a08330125668c8012922685e17aa9108c33ac"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:467d28112c7faa29b7db743f40803d927c8591e9da02b6ce3d5fadc170a542a2"},
{file = "hiredis-3.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:dc384874a719c767b50a30750f937af18842ee5e288afba95a5a3ed703b1515a"},
{file = "hiredis-3.0.0.tar.gz", hash = "sha256:fed8581ae26345dea1f1e0d1a96e05041a727a45e7d8d459164583e23c6ac441"},
{file = "hiredis-3.1.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:2892db9db21f0cf7cc298d09f85d3e1f6dc4c4c24463ab67f79bc7a006d51867"},
{file = "hiredis-3.1.0-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:93cfa6cc25ee2ceb0be81dc61eca9995160b9e16bdb7cca4a00607d57e998918"},
{file = "hiredis-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2af62070aa9433802cae7be7364d5e82f76462c6a2ae34e53008b637aaa9a156"},
{file = "hiredis-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:072c162260ebb1d892683107da22d0d5da7a1414739eae4e185cac22fe89627f"},
{file = "hiredis-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c6b232c43e89755ba332c2745ddab059c0bc1a0f01448a3a14d506f8448b1ce6"},
{file = "hiredis-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eb5316c9a65c4dde80796aa245b76011bab64eb84461a77b0a61c1bf2970bcc9"},
{file = "hiredis-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e812a4e656bbd1c1c15c844b28259c49e26bb384837e44e8d2aa55412c91d2f7"},
{file = "hiredis-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:93a6c9230e5a5565847130c0e1005c8d3aa5ca681feb0ed542c4651323d32feb"},
{file = "hiredis-3.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a5f65e89ce50a94d9490d5442a649c6116f53f216c8c14eb37cf9637956482b2"},
{file = "hiredis-3.1.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9b2d6e33601c67c074c367fdccdd6033e642284e7a56adc130f18f724c378ca8"},
{file = "hiredis-3.1.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:bad3b1e0c83849910f28c95953417106f539277035a4b515d1425f93947bc28f"},
{file = "hiredis-3.1.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:9646de31f5994e6218311dcf216e971703dbf804c510fd3f84ddb9813c495824"},
{file = "hiredis-3.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:59a9230f3aa38a33d09d8171400de202f575d7a38869e5ce2947829bca6fe359"},
{file = "hiredis-3.1.0-cp310-cp310-win32.whl", hash = "sha256:0322d70f3328b97da14b6e98b18f0090a12ed8a8bf7ae20932e2eb9d1bb0aa2c"},
{file = "hiredis-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:802474c18e878b3f9905e160a8b7df87d57885758083eda76c5978265acb41aa"},
{file = "hiredis-3.1.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:c339ff4b4739b2a40da463763dd566129762f72926bca611ad9a457a9fe64abd"},
{file = "hiredis-3.1.0-cp311-cp311-macosx_10_15_x86_64.whl", hash = "sha256:0ffa2552f704a45954627697a378fc2f559004e53055b82f00daf30bd4305330"},
{file = "hiredis-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9acf7f0e7106f631cd618eb60ec9bbd6e43045addd5310f66ba1177209567e59"},
{file = "hiredis-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea4f5ecf9dbea93c827486f59c606684c3496ea71c7ba9a8131932780696e61a"},
{file = "hiredis-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:39efab176fca3d5111075f6ba56cd864f18db46d858289d39360c5672e0e5c3e"},
{file = "hiredis-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1110eae007f30e70a058d743e369c24430327cd01fd97d99519d6794a58dd587"},
{file = "hiredis-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9b390f63191bcccbb6044d4c118acdf4fa55f38e5658ac4cfd5a33a6f0c07659"},
{file = "hiredis-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:72a98ccc7b8ec9ce0100ecf59f45f05d2023606e8e3676b07a316d1c1c364072"},
{file = "hiredis-3.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7c76e751fd1e2f221dec09cdc24040ee486886e943d5d7ffc256e8cf15c75e51"},
{file = "hiredis-3.1.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:7d3880f213b6f14e9c69ce52beffd1748eecc8669698c4782761887273b6e1bd"},
{file = "hiredis-3.1.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:87c2b3fe7e7c96eba376506a76e11514e07e848f737b254e0973e4b5c3a491e9"},
{file = "hiredis-3.1.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:d3cfb4089e96f8f8ee9554da93148a9261aa6612ad2cc202c1a494c7b712e31f"},
{file = "hiredis-3.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4f12018e5c5f866a1c3f7017cb2d88e5c6f9440df2281e48865a2b6c40f247f4"},
{file = "hiredis-3.1.0-cp311-cp311-win32.whl", hash = "sha256:107b66ce977bb2dff8f2239e68344360a75d05fed3d9fa0570ac4d3020ce2396"},
{file = "hiredis-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:8f1240bde53d3d1676f0aba61b3661560dc9a681cae24d9de33e650864029aa4"},
{file = "hiredis-3.1.0-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:f7c7f89e0bc4246115754e2eda078a111282f6d6ecc6fb458557b724fe6f2aac"},
{file = "hiredis-3.1.0-cp312-cp312-macosx_10_15_x86_64.whl", hash = "sha256:3dbf9163296fa45fbddcfc4c5900f10e9ddadda37117dbfb641e327e536b53e0"},
{file = "hiredis-3.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:af46a4be0e82df470f68f35316fa16cd1e134d1c5092fc1082e1aad64cce716d"},
{file = "hiredis-3.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc63d698c43aea500a84d8b083f830c03808b6cf3933ae4d35a27f0a3d881652"},
{file = "hiredis-3.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:676b3d88674134bfaaf70dac181d1790b0f33b3187bfb9da9221e17e0e624f83"},
{file = "hiredis-3.1.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:aed10d9df1e2fb0011db2713ac64497462e9c2c0208b648c97569da772b959ca"},
{file = "hiredis-3.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b5bd8adfe8742e331a94cccd782bffea251fa70d9a709e71f4510f50794d700"},
{file = "hiredis-3.1.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9fc4e35b4afb0af6da55495dd0742ad32ab88150428a6ecdbb3085cbd60714e8"},
{file = "hiredis-3.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:89b83e76eb00ab0464e7b0752a3ffcb02626e742e9509bc141424a9c3202e8dc"},
{file = "hiredis-3.1.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:98ebf08c907836b70a8f40e030df8ab6f174dc7f6fa765251d813e89f14069d8"},
{file = "hiredis-3.1.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:6c840b9cec086328f2ee2cfee0038b5d6bbb514bac7b5e579da6e346eaac056c"},
{file = "hiredis-3.1.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:c5c44e9fa6f4462d0330cb5f5d46fa652512fc86b41d4d1974d0356f263e9105"},
{file = "hiredis-3.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e665b14ab50aa175cfa306fcb00fffd4e3ff02ceb36ca6a4df00b1246d6a73c4"},
{file = "hiredis-3.1.0-cp312-cp312-win32.whl", hash = "sha256:bd33db977ac7af97e8d035ffadb163b00546be22e5f1297b2123f5f9bf0f8a21"},
{file = "hiredis-3.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:37aed4aa9348600145e2d019c7be27855e503ecc4906c6976ff2f3b52e3d5d97"},
{file = "hiredis-3.1.0-cp313-cp313-macosx_10_15_universal2.whl", hash = "sha256:b87cddd8107487863fed6994de51e5594a0be267b0b19e213694e99cdd614623"},
{file = "hiredis-3.1.0-cp313-cp313-macosx_10_15_x86_64.whl", hash = "sha256:d302deff8cb63a7feffc1844e4dafc8076e566bbf10c5aaaf0f4fe791b8a6bd0"},
{file = "hiredis-3.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:4a018340c073cf88cb635b2bedff96619df2f666018c655e7911f46fa2c1c178"},
{file = "hiredis-3.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1e8ba6414ac1ae536129e18c069f3eb497df5a74e136e3566471620a4fa5f95"},
{file = "hiredis-3.1.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a86b9fef256c2beb162244791fdc025aa55f936d6358e86e2020e512fe2e4972"},
{file = "hiredis-3.1.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7acdc68e29a446ad17aadaff19c981a36b3bd8c894c3520412c8a7ab1c3e0de7"},
{file = "hiredis-3.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c7e06baea05de57e1e7548064f505a6964e992674fe61b8f274afe2ac93b6371"},
{file = "hiredis-3.1.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:35b5fc061c8a0dbfdb440053280504d6aaa8d9726bd4d1d0e1cfcbbdf0d60b73"},
{file = "hiredis-3.1.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c89d2dcb271d24c44f02264233b75d5db8c58831190fa92456a90b87fa17b748"},
{file = "hiredis-3.1.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:aa36688c10a08f626fddcf68c2b1b91b0e90b070c26e550a4151a877f5c2d431"},
{file = "hiredis-3.1.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f3982a9c16c1c4bc05a00b65d01ffb8d80ea1a7b6b533be2f1a769d3e989d2c0"},
{file = "hiredis-3.1.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:d1a6f889514ee2452300c9a06862fceedef22a2891f1c421a27b1ba52ef130b2"},
{file = "hiredis-3.1.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8a45ff7915392a55d9386bb235ea1d1eb9960615f301979f02143fc20036b699"},
{file = "hiredis-3.1.0-cp313-cp313-win32.whl", hash = "sha256:539e5bb725b62b76a5319a4e68fc7085f01349abc2316ef3df608ea0883c51d2"},
{file = "hiredis-3.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:9020fd7e58f489fda6a928c31355add0e665fd6b87b21954e675cf9943eafa32"},
{file = "hiredis-3.1.0-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:b621a89fc29b3f4b01be6640ec81a6a94b5382bc78fecb876408d57a071e45aa"},
{file = "hiredis-3.1.0-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:363e21fba55e1a26349dc9ca7da6b14332123879b6359bcee4a9acecb40ca33b"},
{file = "hiredis-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c156156798729eadc9ab76ffee96c88b93cc1c3b493f4dd0a4341f53939194ee"},
{file = "hiredis-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e38d8a325f9a6afac1b1c72d996d1add9e1b99696ce9410538ba5e9aa8fdba02"},
{file = "hiredis-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3004ef7436feb7bfa61c0b36d422b8fb8c29aaa1a514c9405f0fdee5e9694dd3"},
{file = "hiredis-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13f5b16f97d0bbd1c04ce367c49097d1214d60e11f9fee7ef2a9b54e0a6645c8"},
{file = "hiredis-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:230dd0e77cb0f525f58a1306a7b4aaf078037fc5229110922332ca46f90821bb"},
{file = "hiredis-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d968116caddd19d63120d1298e62b1bbc694db3360ed0d5df8c3a97edbc12552"},
{file = "hiredis-3.1.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:511e36a6fa41d3efab3cd5cd70ac388ed825993b9e66fa3b0e47cf27a2f5ffee"},
{file = "hiredis-3.1.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:c5cd20804e3cb0d31e7d899d8dd091f569c33fe40d4bade670a067ab7d31c2ac"},
{file = "hiredis-3.1.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:09e89e7d34cfe5ca8f7a869fca827d1af0afe8aaddb26b38c01058730edb79ad"},
{file = "hiredis-3.1.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:570cbf31413c77fe5e7c157f2943ca4400493ddd9cf2184731cfcafc753becd7"},
{file = "hiredis-3.1.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:b9b4da8162cf289781732d6a5ba01d820c42c05943fcdb7de307d03639961db3"},
{file = "hiredis-3.1.0-cp38-cp38-win32.whl", hash = "sha256:bc117a04bcb461d3bb1b2c5b417aee3442e1e8aa33ebc800481431f4c09fe0c5"},
{file = "hiredis-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:34f3f5f0354db2d6797a6fb08d2c036a50af62a1d919d122c1c784304ef49347"},
{file = "hiredis-3.1.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:a26fa888025badb5563f283cc19594c215a413e905729e59a5f7cf3f46d66c32"},
{file = "hiredis-3.1.0-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:f50763cd819d4a52a47b5966d4bb47dee34b637c5fa6402509800eee6ecb61e6"},
{file = "hiredis-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b6d1c9e1fce5e0a94072667ae2bf0142b89ebbb1917d3531184e060a43f3ee11"},
{file = "hiredis-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e38d7a56b1a79ed0bbb9e6fe376d82e3f4dcc646ae47472f2c858e19a597c112"},
{file = "hiredis-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4ef5ad8b91530e4d10a68562b0a380ea22705a60e88cecee086d7c63a38564ce"},
{file = "hiredis-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cf3d2299b054e57a9f97ca08704c2843e44f29b57dc69b76a2592ecd212efe1a"},
{file = "hiredis-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:93811d60b0f73d0f049c86f4373a3833b4a38fce374ab151074d929553eb4304"},
{file = "hiredis-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:18e703ff860c1d83abbcf57012b309ead02b56b60e85150c6c3bfb37cbb16ebf"},
{file = "hiredis-3.1.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:f9ea0678806c53d96758e74c6a898f9d506a2e3367a344757f768bef9e069366"},
{file = "hiredis-3.1.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:cf6844035abf47d52a1c3f4257255af3bf3b0f14d559b08eaa45885418c6c55d"},
{file = "hiredis-3.1.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:7acf35cfa7ec9e1e7559c04e7095628f7d06049b5f24dcb58c1a55ef6dc689f8"},
{file = "hiredis-3.1.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:b885695dce7a39b1fd9a609ed9c4cf312e53df2ec028d5a78af7a891b5fbea4d"},
{file = "hiredis-3.1.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:1c22fa74ddd063396b19fe8445a1ae8b4190eff755d5750dda48e860a45b2ee7"},
{file = "hiredis-3.1.0-cp39-cp39-win32.whl", hash = "sha256:0614e16339f1784df3bbd2800322e20b4127d3f3a3509f00a5562efddb2521aa"},
{file = "hiredis-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:c2bc713ee73ab9de4a0d68b0ab0f29612342b63173714742437b977584adb2d8"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:07ab990d0835f36bf358dbb84db4541ac0a8f533128ec09af8f80a576eef2e88"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5c54a88eb9d8ebc4e5eefaadbe2102a4f7499f9e413654172f40aefd25350959"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8095ef159896e5999a795b0f80e4d64281301a109e442a8d29cd750ca6bd8303"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f8ca13e2476ffd6d5be4763f5868133506ddcfa5ce54b4dac231ebdc19be6c6"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:34d25aa25c10f966d5415795ed271da84605044dbf436c054966cea5442451b3"},
{file = "hiredis-3.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4180dc5f646b426e5fa1212e1348c167ee2a864b3a70d56579163d64a847dd1e"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-macosx_10_15_x86_64.whl", hash = "sha256:d92144e0cd6e6e841a6ad343e9d58631626eeb4ac96b0322649379b5d4527447"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:fcb91ba42903de637b94a1b64477f381f94ad82c0742c264f9245be76a7a3cbc"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ce71a797b5bc02c51da082428c00251ed6a7a67a03acbda5fbf9e8d028725f6"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e04c7feb9467e3170cd4d5bee381775783d81bbc45d6147c1c0ce3b50dc04f9"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a31806306a60f3565c04c964d6bee0e9d4a5120e1da589e41976b53972edf635"},
{file = "hiredis-3.1.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:bc51f594c2c0863ded6501642dc96701ca8bbea9ced4fa3af0a1aeda8aa634cb"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:4663a319ab7d22c597b9421e5ea384fd583e044f2f1ca9a1b98d4fef8a0fea2f"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:8060fa256862b0c3de64a73ab45bc1ccf381caca464f2647af9075b200828948"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e9445b7f117a9c8c8ccad97cb44daa55ddccff3cbc9079984eac56d982ba01f"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:732cf1c5cf1324f7bf3b6086976fe62a2ca98f0bf6316f31063c2c67be8797bc"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2102a94063d878c40df92f55199637a74f535e3a0b79ceba4a00538853a21be3"},
{file = "hiredis-3.1.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:d968dde69e3fe903bf9ef00667669dcf04a3e096e33aaf138775106ead138bc8"},
{file = "hiredis-3.1.0.tar.gz", hash = "sha256:51d40ac3611091020d7dea6b05ed62cb152bff595fa4f931e7b6479d777acf7c"},
]
[[package]]
@@ -605,6 +618,25 @@ files = [
[package.dependencies]
idna = ">=2.5"
[[package]]
name = "id"
version = "1.5.0"
description = "A tool for generating OIDC identities"
optional = false
python-versions = ">=3.8"
files = [
{file = "id-1.5.0-py3-none-any.whl", hash = "sha256:f1434e1cef91f2cbb8a4ec64663d5a23b9ed43ef44c4c957d02583d61714c658"},
{file = "id-1.5.0.tar.gz", hash = "sha256:292cb8a49eacbbdbce97244f47a97b4c62540169c976552e497fd57df0734c1d"},
]
[package.dependencies]
requests = "*"
[package.extras]
dev = ["build", "bump (>=1.3.2)", "id[lint,test]"]
lint = ["bandit", "interrogate", "mypy", "ruff (<0.8.2)", "types-requests"]
test = ["coverage[toml]", "pretend", "pytest", "pytest-cov"]
[[package]]
name = "idna"
version = "3.10"
@@ -842,13 +874,13 @@ trio = ["async_generator", "trio"]
[[package]]
name = "jinja2"
version = "3.1.4"
version = "3.1.5"
description = "A very fast and expressive template engine."
optional = false
python-versions = ">=3.7"
files = [
{file = "jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d"},
{file = "jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369"},
{file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"},
{file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"},
]
[package.dependencies]
@@ -1314,43 +1346,43 @@ files = [
[[package]]
name = "mypy"
version = "1.12.1"
version = "1.13.0"
description = "Optional static typing for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "mypy-1.12.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3d7d4371829184e22fda4015278fbfdef0327a4b955a483012bd2d423a788801"},
{file = "mypy-1.12.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f59f1dfbf497d473201356966e353ef09d4daec48caeacc0254db8ef633a28a5"},
{file = "mypy-1.12.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b947097fae68004b8328c55161ac9db7d3566abfef72d9d41b47a021c2fba6b1"},
{file = "mypy-1.12.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:96af62050971c5241afb4701c15189ea9507db89ad07794a4ee7b4e092dc0627"},
{file = "mypy-1.12.1-cp310-cp310-win_amd64.whl", hash = "sha256:d90da248f4c2dba6c44ddcfea94bb361e491962f05f41990ff24dbd09969ce20"},
{file = "mypy-1.12.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1230048fec1380faf240be6385e709c8570604d2d27ec6ca7e573e3bc09c3735"},
{file = "mypy-1.12.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:02dcfe270c6ea13338210908f8cadc8d31af0f04cee8ca996438fe6a97b4ec66"},
{file = "mypy-1.12.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a5a437c9102a6a252d9e3a63edc191a3aed5f2fcb786d614722ee3f4472e33f6"},
{file = "mypy-1.12.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:186e0c8346efc027ee1f9acf5ca734425fc4f7dc2b60144f0fbe27cc19dc7931"},
{file = "mypy-1.12.1-cp311-cp311-win_amd64.whl", hash = "sha256:673ba1140a478b50e6d265c03391702fa11a5c5aff3f54d69a62a48da32cb811"},
{file = "mypy-1.12.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:9fb83a7be97c498176fb7486cafbb81decccaef1ac339d837c377b0ce3743a7f"},
{file = "mypy-1.12.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:389e307e333879c571029d5b93932cf838b811d3f5395ed1ad05086b52148fb0"},
{file = "mypy-1.12.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:94b2048a95a21f7a9ebc9fbd075a4fcd310410d078aa0228dbbad7f71335e042"},
{file = "mypy-1.12.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ee5932370ccf7ebf83f79d1c157a5929d7ea36313027b0d70a488493dc1b179"},
{file = "mypy-1.12.1-cp312-cp312-win_amd64.whl", hash = "sha256:19bf51f87a295e7ab2894f1d8167622b063492d754e69c3c2fed6563268cb42a"},
{file = "mypy-1.12.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d34167d43613ffb1d6c6cdc0cc043bb106cac0aa5d6a4171f77ab92a3c758bcc"},
{file = "mypy-1.12.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:427878aa54f2e2c5d8db31fa9010c599ed9f994b3b49e64ae9cd9990c40bd635"},
{file = "mypy-1.12.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5fcde63ea2c9f69d6be859a1e6dd35955e87fa81de95bc240143cf00de1f7f81"},
{file = "mypy-1.12.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d54d840f6c052929f4a3d2aab2066af0f45a020b085fe0e40d4583db52aab4e4"},
{file = "mypy-1.12.1-cp313-cp313-win_amd64.whl", hash = "sha256:20db6eb1ca3d1de8ece00033b12f793f1ea9da767334b7e8c626a4872090cf02"},
{file = "mypy-1.12.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b16fe09f9c741d85a2e3b14a5257a27a4f4886c171d562bc5a5e90d8591906b8"},
{file = "mypy-1.12.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:0dcc1e843d58f444fce19da4cce5bd35c282d4bde232acdeca8279523087088a"},
{file = "mypy-1.12.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e10ba7de5c616e44ad21005fa13450cd0de7caaa303a626147d45307492e4f2d"},
{file = "mypy-1.12.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:0e6fe449223fa59fbee351db32283838a8fee8059e0028e9e6494a03802b4004"},
{file = "mypy-1.12.1-cp38-cp38-win_amd64.whl", hash = "sha256:dc6e2a2195a290a7fd5bac3e60b586d77fc88e986eba7feced8b778c373f9afe"},
{file = "mypy-1.12.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:de5b2a8988b4e1269a98beaf0e7cc71b510d050dce80c343b53b4955fff45f19"},
{file = "mypy-1.12.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:843826966f1d65925e8b50d2b483065c51fc16dc5d72647e0236aae51dc8d77e"},
{file = "mypy-1.12.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9fe20f89da41a95e14c34b1ddb09c80262edcc295ad891f22cc4b60013e8f78d"},
{file = "mypy-1.12.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8135ffec02121a75f75dc97c81af7c14aa4ae0dda277132cfcd6abcd21551bfd"},
{file = "mypy-1.12.1-cp39-cp39-win_amd64.whl", hash = "sha256:a7b76fa83260824300cc4834a3ab93180db19876bce59af921467fd03e692810"},
{file = "mypy-1.12.1-py3-none-any.whl", hash = "sha256:ce561a09e3bb9863ab77edf29ae3a50e65685ad74bba1431278185b7e5d5486e"},
{file = "mypy-1.12.1.tar.gz", hash = "sha256:f5b3936f7a6d0e8280c9bdef94c7ce4847f5cdfc258fbb2c29a8c1711e8bb96d"},
{file = "mypy-1.13.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6607e0f1dd1fb7f0aca14d936d13fd19eba5e17e1cd2a14f808fa5f8f6d8f60a"},
{file = "mypy-1.13.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8a21be69bd26fa81b1f80a61ee7ab05b076c674d9b18fb56239d72e21d9f4c80"},
{file = "mypy-1.13.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7b2353a44d2179846a096e25691d54d59904559f4232519d420d64da6828a3a7"},
{file = "mypy-1.13.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0730d1c6a2739d4511dc4253f8274cdd140c55c32dfb0a4cf8b7a43f40abfa6f"},
{file = "mypy-1.13.0-cp310-cp310-win_amd64.whl", hash = "sha256:c5fc54dbb712ff5e5a0fca797e6e0aa25726c7e72c6a5850cfd2adbc1eb0a372"},
{file = "mypy-1.13.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:581665e6f3a8a9078f28d5502f4c334c0c8d802ef55ea0e7276a6e409bc0d82d"},
{file = "mypy-1.13.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3ddb5b9bf82e05cc9a627e84707b528e5c7caaa1c55c69e175abb15a761cec2d"},
{file = "mypy-1.13.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:20c7ee0bc0d5a9595c46f38beb04201f2620065a93755704e141fcac9f59db2b"},
{file = "mypy-1.13.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3790ded76f0b34bc9c8ba4def8f919dd6a46db0f5a6610fb994fe8efdd447f73"},
{file = "mypy-1.13.0-cp311-cp311-win_amd64.whl", hash = "sha256:51f869f4b6b538229c1d1bcc1dd7d119817206e2bc54e8e374b3dfa202defcca"},
{file = "mypy-1.13.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5c7051a3461ae84dfb5dd15eff5094640c61c5f22257c8b766794e6dd85e72d5"},
{file = "mypy-1.13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:39bb21c69a5d6342f4ce526e4584bc5c197fd20a60d14a8624d8743fffb9472e"},
{file = "mypy-1.13.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:164f28cb9d6367439031f4c81e84d3ccaa1e19232d9d05d37cb0bd880d3f93c2"},
{file = "mypy-1.13.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a4c1bfcdbce96ff5d96fc9b08e3831acb30dc44ab02671eca5953eadad07d6d0"},
{file = "mypy-1.13.0-cp312-cp312-win_amd64.whl", hash = "sha256:a0affb3a79a256b4183ba09811e3577c5163ed06685e4d4b46429a271ba174d2"},
{file = "mypy-1.13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a7b44178c9760ce1a43f544e595d35ed61ac2c3de306599fa59b38a6048e1aa7"},
{file = "mypy-1.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5d5092efb8516d08440e36626f0153b5006d4088c1d663d88bf79625af3d1d62"},
{file = "mypy-1.13.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2904956dac40ced10931ac967ae63c5089bd498542194b436eb097a9f77bc8"},
{file = "mypy-1.13.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7bfd8836970d33c2105562650656b6846149374dc8ed77d98424b40b09340ba7"},
{file = "mypy-1.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:9f73dba9ec77acb86457a8fc04b5239822df0c14a082564737833d2963677dbc"},
{file = "mypy-1.13.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:100fac22ce82925f676a734af0db922ecfea991e1d7ec0ceb1e115ebe501301a"},
{file = "mypy-1.13.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7bcb0bb7f42a978bb323a7c88f1081d1b5dee77ca86f4100735a6f541299d8fb"},
{file = "mypy-1.13.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bde31fc887c213e223bbfc34328070996061b0833b0a4cfec53745ed61f3519b"},
{file = "mypy-1.13.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:07de989f89786f62b937851295ed62e51774722e5444a27cecca993fc3f9cd74"},
{file = "mypy-1.13.0-cp38-cp38-win_amd64.whl", hash = "sha256:4bde84334fbe19bad704b3f5b78c4abd35ff1026f8ba72b29de70dda0916beb6"},
{file = "mypy-1.13.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:0246bcb1b5de7f08f2826451abd947bf656945209b140d16ed317f65a17dc7dc"},
{file = "mypy-1.13.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7f5b7deae912cf8b77e990b9280f170381fdfbddf61b4ef80927edd813163732"},
{file = "mypy-1.13.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7029881ec6ffb8bc233a4fa364736789582c738217b133f1b55967115288a2bc"},
{file = "mypy-1.13.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3e38b980e5681f28f033f3be86b099a247b13c491f14bb8b1e1e134d23bb599d"},
{file = "mypy-1.13.0-cp39-cp39-win_amd64.whl", hash = "sha256:a6789be98a2017c912ae6ccb77ea553bbaf13d27605d2ca20a76dfbced631b24"},
{file = "mypy-1.13.0-py3-none-any.whl", hash = "sha256:9c250883f9fd81d212e0952c92dbfcc96fc237f4b7c92f56ac81fd48460b3e5a"},
{file = "mypy-1.13.0.tar.gz", hash = "sha256:0291a61b6fbf3e6673e3405cfcc0e7650bebc7939659fdca2702958038bd835e"},
]
[package.dependencies]
@@ -1360,6 +1392,7 @@ typing-extensions = ">=4.6.0"
[package.extras]
dmypy = ["psutil (>=4.0)"]
faster-cache = ["orjson"]
install-types = ["pip"]
mypyc = ["setuptools (>=50)"]
reports = ["lxml"]
@@ -1459,110 +1492,92 @@ files = [
[[package]]
name = "pillow"
version = "11.0.0"
version = "11.1.0"
description = "Python Imaging Library (Fork)"
optional = false
python-versions = ">=3.9"
files = [
{file = "pillow-11.0.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:6619654954dc4936fcff82db8eb6401d3159ec6be81e33c6000dfd76ae189947"},
{file = "pillow-11.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b3c5ac4bed7519088103d9450a1107f76308ecf91d6dabc8a33a2fcfb18d0fba"},
{file = "pillow-11.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a65149d8ada1055029fcb665452b2814fe7d7082fcb0c5bed6db851cb69b2086"},
{file = "pillow-11.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88a58d8ac0cc0e7f3a014509f0455248a76629ca9b604eca7dc5927cc593c5e9"},
{file = "pillow-11.0.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:c26845094b1af3c91852745ae78e3ea47abf3dbcd1cf962f16b9a5fbe3ee8488"},
{file = "pillow-11.0.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:1a61b54f87ab5786b8479f81c4b11f4d61702830354520837f8cc791ebba0f5f"},
{file = "pillow-11.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:674629ff60030d144b7bca2b8330225a9b11c482ed408813924619c6f302fdbb"},
{file = "pillow-11.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:598b4e238f13276e0008299bd2482003f48158e2b11826862b1eb2ad7c768b97"},
{file = "pillow-11.0.0-cp310-cp310-win32.whl", hash = "sha256:9a0f748eaa434a41fccf8e1ee7a3eed68af1b690e75328fd7a60af123c193b50"},
{file = "pillow-11.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:a5629742881bcbc1f42e840af185fd4d83a5edeb96475a575f4da50d6ede337c"},
{file = "pillow-11.0.0-cp310-cp310-win_arm64.whl", hash = "sha256:ee217c198f2e41f184f3869f3e485557296d505b5195c513b2bfe0062dc537f1"},
{file = "pillow-11.0.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1c1d72714f429a521d8d2d018badc42414c3077eb187a59579f28e4270b4b0fc"},
{file = "pillow-11.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:499c3a1b0d6fc8213519e193796eb1a86a1be4b1877d678b30f83fd979811d1a"},
{file = "pillow-11.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8b2351c85d855293a299038e1f89db92a2f35e8d2f783489c6f0b2b5f3fe8a3"},
{file = "pillow-11.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f4dba50cfa56f910241eb7f883c20f1e7b1d8f7d91c750cd0b318bad443f4d5"},
{file = "pillow-11.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:5ddbfd761ee00c12ee1be86c9c0683ecf5bb14c9772ddbd782085779a63dd55b"},
{file = "pillow-11.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:45c566eb10b8967d71bf1ab8e4a525e5a93519e29ea071459ce517f6b903d7fa"},
{file = "pillow-11.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b4fd7bd29610a83a8c9b564d457cf5bd92b4e11e79a4ee4716a63c959699b306"},
{file = "pillow-11.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:cb929ca942d0ec4fac404cbf520ee6cac37bf35be479b970c4ffadf2b6a1cad9"},
{file = "pillow-11.0.0-cp311-cp311-win32.whl", hash = "sha256:006bcdd307cc47ba43e924099a038cbf9591062e6c50e570819743f5607404f5"},
{file = "pillow-11.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:52a2d8323a465f84faaba5236567d212c3668f2ab53e1c74c15583cf507a0291"},
{file = "pillow-11.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:16095692a253047fe3ec028e951fa4221a1f3ed3d80c397e83541a3037ff67c9"},
{file = "pillow-11.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d2c0a187a92a1cb5ef2c8ed5412dd8d4334272617f532d4ad4de31e0495bd923"},
{file = "pillow-11.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:084a07ef0821cfe4858fe86652fffac8e187b6ae677e9906e192aafcc1b69903"},
{file = "pillow-11.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8069c5179902dcdce0be9bfc8235347fdbac249d23bd90514b7a47a72d9fecf4"},
{file = "pillow-11.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f02541ef64077f22bf4924f225c0fd1248c168f86e4b7abdedd87d6ebaceab0f"},
{file = "pillow-11.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:fcb4621042ac4b7865c179bb972ed0da0218a076dc1820ffc48b1d74c1e37fe9"},
{file = "pillow-11.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:00177a63030d612148e659b55ba99527803288cea7c75fb05766ab7981a8c1b7"},
{file = "pillow-11.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8853a3bf12afddfdf15f57c4b02d7ded92c7a75a5d7331d19f4f9572a89c17e6"},
{file = "pillow-11.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3107c66e43bda25359d5ef446f59c497de2b5ed4c7fdba0894f8d6cf3822dafc"},
{file = "pillow-11.0.0-cp312-cp312-win32.whl", hash = "sha256:86510e3f5eca0ab87429dd77fafc04693195eec7fd6a137c389c3eeb4cfb77c6"},
{file = "pillow-11.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:8ec4a89295cd6cd4d1058a5e6aec6bf51e0eaaf9714774e1bfac7cfc9051db47"},
{file = "pillow-11.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:27a7860107500d813fcd203b4ea19b04babe79448268403172782754870dac25"},
{file = "pillow-11.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bcd1fb5bb7b07f64c15618c89efcc2cfa3e95f0e3bcdbaf4642509de1942a699"},
{file = "pillow-11.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0e038b0745997c7dcaae350d35859c9715c71e92ffb7e0f4a8e8a16732150f38"},
{file = "pillow-11.0.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ae08bd8ffc41aebf578c2af2f9d8749d91f448b3bfd41d7d9ff573d74f2a6b2"},
{file = "pillow-11.0.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d69bfd8ec3219ae71bcde1f942b728903cad25fafe3100ba2258b973bd2bc1b2"},
{file = "pillow-11.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:61b887f9ddba63ddf62fd02a3ba7add935d053b6dd7d58998c630e6dbade8527"},
{file = "pillow-11.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:c6a660307ca9d4867caa8d9ca2c2658ab685de83792d1876274991adec7b93fa"},
{file = "pillow-11.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:73e3a0200cdda995c7e43dd47436c1548f87a30bb27fb871f352a22ab8dcf45f"},
{file = "pillow-11.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fba162b8872d30fea8c52b258a542c5dfd7b235fb5cb352240c8d63b414013eb"},
{file = "pillow-11.0.0-cp313-cp313-win32.whl", hash = "sha256:f1b82c27e89fffc6da125d5eb0ca6e68017faf5efc078128cfaa42cf5cb38798"},
{file = "pillow-11.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:8ba470552b48e5835f1d23ecb936bb7f71d206f9dfeee64245f30c3270b994de"},
{file = "pillow-11.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:846e193e103b41e984ac921b335df59195356ce3f71dcfd155aa79c603873b84"},
{file = "pillow-11.0.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4ad70c4214f67d7466bea6a08061eba35c01b1b89eaa098040a35272a8efb22b"},
{file = "pillow-11.0.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:6ec0d5af64f2e3d64a165f490d96368bb5dea8b8f9ad04487f9ab60dc4bb6003"},
{file = "pillow-11.0.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c809a70e43c7977c4a42aefd62f0131823ebf7dd73556fa5d5950f5b354087e2"},
{file = "pillow-11.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:4b60c9520f7207aaf2e1d94de026682fc227806c6e1f55bba7606d1c94dd623a"},
{file = "pillow-11.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1e2688958a840c822279fda0086fec1fdab2f95bf2b717b66871c4ad9859d7e8"},
{file = "pillow-11.0.0-cp313-cp313t-win32.whl", hash = "sha256:607bbe123c74e272e381a8d1957083a9463401f7bd01287f50521ecb05a313f8"},
{file = "pillow-11.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5c39ed17edea3bc69c743a8dd3e9853b7509625c2462532e62baa0732163a904"},
{file = "pillow-11.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:75acbbeb05b86bc53cbe7b7e6fe00fbcf82ad7c684b3ad82e3d711da9ba287d3"},
{file = "pillow-11.0.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:2e46773dc9f35a1dd28bd6981332fd7f27bec001a918a72a79b4133cf5291dba"},
{file = "pillow-11.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2679d2258b7f1192b378e2893a8a0a0ca472234d4c2c0e6bdd3380e8dfa21b6a"},
{file = "pillow-11.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eda2616eb2313cbb3eebbe51f19362eb434b18e3bb599466a1ffa76a033fb916"},
{file = "pillow-11.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20ec184af98a121fb2da42642dea8a29ec80fc3efbaefb86d8fdd2606619045d"},
{file = "pillow-11.0.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:8594f42df584e5b4bb9281799698403f7af489fba84c34d53d1c4bfb71b7c4e7"},
{file = "pillow-11.0.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:c12b5ae868897c7338519c03049a806af85b9b8c237b7d675b8c5e089e4a618e"},
{file = "pillow-11.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:70fbbdacd1d271b77b7721fe3cdd2d537bbbd75d29e6300c672ec6bb38d9672f"},
{file = "pillow-11.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5178952973e588b3f1360868847334e9e3bf49d19e169bbbdfaf8398002419ae"},
{file = "pillow-11.0.0-cp39-cp39-win32.whl", hash = "sha256:8c676b587da5673d3c75bd67dd2a8cdfeb282ca38a30f37950511766b26858c4"},
{file = "pillow-11.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:94f3e1780abb45062287b4614a5bc0874519c86a777d4a7ad34978e86428b8dd"},
{file = "pillow-11.0.0-cp39-cp39-win_arm64.whl", hash = "sha256:290f2cc809f9da7d6d622550bbf4c1e57518212da51b6a30fe8e0a270a5b78bd"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1187739620f2b365de756ce086fdb3604573337cc28a0d3ac4a01ab6b2d2a6d2"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:fbbcb7b57dc9c794843e3d1258c0fbf0f48656d46ffe9e09b63bbd6e8cd5d0a2"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d203af30149ae339ad1b4f710d9844ed8796e97fda23ffbc4cc472968a47d0b"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21a0d3b115009ebb8ac3d2ebec5c2982cc693da935f4ab7bb5c8ebe2f47d36f2"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:73853108f56df97baf2bb8b522f3578221e56f646ba345a372c78326710d3830"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e58876c91f97b0952eb766123bfef372792ab3f4e3e1f1a2267834c2ab131734"},
{file = "pillow-11.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:224aaa38177597bb179f3ec87eeefcce8e4f85e608025e9cfac60de237ba6316"},
{file = "pillow-11.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:5bd2d3bdb846d757055910f0a59792d33b555800813c3b39ada1829c372ccb06"},
{file = "pillow-11.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:375b8dd15a1f5d2feafff536d47e22f69625c1aa92f12b339ec0b2ca40263273"},
{file = "pillow-11.0.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:daffdf51ee5db69a82dd127eabecce20729e21f7a3680cf7cbb23f0829189790"},
{file = "pillow-11.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7326a1787e3c7b0429659e0a944725e1b03eeaa10edd945a86dead1913383944"},
{file = "pillow-11.0.0.tar.gz", hash = "sha256:72bacbaf24ac003fea9bff9837d1eedb6088758d41e100c1552930151f677739"},
{file = "pillow-11.1.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:e1abe69aca89514737465752b4bcaf8016de61b3be1397a8fc260ba33321b3a8"},
{file = "pillow-11.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c640e5a06869c75994624551f45e5506e4256562ead981cce820d5ab39ae2192"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a07dba04c5e22824816b2615ad7a7484432d7f540e6fa86af60d2de57b0fcee2"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e267b0ed063341f3e60acd25c05200df4193e15a4a5807075cd71225a2386e26"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:bd165131fd51697e22421d0e467997ad31621b74bfc0b75956608cb2906dda07"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:abc56501c3fd148d60659aae0af6ddc149660469082859fa7b066a298bde9482"},
{file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:54ce1c9a16a9561b6d6d8cb30089ab1e5eb66918cb47d457bd996ef34182922e"},
{file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:73ddde795ee9b06257dac5ad42fcb07f3b9b813f8c1f7f870f402f4dc54b5269"},
{file = "pillow-11.1.0-cp310-cp310-win32.whl", hash = "sha256:3a5fe20a7b66e8135d7fd617b13272626a28278d0e578c98720d9ba4b2439d49"},
{file = "pillow-11.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:b6123aa4a59d75f06e9dd3dac5bf8bc9aa383121bb3dd9a7a612e05eabc9961a"},
{file = "pillow-11.1.0-cp310-cp310-win_arm64.whl", hash = "sha256:a76da0a31da6fcae4210aa94fd779c65c75786bc9af06289cd1c184451ef7a65"},
{file = "pillow-11.1.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:e06695e0326d05b06833b40b7ef477e475d0b1ba3a6d27da1bb48c23209bf457"},
{file = "pillow-11.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96f82000e12f23e4f29346e42702b6ed9a2f2fea34a740dd5ffffcc8c539eb35"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a3cd561ded2cf2bbae44d4605837221b987c216cff94f49dfeed63488bb228d2"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f189805c8be5ca5add39e6f899e6ce2ed824e65fb45f3c28cb2841911da19070"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:dd0052e9db3474df30433f83a71b9b23bd9e4ef1de13d92df21a52c0303b8ab6"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:837060a8599b8f5d402e97197d4924f05a2e0d68756998345c829c33186217b1"},
{file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aa8dd43daa836b9a8128dbe7d923423e5ad86f50a7a14dc688194b7be5c0dea2"},
{file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0a2f91f8a8b367e7a57c6e91cd25af510168091fb89ec5146003e424e1558a96"},
{file = "pillow-11.1.0-cp311-cp311-win32.whl", hash = "sha256:c12fc111ef090845de2bb15009372175d76ac99969bdf31e2ce9b42e4b8cd88f"},
{file = "pillow-11.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:fbd43429d0d7ed6533b25fc993861b8fd512c42d04514a0dd6337fb3ccf22761"},
{file = "pillow-11.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:f7955ecf5609dee9442cbface754f2c6e541d9e6eda87fad7f7a989b0bdb9d71"},
{file = "pillow-11.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2062ffb1d36544d42fcaa277b069c88b01bb7298f4efa06731a7fd6cc290b81a"},
{file = "pillow-11.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a85b653980faad27e88b141348707ceeef8a1186f75ecc600c395dcac19f385b"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9409c080586d1f683df3f184f20e36fb647f2e0bc3988094d4fd8c9f4eb1b3b3"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fdadc077553621911f27ce206ffcbec7d3f8d7b50e0da39f10997e8e2bb7f6a"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:93a18841d09bcdd774dcdc308e4537e1f867b3dec059c131fde0327899734aa1"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:9aa9aeddeed452b2f616ff5507459e7bab436916ccb10961c4a382cd3e03f47f"},
{file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3cdcdb0b896e981678eee140d882b70092dac83ac1cdf6b3a60e2216a73f2b91"},
{file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:36ba10b9cb413e7c7dfa3e189aba252deee0602c86c309799da5a74009ac7a1c"},
{file = "pillow-11.1.0-cp312-cp312-win32.whl", hash = "sha256:cfd5cd998c2e36a862d0e27b2df63237e67273f2fc78f47445b14e73a810e7e6"},
{file = "pillow-11.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:a697cd8ba0383bba3d2d3ada02b34ed268cb548b369943cd349007730c92bddf"},
{file = "pillow-11.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:4dd43a78897793f60766563969442020e90eb7847463eca901e41ba186a7d4a5"},
{file = "pillow-11.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ae98e14432d458fc3de11a77ccb3ae65ddce70f730e7c76140653048c71bfcbc"},
{file = "pillow-11.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cc1331b6d5a6e144aeb5e626f4375f5b7ae9934ba620c0ac6b3e43d5e683a0f0"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:758e9d4ef15d3560214cddbc97b8ef3ef86ce04d62ddac17ad39ba87e89bd3b1"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b523466b1a31d0dcef7c5be1f20b942919b62fd6e9a9be199d035509cbefc0ec"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:9044b5e4f7083f209c4e35aa5dd54b1dd5b112b108648f5c902ad586d4f945c5"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:3764d53e09cdedd91bee65c2527815d315c6b90d7b8b79759cc48d7bf5d4f114"},
{file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:31eba6bbdd27dde97b0174ddf0297d7a9c3a507a8a1480e1e60ef914fe23d352"},
{file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b5d658fbd9f0d6eea113aea286b21d3cd4d3fd978157cbf2447a6035916506d3"},
{file = "pillow-11.1.0-cp313-cp313-win32.whl", hash = "sha256:f86d3a7a9af5d826744fabf4afd15b9dfef44fe69a98541f666f66fbb8d3fef9"},
{file = "pillow-11.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:593c5fd6be85da83656b93ffcccc2312d2d149d251e98588b14fbc288fd8909c"},
{file = "pillow-11.1.0-cp313-cp313-win_arm64.whl", hash = "sha256:11633d58b6ee5733bde153a8dafd25e505ea3d32e261accd388827ee987baf65"},
{file = "pillow-11.1.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:70ca5ef3b3b1c4a0812b5c63c57c23b63e53bc38e758b37a951e5bc466449861"},
{file = "pillow-11.1.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8000376f139d4d38d6851eb149b321a52bb8893a88dae8ee7d95840431977081"},
{file = "pillow-11.1.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee85f0696a17dd28fbcfceb59f9510aa71934b483d1f5601d1030c3c8304f3c"},
{file = "pillow-11.1.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:dd0e081319328928531df7a0e63621caf67652c8464303fd102141b785ef9547"},
{file = "pillow-11.1.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e63e4e5081de46517099dc30abe418122f54531a6ae2ebc8680bcd7096860eab"},
{file = "pillow-11.1.0-cp313-cp313t-win32.whl", hash = "sha256:dda60aa465b861324e65a78c9f5cf0f4bc713e4309f83bc387be158b077963d9"},
{file = "pillow-11.1.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ad5db5781c774ab9a9b2c4302bbf0c1014960a0a7be63278d13ae6fdf88126fe"},
{file = "pillow-11.1.0-cp313-cp313t-win_arm64.whl", hash = "sha256:67cd427c68926108778a9005f2a04adbd5e67c442ed21d95389fe1d595458756"},
{file = "pillow-11.1.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:bf902d7413c82a1bfa08b06a070876132a5ae6b2388e2712aab3a7cbc02205c6"},
{file = "pillow-11.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c1eec9d950b6fe688edee07138993e54ee4ae634c51443cfb7c1e7613322718e"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e275ee4cb11c262bd108ab2081f750db2a1c0b8c12c1897f27b160c8bd57bbc"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4db853948ce4e718f2fc775b75c37ba2efb6aaea41a1a5fc57f0af59eee774b2"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:ab8a209b8485d3db694fa97a896d96dd6533d63c22829043fd9de627060beade"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:54251ef02a2309b5eec99d151ebf5c9904b77976c8abdcbce7891ed22df53884"},
{file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5bb94705aea800051a743aa4874bb1397d4695fb0583ba5e425ee0328757f196"},
{file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89dbdb3e6e9594d512780a5a1c42801879628b38e3efc7038094430844e271d8"},
{file = "pillow-11.1.0-cp39-cp39-win32.whl", hash = "sha256:e5449ca63da169a2e6068dd0e2fcc8d91f9558aba89ff6d02121ca8ab11e79e5"},
{file = "pillow-11.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:3362c6ca227e65c54bf71a5f88b3d4565ff1bcbc63ae72c34b07bbb1cc59a43f"},
{file = "pillow-11.1.0-cp39-cp39-win_arm64.whl", hash = "sha256:b20be51b37a75cc54c2c55def3fa2c65bb94ba859dde241cd0a4fd302de5ae0a"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:8c730dc3a83e5ac137fbc92dfcfe1511ce3b2b5d7578315b63dbbb76f7f51d90"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:7d33d2fae0e8b170b6a6c57400e077412240f6f5bb2a342cf1ee512a787942bb"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8d65b38173085f24bc07f8b6c505cbb7418009fa1a1fcb111b1f4961814a442"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:015c6e863faa4779251436db398ae75051469f7c903b043a48f078e437656f83"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d44ff19eea13ae4acdaaab0179fa68c0c6f2f45d66a4d8ec1eda7d6cecbcc15f"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d3d8da4a631471dfaf94c10c85f5277b1f8e42ac42bade1ac67da4b4a7359b73"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4637b88343166249fe8aa94e7c4a62a180c4b3898283bb5d3d2fd5fe10d8e4e0"},
{file = "pillow-11.1.0.tar.gz", hash = "sha256:368da70808b36d73b4b390a8ffac11069f8a5c85f29eff1f1b01bcf3ef5b2a20"},
]
[package.extras]
docs = ["furo", "olefile", "sphinx (>=8.1)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"]
fpx = ["olefile"]
mic = ["olefile"]
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "trove-classifiers (>=2024.10.12)"]
typing = ["typing-extensions"]
xmp = ["defusedxml"]
[[package]]
name = "pkginfo"
version = "1.9.6"
description = "Query metadata from sdists / bdists / installed packages."
optional = false
python-versions = ">=3.6"
files = [
{file = "pkginfo-1.9.6-py3-none-any.whl", hash = "sha256:4b7a555a6d5a22169fcc9cf7bfd78d296b0361adad412a346c1226849af5e546"},
{file = "pkginfo-1.9.6.tar.gz", hash = "sha256:8fd5896e8718a4372f0ea9cc9d96f6417c9b986e23a4d116dda26b62cc29d046"},
]
[package.extras]
testing = ["pytest", "pytest-cov"]
[[package]]
name = "prometheus-client"
version = "0.21.0"
@@ -1590,6 +1605,7 @@ files = [
{file = "psycopg2-2.9.10-cp311-cp311-win_amd64.whl", hash = "sha256:0435034157049f6846e95103bd8f5a668788dd913a7c30162ca9503fdf542cb4"},
{file = "psycopg2-2.9.10-cp312-cp312-win32.whl", hash = "sha256:65a63d7ab0e067e2cdb3cf266de39663203d38d6a8ed97f5ca0cb315c73fe067"},
{file = "psycopg2-2.9.10-cp312-cp312-win_amd64.whl", hash = "sha256:4a579d6243da40a7b3182e0430493dbd55950c493d8c68f4eec0b302f6bbf20e"},
{file = "psycopg2-2.9.10-cp313-cp313-win_amd64.whl", hash = "sha256:91fd603a2155da8d0cfcdbf8ab24a2d54bca72795b90d2a3ed2b6da8d979dee2"},
{file = "psycopg2-2.9.10-cp39-cp39-win32.whl", hash = "sha256:9d5b3b94b79a844a986d029eee38998232451119ad653aea42bb9220a8c5066b"},
{file = "psycopg2-2.9.10-cp39-cp39-win_amd64.whl", hash = "sha256:88138c8dedcbfa96408023ea2b0c369eda40fe5d75002c0964c78f46f11fa442"},
{file = "psycopg2-2.9.10.tar.gz", hash = "sha256:12ec0b40b0273f95296233e8750441339298e6a572f7039da5b260e3c8b60e11"},
@@ -1825,12 +1841,12 @@ plugins = ["importlib-metadata"]
[[package]]
name = "pyicu"
version = "2.13.1"
version = "2.14"
description = "Python extension wrapping the ICU C++ API"
optional = true
python-versions = "*"
files = [
{file = "PyICU-2.13.1.tar.gz", hash = "sha256:d4919085eaa07da12bade8ee721e7bbf7ade0151ca0f82946a26c8f4b98cdceb"},
{file = "PyICU-2.14.tar.gz", hash = "sha256:acc7eb92bd5c554ed577249c6978450a4feda0aa6f01470152b3a7b382a02132"},
]
[[package]]
@@ -1907,20 +1923,20 @@ tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
[[package]]
name = "pyopenssl"
version = "24.2.1"
version = "24.3.0"
description = "Python wrapper module around the OpenSSL library"
optional = false
python-versions = ">=3.7"
files = [
{file = "pyOpenSSL-24.2.1-py3-none-any.whl", hash = "sha256:967d5719b12b243588573f39b0c677637145c7a1ffedcd495a487e58177fbb8d"},
{file = "pyopenssl-24.2.1.tar.gz", hash = "sha256:4247f0dbe3748d560dcbb2ff3ea01af0f9a1a001ef5f7c4c647956ed8cbf0e95"},
{file = "pyOpenSSL-24.3.0-py3-none-any.whl", hash = "sha256:e474f5a473cd7f92221cc04976e48f4d11502804657a08a989fb3be5514c904a"},
{file = "pyopenssl-24.3.0.tar.gz", hash = "sha256:49f7a019577d834746bc55c5fce6ecbcec0f2b4ec5ce1cf43a9a173b8138bb36"},
]
[package.dependencies]
cryptography = ">=41.0.5,<44"
cryptography = ">=41.0.5,<45"
[package.extras]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx-rtd-theme"]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"]
test = ["pretend", "pytest (>=3.0.1)", "pytest-rerunfailures"]
[[package]]
@@ -1962,13 +1978,13 @@ six = ">=1.5"
[[package]]
name = "python-multipart"
version = "0.0.18"
version = "0.0.20"
description = "A streaming multipart parser for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "python_multipart-0.0.18-py3-none-any.whl", hash = "sha256:efe91480f485f6a361427a541db4796f9e1591afc0fb8e7a4ba06bfbc6708996"},
{file = "python_multipart-0.0.18.tar.gz", hash = "sha256:7a68db60c8bfb82e460637fa4750727b45af1d5e2ed215593f917f64694d34fe"},
{file = "python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104"},
{file = "python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13"},
]
[[package]]
@@ -2321,13 +2337,13 @@ doc = ["Sphinx", "sphinx-rtd-theme"]
[[package]]
name = "sentry-sdk"
version = "2.17.0"
version = "2.19.2"
description = "Python client for Sentry (https://sentry.io)"
optional = true
python-versions = ">=3.6"
files = [
{file = "sentry_sdk-2.17.0-py2.py3-none-any.whl", hash = "sha256:625955884b862cc58748920f9e21efdfb8e0d4f98cca4ab0d3918576d5b606ad"},
{file = "sentry_sdk-2.17.0.tar.gz", hash = "sha256:dd0a05352b78ffeacced73a94e86f38b32e2eae15fff5f30ca5abb568a72eacf"},
{file = "sentry_sdk-2.19.2-py2.py3-none-any.whl", hash = "sha256:ebdc08228b4d131128e568d696c210d846e5b9d70aa0327dec6b1272d9d40b84"},
{file = "sentry_sdk-2.19.2.tar.gz", hash = "sha256:467df6e126ba242d39952375dd816fbee0f217d119bf454a8ce74cf1e7909e8d"},
]
[package.dependencies]
@@ -2353,14 +2369,16 @@ grpcio = ["grpcio (>=1.21.1)", "protobuf (>=3.8.0)"]
http2 = ["httpcore[http2] (==1.*)"]
httpx = ["httpx (>=0.16.0)"]
huey = ["huey (>=2)"]
huggingface-hub = ["huggingface-hub (>=0.22)"]
huggingface-hub = ["huggingface_hub (>=0.22)"]
langchain = ["langchain (>=0.0.210)"]
launchdarkly = ["launchdarkly-server-sdk (>=9.8.0)"]
litestar = ["litestar (>=2.0.0)"]
loguru = ["loguru (>=0.5)"]
openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"]
openfeature = ["openfeature-sdk (>=0.7.1)"]
opentelemetry = ["opentelemetry-distro (>=0.35b0)"]
opentelemetry-experimental = ["opentelemetry-distro"]
pure-eval = ["asttokens", "executing", "pure-eval"]
pure-eval = ["asttokens", "executing", "pure_eval"]
pymongo = ["pymongo (>=3.1)"]
pyspark = ["pyspark (>=2.4.4)"]
quart = ["blinker (>=1.1)", "quart (>=0.16.1)"]
@@ -2373,13 +2391,13 @@ tornado = ["tornado (>=6)"]
[[package]]
name = "service-identity"
version = "24.1.0"
version = "24.2.0"
description = "Service identity verification for pyOpenSSL & cryptography."
optional = false
python-versions = ">=3.8"
files = [
{file = "service_identity-24.1.0-py3-none-any.whl", hash = "sha256:a28caf8130c8a5c1c7a6f5293faaf239bbfb7751e4862436920ee6f2616f568a"},
{file = "service_identity-24.1.0.tar.gz", hash = "sha256:6829c9d62fb832c2e1c435629b0a8c476e1929881f28bee4d20bc24161009221"},
{file = "service_identity-24.2.0-py3-none-any.whl", hash = "sha256:6b047fbd8a84fd0bb0d55ebce4031e400562b9196e1e0d3e0fe2b8a59f6d4a85"},
{file = "service_identity-24.2.0.tar.gz", hash = "sha256:b8683ba13f0d39c6cd5d625d2c5f65421d6d707b013b375c355751557cbe8e09"},
]
[package.dependencies]
@@ -2389,7 +2407,7 @@ pyasn1 = "*"
pyasn1-modules = "*"
[package.extras]
dev = ["pyopenssl", "service-identity[idna,mypy,tests]"]
dev = ["coverage[toml] (>=5.0.2)", "idna", "mypy", "pyopenssl", "pytest", "types-pyopenssl"]
docs = ["furo", "myst-parser", "pyopenssl", "sphinx", "sphinx-notfound-page"]
idna = ["idna"]
mypy = ["idna", "mypy", "types-pyopenssl"]
@@ -2627,20 +2645,20 @@ docs = ["sphinx (<7.0.0)"]
[[package]]
name = "twine"
version = "6.0.1"
version = "6.1.0"
description = "Collection of utilities for publishing packages on PyPI"
optional = false
python-versions = ">=3.8"
files = [
{file = "twine-6.0.1-py3-none-any.whl", hash = "sha256:9c6025b203b51521d53e200f4a08b116dee7500a38591668c6a6033117bdc218"},
{file = "twine-6.0.1.tar.gz", hash = "sha256:36158b09df5406e1c9c1fb8edb24fc2be387709443e7376689b938531582ee27"},
{file = "twine-6.1.0-py3-none-any.whl", hash = "sha256:a47f973caf122930bf0fbbf17f80b83bc1602c9ce393c7845f289a3001dc5384"},
{file = "twine-6.1.0.tar.gz", hash = "sha256:be324f6272eff91d07ee93f251edf232fc647935dd585ac003539b42404a8dbd"},
]
[package.dependencies]
id = "*"
importlib-metadata = {version = ">=3.6", markers = "python_version < \"3.10\""}
keyring = {version = ">=15.1", markers = "platform_machine != \"ppc64le\" and platform_machine != \"s390x\""}
packaging = "*"
pkginfo = ">=1.8.1"
packaging = ">=24.0"
readme-renderer = ">=35.0"
requests = ">=2.20"
requests-toolbelt = ">=0.8.0,<0.9.0 || >0.9.0"
@@ -2706,13 +2724,13 @@ twisted = "*"
[[package]]
name = "types-bleach"
version = "6.1.0.20240331"
version = "6.2.0.20241123"
description = "Typing stubs for bleach"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-bleach-6.1.0.20240331.tar.gz", hash = "sha256:2ee858a84fb06fc2225ff56ba2f7f6c88b65638659efae0d7bfd6b24a1b5a524"},
{file = "types_bleach-6.1.0.20240331-py3-none-any.whl", hash = "sha256:399bc59bfd20a36a56595f13f805e56c8a08e5a5c07903e5cf6fafb5a5107dd4"},
{file = "types_bleach-6.2.0.20241123-py3-none-any.whl", hash = "sha256:c6e58b3646665ca7c6b29890375390f4569e84f0cf5c171e0fe1ddb71a7be86a"},
{file = "types_bleach-6.2.0.20241123.tar.gz", hash = "sha256:dac5fe9015173514da3ac810c1a935619a3ccbcc5d66c4cbf4707eac00539057"},
]
[package.dependencies]
@@ -2829,13 +2847,13 @@ types-cffi = "*"
[[package]]
name = "types-pyyaml"
version = "6.0.12.20240917"
version = "6.0.12.20241230"
description = "Typing stubs for PyYAML"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-PyYAML-6.0.12.20240917.tar.gz", hash = "sha256:d1405a86f9576682234ef83bcb4e6fff7c9305c8b1fbad5e0bcd4f7dbdc9c587"},
{file = "types_PyYAML-6.0.12.20240917-py3-none-any.whl", hash = "sha256:392b267f1c0fe6022952462bf5d6523f31e37f6cea49b14cee7ad634b6301570"},
{file = "types_PyYAML-6.0.12.20241230-py3-none-any.whl", hash = "sha256:fa4d32565219b68e6dee5f67534c722e53c00d1cfc09c435ef04d7353e1e96e6"},
{file = "types_pyyaml-6.0.12.20241230.tar.gz", hash = "sha256:7f07622dbd34bb9c8b264fe860a17e0efcad00d50b5f27e93984909d9363498c"},
]
[[package]]

View File

@@ -97,7 +97,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.121.1"
version = "1.125.0"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"

View File

@@ -34,7 +34,7 @@ pyo3 = { version = "0.23.2", features = [
"macros",
"anyhow",
"abi3",
"abi3-py38",
"abi3-py39",
] }
pyo3-log = "0.12.0"
pythonize = "0.23.0"

View File

@@ -42,12 +42,12 @@ from typing import (
Set,
Tuple,
Type,
TypedDict,
TypeVar,
cast,
)
import yaml
from typing_extensions import TypedDict
from twisted.internet import defer, reactor as reactor_

View File

@@ -18,9 +18,7 @@
# [This file includes modifications made by New Vector Limited]
#
#
from typing import TYPE_CHECKING, Optional, Tuple
from typing_extensions import Protocol
from typing import TYPE_CHECKING, Optional, Protocol, Tuple
from twisted.web.server import Request

View File

@@ -19,7 +19,7 @@
#
#
import logging
from typing import TYPE_CHECKING, Any, Dict, List, Optional
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Optional
from urllib.parse import urlencode
from authlib.oauth2 import ClientAuth
@@ -119,7 +119,7 @@ class MSC3861DelegatedAuth(BaseAuth):
self._clock = hs.get_clock()
self._http_client = hs.get_proxied_http_client()
self._hostname = hs.hostname
self._admin_token = self._config.admin_token
self._admin_token: Callable[[], Optional[str]] = self._config.admin_token
self._issuer_metadata = RetryOnExceptionCachedCall[OpenIDProviderMetadata](
self._load_metadata
@@ -133,9 +133,10 @@ class MSC3861DelegatedAuth(BaseAuth):
)
else:
# Else use the client secret
assert self._config.client_secret, "No client_secret provided"
client_secret = self._config.client_secret()
assert client_secret, "No client_secret provided"
self._client_auth = ClientAuth(
self._config.client_id, self._config.client_secret, auth_method
self._config.client_id, client_secret, auth_method
)
async def _load_metadata(self) -> OpenIDProviderMetadata:
@@ -174,6 +175,12 @@ class MSC3861DelegatedAuth(BaseAuth):
logger.warning("Failed to load metadata:", exc_info=True)
return None
async def auth_metadata(self) -> Dict[str, Any]:
"""
Returns the auth metadata dict
"""
return await self._issuer_metadata.get()
async def _introspection_endpoint(self) -> str:
"""
Returns the introspection endpoint of the issuer
@@ -277,7 +284,7 @@ class MSC3861DelegatedAuth(BaseAuth):
requester = await self.get_user_by_access_token(access_token, allow_expired)
# Do not record requests from MAS using the virtual `__oidc_admin` user.
if access_token != self._admin_token:
if access_token != self._admin_token():
await self._record_request(request, requester)
if not allow_guest and requester.is_guest:
@@ -318,7 +325,8 @@ class MSC3861DelegatedAuth(BaseAuth):
token: str,
allow_expired: bool = False,
) -> Requester:
if self._admin_token is not None and token == self._admin_token:
admin_token = self._admin_token()
if admin_token is not None and token == admin_token:
# XXX: This is a temporary solution so that the admin API can be called by
# the OIDC provider. This will be removed once we have OIDC client
# credentials grant support in matrix-authentication-service.

View File

@@ -320,3 +320,8 @@ class ApprovalNoticeMedium:
class Direction(enum.Enum):
BACKWARDS = "b"
FORWARDS = "f"
class ProfileFields:
DISPLAYNAME: Final = "displayname"
AVATAR_URL: Final = "avatar_url"

View File

@@ -132,6 +132,10 @@ class Codes(str, Enum):
# connection.
UNKNOWN_POS = "M_UNKNOWN_POS"
# Part of MSC4133
PROFILE_TOO_LARGE = "M_PROFILE_TOO_LARGE"
KEY_TOO_LARGE = "M_KEY_TOO_LARGE"
class CodeMessageException(RuntimeError):
"""An exception with integer code, a message string attributes and optional headers.

View File

@@ -275,6 +275,7 @@ class Ratelimiter:
update: bool = True,
n_actions: int = 1,
_time_now_s: Optional[float] = None,
pause: Optional[float] = 0.5,
) -> None:
"""Checks if an action can be performed. If not, raises a LimitExceededError
@@ -298,6 +299,8 @@ class Ratelimiter:
at all.
_time_now_s: The current time. Optional, defaults to the current time according
to self.clock. Only used by tests.
pause: Time in seconds to pause when an action is being limited. Defaults to 0.5
to stop clients from "tight-looping" on retrying their request.
Raises:
LimitExceededError: If an action could not be performed, along with the time in
@@ -316,9 +319,8 @@ class Ratelimiter:
)
if not allowed:
# We pause for a bit here to stop clients from "tight-looping" on
# retrying their request.
await self.clock.sleep(0.5)
if pause:
await self.clock.sleep(pause)
raise LimitExceededError(
limiter_name=self._limiter_name,

View File

@@ -221,9 +221,13 @@ class Config:
The number of milliseconds in the duration.
Raises:
TypeError, if given something other than an integer or a string
TypeError: if given something other than an integer or a string, or the
duration is using an incorrect suffix.
ValueError: if given a string not of the form described above.
"""
# For integers, we prefer to use `type(value) is int` instead of
# `isinstance(value, int)` because we want to exclude subclasses of int, such as
# bool.
if type(value) is int: # noqa: E721
return value
elif isinstance(value, str):
@@ -246,9 +250,20 @@ class Config:
if suffix in sizes:
value = value[:-1]
size = sizes[suffix]
elif suffix.isdigit():
# No suffix is treated as milliseconds.
value = value
size = 1
else:
raise TypeError(
f"Bad duration suffix {value} (expected no suffix or one of these suffixes: {sizes.keys()})"
)
return int(value) * size
else:
raise TypeError(f"Bad duration {value!r}")
raise TypeError(
f"Bad duration type {value!r} (expected int or string duration)"
)
@staticmethod
def abspath(file_path: str) -> str:

View File

@@ -20,14 +20,15 @@
#
import enum
from typing import TYPE_CHECKING, Any, Optional
from functools import cache
from typing import TYPE_CHECKING, Any, Iterable, Optional
import attr
import attr.validators
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS, RoomVersions
from synapse.config import ConfigError
from synapse.config._base import Config, RootConfig
from synapse.config._base import Config, RootConfig, read_file
from synapse.types import JsonDict
# Determine whether authlib is installed.
@@ -43,6 +44,12 @@ if TYPE_CHECKING:
from authlib.jose.rfc7517 import JsonWebKey
@cache
def read_secret_from_file_once(file_path: Any, config_path: Iterable[str]) -> str:
"""Returns the memoized secret read from file."""
return read_file(file_path, config_path).strip()
class ClientAuthMethod(enum.Enum):
"""List of supported client auth methods."""
@@ -63,6 +70,40 @@ def _parse_jwks(jwks: Optional[JsonDict]) -> Optional["JsonWebKey"]:
return JsonWebKey.import_key(jwks)
def _check_client_secret(
instance: "MSC3861", _attribute: attr.Attribute, _value: Optional[str]
) -> None:
if instance._client_secret and instance._client_secret_path:
raise ConfigError(
(
"You have configured both "
"`experimental_features.msc3861.client_secret` and "
"`experimental_features.msc3861.client_secret_path`. "
"These are mutually incompatible."
),
("experimental", "msc3861", "client_secret"),
)
# Check client secret can be retrieved
instance.client_secret()
def _check_admin_token(
instance: "MSC3861", _attribute: attr.Attribute, _value: Optional[str]
) -> None:
if instance._admin_token and instance._admin_token_path:
raise ConfigError(
(
"You have configured both "
"`experimental_features.msc3861.admin_token` and "
"`experimental_features.msc3861.admin_token_path`. "
"These are mutually incompatible."
),
("experimental", "msc3861", "admin_token"),
)
# Check client secret can be retrieved
instance.admin_token()
@attr.s(slots=True, frozen=True)
class MSC3861:
"""Configuration for MSC3861: Matrix architecture change to delegate authentication via OIDC"""
@@ -97,15 +138,30 @@ class MSC3861:
)
"""The auth method used when calling the introspection endpoint."""
client_secret: Optional[str] = attr.ib(
_client_secret: Optional[str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(str)),
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_client_secret,
],
)
"""
The client secret to use when calling the introspection endpoint,
when using any of the client_secret_* client auth methods.
"""
_client_secret_path: Optional[str] = attr.ib(
default=None,
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_client_secret,
],
)
"""
Alternative to `client_secret`: allows the secret to be specified in an
external file.
"""
jwk: Optional["JsonWebKey"] = attr.ib(default=None, converter=_parse_jwks)
"""
The JWKS to use when calling the introspection endpoint,
@@ -133,7 +189,7 @@ class MSC3861:
ClientAuthMethod.CLIENT_SECRET_BASIC,
ClientAuthMethod.CLIENT_SECRET_JWT,
)
and self.client_secret is None
and self.client_secret() is None
):
raise ConfigError(
f"A client secret must be provided when using the {value} client auth method",
@@ -152,15 +208,48 @@ class MSC3861:
)
"""The URL of the My Account page on the OIDC Provider as per MSC2965."""
admin_token: Optional[str] = attr.ib(
_admin_token: Optional[str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(str)),
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_admin_token,
],
)
"""
A token that should be considered as an admin token.
This is used by the OIDC provider, to make admin calls to Synapse.
"""
_admin_token_path: Optional[str] = attr.ib(
default=None,
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_admin_token,
],
)
"""
Alternative to `admin_token`: allows the secret to be specified in an
external file.
"""
def client_secret(self) -> Optional[str]:
"""Returns the secret given via `client_secret` or `client_secret_path`."""
if self._client_secret_path:
return read_secret_from_file_once(
self._client_secret_path,
("experimental_features", "msc3861", "client_secret_path"),
)
return self._client_secret
def admin_token(self) -> Optional[str]:
"""Returns the admin token given via `admin_token` or `admin_token_path`."""
if self._admin_token_path:
return read_secret_from_file_once(
self._admin_token_path,
("experimental_features", "msc3861", "admin_token_path"),
)
return self._admin_token
def check_config_conflicts(self, root: RootConfig) -> None:
"""Checks for any configuration conflicts with other parts of Synapse.
@@ -436,8 +525,8 @@ class ExperimentalConfig(Config):
("experimental", "msc4108_delegation_endpoint"),
)
# MSC4151: Report room API (Client-Server API)
self.msc4151_enabled: bool = experimental.get("msc4151_enabled", False)
# MSC4133: Custom profile fields
self.msc4133_enabled: bool = experimental.get("msc4133_enabled", False)
# MSC4210: Remove legacy mentions
self.msc4210_enabled: bool = experimental.get("msc4210_enabled", False)

View File

@@ -228,3 +228,9 @@ class RatelimitConfig(Config):
config.get("remote_media_download_burst_count", "500M")
),
)
self.rc_presence_per_user = RatelimitSettings.parse(
config,
"rc_presence.per_user",
defaults={"per_second": 0.1, "burst_count": 1},
)

View File

@@ -19,7 +19,7 @@
#
#
import logging
from typing import Any, Dict, Optional
from typing import Any, Dict, List, Optional
import attr
@@ -43,13 +43,23 @@ class SsoAttributeRequirement:
"""Object describing a single requirement for SSO attributes."""
attribute: str
# If a value is not given, than the attribute must simply exist.
value: Optional[str]
# If neither value nor one_of is given, the attribute must simply exist. This is
# only true for CAS configs which use a different JSON schema than the one below.
value: Optional[str] = None
one_of: Optional[List[str]] = None
JSON_SCHEMA = {
"type": "object",
"properties": {"attribute": {"type": "string"}, "value": {"type": "string"}},
"required": ["attribute", "value"],
"properties": {
"attribute": {"type": "string"},
"value": {"type": "string"},
"one_of": {"type": "array", "items": {"type": "string"}},
},
"required": ["attribute"],
"oneOf": [
{"required": ["value"]},
{"required": ["one_of"]},
],
}

View File

@@ -32,6 +32,7 @@ from typing import (
Mapping,
MutableMapping,
Optional,
Protocol,
Set,
Tuple,
Union,
@@ -41,7 +42,6 @@ from typing import (
from canonicaljson import encode_canonical_json
from signedjson.key import decode_verify_key_bytes
from signedjson.sign import SignatureVerifyException, verify_signed_json
from typing_extensions import Protocol
from unpaddedbase64 import decode_base64
from synapse.api.constants import (
@@ -566,6 +566,7 @@ def _is_membership_change_allowed(
logger.debug(
"_is_membership_change_allowed: %s",
{
"caller_membership": caller.membership if caller else None,
"caller_in_room": caller_in_room,
"caller_invited": caller_invited,
"caller_knocked": caller_knocked,
@@ -677,7 +678,8 @@ def _is_membership_change_allowed(
and join_rule == JoinRules.KNOCK_RESTRICTED
)
):
if not caller_in_room and not caller_invited:
# You can only join the room if you are invited or are already in the room.
if not (caller_in_room or caller_invited):
raise AuthError(403, "You are not invited to this room.")
else:
# TODO (erikj): may_join list

View File

@@ -30,6 +30,7 @@ from typing import (
Generic,
Iterable,
List,
Literal,
Optional,
Tuple,
Type,
@@ -39,10 +40,9 @@ from typing import (
)
import attr
from typing_extensions import Literal
from unpaddedbase64 import encode_base64
from synapse.api.constants import RelationTypes
from synapse.api.constants import EventTypes, RelationTypes
from synapse.api.room_versions import EventFormatVersions, RoomVersion, RoomVersions
from synapse.synapse_rust.events import EventInternalMetadata
from synapse.types import JsonDict, StrCollection
@@ -325,12 +325,17 @@ class EventBase(metaclass=abc.ABCMeta):
def __repr__(self) -> str:
rejection = f"REJECTED={self.rejected_reason}, " if self.rejected_reason else ""
conditional_membership_string = ""
if self.get("type") == EventTypes.Member:
conditional_membership_string = f"membership={self.membership}, "
return (
f"<{self.__class__.__name__} "
f"{rejection}"
f"event_id={self.event_id}, "
f"type={self.get('type')}, "
f"state_key={self.get('state_key')}, "
f"{conditional_membership_string}"
f"outlier={self.internal_metadata.is_outlier()}"
">"
)

View File

@@ -66,50 +66,67 @@ class InviteAutoAccepter:
event: The incoming event.
"""
# Check if the event is an invite for a local user.
is_invite_for_local_user = (
event.type == EventTypes.Member
and event.is_state()
and event.membership == Membership.INVITE
and self._api.is_mine(event.state_key)
)
if (
event.type != EventTypes.Member
or event.is_state() is False
or event.membership != Membership.INVITE
or self._api.is_mine(event.state_key) is False
):
return
# Only accept invites for direct messages if the configuration mandates it.
is_direct_message = event.content.get("is_direct", False)
is_allowed_by_direct_message_rules = (
not self._config.accept_invites_only_for_direct_messages
or is_direct_message is True
)
if (
self._config.accept_invites_only_for_direct_messages
and is_direct_message is False
):
return
# Only accept invites from remote users if the configuration mandates it.
is_from_local_user = self._api.is_mine(event.sender)
is_allowed_by_local_user_rules = (
not self._config.accept_invites_only_from_local_users
or is_from_local_user is True
if (
self._config.accept_invites_only_from_local_users
and is_from_local_user is False
):
return
# Check the user is activated.
recipient = await self._api.get_userinfo_by_id(event.state_key)
# Ignore if the user doesn't exist.
if recipient is None:
return
# Never accept invites for deactivated users.
if recipient.is_deactivated:
return
# Never accept invites for suspended users.
if recipient.suspended:
return
# Never accept invites for locked users.
if recipient.locked:
return
# Make the user join the room. We run this as a background process to circumvent a race condition
# that occurs when responding to invites over federation (see https://github.com/matrix-org/synapse-auto-accept-invite/issues/12)
run_as_background_process(
"retry_make_join",
self._retry_make_join,
event.state_key,
event.state_key,
event.room_id,
"join",
bg_start_span=False,
)
if (
is_invite_for_local_user
and is_allowed_by_direct_message_rules
and is_allowed_by_local_user_rules
):
# Make the user join the room. We run this as a background process to circumvent a race condition
# that occurs when responding to invites over federation (see https://github.com/matrix-org/synapse-auto-accept-invite/issues/12)
run_as_background_process(
"retry_make_join",
self._retry_make_join,
event.state_key,
event.state_key,
event.room_id,
"join",
bg_start_span=False,
if is_direct_message:
# Mark this room as a direct message!
await self._mark_room_as_direct_message(
event.state_key, event.sender, event.room_id
)
if is_direct_message:
# Mark this room as a direct message!
await self._mark_room_as_direct_message(
event.state_key, event.sender, event.room_id
)
async def _mark_room_as_direct_message(
self, user_id: str, dm_user_id: str, room_id: str
) -> None:

View File

@@ -24,7 +24,7 @@ from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union
import attr
from signedjson.types import SigningKey
from synapse.api.constants import MAX_DEPTH
from synapse.api.constants import MAX_DEPTH, EventTypes
from synapse.api.room_versions import (
KNOWN_EVENT_FORMAT_VERSIONS,
EventFormatVersions,
@@ -109,6 +109,19 @@ class EventBuilder:
def is_state(self) -> bool:
return self._state_key is not None
def is_mine_id(self, user_id: str) -> bool:
"""Determines whether a user ID or room alias originates from this homeserver.
Returns:
`True` if the hostname part of the user ID or room alias matches this
homeserver.
`False` otherwise, or if the user ID or room alias is malformed.
"""
localpart_hostname = user_id.split(":", 1)
if len(localpart_hostname) < 2:
return False
return localpart_hostname[1] == self._hostname
async def build(
self,
prev_event_ids: List[str],
@@ -142,6 +155,46 @@ class EventBuilder:
self, state_ids
)
# Check for out-of-band membership that may have been exposed on `/sync` but
# the events have not been de-outliered yet so they won't be part of the
# room state yet.
#
# This helps in situations where a remote homeserver invites a local user to
# a room that we're already participating in; and we've persisted the invite
# as an out-of-band membership (outlier), but it hasn't been pushed to us as
# part of a `/send` transaction yet and de-outliered. This also helps for
# any of the other out-of-band membership transitions.
#
# As an optimization, we could check if the room state already includes a
# non-`leave` membership event, then we can assume the membership event has
# been de-outliered and we don't need to check for an out-of-band
# membership. But we don't have the necessary information from a
# `StateMap[str]` and we'll just have to take the hit of this extra lookup
# for any membership event for now.
if self.type == EventTypes.Member and self.is_mine_id(self.state_key):
(
_membership,
member_event_id,
) = await self._store.get_local_current_membership_for_user_in_room(
user_id=self.state_key,
room_id=self.room_id,
)
# There is no need to check if the membership is actually an
# out-of-band membership (`outlier`) as we would end up with the
# same result either way (adding the member event to the
# `auth_event_ids`).
if (
member_event_id is not None
# We only need to be careful about duplicating the event in the
# `auth_event_ids` list (duplicate `type`/`state_key` is part of the
# authorization rules)
and member_event_id not in auth_event_ids
):
auth_event_ids.append(member_event_id)
# Also make sure to point to the previous membership event that will
# allow this one to happen so the computed state works out.
prev_event_ids.append(member_event_id)
format_version = self.room_version.event_format
# The types of auth/prev events changes between event versions.
prev_events: Union[StrCollection, List[Tuple[str, Dict[str, str]]]]

View File

@@ -139,13 +139,13 @@ from typing import (
Hashable,
Iterable,
List,
Literal,
Optional,
Tuple,
)
import attr
from prometheus_client import Counter
from typing_extensions import Literal
from twisted.internet import defer

View File

@@ -20,9 +20,7 @@
#
#
import logging
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Tuple, Type
from typing_extensions import Literal
from typing import TYPE_CHECKING, Dict, Iterable, List, Literal, Optional, Tuple, Type
from synapse.api.errors import FederationDeniedError, SynapseError
from synapse.federation.transport.server._base import (

View File

@@ -24,6 +24,7 @@ from typing import (
TYPE_CHECKING,
Dict,
List,
Literal,
Mapping,
Optional,
Sequence,
@@ -32,8 +33,6 @@ from typing import (
Union,
)
from typing_extensions import Literal
from synapse.api.constants import Direction, EduTypes
from synapse.api.errors import Codes, SynapseError
from synapse.api.room_versions import RoomVersions

View File

@@ -1579,7 +1579,10 @@ class AuthHandler:
# for the presence of an email address during password reset was
# case sensitive).
if medium == "email":
address = canonicalise_email(address)
try:
address = canonicalise_email(address)
except ValueError as e:
raise SynapseError(400, str(e))
await self.store.user_add_threepid(
user_id, medium, address, validated_at, self.hs.get_clock().time_msec()
@@ -1610,7 +1613,10 @@ class AuthHandler:
"""
# 'Canonicalise' email addresses as per above
if medium == "email":
address = canonicalise_email(address)
try:
address = canonicalise_email(address)
except ValueError as e:
raise SynapseError(400, str(e))
await self.store.user_delete_threepid(user_id, medium, address)

View File

@@ -21,9 +21,7 @@
import logging
import string
from typing import TYPE_CHECKING, Iterable, List, Optional, Sequence
from typing_extensions import Literal
from typing import TYPE_CHECKING, Iterable, List, Literal, Optional, Sequence
from synapse.api.constants import MAX_ALIAS_LENGTH, EventTypes
from synapse.api.errors import (

View File

@@ -20,9 +20,7 @@
#
import logging
from typing import TYPE_CHECKING, Dict, Optional, cast
from typing_extensions import Literal
from typing import TYPE_CHECKING, Dict, Literal, Optional, cast
from synapse.api.errors import (
Codes,

View File

@@ -151,6 +151,8 @@ class FederationEventHandler:
def __init__(self, hs: "HomeServer"):
self._clock = hs.get_clock()
self._store = hs.get_datastores().main
self._state_store = hs.get_datastores().state
self._state_deletion_store = hs.get_datastores().state_deletion
self._storage_controllers = hs.get_storage_controllers()
self._state_storage_controller = self._storage_controllers.state
@@ -580,7 +582,9 @@ class FederationEventHandler:
room_version.identifier,
state_maps_to_resolve,
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -1179,7 +1183,9 @@ class FederationEventHandler:
room_version,
state_maps,
event_map={event_id: event},
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
except Exception as e:
@@ -1874,7 +1880,9 @@ class FederationEventHandler:
room_version,
[local_state_id_map, claimed_auth_events_id_map],
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -2014,7 +2022,9 @@ class FederationEventHandler:
room_version,
state_sets,
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -2272,8 +2282,9 @@ class FederationEventHandler:
event_and_contexts, backfilled=backfilled
)
# After persistence we always need to notify replication there may
# be new data.
# After persistence, we never notify clients (wake up `/sync` streams) about
# backfilled events but it's important to let all the workers know about any
# new event (backfilled or not) because TODO
self._notifier.notify_replication()
if self._ephemeral_messages_enabled:

View File

@@ -31,6 +31,7 @@ from typing import (
List,
Optional,
Type,
TypedDict,
TypeVar,
Union,
)
@@ -52,7 +53,6 @@ from pymacaroons.exceptions import (
MacaroonInitException,
MacaroonInvalidSignatureException,
)
from typing_extensions import TypedDict
from twisted.web.client import readBody
from twisted.web.http_headers import Headers
@@ -1002,7 +1002,21 @@ class OidcProvider:
"""
state = generate_token()
nonce = generate_token()
# Generate a nonce 32 characters long. When encoded with base64url later on,
# the nonce will be 43 characters when sent to the identity provider.
#
# While RFC7636 does not specify a minimum length for the `nonce`
# parameter, the TI-Messenger IDP_FD spec v1.7.3 does require it to be
# between 43 and 128 characters. This spec concerns using Matrix for
# communication in German healthcare.
#
# As increasing the length only strengthens security, we use this length
# to allow TI-Messenger deployments using Synapse to satisfy this
# external spec.
#
# See https://github.com/element-hq/synapse/pull/18109 for more context.
nonce = generate_token(length=32)
code_verifier = ""
if not client_redirect_url:

View File

@@ -22,6 +22,7 @@ import logging
import random
from typing import TYPE_CHECKING, List, Optional, Union
from synapse.api.constants import ProfileFields
from synapse.api.errors import (
AuthError,
Codes,
@@ -31,7 +32,7 @@ from synapse.api.errors import (
SynapseError,
)
from synapse.storage.databases.main.media_repository import LocalMedia, RemoteMedia
from synapse.types import JsonDict, Requester, UserID, create_requester
from synapse.types import JsonDict, JsonValue, Requester, UserID, create_requester
from synapse.util.caches.descriptors import cached
from synapse.util.stringutils import parse_and_validate_mxc_uri
@@ -42,6 +43,8 @@ logger = logging.getLogger(__name__)
MAX_DISPLAYNAME_LEN = 256
MAX_AVATAR_URL_LEN = 1000
# Field name length is specced at 255 bytes.
MAX_CUSTOM_FIELD_LEN = 255
class ProfileHandler:
@@ -83,19 +86,33 @@ class ProfileHandler:
Returns:
A JSON dictionary. For local queries this will include the displayname and avatar_url
fields. For remote queries it may contain arbitrary information.
fields, if set. For remote queries it may contain arbitrary information.
"""
target_user = UserID.from_string(user_id)
if self.hs.is_mine(target_user):
profileinfo = await self.store.get_profileinfo(target_user)
if profileinfo.display_name is None and profileinfo.avatar_url is None:
extra_fields = {}
if self.hs.config.experimental.msc4133_enabled:
extra_fields = await self.store.get_profile_fields(target_user)
if (
profileinfo.display_name is None
and profileinfo.avatar_url is None
and not extra_fields
):
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)
return {
"displayname": profileinfo.display_name,
"avatar_url": profileinfo.avatar_url,
}
# Do not include display name or avatar if unset.
ret = {}
if profileinfo.display_name is not None:
ret[ProfileFields.DISPLAYNAME] = profileinfo.display_name
if profileinfo.avatar_url is not None:
ret[ProfileFields.AVATAR_URL] = profileinfo.avatar_url
if extra_fields:
ret.update(extra_fields)
return ret
else:
try:
result = await self.federation.make_query(
@@ -399,6 +416,110 @@ class ProfileHandler:
return True
async def get_profile_field(
self, target_user: UserID, field_name: str
) -> JsonValue:
"""
Fetch a user's profile from the database for local users and over federation
for remote users.
Args:
target_user: The user ID to fetch the profile for.
field_name: The field to fetch the profile for.
Returns:
The value for the profile field or None if the field does not exist.
"""
if self.hs.is_mine(target_user):
try:
field_value = await self.store.get_profile_field(
target_user, field_name
)
except StoreError as e:
if e.code == 404:
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)
raise
return field_value
else:
try:
result = await self.federation.make_query(
destination=target_user.domain,
query_type="profile",
args={"user_id": target_user.to_string(), "field": field_name},
ignore_backoff=True,
)
except RequestSendFailed as e:
raise SynapseError(502, "Failed to fetch profile") from e
except HttpResponseException as e:
raise e.to_synapse_error()
return result.get(field_name)
async def set_profile_field(
self,
target_user: UserID,
requester: Requester,
field_name: str,
new_value: JsonValue,
by_admin: bool = False,
deactivation: bool = False,
) -> None:
"""Set a new profile field for a user.
Args:
target_user: the user whose profile is to be changed.
requester: The user attempting to make this change.
field_name: The name of the profile field to update.
new_value: The new field value for this user.
by_admin: Whether this change was made by an administrator.
deactivation: Whether this change was made while deactivating the user.
"""
if not self.hs.is_mine(target_user):
raise SynapseError(400, "User is not hosted on this homeserver")
if not by_admin and target_user != requester.user:
raise AuthError(403, "Cannot set another user's profile")
await self.store.set_profile_field(target_user, field_name, new_value)
# Custom fields do not propagate into the user directory *or* rooms.
profile = await self.store.get_profileinfo(target_user)
await self._third_party_rules.on_profile_update(
target_user.to_string(), profile, by_admin, deactivation
)
async def delete_profile_field(
self,
target_user: UserID,
requester: Requester,
field_name: str,
by_admin: bool = False,
deactivation: bool = False,
) -> None:
"""Delete a field from a user's profile.
Args:
target_user: the user whose profile is to be changed.
requester: The user attempting to make this change.
field_name: The name of the profile field to remove.
by_admin: Whether this change was made by an administrator.
deactivation: Whether this change was made while deactivating the user.
"""
if not self.hs.is_mine(target_user):
raise SynapseError(400, "User is not hosted on this homeserver")
if not by_admin and target_user != requester.user:
raise AuthError(400, "Cannot set another user's profile")
await self.store.delete_profile_field(target_user, field_name)
# Custom fields do not propagate into the user directory *or* rooms.
profile = await self.store.get_profileinfo(target_user)
await self._third_party_rules.on_profile_update(
target_user.to_string(), profile, by_admin, deactivation
)
async def on_profile_query(self, args: JsonDict) -> JsonDict:
"""Handles federation profile query requests."""
@@ -415,13 +536,24 @@ class ProfileHandler:
just_field = args.get("field", None)
response = {}
response: JsonDict = {}
try:
if just_field is None or just_field == "displayname":
if just_field is None or just_field == ProfileFields.DISPLAYNAME:
response["displayname"] = await self.store.get_profile_displayname(user)
if just_field is None or just_field == "avatar_url":
if just_field is None or just_field == ProfileFields.AVATAR_URL:
response["avatar_url"] = await self.store.get_profile_avatar_url(user)
if self.hs.config.experimental.msc4133_enabled:
if just_field is None:
response.update(await self.store.get_profile_fields(user))
elif just_field not in (
ProfileFields.DISPLAYNAME,
ProfileFields.AVATAR_URL,
):
response[just_field] = await self.store.get_profile_field(
user, just_field
)
except StoreError as e:
if e.code == 404:
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)

View File

@@ -23,10 +23,9 @@
"""Contains functions for registering clients."""
import logging
from typing import TYPE_CHECKING, Iterable, List, Optional, Tuple
from typing import TYPE_CHECKING, Iterable, List, Optional, Tuple, TypedDict
from prometheus_client import Counter
from typing_extensions import TypedDict
from synapse import types
from synapse.api.constants import (

View File

@@ -33,17 +33,17 @@ from typing import (
Mapping,
NoReturn,
Optional,
Protocol,
Set,
)
from urllib.parse import urlencode
import attr
from typing_extensions import Protocol
from twisted.web.iweb import IRequest
from twisted.web.server import Request
from synapse.api.constants import LoginType
from synapse.api.constants import LoginType, ProfileFields
from synapse.api.errors import Codes, NotFoundError, RedirectException, SynapseError
from synapse.config.sso import SsoAttributeRequirement
from synapse.handlers.device import DeviceHandler
@@ -813,9 +813,10 @@ class SsoHandler:
# bail if user already has the same avatar
profile = await self._profile_handler.get_profile(user_id)
if profile["avatar_url"] is not None:
server_name = profile["avatar_url"].split("/")[-2]
media_id = profile["avatar_url"].split("/")[-1]
if ProfileFields.AVATAR_URL in profile:
avatar_url_parts = profile[ProfileFields.AVATAR_URL].split("/")
server_name = avatar_url_parts[-2]
media_id = avatar_url_parts[-1]
if self._is_mine_server_name(server_name):
media = await self._media_repo.store.get_local_media(media_id) # type: ignore[has-type]
if media is not None and upload_name == media.upload_name:
@@ -1276,12 +1277,16 @@ def _check_attribute_requirement(
return False
# If the requirement is None, the attribute existing is enough.
if req.value is None:
if req.value is None and req.one_of is None:
return True
values = attributes[req.attribute]
if req.value in values:
return True
if req.one_of:
for value in req.one_of:
if value in values:
return True
logger.info(
"SSO attribute %s did not match required value '%s' (was '%s')",

View File

@@ -26,7 +26,13 @@ from typing import TYPE_CHECKING, List, Optional, Set, Tuple
from twisted.internet.interfaces import IDelayedCall
import synapse.metrics
from synapse.api.constants import EventTypes, HistoryVisibility, JoinRules, Membership
from synapse.api.constants import (
EventTypes,
HistoryVisibility,
JoinRules,
Membership,
ProfileFields,
)
from synapse.api.errors import Codes, SynapseError
from synapse.handlers.state_deltas import MatchChange, StateDeltasHandler
from synapse.metrics.background_process_metrics import run_as_background_process
@@ -756,6 +762,10 @@ class UserDirectoryHandler(StateDeltasHandler):
await self.store.update_profile_in_user_dir(
user_id,
display_name=non_null_str_or_none(profile.get("displayname")),
avatar_url=non_null_str_or_none(profile.get("avatar_url")),
display_name=non_null_str_or_none(
profile.get(ProfileFields.DISPLAYNAME)
),
avatar_url=non_null_str_or_none(
profile.get(ProfileFields.AVATAR_URL)
),
)

View File

@@ -19,6 +19,7 @@
#
#
import logging
import random
from types import TracebackType
from typing import (
@@ -269,6 +270,10 @@ class WaitingLock:
def _get_next_retry_interval(self) -> float:
next = self._retry_interval
self._retry_interval = max(5, next * 2)
if self._retry_interval > 5 * 2 ^ 7: # ~10 minutes
logging.warning(
f"Lock timeout is getting excessive: {self._retry_interval}s. There may be a deadlock."
)
return next * random.uniform(0.9, 1.1)
@@ -344,4 +349,8 @@ class WaitingMultiLock:
def _get_next_retry_interval(self) -> float:
next = self._retry_interval
self._retry_interval = max(5, next * 2)
if self._retry_interval > 5 * 2 ^ 7: # ~10 minutes
logging.warning(
f"Lock timeout is getting excessive: {self._retry_interval}s. There may be a deadlock."
)
return next * random.uniform(0.9, 1.1)

View File

@@ -31,6 +31,7 @@ from typing import (
List,
Mapping,
Optional,
Protocol,
Tuple,
Union,
)
@@ -40,7 +41,6 @@ import treq
from canonicaljson import encode_canonical_json
from netaddr import AddrFormatError, IPAddress, IPSet
from prometheus_client import Counter
from typing_extensions import Protocol
from zope.interface import implementer
from OpenSSL import SSL

View File

@@ -34,6 +34,7 @@ from typing import (
Dict,
Generic,
List,
Literal,
Optional,
TextIO,
Tuple,
@@ -48,7 +49,6 @@ import treq
from canonicaljson import encode_canonical_json
from prometheus_client import Counter
from signedjson.sign import sign_json
from typing_extensions import Literal
from twisted.internet import defer
from twisted.internet.error import DNSLookupError

View File

@@ -150,6 +150,12 @@ class ProxyAgent(_AgentBase):
http_proxy = proxies["http"].encode() if "http" in proxies else None
https_proxy = proxies["https"].encode() if "https" in proxies else None
no_proxy = proxies["no"] if "no" in proxies else None
logger.debug(
"Using proxy settings: http_proxy=%s, https_proxy=%s, no_proxy=%s",
http_proxy,
https_proxy,
no_proxy,
)
self.http_proxy_endpoint, self.http_proxy_creds = http_proxy_endpoint(
http_proxy, self.proxy_reactor, contextFactory, **self._endpoint_kwargs

View File

@@ -39,6 +39,7 @@ from typing import (
List,
Optional,
Pattern,
Protocol,
Tuple,
Union,
)
@@ -46,7 +47,6 @@ from typing import (
import attr
import jinja2
from canonicaljson import encode_canonical_json
from typing_extensions import Protocol
from zope.interface import implementer
from twisted.internet import defer, interfaces

View File

@@ -28,6 +28,7 @@ from http import HTTPStatus
from typing import (
TYPE_CHECKING,
List,
Literal,
Mapping,
Optional,
Sequence,
@@ -37,8 +38,6 @@ from typing import (
overload,
)
from typing_extensions import Literal
from twisted.web.server import Request
from synapse._pydantic_compat import (

View File

@@ -40,6 +40,7 @@ from typing import (
Any,
Awaitable,
Callable,
Literal,
Optional,
Tuple,
Type,
@@ -49,7 +50,7 @@ from typing import (
)
import attr
from typing_extensions import Literal, ParamSpec
from typing_extensions import ParamSpec
from twisted.internet import defer, threads
from twisted.python.threadpool import ThreadPool

View File

@@ -19,8 +19,7 @@
#
#
import logging
from typing_extensions import Literal
from typing import Literal
class MetadataFilter(logging.Filter):

View File

@@ -23,11 +23,10 @@ import ctypes
import logging
import os
import re
from typing import Iterable, Optional, overload
from typing import Iterable, Literal, Optional, overload
import attr
from prometheus_client import REGISTRY, Metric
from typing_extensions import Literal
from synapse.metrics import GaugeMetricFamily
from synapse.metrics._types import Collector

View File

@@ -45,6 +45,7 @@ from twisted.internet.interfaces import IDelayedCall
from twisted.web.resource import Resource
from synapse.api import errors
from synapse.api.constants import ProfileFields
from synapse.api.errors import SynapseError
from synapse.api.presence import UserPresenceState
from synapse.config import ConfigError
@@ -1086,7 +1087,10 @@ class ModuleApi:
content = {}
# Set the profile if not already done by the module.
if "avatar_url" not in content or "displayname" not in content:
if (
ProfileFields.AVATAR_URL not in content
or ProfileFields.DISPLAYNAME not in content
):
try:
# Try to fetch the user's profile.
profile = await self._hs.get_profile_handler().get_profile(
@@ -1095,8 +1099,8 @@ class ModuleApi:
except SynapseError as e:
# If the profile couldn't be found, use default values.
profile = {
"displayname": target_user_id.localpart,
"avatar_url": None,
ProfileFields.DISPLAYNAME: target_user_id.localpart,
ProfileFields.AVATAR_URL: None,
}
if e.code != 404:
@@ -1109,11 +1113,9 @@ class ModuleApi:
)
# Set the profile where it needs to be set.
if "avatar_url" not in content:
content["avatar_url"] = profile["avatar_url"]
if "displayname" not in content:
content["displayname"] = profile["displayname"]
for field_name in [ProfileFields.AVATAR_URL, ProfileFields.DISPLAYNAME]:
if field_name not in content and field_name in profile:
content[field_name] = profile[field_name]
event_id, _ = await self._hs.get_room_member_handler().update_membership(
requester=requester,
@@ -1844,6 +1846,10 @@ class ModuleApi:
deactivation=deactivation,
)
def get_current_time_msec(self) -> int:
"""Returns the current server time in milliseconds."""
return self._clock.time_msec()
class PublicRoomListManager:
"""Contains methods for adding to, removing from and querying whether a room

View File

@@ -19,6 +19,7 @@
#
#
import functools
import inspect
import logging
from typing import (
@@ -28,15 +29,13 @@ from typing import (
Callable,
Collection,
List,
Literal,
Optional,
Tuple,
Union,
cast,
)
# `Literal` appears with Python 3.8.
from typing_extensions import Literal
import synapse
from synapse.api.errors import Codes
from synapse.logging.opentracing import trace
@@ -297,6 +296,7 @@ def load_legacy_spam_checkers(hs: "synapse.server.HomeServer") -> None:
"Bad signature for callback check_registration_for_spam",
)
@functools.wraps(wrapped_func)
def run(*args: Any, **kwargs: Any) -> Awaitable:
# Assertion required because mypy can't prove we won't change `f`
# back to `None`. See

View File

@@ -18,9 +18,7 @@
# [This file includes modifications made by New Vector Limited]
#
#
from typing import List, Optional
from typing_extensions import TypedDict
from typing import List, Optional, TypedDict
class EmailReason(TypedDict, total=False):

View File

@@ -29,7 +29,7 @@ from synapse.rest.client import (
account_validity,
appservice_ping,
auth,
auth_issuer,
auth_metadata,
capabilities,
delayed_events,
devices,
@@ -121,7 +121,7 @@ CLIENT_SERVLET_FUNCTIONS: Tuple[RegisterServletsFunc, ...] = (
mutual_rooms.register_servlets,
login_token_request.register_servlets,
rendezvous.register_servlets,
auth_issuer.register_servlets,
auth_metadata.register_servlets,
)
SERVLET_GROUPS: Dict[str, Iterable[RegisterServletsFunc]] = {
@@ -187,7 +187,7 @@ class ClientRestResource(JsonResource):
mutual_rooms.register_servlets,
login_token_request.register_servlets,
rendezvous.register_servlets,
auth_issuer.register_servlets,
auth_metadata.register_servlets,
]:
continue

View File

@@ -23,6 +23,7 @@ from http import HTTPStatus
from typing import TYPE_CHECKING, List, Optional, Tuple, cast
import attr
from immutabledict import immutabledict
from synapse.api.constants import Direction, EventTypes, JoinRules, Membership
from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
@@ -463,7 +464,18 @@ class RoomStateRestServlet(RestServlet):
if not room:
raise NotFoundError("Room not found")
event_ids = await self._storage_controllers.state.get_current_state_ids(room_id)
state_filter = None
type = parse_string(request, "type")
if type:
state_filter = StateFilter(
types=immutabledict({type: None}),
include_others=False,
)
event_ids = await self._storage_controllers.state.get_current_state_ids(
room_id, state_filter
)
events = await self.store.get_events(event_ids.values())
now = self.clock.time_msec()
room_state = await self._event_serializer.serialize_events(events.values(), now)

View File

@@ -21,11 +21,10 @@
#
import logging
import random
from typing import TYPE_CHECKING, List, Optional, Tuple
from typing import TYPE_CHECKING, List, Literal, Optional, Tuple
from urllib.parse import urlparse
import attr
from typing_extensions import Literal
from twisted.web.server import Request

View File

@@ -32,6 +32,8 @@ logger = logging.getLogger(__name__)
class AuthIssuerServlet(RestServlet):
"""
Advertises what OpenID Connect issuer clients should use to authorise users.
This endpoint was defined in a previous iteration of MSC2965, and is still
used by some clients.
"""
PATTERNS = client_patterns(
@@ -63,7 +65,42 @@ class AuthIssuerServlet(RestServlet):
)
class AuthMetadataServlet(RestServlet):
"""
Advertises the OAuth 2.0 server metadata for the homeserver.
"""
PATTERNS = client_patterns(
"/org.matrix.msc2965/auth_metadata$",
unstable=True,
releases=(),
)
def __init__(self, hs: "HomeServer"):
super().__init__()
self._config = hs.config
self._auth = hs.get_auth()
async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
if self._config.experimental.msc3861.enabled:
# If MSC3861 is enabled, we can assume self._auth is an instance of MSC3861DelegatedAuth
# We import lazily here because of the authlib requirement
from synapse.api.auth.msc3861_delegated import MSC3861DelegatedAuth
auth = cast(MSC3861DelegatedAuth, self._auth)
return 200, await auth.auth_metadata()
else:
# Wouldn't expect this to be reached: the servlet shouldn't have been
# registered. Still, fail gracefully if we are registered for some reason.
raise SynapseError(
404,
"OIDC discovery has not been configured on this homeserver",
Codes.NOT_FOUND,
)
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
# We use the MSC3861 values as they are used by multiple MSCs
if hs.config.experimental.msc3861.enabled:
AuthIssuerServlet(hs).register(http_server)
AuthMetadataServlet(hs).register(http_server)

View File

@@ -92,6 +92,23 @@ class CapabilitiesRestServlet(RestServlet):
"enabled": self.config.experimental.msc3664_enabled,
}
if self.config.experimental.msc4133_enabled:
response["capabilities"]["uk.tcpip.msc4133.profile_fields"] = {
"enabled": True,
}
# Ensure this is consistent with the legacy m.set_displayname and
# m.set_avatar_url.
disallowed = []
if not self.config.registration.enable_set_displayname:
disallowed.append("displayname")
if not self.config.registration.enable_set_avatar_url:
disallowed.append("avatar_url")
if disallowed:
response["capabilities"]["uk.tcpip.msc4133.profile_fields"][
"disallowed"
] = disallowed
return HTTPStatus.OK, response

View File

@@ -20,9 +20,7 @@
#
import logging
from typing import TYPE_CHECKING, List, Optional, Tuple
from typing_extensions import Literal
from typing import TYPE_CHECKING, List, Literal, Optional, Tuple
from twisted.web.server import Request

View File

@@ -30,11 +30,10 @@ from typing import (
List,
Optional,
Tuple,
TypedDict,
Union,
)
from typing_extensions import TypedDict
from synapse.api.constants import ApprovalNoticeMedium
from synapse.api.errors import (
Codes,

View File

@@ -24,7 +24,8 @@
import logging
from typing import TYPE_CHECKING, Tuple
from synapse.api.errors import AuthError, SynapseError
from synapse.api.errors import AuthError, Codes, LimitExceededError, SynapseError
from synapse.api.ratelimiting import Ratelimiter
from synapse.handlers.presence import format_user_presence_state
from synapse.http.server import HttpServer
from synapse.http.servlet import RestServlet, parse_json_object_from_request
@@ -48,6 +49,14 @@ class PresenceStatusRestServlet(RestServlet):
self.presence_handler = hs.get_presence_handler()
self.clock = hs.get_clock()
self.auth = hs.get_auth()
self.store = hs.get_datastores().main
# Ratelimiter for presence updates, keyed by requester.
self._presence_per_user_limiter = Ratelimiter(
store=self.store,
clock=self.clock,
cfg=hs.config.ratelimiting.rc_presence_per_user,
)
async def on_GET(
self, request: SynapseRequest, user_id: str
@@ -82,6 +91,17 @@ class PresenceStatusRestServlet(RestServlet):
if requester.user != user:
raise AuthError(403, "Can only set your own presence state")
# ignore the presence update if the ratelimit is exceeded
try:
await self._presence_per_user_limiter.ratelimit(requester)
except LimitExceededError as e:
logger.debug("User presence ratelimit exceeded; ignoring it.")
return 429, {
"errcode": Codes.LIMIT_EXCEEDED,
"error": "Too many requests",
"retry_after_ms": e.retry_after_ms,
}
state = {}
content = parse_json_object_from_request(request)

View File

@@ -21,10 +21,13 @@
"""This module contains REST servlets to do with profile: /profile/<paths>"""
import re
from http import HTTPStatus
from typing import TYPE_CHECKING, Tuple
from synapse.api.constants import ProfileFields
from synapse.api.errors import Codes, SynapseError
from synapse.handlers.profile import MAX_CUSTOM_FIELD_LEN
from synapse.http.server import HttpServer
from synapse.http.servlet import (
RestServlet,
@@ -33,7 +36,8 @@ from synapse.http.servlet import (
)
from synapse.http.site import SynapseRequest
from synapse.rest.client._base import client_patterns
from synapse.types import JsonDict, UserID
from synapse.types import JsonDict, JsonValue, UserID
from synapse.util.stringutils import is_namedspaced_grammar
if TYPE_CHECKING:
from synapse.server import HomeServer
@@ -91,6 +95,11 @@ class ProfileDisplaynameRestServlet(RestServlet):
async def on_PUT(
self, request: SynapseRequest, user_id: str
) -> Tuple[int, JsonDict]:
if not UserID.is_valid(user_id):
raise SynapseError(
HTTPStatus.BAD_REQUEST, "Invalid user id", Codes.INVALID_PARAM
)
requester = await self.auth.get_user_by_req(request, allow_guest=True)
user = UserID.from_string(user_id)
is_admin = await self.auth.is_server_admin(requester)
@@ -101,9 +110,7 @@ class ProfileDisplaynameRestServlet(RestServlet):
new_name = content["displayname"]
except Exception:
raise SynapseError(
code=400,
msg="Unable to parse name",
errcode=Codes.BAD_JSON,
400, "Missing key 'displayname'", errcode=Codes.MISSING_PARAM
)
propagate = _read_propagate(self.hs, request)
@@ -166,6 +173,11 @@ class ProfileAvatarURLRestServlet(RestServlet):
async def on_PUT(
self, request: SynapseRequest, user_id: str
) -> Tuple[int, JsonDict]:
if not UserID.is_valid(user_id):
raise SynapseError(
HTTPStatus.BAD_REQUEST, "Invalid user id", Codes.INVALID_PARAM
)
requester = await self.auth.get_user_by_req(request)
user = UserID.from_string(user_id)
is_admin = await self.auth.is_server_admin(requester)
@@ -227,19 +239,185 @@ class ProfileRestServlet(RestServlet):
user = UserID.from_string(user_id)
await self.profile_handler.check_profile_query_allowed(user, requester_user)
displayname = await self.profile_handler.get_displayname(user)
avatar_url = await self.profile_handler.get_avatar_url(user)
ret = {}
if displayname is not None:
ret["displayname"] = displayname
if avatar_url is not None:
ret["avatar_url"] = avatar_url
ret = await self.profile_handler.get_profile(user_id)
return 200, ret
class UnstableProfileFieldRestServlet(RestServlet):
PATTERNS = [
re.compile(
r"^/_matrix/client/unstable/uk\.tcpip\.msc4133/profile/(?P<user_id>[^/]*)/(?P<field_name>[^/]*)"
)
]
CATEGORY = "Event sending requests"
def __init__(self, hs: "HomeServer"):
super().__init__()
self.hs = hs
self.profile_handler = hs.get_profile_handler()
self.auth = hs.get_auth()
async def on_GET(
self, request: SynapseRequest, user_id: str, field_name: str
) -> Tuple[int, JsonDict]:
requester_user = None
if self.hs.config.server.require_auth_for_profile_requests:
requester = await self.auth.get_user_by_req(request)
requester_user = requester.user
if not UserID.is_valid(user_id):
raise SynapseError(
HTTPStatus.BAD_REQUEST, "Invalid user id", Codes.INVALID_PARAM
)
if not field_name:
raise SynapseError(400, "Field name too short", errcode=Codes.INVALID_PARAM)
if len(field_name.encode("utf-8")) > MAX_CUSTOM_FIELD_LEN:
raise SynapseError(400, "Field name too long", errcode=Codes.KEY_TOO_LARGE)
if not is_namedspaced_grammar(field_name):
raise SynapseError(
400,
"Field name does not follow Common Namespaced Identifier Grammar",
errcode=Codes.INVALID_PARAM,
)
user = UserID.from_string(user_id)
await self.profile_handler.check_profile_query_allowed(user, requester_user)
if field_name == ProfileFields.DISPLAYNAME:
field_value: JsonValue = await self.profile_handler.get_displayname(user)
elif field_name == ProfileFields.AVATAR_URL:
field_value = await self.profile_handler.get_avatar_url(user)
else:
field_value = await self.profile_handler.get_profile_field(user, field_name)
return 200, {field_name: field_value}
async def on_PUT(
self, request: SynapseRequest, user_id: str, field_name: str
) -> Tuple[int, JsonDict]:
if not UserID.is_valid(user_id):
raise SynapseError(
HTTPStatus.BAD_REQUEST, "Invalid user id", Codes.INVALID_PARAM
)
requester = await self.auth.get_user_by_req(request)
user = UserID.from_string(user_id)
is_admin = await self.auth.is_server_admin(requester)
if not field_name:
raise SynapseError(400, "Field name too short", errcode=Codes.INVALID_PARAM)
if len(field_name.encode("utf-8")) > MAX_CUSTOM_FIELD_LEN:
raise SynapseError(400, "Field name too long", errcode=Codes.KEY_TOO_LARGE)
if not is_namedspaced_grammar(field_name):
raise SynapseError(
400,
"Field name does not follow Common Namespaced Identifier Grammar",
errcode=Codes.INVALID_PARAM,
)
content = parse_json_object_from_request(request)
try:
new_value = content[field_name]
except KeyError:
raise SynapseError(
400, f"Missing key '{field_name}'", errcode=Codes.MISSING_PARAM
)
propagate = _read_propagate(self.hs, request)
requester_suspended = (
await self.hs.get_datastores().main.get_user_suspended_status(
requester.user.to_string()
)
)
if requester_suspended:
raise SynapseError(
403,
"Updating profile while account is suspended is not allowed.",
Codes.USER_ACCOUNT_SUSPENDED,
)
if field_name == ProfileFields.DISPLAYNAME:
await self.profile_handler.set_displayname(
user, requester, new_value, is_admin, propagate=propagate
)
elif field_name == ProfileFields.AVATAR_URL:
await self.profile_handler.set_avatar_url(
user, requester, new_value, is_admin, propagate=propagate
)
else:
await self.profile_handler.set_profile_field(
user, requester, field_name, new_value, is_admin
)
return 200, {}
async def on_DELETE(
self, request: SynapseRequest, user_id: str, field_name: str
) -> Tuple[int, JsonDict]:
if not UserID.is_valid(user_id):
raise SynapseError(
HTTPStatus.BAD_REQUEST, "Invalid user id", Codes.INVALID_PARAM
)
requester = await self.auth.get_user_by_req(request)
user = UserID.from_string(user_id)
is_admin = await self.auth.is_server_admin(requester)
if not field_name:
raise SynapseError(400, "Field name too short", errcode=Codes.INVALID_PARAM)
if len(field_name.encode("utf-8")) > MAX_CUSTOM_FIELD_LEN:
raise SynapseError(400, "Field name too long", errcode=Codes.KEY_TOO_LARGE)
if not is_namedspaced_grammar(field_name):
raise SynapseError(
400,
"Field name does not follow Common Namespaced Identifier Grammar",
errcode=Codes.INVALID_PARAM,
)
propagate = _read_propagate(self.hs, request)
requester_suspended = (
await self.hs.get_datastores().main.get_user_suspended_status(
requester.user.to_string()
)
)
if requester_suspended:
raise SynapseError(
403,
"Updating profile while account is suspended is not allowed.",
Codes.USER_ACCOUNT_SUSPENDED,
)
if field_name == ProfileFields.DISPLAYNAME:
await self.profile_handler.set_displayname(
user, requester, "", is_admin, propagate=propagate
)
elif field_name == ProfileFields.AVATAR_URL:
await self.profile_handler.set_avatar_url(
user, requester, "", is_admin, propagate=propagate
)
else:
await self.profile_handler.delete_profile_field(
user, requester, field_name, is_admin
)
return 200, {}
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
# The specific displayname / avatar URL / custom field endpoints *must* appear
# before their corresponding generic profile endpoint.
ProfileDisplaynameRestServlet(hs).register(http_server)
ProfileAvatarURLRestServlet(hs).register(http_server)
ProfileRestServlet(hs).register(http_server)
if hs.config.experimental.msc4133_enabled:
UnstableProfileFieldRestServlet(hs).register(http_server)

View File

@@ -20,13 +20,11 @@
#
import logging
import re
from http import HTTPStatus
from typing import TYPE_CHECKING, Tuple
from synapse._pydantic_compat import StrictStr
from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
from synapse.api.urls import CLIENT_API_PREFIX
from synapse.http.server import HttpServer
from synapse.http.servlet import (
RestServlet,
@@ -127,16 +125,6 @@ class ReportRoomRestServlet(RestServlet):
self.clock = hs.get_clock()
self.store = hs.get_datastores().main
# TODO: Remove the unstable variant after 2-3 releases
# https://github.com/element-hq/synapse/issues/17373
if hs.config.experimental.msc4151_enabled:
self.PATTERNS.append(
re.compile(
f"^{CLIENT_API_PREFIX}/unstable/org.matrix.msc4151"
"/rooms/(?P<room_id>[^/]*)/report$"
)
)
class PostBody(RequestBodyModel):
reason: StrictStr

View File

@@ -24,9 +24,10 @@ from collections import defaultdict
from typing import TYPE_CHECKING, Any, Dict, List, Mapping, Optional, Tuple, Union
from synapse.api.constants import AccountDataTypes, EduTypes, Membership, PresenceState
from synapse.api.errors import Codes, StoreError, SynapseError
from synapse.api.errors import Codes, LimitExceededError, StoreError, SynapseError
from synapse.api.filtering import FilterCollection
from synapse.api.presence import UserPresenceState
from synapse.api.ratelimiting import Ratelimiter
from synapse.events.utils import (
SerializeEventConfig,
format_event_for_client_v2_without_room_id,
@@ -126,6 +127,13 @@ class SyncRestServlet(RestServlet):
cache_name="sync_valid_filter",
)
# Ratelimiter for presence updates, keyed by requester.
self._presence_per_user_limiter = Ratelimiter(
store=self.store,
clock=self.clock,
cfg=hs.config.ratelimiting.rc_presence_per_user,
)
async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
# This will always be set by the time Twisted calls us.
assert request.args is not None
@@ -239,7 +247,14 @@ class SyncRestServlet(RestServlet):
# send any outstanding server notices to the user.
await self._server_notices_sender.on_user_syncing(user.to_string())
affect_presence = set_presence != PresenceState.OFFLINE
# ignore the presence update if the ratelimit is exceeded but do not pause the request
try:
await self._presence_per_user_limiter.ratelimit(requester, pause=0.0)
except LimitExceededError:
affect_presence = False
logger.debug("User set_presence ratelimit exceeded; ignoring it.")
else:
affect_presence = set_presence != PresenceState.OFFLINE
context = await self.presence_handler.user_syncing(
user.to_string(),

View File

@@ -170,10 +170,10 @@ class VersionsRestServlet(RestServlet):
),
# MSC4140: Delayed events
"org.matrix.msc4140": bool(self.config.server.max_event_delay_ms),
# MSC4151: Report room API (Client-Server API)
"org.matrix.msc4151": self.config.experimental.msc4151_enabled,
# Simplified sliding sync
"org.matrix.simplified_msc3575": msc3575_enabled,
# Arbitrary key-value profile fields.
"uk.tcpip.msc4133": self.config.experimental.msc4133_enabled,
},
},
)

View File

@@ -391,7 +391,7 @@ class HomeServer(metaclass=abc.ABCMeta):
def is_mine(self, domain_specific_string: DomainSpecificString) -> bool:
return domain_specific_string.domain == self.hostname
def is_mine_id(self, string: str) -> bool:
def is_mine_id(self, user_id: str) -> bool:
"""Determines whether a user ID or room alias originates from this homeserver.
Returns:
@@ -399,7 +399,7 @@ class HomeServer(metaclass=abc.ABCMeta):
homeserver.
`False` otherwise, or if the user ID or room alias is malformed.
"""
localpart_hostname = string.split(":", 1)
localpart_hostname = user_id.split(":", 1)
if len(localpart_hostname) < 2:
return False
return localpart_hostname[1] == self.hostname

View File

@@ -59,11 +59,13 @@ from synapse.types.state import StateFilter
from synapse.util.async_helpers import Linearizer
from synapse.util.caches.expiringcache import ExpiringCache
from synapse.util.metrics import Measure, measure_func
from synapse.util.stringutils import shortstr
if TYPE_CHECKING:
from synapse.server import HomeServer
from synapse.storage.controllers import StateStorageController
from synapse.storage.databases.main import DataStore
from synapse.storage.databases.state.deletion import StateDeletionDataStore
logger = logging.getLogger(__name__)
metrics_logger = logging.getLogger("synapse.state.metrics")
@@ -194,6 +196,8 @@ class StateHandler:
self._storage_controllers = hs.get_storage_controllers()
self._events_shard_config = hs.config.worker.events_shard_config
self._instance_name = hs.get_instance_name()
self._state_store = hs.get_datastores().state
self._state_deletion_store = hs.get_datastores().state_deletion
self._update_current_state_client = (
ReplicationUpdateCurrentStateRestServlet.make_client(hs)
@@ -355,6 +359,28 @@ class StateHandler:
await_full_state=False,
)
# Ensure we still have the state groups we're relying on, and bump
# their usage time to avoid them being deleted from under us.
if entry.state_group:
missing_state_group = await self._state_deletion_store.check_state_groups_and_bump_deletion(
{entry.state_group}
)
if missing_state_group:
raise Exception(f"Missing state group: {entry.state_group}")
elif entry.prev_group:
# We only rely on the prev group when persisting the event if we
# don't have an `entry.state_group`.
missing_state_group = await self._state_deletion_store.check_state_groups_and_bump_deletion(
{entry.prev_group}
)
if missing_state_group:
# If we're missing the prev group then we can just clear the
# entries, and rely on `entry._state` (which must exist if
# `entry.state_group` is None)
entry.prev_group = None
entry.delta_ids = None
state_group_before_event_prev_group = entry.prev_group
deltas_to_state_group_before_event = entry.delta_ids
state_ids_before_event = None
@@ -475,7 +501,10 @@ class StateHandler:
@trace
@measure_func()
async def resolve_state_groups_for_events(
self, room_id: str, event_ids: StrCollection, await_full_state: bool = True
self,
room_id: str,
event_ids: StrCollection,
await_full_state: bool = True,
) -> _StateCacheEntry:
"""Given a list of event_ids this method fetches the state at each
event, resolves conflicts between them and returns them.
@@ -511,6 +540,7 @@ class StateHandler:
) = await self._state_storage_controller.get_state_group_delta(
state_group_id
)
return _StateCacheEntry(
state=None,
state_group=state_group_id,
@@ -531,7 +561,9 @@ class StateHandler:
room_version,
state_to_resolve,
None,
state_res_store=StateResolutionStore(self.store),
state_res_store=StateResolutionStore(
self.store, self._state_deletion_store
),
)
return result
@@ -663,7 +695,25 @@ class StateResolutionHandler:
async with self.resolve_linearizer.queue(group_names):
cache = self._state_cache.get(group_names, None)
if cache:
return cache
# Check that the returned cache entry doesn't point to deleted
# state groups.
state_groups_to_check = set()
if cache.state_group is not None:
state_groups_to_check.add(cache.state_group)
if cache.prev_group is not None:
state_groups_to_check.add(cache.prev_group)
missing_state_groups = await state_res_store.state_deletion_store.check_state_groups_and_bump_deletion(
state_groups_to_check
)
if not missing_state_groups:
return cache
else:
# There are missing state groups, so let's remove the stale
# entry and continue as if it was a cache miss.
self._state_cache.pop(group_names, None)
logger.info(
"Resolving state for %s with groups %s",
@@ -671,6 +721,16 @@ class StateResolutionHandler:
list(group_names),
)
# We double check that none of the state groups have been deleted.
# They shouldn't be as all these state groups should be referenced.
missing_state_groups = await state_res_store.state_deletion_store.check_state_groups_and_bump_deletion(
group_names
)
if missing_state_groups:
raise Exception(
f"State groups have been deleted: {shortstr(missing_state_groups)}"
)
state_groups_histogram.observe(len(state_groups_ids))
new_state = await self.resolve_events_with_store(
@@ -884,7 +944,8 @@ class StateResolutionStore:
in well defined way.
"""
store: "DataStore"
main_store: "DataStore"
state_deletion_store: "StateDeletionDataStore"
def get_events(
self, event_ids: StrCollection, allow_rejected: bool = False
@@ -899,7 +960,7 @@ class StateResolutionStore:
An awaitable which resolves to a dict from event_id to event.
"""
return self.store.get_events(
return self.main_store.get_events(
event_ids,
redact_behaviour=EventRedactBehaviour.as_is,
get_prev_content=False,
@@ -920,4 +981,4 @@ class StateResolutionStore:
An awaitable that resolves to a set of event IDs.
"""
return self.store.get_auth_chain_difference(room_id, state_sets)
return self.main_store.get_auth_chain_difference(room_id, state_sets)

View File

@@ -29,15 +29,15 @@ from typing import (
Generator,
Iterable,
List,
Literal,
Optional,
Protocol,
Sequence,
Set,
Tuple,
overload,
)
from typing_extensions import Literal, Protocol
from synapse import event_auth
from synapse.api.constants import EventTypes
from synapse.api.errors import AuthError

View File

@@ -86,7 +86,9 @@ class SQLBaseStore(metaclass=ABCMeta):
"""
def _invalidate_state_caches(
self, room_id: str, members_changed: Collection[str]
self,
room_id: str,
members_changed: Collection[str],
) -> None:
"""Invalidates caches that are based on the current state, but does
not stream invalidations down replication.

View File

@@ -789,7 +789,7 @@ class BackgroundUpdater:
# we may already have a half-built index. Let's just drop it
# before trying to create it again.
sql = "DROP INDEX IF EXISTS %s" % (index_name,)
sql = "DROP INDEX CONCURRENTLY IF EXISTS %s" % (index_name,)
logger.debug("[SQL] %s", sql)
c.execute(sql)
@@ -814,7 +814,7 @@ class BackgroundUpdater:
if replaces_index is not None:
# We drop the old index as the new index has now been created.
sql = f"DROP INDEX IF EXISTS {replaces_index}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {replaces_index}"
logger.debug("[SQL] %s", sql)
c.execute(sql)
finally:

View File

@@ -332,6 +332,7 @@ class EventsPersistenceStorageController:
# store for now.
self.main_store = stores.main
self.state_store = stores.state
self._state_deletion_store = stores.state_deletion
assert stores.persist_events
self.persist_events_store = stores.persist_events
@@ -549,7 +550,9 @@ class EventsPersistenceStorageController:
room_version,
state_maps_by_state_group,
event_map=None,
state_res_store=StateResolutionStore(self.main_store),
state_res_store=StateResolutionStore(
self.main_store, self._state_deletion_store
),
)
return await res.get_state(self._state_controller, StateFilter.all())
@@ -635,15 +638,20 @@ class EventsPersistenceStorageController:
room_id, [e for e, _ in chunk]
)
await self.persist_events_store._persist_events_and_state_updates(
room_id,
chunk,
state_delta_for_room=state_delta_for_room,
new_forward_extremities=new_forward_extremities,
use_negative_stream_ordering=backfilled,
inhibit_local_membership_updates=backfilled,
new_event_links=new_event_links,
)
# Stop the state groups from being deleted while we're persisting
# them.
async with self._state_deletion_store.persisting_state_group_references(
events_and_contexts
):
await self.persist_events_store._persist_events_and_state_updates(
room_id,
chunk,
state_delta_for_room=state_delta_for_room,
new_forward_extremities=new_forward_extremities,
use_negative_stream_ordering=backfilled,
inhibit_local_membership_updates=backfilled,
new_event_links=new_event_links,
)
return replaced_events
@@ -965,7 +973,9 @@ class EventsPersistenceStorageController:
room_version,
state_groups,
events_map,
state_res_store=StateResolutionStore(self.main_store),
state_res_store=StateResolutionStore(
self.main_store, self._state_deletion_store
),
)
state_resolutions_during_persistence.inc()

View File

@@ -21,9 +21,10 @@
import itertools
import logging
from typing import TYPE_CHECKING, Set
from typing import TYPE_CHECKING, Collection, Mapping, Set
from synapse.logging.context import nested_logging_context
from synapse.metrics.background_process_metrics import wrap_as_background_process
from synapse.storage.databases import Databases
if TYPE_CHECKING:
@@ -38,12 +39,17 @@ class PurgeEventsStorageController:
def __init__(self, hs: "HomeServer", stores: Databases):
self.stores = stores
if hs.config.worker.run_background_tasks:
self._delete_state_loop_call = hs.get_clock().looping_call(
self._delete_state_groups_loop, 60 * 1000
)
async def purge_room(self, room_id: str) -> None:
"""Deletes all record of a room"""
with nested_logging_context(room_id):
state_groups_to_delete = await self.stores.main.purge_room(room_id)
await self.stores.state.purge_room_state(room_id, state_groups_to_delete)
await self.stores.main.purge_room(room_id)
await self.stores.state.purge_room_state(room_id)
async def purge_history(
self, room_id: str, token: str, delete_local_events: bool
@@ -68,11 +74,15 @@ class PurgeEventsStorageController:
logger.info("[purge] finding state groups that can be deleted")
sg_to_delete = await self._find_unreferenced_groups(state_groups)
await self.stores.state.purge_unreferenced_state_groups(
room_id, sg_to_delete
# Mark these state groups as pending deletion, they will actually
# get deleted automatically later.
await self.stores.state_deletion.mark_state_groups_as_pending_deletion(
sg_to_delete
)
async def _find_unreferenced_groups(self, state_groups: Set[int]) -> Set[int]:
async def _find_unreferenced_groups(
self, state_groups: Collection[int]
) -> Set[int]:
"""Used when purging history to figure out which state groups can be
deleted.
@@ -118,6 +128,78 @@ class PurgeEventsStorageController:
next_to_search |= prevs
state_groups_seen |= prevs
# We also check to see if anything referencing the state groups are
# also unreferenced. This helps ensure that we delete unreferenced
# state groups, if we don't then we will de-delta them when we
# delete the other state groups leading to increased DB usage.
next_edges = await self.stores.state.get_next_state_groups(current_search)
nexts = set(next_edges.keys())
nexts -= state_groups_seen
next_to_search |= nexts
state_groups_seen |= nexts
to_delete = state_groups_seen - referenced_groups
return to_delete
@wrap_as_background_process("_delete_state_groups_loop")
async def _delete_state_groups_loop(self) -> None:
"""Background task that deletes any state groups that may be pending
deletion."""
while True:
next_to_delete = await self.stores.state_deletion.get_next_state_group_collection_to_delete()
if next_to_delete is None:
break
(room_id, groups_to_sequences) = next_to_delete
made_progress = await self._delete_state_groups(
room_id, groups_to_sequences
)
# If no progress was made in deleting the state groups, then we
# break to allow a pause before trying again next time we get
# called.
if not made_progress:
break
async def _delete_state_groups(
self, room_id: str, groups_to_sequences: Mapping[int, int]
) -> bool:
"""Tries to delete the given state groups.
Returns:
Whether we made progress in deleting the state groups (or marking
them as referenced).
"""
# We double check if any of the state groups have become referenced.
# This shouldn't happen, as any usages should cause the state group to
# be removed as pending deletion.
referenced_state_groups = await self.stores.main.get_referenced_state_groups(
groups_to_sequences
)
if referenced_state_groups:
# We mark any state groups that have become referenced as being
# used.
await self.stores.state_deletion.mark_state_groups_as_used(
referenced_state_groups
)
# Update list of state groups to remove referenced ones
groups_to_sequences = {
state_group: sequence_number
for state_group, sequence_number in groups_to_sequences.items()
if state_group not in referenced_state_groups
}
if not groups_to_sequences:
# We made progress here as long as we marked some state groups as
# now referenced.
return len(referenced_state_groups) > 0
return await self.stores.state.purge_unreferenced_state_groups(
room_id,
groups_to_sequences,
)

View File

@@ -35,6 +35,7 @@ from typing import (
Iterable,
Iterator,
List,
Literal,
Mapping,
Optional,
Sequence,
@@ -47,7 +48,7 @@ from typing import (
import attr
from prometheus_client import Counter, Histogram
from typing_extensions import Concatenate, Literal, ParamSpec
from typing_extensions import Concatenate, ParamSpec
from twisted.enterprise import adbapi
from twisted.internet.interfaces import IReactorCore
@@ -2159,10 +2160,26 @@ class DatabasePool:
if rowcount > 1:
raise StoreError(500, "More than one row matched (%s)" % (table,))
# Ideally we could use the overload decorator here to specify that the
# return type is only optional if allow_none is True, but this does not work
# when you call a static method from an instance.
# See https://github.com/python/mypy/issues/7781
@overload
@staticmethod
def simple_select_one_txn(
txn: LoggingTransaction,
table: str,
keyvalues: Dict[str, Any],
retcols: Collection[str],
allow_none: Literal[False] = False,
) -> Tuple[Any, ...]: ...
@overload
@staticmethod
def simple_select_one_txn(
txn: LoggingTransaction,
table: str,
keyvalues: Dict[str, Any],
retcols: Collection[str],
allow_none: Literal[True] = True,
) -> Optional[Tuple[Any, ...]]: ...
@staticmethod
def simple_select_one_txn(
txn: LoggingTransaction,

Some files were not shown because too many files have changed in this diff Show More