Compare commits

..

122 Commits

Author SHA1 Message Date
Erik Johnston
d2e0c0c927 Newsfile 2025-02-06 12:17:17 +00:00
Erik Johnston
af84f1d7aa Don't log exceptions for onbvious incorrect stream tokens
We log incorrect ones as we want to catch bugs where Synapse returns bad
tokens. However, sometimes clients just send tokens that are e.g. empty.
2025-02-06 12:15:56 +00:00
Erik Johnston
29534e7d0a Merge branch 'release-v1.124' into develop 2025-02-05 18:23:59 +00:00
Erik Johnston
553e9882bf 1.124.0rc2 2025-02-05 16:35:55 +00:00
Erik Johnston
3391da348f Fix bug where persisting some events fails after unclean shutdown. (#18137)
Introduced in #18107

`UniqueViolation: duplicate key value violates unique constraint
"state_groups_persisting_pkey"`
2025-02-05 16:26:07 +00:00
Matthew Hodgson
6fe41d2b47 make dual licensing explicit (#18134)
Update readme & LICENSE files to make it explicit that you can buy a
commercial license as an AGPL alternative from Element.
2025-02-05 13:40:10 +00:00
Erik Johnston
5b03265cfb Fix 'Fix lint' GHA (#18136)
c.f. #18121

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-02-05 12:30:13 +00:00
Erik Johnston
b8a333004a Fix legacy modules check_username_for_spam (#18135)
Broke in #17916, as the signature inspection incorrectly looks at the
wrapper function. We fix this by setting the signature on the wrapper
function to that of the wrapped function via `@functools.wraps`.
2025-02-05 12:07:49 +00:00
V02460
e41174cae3 Add MSC3861 config options admin_token_path and client_secret_path (#18004)
Another PR on my quest to a `*_path` variant for every secret. Adds two
config options `admin_token_path` and `client_secret_path` to the
experimental config under `experimental_features.msc3861`. Also includes
tests.

I tried to be a good citizen here by following `attrs` conventions and
not rewriting the corresponding non-path variants in the class, but
instead adding methods to retrieve the value.

Reading secrets from files has the security advantage of separating the
secrets from the config. It also simplifies secrets management in
Kubernetes. Also useful to NixOS users.
2025-02-04 12:45:33 -06:00
Erik Johnston
37e893499f 1.124.0rc1 2025-02-04 11:53:27 +00:00
Erik Johnston
c46d452c7c Fix bug where purging history could lead to increase in disk space usage (#18131)
When purging history, we try and delete any state groups that become
unreferenced (i.e. there are no longer any events that directly
reference them). When we delete a state group that is referenced by
another state group, we "de-delta" that state group so that it no longer
refers to the state group that is deleted.

There are two bugs with this approach that we fix here:
1. There is a common pattern where we end up storing two state groups
when persisting a state event: the state before and after the new state
event, where the latter is stored as a delta to the former. When
deleting state groups we only deleted the "new" state and left (and
potentially de-deltaed) the old state. This was due to a bug/typo when
trying to find referenced state groups.
2. There are times where we store unreferenced state groups in the DB,
during the purging of history these would not get rechecked and instead
always de-deltaed. Instead, we should check for this case and delete any
unreferenced state groups rather than de-deltaing them.

The effect of the above bugs is that when purging history we'd end up
with lots of unreferenced state groups that had been de-deltaed (i.e.
stored as the full state). This can lead to dramatic increases in
storage space used.
2025-02-03 19:04:19 +00:00
Erik Johnston
27dbb1b429 Add locking to more safely delete state groups: Part 2 (#18130)
This actually makes it so that deleting state groups goes via the new
mechanism.

c.f. #18107
2025-02-03 17:58:55 +00:00
Erik Johnston
aa6e5c2ecb Add locking to more safely delete state groups: Part 1 (#18107)
Currently we don't really have anything that stops us from deleting
state groups when an in-flight event references it. This is a fairly
rare race currently, but we want to be able to more aggressively delete
state groups so it is important to address this to ensure that the
database remains valid.

This implements the locking, but doesn't actually use it.

See the class docstring of the new data store for an explanation for how
this works.

---------

Co-authored-by: Devon Hudson <devon.dmytro@gmail.com>
2025-02-03 17:29:15 +00:00
Andrew Morgan
ac1bf682ff Allow (un)block_room storage functions to be called on workers (#18119)
This is so workers can call these functions.

This was preventing the [Delete Room Admin
API](https://element-hq.github.io/synapse/latest/admin_api/rooms.html#version-2-new-version)
from succeeding when `block: true` was specified. This was because we
had `run_background_tasks_on` configured to run on a separate worker.

As workers weren't able to call the `block_room` storage function before
this PR, the (delete room) task failed when taken off the queue by the
worker.
2025-01-30 20:48:12 +00:00
Eric Eastwood
a0b70473fc Raise an error if someone is using an incorrect suffix in a config duration string (#18112)
Previously, a value like `5q` would be interpreted as 5 milliseconds. We
should just raise an error instead of letting someone run with a
misconfiguration.
2025-01-29 18:14:02 -06:00
Devon Hudson
95a85b1129 Merge branch 'master' into develop 2025-01-28 09:23:26 -07:00
Devon Hudson
3d8535b1de 1.123.0 2025-01-28 08:37:58 -07:00
Will Hunt
628351b98d Never autojoin deactivated & suspended users. (#18073)
This PR changes the logic so that deactivated users are always ignored.
Suspended users were already effectively ignored as Synapse forbids a
join while suspended.

---------

Co-authored-by: Devon Hudson <devon.dmytro@gmail.com>
2025-01-28 00:37:24 +00:00
dependabot[bot]
8f27b3af07 Bump python-multipart from 0.0.18 to 0.0.20 (#18096)
Bumps [python-multipart](https://github.com/Kludex/python-multipart)
from 0.0.18 to 0.0.20.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/releases">python-multipart's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.0.20</h2>
<h2>What's Changed</h2>
<ul>
<li>Handle messages containing only end boundary, fixes <a
href="https://redirect.github.com/Kludex/python-multipart/issues/38">#38</a>
by <a href="https://github.com/jhnstrk"><code>@​jhnstrk</code></a> in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/142">Kludex/python-multipart#142</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/Mr-Sunglasses"><code>@​Mr-Sunglasses</code></a>
made their first contribution in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/185">Kludex/python-multipart#185</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.19...0.0.20">https://github.com/Kludex/python-multipart/compare/0.0.19...0.0.20</a></p>
<h2>Version 0.0.19</h2>
<h2>What's Changed</h2>
<ul>
<li>Don't warn when CRLF is found after last boundary by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/Kludex/python-multipart/pull/193">Kludex/python-multipart#193</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.19">https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.19</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/blob/master/CHANGELOG.md">python-multipart's
changelog</a>.</em></p>
<blockquote>
<h2>0.0.20 (2024-12-16)</h2>
<ul>
<li>Handle messages containing only end boundary <a
href="https://redirect.github.com/Kludex/python-multipart/pull/142">#142</a>.</li>
</ul>
<h2>0.0.19 (2024-11-30)</h2>
<ul>
<li>Don't warn when CRLF is found after last boundary on
<code>MultipartParser</code> <a
href="https://redirect.github.com/Kludex/python-multipart/pull/193">#193</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b083cef4d6"><code>b083cef</code></a>
Version 0.0.20 (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/197">#197</a>)</li>
<li><a
href="04d3cf5ef5"><code>04d3cf5</code></a>
Handle messages containing only end boundary, fixes <a
href="https://redirect.github.com/Kludex/python-multipart/issues/38">#38</a>
(<a
href="https://redirect.github.com/Kludex/python-multipart/issues/142">#142</a>)</li>
<li><a
href="f1c5a2821b"><code>f1c5a28</code></a>
feat: Add python 3.13 in CI matrix. (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/185">#185</a>)</li>
<li><a
href="4bffa0c7c6"><code>4bffa0c</code></a>
doc: A file parameter is not a field (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/127">#127</a>)</li>
<li><a
href="6f3295bc79"><code>6f3295b</code></a>
Bump astral-sh/setup-uv from 3 to 4 in the github-actions group (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/194">#194</a>)</li>
<li><a
href="c4fe4d3ceb"><code>c4fe4d3</code></a>
Don't warn when CRLF is found after last boundary (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/193">#193</a>)</li>
<li>See full diff in <a
href="https://github.com/Kludex/python-multipart/compare/0.0.18...0.0.20">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=python-multipart&package-manager=pip&previous-version=0.0.18&new-version=0.0.20)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:28:00 +00:00
dependabot[bot]
579f4ac1cd Bump serde_json from 1.0.135 to 1.0.137 (#18099)
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.135 to
1.0.137.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/json/releases">serde_json's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.137</h2>
<ul>
<li>Turn on &quot;float_roundtrip&quot; and &quot;unbounded_depth&quot;
features for serde_json in play.rust-lang.org (<a
href="https://redirect.github.com/serde-rs/json/issues/1231">#1231</a>)</li>
</ul>
<h2>v1.0.136</h2>
<ul>
<li>Optimize serde_json::value::Serializer::serialize_map by using
Map::with_capacity (<a
href="https://redirect.github.com/serde-rs/json/issues/1230">#1230</a>,
thanks <a
href="https://github.com/goffrie"><code>@​goffrie</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eb49e28204"><code>eb49e28</code></a>
Release 1.0.137</li>
<li><a
href="51c48ab3b0"><code>51c48ab</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1231">#1231</a>
from dtolnay/playground</li>
<li><a
href="7d8f15b963"><code>7d8f15b</code></a>
Enable &quot;float_roundtrip&quot; and &quot;unbounded_depth&quot;
features in playground</li>
<li><a
href="a46f14cf2e"><code>a46f14c</code></a>
Release 1.0.136</li>
<li><a
href="eb9f3f6387"><code>eb9f3f6</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1230">#1230</a>
from goffrie/patch-1</li>
<li><a
href="513e5b2f74"><code>513e5b2</code></a>
Use Map::with_capacity in value::Serializer::serialize_map</li>
<li>See full diff in <a
href="https://github.com/serde-rs/json/compare/v1.0.135...v1.0.137">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde_json&package-manager=cargo&previous-version=1.0.135&new-version=1.0.137)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:24:07 +00:00
dependabot[bot]
c53999dab8 Bump types-bleach from 6.1.0.20240331 to 6.2.0.20241123 (#18082)
Bumps [types-bleach](https://github.com/python/typeshed) from
6.1.0.20240331 to 6.2.0.20241123.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-bleach&package-manager=pip&previous-version=6.1.0.20240331&new-version=6.2.0.20241123)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 21:04:41 +00:00
Andrew Morgan
b41a9ebb38 OIDC: increase length of generated nonce parameter from 30->32 chars (#18109) 2025-01-27 18:39:51 +00:00
Eric Eastwood
6ec5e13ec9 Fix join being denied after being invited over federation (#18075)
This also happens for rejecting an invite. Basically, any out-of-band membership transition where we first get the membership as an `outlier` and then rely on federation filling us in to de-outlier it.

This PR mainly addresses automated test flakiness, bots/scripts, and options within Synapse like [`auto_accept_invites`](https://element-hq.github.io/synapse/v1.122/usage/configuration/config_documentation.html#auto_accept_invites) that are able to react quickly (before federation is able to push us events), but also helps in generic scenarios where federation is lagging.

I initially thought this might be a Synapse consistency issue (see issues labeled with [`Z-Read-After-Write`](https://github.com/matrix-org/synapse/labels/Z-Read-After-Write)) but it seems to be an event auth logic problem. Workers probably do increase the number of possible race condition scenarios that make this visible though (replication and cache invalidation lag).

Fix https://github.com/element-hq/synapse/issues/15012
(probably fixes https://github.com/matrix-org/synapse/issues/15012 (https://github.com/element-hq/synapse/issues/15012))
Related to https://github.com/matrix-org/matrix-spec/issues/2062

Problems:

 1. We don't consider [out-of-band membership](https://github.com/element-hq/synapse/blob/develop/docs/development/room-dag-concepts.md#out-of-band-membership-events) (outliers) in our `event_auth` logic even though we expose them in `/sync`.
 1. (This PR doesn't address this point) Perhaps we should consider authing events in the persistence queue as events already in the queue could allow subsequent events to be allowed (events come through many channels: federation transaction, remote invite, remote join, local send). But this doesn't save us in the case where the event is more delayed over federation.


### What happened before?

I wrote some Complement test that stresses this exact scenario and reproduces the problem: https://github.com/matrix-org/complement/pull/757

```
COMPLEMENT_ALWAYS_PRINT_SERVER_LOGS=1 COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh -run TestSynapseConsistency
```


We have `hs1` and `hs2` running in monolith mode (no workers):

 1. `@charlie1:hs2` is invited and joins the room:
     1. `hs1` invites `@charlie1:hs2` to a room which we receive on `hs2` as `PUT /_matrix/federation/v1/invite/{roomId}/{eventId}` (`on_invite_request(...)`) and the invite membership is persisted as an outlier. The `room_memberships` and `local_current_membership` database tables are also updated which means they are visible down `/sync` at this point.
     1. `@charlie1:hs2` decides to join because it saw the invite down `/sync`. Because `hs2` is not yet in the room, this happens as a remote join `make_join`/`send_join` which comes back with all of the auth events needed to auth successfully and now `@charlie1:hs2` is successfully joined to the room.
 1. `@charlie2:hs2` is invited and and tries to join the room:
     1. `hs1` invites `@charlie2:hs2` to the room which we receive on `hs2` as `PUT /_matrix/federation/v1/invite/{roomId}/{eventId}` (`on_invite_request(...)`) and the invite membership is persisted as an outlier. The `room_memberships` and `local_current_membership` database tables are also updated which means they are visible down `/sync` at this point.
     1. Because `hs2` is already participating in the room, we also see the invite come over federation in a transaction and we start processing it (not done yet, see below)
     1. `@charlie2:hs2` decides to join because it saw the invite down `/sync`. Because `hs2`, is already in the room, this happens as a local join but we deny the event because our `event_auth` logic thinks that we have no membership in the room  (expected to be able to join because we saw the invite down `/sync`)
     1. We finally finish processing the `@charlie2:hs2` invite event from and de-outlier it.
         - If this finished before we tried to join we would have been fine but this is the race condition that makes this situation visible.


Logs for `hs2`:

```
🗳️ on_invite_request: handling event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=False>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
🔦 _store_room_members_txn update local_current_membership: <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
📨 Notifying about new event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
 on_invite_request: handled event <FrozenEventV3 event_id=$PRPCvdXdcqyjdUKP_NxGF2CcukmwOaoK0ZR1WiVOZVk, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=invite, outlier=True>
🧲 do_invite_join for @user-2-charlie1:hs2 in !sfZVBdLUezpPWetrol:hs1
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$bwv8LxFnqfpsw_rhR7OrTjtz09gaJ23MqstKOcs7ygA, type=m.room.member, state_key=@user-1-alice:hs1, membership=join, outlier=True>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$oju1ts3G3pz5O62IesrxX5is4LxAwU3WPr4xvid5ijI, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=join, outlier=False>
📨 Notifying about new event <FrozenEventV3 event_id=$oju1ts3G3pz5O62IesrxX5is4LxAwU3WPr4xvid5ijI, type=m.room.member, state_key=@user-2-charlie1:hs2, membership=join, outlier=False>

...

🗳️ on_invite_request: handling event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
🔦 _store_room_members_txn update room_memberships: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
🔦 _store_room_members_txn update local_current_membership: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
📨 Notifying about new event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
 on_invite_request: handled event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=True>
📬 handling received PDU in room !sfZVBdLUezpPWetrol:hs1: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
📮 handle_new_client_event: handling <FrozenEventV3 event_id=$WNVDTQrxy5tCdPQHMyHyIn7tE4NWqKsZ8Bn8R4WbBSA, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=join, outlier=False>
 Denying new event <FrozenEventV3 event_id=$WNVDTQrxy5tCdPQHMyHyIn7tE4NWqKsZ8Bn8R4WbBSA, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=join, outlier=False> because 403: You are not invited to this room.
synapse.http.server - 130 - INFO - POST-16 - <SynapseRequest at 0x7f460c91fbf0 method='POST' uri='/_matrix/client/v3/join/%21sfZVBdLUezpPWetrol:hs1?server_name=hs1' clientproto='HTTP/1.0' site='8080'> SynapseError: 403 - You are not invited to this room.
📨 Notifying about new event <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
 handled received PDU in room !sfZVBdLUezpPWetrol:hs1: <FrozenEventV3 event_id=$O_54j7O--6xMsegY5EVZ9SA-mI4_iHJOIoRwYyeWIPY, type=m.room.member, state_key=@user-3-charlie2:hs2, membership=invite, outlier=False>
```
2025-01-27 11:21:10 -06:00
dependabot[bot]
148e93576e Bump log from 0.4.22 to 0.4.25 (#18098)
Bumps [log](https://github.com/rust-lang/log) from 0.4.22 to 0.4.25.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/releases">log's
releases</a>.</em></p>
<blockquote>
<h2>0.4.25</h2>
<h2>What's Changed</h2>
<ul>
<li>Revert loosening of kv cargo features by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/662">rust-lang/log#662</a></li>
<li>Prepare for 0.4.25 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/663">rust-lang/log#663</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.24...0.4.25">https://github.com/rust-lang/log/compare/0.4.24...0.4.25</a></p>
<h2>0.4.24 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix up kv feature activation by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/659">rust-lang/log#659</a></li>
<li>Prepare for 0.4.24 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/660">rust-lang/log#660</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.23...0.4.24">https://github.com/rust-lang/log/compare/0.4.23...0.4.24</a></p>
<h2>0.4.23 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix some typos by <a
href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li>Add logforth to implementation by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li>Add <code>spdlog-rs</code> link to README by <a
href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li>Add correct lifetime to kv::Value::to_borrowed_str by <a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a> in
<a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li>docs: Add logforth as an impl by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/642">rust-lang/log#642</a></li>
<li>Add clang_log implementation by <a
href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li>Bind lifetimes of &amp;str returned from Key by the lifetime of 'k
rather than the lifetime of the Key struct by <a
href="https://github.com/gbbosak"><code>@​gbbosak</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a>
(reverted)</li>
<li>Fix up key lifetimes and add method to try get a borrowed key by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/653">rust-lang/log#653</a></li>
<li>Add Ftail implementation by <a
href="https://github.com/tjardoo"><code>@​tjardoo</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
<li>Relax feature flag for value's std_support by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/657">rust-lang/log#657</a></li>
<li>Prepare for 0.4.23 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/656">rust-lang/log#656</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li><a href="https://github.com/tisonkun"><code>@​tisonkun</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li><a href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li><a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li><a href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li><a href="https://github.com/gbbosak"><code>@​gbbosak</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li><a href="https://github.com/tjardoo"><code>@​tjardoo</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.23">https://github.com/rust-lang/log/compare/0.4.22...0.4.23</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/blob/master/CHANGELOG.md">log's
changelog</a>.</em></p>
<blockquote>
<h2>[0.4.25] - 2025-01-14</h2>
<h2>What's Changed</h2>
<ul>
<li>Revert loosening of kv cargo features by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/662">rust-lang/log#662</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.24...0.4.25">https://github.com/rust-lang/log/compare/0.4.24...0.4.25</a></p>
<h2>[0.4.24] - 2025-01-11</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix up kv feature activation by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/659">rust-lang/log#659</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.23...0.4.24">https://github.com/rust-lang/log/compare/0.4.23...0.4.24</a></p>
<h2>[0.4.23] - 2025-01-10 (yanked)</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix some typos by <a
href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li>Add logforth to implementation by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li>Add <code>spdlog-rs</code> link to README by <a
href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li>Add correct lifetime to kv::Value::to_borrowed_str by <a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a> in
<a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li>docs: Add logforth as an impl by <a
href="https://github.com/tisonkun"><code>@​tisonkun</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/642">rust-lang/log#642</a></li>
<li>Add clang_log implementation by <a
href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li>Bind lifetimes of &amp;str returned from Key by the lifetime of 'k
rather than the lifetime of the Key struct by <a
href="https://github.com/gbbosak"><code>@​gbbosak</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li>Fix up key lifetimes and add method to try get a borrowed key by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/653">rust-lang/log#653</a></li>
<li>Add Ftail implementation by <a
href="https://github.com/tjardoo"><code>@​tjardoo</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Kleinmarb"><code>@​Kleinmarb</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/637">rust-lang/log#637</a></li>
<li><a href="https://github.com/tisonkun"><code>@​tisonkun</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/638">rust-lang/log#638</a></li>
<li><a href="https://github.com/SpriteOvO"><code>@​SpriteOvO</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/639">rust-lang/log#639</a></li>
<li><a
href="https://github.com/stevenroose"><code>@​stevenroose</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/643">rust-lang/log#643</a></li>
<li><a href="https://github.com/DDAN-17"><code>@​DDAN-17</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/646">rust-lang/log#646</a></li>
<li><a href="https://github.com/gbbosak"><code>@​gbbosak</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/648">rust-lang/log#648</a></li>
<li><a href="https://github.com/tjardoo"><code>@​tjardoo</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/652">rust-lang/log#652</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.23">https://github.com/rust-lang/log/compare/0.4.22...0.4.23</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="22be810729"><code>22be810</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/663">#663</a>
from rust-lang/cargo/0.4.25</li>
<li><a
href="0279730123"><code>0279730</code></a>
prepare for 0.4.25 release</li>
<li><a
href="4099bcb357"><code>4099bcb</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/662">#662</a>
from rust-lang/fix/cargo-features</li>
<li><a
href="36e7e3f696"><code>36e7e3f</code></a>
revert loosening of kv cargo features</li>
<li><a
href="2282191854"><code>2282191</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/660">#660</a>
from rust-lang/cargo/0.4.24</li>
<li><a
href="2994f0a62c"><code>2994f0a</code></a>
prepare for 0.4.24 release</li>
<li><a
href="5fcb50eccd"><code>5fcb50e</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/659">#659</a>
from rust-lang/fix/feature-builds</li>
<li><a
href="29fe9e60ff"><code>29fe9e6</code></a>
fix up feature activation</li>
<li><a
href="b1824f2c28"><code>b1824f2</code></a>
use cargo hack in CI to test all feature combinations</li>
<li><a
href="e6b643d591"><code>e6b643d</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/656">#656</a>
from rust-lang/cargo/0.4.23</li>
<li>Additional commits viewable in <a
href="https://github.com/rust-lang/log/compare/0.4.22...0.4.25">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=log&package-manager=cargo&previous-version=0.4.22&new-version=0.4.25)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 15:23:28 +00:00
dependabot[bot]
56ed412839 Bump dawidd6/action-download-artifact from 7 to 8 (#18108)
Bumps
[dawidd6/action-download-artifact](https://github.com/dawidd6/action-download-artifact)
from 7 to 8.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dawidd6/action-download-artifact/releases">dawidd6/action-download-artifact's
releases</a>.</em></p>
<blockquote>
<h2>v8</h2>
<h2>New features</h2>
<ul>
<li><code>use_unzip</code> boolean input (defaulting to false) - if set
to true, the action will use system provided <code>unzip</code> utility
for unpacking downloaded artifact(s) (note that the action will first
download the .zip artifact file, then unpack it and remove the .zip
file)</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>README: v7 by <a
href="https://github.com/haines"><code>@​haines</code></a> in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/318">dawidd6/action-download-artifact#318</a></li>
<li>Unzip by <a
href="https://github.com/dawidd6"><code>@​dawidd6</code></a> in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/325">dawidd6/action-download-artifact#325</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/haines"><code>@​haines</code></a> made
their first contribution in <a
href="https://redirect.github.com/dawidd6/action-download-artifact/pull/318">dawidd6/action-download-artifact#318</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/dawidd6/action-download-artifact/compare/v7...v8">https://github.com/dawidd6/action-download-artifact/compare/v7...v8</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="20319c5641"><code>20319c5</code></a>
README: v8</li>
<li><a
href="e58a9e5d14"><code>e58a9e5</code></a>
Unzip (<a
href="https://redirect.github.com/dawidd6/action-download-artifact/issues/325">#325</a>)</li>
<li><a
href="6d05268723"><code>6d05268</code></a>
node_modules: update</li>
<li><a
href="c03fb0c928"><code>c03fb0c</code></a>
README: v7 (<a
href="https://redirect.github.com/dawidd6/action-download-artifact/issues/318">#318</a>)</li>
<li>See full diff in <a
href="80620a5d27...20319c5641">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=dawidd6/action-download-artifact&package-manager=github_actions&previous-version=7&new-version=8)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-27 15:20:41 +00:00
Sven Mäder
9c5d08fff8 Ratelimit presence updates (#18000) 2025-01-24 19:58:01 +00:00
Max Kratz
90a6bd01c2 Contrib: Docker: updates PostgreSQL version in docker-compose.yml (#18089)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 18:54:31 +00:00
Till Faelligen
aa07a01452 One more tiny change 2025-01-21 15:01:16 +01:00
Till Faelligen
8364c01a2b Update changelog 2025-01-21 14:58:20 +01:00
Till Faelligen
e27808f306 1.123.0rc1 2025-01-21 14:46:40 +01:00
Quentin Gliech
048c1ac7f6 Support the new /auth_metadata endpoint defined in MSC2965. (#18093)
See the updated MSC2965

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 13:48:49 +01:00
Patrick Cloke
ca290d325c Implement MSC4133 to support custom profile fields. (#17488)
Implementation of
[MSC4133](https://github.com/matrix-org/matrix-spec-proposals/pull/4133)
to support custom profile fields. It is behind an experimental flag and
includes tests.


### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-21 11:11:04 +00:00
Eric Eastwood
0a31cf18cd Document possibility of configuring tls for a worker instance in instance_map (#18064) 2025-01-20 12:40:05 -06:00
Erik Johnston
48db0c2d6c Drop indices concurrently on background updates (#18091)
Otherwise these can race with other long running queries and lock out
all other queries.

This caused problems in v1.22.0 as we added an index to `events` table
in #17948, but that got interrupted and so next time we ran the
background update we needed to delete the half-finished index. However,
that got blocked behind some long running queries and then locked other
queries out (stopping workers from even starting).
2025-01-20 17:14:06 +00:00
dependabot[bot]
24c4d82aeb Bump pyo3 from 0.23.3 to 0.23.4 (#18079) 2025-01-16 14:18:06 +00:00
dependabot[bot]
3fda8d3b67 Bump serde_json from 1.0.134 to 1.0.135 (#18081) 2025-01-16 14:15:01 +00:00
dependabot[bot]
5f15a549d7 Bump ulid from 1.1.3 to 1.1.4 (#18080) 2025-01-16 14:14:46 +00:00
dependabot[bot]
6cefbc6852 Bump mypy from 1.12.1 to 1.13.0 (#18083) 2025-01-16 10:17:58 +00:00
dependabot[bot]
fd3ec6435e Bump pillow from 11.0.0 to 11.1.0 (#18084) 2025-01-16 10:17:46 +00:00
Andrew Morgan
39bd6e2c16 Merge branch 'master' into develop 2025-01-14 15:41:08 +00:00
Andrew Morgan
5c736cd2af move additional release missed in last commit 2025-01-14 14:23:35 +00:00
Andrew Morgan
e70e8d132c Move 2023/4 changelog entries under docs/changelogs 2025-01-14 14:20:08 +00:00
Andrew Morgan
48334fbc40 move postgres changelog to the top 2025-01-14 14:17:55 +00:00
Andrew Morgan
b4fd694ce3 1.122.0 2025-01-14 14:14:23 +00:00
Eric Eastwood
e2d757f62d Increase rc_invites.per_issuer for Complement (#18072)
It's possible to run into `SynapseError: 429 - Too Many Requests (rc_invites.per_issuer)`

`rc_invites.per_issuer` originally introduced in
https://github.com/matrix-org/synapse/pull/13125
2025-01-13 15:01:00 -06:00
Eric Eastwood
aab3672037 Bust _membership_stream_cache cache when current state changes (#17732)
This is particularly a problem in a state reset scenario where the membership
might change without a corresponding event.

This PR is targeting a scenario where a state reset happens which causes
room membership to change. Previously, the cache would just hold onto
stale data and now we properly bust the cache in this scenario.

We have a few tests for these scenarios which you can see are now fixed
because we can remove the `FIXME` where we were previously manually
busting the cache in the test itself.

This is a general Synapse thing so by it's nature it helps out Sliding
Sync.

Fix https://github.com/element-hq/synapse/issues/17368

Prerequisite for https://github.com/element-hq/synapse/issues/17929

---

Match when are busting `_curr_state_delta_stream_cache`
2025-01-08 10:11:09 -06:00
dependabot[bot]
d0677dca39 Bump jinja2 from 3.1.4 to 3.1.5 (#18067) 2025-01-08 16:08:43 +00:00
Shay
e34fd1228d Add the ability to filter by state event type on admin room state endpoint (#18035)
Adds a query param `type` to `/_synapse/admin/v1/rooms/{room_id}/state`
that filters the state event query by state event type.

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-01-08 15:38:26 +00:00
Travis Ralston
beea39f000 Drop unstable MSC4151 implementation (#18052)
It's been rotated out of known clients, and should be safe for removal
now.

Fixes https://github.com/element-hq/synapse/issues/17373

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-01-07 15:45:57 -07:00
Olivier 'reivilibre
fa320c4fcb Fix typographical error in changelog 2025-01-07 17:43:41 +00:00
Olivier 'reivilibre
22c2add9c0 Merge branch 'release-v1.122' into develop 2025-01-07 17:42:44 +00:00
dependabot[bot]
60f596b4d8 Bump pyopenssl from 24.2.1 to 24.3.0 (#18062)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-07 16:50:50 +00:00
Olivier 'reivilibre
1143e14479 Tweak changelog 2025-01-07 15:20:24 +00:00
Olivier 'reivilibre
c199ede287 1.122.0rc1 2025-01-07 14:13:02 +00:00
dependabot[bot]
9fb7333a7c Bump sentry-sdk from 2.17.0 to 2.19.2 (#18061)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-06 18:27:05 +00:00
dependabot[bot]
a0a4a36891 Bump pyicu from 2.13.1 to 2.14 (#18060) 2025-01-06 18:24:49 +00:00
dependabot[bot]
49fcda31f6 Bump serde from 1.0.216 to 1.0.217 (#18059) 2025-01-06 18:23:12 +00:00
Mathieu Velten
b3ba501c52 Properly purge state groups tables when purging a room (#18024)
Currently purging a complex room can lead to a lot of orphaned rows left
behind in the state groups tables.
It seems it is because we are loosing track of state groups sometimes.

This change uses the `room_id` indexed column of `state_groups` table to
decide what to delete instead of doing an indirection through
`event_to_state_groups`.

Related to https://github.com/element-hq/synapse/issues/3364.

### Pull Request Checklist

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Erik Johnston <erikj@jki.re>
2025-01-06 15:32:18 +00:00
Patrick Cloke
6306de8e16 Refactor get_profile: do not return missing fields. (#18063)
Refactor `get_profile` to avoid returning "empty" (`None` / `null`)
fields. Currently this is not very important, but will be more useful
once #17488 lands. It does update the servlet to use this now which has
a minor change in behavior: additional fields served over federation
will now be properly sent back to clients.

It also adds constants for `avatar_url` / `displayname` although I did
not attempt to use it everywhere possible.
2025-01-03 17:23:29 +00:00
Shay
b5267678d2 Add a test to verify remote user messages can be redacted via admin api redaction endpoint if requester is admin in room (#18043) 2025-01-03 12:52:42 +00:00
dependabot[bot]
ebc21a8c67 Bump twine from 5.1.1 to 6.0.1 (#18049)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-23 15:09:51 +00:00
dependabot[bot]
e5a53819fc Bump mypy-zope from 1.0.8 to 1.0.9 (#18047) 2024-12-23 15:03:55 +00:00
dependabot[bot]
66b24d3d00 Bump anyhow from 1.0.94 to 1.0.95 (#18045) 2024-12-23 15:03:10 +00:00
dependabot[bot]
2b59e738ee Bump authlib from 1.3.2 to 1.4.0 (#18048)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-23 14:57:22 +00:00
dependabot[bot]
b1d030a107 Bump serde_json from 1.0.133 to 1.0.134 (#18044) 2024-12-23 14:52:41 +00:00
morguldir
7c2284b2f2 Make admin api redactions use the requester to send the redaction (#18029) 2024-12-23 11:19:35 +00:00
Colin Watson
d69c00b5a1 Stop using twisted.internet.defer.returnValue (#18020)
`defer.returnValue` was only needed in Python 2; in Python 3, a simple
`return` is fine.

`twisted.internet.defer.returnValue` is deprecated as of Twisted 24.7.0.

Most uses of `returnValue` in synapse were removed a while back; this
cleans up some remaining bits.
2024-12-20 10:57:59 +00:00
Patrick Cloke
2d23250da7 Remove support for PostgreSQL 11 and 12 (#18034)
This is essentially matrix-org/synapse#14392. I didn't see anything in
there about updating sytest or complement.

The main driver of this is so that I can use `jsonb_path_exists` in
#17488. 😄
2024-12-19 17:02:47 +00:00
Mathieu Velten
234d07eb09 Disable statement timeout during room purge (#18017)
This is already done for `purge_history` but seems to have been
forgotten for `purge_room`.
2024-12-19 14:02:06 +00:00
Eric Eastwood
bd9a1079bc Update reverse proxy docs with what we've learned from #17986 (#17994)
Update reverse proxy docs with what we've learned from
https://github.com/element-hq/synapse/pull/17986

Also vice versa and update our nginx config with what I learned from the
reverse proxy docs.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-12-19 14:00:50 +00:00
Andrew Morgan
3eb92369ca Fix mypy errors on Twisted 24.11.0 (#17998)
Fixes various `mypy` errors associated with Twisted `24.11.0`.

Hopefully addresses https://github.com/element-hq/synapse/issues/17075,
though I've yet to test against `trunk`.

Changes should be compatible with our currently pinned Twisted version
of `24.7.0`.
2024-12-18 11:49:38 +00:00
Andrew Morgan
09f377fa52 Wording improvements for the TaskScheduler (#17992)
As I found the current docstrings a bit unclear while trying to wrap my
head around this class.
2024-12-18 11:42:34 +00:00
Andrew Morgan
f1b0f9a4ef Bump mypy from 1.11.2 to 1.12.1 and fix new typechecking errors (#17999)
Supersedes https://github.com/element-hq/synapse/pull/17958.

Awkwardly, the changes made to fix the mypy errors in 1.12.1 cause
errors in 1.11.2. So you'll need to update your mypy version to 1.12.1
to eliminate typechecking errors during developing.
2024-12-18 11:42:17 +00:00
cynhr
f1ecf46647 Add email.tlsname config option (#17849)
The existing `email.smtp_host` config option is used for two distinct
purposes: it is resolved into the IP address to connect to, and used to
(request via SNI and) validate the server's certificate if TLS is
enabled. This new option allows specifying a different name for the
second purpose.

This is especially helpful, if `email.smtp_host` isn't a global FQDN,
but something that resolves only locally (e.g. "localhost" to connect
through the loopback interface, or some other internally routed name),
that one cannot get a valid certificate for.
Alternatives would of course be to specify a global FQDN as
`email.smtp_host`, or to disable TLS entirely, both of which might be
undesirable, depending on the SMTP server configuration.
2024-12-17 18:05:38 -06:00
V02460
57bf44941e Add macaroon_secret_key_path config option (#17983)
Another config option on my quest to a `*_path` variant for every
secret. This time it’s `macaroon_secret_key_path`.

Reading secrets from files has the security advantage of separating the secrets from the config. It also simplifies secrets management in Kubernetes. Also useful to NixOS users.
2024-12-16 18:01:33 -06:00
Travis Ralston
3d60a58ad6 Add last_seen_ts to query user example (#17976)
This section could probably do with a lot more editorial attention, but
for now this is all there is in terms of documentation. The field is
already returned by Synapse:
4587decd67/synapse/handlers/admin.py (L150)

`last_seen_ts` was introduced in
https://github.com/matrix-org/synapse/pull/16218
2024-12-16 17:12:40 -06:00
Shay
8208186e3c Add some useful endpoints to Admin API (#17948)
- Fetch the number of invites the provided user has sent after a given
timestamp
- Fetch the number of rooms the provided user has joined after a given
timestamp, regardless if they have left/been banned from the rooms
subsequently
- Get report IDs of event reports where the provided user was the sender
of the reported event
2024-12-16 13:27:34 -06:00
dependabot[bot]
29d586311d Bump http from 1.1.0 to 1.2.0 (#18013) 2024-12-16 13:23:11 +00:00
dependabot[bot]
512c9efcb3 Bump serde from 1.0.215 to 1.0.216 (#18031) 2024-12-16 12:20:16 +00:00
dependabot[bot]
35c361c0d9 Bump pillow from 10.4.0 to 11.0.0 (#18015) 2024-12-16 12:19:09 +00:00
dependabot[bot]
95853c5f31 Bump pydantic from 2.9.2 to 2.10.3 (#18014) 2024-12-16 12:03:42 +00:00
dependabot[bot]
eb019c03c4 Bump anyhow from 1.0.93 to 1.0.94 (#18012) 2024-12-16 11:58:34 +00:00
Wilson
eedab12e6d forward requester id to check username for spam callbacks (#17916) 2024-12-13 14:17:41 +00:00
Andrew Morgan
483602efb2 Merge branch 'master' into develop 2024-12-11 19:24:03 +00:00
Andrew Morgan
ac429050bc Remove redundant security disclaimer 2024-12-11 18:28:45 +00:00
Andrew Morgan
daa783f16c 1.121.1 2024-12-11 18:25:44 +00:00
Till
6c4037dcf3 Downgrade ubuntu to 22.04 when building docker images (#18026)
As currently all docker builds are failing.


https://github.blog/changelog/2024-12-05-notice-of-upcoming-releases-and-breaking-changes-for-github-actions/
https://github.com/actions/runner-images/issues/10636
2024-12-11 18:27:56 +01:00
Till Faelligen
737f6c73f7 Update changelog 2024-12-11 15:20:39 +01:00
Till Faelligen
ed6edc17d0 1.121.0 2024-12-11 13:12:50 +01:00
Till
5b0873516c Attempt to fix duplicate releases issue (#18025)
This hopefully fixes https://github.com/element-hq/synapse/issues/17991,
as we first upgraded to v2 and are now back to 0.1.15.
(This was lost in https://github.com/element-hq/synapse/pull/17923,
related https://github.com/element-hq/synapse/pull/17995)
2024-12-11 12:40:36 +01:00
jahway603
5da7081197 Update Alpine Linux Synapse Package Maintainer within installation.md (#17846)
Update Alpine Linux Synapse Package Maintainer within installation.md as
it is outdated.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [N/A] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-12-10 22:24:03 +00:00
Mathieu Velten
5cf74c2da0 Fix bug when rejecting withdrew invite with a third_party_rules module (#17930)
When rejecting a withdrew invite through federation, an out of band
event needs to be created.

When doing so with a third_party_rules module installed,
`get_prev_state_ids` [is
called](e0fdb862cb/synapse/module_api/callbacks/third_party_event_rules_callbacks.py (L285))
on the context to calculate the state to pass at `check_event_allowed`
callbacks.

The context for outliers is defined
[here](e0fdb862cb/synapse/events/snapshot.py (L168)),
and `state_group_before_event` is None.

This change makes the behavior of `get_prev_state_ids` and
`get_current_state_ids` match the one presented in the docstring
regarding null state_group.
2024-12-10 14:26:38 +00:00
Rafał Hirsch
adce8a0111 Reorganize account data, receipts and presence request regexps in generic_worker docs (#17954)
POST requests for account data, receipts and presence require the worker
to be configured as a stream writer. The regular expressions in the
default list don't assume any HTTP method, so if the worker is not a
stream writer, the request fails.

The stream writer section of the documentation lists the same regexps as
the one I'm removing, so people configuring stream writers can still
configure their routing properly.

More context:
https://github.com/element-hq/synapse/issues/17243#issuecomment-2493621645
2024-12-09 10:30:03 -06:00
dependabot[bot]
790ce14e46 Bump pyo3 from 0.23.2 to 0.23.3 (#18001) 2024-12-09 10:54:53 +00:00
dependabot[bot]
ecbc0b740c Bump dawidd6/action-download-artifact from 6 to 7 (#17981) 2024-12-05 17:37:40 +00:00
dependabot[bot]
0db5d247f8 Bump python-multipart from 0.0.16 to 0.0.18 (#17985) 2024-12-05 17:07:40 +00:00
Devon Hudson
02d09e3f0c Add RoomID & EventID rust types (#17996)
Adds the RoomID & EventID rust types to the rust lib.
Also adds a Deserialize impl to the existing UserID type.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-12-05 15:41:57 +00:00
Travis Ralston
b90ad26ebc Promote account suspension to stable (#17964)
MSC: https://github.com/matrix-org/matrix-spec-proposals/pull/3823
2024-12-04 17:56:42 -06:00
Andrew Morgan
a00d0b3d0e 1.121.0rc1 2024-12-04 14:49:28 +00:00
Andrew Morgan
45ca6392f4 Pin Rust to 1.82.0 when building Python wheels (#17993)
Addresses step 1 of #17988.
2024-12-04 12:58:26 +00:00
Andrew Morgan
05d58b86ac Pin softprops/action-gh-release to v0.1.15 (#17995)
We are still seeing duplicate releases on v2.0.5, so roll back further.
[Other](f8a5a60b7c (diff-88ab30345d9874c4336fe50b54b083ba5bdd925be961c34060e6a192b56b0433R72))
[repositories](55fca4fec7 (diff-e426ed45842837026e10e66af23d9c7077e89eacbe6958ce7cb991130ad05adaR105))
seem to have settled on this version.

Addresses https://github.com/element-hq/synapse/issues/17991

We're just going to test this during 1.121.0rc1.
2024-12-04 12:53:51 +00:00
Quentin Gliech
23b626f2e6 Support for MSC4190: device management for application services (#17705)
This is an implementation of MSC4190, which allows appservices to manage
their user's devices without /login & /logout.

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-12-04 12:04:49 +01:00
manuroe
abf44ad324 MSC4076: Add disable_badge_count to pusher configuration (#17975)
This PR implements [MSC4076: Let E2EE clients calculate app badge counts
themselves
(disable_badge_count)](https://github.com/matrix-org/matrix-spec-proposals/pull/4076).
2024-12-03 22:58:43 +00:00
Quentin Gliech
657dd5151e Merge branch 'master' into develop 2024-12-03 17:44:48 +01:00
Quentin Gliech
6f689d452c 1.120.2 2024-12-03 16:58:40 +01:00
Quentin Gliech
650492ed4d Stop building wheels for macOS 2024-12-03 16:39:41 +01:00
Eric Eastwood
b257c7ab19 Be able to test /login/sso/redirect in Complement (#17986)
Be able to test `/login/sso/redirect` in Complement

Spawning from
https://github.com/element-hq/sbg/pull/421#discussion_r1854926218 where
we have a proxy that intercepts responses to
`/_matrix/client/v3/login/sso/redirect(/{idpId})` in order to upgrade
them to use OAuth 2.0 Pushed Authorization Requests (PAR). We have some
Complement tests in that codebase that go over this flow and these
changes are required [in order for the URL's to line
up](d648c8ce3f/synapse/rest/client/login.py (L652-L673)).
2024-12-03 12:54:25 +00:00
Quentin Gliech
fe3d88b833 1.120.1 2024-12-03 11:18:31 +01:00
Olivier 'reivilibre
b64a4e5fbb Restrict which image formats we will decode in order to generate thumbnails 2024-12-03 09:53:21 +01:00
Devon Hudson
4b7154c585 Don't allow unsupported content-type
Co-authored-by: Eric Eastwood <erice@element.io>
2024-12-03 09:53:21 +01:00
Erik Johnston
d82e1ed357 Handle null invite and knock room state 2024-12-03 09:53:21 +01:00
Eric Eastwood
4daa533e82 Sliding Sync: Fix state leaking on incremental sync 2024-12-03 09:53:21 +01:00
Erik Johnston
f3fd6852ac Fix release process to not create duplicate releases (#17970)
This is to work around
https://github.com/softprops/action-gh-release/issues/445

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2024-12-03 09:53:20 +01:00
dependabot[bot]
d648c8ce3f Bump bytes from 1.8.0 to 1.9.0 (#17982) 2024-12-02 16:55:53 +00:00
dependabot[bot]
190c400a83 Bump tomli from 2.1.0 to 2.2.1 (#17979) 2024-12-02 16:55:40 +00:00
Eric Eastwood
e5d3bfba30 Sliding Sync: Include invite, ban, kick, targets when $LAZY-loading room members (#17947)
Part of https://github.com/element-hq/synapse/issues/17929
2024-12-02 10:17:55 -06:00
Travis Ralston
9b2ae62d20 Use stable error code for account locking (#17965) 2024-12-02 15:28:47 +00:00
dependabot[bot]
a89b697209 Bump pysaml2 from 7.3.1 to 7.5.0 (#17978) 2024-12-02 15:28:08 +00:00
Erik Johnston
a82f5f206f Fix release process to not create duplicate releases (#17970)
This is to work around
https://github.com/softprops/action-gh-release/issues/445

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2024-12-02 10:54:14 +00:00
Eric Eastwood
6a909aade2 Consolidate SSO redirects through /_matrix/client/v3/login/sso/redirect(/{idpId}) (#17972)
Consolidate SSO redirects through
`/_matrix/client/v3/login/sso/redirect(/{idpId})`

Spawning from
https://github.com/element-hq/sbg/pull/421#discussion_r1859497330 where
we have a proxy that intercepts responses to
`/_matrix/client/v3/login/sso/redirect(/{idpId})` in order to upgrade
them to use OAuth 2.0 Pushed Authorization Requests (PAR). Instead of
needing to intercept multiple endpoints that redirect to the
authorization endpoint, it seems better to just have Synapse consolidate
to a single flow.


### Testing strategy

1. Create a new OAuth application. I'll be using GitHub for example but
there are [many
options](be65a8ec01/docs/openid.md).
Visit https://github.com/settings/developers -> **New OAuth App**
    - Application name: `Synapse local testing`
    - Homepage URL: `http://localhost:8008`
- Authorization callback URL:
`http://localhost:8008/_synapse/client/oidc/callback`
 1. Update your Synapse `homeserver.yaml`
    ```yaml
    server_name: "my.synapse.server"
    public_baseurl: http://localhost:8008/
    listeners:
      - port: 8008
        bind_addresses: [
          #'::1',
          '127.0.0.1'
        ]
        tls: false
        type: http
        x_forwarded: true
        resources:
          - names: [client, federation, metrics]
            compress: false
    
    # SSO login testing
    oidc_providers:
      - idp_id: github
        idp_name: Github
        idp_brand: "github"  # optional: styling hint for clients
        discover: false
        issuer: "https://github.com/"
        client_id: "xxx" # TO BE FILLED
        client_secret: "xxx" # TO BE FILLED
authorization_endpoint: "https://github.com/login/oauth/authorize"
        token_endpoint: "https://github.com/login/oauth/access_token"
        userinfo_endpoint: "https://api.github.com/user"
        scopes: ["read:user"]
        user_mapping_provider:
          config:
            subject_claim: "id"
            localpart_template: "{{ user.login }}"
            display_name_template: "{{ user.name }}"
    ```
1. Start Synapse: `poetry run synapse_homeserver --config-path
homeserver.yaml`
1. Visit
`http://localhost:8008/_synapse/client/pick_idp?redirectUrl=http%3A%2F%2Fexample.com`
 1. Choose GitHub
1. Notice that you're redirected to GitHub to sign in
(`https://github.com/login/oauth/authorize?...`)

Tested locally and works:

1.
`http://localhost:8008/_synapse/client/pick_idp?idp=oidc-github&redirectUrl=http%3A//example.com`
->
1.
`http://localhost:8008/_matrix/client/v3/login/sso/redirect/oidc-github?redirectUrl=http://example.com`
->
1.
`https://github.com/login/oauth/authorize?response_type=code&client_id=xxx&redirect_uri=http%3A%2F%2Flocalhost%3A8008%2F_synapse%2Fclient%2Foidc%2Fcallback&scope=read%3Auser&state=xxx&nonce=xxx`
2024-11-29 11:26:37 -06:00
Richard van der Hoff
d80cd57c54 Fix new scheduled tasks jumping the queue (#17962)
Currently, when a new scheduled task is added and its scheduled time has
already passed, we set it to ACTIVE. This is problematic, because it
means it will jump the queue ahead of all other SCHEDULED tasks;
furthermore, if the Synapse process gets restarted, it will jump ahead
of any ACTIVE tasks which have been started but are taking a while to
run.

Instead, we leave it set to SCHEDULED, but kick off a call to
`_launch_scheduled_tasks`, which will decide if we actually have
capacity to start a new task, and start the newly-added task if so.
2024-11-28 18:06:19 +00:00
Erik Johnston
59ad4b18fc Update setuptools-rust and fix building abi3 wheels (#17969)
Newer versions of `setuptools-rust` ignore the `py_limited_api` flag to
`RustExtension`, and instead read it from `bdist_wheel` config.

c.f.
https://github.com/PyO3/setuptools-rust/blob/main/CHANGELOG.md#190-2024-02-24
2024-11-27 13:31:43 +00:00
184 changed files with 11656 additions and 5154 deletions

View File

@@ -60,7 +60,7 @@ trial_postgres_tests = [
{
"python-version": "3.9",
"database": "postgres",
"postgres-version": "11",
"postgres-version": "13",
"extras": "all",
}
]

View File

@@ -14,7 +14,7 @@ permissions:
id-token: write # needed for signing the images with GitHub OIDC Token
jobs:
build:
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
steps:
- name: Set up QEMU
id: qemu

View File

@@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@bf251b5aa9c2f7eeb574a96ee720e24f801b7c11 # v6
uses: dawidd6/action-download-artifact@20319c5641d495c8a52e688b7dc5fada6c3a9fbc # v8
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -21,7 +21,7 @@ jobs:
# We use nightly so that `fmt` correctly groups together imports, and
# clippy correctly fixes up the benchmarks.
toolchain: nightly-2022-12-01
components: rustfmt
components: clippy, rustfmt
- uses: Swatinem/rust-cache@v2
- name: Setup Poetry

View File

@@ -212,7 +212,8 @@ jobs:
mv debs*/* debs/
tar -cvJf debs.tar.xz debs
- name: Attach to release
uses: softprops/action-gh-release@v2.0.5
# Pinned to work around https://github.com/softprops/action-gh-release/issues/445
uses: softprops/action-gh-release@v0.1.15
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
@@ -220,3 +221,7 @@ jobs:
Sdist/*
Wheel*/*
debs.tar.xz
# if it's not already published, keep the release as a draft.
draft: true
# mark it as a prerelease if the tag contains 'rc'.
prerelease: ${{ contains(github.ref, 'rc') }}

View File

@@ -581,7 +581,7 @@ jobs:
matrix:
include:
- python-version: "3.9"
postgres-version: "11"
postgres-version: "13"
- python-version: "3.13"
postgres-version: "17"

3722
CHANGES.md

File diff suppressed because it is too large Load Diff

55
Cargo.lock generated
View File

@@ -13,9 +13,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.93"
version = "1.0.95"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c95c10ba0b00a02636238b814946408b1322d5ac4760326e6fb8ec956d85775"
checksum = "34ac096ce696dc2fcabef30516bb13c0a68a11d30131d3df6f04711467681b04"
[[package]]
name = "arc-swap"
@@ -61,9 +61,9 @@ checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c"
[[package]]
name = "bytes"
version = "1.8.0"
version = "1.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9ac0150caa2ae65ca5bd83f25c7de183dea78d4d366469f148435e2acfbad0da"
checksum = "325918d6fe32f23b19878fe4b34794ae41fc19ddbe53b10571a4874d44ffd39b"
[[package]]
name = "cfg-if"
@@ -124,10 +124,8 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c4567c8db10ae91089c99af84c68c38da3ec2f087c3f82960bcdbf3656b6f4d7"
dependencies = [
"cfg-if",
"js-sys",
"libc",
"wasi",
"wasm-bindgen",
]
[[package]]
@@ -168,9 +166,9 @@ checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70"
[[package]]
name = "http"
version = "1.1.0"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21b9ddb458710bc376481b842f5da65cdf31522de232c1ca8146abce2a358258"
checksum = "f16ca2af56261c99fba8bac40a10251ce8188205a4c448fbb745a2e4daa76fea"
dependencies = [
"bytes",
"fnv",
@@ -218,9 +216,9 @@ checksum = "ae743338b92ff9146ce83992f766a31066a91a8c84a45e0e9f21e7cf6de6d346"
[[package]]
name = "log"
version = "0.4.22"
version = "0.4.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a7a70ba024b9dc04c27ea2f0c0548feb474ec5c54bba33a7f72f873a39d07b24"
checksum = "04cbf5b083de1c7e0222a7a51dbfdba1cbe1c6ab0b15e29fff3f6c077fd9cd9f"
[[package]]
name = "memchr"
@@ -272,9 +270,9 @@ dependencies = [
[[package]]
name = "pyo3"
version = "0.23.2"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f54b3d09cbdd1f8c20650b28e7b09e338881482f4aa908a5f61a00c98fba2690"
checksum = "57fe09249128b3173d092de9523eaa75136bf7ba85e0d69eca241c7939c933cc"
dependencies = [
"anyhow",
"cfg-if",
@@ -291,9 +289,9 @@ dependencies = [
[[package]]
name = "pyo3-build-config"
version = "0.23.2"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3015cf985888fe66cfb63ce0e321c603706cd541b7aec7ddd35c281390af45d8"
checksum = "1cd3927b5a78757a0d71aa9dff669f903b1eb64b54142a9bd9f757f8fde65fd7"
dependencies = [
"once_cell",
"target-lexicon",
@@ -301,9 +299,9 @@ dependencies = [
[[package]]
name = "pyo3-ffi"
version = "0.23.2"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6fca7cd8fd809b5ac4eefb89c1f98f7a7651d3739dfb341ca6980090f554c270"
checksum = "dab6bb2102bd8f991e7749f130a70d05dd557613e39ed2deeee8e9ca0c4d548d"
dependencies = [
"libc",
"pyo3-build-config",
@@ -322,9 +320,9 @@ dependencies = [
[[package]]
name = "pyo3-macros"
version = "0.23.2"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34e657fa5379a79151b6ff5328d9216a84f55dc93b17b08e7c3609a969b73aa0"
checksum = "91871864b353fd5ffcb3f91f2f703a22a9797c91b9ab497b1acac7b07ae509c7"
dependencies = [
"proc-macro2",
"pyo3-macros-backend",
@@ -334,9 +332,9 @@ dependencies = [
[[package]]
name = "pyo3-macros-backend"
version = "0.23.2"
version = "0.23.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "295548d5ffd95fd1981d2d3cf4458831b21d60af046b729b6fd143b0ba7aee2f"
checksum = "43abc3b80bc20f3facd86cd3c60beed58c3e2aa26213f3cda368de39c60a27e4"
dependencies = [
"heck",
"proc-macro2",
@@ -431,18 +429,18 @@ checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f"
[[package]]
name = "serde"
version = "1.0.215"
version = "1.0.217"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f"
checksum = "02fc4265df13d6fa1d00ecff087228cc0a2b5f3c0e87e258d8b94a156e984c70"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.215"
version = "1.0.217"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0"
checksum = "5a9bf7cf98d04a2b28aead066b7496853d4779c9cc183c440dbac457641e19a0"
dependencies = [
"proc-macro2",
"quote",
@@ -451,9 +449,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.133"
version = "1.0.137"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7fceb2473b9166b2294ef05efcb65a3db80803f0b03ef86a5fc88a2b85ee377"
checksum = "930cfb6e6abf99298aaad7d29abbef7a9999a9a8806a40088f55f0dcec03146b"
dependencies = [
"itoa",
"memchr",
@@ -538,11 +536,10 @@ checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
[[package]]
name = "ulid"
version = "1.1.3"
version = "1.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04f903f293d11f31c0c29e4148f6dc0d033a7f80cebc0282bea147611667d289"
checksum = "f294bff79170ed1c5633812aff1e565c35d993a36e757f9bc0accf5eec4e6045"
dependencies = [
"getrandom",
"rand",
"web-time",
]

6
LICENSE-COMMERCIAL Normal file
View File

@@ -0,0 +1,6 @@
Licensees holding a valid commercial license with Element may use this
software in accordance with the terms contained in a written agreement
between you and Element.
To purchase a commercial license please contact our sales team at
licensing@element.io

View File

@@ -10,14 +10,15 @@ implementation, written and maintained by `Element <https://element.io>`_.
`Matrix <https://github.com/matrix-org>`__ is the open standard for
secure and interoperable real time communications. You can directly run
and manage the source code in this repository, available under an AGPL
license. There is no support provided from Element unless you have a
subscription.
license (or alternatively under a commercial license from Element).
There is no support provided by Element unless you have a
subscription from Element.
Subscription alternative
========================
Subscription
============
Alternatively, for those that need an enterprise-ready solution, Element
Server Suite (ESS) is `available as a subscription <https://element.io/pricing>`_.
For those that need an enterprise-ready solution, Element
Server Suite (ESS) is `available via subscription <https://element.io/pricing>`_.
ESS builds on Synapse to offer a complete Matrix-based backend including the full
`Admin Console product <https://element.io/enterprise-functionality/admin-console>`_,
giving admins the power to easily manage an organization-wide
@@ -249,6 +250,20 @@ Developers might be particularly interested in:
Alongside all that, join our developer community on Matrix:
`#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_, featuring real humans!
Copyright and Licensing
=======================
Copyright 2014-2017 OpenMarket Ltd
Copyright 2017 Vector Creations Ltd
Copyright 2017-2025 New Vector Ltd
This software is dual-licensed by New Vector Ltd (Element). It can be used either:
(1) for free under the terms of the GNU Affero General Public License (as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version); OR
(2) under the terms of a paid-for Element Commercial License agreement between you and Element (the terms of which may vary depending on what you and Element have agreed to).
Unless required by applicable law or agreed to in writing, software distributed under the Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
.. |support| image:: https://img.shields.io/badge/matrix-community%20support-success
:alt: (get community support in #synapse:matrix.org)

View File

@@ -1,8 +1,10 @@
# A build script for poetry that adds the rust extension.
import itertools
import os
from typing import Any, Dict
from packaging.specifiers import SpecifierSet
from setuptools_rust import Binding, RustExtension
@@ -14,6 +16,8 @@ def build(setup_kwargs: Dict[str, Any]) -> None:
target="synapse.synapse_rust",
path=cargo_toml_path,
binding=Binding.PyO3,
# This flag is a no-op in the latest versions. Instead, we need to
# specify this in the `bdist_wheel` config below.
py_limited_api=True,
# We force always building in release mode, as we can't tell the
# difference between using `poetry` in development vs production.
@@ -21,3 +25,18 @@ def build(setup_kwargs: Dict[str, Any]) -> None:
)
setup_kwargs.setdefault("rust_extensions", []).append(extension)
setup_kwargs["zip_safe"] = False
# We lookup the minimum supported python version by looking at
# `python_requires` (e.g. ">=3.9.0,<4.0.0") and finding the first python
# version that matches. We then convert that into the `py_limited_api` form,
# e.g. cp39 for python 3.9.
py_limited_api: str
python_bounds = SpecifierSet(setup_kwargs["python_requires"])
for minor_version in itertools.count(start=8):
if f"3.{minor_version}.0" in python_bounds:
py_limited_api = f"cp3{minor_version}"
break
setup_kwargs.setdefault("options", {}).setdefault("bdist_wheel", {})[
"py_limited_api"
] = py_limited_api

View File

@@ -1 +0,0 @@
[MSC4108](https://github.com/matrix-org/matrix-spec-proposals/pull/4108): Add a `Content-Type` header on the `PUT` response to work around a faulty behavior in some caching reverse proxies.

View File

@@ -1 +0,0 @@
Add OIDC example configuration for Forgejo (fork of Gitea).

View File

@@ -1 +0,0 @@
Fix long-standing bug where read receipts could get overly delayed being sent over federation.

View File

@@ -1 +0,0 @@
Fix incorrect comment in new schema delta.

View File

@@ -1 +0,0 @@
Raise setuptools_rust version cap to 1.10.2.

View File

@@ -1 +0,0 @@
Enable encrypted appservice related experimental features in the complement docker image.

View File

@@ -1 +0,0 @@
Return whether the user is suspended when querying the user account in the Admin API.

View File

@@ -1 +0,0 @@
Link to element-docker-demo from contrib/docker*.

View File

@@ -1 +0,0 @@
Bump pyo3 and dependencies to v0.23.2.

View File

@@ -1 +0,0 @@
Fix release process to not create duplicate releases.

View File

@@ -0,0 +1 @@
Add experimental config options `admin_token_path` and `client_secret_path` for MSC 3861.

1
changelog.d/18134.misc Normal file
View File

@@ -0,0 +1 @@
Make it explicit that you can buy an AGPL-alternative commercial license from Element.

1
changelog.d/18135.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix user directory search when using a legacy module with a `check_username_for_spam` callback. Broke in v1.122.0.

1
changelog.d/18136.misc Normal file
View File

@@ -0,0 +1 @@
Fix the 'Fix linting' GitHub Actions workflow.

1
changelog.d/18139.misc Normal file
View File

@@ -0,0 +1 @@
Do not log exceptions when clients provide empty `since` token to `/sync` API.

View File

@@ -245,7 +245,7 @@ class SynapseCmd(cmd.Cmd):
if "flows" not in json_res:
print("Failed to find any login flows.")
defer.returnValue(False)
return False
flow = json_res["flows"][0] # assume first is the one we want.
if "type" not in flow or "m.login.password" != flow["type"] or "stages" in flow:
@@ -254,8 +254,8 @@ class SynapseCmd(cmd.Cmd):
"Unable to login via the command line client. Please visit "
"%s to login." % fallback_url
)
defer.returnValue(False)
defer.returnValue(True)
return False
return True
def do_emailrequest(self, line):
"""Requests the association of a third party identifier

View File

@@ -78,7 +78,7 @@ class TwistedHttpClient(HttpClient):
url, data, headers_dict={"Content-Type": ["application/json"]}
)
body = yield readBody(response)
defer.returnValue((response.code, body))
return response.code, body
@defer.inlineCallbacks
def get_json(self, url, args=None):
@@ -88,7 +88,7 @@ class TwistedHttpClient(HttpClient):
url = "%s?%s" % (url, qs)
response = yield self._create_get_request(url)
body = yield readBody(response)
defer.returnValue(json.loads(body))
return json.loads(body)
def _create_put_request(self, url, json_data, headers_dict: Optional[dict] = None):
"""Wrapper of _create_request to issue a PUT request"""
@@ -134,7 +134,7 @@ class TwistedHttpClient(HttpClient):
response = yield self._create_request(method, url)
body = yield readBody(response)
defer.returnValue(json.loads(body))
return json.loads(body)
@defer.inlineCallbacks
def _create_request(
@@ -173,7 +173,7 @@ class TwistedHttpClient(HttpClient):
if self.verbose:
print("Status %s %s" % (response.code, response.phrase))
print(pformat(list(response.headers.getAllRawHeaders())))
defer.returnValue(response)
return response
def sleep(self, seconds):
d = defer.Deferred()

View File

@@ -51,7 +51,7 @@ services:
- traefik.http.routers.https-synapse.tls.certResolver=le-ssl
db:
image: docker.io/postgres:12-alpine
image: docker.io/postgres:15-alpine
# Change that password, of course!
environment:
- POSTGRES_USER=synapse

66
debian/changelog vendored
View File

@@ -1,3 +1,69 @@
matrix-synapse-py3 (1.124.0~rc2) stable; urgency=medium
* New Synapse release 1.124.0rc2.
-- Synapse Packaging team <packages@matrix.org> Wed, 05 Feb 2025 16:35:53 +0000
matrix-synapse-py3 (1.124.0~rc1) stable; urgency=medium
* New Synapse release 1.124.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 04 Feb 2025 11:53:05 +0000
matrix-synapse-py3 (1.123.0) stable; urgency=medium
* New Synapse release 1.123.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 28 Jan 2025 08:37:34 -0700
matrix-synapse-py3 (1.123.0~rc1) stable; urgency=medium
* New Synapse release 1.123.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 21 Jan 2025 14:39:57 +0100
matrix-synapse-py3 (1.122.0) stable; urgency=medium
* New Synapse release 1.122.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 14 Jan 2025 14:14:14 +0000
matrix-synapse-py3 (1.122.0~rc1) stable; urgency=medium
* New Synapse release 1.122.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Jan 2025 14:06:19 +0000
matrix-synapse-py3 (1.121.1) stable; urgency=medium
* New Synapse release 1.121.1.
-- Synapse Packaging team <packages@matrix.org> Wed, 11 Dec 2024 18:24:48 +0000
matrix-synapse-py3 (1.121.0) stable; urgency=medium
* New Synapse release 1.121.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 11 Dec 2024 13:12:30 +0100
matrix-synapse-py3 (1.121.0~rc1) stable; urgency=medium
* New Synapse release 1.121.0rc1.
-- Synapse Packaging team <packages@matrix.org> Wed, 04 Dec 2024 14:47:23 +0000
matrix-synapse-py3 (1.120.2) stable; urgency=medium
* New synapse release 1.120.2.
-- Synapse Packaging team <packages@matrix.org> Tue, 03 Dec 2024 15:43:37 +0000
matrix-synapse-py3 (1.120.1) stable; urgency=medium
* New synapse release 1.120.1.
-- Synapse Packaging team <packages@matrix.org> Tue, 03 Dec 2024 09:07:57 +0000
matrix-synapse-py3 (1.120.0) stable; urgency=medium
* New synapse release 1.120.0.

View File

@@ -7,6 +7,7 @@
#}
## Server ##
public_baseurl: http://127.0.0.1:8008/
report_stats: False
trusted_key_servers: []
enable_registration: true
@@ -84,6 +85,14 @@ rc_invites:
per_user:
per_second: 1000
burst_count: 1000
per_issuer:
per_second: 1000
burst_count: 1000
rc_presence:
per_user:
per_second: 9999
burst_count: 9999
federation_rr_transactions_per_room_per_second: 9999

View File

@@ -38,10 +38,13 @@ server {
{% if using_unix_sockets %}
proxy_pass http://unix:/run/main_public.sock;
{% else %}
# note: do not add a path (even a single /) after the port in `proxy_pass`,
# otherwise nginx will canonicalise the URI and cause signature verification
# errors.
proxy_pass http://localhost:8080;
{% endif %}
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $host;
proxy_set_header Host $host:$server_port;
}
}

View File

@@ -60,10 +60,11 @@ paginate through.
anything other than the return value of `next_token` from a previous call. Defaults to `0`.
* `dir`: string - Direction of event report order. Whether to fetch the most recent
first (`b`) or the oldest first (`f`). Defaults to `b`.
* `user_id`: string - Is optional and filters to only return users with user IDs that
contain this value. This is the user who reported the event and wrote the reason.
* `room_id`: string - Is optional and filters to only return rooms with room IDs that
contain this value.
* `user_id`: optional string - Filter by the user ID of the reporter. This is the user who reported the event
and wrote the reason.
* `room_id`: optional string - Filter by room id.
* `event_sender_user_id`: optional string - Filter by the sender of the reported event. This is the user who
the report was made against.
**Response**

View File

@@ -385,6 +385,13 @@ The API is:
GET /_synapse/admin/v1/rooms/<room_id>/state
```
**Parameters**
The following query parameter is available:
* `type` - The type of room state event to filter by, eg "m.room.create". If provided, only state events
of this type will be returned (regardless of their `state_key` value).
A response body like the following is returned:
```json

View File

@@ -40,6 +40,7 @@ It returns a JSON body like the following:
"erased": false,
"shadow_banned": 0,
"creation_ts": 1560432506,
"last_seen_ts": 1732919539393,
"appservice_id": null,
"consent_server_notice_sent": null,
"consent_version": null,
@@ -477,9 +478,9 @@ with a body of:
}
```
## List room memberships of a user
## List joined rooms of a user
Gets a list of all `room_id` that a specific `user_id` is member.
Gets a list of all `room_id` that a specific `user_id` is joined to and is a member of (participating in).
The API is:
@@ -516,6 +517,73 @@ The following fields are returned in the JSON response body:
- `joined_rooms` - An array of `room_id`.
- `total` - Number of rooms.
## Get the number of invites sent by the user
Fetches the number of invites sent by the provided user ID across all rooms
after the given timestamp.
```
GET /_synapse/admin/v1/users/$user_id/sent_invite_count
```
**Parameters**
The following parameters should be set in the URL:
* `user_id`: fully qualified: for example, `@user:server.com`
The following should be set as query parameters in the URL:
* `from_ts`: int, required. A timestamp in ms from the unix epoch. Only
invites sent at or after the provided timestamp will be returned.
This works by comparing the provided timestamp to the `received_ts`
column in the `events` table.
Note: https://currentmillis.com/ is a useful tool for converting dates
into timestamps and vice versa.
A response body like the following is returned:
```json
{
"invite_count": 30
}
```
_Added in Synapse 1.122.0_
## Get the cumulative number of rooms a user has joined after a given timestamp
Fetches the number of rooms that the user joined after the given timestamp, even
if they have subsequently left/been banned from those rooms.
```
GET /_synapse/admin/v1/users/$<user_id/cumulative_joined_room_count
```
**Parameters**
The following parameters should be set in the URL:
* `user_id`: fully qualified: for example, `@user:server.com`
The following should be set as query parameters in the URL:
* `from_ts`: int, required. A timestamp in ms from the unix epoch. Only
invites sent at or after the provided timestamp will be returned.
This works by comparing the provided timestamp to the `received_ts`
column in the `events` table.
Note: https://currentmillis.com/ is a useful tool for converting dates
into timestamps and vice versa.
A response body like the following is returned:
```json
{
"cumulative_joined_room_count": 30
}
```
_Added in Synapse 1.122.0_
## Account Data
Gets information about account data for a specific `user_id`.
@@ -1444,4 +1512,6 @@ The following fields are returned in the JSON response body:
- `failed_redactions` - dictionary - the keys of the dict are event ids the process was unable to redact, if any, and the values are
the corresponding error that caused the redaction to fail
_Added in Synapse 1.116.0._
_Added in Synapse 1.116.0._

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -245,7 +245,7 @@ this callback.
_First introduced in Synapse v1.37.0_
```python
async def check_username_for_spam(user_profile: synapse.module_api.UserProfile) -> bool
async def check_username_for_spam(user_profile: synapse.module_api.UserProfile, requester_id: str) -> bool
```
Called when computing search results in the user directory. The module must return a
@@ -264,6 +264,8 @@ The profile is represented as a dictionary with the following keys:
The module is given a copy of the original dictionary, so modifying it from within the
module cannot modify a user's profile when included in user directory search results.
The requester_id parameter is the ID of the user that called the user directory API.
If multiple modules implement this callback, they will be considered in order. If a
callback returns `False`, Synapse falls through to the next one. The value of the first
callback that does not return `False` will be used. If this happens, Synapse will not call

View File

@@ -74,7 +74,7 @@ server {
proxy_pass http://localhost:8008;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $host;
proxy_set_header Host $host:$server_port;
# Nginx by default only allows file uploads up to 1M in size
# Increase client_max_body_size to match max_upload_size defined in homeserver.yaml

View File

@@ -157,7 +157,7 @@ sudo pip install py-bcrypt
#### Alpine Linux
6543 maintains [Synapse packages for Alpine Linux](https://pkgs.alpinelinux.org/packages?name=synapse&branch=edge) in the community repository. Install with:
Jahway603 maintains [Synapse packages for Alpine Linux](https://pkgs.alpinelinux.org/packages?name=synapse&branch=edge) in the community repository. Install with:
```sh
sudo apk add synapse

View File

@@ -72,8 +72,8 @@ class ExampleSpamChecker:
async def user_may_publish_room(self, userid, room_id):
return True # allow publishing of all rooms
async def check_username_for_spam(self, user_profile):
return False # allow all usernames
async def check_username_for_spam(self, user_profile, requester_id):
return False # allow all usernames regardless of requester
async def check_registration_for_spam(
self,

View File

@@ -117,6 +117,14 @@ each upgrade are complete before moving on to the next upgrade, to avoid
stacking them up. You can monitor the currently running background updates with
[the Admin API](usage/administration/admin_api/background_updates.html#status).
# Upgrading to v1.122.0
## Dropping support for PostgreSQL 11 and 12
In line with our [deprecation policy](deprecation_policy.md), we've dropped
support for PostgreSQL 11 and 12, as they are no longer supported upstream.
This release of Synapse requires PostgreSQL 13+.
# Upgrading to v1.120.0
## Removal of experimental MSC3886 feature

View File

@@ -673,8 +673,9 @@ This setting has the following sub-options:
TLS via STARTTLS *if the SMTP server supports it*. If this option is set,
Synapse will refuse to connect unless the server supports STARTTLS.
* `enable_tls`: By default, if the server supports TLS, it will be used, and the server
must present a certificate that is valid for 'smtp_host'. If this option
must present a certificate that is valid for `tlsname`. If this option
is set to false, TLS will not be used.
* `tlsname`: The domain name the SMTP server's TLS certificate must be valid for, defaulting to `smtp_host`.
* `notif_from`: defines the "From" address to use when sending emails.
It must be set if email sending is enabled. The placeholder '%(app)s' will be replaced by the application name,
which is normally set in `app_name`, but may be overridden by the
@@ -741,6 +742,7 @@ email:
force_tls: true
require_transport_security: true
enable_tls: false
tlsname: mail.server.example.com
notif_from: "Your Friendly %(app)s homeserver <noreply@example.com>"
app_name: my_branded_matrix_server
enable_notifs: true
@@ -1866,6 +1868,27 @@ rc_federation:
concurrent: 5
```
---
### `rc_presence`
This option sets ratelimiting for presence.
The `rc_presence.per_user` option sets rate limits on how often a specific
users' presence updates are evaluated. Ratelimited presence updates sent via sync are
ignored, and no error is returned to the client.
This option also sets the rate limit for the
[`PUT /_matrix/client/v3/presence/{userId}/status`](https://spec.matrix.org/latest/client-server-api/#put_matrixclientv3presenceuseridstatus)
endpoint.
`per_user` defaults to `per_second: 0.1`, `burst_count: 1`.
Example configuration:
```yaml
rc_presence:
per_user:
per_second: 0.05
burst_count: 0.5
```
---
### `federation_rr_transactions_per_room_per_second`
Sets outgoing federation transaction frequency for sending read-receipts,
@@ -3091,6 +3114,22 @@ Example configuration:
```yaml
macaroon_secret_key: <PRIVATE STRING>
```
---
### `macaroon_secret_key_path`
An alternative to [`macaroon_secret_key`](#macaroon_secret_key):
allows the secret key to be specified in an external file.
The file should be a plain text file, containing only the secret key.
Synapse reads the secret key from the given file once at startup.
Example configuration:
```yaml
macaroon_secret_key_path: /path/to/secrets/file
```
_Added in Synapse 1.121.0._
---
### `form_secret`
@@ -4447,6 +4486,10 @@ instance_map:
worker1:
host: localhost
port: 8034
other:
host: localhost
port: 8035
tls: true
```
Example configuration(#2, for UNIX sockets):
```yaml

View File

@@ -273,17 +273,6 @@ information.
^/_matrix/client/(api/v1|r0|v3|unstable)/knock/
^/_matrix/client/(api/v1|r0|v3|unstable)/profile/
# Account data requests
^/_matrix/client/(r0|v3|unstable)/.*/tags
^/_matrix/client/(r0|v3|unstable)/.*/account_data
# Receipts requests
^/_matrix/client/(r0|v3|unstable)/rooms/.*/receipt
^/_matrix/client/(r0|v3|unstable)/rooms/.*/read_markers
# Presence requests
^/_matrix/client/(api/v1|r0|v3|unstable)/presence/
# User directory search requests
^/_matrix/client/(r0|v3|unstable)/user_directory/search$
@@ -292,6 +281,13 @@ Additionally, the following REST endpoints can be handled for GET requests:
^/_matrix/client/(api/v1|r0|v3|unstable)/pushrules/
^/_matrix/client/unstable/org.matrix.msc4140/delayed_events
# Account data requests
^/_matrix/client/(r0|v3|unstable)/.*/tags
^/_matrix/client/(r0|v3|unstable)/.*/account_data
# Presence requests
^/_matrix/client/(api/v1|r0|v3|unstable)/presence/
Pagination requests can also be handled, but all requests for a given
room must be routed to the same instance. Additionally, care must be taken to
ensure that the purge history admin API is not used while pagination requests

559
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
[[package]]
name = "annotated-types"
@@ -32,13 +32,13 @@ tests-mypy = ["mypy (>=1.11.1)", "pytest-mypy-plugins"]
[[package]]
name = "authlib"
version = "1.3.2"
version = "1.4.0"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = true
python-versions = ">=3.8"
python-versions = ">=3.9"
files = [
{file = "Authlib-1.3.2-py2.py3-none-any.whl", hash = "sha256:ede026a95e9f5cdc2d4364a52103f5405e75aa156357e831ef2bfd0bc5094dfc"},
{file = "authlib-1.3.2.tar.gz", hash = "sha256:4b16130117f9eb82aa6eec97f6dd4673c3f960ac0283ccdae2897ee4bc030ba2"},
{file = "Authlib-1.4.0-py2.py3-none-any.whl", hash = "sha256:4bb20b978c8b636222b549317c1815e1fe62234fc1c5efe8855d84aebf3a74e3"},
{file = "authlib-1.4.0.tar.gz", hash = "sha256:1c1e6608b5ed3624aeeee136ca7f8c120d6f51f731aa152b153d54741840e1f2"},
]
[package.dependencies]
@@ -842,13 +842,13 @@ trio = ["async_generator", "trio"]
[[package]]
name = "jinja2"
version = "3.1.4"
version = "3.1.5"
description = "A very fast and expressive template engine."
optional = false
python-versions = ">=3.7"
files = [
{file = "jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d"},
{file = "jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369"},
{file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"},
{file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"},
]
[package.dependencies]
@@ -1314,38 +1314,43 @@ files = [
[[package]]
name = "mypy"
version = "1.11.2"
version = "1.13.0"
description = "Optional static typing for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "mypy-1.11.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d42a6dd818ffce7be66cce644f1dff482f1d97c53ca70908dff0b9ddc120b77a"},
{file = "mypy-1.11.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:801780c56d1cdb896eacd5619a83e427ce436d86a3bdf9112527f24a66618fef"},
{file = "mypy-1.11.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41ea707d036a5307ac674ea172875f40c9d55c5394f888b168033177fce47383"},
{file = "mypy-1.11.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6e658bd2d20565ea86da7d91331b0eed6d2eee22dc031579e6297f3e12c758c8"},
{file = "mypy-1.11.2-cp310-cp310-win_amd64.whl", hash = "sha256:478db5f5036817fe45adb7332d927daa62417159d49783041338921dcf646fc7"},
{file = "mypy-1.11.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:75746e06d5fa1e91bfd5432448d00d34593b52e7e91a187d981d08d1f33d4385"},
{file = "mypy-1.11.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a976775ab2256aadc6add633d44f100a2517d2388906ec4f13231fafbb0eccca"},
{file = "mypy-1.11.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cd953f221ac1379050a8a646585a29574488974f79d8082cedef62744f0a0104"},
{file = "mypy-1.11.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:57555a7715c0a34421013144a33d280e73c08df70f3a18a552938587ce9274f4"},
{file = "mypy-1.11.2-cp311-cp311-win_amd64.whl", hash = "sha256:36383a4fcbad95f2657642a07ba22ff797de26277158f1cc7bd234821468b1b6"},
{file = "mypy-1.11.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:e8960dbbbf36906c5c0b7f4fbf2f0c7ffb20f4898e6a879fcf56a41a08b0d318"},
{file = "mypy-1.11.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:06d26c277962f3fb50e13044674aa10553981ae514288cb7d0a738f495550b36"},
{file = "mypy-1.11.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6e7184632d89d677973a14d00ae4d03214c8bc301ceefcdaf5c474866814c987"},
{file = "mypy-1.11.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:3a66169b92452f72117e2da3a576087025449018afc2d8e9bfe5ffab865709ca"},
{file = "mypy-1.11.2-cp312-cp312-win_amd64.whl", hash = "sha256:969ea3ef09617aff826885a22ece0ddef69d95852cdad2f60c8bb06bf1f71f70"},
{file = "mypy-1.11.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:37c7fa6121c1cdfcaac97ce3d3b5588e847aa79b580c1e922bb5d5d2902df19b"},
{file = "mypy-1.11.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4a8a53bc3ffbd161b5b2a4fff2f0f1e23a33b0168f1c0778ec70e1a3d66deb86"},
{file = "mypy-1.11.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2ff93107f01968ed834f4256bc1fc4475e2fecf6c661260066a985b52741ddce"},
{file = "mypy-1.11.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:edb91dded4df17eae4537668b23f0ff6baf3707683734b6a818d5b9d0c0c31a1"},
{file = "mypy-1.11.2-cp38-cp38-win_amd64.whl", hash = "sha256:ee23de8530d99b6db0573c4ef4bd8f39a2a6f9b60655bf7a1357e585a3486f2b"},
{file = "mypy-1.11.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:801ca29f43d5acce85f8e999b1e431fb479cb02d0e11deb7d2abb56bdaf24fd6"},
{file = "mypy-1.11.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:af8d155170fcf87a2afb55b35dc1a0ac21df4431e7d96717621962e4b9192e70"},
{file = "mypy-1.11.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f7821776e5c4286b6a13138cc935e2e9b6fde05e081bdebf5cdb2bb97c9df81d"},
{file = "mypy-1.11.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:539c570477a96a4e6fb718b8d5c3e0c0eba1f485df13f86d2970c91f0673148d"},
{file = "mypy-1.11.2-cp39-cp39-win_amd64.whl", hash = "sha256:3f14cd3d386ac4d05c5a39a51b84387403dadbd936e17cb35882134d4f8f0d24"},
{file = "mypy-1.11.2-py3-none-any.whl", hash = "sha256:b499bc07dbdcd3de92b0a8b29fdf592c111276f6a12fe29c30f6c417dd546d12"},
{file = "mypy-1.11.2.tar.gz", hash = "sha256:7f9993ad3e0ffdc95c2a14b66dee63729f021968bff8ad911867579c65d13a79"},
{file = "mypy-1.13.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6607e0f1dd1fb7f0aca14d936d13fd19eba5e17e1cd2a14f808fa5f8f6d8f60a"},
{file = "mypy-1.13.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8a21be69bd26fa81b1f80a61ee7ab05b076c674d9b18fb56239d72e21d9f4c80"},
{file = "mypy-1.13.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7b2353a44d2179846a096e25691d54d59904559f4232519d420d64da6828a3a7"},
{file = "mypy-1.13.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0730d1c6a2739d4511dc4253f8274cdd140c55c32dfb0a4cf8b7a43f40abfa6f"},
{file = "mypy-1.13.0-cp310-cp310-win_amd64.whl", hash = "sha256:c5fc54dbb712ff5e5a0fca797e6e0aa25726c7e72c6a5850cfd2adbc1eb0a372"},
{file = "mypy-1.13.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:581665e6f3a8a9078f28d5502f4c334c0c8d802ef55ea0e7276a6e409bc0d82d"},
{file = "mypy-1.13.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3ddb5b9bf82e05cc9a627e84707b528e5c7caaa1c55c69e175abb15a761cec2d"},
{file = "mypy-1.13.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:20c7ee0bc0d5a9595c46f38beb04201f2620065a93755704e141fcac9f59db2b"},
{file = "mypy-1.13.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3790ded76f0b34bc9c8ba4def8f919dd6a46db0f5a6610fb994fe8efdd447f73"},
{file = "mypy-1.13.0-cp311-cp311-win_amd64.whl", hash = "sha256:51f869f4b6b538229c1d1bcc1dd7d119817206e2bc54e8e374b3dfa202defcca"},
{file = "mypy-1.13.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5c7051a3461ae84dfb5dd15eff5094640c61c5f22257c8b766794e6dd85e72d5"},
{file = "mypy-1.13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:39bb21c69a5d6342f4ce526e4584bc5c197fd20a60d14a8624d8743fffb9472e"},
{file = "mypy-1.13.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:164f28cb9d6367439031f4c81e84d3ccaa1e19232d9d05d37cb0bd880d3f93c2"},
{file = "mypy-1.13.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a4c1bfcdbce96ff5d96fc9b08e3831acb30dc44ab02671eca5953eadad07d6d0"},
{file = "mypy-1.13.0-cp312-cp312-win_amd64.whl", hash = "sha256:a0affb3a79a256b4183ba09811e3577c5163ed06685e4d4b46429a271ba174d2"},
{file = "mypy-1.13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a7b44178c9760ce1a43f544e595d35ed61ac2c3de306599fa59b38a6048e1aa7"},
{file = "mypy-1.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5d5092efb8516d08440e36626f0153b5006d4088c1d663d88bf79625af3d1d62"},
{file = "mypy-1.13.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2904956dac40ced10931ac967ae63c5089bd498542194b436eb097a9f77bc8"},
{file = "mypy-1.13.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7bfd8836970d33c2105562650656b6846149374dc8ed77d98424b40b09340ba7"},
{file = "mypy-1.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:9f73dba9ec77acb86457a8fc04b5239822df0c14a082564737833d2963677dbc"},
{file = "mypy-1.13.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:100fac22ce82925f676a734af0db922ecfea991e1d7ec0ceb1e115ebe501301a"},
{file = "mypy-1.13.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7bcb0bb7f42a978bb323a7c88f1081d1b5dee77ca86f4100735a6f541299d8fb"},
{file = "mypy-1.13.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bde31fc887c213e223bbfc34328070996061b0833b0a4cfec53745ed61f3519b"},
{file = "mypy-1.13.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:07de989f89786f62b937851295ed62e51774722e5444a27cecca993fc3f9cd74"},
{file = "mypy-1.13.0-cp38-cp38-win_amd64.whl", hash = "sha256:4bde84334fbe19bad704b3f5b78c4abd35ff1026f8ba72b29de70dda0916beb6"},
{file = "mypy-1.13.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:0246bcb1b5de7f08f2826451abd947bf656945209b140d16ed317f65a17dc7dc"},
{file = "mypy-1.13.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7f5b7deae912cf8b77e990b9280f170381fdfbddf61b4ef80927edd813163732"},
{file = "mypy-1.13.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7029881ec6ffb8bc233a4fa364736789582c738217b133f1b55967115288a2bc"},
{file = "mypy-1.13.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3e38b980e5681f28f033f3be86b099a247b13c491f14bb8b1e1e134d23bb599d"},
{file = "mypy-1.13.0-cp39-cp39-win_amd64.whl", hash = "sha256:a6789be98a2017c912ae6ccb77ea553bbaf13d27605d2ca20a76dfbced631b24"},
{file = "mypy-1.13.0-py3-none-any.whl", hash = "sha256:9c250883f9fd81d212e0952c92dbfcc96fc237f4b7c92f56ac81fd48460b3e5a"},
{file = "mypy-1.13.0.tar.gz", hash = "sha256:0291a61b6fbf3e6673e3405cfcc0e7650bebc7939659fdca2702958038bd835e"},
]
[package.dependencies]
@@ -1355,6 +1360,7 @@ typing-extensions = ">=4.6.0"
[package.extras]
dmypy = ["psutil (>=4.0)"]
faster-cache = ["orjson"]
install-types = ["pip"]
mypyc = ["setuptools (>=50)"]
reports = ["lxml"]
@@ -1372,17 +1378,17 @@ files = [
[[package]]
name = "mypy-zope"
version = "1.0.8"
version = "1.0.9"
description = "Plugin for mypy to support zope interfaces"
optional = false
python-versions = "*"
files = [
{file = "mypy_zope-1.0.8-py3-none-any.whl", hash = "sha256:8794a77dae0c7e2f28b8ac48569091310b3ee45bb9d6cd4797dcb837c40f9976"},
{file = "mypy_zope-1.0.8.tar.gz", hash = "sha256:854303a95aefc4289e8a0796808e002c2c7ecde0a10a8f7b8f48092f94ef9b9f"},
{file = "mypy_zope-1.0.9-py3-none-any.whl", hash = "sha256:6666c1556891a3cb186137519dbd7a58cb30fb72b2504798cad47b35391921ba"},
{file = "mypy_zope-1.0.9.tar.gz", hash = "sha256:37d6985dfb05a4c27b35cff47577fd5bad878db4893ddedf54d165f7389a1cdb"},
]
[package.dependencies]
mypy = ">=1.0.0,<1.13.0"
mypy = ">=1.0.0,<1.14.0"
"zope.interface" = "*"
"zope.schema" = "*"
@@ -1454,98 +1460,89 @@ files = [
[[package]]
name = "pillow"
version = "10.4.0"
version = "11.1.0"
description = "Python Imaging Library (Fork)"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
files = [
{file = "pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e"},
{file = "pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d"},
{file = "pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856"},
{file = "pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f"},
{file = "pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b"},
{file = "pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc"},
{file = "pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e"},
{file = "pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46"},
{file = "pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984"},
{file = "pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141"},
{file = "pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1"},
{file = "pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c"},
{file = "pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be"},
{file = "pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3"},
{file = "pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6"},
{file = "pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe"},
{file = "pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319"},
{file = "pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d"},
{file = "pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696"},
{file = "pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496"},
{file = "pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91"},
{file = "pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22"},
{file = "pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94"},
{file = "pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597"},
{file = "pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80"},
{file = "pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca"},
{file = "pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef"},
{file = "pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a"},
{file = "pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b"},
{file = "pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9"},
{file = "pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42"},
{file = "pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a"},
{file = "pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9"},
{file = "pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3"},
{file = "pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb"},
{file = "pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70"},
{file = "pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be"},
{file = "pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0"},
{file = "pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc"},
{file = "pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a"},
{file = "pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309"},
{file = "pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060"},
{file = "pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea"},
{file = "pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d"},
{file = "pillow-10.4.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:8d4d5063501b6dd4024b8ac2f04962d661222d120381272deea52e3fc52d3736"},
{file = "pillow-10.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7c1ee6f42250df403c5f103cbd2768a28fe1a0ea1f0f03fe151c8741e1469c8b"},
{file = "pillow-10.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b15e02e9bb4c21e39876698abf233c8c579127986f8207200bc8a8f6bb27acf2"},
{file = "pillow-10.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a8d4bade9952ea9a77d0c3e49cbd8b2890a399422258a77f357b9cc9be8d680"},
{file = "pillow-10.4.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:43efea75eb06b95d1631cb784aa40156177bf9dd5b4b03ff38979e048258bc6b"},
{file = "pillow-10.4.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:950be4d8ba92aca4b2bb0741285a46bfae3ca699ef913ec8416c1b78eadd64cd"},
{file = "pillow-10.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:d7480af14364494365e89d6fddc510a13e5a2c3584cb19ef65415ca57252fb84"},
{file = "pillow-10.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:73664fe514b34c8f02452ffb73b7a92c6774e39a647087f83d67f010eb9a0cf0"},
{file = "pillow-10.4.0-cp38-cp38-win32.whl", hash = "sha256:e88d5e6ad0d026fba7bdab8c3f225a69f063f116462c49892b0149e21b6c0a0e"},
{file = "pillow-10.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:5161eef006d335e46895297f642341111945e2c1c899eb406882a6c61a4357ab"},
{file = "pillow-10.4.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0ae24a547e8b711ccaaf99c9ae3cd975470e1a30caa80a6aaee9a2f19c05701d"},
{file = "pillow-10.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:298478fe4f77a4408895605f3482b6cc6222c018b2ce565c2b6b9c354ac3229b"},
{file = "pillow-10.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:134ace6dc392116566980ee7436477d844520a26a4b1bd4053f6f47d096997fd"},
{file = "pillow-10.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:930044bb7679ab003b14023138b50181899da3f25de50e9dbee23b61b4de2126"},
{file = "pillow-10.4.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:c76e5786951e72ed3686e122d14c5d7012f16c8303a674d18cdcd6d89557fc5b"},
{file = "pillow-10.4.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:b2724fdb354a868ddf9a880cb84d102da914e99119211ef7ecbdc613b8c96b3c"},
{file = "pillow-10.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:dbc6ae66518ab3c5847659e9988c3b60dc94ffb48ef9168656e0019a93dbf8a1"},
{file = "pillow-10.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:06b2f7898047ae93fad74467ec3d28fe84f7831370e3c258afa533f81ef7f3df"},
{file = "pillow-10.4.0-cp39-cp39-win32.whl", hash = "sha256:7970285ab628a3779aecc35823296a7869f889b8329c16ad5a71e4901a3dc4ef"},
{file = "pillow-10.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:961a7293b2457b405967af9c77dcaa43cc1a8cd50d23c532e62d48ab6cdd56f5"},
{file = "pillow-10.4.0-cp39-cp39-win_arm64.whl", hash = "sha256:32cda9e3d601a52baccb2856b8ea1fc213c90b340c542dcef77140dfa3278a9e"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885"},
{file = "pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:a02364621fe369e06200d4a16558e056fe2805d3468350df3aef21e00d26214b"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:1b5dea9831a90e9d0721ec417a80d4cbd7022093ac38a568db2dd78363b00908"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b885f89040bb8c4a1573566bbb2f44f5c505ef6e74cec7ab9068c900047f04b"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87dd88ded2e6d74d31e1e0a99a726a6765cda32d00ba72dc37f0651f306daaa8"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:2db98790afc70118bd0255c2eeb465e9767ecf1f3c25f9a1abb8ffc8cfd1fe0a"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f7baece4ce06bade126fb84b8af1c33439a76d8a6fd818970215e0560ca28c27"},
{file = "pillow-10.4.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:cfdd747216947628af7b259d274771d84db2268ca062dd5faf373639d00113a3"},
{file = "pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06"},
{file = "pillow-11.1.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:e1abe69aca89514737465752b4bcaf8016de61b3be1397a8fc260ba33321b3a8"},
{file = "pillow-11.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c640e5a06869c75994624551f45e5506e4256562ead981cce820d5ab39ae2192"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a07dba04c5e22824816b2615ad7a7484432d7f540e6fa86af60d2de57b0fcee2"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e267b0ed063341f3e60acd25c05200df4193e15a4a5807075cd71225a2386e26"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:bd165131fd51697e22421d0e467997ad31621b74bfc0b75956608cb2906dda07"},
{file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:abc56501c3fd148d60659aae0af6ddc149660469082859fa7b066a298bde9482"},
{file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:54ce1c9a16a9561b6d6d8cb30089ab1e5eb66918cb47d457bd996ef34182922e"},
{file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:73ddde795ee9b06257dac5ad42fcb07f3b9b813f8c1f7f870f402f4dc54b5269"},
{file = "pillow-11.1.0-cp310-cp310-win32.whl", hash = "sha256:3a5fe20a7b66e8135d7fd617b13272626a28278d0e578c98720d9ba4b2439d49"},
{file = "pillow-11.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:b6123aa4a59d75f06e9dd3dac5bf8bc9aa383121bb3dd9a7a612e05eabc9961a"},
{file = "pillow-11.1.0-cp310-cp310-win_arm64.whl", hash = "sha256:a76da0a31da6fcae4210aa94fd779c65c75786bc9af06289cd1c184451ef7a65"},
{file = "pillow-11.1.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:e06695e0326d05b06833b40b7ef477e475d0b1ba3a6d27da1bb48c23209bf457"},
{file = "pillow-11.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96f82000e12f23e4f29346e42702b6ed9a2f2fea34a740dd5ffffcc8c539eb35"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a3cd561ded2cf2bbae44d4605837221b987c216cff94f49dfeed63488bb228d2"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f189805c8be5ca5add39e6f899e6ce2ed824e65fb45f3c28cb2841911da19070"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:dd0052e9db3474df30433f83a71b9b23bd9e4ef1de13d92df21a52c0303b8ab6"},
{file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:837060a8599b8f5d402e97197d4924f05a2e0d68756998345c829c33186217b1"},
{file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aa8dd43daa836b9a8128dbe7d923423e5ad86f50a7a14dc688194b7be5c0dea2"},
{file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0a2f91f8a8b367e7a57c6e91cd25af510168091fb89ec5146003e424e1558a96"},
{file = "pillow-11.1.0-cp311-cp311-win32.whl", hash = "sha256:c12fc111ef090845de2bb15009372175d76ac99969bdf31e2ce9b42e4b8cd88f"},
{file = "pillow-11.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:fbd43429d0d7ed6533b25fc993861b8fd512c42d04514a0dd6337fb3ccf22761"},
{file = "pillow-11.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:f7955ecf5609dee9442cbface754f2c6e541d9e6eda87fad7f7a989b0bdb9d71"},
{file = "pillow-11.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2062ffb1d36544d42fcaa277b069c88b01bb7298f4efa06731a7fd6cc290b81a"},
{file = "pillow-11.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a85b653980faad27e88b141348707ceeef8a1186f75ecc600c395dcac19f385b"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9409c080586d1f683df3f184f20e36fb647f2e0bc3988094d4fd8c9f4eb1b3b3"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fdadc077553621911f27ce206ffcbec7d3f8d7b50e0da39f10997e8e2bb7f6a"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:93a18841d09bcdd774dcdc308e4537e1f867b3dec059c131fde0327899734aa1"},
{file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:9aa9aeddeed452b2f616ff5507459e7bab436916ccb10961c4a382cd3e03f47f"},
{file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3cdcdb0b896e981678eee140d882b70092dac83ac1cdf6b3a60e2216a73f2b91"},
{file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:36ba10b9cb413e7c7dfa3e189aba252deee0602c86c309799da5a74009ac7a1c"},
{file = "pillow-11.1.0-cp312-cp312-win32.whl", hash = "sha256:cfd5cd998c2e36a862d0e27b2df63237e67273f2fc78f47445b14e73a810e7e6"},
{file = "pillow-11.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:a697cd8ba0383bba3d2d3ada02b34ed268cb548b369943cd349007730c92bddf"},
{file = "pillow-11.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:4dd43a78897793f60766563969442020e90eb7847463eca901e41ba186a7d4a5"},
{file = "pillow-11.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ae98e14432d458fc3de11a77ccb3ae65ddce70f730e7c76140653048c71bfcbc"},
{file = "pillow-11.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cc1331b6d5a6e144aeb5e626f4375f5b7ae9934ba620c0ac6b3e43d5e683a0f0"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:758e9d4ef15d3560214cddbc97b8ef3ef86ce04d62ddac17ad39ba87e89bd3b1"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b523466b1a31d0dcef7c5be1f20b942919b62fd6e9a9be199d035509cbefc0ec"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:9044b5e4f7083f209c4e35aa5dd54b1dd5b112b108648f5c902ad586d4f945c5"},
{file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:3764d53e09cdedd91bee65c2527815d315c6b90d7b8b79759cc48d7bf5d4f114"},
{file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:31eba6bbdd27dde97b0174ddf0297d7a9c3a507a8a1480e1e60ef914fe23d352"},
{file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b5d658fbd9f0d6eea113aea286b21d3cd4d3fd978157cbf2447a6035916506d3"},
{file = "pillow-11.1.0-cp313-cp313-win32.whl", hash = "sha256:f86d3a7a9af5d826744fabf4afd15b9dfef44fe69a98541f666f66fbb8d3fef9"},
{file = "pillow-11.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:593c5fd6be85da83656b93ffcccc2312d2d149d251e98588b14fbc288fd8909c"},
{file = "pillow-11.1.0-cp313-cp313-win_arm64.whl", hash = "sha256:11633d58b6ee5733bde153a8dafd25e505ea3d32e261accd388827ee987baf65"},
{file = "pillow-11.1.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:70ca5ef3b3b1c4a0812b5c63c57c23b63e53bc38e758b37a951e5bc466449861"},
{file = "pillow-11.1.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8000376f139d4d38d6851eb149b321a52bb8893a88dae8ee7d95840431977081"},
{file = "pillow-11.1.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee85f0696a17dd28fbcfceb59f9510aa71934b483d1f5601d1030c3c8304f3c"},
{file = "pillow-11.1.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:dd0e081319328928531df7a0e63621caf67652c8464303fd102141b785ef9547"},
{file = "pillow-11.1.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e63e4e5081de46517099dc30abe418122f54531a6ae2ebc8680bcd7096860eab"},
{file = "pillow-11.1.0-cp313-cp313t-win32.whl", hash = "sha256:dda60aa465b861324e65a78c9f5cf0f4bc713e4309f83bc387be158b077963d9"},
{file = "pillow-11.1.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ad5db5781c774ab9a9b2c4302bbf0c1014960a0a7be63278d13ae6fdf88126fe"},
{file = "pillow-11.1.0-cp313-cp313t-win_arm64.whl", hash = "sha256:67cd427c68926108778a9005f2a04adbd5e67c442ed21d95389fe1d595458756"},
{file = "pillow-11.1.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:bf902d7413c82a1bfa08b06a070876132a5ae6b2388e2712aab3a7cbc02205c6"},
{file = "pillow-11.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c1eec9d950b6fe688edee07138993e54ee4ae634c51443cfb7c1e7613322718e"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e275ee4cb11c262bd108ab2081f750db2a1c0b8c12c1897f27b160c8bd57bbc"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4db853948ce4e718f2fc775b75c37ba2efb6aaea41a1a5fc57f0af59eee774b2"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:ab8a209b8485d3db694fa97a896d96dd6533d63c22829043fd9de627060beade"},
{file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:54251ef02a2309b5eec99d151ebf5c9904b77976c8abdcbce7891ed22df53884"},
{file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5bb94705aea800051a743aa4874bb1397d4695fb0583ba5e425ee0328757f196"},
{file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89dbdb3e6e9594d512780a5a1c42801879628b38e3efc7038094430844e271d8"},
{file = "pillow-11.1.0-cp39-cp39-win32.whl", hash = "sha256:e5449ca63da169a2e6068dd0e2fcc8d91f9558aba89ff6d02121ca8ab11e79e5"},
{file = "pillow-11.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:3362c6ca227e65c54bf71a5f88b3d4565ff1bcbc63ae72c34b07bbb1cc59a43f"},
{file = "pillow-11.1.0-cp39-cp39-win_arm64.whl", hash = "sha256:b20be51b37a75cc54c2c55def3fa2c65bb94ba859dde241cd0a4fd302de5ae0a"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:8c730dc3a83e5ac137fbc92dfcfe1511ce3b2b5d7578315b63dbbb76f7f51d90"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:7d33d2fae0e8b170b6a6c57400e077412240f6f5bb2a342cf1ee512a787942bb"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8d65b38173085f24bc07f8b6c505cbb7418009fa1a1fcb111b1f4961814a442"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:015c6e863faa4779251436db398ae75051469f7c903b043a48f078e437656f83"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d44ff19eea13ae4acdaaab0179fa68c0c6f2f45d66a4d8ec1eda7d6cecbcc15f"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d3d8da4a631471dfaf94c10c85f5277b1f8e42ac42bade1ac67da4b4a7359b73"},
{file = "pillow-11.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4637b88343166249fe8aa94e7c4a62a180c4b3898283bb5d3d2fd5fe10d8e4e0"},
{file = "pillow-11.1.0.tar.gz", hash = "sha256:368da70808b36d73b4b390a8ffac11069f8a5c85f29eff1f1b01bcf3ef5b2a20"},
]
[package.extras]
docs = ["furo", "olefile", "sphinx (>=7.3)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"]
docs = ["furo", "olefile", "sphinx (>=8.1)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"]
fpx = ["olefile"]
mic = ["olefile"]
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "trove-classifiers (>=2024.10.12)"]
typing = ["typing-extensions"]
xmp = ["defusedxml"]
@@ -1590,6 +1587,7 @@ files = [
{file = "psycopg2-2.9.10-cp311-cp311-win_amd64.whl", hash = "sha256:0435034157049f6846e95103bd8f5a668788dd913a7c30162ca9503fdf542cb4"},
{file = "psycopg2-2.9.10-cp312-cp312-win32.whl", hash = "sha256:65a63d7ab0e067e2cdb3cf266de39663203d38d6a8ed97f5ca0cb315c73fe067"},
{file = "psycopg2-2.9.10-cp312-cp312-win_amd64.whl", hash = "sha256:4a579d6243da40a7b3182e0430493dbd55950c493d8c68f4eec0b302f6bbf20e"},
{file = "psycopg2-2.9.10-cp313-cp313-win_amd64.whl", hash = "sha256:91fd603a2155da8d0cfcdbf8ab24a2d54bca72795b90d2a3ed2b6da8d979dee2"},
{file = "psycopg2-2.9.10-cp39-cp39-win32.whl", hash = "sha256:9d5b3b94b79a844a986d029eee38998232451119ad653aea42bb9220a8c5066b"},
{file = "psycopg2-2.9.10-cp39-cp39-win_amd64.whl", hash = "sha256:88138c8dedcbfa96408023ea2b0c369eda40fe5d75002c0964c78f46f11fa442"},
{file = "psycopg2-2.9.10.tar.gz", hash = "sha256:12ec0b40b0273f95296233e8750441339298e6a572f7039da5b260e3c8b60e11"},
@@ -1660,22 +1658,19 @@ files = [
[[package]]
name = "pydantic"
version = "2.9.2"
version = "2.10.3"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
{file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
{file = "pydantic-2.10.3-py3-none-any.whl", hash = "sha256:be04d85bbc7b65651c5f8e6b9976ed9c6f41782a55524cef079a34a0bb82144d"},
{file = "pydantic-2.10.3.tar.gz", hash = "sha256:cb5ac360ce894ceacd69c403187900a02c4b20b693a9dd1d643e1effab9eadf9"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.23.4"
typing-extensions = [
{version = ">=4.12.2", markers = "python_version >= \"3.13\""},
{version = ">=4.6.1", markers = "python_version < \"3.13\""},
]
pydantic-core = "2.27.1"
typing-extensions = ">=4.12.2"
[package.extras]
email = ["email-validator (>=2.0.0)"]
@@ -1683,100 +1678,111 @@ timezone = ["tzdata"]
[[package]]
name = "pydantic-core"
version = "2.23.4"
version = "2.27.1"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
{file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
{file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
{file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
{file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
{file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
{file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
{file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
{file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
{file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
{file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
{file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
{file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
{file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
{file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
{file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
{file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
{file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
{file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
{file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
{file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
{file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
{file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
{file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
{file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
{file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
{file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
{file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
{file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
{file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
{file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
{file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
{file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
{file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
{file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
{file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206"},
{file = "pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c"},
{file = "pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc"},
{file = "pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9"},
{file = "pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5"},
{file = "pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae"},
{file = "pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c"},
{file = "pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16"},
{file = "pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23"},
{file = "pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05"},
{file = "pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337"},
{file = "pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5897bec80a09b4084aee23f9b73a9477a46c3304ad1d2d07acca19723fb1de62"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d0165ab2914379bd56908c02294ed8405c252250668ebcb438a55494c69f44ab"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b9af86e1d8e4cfc82c2022bfaa6f459381a50b94a29e95dcdda8442d6d83864"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f6c8a66741c5f5447e047ab0ba7a1c61d1e95580d64bce852e3df1f895c4067"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a42d6a8156ff78981f8aa56eb6394114e0dedb217cf8b729f438f643608cbcd"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:64c65f40b4cd8b0e049a8edde07e38b476da7e3aaebe63287c899d2cff253fa5"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdcf339322a3fae5cbd504edcefddd5a50d9ee00d968696846f089b4432cf78"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf99c8404f008750c846cb4ac4667b798a9f7de673ff719d705d9b2d6de49c5f"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8f1edcea27918d748c7e5e4d917297b2a0ab80cad10f86631e488b7cddf76a36"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:159cac0a3d096f79ab6a44d77a961917219707e2a130739c64d4dd46281f5c2a"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:029d9757eb621cc6e1848fa0b0310310de7301057f623985698ed7ebb014391b"},
{file = "pydantic_core-2.27.1-cp38-none-win32.whl", hash = "sha256:a28af0695a45f7060e6f9b7092558a928a28553366519f64083c63a44f70e618"},
{file = "pydantic_core-2.27.1-cp38-none-win_amd64.whl", hash = "sha256:2d4567c850905d5eaaed2f7a404e61012a51caf288292e016360aa2b96ff38d4"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:e9386266798d64eeb19dd3677051f5705bf873e98e15897ddb7d76f477131967"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4228b5b646caa73f119b1ae756216b59cc6e2267201c27d3912b592c5e323b60"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b3dfe500de26c52abe0477dde16192ac39c98f05bf2d80e76102d394bd13854"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aee66be87825cdf72ac64cb03ad4c15ffef4143dbf5c113f64a5ff4f81477bf9"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b748c44bb9f53031c8cbc99a8a061bc181c1000c60a30f55393b6e9c45cc5bd"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ca038c7f6a0afd0b2448941b6ef9d5e1949e999f9e5517692eb6da58e9d44be"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e0bd57539da59a3e4671b90a502da9a28c72322a4f17866ba3ac63a82c4498e"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac6c2c45c847bbf8f91930d88716a0fb924b51e0c6dad329b793d670ec5db792"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b94d4ba43739bbe8b0ce4262bcc3b7b9f31459ad120fb595627eaeb7f9b9ca01"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:00e6424f4b26fe82d44577b4c842d7df97c20be6439e8e685d0d715feceb9fb9"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:38de0a70160dd97540335b7ad3a74571b24f1dc3ed33f815f0880682e6880131"},
{file = "pydantic_core-2.27.1-cp39-none-win32.whl", hash = "sha256:7ccebf51efc61634f6c2344da73e366c75e735960b5654b63d7e6f69a5885fa3"},
{file = "pydantic_core-2.27.1-cp39-none-win_amd64.whl", hash = "sha256:a57847b090d7892f123726202b7daa20df6694cbd583b67a592e856bff603d6c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5fde892e6c697ce3e30c61b239330fc5d569a71fefd4eb6512fc6caec9dd9e2f"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:816f5aa087094099fff7edabb5e01cc370eb21aa1a1d44fe2d2aefdfb5599b31"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c10c309e18e443ddb108f0ef64e8729363adbfd92d6d57beec680f6261556f3"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98476c98b02c8e9b2eec76ac4156fd006628b1b2d0ef27e548ffa978393fd154"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c3027001c28434e7ca5a6e1e527487051136aa81803ac812be51802150d880dd"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:7699b1df36a48169cdebda7ab5a2bac265204003f153b4bd17276153d997670a"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1c39b07d90be6b48968ddc8c19e7585052088fd7ec8d568bb31ff64c70ae3c97"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:46ccfe3032b3915586e469d4972973f893c0a2bb65669194a5bdea9bacc088c2"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:62ba45e21cf6571d7f716d903b5b7b6d2617e2d5d67c0923dc47b9d41369f840"},
{file = "pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235"},
]
[package.dependencies]
@@ -1817,12 +1823,12 @@ plugins = ["importlib-metadata"]
[[package]]
name = "pyicu"
version = "2.13.1"
version = "2.14"
description = "Python extension wrapping the ICU C++ API"
optional = true
python-versions = "*"
files = [
{file = "PyICU-2.13.1.tar.gz", hash = "sha256:d4919085eaa07da12bade8ee721e7bbf7ade0151ca0f82946a26c8f4b98cdceb"},
{file = "PyICU-2.14.tar.gz", hash = "sha256:acc7eb92bd5c554ed577249c6978450a4feda0aa6f01470152b3a7b382a02132"},
]
[[package]]
@@ -1899,31 +1905,31 @@ tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
[[package]]
name = "pyopenssl"
version = "24.2.1"
version = "24.3.0"
description = "Python wrapper module around the OpenSSL library"
optional = false
python-versions = ">=3.7"
files = [
{file = "pyOpenSSL-24.2.1-py3-none-any.whl", hash = "sha256:967d5719b12b243588573f39b0c677637145c7a1ffedcd495a487e58177fbb8d"},
{file = "pyopenssl-24.2.1.tar.gz", hash = "sha256:4247f0dbe3748d560dcbb2ff3ea01af0f9a1a001ef5f7c4c647956ed8cbf0e95"},
{file = "pyOpenSSL-24.3.0-py3-none-any.whl", hash = "sha256:e474f5a473cd7f92221cc04976e48f4d11502804657a08a989fb3be5514c904a"},
{file = "pyopenssl-24.3.0.tar.gz", hash = "sha256:49f7a019577d834746bc55c5fce6ecbcec0f2b4ec5ce1cf43a9a173b8138bb36"},
]
[package.dependencies]
cryptography = ">=41.0.5,<44"
cryptography = ">=41.0.5,<45"
[package.extras]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx-rtd-theme"]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"]
test = ["pretend", "pytest (>=3.0.1)", "pytest-rerunfailures"]
[[package]]
name = "pysaml2"
version = "7.3.1"
version = "7.5.0"
description = "Python implementation of SAML Version 2 Standard"
optional = true
python-versions = ">=3.6.2,<4.0.0"
python-versions = ">=3.9,<4.0"
files = [
{file = "pysaml2-7.3.1-py3-none-any.whl", hash = "sha256:2cc66e7a371d3f5ff9601f0ed93b5276cca816fce82bb38447d5a0651f2f5193"},
{file = "pysaml2-7.3.1.tar.gz", hash = "sha256:eab22d187c6dd7707c58b5bb1688f9b8e816427667fc99d77f54399e15cd0a0a"},
{file = "pysaml2-7.5.0-py3-none-any.whl", hash = "sha256:bc6627cc344476a83c757f440a73fda1369f13b6fda1b4e16bca63ffbabb5318"},
{file = "pysaml2-7.5.0.tar.gz", hash = "sha256:f36871d4e5ee857c6b85532e942550d2cf90ea4ee943d75eb681044bbc4f54f7"},
]
[package.dependencies]
@@ -1933,7 +1939,7 @@ pyopenssl = "*"
python-dateutil = "*"
pytz = "*"
requests = ">=2,<3"
xmlschema = ">=1.2.1"
xmlschema = ">=2,<3"
[package.extras]
s2repoze = ["paste", "repoze.who", "zope.interface"]
@@ -1954,13 +1960,13 @@ six = ">=1.5"
[[package]]
name = "python-multipart"
version = "0.0.16"
version = "0.0.20"
description = "A streaming multipart parser for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "python_multipart-0.0.16-py3-none-any.whl", hash = "sha256:c2759b7b976ef3937214dfb592446b59dfaa5f04682a076f78b117c94776d87a"},
{file = "python_multipart-0.0.16.tar.gz", hash = "sha256:8dee37b88dab9b59922ca173c35acb627cc12ec74019f5cd4578369c6df36554"},
{file = "python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104"},
{file = "python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13"},
]
[[package]]
@@ -2313,13 +2319,13 @@ doc = ["Sphinx", "sphinx-rtd-theme"]
[[package]]
name = "sentry-sdk"
version = "2.17.0"
version = "2.19.2"
description = "Python client for Sentry (https://sentry.io)"
optional = true
python-versions = ">=3.6"
files = [
{file = "sentry_sdk-2.17.0-py2.py3-none-any.whl", hash = "sha256:625955884b862cc58748920f9e21efdfb8e0d4f98cca4ab0d3918576d5b606ad"},
{file = "sentry_sdk-2.17.0.tar.gz", hash = "sha256:dd0a05352b78ffeacced73a94e86f38b32e2eae15fff5f30ca5abb568a72eacf"},
{file = "sentry_sdk-2.19.2-py2.py3-none-any.whl", hash = "sha256:ebdc08228b4d131128e568d696c210d846e5b9d70aa0327dec6b1272d9d40b84"},
{file = "sentry_sdk-2.19.2.tar.gz", hash = "sha256:467df6e126ba242d39952375dd816fbee0f217d119bf454a8ce74cf1e7909e8d"},
]
[package.dependencies]
@@ -2345,14 +2351,16 @@ grpcio = ["grpcio (>=1.21.1)", "protobuf (>=3.8.0)"]
http2 = ["httpcore[http2] (==1.*)"]
httpx = ["httpx (>=0.16.0)"]
huey = ["huey (>=2)"]
huggingface-hub = ["huggingface-hub (>=0.22)"]
huggingface-hub = ["huggingface_hub (>=0.22)"]
langchain = ["langchain (>=0.0.210)"]
launchdarkly = ["launchdarkly-server-sdk (>=9.8.0)"]
litestar = ["litestar (>=2.0.0)"]
loguru = ["loguru (>=0.5)"]
openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"]
openfeature = ["openfeature-sdk (>=0.7.1)"]
opentelemetry = ["opentelemetry-distro (>=0.35b0)"]
opentelemetry-experimental = ["opentelemetry-distro"]
pure-eval = ["asttokens", "executing", "pure-eval"]
pure-eval = ["asttokens", "executing", "pure_eval"]
pymongo = ["pymongo (>=3.1)"]
pyspark = ["pyspark (>=2.4.4)"]
quart = ["blinker (>=1.1)", "quart (>=0.16.1)"]
@@ -2405,19 +2413,18 @@ test = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "importlib-metadata
[[package]]
name = "setuptools-rust"
version = "1.8.1"
version = "1.10.2"
description = "Setuptools Rust extension plugin"
optional = false
python-versions = ">=3.8"
files = [
{file = "setuptools-rust-1.8.1.tar.gz", hash = "sha256:94b1dd5d5308b3138d5b933c3a2b55e6d6927d1a22632e509fcea9ddd0f7e486"},
{file = "setuptools_rust-1.8.1-py3-none-any.whl", hash = "sha256:b5324493949ccd6aa0c03890c5f6b5f02de4512e3ac1697d02e9a6c02b18aa8e"},
{file = "setuptools_rust-1.10.2-py3-none-any.whl", hash = "sha256:4b39c435ae9670315d522ed08fa0e8cb29f2a6048033966b6be2571a90ce4f1c"},
{file = "setuptools_rust-1.10.2.tar.gz", hash = "sha256:5d73e7eee5f87a6417285b617c97088a7c20d1a70fcea60e3bdc94ff567c29dc"},
]
[package.dependencies]
semantic-version = ">=2.8.2,<3"
setuptools = ">=62.4"
tomli = {version = ">=1.2.1", markers = "python_version < \"3.11\""}
[[package]]
name = "signedjson"
@@ -2515,20 +2522,50 @@ twisted = ["twisted"]
[[package]]
name = "tomli"
version = "2.1.0"
version = "2.2.1"
description = "A lil' TOML parser"
optional = false
python-versions = ">=3.8"
files = [
{file = "tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391"},
{file = "tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8"},
{file = "tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249"},
{file = "tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6"},
{file = "tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a"},
{file = "tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee"},
{file = "tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e"},
{file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4"},
{file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106"},
{file = "tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8"},
{file = "tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff"},
{file = "tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b"},
{file = "tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea"},
{file = "tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8"},
{file = "tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192"},
{file = "tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222"},
{file = "tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77"},
{file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6"},
{file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd"},
{file = "tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e"},
{file = "tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98"},
{file = "tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4"},
{file = "tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7"},
{file = "tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c"},
{file = "tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13"},
{file = "tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281"},
{file = "tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272"},
{file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140"},
{file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2"},
{file = "tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744"},
{file = "tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec"},
{file = "tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69"},
{file = "tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc"},
{file = "tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff"},
]
[[package]]
name = "tornado"
version = "6.4.2"
description = "Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed."
optional = false
optional = true
python-versions = ">=3.8"
files = [
{file = "tornado-6.4.2-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:e828cce1123e9e44ae2a50a9de3055497ab1d0aeb440c5ac23064d9e44880da1"},
@@ -2590,19 +2627,20 @@ docs = ["sphinx (<7.0.0)"]
[[package]]
name = "twine"
version = "5.1.1"
version = "6.0.1"
description = "Collection of utilities for publishing packages on PyPI"
optional = false
python-versions = ">=3.8"
files = [
{file = "twine-5.1.1-py3-none-any.whl", hash = "sha256:215dbe7b4b94c2c50a7315c0275d2258399280fbb7d04182c7e55e24b5f93997"},
{file = "twine-5.1.1.tar.gz", hash = "sha256:9aa0825139c02b3434d913545c7b847a21c835e11597f5255842d457da2322db"},
{file = "twine-6.0.1-py3-none-any.whl", hash = "sha256:9c6025b203b51521d53e200f4a08b116dee7500a38591668c6a6033117bdc218"},
{file = "twine-6.0.1.tar.gz", hash = "sha256:36158b09df5406e1c9c1fb8edb24fc2be387709443e7376689b938531582ee27"},
]
[package.dependencies]
importlib-metadata = ">=3.6"
keyring = ">=15.1"
pkginfo = ">=1.8.1,<1.11"
importlib-metadata = {version = ">=3.6", markers = "python_version < \"3.10\""}
keyring = {version = ">=15.1", markers = "platform_machine != \"ppc64le\" and platform_machine != \"s390x\""}
packaging = "*"
pkginfo = ">=1.8.1"
readme-renderer = ">=35.0"
requests = ">=2.20"
requests-toolbelt = ">=0.8.0,<0.9.0 || >0.9.0"
@@ -2610,6 +2648,9 @@ rfc3986 = ">=1.4.0"
rich = ">=12.0.0"
urllib3 = ">=1.26.0"
[package.extras]
keyring = ["keyring (>=15.1)"]
[[package]]
name = "twisted"
version = "24.7.0"
@@ -2665,13 +2706,13 @@ twisted = "*"
[[package]]
name = "types-bleach"
version = "6.1.0.20240331"
version = "6.2.0.20241123"
description = "Typing stubs for bleach"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-bleach-6.1.0.20240331.tar.gz", hash = "sha256:2ee858a84fb06fc2225ff56ba2f7f6c88b65638659efae0d7bfd6b24a1b5a524"},
{file = "types_bleach-6.1.0.20240331-py3-none-any.whl", hash = "sha256:399bc59bfd20a36a56595f13f805e56c8a08e5a5c07903e5cf6fafb5a5107dd4"},
{file = "types_bleach-6.2.0.20241123-py3-none-any.whl", hash = "sha256:c6e58b3646665ca7c6b29890375390f4569e84f0cf5c171e0fe1ddb71a7be86a"},
{file = "types_bleach-6.2.0.20241123.tar.gz", hash = "sha256:dac5fe9015173514da3ac810c1a935619a3ccbcc5d66c4cbf4707eac00539057"},
]
[package.dependencies]

View File

@@ -97,7 +97,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.120.0"
version = "1.124.0rc2"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"
@@ -386,8 +386,11 @@ build-backend = "poetry.core.masonry.api"
# c.f. https://github.com/matrix-org/synapse/pull/14259
skip = "cp36* cp37* cp38* pp37* pp38* *-musllinux_i686 pp*aarch64 *-musllinux_aarch64"
# We need a rust compiler
before-all = "curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain stable -y --profile minimal"
# We need a rust compiler.
#
# We temporarily pin Rust to 1.82.0 to work around
# https://github.com/element-hq/synapse/issues/17988
before-all = "curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain 1.82.0 -y --profile minimal"
environment= { PATH = "$PATH:$HOME/.cargo/bin" }
# For some reason if we don't manually clean the build directory we

View File

@@ -71,6 +71,34 @@ impl TryFrom<&str> for UserID {
}
}
impl TryFrom<String> for UserID {
type Error = IdentifierError;
/// Will try creating a `UserID` from the provided `&str`.
/// Can fail if the user_id is incorrectly formatted.
fn try_from(s: String) -> Result<Self, Self::Error> {
if !s.starts_with('@') {
return Err(IdentifierError::IncorrectSigil);
}
if s.find(':').is_none() {
return Err(IdentifierError::MissingColon);
}
Ok(UserID(s))
}
}
impl<'de> serde::Deserialize<'de> for UserID {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let s: String = serde::Deserialize::deserialize(deserializer)?;
UserID::try_from(s).map_err(serde::de::Error::custom)
}
}
impl Deref for UserID {
type Target = str;
@@ -84,3 +112,141 @@ impl fmt::Display for UserID {
write!(f, "{}", self.0)
}
}
/// A Matrix room_id.
#[derive(Clone, Debug, PartialEq)]
pub struct RoomID(String);
impl RoomID {
/// Returns the `localpart` of the room_id.
pub fn localpart(&self) -> &str {
&self[1..self.colon_pos()]
}
/// Returns the `server_name` / `domain` of the room_id.
pub fn server_name(&self) -> &str {
&self[self.colon_pos() + 1..]
}
/// Returns the position of the ':' inside of the room_id.
/// Used when splitting the room_id into it's respective parts.
fn colon_pos(&self) -> usize {
self.find(':').unwrap()
}
}
impl TryFrom<&str> for RoomID {
type Error = IdentifierError;
/// Will try creating a `RoomID` from the provided `&str`.
/// Can fail if the room_id is incorrectly formatted.
fn try_from(s: &str) -> Result<Self, Self::Error> {
if !s.starts_with('!') {
return Err(IdentifierError::IncorrectSigil);
}
if s.find(':').is_none() {
return Err(IdentifierError::MissingColon);
}
Ok(RoomID(s.to_string()))
}
}
impl TryFrom<String> for RoomID {
type Error = IdentifierError;
/// Will try creating a `RoomID` from the provided `String`.
/// Can fail if the room_id is incorrectly formatted.
fn try_from(s: String) -> Result<Self, Self::Error> {
if !s.starts_with('!') {
return Err(IdentifierError::IncorrectSigil);
}
if s.find(':').is_none() {
return Err(IdentifierError::MissingColon);
}
Ok(RoomID(s))
}
}
impl<'de> serde::Deserialize<'de> for RoomID {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let s: String = serde::Deserialize::deserialize(deserializer)?;
RoomID::try_from(s).map_err(serde::de::Error::custom)
}
}
impl Deref for RoomID {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl fmt::Display for RoomID {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
/// A Matrix event_id.
#[derive(Clone, Debug, PartialEq)]
pub struct EventID(String);
impl TryFrom<&str> for EventID {
type Error = IdentifierError;
/// Will try creating a `EventID` from the provided `&str`.
/// Can fail if the event_id is incorrectly formatted.
fn try_from(s: &str) -> Result<Self, Self::Error> {
if !s.starts_with('$') {
return Err(IdentifierError::IncorrectSigil);
}
Ok(EventID(s.to_string()))
}
}
impl TryFrom<String> for EventID {
type Error = IdentifierError;
/// Will try creating a `EventID` from the provided `String`.
/// Can fail if the event_id is incorrectly formatted.
fn try_from(s: String) -> Result<Self, Self::Error> {
if !s.starts_with('$') {
return Err(IdentifierError::IncorrectSigil);
}
Ok(EventID(s))
}
}
impl<'de> serde::Deserialize<'de> for EventID {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
let s: String = serde::Deserialize::deserialize(deserializer)?;
EventID::try_from(s).map_err(serde::de::Error::custom)
}
}
impl Deref for EventID {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl fmt::Display for EventID {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}

View File

@@ -195,6 +195,10 @@ if [ -z "$skip_docker_build" ]; then
# Build the unified Complement image (from the worker Synapse image we just built).
echo_if_github "::group::Build Docker image: complement/Dockerfile"
$CONTAINER_RUNTIME build -t complement-synapse \
`# This is the tag we end up pushing to the registry (see` \
`# .github/workflows/push_complement_image.yml) so let's just label it now` \
`# so people can reference it by the same name locally.` \
-t ghcr.io/element-hq/synapse/complement-synapse \
-f "docker/complement/Dockerfile" "docker/complement"
echo_if_github "::endgroup::"

View File

@@ -19,7 +19,7 @@
#
#
import logging
from typing import TYPE_CHECKING, Any, Dict, List, Optional
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Optional
from urllib.parse import urlencode
from authlib.oauth2 import ClientAuth
@@ -119,7 +119,7 @@ class MSC3861DelegatedAuth(BaseAuth):
self._clock = hs.get_clock()
self._http_client = hs.get_proxied_http_client()
self._hostname = hs.hostname
self._admin_token = self._config.admin_token
self._admin_token: Callable[[], Optional[str]] = self._config.admin_token
self._issuer_metadata = RetryOnExceptionCachedCall[OpenIDProviderMetadata](
self._load_metadata
@@ -133,9 +133,10 @@ class MSC3861DelegatedAuth(BaseAuth):
)
else:
# Else use the client secret
assert self._config.client_secret, "No client_secret provided"
client_secret = self._config.client_secret()
assert client_secret, "No client_secret provided"
self._client_auth = ClientAuth(
self._config.client_id, self._config.client_secret, auth_method
self._config.client_id, client_secret, auth_method
)
async def _load_metadata(self) -> OpenIDProviderMetadata:
@@ -174,6 +175,12 @@ class MSC3861DelegatedAuth(BaseAuth):
logger.warning("Failed to load metadata:", exc_info=True)
return None
async def auth_metadata(self) -> Dict[str, Any]:
"""
Returns the auth metadata dict
"""
return await self._issuer_metadata.get()
async def _introspection_endpoint(self) -> str:
"""
Returns the introspection endpoint of the issuer
@@ -277,7 +284,7 @@ class MSC3861DelegatedAuth(BaseAuth):
requester = await self.get_user_by_access_token(access_token, allow_expired)
# Do not record requests from MAS using the virtual `__oidc_admin` user.
if access_token != self._admin_token:
if access_token != self._admin_token():
await self._record_request(request, requester)
if not allow_guest and requester.is_guest:
@@ -318,7 +325,8 @@ class MSC3861DelegatedAuth(BaseAuth):
token: str,
allow_expired: bool = False,
) -> Requester:
if self._admin_token is not None and token == self._admin_token:
admin_token = self._admin_token()
if admin_token is not None and token == admin_token:
# XXX: This is a temporary solution so that the admin API can be called by
# the OIDC provider. This will be removed once we have OIDC client
# credentials grant support in matrix-authentication-service.

View File

@@ -231,6 +231,8 @@ class EventContentFields:
ROOM_NAME: Final = "name"
MEMBERSHIP: Final = "membership"
MEMBERSHIP_DISPLAYNAME: Final = "displayname"
MEMBERSHIP_AVATAR_URL: Final = "avatar_url"
# Used in m.room.guest_access events.
GUEST_ACCESS: Final = "guest_access"
@@ -318,3 +320,8 @@ class ApprovalNoticeMedium:
class Direction(enum.Enum):
BACKWARDS = "b"
FORWARDS = "f"
class ProfileFields:
DISPLAYNAME: Final = "displayname"
AVATAR_URL: Final = "avatar_url"

View File

@@ -87,8 +87,7 @@ class Codes(str, Enum):
WEAK_PASSWORD = "M_WEAK_PASSWORD"
INVALID_SIGNATURE = "M_INVALID_SIGNATURE"
USER_DEACTIVATED = "M_USER_DEACTIVATED"
# USER_LOCKED = "M_USER_LOCKED"
USER_LOCKED = "ORG_MATRIX_MSC3939_USER_LOCKED"
USER_LOCKED = "M_USER_LOCKED"
NOT_YET_UPLOADED = "M_NOT_YET_UPLOADED"
CANNOT_OVERWRITE_MEDIA = "M_CANNOT_OVERWRITE_MEDIA"
@@ -101,8 +100,9 @@ class Codes(str, Enum):
# The account has been suspended on the server.
# By opposition to `USER_DEACTIVATED`, this is a reversible measure
# that can possibly be appealed and reverted.
# Part of MSC3823.
USER_ACCOUNT_SUSPENDED = "ORG.MATRIX.MSC3823.USER_ACCOUNT_SUSPENDED"
# Introduced by MSC3823
# https://github.com/matrix-org/matrix-spec-proposals/pull/3823
USER_ACCOUNT_SUSPENDED = "M_USER_SUSPENDED"
BAD_ALIAS = "M_BAD_ALIAS"
# For restricted join rules.
@@ -132,6 +132,10 @@ class Codes(str, Enum):
# connection.
UNKNOWN_POS = "M_UNKNOWN_POS"
# Part of MSC4133
PROFILE_TOO_LARGE = "M_PROFILE_TOO_LARGE"
KEY_TOO_LARGE = "M_KEY_TOO_LARGE"
class CodeMessageException(RuntimeError):
"""An exception with integer code, a message string attributes and optional headers.

View File

@@ -275,6 +275,7 @@ class Ratelimiter:
update: bool = True,
n_actions: int = 1,
_time_now_s: Optional[float] = None,
pause: Optional[float] = 0.5,
) -> None:
"""Checks if an action can be performed. If not, raises a LimitExceededError
@@ -298,6 +299,8 @@ class Ratelimiter:
at all.
_time_now_s: The current time. Optional, defaults to the current time according
to self.clock. Only used by tests.
pause: Time in seconds to pause when an action is being limited. Defaults to 0.5
to stop clients from "tight-looping" on retrying their request.
Raises:
LimitExceededError: If an action could not be performed, along with the time in
@@ -316,9 +319,8 @@ class Ratelimiter:
)
if not allowed:
# We pause for a bit here to stop clients from "tight-looping" on
# retrying their request.
await self.clock.sleep(0.5)
if pause:
await self.clock.sleep(pause)
raise LimitExceededError(
limiter_name=self._limiter_name,

View File

@@ -23,7 +23,8 @@
import hmac
from hashlib import sha256
from urllib.parse import urlencode
from typing import Optional
from urllib.parse import urlencode, urljoin
from synapse.config import ConfigError
from synapse.config.homeserver import HomeServerConfig
@@ -66,3 +67,42 @@ class ConsentURIBuilder:
urlencode({"u": user_id, "h": mac}),
)
return consent_uri
class LoginSSORedirectURIBuilder:
def __init__(self, hs_config: HomeServerConfig):
self._public_baseurl = hs_config.server.public_baseurl
def build_login_sso_redirect_uri(
self, *, idp_id: Optional[str], client_redirect_url: str
) -> str:
"""Build a `/login/sso/redirect` URI for the given identity provider.
Builds `/_matrix/client/v3/login/sso/redirect/{idpId}?redirectUrl=xxx` when `idp_id` is specified.
Otherwise, builds `/_matrix/client/v3/login/sso/redirect?redirectUrl=xxx` when `idp_id` is `None`.
Args:
idp_id: Optional ID of the identity provider
client_redirect_url: URL to redirect the user to after login
Returns
The URI to follow when choosing a specific identity provider.
"""
base_url = urljoin(
self._public_baseurl,
f"{CLIENT_API_PREFIX}/v3/login/sso/redirect",
)
serialized_query_parameters = urlencode({"redirectUrl": client_redirect_url})
if idp_id:
resultant_url = urljoin(
# We have to add a trailing slash to the base URL to ensure that the
# last path segment is not stripped away when joining with another path.
f"{base_url}/",
f"{idp_id}?{serialized_query_parameters}",
)
else:
resultant_url = f"{base_url}?{serialized_query_parameters}"
return resultant_url

View File

@@ -87,6 +87,7 @@ class ApplicationService:
ip_range_whitelist: Optional[IPSet] = None,
supports_ephemeral: bool = False,
msc3202_transaction_extensions: bool = False,
msc4190_device_management: bool = False,
):
self.token = token
self.url = (
@@ -100,6 +101,7 @@ class ApplicationService:
self.ip_range_whitelist = ip_range_whitelist
self.supports_ephemeral = supports_ephemeral
self.msc3202_transaction_extensions = msc3202_transaction_extensions
self.msc4190_device_management = msc4190_device_management
if "|" in self.id:
raise Exception("application service ID cannot contain '|' character")

View File

@@ -221,9 +221,13 @@ class Config:
The number of milliseconds in the duration.
Raises:
TypeError, if given something other than an integer or a string
TypeError: if given something other than an integer or a string, or the
duration is using an incorrect suffix.
ValueError: if given a string not of the form described above.
"""
# For integers, we prefer to use `type(value) is int` instead of
# `isinstance(value, int)` because we want to exclude subclasses of int, such as
# bool.
if type(value) is int: # noqa: E721
return value
elif isinstance(value, str):
@@ -246,9 +250,20 @@ class Config:
if suffix in sizes:
value = value[:-1]
size = sizes[suffix]
elif suffix.isdigit():
# No suffix is treated as milliseconds.
value = value
size = 1
else:
raise TypeError(
f"Bad duration suffix {value} (expected no suffix or one of these suffixes: {sizes.keys()})"
)
return int(value) * size
else:
raise TypeError(f"Bad duration {value!r}")
raise TypeError(
f"Bad duration type {value!r} (expected int or string duration)"
)
@staticmethod
def abspath(file_path: str) -> str:

View File

@@ -183,6 +183,18 @@ def _load_appservice(
"The `org.matrix.msc3202` option should be true or false if specified."
)
# Opt-in flag for the MSC4190 behaviours.
# When enabled, the following C-S API endpoints change for appservices:
# - POST /register does not return an access token
# - PUT /devices/{device_id} creates a new device if one does not exist
# - DELETE /devices/{device_id} no longer requires UIA
# - POST /delete_devices/{device_id} no longer requires UIA
msc4190_enabled = as_info.get("io.element.msc4190", False)
if not isinstance(msc4190_enabled, bool):
raise ValueError(
"The `io.element.msc4190` option should be true or false if specified."
)
return ApplicationService(
token=as_info["as_token"],
url=as_info["url"],
@@ -195,4 +207,5 @@ def _load_appservice(
ip_range_whitelist=ip_range_whitelist,
supports_ephemeral=supports_ephemeral,
msc3202_transaction_extensions=msc3202_transaction_extensions,
msc4190_device_management=msc4190_enabled,
)

View File

@@ -20,7 +20,7 @@
#
#
from typing import Any, List
from typing import Any, List, Optional
from synapse.config.sso import SsoAttributeRequirement
from synapse.types import JsonDict
@@ -46,7 +46,9 @@ class CasConfig(Config):
# TODO Update this to a _synapse URL.
public_baseurl = self.root.server.public_baseurl
self.cas_service_url = public_baseurl + "_matrix/client/r0/login/cas/ticket"
self.cas_service_url: Optional[str] = (
public_baseurl + "_matrix/client/r0/login/cas/ticket"
)
self.cas_protocol_version = cas_config.get("protocol_version")
if (

View File

@@ -110,6 +110,7 @@ class EmailConfig(Config):
raise ConfigError(
"email.require_transport_security requires email.enable_tls to be true"
)
self.email_tlsname = email_config.get("tlsname", None)
if "app_name" in email_config:
self.email_app_name = email_config["app_name"]

View File

@@ -20,14 +20,15 @@
#
import enum
from typing import TYPE_CHECKING, Any, Optional
from functools import cache
from typing import TYPE_CHECKING, Any, Iterable, Optional
import attr
import attr.validators
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS, RoomVersions
from synapse.config import ConfigError
from synapse.config._base import Config, RootConfig
from synapse.config._base import Config, RootConfig, read_file
from synapse.types import JsonDict
# Determine whether authlib is installed.
@@ -43,6 +44,12 @@ if TYPE_CHECKING:
from authlib.jose.rfc7517 import JsonWebKey
@cache
def read_secret_from_file_once(file_path: Any, config_path: Iterable[str]) -> str:
"""Returns the memoized secret read from file."""
return read_file(file_path, config_path).strip()
class ClientAuthMethod(enum.Enum):
"""List of supported client auth methods."""
@@ -63,6 +70,40 @@ def _parse_jwks(jwks: Optional[JsonDict]) -> Optional["JsonWebKey"]:
return JsonWebKey.import_key(jwks)
def _check_client_secret(
instance: "MSC3861", _attribute: attr.Attribute, _value: Optional[str]
) -> None:
if instance._client_secret and instance._client_secret_path:
raise ConfigError(
(
"You have configured both "
"`experimental_features.msc3861.client_secret` and "
"`experimental_features.msc3861.client_secret_path`. "
"These are mutually incompatible."
),
("experimental", "msc3861", "client_secret"),
)
# Check client secret can be retrieved
instance.client_secret()
def _check_admin_token(
instance: "MSC3861", _attribute: attr.Attribute, _value: Optional[str]
) -> None:
if instance._admin_token and instance._admin_token_path:
raise ConfigError(
(
"You have configured both "
"`experimental_features.msc3861.admin_token` and "
"`experimental_features.msc3861.admin_token_path`. "
"These are mutually incompatible."
),
("experimental", "msc3861", "admin_token"),
)
# Check client secret can be retrieved
instance.admin_token()
@attr.s(slots=True, frozen=True)
class MSC3861:
"""Configuration for MSC3861: Matrix architecture change to delegate authentication via OIDC"""
@@ -97,15 +138,30 @@ class MSC3861:
)
"""The auth method used when calling the introspection endpoint."""
client_secret: Optional[str] = attr.ib(
_client_secret: Optional[str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(str)),
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_client_secret,
],
)
"""
The client secret to use when calling the introspection endpoint,
when using any of the client_secret_* client auth methods.
"""
_client_secret_path: Optional[str] = attr.ib(
default=None,
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_client_secret,
],
)
"""
Alternative to `client_secret`: allows the secret to be specified in an
external file.
"""
jwk: Optional["JsonWebKey"] = attr.ib(default=None, converter=_parse_jwks)
"""
The JWKS to use when calling the introspection endpoint,
@@ -133,7 +189,7 @@ class MSC3861:
ClientAuthMethod.CLIENT_SECRET_BASIC,
ClientAuthMethod.CLIENT_SECRET_JWT,
)
and self.client_secret is None
and self.client_secret() is None
):
raise ConfigError(
f"A client secret must be provided when using the {value} client auth method",
@@ -152,15 +208,48 @@ class MSC3861:
)
"""The URL of the My Account page on the OIDC Provider as per MSC2965."""
admin_token: Optional[str] = attr.ib(
_admin_token: Optional[str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(str)),
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_admin_token,
],
)
"""
A token that should be considered as an admin token.
This is used by the OIDC provider, to make admin calls to Synapse.
"""
_admin_token_path: Optional[str] = attr.ib(
default=None,
validator=[
attr.validators.optional(attr.validators.instance_of(str)),
_check_admin_token,
],
)
"""
Alternative to `admin_token`: allows the secret to be specified in an
external file.
"""
def client_secret(self) -> Optional[str]:
"""Returns the secret given via `client_secret` or `client_secret_path`."""
if self._client_secret_path:
return read_secret_from_file_once(
self._client_secret_path,
("experimental_features", "msc3861", "client_secret_path"),
)
return self._client_secret
def admin_token(self) -> Optional[str]:
"""Returns the admin token given via `admin_token` or `admin_token_path`."""
if self._admin_token_path:
return read_secret_from_file_once(
self._admin_token_path,
("experimental_features", "msc3861", "admin_token_path"),
)
return self._admin_token
def check_config_conflicts(self, root: RootConfig) -> None:
"""Checks for any configuration conflicts with other parts of Synapse.
@@ -436,15 +525,14 @@ class ExperimentalConfig(Config):
("experimental", "msc4108_delegation_endpoint"),
)
self.msc3823_account_suspension = experimental.get(
"msc3823_account_suspension", False
)
# MSC4151: Report room API (Client-Server API)
self.msc4151_enabled: bool = experimental.get("msc4151_enabled", False)
# MSC4133: Custom profile fields
self.msc4133_enabled: bool = experimental.get("msc4133_enabled", False)
# MSC4210: Remove legacy mentions
self.msc4210_enabled: bool = experimental.get("msc4210_enabled", False)
# MSC4222: Adding `state_after` to sync v2
self.msc4222_enabled: bool = experimental.get("msc4222_enabled", False)
# MSC4076: Add `disable_badge_count`` to pusher configuration
self.msc4076_enabled: bool = experimental.get("msc4076_enabled", False)

View File

@@ -43,7 +43,7 @@ from unpaddedbase64 import decode_base64
from synapse.types import JsonDict
from synapse.util.stringutils import random_string, random_string_with_symbols
from ._base import Config, ConfigError
from ._base import Config, ConfigError, read_file
if TYPE_CHECKING:
from signedjson.key import VerifyKeyWithExpiry
@@ -91,6 +91,11 @@ To suppress this warning and continue using 'matrix.org', admins should set
'suppress_key_server_warning' to 'true' in homeserver.yaml.
--------------------------------------------------------------------------------"""
CONFLICTING_MACAROON_SECRET_KEY_OPTS_ERROR = """\
Conflicting options 'macaroon_secret_key' and 'macaroon_secret_key_path' are
both defined in config file.
"""
logger = logging.getLogger(__name__)
@@ -166,10 +171,16 @@ class KeyConfig(Config):
)
)
macaroon_secret_key: Optional[str] = config.get(
"macaroon_secret_key", self.root.registration.registration_shared_secret
)
macaroon_secret_key = config.get("macaroon_secret_key")
macaroon_secret_key_path = config.get("macaroon_secret_key_path")
if macaroon_secret_key_path:
if macaroon_secret_key:
raise ConfigError(CONFLICTING_MACAROON_SECRET_KEY_OPTS_ERROR)
macaroon_secret_key = read_file(
macaroon_secret_key_path, "macaroon_secret_key_path"
).strip()
if not macaroon_secret_key:
macaroon_secret_key = self.root.registration.registration_shared_secret
if not macaroon_secret_key:
# Unfortunately, there are people out there that don't have this
# set. Lets just be "nice" and derive one from their secret key.

View File

@@ -360,5 +360,6 @@ def setup_logging(
"Licensed under the AGPL 3.0 license. Website: https://github.com/element-hq/synapse"
)
logging.info("Server hostname: %s", config.server.server_name)
logging.info("Public Base URL: %s", config.server.public_baseurl)
logging.info("Instance name: %s", hs.get_instance_name())
logging.info("Twisted reactor: %s", type(reactor).__name__)

View File

@@ -228,3 +228,9 @@ class RatelimitConfig(Config):
config.get("remote_media_download_burst_count", "500M")
),
)
self.rc_presence_per_user = RatelimitSettings.parse(
config,
"rc_presence.per_user",
defaults={"per_second": 0.1, "burst_count": 1},
)

View File

@@ -22,7 +22,7 @@
import logging
import os
from typing import Any, Dict, List, Tuple
from urllib.request import getproxies_environment # type: ignore
from urllib.request import getproxies_environment
import attr

View File

@@ -332,8 +332,14 @@ class ServerConfig(Config):
logger.info("Using default public_baseurl %s", public_baseurl)
else:
self.serve_client_wellknown = True
# Ensure that public_baseurl ends with a trailing slash
if public_baseurl[-1] != "/":
public_baseurl += "/"
# Scrutinize user-provided config
if not isinstance(public_baseurl, str):
raise ConfigError("Must be a string", ("public_baseurl",))
self.public_baseurl = public_baseurl
# check that public_baseurl is valid

View File

@@ -566,6 +566,7 @@ def _is_membership_change_allowed(
logger.debug(
"_is_membership_change_allowed: %s",
{
"caller_membership": caller.membership if caller else None,
"caller_in_room": caller_in_room,
"caller_invited": caller_invited,
"caller_knocked": caller_knocked,
@@ -677,7 +678,8 @@ def _is_membership_change_allowed(
and join_rule == JoinRules.KNOCK_RESTRICTED
)
):
if not caller_in_room and not caller_invited:
# You can only join the room if you are invited or are already in the room.
if not (caller_in_room or caller_invited):
raise AuthError(403, "You are not invited to this room.")
else:
# TODO (erikj): may_join list

View File

@@ -42,7 +42,7 @@ import attr
from typing_extensions import Literal
from unpaddedbase64 import encode_base64
from synapse.api.constants import RelationTypes
from synapse.api.constants import EventTypes, RelationTypes
from synapse.api.room_versions import EventFormatVersions, RoomVersion, RoomVersions
from synapse.synapse_rust.events import EventInternalMetadata
from synapse.types import JsonDict, StrCollection
@@ -325,12 +325,17 @@ class EventBase(metaclass=abc.ABCMeta):
def __repr__(self) -> str:
rejection = f"REJECTED={self.rejected_reason}, " if self.rejected_reason else ""
conditional_membership_string = ""
if self.get("type") == EventTypes.Member:
conditional_membership_string = f"membership={self.membership}, "
return (
f"<{self.__class__.__name__} "
f"{rejection}"
f"event_id={self.event_id}, "
f"type={self.get('type')}, "
f"state_key={self.get('state_key')}, "
f"{conditional_membership_string}"
f"outlier={self.internal_metadata.is_outlier()}"
">"
)

View File

@@ -66,50 +66,67 @@ class InviteAutoAccepter:
event: The incoming event.
"""
# Check if the event is an invite for a local user.
is_invite_for_local_user = (
event.type == EventTypes.Member
and event.is_state()
and event.membership == Membership.INVITE
and self._api.is_mine(event.state_key)
)
if (
event.type != EventTypes.Member
or event.is_state() is False
or event.membership != Membership.INVITE
or self._api.is_mine(event.state_key) is False
):
return
# Only accept invites for direct messages if the configuration mandates it.
is_direct_message = event.content.get("is_direct", False)
is_allowed_by_direct_message_rules = (
not self._config.accept_invites_only_for_direct_messages
or is_direct_message is True
)
if (
self._config.accept_invites_only_for_direct_messages
and is_direct_message is False
):
return
# Only accept invites from remote users if the configuration mandates it.
is_from_local_user = self._api.is_mine(event.sender)
is_allowed_by_local_user_rules = (
not self._config.accept_invites_only_from_local_users
or is_from_local_user is True
if (
self._config.accept_invites_only_from_local_users
and is_from_local_user is False
):
return
# Check the user is activated.
recipient = await self._api.get_userinfo_by_id(event.state_key)
# Ignore if the user doesn't exist.
if recipient is None:
return
# Never accept invites for deactivated users.
if recipient.is_deactivated:
return
# Never accept invites for suspended users.
if recipient.suspended:
return
# Never accept invites for locked users.
if recipient.locked:
return
# Make the user join the room. We run this as a background process to circumvent a race condition
# that occurs when responding to invites over federation (see https://github.com/matrix-org/synapse-auto-accept-invite/issues/12)
run_as_background_process(
"retry_make_join",
self._retry_make_join,
event.state_key,
event.state_key,
event.room_id,
"join",
bg_start_span=False,
)
if (
is_invite_for_local_user
and is_allowed_by_direct_message_rules
and is_allowed_by_local_user_rules
):
# Make the user join the room. We run this as a background process to circumvent a race condition
# that occurs when responding to invites over federation (see https://github.com/matrix-org/synapse-auto-accept-invite/issues/12)
run_as_background_process(
"retry_make_join",
self._retry_make_join,
event.state_key,
event.state_key,
event.room_id,
"join",
bg_start_span=False,
if is_direct_message:
# Mark this room as a direct message!
await self._mark_room_as_direct_message(
event.state_key, event.sender, event.room_id
)
if is_direct_message:
# Mark this room as a direct message!
await self._mark_room_as_direct_message(
event.state_key, event.sender, event.room_id
)
async def _mark_room_as_direct_message(
self, user_id: str, dm_user_id: str, room_id: str
) -> None:

View File

@@ -24,7 +24,7 @@ from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union
import attr
from signedjson.types import SigningKey
from synapse.api.constants import MAX_DEPTH
from synapse.api.constants import MAX_DEPTH, EventTypes
from synapse.api.room_versions import (
KNOWN_EVENT_FORMAT_VERSIONS,
EventFormatVersions,
@@ -109,6 +109,19 @@ class EventBuilder:
def is_state(self) -> bool:
return self._state_key is not None
def is_mine_id(self, user_id: str) -> bool:
"""Determines whether a user ID or room alias originates from this homeserver.
Returns:
`True` if the hostname part of the user ID or room alias matches this
homeserver.
`False` otherwise, or if the user ID or room alias is malformed.
"""
localpart_hostname = user_id.split(":", 1)
if len(localpart_hostname) < 2:
return False
return localpart_hostname[1] == self._hostname
async def build(
self,
prev_event_ids: List[str],
@@ -142,6 +155,46 @@ class EventBuilder:
self, state_ids
)
# Check for out-of-band membership that may have been exposed on `/sync` but
# the events have not been de-outliered yet so they won't be part of the
# room state yet.
#
# This helps in situations where a remote homeserver invites a local user to
# a room that we're already participating in; and we've persisted the invite
# as an out-of-band membership (outlier), but it hasn't been pushed to us as
# part of a `/send` transaction yet and de-outliered. This also helps for
# any of the other out-of-band membership transitions.
#
# As an optimization, we could check if the room state already includes a
# non-`leave` membership event, then we can assume the membership event has
# been de-outliered and we don't need to check for an out-of-band
# membership. But we don't have the necessary information from a
# `StateMap[str]` and we'll just have to take the hit of this extra lookup
# for any membership event for now.
if self.type == EventTypes.Member and self.is_mine_id(self.state_key):
(
_membership,
member_event_id,
) = await self._store.get_local_current_membership_for_user_in_room(
user_id=self.state_key,
room_id=self.room_id,
)
# There is no need to check if the membership is actually an
# out-of-band membership (`outlier`) as we would end up with the
# same result either way (adding the member event to the
# `auth_event_ids`).
if (
member_event_id is not None
# We only need to be careful about duplicating the event in the
# `auth_event_ids` list (duplicate `type`/`state_key` is part of the
# authorization rules)
and member_event_id not in auth_event_ids
):
auth_event_ids.append(member_event_id)
# Also make sure to point to the previous membership event that will
# allow this one to happen so the computed state works out.
prev_event_ids.append(member_event_id)
format_version = self.room_version.event_format
# The types of auth/prev events changes between event versions.
prev_events: Union[StrCollection, List[Tuple[str, Dict[str, str]]]]

View File

@@ -248,7 +248,7 @@ class EventContext(UnpersistedEventContextBase):
@tag_args
async def get_current_state_ids(
self, state_filter: Optional["StateFilter"] = None
) -> Optional[StateMap[str]]:
) -> StateMap[str]:
"""
Gets the room state map, including this event - ie, the state in ``state_group``
@@ -256,13 +256,12 @@ class EventContext(UnpersistedEventContextBase):
not make it into the room state. This method will raise an exception if
``rejected`` is set.
It is also an error to access this for an outlier event.
Arg:
state_filter: specifies the type of state event to fetch from DB, example: EventTypes.JoinRules
Returns:
Returns None if state_group is None, which happens when the associated
event is an outlier.
Maps a (type, state_key) to the event ID of the state event matching
this tuple.
"""
@@ -300,7 +299,8 @@ class EventContext(UnpersistedEventContextBase):
this tuple.
"""
assert self.state_group_before_event is not None
if self.state_group_before_event is None:
return {}
return await self._storage.state.get_state_ids_for_group(
self.state_group_before_event, state_filter
)

View File

@@ -509,6 +509,9 @@ class FederationV2InviteServlet(BaseFederationServerServlet):
event = content["event"]
invite_room_state = content.get("invite_room_state", [])
if not isinstance(invite_room_state, list):
invite_room_state = []
# Synapse expects invite_room_state to be in unsigned, as it is in v1
# API

View File

@@ -473,7 +473,7 @@ class AdminHandler:
"type": EventTypes.Redaction,
"content": {"reason": reason} if reason else {},
"room_id": room,
"sender": user_id,
"sender": requester.user.to_string(),
}
if room_version.updated_redaction_rules:
event_dict["content"]["redacts"] = event.event_id

View File

@@ -896,10 +896,10 @@ class ApplicationServicesHandler:
results = await make_deferred_yieldable(
defer.DeferredList(
[
run_in_background(
run_in_background( # type: ignore[call-overload]
self.appservice_api.claim_client_keys,
# We know this must be an app service.
self.store.get_app_service_by_id(service_id), # type: ignore[arg-type]
self.store.get_app_service_by_id(service_id),
service_query,
)
for service_id, service_query in query_by_appservice.items()
@@ -952,10 +952,10 @@ class ApplicationServicesHandler:
results = await make_deferred_yieldable(
defer.DeferredList(
[
run_in_background(
run_in_background( # type: ignore[call-overload]
self.appservice_api.query_keys,
# We know this must be an app service.
self.store.get_app_service_by_id(service_id), # type: ignore[arg-type]
self.store.get_app_service_by_id(service_id),
service_query,
)
for service_id, service_query in query_by_appservice.items()

View File

@@ -729,6 +729,40 @@ class DeviceHandler(DeviceWorkerHandler):
await self.notify_device_update(user_id, device_ids)
async def upsert_device(
self, user_id: str, device_id: str, display_name: Optional[str] = None
) -> bool:
"""Create or update a device
Args:
user_id: The user to update devices of.
device_id: The device to update.
display_name: The new display name for this device.
Returns:
True if the device was created, False if it was updated.
"""
# Reject a new displayname which is too long.
self._check_device_name_length(display_name)
created = await self.store.store_device(
user_id,
device_id,
initial_device_display_name=display_name,
)
if not created:
await self.store.update_device(
user_id,
device_id,
new_display_name=display_name,
)
await self.notify_device_update(user_id, [device_id])
return created
async def update_device(self, user_id: str, device_id: str, content: dict) -> None:
"""Update the given device

View File

@@ -880,6 +880,9 @@ class FederationHandler:
if stripped_room_state is None:
raise KeyError("Missing 'knock_room_state' field in send_knock response")
if not isinstance(stripped_room_state, list):
raise TypeError("'knock_room_state' has wrong type")
event.unsigned["knock_room_state"] = stripped_room_state
context = EventContext.for_outlier(self._storage_controllers)

View File

@@ -151,6 +151,8 @@ class FederationEventHandler:
def __init__(self, hs: "HomeServer"):
self._clock = hs.get_clock()
self._store = hs.get_datastores().main
self._state_store = hs.get_datastores().state
self._state_deletion_store = hs.get_datastores().state_deletion
self._storage_controllers = hs.get_storage_controllers()
self._state_storage_controller = self._storage_controllers.state
@@ -580,7 +582,9 @@ class FederationEventHandler:
room_version.identifier,
state_maps_to_resolve,
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -1179,7 +1183,9 @@ class FederationEventHandler:
room_version,
state_maps,
event_map={event_id: event},
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
except Exception as e:
@@ -1874,7 +1880,9 @@ class FederationEventHandler:
room_version,
[local_state_id_map, claimed_auth_events_id_map],
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -2014,7 +2022,9 @@ class FederationEventHandler:
room_version,
state_sets,
event_map=None,
state_res_store=StateResolutionStore(self._store),
state_res_store=StateResolutionStore(
self._store, self._state_deletion_store
),
)
)
else:
@@ -2272,8 +2282,9 @@ class FederationEventHandler:
event_and_contexts, backfilled=backfilled
)
# After persistence we always need to notify replication there may
# be new data.
# After persistence, we never notify clients (wake up `/sync` streams) about
# backfilled events but it's important to let all the workers know about any
# new event (backfilled or not) because TODO
self._notifier.notify_replication()
if self._ephemeral_messages_enabled:

View File

@@ -1002,7 +1002,21 @@ class OidcProvider:
"""
state = generate_token()
nonce = generate_token()
# Generate a nonce 32 characters long. When encoded with base64url later on,
# the nonce will be 43 characters when sent to the identity provider.
#
# While RFC7636 does not specify a minimum length for the `nonce`
# parameter, the TI-Messenger IDP_FD spec v1.7.3 does require it to be
# between 43 and 128 characters. This spec concerns using Matrix for
# communication in German healthcare.
#
# As increasing the length only strengthens security, we use this length
# to allow TI-Messenger deployments using Synapse to satisfy this
# external spec.
#
# See https://github.com/element-hq/synapse/pull/18109 for more context.
nonce = generate_token(length=32)
code_verifier = ""
if not client_redirect_url:

View File

@@ -22,6 +22,7 @@ import logging
import random
from typing import TYPE_CHECKING, List, Optional, Union
from synapse.api.constants import ProfileFields
from synapse.api.errors import (
AuthError,
Codes,
@@ -31,7 +32,7 @@ from synapse.api.errors import (
SynapseError,
)
from synapse.storage.databases.main.media_repository import LocalMedia, RemoteMedia
from synapse.types import JsonDict, Requester, UserID, create_requester
from synapse.types import JsonDict, JsonValue, Requester, UserID, create_requester
from synapse.util.caches.descriptors import cached
from synapse.util.stringutils import parse_and_validate_mxc_uri
@@ -42,6 +43,8 @@ logger = logging.getLogger(__name__)
MAX_DISPLAYNAME_LEN = 256
MAX_AVATAR_URL_LEN = 1000
# Field name length is specced at 255 bytes.
MAX_CUSTOM_FIELD_LEN = 255
class ProfileHandler:
@@ -83,19 +86,33 @@ class ProfileHandler:
Returns:
A JSON dictionary. For local queries this will include the displayname and avatar_url
fields. For remote queries it may contain arbitrary information.
fields, if set. For remote queries it may contain arbitrary information.
"""
target_user = UserID.from_string(user_id)
if self.hs.is_mine(target_user):
profileinfo = await self.store.get_profileinfo(target_user)
if profileinfo.display_name is None and profileinfo.avatar_url is None:
extra_fields = {}
if self.hs.config.experimental.msc4133_enabled:
extra_fields = await self.store.get_profile_fields(target_user)
if (
profileinfo.display_name is None
and profileinfo.avatar_url is None
and not extra_fields
):
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)
return {
"displayname": profileinfo.display_name,
"avatar_url": profileinfo.avatar_url,
}
# Do not include display name or avatar if unset.
ret = {}
if profileinfo.display_name is not None:
ret[ProfileFields.DISPLAYNAME] = profileinfo.display_name
if profileinfo.avatar_url is not None:
ret[ProfileFields.AVATAR_URL] = profileinfo.avatar_url
if extra_fields:
ret.update(extra_fields)
return ret
else:
try:
result = await self.federation.make_query(
@@ -399,6 +416,110 @@ class ProfileHandler:
return True
async def get_profile_field(
self, target_user: UserID, field_name: str
) -> JsonValue:
"""
Fetch a user's profile from the database for local users and over federation
for remote users.
Args:
target_user: The user ID to fetch the profile for.
field_name: The field to fetch the profile for.
Returns:
The value for the profile field or None if the field does not exist.
"""
if self.hs.is_mine(target_user):
try:
field_value = await self.store.get_profile_field(
target_user, field_name
)
except StoreError as e:
if e.code == 404:
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)
raise
return field_value
else:
try:
result = await self.federation.make_query(
destination=target_user.domain,
query_type="profile",
args={"user_id": target_user.to_string(), "field": field_name},
ignore_backoff=True,
)
except RequestSendFailed as e:
raise SynapseError(502, "Failed to fetch profile") from e
except HttpResponseException as e:
raise e.to_synapse_error()
return result.get(field_name)
async def set_profile_field(
self,
target_user: UserID,
requester: Requester,
field_name: str,
new_value: JsonValue,
by_admin: bool = False,
deactivation: bool = False,
) -> None:
"""Set a new profile field for a user.
Args:
target_user: the user whose profile is to be changed.
requester: The user attempting to make this change.
field_name: The name of the profile field to update.
new_value: The new field value for this user.
by_admin: Whether this change was made by an administrator.
deactivation: Whether this change was made while deactivating the user.
"""
if not self.hs.is_mine(target_user):
raise SynapseError(400, "User is not hosted on this homeserver")
if not by_admin and target_user != requester.user:
raise AuthError(403, "Cannot set another user's profile")
await self.store.set_profile_field(target_user, field_name, new_value)
# Custom fields do not propagate into the user directory *or* rooms.
profile = await self.store.get_profileinfo(target_user)
await self._third_party_rules.on_profile_update(
target_user.to_string(), profile, by_admin, deactivation
)
async def delete_profile_field(
self,
target_user: UserID,
requester: Requester,
field_name: str,
by_admin: bool = False,
deactivation: bool = False,
) -> None:
"""Delete a field from a user's profile.
Args:
target_user: the user whose profile is to be changed.
requester: The user attempting to make this change.
field_name: The name of the profile field to remove.
by_admin: Whether this change was made by an administrator.
deactivation: Whether this change was made while deactivating the user.
"""
if not self.hs.is_mine(target_user):
raise SynapseError(400, "User is not hosted on this homeserver")
if not by_admin and target_user != requester.user:
raise AuthError(400, "Cannot set another user's profile")
await self.store.delete_profile_field(target_user, field_name)
# Custom fields do not propagate into the user directory *or* rooms.
profile = await self.store.get_profileinfo(target_user)
await self._third_party_rules.on_profile_update(
target_user.to_string(), profile, by_admin, deactivation
)
async def on_profile_query(self, args: JsonDict) -> JsonDict:
"""Handles federation profile query requests."""
@@ -415,13 +536,24 @@ class ProfileHandler:
just_field = args.get("field", None)
response = {}
response: JsonDict = {}
try:
if just_field is None or just_field == "displayname":
if just_field is None or just_field == ProfileFields.DISPLAYNAME:
response["displayname"] = await self.store.get_profile_displayname(user)
if just_field is None or just_field == "avatar_url":
if just_field is None or just_field == ProfileFields.AVATAR_URL:
response["avatar_url"] = await self.store.get_profile_avatar_url(user)
if self.hs.config.experimental.msc4133_enabled:
if just_field is None:
response.update(await self.store.get_profile_fields(user))
elif just_field not in (
ProfileFields.DISPLAYNAME,
ProfileFields.AVATAR_URL,
):
response[just_field] = await self.store.get_profile_field(
user, just_field
)
except StoreError as e:
if e.code == 404:
raise SynapseError(404, "Profile was not found", Codes.NOT_FOUND)

View File

@@ -630,7 +630,9 @@ class RegistrationHandler:
"""
await self._auto_join_rooms(user_id)
async def appservice_register(self, user_localpart: str, as_token: str) -> str:
async def appservice_register(
self, user_localpart: str, as_token: str
) -> Tuple[str, ApplicationService]:
user = UserID(user_localpart, self.hs.hostname)
user_id = user.to_string()
service = self.store.get_app_service_by_token(as_token)
@@ -653,7 +655,7 @@ class RegistrationHandler:
appservice_id=service_id,
create_profile_with_displayname=user.localpart,
)
return user_id
return (user_id, service)
def check_user_id_not_appservice_exclusive(
self, user_id: str, allowed_appservice: Optional[ApplicationService] = None

View File

@@ -47,15 +47,45 @@ logger = logging.getLogger(__name__)
_is_old_twisted = parse_version(twisted.__version__) < parse_version("21")
class _NoTLSESMTPSender(ESMTPSender):
"""Extend ESMTPSender to disable TLS
class _BackportESMTPSender(ESMTPSender):
"""Extend old versions of ESMTPSender to configure TLS.
Unfortunately, before Twisted 21.2, ESMTPSender doesn't give an easy way to disable
TLS, so we override its internal method which it uses to generate a context factory.
Unfortunately, before Twisted 21.2, ESMTPSender doesn't give an easy way to
disable TLS, or to configure the hostname used for TLS certificate validation.
This backports the `hostname` parameter for that functionality.
"""
__hostname: Optional[str]
def __init__(self, *args: Any, **kwargs: Any) -> None:
""""""
self.__hostname = kwargs.pop("hostname", None)
super().__init__(*args, **kwargs)
def _getContextFactory(self) -> Optional[IOpenSSLContextFactory]:
return None
if self.context is not None:
return self.context
elif self.__hostname is None:
return None # disable TLS if hostname is None
return optionsForClientTLS(self.__hostname)
class _BackportESMTPSenderFactory(ESMTPSenderFactory):
"""An ESMTPSenderFactory for _BackportESMTPSender.
This backports the `hostname` parameter, to disable or configure TLS.
"""
__hostname: Optional[str]
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.__hostname = kwargs.pop("hostname", None)
super().__init__(*args, **kwargs)
def protocol(self, *args: Any, **kwargs: Any) -> ESMTPSender: # type: ignore
# this overrides ESMTPSenderFactory's `protocol` attribute, with a Callable
# instantiating our _BackportESMTPSender, providing the hostname parameter
return _BackportESMTPSender(*args, **kwargs, hostname=self.__hostname)
async def _sendmail(
@@ -71,6 +101,7 @@ async def _sendmail(
require_tls: bool = False,
enable_tls: bool = True,
force_tls: bool = False,
tlsname: Optional[str] = None,
) -> None:
"""A simple wrapper around ESMTPSenderFactory, to allow substitution in tests
@@ -88,39 +119,33 @@ async def _sendmail(
enable_tls: True to enable STARTTLS. If this is False and require_tls is True,
the request will fail.
force_tls: True to enable Implicit TLS.
tlsname: the domain name expected as the TLS certificate's commonname,
defaults to smtphost.
"""
msg = BytesIO(msg_bytes)
d: "Deferred[object]" = Deferred()
if not enable_tls:
tlsname = None
elif tlsname is None:
tlsname = smtphost
def build_sender_factory(**kwargs: Any) -> ESMTPSenderFactory:
return ESMTPSenderFactory(
username,
password,
from_addr,
to_addr,
msg,
d,
heloFallback=True,
requireAuthentication=require_auth,
requireTransportSecurity=require_tls,
**kwargs,
)
factory: IProtocolFactory
if _is_old_twisted:
# before twisted 21.2, we have to override the ESMTPSender protocol to disable
# TLS
factory = build_sender_factory()
if not enable_tls:
factory.protocol = _NoTLSESMTPSender
else:
# for twisted 21.2 and later, there is a 'hostname' parameter which we should
# set to enable TLS.
factory = build_sender_factory(hostname=smtphost if enable_tls else None)
factory: IProtocolFactory = (
_BackportESMTPSenderFactory if _is_old_twisted else ESMTPSenderFactory
)(
username,
password,
from_addr,
to_addr,
msg,
d,
heloFallback=True,
requireAuthentication=require_auth,
requireTransportSecurity=require_tls,
hostname=tlsname,
)
if force_tls:
factory = TLSMemoryBIOFactory(optionsForClientTLS(smtphost), True, factory)
factory = TLSMemoryBIOFactory(optionsForClientTLS(tlsname), True, factory)
endpoint = HostnameEndpoint(
reactor, smtphost, smtpport, timeout=30, bindAddress=None
@@ -148,6 +173,7 @@ class SendEmailHandler:
self._require_transport_security = hs.config.email.require_transport_security
self._enable_tls = hs.config.email.enable_smtp_tls
self._force_tls = hs.config.email.force_tls
self._tlsname = hs.config.email.email_tlsname
self._sendmail = _sendmail
@@ -227,4 +253,5 @@ class SendEmailHandler:
require_tls=self._require_transport_security,
enable_tls=self._enable_tls,
force_tls=self._force_tls,
tlsname=self._tlsname,
)

View File

@@ -39,6 +39,7 @@ from synapse.logging.opentracing import (
trace,
)
from synapse.storage.databases.main.roommember import extract_heroes_from_room_summary
from synapse.storage.databases.main.state_deltas import StateDelta
from synapse.storage.databases.main.stream import PaginateFunction
from synapse.storage.roommember import (
MemberSummary,
@@ -48,6 +49,7 @@ from synapse.types import (
MutableStateMap,
PersistedEventPosition,
Requester,
RoomStreamToken,
SlidingSyncStreamToken,
StateMap,
StrCollection,
@@ -470,6 +472,64 @@ class SlidingSyncHandler:
return state_map
@trace
async def get_current_state_deltas_for_room(
self,
room_id: str,
room_membership_for_user_at_to_token: RoomsForUserType,
from_token: RoomStreamToken,
to_token: RoomStreamToken,
) -> List[StateDelta]:
"""
Get the state deltas between two tokens taking into account the user's
membership. If the user is LEAVE/BAN, we will only get the state deltas up to
their LEAVE/BAN event (inclusive).
(> `from_token` and <= `to_token`)
"""
membership = room_membership_for_user_at_to_token.membership
# We don't know how to handle `membership` values other than these. The
# code below would need to be updated.
assert membership in (
Membership.JOIN,
Membership.INVITE,
Membership.KNOCK,
Membership.LEAVE,
Membership.BAN,
)
# People shouldn't see past their leave/ban event
if membership in (
Membership.LEAVE,
Membership.BAN,
):
to_bound = (
room_membership_for_user_at_to_token.event_pos.to_room_stream_token()
)
# If we are participating in the room, we can get the latest current state in
# the room
elif membership == Membership.JOIN:
to_bound = to_token
# We can only rely on the stripped state included in the invite/knock event
# itself so there will never be any state deltas to send down.
elif membership in (Membership.INVITE, Membership.KNOCK):
return []
else:
# We don't know how to handle this type of membership yet
#
# FIXME: We should use `assert_never` here but for some reason
# the exhaustive matching doesn't recognize the `Never` here.
# assert_never(membership)
raise AssertionError(
f"Unexpected membership {membership} that we don't know how to handle yet"
)
return await self.store.get_current_state_deltas_for_room(
room_id=room_id,
from_token=from_token,
to_token=to_bound,
)
@trace
async def get_room_sync_data(
self,
@@ -755,13 +815,19 @@ class SlidingSyncHandler:
stripped_state = []
if invite_or_knock_event.membership == Membership.INVITE:
stripped_state.extend(
invite_or_knock_event.unsigned.get("invite_room_state", [])
invite_state = invite_or_knock_event.unsigned.get(
"invite_room_state", []
)
if not isinstance(invite_state, list):
invite_state = []
stripped_state.extend(invite_state)
elif invite_or_knock_event.membership == Membership.KNOCK:
stripped_state.extend(
invite_or_knock_event.unsigned.get("knock_room_state", [])
)
knock_state = invite_or_knock_event.unsigned.get("knock_room_state", [])
if not isinstance(knock_state, list):
knock_state = []
stripped_state.extend(knock_state)
stripped_state.append(strip_event(invite_or_knock_event))
@@ -790,8 +856,9 @@ class SlidingSyncHandler:
# TODO: Limit the number of state events we're about to send down
# the room, if its too many we should change this to an
# `initial=True`?
deltas = await self.store.get_current_state_deltas_for_room(
deltas = await self.get_current_state_deltas_for_room(
room_id=room_id,
room_membership_for_user_at_to_token=room_membership_for_user_at_to_token,
from_token=from_bound,
to_token=to_token.room_key,
)
@@ -955,15 +1022,21 @@ class SlidingSyncHandler:
and state_key == StateValues.LAZY
):
lazy_load_room_members = True
# Everyone in the timeline is relevant
#
# FIXME: We probably also care about invite, ban, kick, targets, etc
# but the spec only mentions "senders".
timeline_membership: Set[str] = set()
if timeline_events is not None:
for timeline_event in timeline_events:
# Anyone who sent a message is relevant
timeline_membership.add(timeline_event.sender)
# We also care about invite, ban, kick, targets,
# etc.
if timeline_event.type == EventTypes.Member:
timeline_membership.add(
timeline_event.state_key
)
# Update the required state filter so we pick up the new
# membership
for user_id in timeline_membership:

View File

@@ -43,7 +43,7 @@ from typing_extensions import Protocol
from twisted.web.iweb import IRequest
from twisted.web.server import Request
from synapse.api.constants import LoginType
from synapse.api.constants import LoginType, ProfileFields
from synapse.api.errors import Codes, NotFoundError, RedirectException, SynapseError
from synapse.config.sso import SsoAttributeRequirement
from synapse.handlers.device import DeviceHandler
@@ -813,9 +813,10 @@ class SsoHandler:
# bail if user already has the same avatar
profile = await self._profile_handler.get_profile(user_id)
if profile["avatar_url"] is not None:
server_name = profile["avatar_url"].split("/")[-2]
media_id = profile["avatar_url"].split("/")[-1]
if ProfileFields.AVATAR_URL in profile:
avatar_url_parts = profile[ProfileFields.AVATAR_URL].split("/")
server_name = avatar_url_parts[-2]
media_id = avatar_url_parts[-1]
if self._is_mine_server_name(server_name):
media = await self._media_repo.store.get_local_media(media_id) # type: ignore[has-type]
if media is not None and upload_name == media.upload_name:

View File

@@ -26,7 +26,13 @@ from typing import TYPE_CHECKING, List, Optional, Set, Tuple
from twisted.internet.interfaces import IDelayedCall
import synapse.metrics
from synapse.api.constants import EventTypes, HistoryVisibility, JoinRules, Membership
from synapse.api.constants import (
EventTypes,
HistoryVisibility,
JoinRules,
Membership,
ProfileFields,
)
from synapse.api.errors import Codes, SynapseError
from synapse.handlers.state_deltas import MatchChange, StateDeltasHandler
from synapse.metrics.background_process_metrics import run_as_background_process
@@ -161,7 +167,7 @@ class UserDirectoryHandler(StateDeltasHandler):
non_spammy_users = []
for user in results["results"]:
if not await self._spam_checker_module_callbacks.check_username_for_spam(
user
user, user_id
):
non_spammy_users.append(user)
results["results"] = non_spammy_users
@@ -756,6 +762,10 @@ class UserDirectoryHandler(StateDeltasHandler):
await self.store.update_profile_in_user_dir(
user_id,
display_name=non_null_str_or_none(profile.get("displayname")),
avatar_url=non_null_str_or_none(profile.get("avatar_url")),
display_name=non_null_str_or_none(
profile.get(ProfileFields.DISPLAYNAME)
),
avatar_url=non_null_str_or_none(
profile.get(ProfileFields.AVATAR_URL)
),
)

View File

@@ -41,7 +41,7 @@ from canonicaljson import encode_canonical_json
from netaddr import AddrFormatError, IPAddress, IPSet
from prometheus_client import Counter
from typing_extensions import Protocol
from zope.interface import implementer, provider
from zope.interface import implementer
from OpenSSL import SSL
from OpenSSL.SSL import VERIFY_NONE
@@ -225,7 +225,7 @@ class _IPBlockingResolver:
recv.addressResolved(address)
recv.resolutionComplete()
@provider(IResolutionReceiver)
@implementer(IResolutionReceiver)
class EndpointReceiver:
@staticmethod
def resolutionBegan(resolutionInProgress: IHostResolution) -> None:
@@ -239,8 +239,9 @@ class _IPBlockingResolver:
def resolutionComplete() -> None:
_callback()
endpoint_receiver_wrapper = EndpointReceiver()
self._reactor.nameResolver.resolveHostName(
EndpointReceiver, hostname, portNumber=portNumber
endpoint_receiver_wrapper, hostname, portNumber=portNumber
)
return recv

View File

@@ -21,7 +21,7 @@
import logging
import random
import re
from typing import Any, Collection, Dict, List, Optional, Sequence, Tuple
from typing import Any, Collection, Dict, List, Optional, Sequence, Tuple, Union
from urllib.parse import urlparse
from urllib.request import ( # type: ignore[attr-defined]
getproxies_environment,
@@ -351,7 +351,9 @@ def http_proxy_endpoint(
proxy: Optional[bytes],
reactor: IReactorCore,
tls_options_factory: Optional[IPolicyForHTTPS],
**kwargs: object,
timeout: float = 30,
bindAddress: Optional[Union[bytes, str, tuple[Union[bytes, str], int]]] = None,
attemptDelay: Optional[float] = None,
) -> Tuple[Optional[IStreamClientEndpoint], Optional[ProxyCredentials]]:
"""Parses an http proxy setting and returns an endpoint for the proxy
@@ -382,12 +384,15 @@ def http_proxy_endpoint(
# 3.9+) on scheme-less proxies, e.g. host:port.
scheme, host, port, credentials = parse_proxy(proxy)
proxy_endpoint = HostnameEndpoint(reactor, host, port, **kwargs)
proxy_endpoint = HostnameEndpoint(
reactor, host, port, timeout, bindAddress, attemptDelay
)
if scheme == b"https":
if tls_options_factory:
tls_options = tls_options_factory.creatorForNetloc(host, port)
proxy_endpoint = wrapClientTLS(tls_options, proxy_endpoint)
wrapped_proxy_endpoint = wrapClientTLS(tls_options, proxy_endpoint)
return wrapped_proxy_endpoint, credentials
else:
raise RuntimeError(
f"No TLS options for a https connection via proxy {proxy!s}"

View File

@@ -89,7 +89,7 @@ class ReplicationEndpointFactory:
location_config.port,
)
if scheme == "https":
endpoint = wrapClientTLS(
wrapped_endpoint = wrapClientTLS(
# The 'port' argument below isn't actually used by the function
self.context_factory.creatorForNetloc(
location_config.host.encode("utf-8"),
@@ -97,6 +97,8 @@ class ReplicationEndpointFactory:
),
endpoint,
)
return wrapped_endpoint
return endpoint
elif isinstance(location_config, InstanceUnixLocationConfig):
return UNIXClientEndpoint(self.reactor, location_config.path)

View File

@@ -21,6 +21,7 @@
import contextlib
import logging
import time
from http import HTTPStatus
from typing import TYPE_CHECKING, Any, Generator, Optional, Tuple, Union
import attr
@@ -139,6 +140,41 @@ class SynapseRequest(Request):
self.synapse_site.site_tag,
)
# Twisted machinery: this method is called by the Channel once the full request has
# been received, to dispatch the request to a resource.
#
# We're patching Twisted to bail/abort early when we see someone trying to upload
# `multipart/form-data` so we can avoid Twisted parsing the entire request body into
# in-memory (specific problem of this specific `Content-Type`). This protects us
# from an attacker uploading something bigger than the available RAM and crashing
# the server with a `MemoryError`, or carefully block just enough resources to cause
# all other requests to fail.
#
# FIXME: This can be removed once we Twisted releases a fix and we update to a
# version that is patched
def requestReceived(self, command: bytes, path: bytes, version: bytes) -> None:
if command == b"POST":
ctype = self.requestHeaders.getRawHeaders(b"content-type")
if ctype and b"multipart/form-data" in ctype[0]:
self.method, self.uri = command, path
self.clientproto = version
self.code = HTTPStatus.UNSUPPORTED_MEDIA_TYPE.value
self.code_message = bytes(
HTTPStatus.UNSUPPORTED_MEDIA_TYPE.phrase, "ascii"
)
self.responseHeaders.setRawHeaders(b"content-length", [b"0"])
logger.warning(
"Aborting connection from %s because `content-type: multipart/form-data` is unsupported: %s %s",
self.client,
command,
path,
)
self.write(b"")
self.loseConnection()
return
return super().requestReceived(command, path, version)
def handleContentChunk(self, data: bytes) -> None:
# we should have a `content` by now.
assert self.content, "handleContentChunk() called before gotLength()"

View File

@@ -20,13 +20,10 @@
#
import logging
from types import TracebackType
from typing import Optional, Type
from typing import Optional
from opentracing import Scope, ScopeManager, Span
import twisted
from synapse.logging.context import (
LoggingContext,
current_context,
@@ -112,9 +109,6 @@ class _LogContextScope(Scope):
"""
A custom opentracing scope, associated with a LogContext
* filters out _DefGen_Return exceptions which arise from calling
`defer.returnValue` in Twisted code
* When the scope is closed, the logcontext's active scope is reset to None.
and - if enter_logcontext was set - the logcontext is finished too.
"""
@@ -146,17 +140,6 @@ class _LogContextScope(Scope):
self._finish_on_close = finish_on_close
self._enter_logcontext = enter_logcontext
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
value: Optional[BaseException],
traceback: Optional[TracebackType],
) -> None:
if exc_type == twisted.internet.defer._DefGen_Return:
# filter out defer.returnValue() calls
exc_type = value = traceback = None
super().__exit__(exc_type, value, traceback)
def __str__(self) -> str:
return f"Scope<{self.span}>"

View File

@@ -67,6 +67,11 @@ class ThumbnailError(Exception):
class Thumbnailer:
FORMATS = {"image/jpeg": "JPEG", "image/png": "PNG"}
# Which image formats we allow Pillow to open.
# This should intentionally be kept restrictive, because the decoder of any
# format in this list becomes part of our trusted computing base.
PILLOW_FORMATS = ("jpeg", "png", "webp", "gif")
@staticmethod
def set_limits(max_image_pixels: int) -> None:
Image.MAX_IMAGE_PIXELS = max_image_pixels
@@ -76,7 +81,7 @@ class Thumbnailer:
self._closed = False
try:
self.image = Image.open(input_path)
self.image = Image.open(input_path, formats=self.PILLOW_FORMATS)
except OSError as e:
# If an error occurs opening the image, a thumbnail won't be able to
# be generated.

View File

@@ -45,6 +45,7 @@ from twisted.internet.interfaces import IDelayedCall
from twisted.web.resource import Resource
from synapse.api import errors
from synapse.api.constants import ProfileFields
from synapse.api.errors import SynapseError
from synapse.api.presence import UserPresenceState
from synapse.config import ConfigError
@@ -1086,7 +1087,10 @@ class ModuleApi:
content = {}
# Set the profile if not already done by the module.
if "avatar_url" not in content or "displayname" not in content:
if (
ProfileFields.AVATAR_URL not in content
or ProfileFields.DISPLAYNAME not in content
):
try:
# Try to fetch the user's profile.
profile = await self._hs.get_profile_handler().get_profile(
@@ -1095,8 +1099,8 @@ class ModuleApi:
except SynapseError as e:
# If the profile couldn't be found, use default values.
profile = {
"displayname": target_user_id.localpart,
"avatar_url": None,
ProfileFields.DISPLAYNAME: target_user_id.localpart,
ProfileFields.AVATAR_URL: None,
}
if e.code != 404:
@@ -1109,11 +1113,9 @@ class ModuleApi:
)
# Set the profile where it needs to be set.
if "avatar_url" not in content:
content["avatar_url"] = profile["avatar_url"]
if "displayname" not in content:
content["displayname"] = profile["displayname"]
for field_name in [ProfileFields.AVATAR_URL, ProfileFields.DISPLAYNAME]:
if field_name not in content and field_name in profile:
content[field_name] = profile[field_name]
event_id, _ = await self._hs.get_room_member_handler().update_membership(
requester=requester,

View File

@@ -19,6 +19,7 @@
#
#
import functools
import inspect
import logging
from typing import (
@@ -31,6 +32,7 @@ from typing import (
Optional,
Tuple,
Union,
cast,
)
# `Literal` appears with Python 3.8.
@@ -168,7 +170,10 @@ USER_MAY_PUBLISH_ROOM_CALLBACK = Callable[
]
],
]
CHECK_USERNAME_FOR_SPAM_CALLBACK = Callable[[UserProfile], Awaitable[bool]]
CHECK_USERNAME_FOR_SPAM_CALLBACK = Union[
Callable[[UserProfile], Awaitable[bool]],
Callable[[UserProfile, str], Awaitable[bool]],
]
LEGACY_CHECK_REGISTRATION_FOR_SPAM_CALLBACK = Callable[
[
Optional[dict],
@@ -293,6 +298,7 @@ def load_legacy_spam_checkers(hs: "synapse.server.HomeServer") -> None:
"Bad signature for callback check_registration_for_spam",
)
@functools.wraps(wrapped_func)
def run(*args: Any, **kwargs: Any) -> Awaitable:
# Assertion required because mypy can't prove we won't change `f`
# back to `None`. See
@@ -716,7 +722,9 @@ class SpamCheckerModuleApiCallbacks:
return self.NOT_SPAM
async def check_username_for_spam(self, user_profile: UserProfile) -> bool:
async def check_username_for_spam(
self, user_profile: UserProfile, requester_id: str
) -> bool:
"""Checks if a user ID or display name are considered "spammy" by this server.
If the server considers a username spammy, then it will not be included in
@@ -727,15 +735,33 @@ class SpamCheckerModuleApiCallbacks:
* user_id
* display_name
* avatar_url
requester_id: The user ID of the user making the user directory search request.
Returns:
True if the user is spammy.
"""
for callback in self._check_username_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
checker_args = inspect.signature(callback)
# Make a copy of the user profile object to ensure the spam checker cannot
# modify it.
res = await delay_cancellation(callback(user_profile.copy()))
# Also ensure backwards compatibility with spam checker callbacks
# that don't expect the requester_id argument.
if len(checker_args.parameters) == 2:
callback_with_requester_id = cast(
Callable[[UserProfile, str], Awaitable[bool]], callback
)
res = await delay_cancellation(
callback_with_requester_id(user_profile.copy(), requester_id)
)
else:
callback_without_requester_id = cast(
Callable[[UserProfile], Awaitable[bool]], callback
)
res = await delay_cancellation(
callback_without_requester_id(user_profile.copy())
)
if res:
return True

View File

@@ -371,7 +371,7 @@ class BulkPushRuleEvaluator:
"Deferred[Tuple[int, Tuple[dict, Optional[int]], Dict[str, Dict[str, JsonValue]], Mapping[str, ProfileInfo]]]",
gather_results(
(
run_in_background( # type: ignore[call-arg]
run_in_background( # type: ignore[call-overload]
self.store.get_number_joined_users_in_room,
event.room_id, # type: ignore[arg-type]
),
@@ -382,10 +382,10 @@ class BulkPushRuleEvaluator:
event_id_to_event,
),
run_in_background(self._related_events, event),
run_in_background( # type: ignore[call-arg]
run_in_background( # type: ignore[call-overload]
self.store.get_subset_users_in_room_with_profiles,
event.room_id, # type: ignore[arg-type]
rules_by_user.keys(), # type: ignore[arg-type]
event.room_id,
rules_by_user.keys(),
),
),
consumeErrors=True,

View File

@@ -127,6 +127,11 @@ class HttpPusher(Pusher):
if self.data is None:
raise PusherConfigException("'data' key can not be null for HTTP pusher")
# Check if badge counts should be disabled for this push gateway
self.disable_badge_count = self.hs.config.experimental.msc4076_enabled and bool(
self.data.get("org.matrix.msc4076.disable_badge_count", False)
)
self.name = "%s/%s/%s" % (
pusher_config.user_name,
pusher_config.app_id,
@@ -461,9 +466,10 @@ class HttpPusher(Pusher):
content: JsonDict = {
"event_id": event.event_id,
"room_id": event.room_id,
"counts": {"unread": badge},
"prio": priority,
}
if not self.disable_badge_count:
content["counts"] = {"unread": badge}
# event_id_only doesn't include the tweaks, so override them.
tweaks = {}
else:
@@ -478,11 +484,11 @@ class HttpPusher(Pusher):
"type": event.type,
"sender": event.user_id,
"prio": priority,
"counts": {
"unread": badge,
# 'missed_calls': 2
},
}
if not self.disable_badge_count:
content["counts"] = {
"unread": badge,
}
if event.type == "m.room.member" and event.is_state():
content["membership"] = event.content["membership"]
content["user_is_target"] = event.state_key == self.user_id

View File

@@ -74,9 +74,13 @@ async def get_context_for_event(
room_state = []
if ev.content.get("membership") == Membership.INVITE:
room_state = ev.unsigned.get("invite_room_state", [])
invite_room_state = ev.unsigned.get("invite_room_state", [])
if isinstance(invite_room_state, list):
room_state = invite_room_state
elif ev.content.get("membership") == Membership.KNOCK:
room_state = ev.unsigned.get("knock_room_state", [])
knock_room_state = ev.unsigned.get("knock_room_state", [])
if isinstance(knock_room_state, list):
room_state = knock_room_state
# Ideally we'd reuse the logic in `calculate_room_name`, but that gets
# complicated to handle partial events vs pulling events from the DB.

View File

@@ -495,7 +495,7 @@ class LockReleasedCommand(Command):
class NewActiveTaskCommand(_SimpleCommand):
"""Sent to inform instance handling background tasks that a new active task is available to run.
"""Sent to inform instance handling background tasks that a new task is ready to run.
Format::

View File

@@ -727,7 +727,7 @@ class ReplicationCommandHandler:
) -> None:
"""Called when get a new NEW_ACTIVE_TASK command."""
if self._task_scheduler:
self._task_scheduler.launch_task_by_id(cmd.data)
self._task_scheduler.on_new_task(cmd.data)
def new_connection(self, connection: IReplicationConnection) -> None:
"""Called when we have a new connection."""

View File

@@ -29,7 +29,7 @@ from synapse.rest.client import (
account_validity,
appservice_ping,
auth,
auth_issuer,
auth_metadata,
capabilities,
delayed_events,
devices,
@@ -121,7 +121,7 @@ CLIENT_SERVLET_FUNCTIONS: Tuple[RegisterServletsFunc, ...] = (
mutual_rooms.register_servlets,
login_token_request.register_servlets,
rendezvous.register_servlets,
auth_issuer.register_servlets,
auth_metadata.register_servlets,
)
SERVLET_GROUPS: Dict[str, Iterable[RegisterServletsFunc]] = {
@@ -187,7 +187,7 @@ class ClientRestResource(JsonResource):
mutual_rooms.register_servlets,
login_token_request.register_servlets,
rendezvous.register_servlets,
auth_issuer.register_servlets,
auth_metadata.register_servlets,
]:
continue

View File

@@ -107,6 +107,8 @@ from synapse.rest.admin.users import (
UserAdminServlet,
UserByExternalId,
UserByThreePid,
UserInvitesCount,
UserJoinedRoomCount,
UserMembershipRestServlet,
UserRegisterServlet,
UserReplaceMasterCrossSigningKeyRestServlet,
@@ -323,6 +325,8 @@ def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
UserByThreePid(hs).register(http_server)
RedactUser(hs).register(http_server)
RedactUserStatus(hs).register(http_server)
UserInvitesCount(hs).register(http_server)
UserJoinedRoomCount(hs).register(http_server)
DeviceRestServlet(hs).register(http_server)
DevicesRestServlet(hs).register(http_server)
@@ -332,8 +336,7 @@ def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
BackgroundUpdateRestServlet(hs).register(http_server)
BackgroundUpdateStartJobRestServlet(hs).register(http_server)
ExperimentalFeaturesRestServlet(hs).register(http_server)
if hs.config.experimental.msc3823_account_suspension:
SuspendAccountRestServlet(hs).register(http_server)
SuspendAccountRestServlet(hs).register(http_server)
def register_servlets_for_client_rest_resource(

View File

@@ -50,8 +50,10 @@ class EventReportsRestServlet(RestServlet):
The parameters `from` and `limit` are required only for pagination.
By default, a `limit` of 100 is used.
The parameter `dir` can be used to define the order of results.
The parameter `user_id` can be used to filter by user id.
The parameter `room_id` can be used to filter by room id.
The `user_id` query parameter filters by the user ID of the reporter of the event.
The `room_id` query parameter filters by room id.
The `event_sender_user_id` query parameter can be used to filter by the user id
of the sender of the reported event.
Returns:
A list of reported events and an integer representing the total number of
reported events that exist given this query
@@ -71,6 +73,7 @@ class EventReportsRestServlet(RestServlet):
direction = parse_enum(request, "dir", Direction, Direction.BACKWARDS)
user_id = parse_string(request, "user_id")
room_id = parse_string(request, "room_id")
event_sender_user_id = parse_string(request, "event_sender_user_id")
if start < 0:
raise SynapseError(
@@ -87,7 +90,7 @@ class EventReportsRestServlet(RestServlet):
)
event_reports, total = await self._store.get_event_reports_paginate(
start, limit, direction, user_id, room_id
start, limit, direction, user_id, room_id, event_sender_user_id
)
ret = {"event_reports": event_reports, "total": total}
if (start + limit) < total:

View File

@@ -23,6 +23,7 @@ from http import HTTPStatus
from typing import TYPE_CHECKING, List, Optional, Tuple, cast
import attr
from immutabledict import immutabledict
from synapse.api.constants import Direction, EventTypes, JoinRules, Membership
from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
@@ -463,7 +464,18 @@ class RoomStateRestServlet(RestServlet):
if not room:
raise NotFoundError("Room not found")
event_ids = await self._storage_controllers.state.get_current_state_ids(room_id)
state_filter = None
type = parse_string(request, "type")
if type:
state_filter = StateFilter(
types=immutabledict({type: None}),
include_others=False,
)
event_ids = await self._storage_controllers.state.get_current_state_ids(
room_id, state_filter
)
events = await self.store.get_events(event_ids.values())
now = self.clock.time_msec()
room_state = await self._event_serializer.serialize_events(events.values(), now)

Some files were not shown because too many files have changed in this diff Show More