Compare commits

...

67 Commits

Author SHA1 Message Date
Erik Johnston
90e1df262d Handle SQLite 2024-06-26 13:14:48 +01:00
Erik Johnston
b8b6fe0da3 Remove debug logging 2024-06-26 13:14:42 +01:00
Erik Johnston
fe9fa90af4 Add a cache to auth links 2024-05-17 10:35:03 +01:00
Erik Johnston
4ffe5a4459 Up batch size 2024-05-14 10:26:31 +01:00
Erik Johnston
b58ed63884 Go faster stripes 2024-05-13 13:10:16 +01:00
Erik Johnston
2d4cea496b Debug logging 2024-05-13 09:54:32 +01:00
Erik Johnston
ca79b4d87d Use a sortedset instead 2024-05-09 10:58:00 +01:00
Erik Johnston
202a09cdb3 Newsfile 2024-05-08 16:05:24 +01:00
Erik Johnston
db25e30a25 Perf improvement to getting auth chains 2024-05-08 16:04:35 +01:00
Timshel
34a8652366 Optional whitespace support in Authorization (#1350) (#17145)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-05-08 13:56:16 +00:00
Erik Johnston
414ddcd457 Update PyO3 to 0.21 (#17162)
This version change requires a migration to a new API. See
https://pyo3.rs/v0.21.2/migration#from-020-to-021

This will fix the annoying warnings added when using the recent rust
nightly:

> warning: non-local `impl` definition, they should be avoided as they
go against expectation
2024-05-08 14:30:06 +01:00
Andrew Morgan
4d408cb4dd Note preset behaviour in autocreate_auto_join_room_preset docs (#17150) 2024-05-08 13:05:10 +01:00
Hugh Nimmo-Smith
212f150208 Add note about MSC3886 being closed (#17151) 2024-05-08 12:49:32 +01:00
Olivier 'reivilibre
4c6e78fa14 Merge branch 'release-v1.107' into develop 2024-05-07 18:52:15 +01:00
Jacob Sánchez
1b155362ca Add note about external_ids for User Admin API in documentation (#17139) 2024-05-07 16:38:29 +00:00
Olivier 'reivilibre
522a40c4de Tweak changelog 2024-05-07 17:25:47 +01:00
Olivier 'reivilibre
dcd03d3b15 1.107.0rc1 2024-05-07 16:30:07 +01:00
dependabot[bot]
438bc23560 Bump serde from 1.0.199 to 1.0.200 (#17161) 2024-05-07 10:35:37 +01:00
dependabot[bot]
cf30cfe5d1 Bump pydantic from 2.7.0 to 2.7.1 (#17160) 2024-05-07 10:35:24 +01:00
dependabot[bot]
1726b49457 Bump types-pillow from 10.2.0.20240415 to 10.2.0.20240423 (#17159) 2024-05-07 10:34:56 +01:00
dependabot[bot]
792cfe7ba6 Bump lxml from 5.1.0 to 5.2.1 (#17158) 2024-05-07 10:34:46 +01:00
dependabot[bot]
c3682ff668 Bump jsonschema from 4.21.1 to 4.22.0 (#17157)
Bumps [jsonschema](https://github.com/python-jsonschema/jsonschema) from
4.21.1 to 4.22.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/python-jsonschema/jsonschema/releases">jsonschema's
releases</a>.</em></p>
<blockquote>
<h2>v4.22.0</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<ul>
<li>Improve <code>best_match</code> (and thereby error messages from
<code>jsonschema.validate</code>) in cases where there are multiple
<em>sibling</em> errors from applying <code>anyOf</code> /
<code>allOf</code> -- i.e. when multiple elements of a JSON array have
errors, we now do prefer showing errors from earlier elements rather
than simply showing an error for the full array (<a
href="https://redirect.github.com/python-jsonschema/jsonschema/issues/1250">#1250</a>).</li>
<li>(Micro-)optimize equality checks when comparing for JSON Schema
equality by first checking for object identity, as <code>==</code>
would.</li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/shinnar"><code>@​shinnar</code></a> made
their first contribution in <a
href="https://redirect.github.com/python-jsonschema/jsonschema/pull/1224">python-jsonschema/jsonschema#1224</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/python-jsonschema/jsonschema/compare/v4.21.1...v4.22.0">https://github.com/python-jsonschema/jsonschema/compare/v4.21.1...v4.22.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/python-jsonschema/jsonschema/blob/main/CHANGELOG.rst">jsonschema's
changelog</a>.</em></p>
<blockquote>
<h1>v4.22.0</h1>
<ul>
<li>Improve <code>best_match</code> (and thereby error messages from
<code>jsonschema.validate</code>) in cases where there are multiple
<em>sibling</em> errors from applying <code>anyOf</code> /
<code>allOf</code> -- i.e. when multiple elements of a JSON array have
errors, we now do prefer showing errors from earlier elements rather
than simply showing an error for the full array (<a
href="https://redirect.github.com/python-jsonschema/jsonschema/issues/1250">#1250</a>).</li>
<li>(Micro-)optimize equality checks when comparing for JSON Schema
equality by first checking for object identity, as <code>==</code>
would.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9882dbeb1a"><code>9882dbe</code></a>
Add / ignore the new specification test suite property.</li>
<li><a
href="ebc90bb2df"><code>ebc90bb</code></a>
Merge commit '8fcfc3a674a7188a4fcc822b7a91efb3e0422a20'</li>
<li><a
href="8fcfc3a674"><code>8fcfc3a</code></a>
Squashed 'json/' changes from b41167c74..54f3784a8</li>
<li><a
href="30b7537944"><code>30b7537</code></a>
Pin pyenchant to pre from below until <a
href="https://redirect.github.com/pyenchant/pyenchant/issues/302">pyenchant/pyenchant#302</a>
is released.</li>
<li><a
href="c3729db732"><code>c3729db</code></a>
Enable doctests for the rest of the referencing page.</li>
<li><a
href="70a994ceab"><code>70a994c</code></a>
Remove a now-unneeded noqa since apparently this is fixed in new
ruff.</li>
<li><a
href="e6d0ef1cff"><code>e6d0ef1</code></a>
Fix a minor typo in the referencing example docs.</li>
<li><a
href="bceaf41a7d"><code>bceaf41</code></a>
Another placeholder benchmark for future optimization.</li>
<li><a
href="b20234e86c"><code>b20234e</code></a>
Consider errors from earlier indices (in instances) to be better
matches</li>
<li><a
href="41b49c68e5"><code>41b49c6</code></a>
Minor improvement to test failure message when a best match test
fails.</li>
<li>Additional commits viewable in <a
href="https://github.com/python-jsonschema/jsonschema/compare/v4.21.1...v4.22.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jsonschema&package-manager=pip&previous-version=4.21.1&new-version=4.22.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 10:34:30 +01:00
Erik Johnston
3e6ee8ff88 Add optimisation to StreamChangeCache (#17130)
When there have been lots of changes compared with the number of
entities, we can do a fast(er) path.

Locally I ran some benchmarking, and the comparison seems to give the
best determination of which method we use.
2024-05-06 12:56:52 +01:00
Erik Johnston
7c9ac01eb5 Fix bug where StreamChangeCache would not respect cache factors (#17152)
Annoyingly mypy didn't pick up this typo.
2024-05-03 18:00:08 +01:00
Erik Johnston
3818597751 Fix lint.sh script (#17148)
Broke in #17073
2024-05-03 17:12:03 +01:00
Andrew Morgan
3aadf43122 Bump pillow from 10.2.0 to 10.3.0 (#17146) 2024-05-03 10:55:59 +01:00
jahway603
5b6a75935e upgrade.md: Bump minimum Rust version to 1.66.0 (element-hq#17079) (#17140)
upgrade.md: Bump minimum Rust version to 1.66.0 (element-hq#17079)
2024-05-02 14:57:29 +01:00
Benjamin Bouvier
c0ea2bf800 synapse complement image: hardcode enabling msc3266 (#17105)
This is an alternative to
https://github.com/matrix-org/matrix-rust-sdk/issues/3340 where we don't
need to change our CI setup too much in the Rust SDK repository, and
still can test MSC3266.
2024-05-02 11:48:27 +01:00
Shay
37558d5e4c Add support for MSC3823 - Account Suspension (#17051) 2024-05-01 17:45:17 +01:00
Erik Johnston
0b358f8643 Drop sphinx docs (#17073)
It is broken, and we only seemed to have been building it for the
federation sender.

Closes https://github.com/element-hq/synapse/issues/16804
2024-05-01 16:01:50 +00:00
Ben Banfield-Zanin
7254015665 Correct licensing metadata on the Docker image (#17141)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-05-01 16:23:42 +01:00
Andrew Morgan
e84a493f41 Merge branch 'master' into develop 2024-04-30 14:42:45 +01:00
Richard van der Hoff
07232e27a8 Enable complement tests for MSC4115 support (#17137)
Follow-up to #17137 and
https://github.com/matrix-org/complement/pull/722
2024-04-30 13:57:20 +01:00
Andrew Morgan
e26673fe97 1.106.0 2024-04-30 11:51:50 +01:00
devonh
7ab0f630da Apply user email & picture during OIDC registration if present & selected (#17120)
This change will apply the `email` & `picture` provided by OIDC to the
new user account when registering a new user via OIDC. If the user is
directed to the account details form, this change makes sure they have
been selected before applying them, otherwise they are omitted. In
particular, this change ensures the values are carried through when
Synapse has consent configured, and the redirect to the consent form/s
are followed.

I have tested everything manually. Including: 
- with/without consent configured
- allowing/not allowing the use of email/avatar (via
`sso_auth_account_details.html`)
- with/without automatic account detail population (by un/commenting the
`localpart_template` option in synapse config).

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-04-29 15:23:05 +00:00
Richard van der Hoff
b548f7803a Add support for MSC4115 (#17104)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-04-29 15:22:13 +01:00
Andrew Morgan
758aec6b34 Update tornado 6.2 -> 6.4 (#17131) 2024-04-29 14:33:25 +01:00
Richard van der Hoff
c897ac63e9 Ensure that incoming to-device messages are not dropped (#17127)
... when workers are unreachable, etc.

Fixes https://github.com/element-hq/synapse/issues/17117.

The general principle is just to make sure that we propagate any
exceptions to the JsonResource, so that we return an error code to the
sending server. That means that the sending server no longer considers
the message safely sent, so it will retry later.

In the issue, Erik mentions that an alternative solution would be to
persist the to-device messages into a table so that they can be retried.
This might be an improvement for performance, but even if we did that,
we still need this mechanism, since we might be unable to reach the
database. So, if we want to do that, it can be a later follow-up.

---------

Co-authored-by: Erik Johnston <erik@matrix.org>
2024-04-29 14:11:00 +01:00
Patrick Cloke
38bc7a009d Declare support for Matrix v1.10. (#17082)
Pretty straightforward. 😄 

Fixes #17021
2024-04-29 14:09:03 +01:00
dependabot[bot]
6a275828c8 Bump types-setuptools from 69.0.0.20240125 to 69.5.0.20240423 (#17134)
Bumps [types-setuptools](https://github.com/python/typeshed) from
69.0.0.20240125 to 69.5.0.20240423.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-setuptools&package-manager=pip&previous-version=69.0.0.20240125&new-version=69.5.0.20240423)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 14:06:14 +01:00
dependabot[bot]
6e373468a4 Bump idna from 3.6 to 3.7 (#17136)
Bumps [idna](https://github.com/kjd/idna) from 3.6 to 3.7.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/kjd/idna/releases">idna's
releases</a>.</em></p>
<blockquote>
<h2>v3.7</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix issue where specially crafted inputs to encode() could take
exceptionally long amount of time to process. [CVE-2024-3651]</li>
</ul>
<p>Thanks to Guido Vranken for reporting the issue.</p>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/kjd/idna/compare/v3.6...v3.7">https://github.com/kjd/idna/compare/v3.6...v3.7</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/kjd/idna/blob/master/HISTORY.rst">idna's
changelog</a>.</em></p>
<blockquote>
<p>3.7 (2024-04-11)
++++++++++++++++</p>
<ul>
<li>Fix issue where specially crafted inputs to encode() could
take exceptionally long amount of time to process. [CVE-2024-3651]</li>
</ul>
<p>Thanks to Guido Vranken for reporting the issue.</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1d365e17e1"><code>1d365e1</code></a>
Release v3.7</li>
<li><a
href="c1b3154939"><code>c1b3154</code></a>
Merge pull request <a
href="https://redirect.github.com/kjd/idna/issues/172">#172</a> from
kjd/optimize-contextj</li>
<li><a
href="0394ec76ff"><code>0394ec7</code></a>
Merge branch 'master' into optimize-contextj</li>
<li><a
href="cd58a23173"><code>cd58a23</code></a>
Merge pull request <a
href="https://redirect.github.com/kjd/idna/issues/152">#152</a> from
elliotwutingfeng/dev</li>
<li><a
href="5beb28b9dd"><code>5beb28b</code></a>
More efficient resolution of joiner contexts</li>
<li><a
href="1b121483ed"><code>1b12148</code></a>
Update ossf/scorecard-action to v2.3.1</li>
<li><a
href="d516b874c3"><code>d516b87</code></a>
Update Github actions/checkout to v4</li>
<li><a
href="c095c75943"><code>c095c75</code></a>
Merge branch 'master' into dev</li>
<li><a
href="60a0a4cb61"><code>60a0a4c</code></a>
Fix typo in GitHub Actions workflow key</li>
<li><a
href="5918a0ef80"><code>5918a0e</code></a>
Merge branch 'master' into dev</li>
<li>Additional commits viewable in <a
href="https://github.com/kjd/idna/compare/v3.6...v3.7">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=idna&package-manager=pip&previous-version=3.6&new-version=3.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 14:06:02 +01:00
dependabot[bot]
48ee17dc79 Bump twisted from 23.10.0 to 24.3.0 (#17135)
Bumps [twisted](https://github.com/twisted/twisted) from 23.10.0 to
24.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/twisted/twisted/releases">twisted's
releases</a>.</em></p>
<blockquote>
<h1>Twisted 24.3.0 (2024-03-01)</h1>
<p>This release supports PyPy v7.3.14.</p>
<h2>Bugfixes</h2>
<ul>
<li>twisted.logger.formatEvent now honors dotted method names, not just
flat function names, in format strings, as it has long been
explicitly documented to do. So, you will now get the expected
result from [formatEvent(&quot;here's the result of calling a method at
log-format time: {obj.method()}&quot;, obj=...)]{.title-ref} (<a
href="https://redirect.github.com/twisted/twisted/issues/9347">#9347</a>)</li>
<li>twisted.web.http.HTTPChannel now ignores the trailer headers
provided in the last chunk of a chunked encoded response, rather
than raising an exception. (<a
href="https://redirect.github.com/twisted/twisted/issues/11997">#11997</a>)</li>
<li>twisted.protocols.tls.BufferingTLSTransport, used by default by
twisted.protocols.tls.TLSMemoryBIOFactory, was refactored for
improved performance when doing a high number of small writes.
(<a
href="https://redirect.github.com/twisted/twisted/issues/12011">#12011</a>)</li>
<li>twisted.python.failure.Failure now throws exception for generators
without triggering a deprecation warnings on Python 3.12. (<a
href="https://redirect.github.com/twisted/twisted/issues/12026">#12026</a>)</li>
<li>twisted.internet.process.Process, used by
<code>reactor.spawnProcess</code>,
now copies the parent environment when the [env=None]{.title-ref}
argument is passed on Posix systems and <code>os.posix_spawnp</code> is
used
internally. (<a
href="https://redirect.github.com/twisted/twisted/issues/12068">#12068</a>)</li>
<li>twisted.internet.defer.inlineCallbacks.returnValue's stack
introspection was adjusted for the latest PyPy 7.3.14 release,
allowing legacy <a
href="https://github.com/inlineCallbacks"><code>@​inlineCallbacks</code></a>
to run on new PyPY versions.
(<a
href="https://redirect.github.com/twisted/twisted/issues/12084">#12084</a>)</li>
</ul>
<h2>Deprecations and Removals</h2>
<ul>
<li>twisted.trial.reporter.TestRun.startTest() is no longer called for
tests with skip annotation or skip attribute for Python 3.12.1 or
newer. This is the result of upstream Python <a
href="https://redirect.github.com/twisted/twisted/issues/106584">gh-106584</a>
change. The
behavior is not change in 3.12.0 or older. (<a
href="https://redirect.github.com/twisted/twisted/issues/12052">#12052</a>)</li>
</ul>
<h2>Misc</h2>
<ul>
<li><a
href="https://redirect.github.com/twisted/twisted/issues/11902">#11902</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12018">#12018</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12023">#12023</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12031">#12031</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12032">#12032</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12052">#12052</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12056">#12056</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12067">#12067</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12076">#12076</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12078">#12078</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12087">#12087</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12095">#12095</a></li>
</ul>
<h2>Conch</h2>
<p>No significant changes.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/twisted/twisted/blob/trunk/NEWS.rst">twisted's
changelog</a>.</em></p>
<blockquote>
<h1>Twisted 24.3.0 (2024-03-01)</h1>
<p>This release supports PyPy v7.3.14.</p>
<h2>Bugfixes</h2>
<ul>
<li>twisted.logger.formatEvent now honors dotted method names, not just
flat
function names, in format strings, as it has long been explicitly
documented to
do. So, you will now get the expected result from
<code>formatEvent(&quot;here's the result of calling a method at
log-format time: {obj.method()}&quot;, obj=...)</code> (<a
href="https://redirect.github.com/twisted/twisted/issues/9347">#9347</a>)</li>
<li>twisted.web.http.HTTPChannel now ignores the trailer headers
provided in the last chunk of a chunked encoded response, rather than
raising an exception. (<a
href="https://redirect.github.com/twisted/twisted/issues/11997">#11997</a>)</li>
<li>twisted.protocols.tls.BufferingTLSTransport, used by default by
twisted.protocols.tls.TLSMemoryBIOFactory, was refactored for improved
performance when doing a high number of small writes. (<a
href="https://redirect.github.com/twisted/twisted/issues/12011">#12011</a>)</li>
<li>twisted.python.failure.Failure now throws exception for generators
without triggering a deprecation warnings on Python 3.12. (<a
href="https://redirect.github.com/twisted/twisted/issues/12026">#12026</a>)</li>
<li>twisted.internet.process.Process, used by
<code>reactor.spawnProcess</code>, now copies the parent environment
when the <code>env=None</code> argument is passed on Posix systems and
<code>os.posix_spawnp</code> is used internally. (<a
href="https://redirect.github.com/twisted/twisted/issues/12068">#12068</a>)</li>
<li>twisted.internet.defer.inlineCallbacks.returnValue's stack
introspection was adjusted for the latest PyPy 7.3.14 release, allowing
legacy <a
href="https://github.com/inlineCallbacks"><code>@​inlineCallbacks</code></a>
to run on new PyPY versions. (<a
href="https://redirect.github.com/twisted/twisted/issues/12084">#12084</a>)</li>
</ul>
<h2>Deprecations and Removals</h2>
<ul>
<li>twisted.trial.reporter.TestRun.startTest() is no longer called for
tests
with skip annotation or skip attribute for Python 3.12.1 or newer.
This is the result of upstream Python <a
href="https://redirect.github.com/twisted/twisted/issues/106584">gh-106584</a>
change.
The behavior is not change in 3.12.0 or older. (<a
href="https://redirect.github.com/twisted/twisted/issues/12052">#12052</a>)</li>
</ul>
<h2>Misc</h2>
<ul>
<li><a
href="https://redirect.github.com/twisted/twisted/issues/11902">#11902</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12018">#12018</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12023">#12023</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12031">#12031</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12032">#12032</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12052">#12052</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12056">#12056</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12067">#12067</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12076">#12076</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12078">#12078</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12087">#12087</a>,
<a
href="https://redirect.github.com/twisted/twisted/issues/12095">#12095</a></li>
</ul>
<h2>Conch</h2>
<p>No significant changes.</p>
<h2>Web</h2>
<p>Bugfixes</p>
<pre><code>
- The documentation for twisted.web.client.CookieAgent no longer
references
long-deprecated ``cookielib`` and ``urllib2`` standard library modules.
([#12044](https://github.com/twisted/twisted/issues/12044))
<p>&lt;/tr&gt;&lt;/table&gt;
</code></pre></p>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2e59e1fb32"><code>2e59e1f</code></a>
Merge remote-tracking branch 'origin/release-24.2.0-12097' into
release-24.2....</li>
<li><a
href="64a18eb6d8"><code>64a18eb</code></a>
Prep for final release.</li>
<li><a
href="c33d114c83"><code>c33d114</code></a>
Update NEWS.rst</li>
<li><a
href="d7c9b3d886"><code>d7c9b3d</code></a>
Clarify and fix docs.</li>
<li><a
href="4ed1a5b8ad"><code>4ed1a5b</code></a>
Update copyright year.</li>
<li><a
href="a2ba6eb330"><code>a2ba6eb</code></a>
Changelog for 24.2.0.</li>
<li><a
href="f2f1bc5d54"><code>f2f1bc5</code></a>
New pre-release.</li>
<li><a
href="446ee13918"><code>446ee13</code></a>
Fix chat.py example (<a
href="https://redirect.github.com/twisted/twisted/issues/12070">#12070</a>)</li>
<li><a
href="d05599b21a"><code>d05599b</code></a>
Merge branch 'trunk' into fix-chat-example</li>
<li><a
href="234f3788b6"><code>234f378</code></a>
<a
href="https://redirect.github.com/twisted/twisted/issues/11902">#11902</a>
- Fix type for twisted.web.server.Request.defaultContentType (<a
href="https://redirect.github.com/twisted/twisted/issues/12101">#12101</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/twisted/twisted/compare/twisted-23.10.0...twisted-24.3.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=twisted&package-manager=pip&previous-version=23.10.0&new-version=24.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 14:05:53 +01:00
dependabot[bot]
f6437ca1c4 Bump serde from 1.0.198 to 1.0.199 (#17132)
Bumps [serde](https://github.com/serde-rs/serde) from 1.0.198 to
1.0.199.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/serde/releases">serde's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.199</h2>
<ul>
<li>Fix ambiguous associated item when
<code>forward_to_deserialize_any!</code> is used on an enum with
<code>Error</code> variant (<a
href="https://redirect.github.com/serde-rs/serde/issues/2732">#2732</a>,
thanks <a
href="https://github.com/aatifsyed"><code>@​aatifsyed</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1477028717"><code>1477028</code></a>
Release 1.0.199</li>
<li><a
href="789740be0d"><code>789740b</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/serde/issues/2732">#2732</a>
from aatifsyed/master</li>
<li><a
href="8fe7539bb2"><code>8fe7539</code></a>
fix: ambiguous associated type in forward_to_deserialize_any!</li>
<li><a
href="f6623a3654"><code>f6623a3</code></a>
Ignore cast_precision_loss pedantic clippy lint</li>
<li>See full diff in <a
href="https://github.com/serde-rs/serde/compare/v1.0.198...v1.0.199">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde&package-manager=cargo&previous-version=1.0.198&new-version=1.0.199)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 14:05:30 +01:00
dependabot[bot]
02bda250f8 Bump furo from 2024.1.29 to 2024.4.27 (#17133)
Bumps [furo](https://github.com/pradyunsg/furo) from 2024.1.29 to
2024.4.27.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pradyunsg/furo/blob/main/docs/changelog.md">furo's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<!-- raw HTML omitted -->
<h2>2024.04.27 -- Bold Burgundy</h2>
<ul>
<li>Add a skip to content link.</li>
<li>Add <code>--font-stack--headings</code>.</li>
<li>Add <code>:visited</code> colour and enforce uniform contrast
between light/dark.</li>
<li>Add an offset of <code>:target</code> to reduce back-to-top
overlap.</li>
<li>Improve dark mode colours.</li>
<li>Fix outstanding colour contrast warnings on Firefox.</li>
<li>Fix bad indent in footnotes.</li>
<li>Tweak handling of default configuration options in a more resilient
manner.</li>
<li>Tweak length and sizing of API <code>source</code> links.</li>
<li>Stop search engine indexing on search page.</li>
</ul>
<h2>2024.01.29 -- Amazing Amethyst</h2>
<ul>
<li>Fix canonical url when building with <code>dirhtml</code>.</li>
<li>Relicense the demo module.</li>
</ul>
<h2>2023.09.10 -- Zesty Zaffre</h2>
<ul>
<li>Make asset hash injection idempotent, fixing Sphinx 6
compatibility.</li>
<li>Fix the check for HTML builders, fixing non-HTML Read the Docs
builds.</li>
</ul>
<h2>2023.08.19 -- Xenolithic Xanadu</h2>
<ul>
<li>Fix missing search context with Sphinx 7.2, for dirhtml builds.</li>
<li>Drop support for Python 3.7.</li>
<li>Present configuration errors in a better format -- thanks <a
href="https://github.com/AA-Turner"><code>@​AA-Turner</code></a>!</li>
<li>Bump <code>require_sphinx()</code> to Sphinx 6.0, in line with
dependency changes in Unassuming Ultramarine.</li>
</ul>
<h2>2023.08.17 -- Wonderous White</h2>
<ul>
<li>Fix compatiblity with Sphinx 7.2.0 and 7.2.1.</li>
</ul>
<h2>2023.07.26 -- Vigilant Volt</h2>
<ul>
<li>Fix compatiblity with Sphinx 7.1.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="750fcd77fd"><code>750fcd7</code></a>
Prepare release: 2024.04.27</li>
<li><a
href="c0cb0200f0"><code>c0cb020</code></a>
Update changelog</li>
<li><a
href="3787a7c1f2"><code>3787a7c</code></a>
Patch <code>app.config</code> in a more resilient manner (<a
href="https://redirect.github.com/pradyunsg/furo/issues/783">#783</a>)</li>
<li><a
href="6a3afaba38"><code>6a3afab</code></a>
Indent all children of aside.footnote (<a
href="https://redirect.github.com/pradyunsg/furo/issues/788">#788</a>)</li>
<li><a
href="035b276516"><code>035b276</code></a>
fix: no index content on search page (<a
href="https://redirect.github.com/pradyunsg/furo/issues/784">#784</a>)</li>
<li><a
href="151f523271"><code>151f523</code></a>
[pre-commit.ci] pre-commit autoupdate (<a
href="https://redirect.github.com/pradyunsg/furo/issues/771">#771</a>)</li>
<li><a
href="2eb75aa20e"><code>2eb75aa</code></a>
Bump the github-actions group with 1 update (<a
href="https://redirect.github.com/pradyunsg/furo/issues/777">#777</a>)</li>
<li><a
href="df6f65c819"><code>df6f65c</code></a>
Bump the npm group with 6 updates (<a
href="https://redirect.github.com/pradyunsg/furo/issues/778">#778</a>)</li>
<li><a
href="0b51a5eebd"><code>0b51a5e</code></a>
Add space after period in ToC warning (<a
href="https://redirect.github.com/pradyunsg/furo/issues/776">#776</a>)</li>
<li><a
href="0188705150"><code>0188705</code></a>
Bump the npm group with 5 updates (<a
href="https://redirect.github.com/pradyunsg/furo/issues/770">#770</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pradyunsg/furo/compare/2024.01.29...2024.04.27">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=furo&package-manager=pip&previous-version=2024.1.29&new-version=2024.4.27)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 14:05:24 +01:00
devonh
0fd6b269d3 Fix various typos in docs (#17114)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-04-26 18:10:45 +00:00
Andrew Morgan
89fc579329 Fix filtering of rooms when supplying the destination query parameter to /_synapse/admin/v1/federation/destinations/<destination>/rooms (#17077) 2024-04-26 10:52:24 +01:00
villepeh
9c91873922 Add RuntimeDirectory to matrix-synapse.service (#17084)
This makes it easy to store UNIX sockets with correct permissions. Those
would be located in /run/synapse which is the directory used in many
examples in Synapse configuration manual. Additionally, the directory
and sockets are deleted when Synapse is shut down.
2024-04-26 09:56:20 +01:00
Michael Telatynski
41fbe387d6 Improve error message for cross signing reset with MSC3861 enabled (#17121) 2024-04-26 09:54:30 +01:00
Amanda H. L. de Andrade Katz
90cc9e5b29 Rephrase enable_notifs configuration (#17116) 2024-04-26 09:52:58 +01:00
Andrew Ferrazzutti
516fd891ee Use recommended endpoint for MSC3266 requests (#17078)
Keep the existing endpoint for backwards compatibility

Signed-off-by: Andrew Ferrazzutti <andrewf@element.io>
2024-04-26 09:46:42 +01:00
Amanda H. L. de Andrade Katz
0ef2315a99 Update event_cache_size and global_factor configurations documentation (#17071)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-04-26 09:44:54 +01:00
Melvyn Laïly
59710437e4 Return the search terms as search highlights for SQLite instead of nothing (#17000)
Fixes https://github.com/element-hq/synapse/issues/16999 and
https://github.com/element-hq/element-android/pull/8729 by returning the
search terms as search highlights.
2024-04-26 09:43:52 +01:00
dependabot[bot]
9985aa6821 Bump serde_json from 1.0.115 to 1.0.116 (#17112)
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.115 to
1.0.116.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/json/releases">serde_json's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.116</h2>
<ul>
<li>Make module structure comprehensible to static analysis (<a
href="https://redirect.github.com/serde-rs/json/issues/1124">#1124</a>,
thanks <a
href="https://github.com/mleonhard"><code>@​mleonhard</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a3f62bb10e"><code>a3f62bb</code></a>
Release 1.0.116</li>
<li><a
href="12c8ee0ce6"><code>12c8ee0</code></a>
Hide &quot;non-exhaustive patterns&quot; errors when crate fails to
compile</li>
<li><a
href="051ce970fe"><code>051ce97</code></a>
Merge pull request 1124 from mleonhard/master</li>
<li><a
href="25dc75050a"><code>25dc750</code></a>
Replace <code>features_check</code> mod with a call to
<code>std::compile_error!</code>. Fixes htt...</li>
<li><a
href="2e15e3d7d5"><code>2e15e3d</code></a>
Revert &quot;Temporarily disable miri on doctests&quot;</li>
<li><a
href="0baba28775"><code>0baba28</code></a>
Resolve legacy_numeric_constants clippy lints</li>
<li>See full diff in <a
href="https://github.com/serde-rs/json/compare/v1.0.115...v1.0.116">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde_json&package-manager=cargo&previous-version=1.0.115&new-version=1.0.116)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:39:57 +01:00
dependabot[bot]
31742149d4 Bump serde from 1.0.197 to 1.0.198 (#17111)
Bumps [serde](https://github.com/serde-rs/serde) from 1.0.197 to
1.0.198.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/serde/releases">serde's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.198</h2>
<ul>
<li>Support serializing and deserializing
<code>Saturating&lt;T&gt;</code> (<a
href="https://redirect.github.com/serde-rs/serde/issues/2709">#2709</a>,
thanks <a
href="https://github.com/jbethune"><code>@​jbethune</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c4fb923335"><code>c4fb923</code></a>
Release 1.0.198</li>
<li><a
href="65b7eea775"><code>65b7eea</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/serde/issues/2729">#2729</a>
from dtolnay/saturating</li>
<li><a
href="01cd696fd1"><code>01cd696</code></a>
Integrate Saturating&lt;T&gt; deserialization into impl_deserialize_num
macro</li>
<li><a
href="c13b3f7e68"><code>c13b3f7</code></a>
Format PR 2709</li>
<li><a
href="a6571ee0da"><code>a6571ee</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/serde/issues/2709">#2709</a>
from jbethune/master</li>
<li><a
href="6e38afff49"><code>6e38aff</code></a>
Revert &quot;Temporarily disable miri on doctests&quot;</li>
<li><a
href="3d1b19ed90"><code>3d1b19e</code></a>
Implement Ser+De for <code>Saturating\&lt;T&gt;</code></li>
<li><a
href="5b24f88e73"><code>5b24f88</code></a>
Resolve legacy_numeric_constants clippy lints</li>
<li><a
href="74d06708dd"><code>74d0670</code></a>
Explicitly install a Rust toolchain for cargo-outdated job</li>
<li><a
href="3bfab6ef7f"><code>3bfab6e</code></a>
Temporarily disable miri on doctests</li>
<li>Additional commits viewable in <a
href="https://github.com/serde-rs/serde/compare/v1.0.197...v1.0.198">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde&package-manager=cargo&previous-version=1.0.197&new-version=1.0.198)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:39:49 +01:00
dependabot[bot]
947e8a6cb0 Bump types-bleach from 6.1.0.1 to 6.1.0.20240331 (#17110)
Bumps [types-bleach](https://github.com/python/typeshed) from 6.1.0.1 to
6.1.0.20240331.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-bleach&package-manager=pip&previous-version=6.1.0.1&new-version=6.1.0.20240331)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:39:36 +01:00
dependabot[bot]
0d4d00a07c Bump pyicu from 2.12 to 2.13 (#17109)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pyicu&package-manager=pip&previous-version=2.12&new-version=2.13)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:39:30 +01:00
dependabot[bot]
3166445514 Bump pydantic from 2.6.4 to 2.7.0 (#17107)
Bumps [pydantic](https://github.com/pydantic/pydantic) from 2.6.4 to
2.7.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h1>v2.7.0 (2024-04-11)</h1>
<p>The code released in v2.7.0 is practically identical to that of
v2.7.0b1.</p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Reorganize <code>pyproject.toml</code> sections by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8899">#8899</a></li>
<li>Bump <code>pydantic-core</code> to <code>v2.18.1</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9211">#9211</a></li>
<li>Adopt <code>jiter</code> <code>v0.2.0</code> by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1250">pydantic/pydantic-core#1250</a></li>
</ul>
<h4>New Features</h4>
<ul>
<li>Extract attribute docstrings from <code>FieldInfo.description</code>
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/6563">#6563</a></li>
<li>Add a <code>with_config</code> decorator to comply with typing spec
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8611">#8611</a></li>
<li>Allow an optional separator splitting the value and unit of the
result of <code>ByteSize.human_readable</code> by <a
href="https://github.com/jks15satoshi"><code>@​jks15satoshi</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8706">#8706</a></li>
<li>Add generic <code>Secret</code> base type by <a
href="https://github.com/conradogarciaberrotaran"><code>@​conradogarciaberrotaran</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8519">#8519</a></li>
<li>Make use of <code>Sphinx</code> inventories for cross references in
docs by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/8682">#8682</a></li>
<li>Add environment variable to disable plugins by <a
href="https://github.com/geospackle"><code>@​geospackle</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8767">#8767</a></li>
<li>Add support for <code>deprecated</code> fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8237">#8237</a></li>
<li>Allow <code>field_serializer('*')</code> by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9001">#9001</a></li>
<li>Handle a case when <code>model_config</code> is defined as a model
property by <a
href="https://github.com/alexeyt101"><code>@​alexeyt101</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9004">#9004</a></li>
<li>Update <code>create_model()</code> to support
<code>typing.Annotated</code> as input by <a
href="https://github.com/wannieman98"><code>@​wannieman98</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/8947">#8947</a></li>
<li>Add <code>ClickhouseDsn</code> support by <a
href="https://github.com/solidguy7"><code>@​solidguy7</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9062">#9062</a></li>
<li>Add support for <code>re.Pattern[str]</code> to <code>pattern</code>
field by <a href="https://github.com/jag-k"><code>@​jag-k</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9053">#9053</a></li>
<li>Support for <code>serialize_as_any</code> runtime setting by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8830">#8830</a></li>
<li>Add support for <code>typing.Self</code> by <a
href="https://github.com/Youssefares"><code>@​Youssefares</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9023">#9023</a></li>
<li>Ability to pass <code>context</code> to serialization by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8965">#8965</a></li>
<li>Add feedback widget to docs with flarelytics integration by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9129">#9129</a></li>
<li>Support for parsing partial JSON strings in Python by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/jiter/pull/66">pydantic/jiter#66</a></li>
</ul>
<p><strong>Finalized in v2.7.0, rather than v2.7.0b1:</strong></p>
<ul>
<li>Add support for field level number to str coercion option by <a
href="https://github.com/NeevCohen"><code>@​NeevCohen</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9137">#9137</a></li>
<li>Update <code>warnings</code> parameter for serialization utilities
to allow raising a warning by <a
href="https://github.com/Lance-Drane"><code>@​Lance-Drane</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9166">#9166</a></li>
</ul>
<h4>Changes</h4>
<ul>
<li>Correct docs, logic for <code>model_construct</code> behavior with
<code>extra</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8807">#8807</a></li>
<li>Improve error message for improper <code>RootModel</code> subclasses
by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8857">#8857</a></li>
<li>Use <code>PEP570</code> syntax by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8940">#8940</a></li>
<li>Add <code>enum</code> and <code>type</code> to the JSON schema for
single item literals by <a
href="https://github.com/dmontagu"><code>@​dmontagu</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8944">#8944</a></li>
<li>Deprecate <code>update_json_schema</code> internal function by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9125">#9125</a></li>
<li>Serialize duration to hour minute second, instead of just seconds by
<a href="https://github.com/kakilangit"><code>@​kakilangit</code></a> in
<a
href="https://redirect.github.com/pydantic/speedate/pull/50">pydantic/speedate#50</a></li>
<li>Trimming str before parsing to int and float by <a
href="https://github.com/hungtsetse"><code>@​hungtsetse</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1203">pydantic/pydantic-core#1203</a></li>
</ul>
<h4>Performance</h4>
<ul>
<li><code>enum</code> validator improvements by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9045">#9045</a></li>
<li>Move <code>enum</code> validation and serialization to Rust by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9064">#9064</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.7.0 (2024-04-11)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.7.0">GitHub
release</a></p>
<p>The code released in v2.7.0 is practically identical to that of
v2.7.0b1.</p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Reorganize <code>pyproject.toml</code> sections by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8899">#8899</a></li>
<li>Bump <code>pydantic-core</code> to <code>v2.18.1</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9211">#9211</a></li>
<li>Adopt <code>jiter</code> <code>v0.2.0</code> by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1250">pydantic/pydantic-core#1250</a></li>
</ul>
<h4>New Features</h4>
<ul>
<li>Extract attribute docstrings from <code>FieldInfo.description</code>
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/6563">#6563</a></li>
<li>Add a <code>with_config</code> decorator to comply with typing spec
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8611">#8611</a></li>
<li>Allow an optional separator splitting the value and unit of the
result of <code>ByteSize.human_readable</code> by <a
href="https://github.com/jks15satoshi"><code>@​jks15satoshi</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8706">#8706</a></li>
<li>Add generic <code>Secret</code> base type by <a
href="https://github.com/conradogarciaberrotaran"><code>@​conradogarciaberrotaran</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8519">#8519</a></li>
<li>Make use of <code>Sphinx</code> inventories for cross references in
docs by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/8682">#8682</a></li>
<li>Add environment variable to disable plugins by <a
href="https://github.com/geospackle"><code>@​geospackle</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8767">#8767</a></li>
<li>Add support for <code>deprecated</code> fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8237">#8237</a></li>
<li>Allow <code>field_serializer('*')</code> by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9001">#9001</a></li>
<li>Handle a case when <code>model_config</code> is defined as a model
property by <a
href="https://github.com/alexeyt101"><code>@​alexeyt101</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9004">#9004</a></li>
<li>Update <code>create_model()</code> to support
<code>typing.Annotated</code> as input by <a
href="https://github.com/wannieman98"><code>@​wannieman98</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/8947">#8947</a></li>
<li>Add <code>ClickhouseDsn</code> support by <a
href="https://github.com/solidguy7"><code>@​solidguy7</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9062">#9062</a></li>
<li>Add support for <code>re.Pattern[str]</code> to <code>pattern</code>
field by <a href="https://github.com/jag-k"><code>@​jag-k</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9053">#9053</a></li>
<li>Support for <code>serialize_as_any</code> runtime setting by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8830">#8830</a></li>
<li>Add support for <code>typing.Self</code> by <a
href="https://github.com/Youssefares"><code>@​Youssefares</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9023">#9023</a></li>
<li>Ability to pass <code>context</code> to serialization by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8965">#8965</a></li>
<li>Add feedback widget to docs with flarelytics integration by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9129">#9129</a></li>
<li>Support for parsing partial JSON strings in Python by <a
href="https://github.com/samuelcolvin"><code>@​samuelcolvin</code></a>
in <a
href="https://redirect.github.com/pydantic/jiter/pull/66">pydantic/jiter#66</a></li>
</ul>
<p><strong>Finalized in v2.7.0, rather than v2.7.0b1:</strong></p>
<ul>
<li>Add support for field level number to str coercion option by <a
href="https://github.com/NeevCohen"><code>@​NeevCohen</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9137">#9137</a></li>
<li>Update <code>warnings</code> parameter for serialization utilities
to allow raising a warning by <a
href="https://github.com/Lance-Drane"><code>@​Lance-Drane</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/9166">#9166</a></li>
</ul>
<h4>Changes</h4>
<ul>
<li>Correct docs, logic for <code>model_construct</code> behavior with
<code>extra</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8807">#8807</a></li>
<li>Improve error message for improper <code>RootModel</code> subclasses
by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8857">#8857</a></li>
<li>Use <code>PEP570</code> syntax by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8940">#8940</a></li>
<li>Add <code>enum</code> and <code>type</code> to the JSON schema for
single item literals by <a
href="https://github.com/dmontagu"><code>@​dmontagu</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/8944">#8944</a></li>
<li>Deprecate <code>update_json_schema</code> internal function by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/9125">#9125</a></li>
<li>Serialize duration to hour minute second, instead of just seconds by
<a href="https://github.com/kakilangit"><code>@​kakilangit</code></a> in
<a
href="https://redirect.github.com/pydantic/speedate/pull/50">pydantic/speedate#50</a></li>
<li>Trimming str before parsing to int and float by <a
href="https://github.com/hungtsetse"><code>@​hungtsetse</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1203">pydantic/pydantic-core#1203</a></li>
</ul>
<h4>Performance</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7af856a109"><code>7af856a</code></a>
Prep for 2.7 Release (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9212">#9212</a>)</li>
<li><a
href="60d77f02e7"><code>60d77f0</code></a>
Update <code>warnings</code> parameter for serialization utilities to
allow raising a wa...</li>
<li><a
href="99821e9532"><code>99821e9</code></a>
Add support for field level number to str coercion option (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9137">#9137</a>)</li>
<li><a
href="a01b9029e3"><code>a01b902</code></a>
Updating JSON docs, adding <code>cache_strings</code> to
<code>ConfigDict</code> (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9178">#9178</a>)</li>
<li><a
href="932b025f89"><code>932b025</code></a>
Bump core to 2.18.1 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9211">#9211</a>)</li>
<li><a
href="a7d3253477"><code>a7d3253</code></a>
Fix allow extra generic (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9193">#9193</a>)</li>
<li><a
href="8aeac1a4c6"><code>8aeac1a</code></a>
Update mkdocs_material (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9169">#9169</a>)</li>
<li><a
href="75012318fb"><code>7501231</code></a>
Add 1.10.15 section to HISTORY.md (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9161">#9161</a>)</li>
<li><a
href="d294244e2d"><code>d294244</code></a>
Prep for 2.7 beta release 🚀 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9158">#9158</a>)</li>
<li><a
href="d77a940360"><code>d77a940</code></a>
Uprev <code>pydantic-core</code> (<a
href="https://redirect.github.com/pydantic/pydantic/issues/9153">#9153</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.6.4...v2.7.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pydantic&package-manager=pip&previous-version=2.6.4&new-version=2.7.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:36:47 +01:00
dependabot[bot]
922656fc77 Bump phonenumbers from 8.13.29 to 8.13.35 (#17106)
Bumps
[phonenumbers](https://github.com/daviddrysdale/python-phonenumbers)
from 8.13.29 to 8.13.35.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9369ff4607"><code>9369ff4</code></a>
Prep for 8.13.35 release</li>
<li><a
href="2e1e133890"><code>2e1e133</code></a>
Generated files for metadata</li>
<li><a
href="25a306f670"><code>25a306f</code></a>
Merge metadata changes from upstream 8.13.35</li>
<li><a
href="710529234b"><code>7105292</code></a>
Prep for 8.13.34 release</li>
<li><a
href="e7b328d071"><code>e7b328d</code></a>
Generated files for metadata</li>
<li><a
href="315eb10e00"><code>315eb10</code></a>
Merge metadata changes from upstream 8.13.34</li>
<li><a
href="29dab756ac"><code>29dab75</code></a>
Prep for 8.13.33 release</li>
<li><a
href="f5b9401fdb"><code>f5b9401</code></a>
Generated files for metadata</li>
<li><a
href="aa21158f8d"><code>aa21158</code></a>
Merge metadata changes from upstream 8.13.33</li>
<li><a
href="92c242c2b4"><code>92c242c</code></a>
Prep for 8.13.32 release</li>
<li>Additional commits viewable in <a
href="https://github.com/daviddrysdale/python-phonenumbers/compare/v8.13.29...v8.13.35">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=phonenumbers&package-manager=pip&previous-version=8.13.29&new-version=8.13.35)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-26 09:36:21 +01:00
Olivier 'reivilibre
30c50e0240 Tweak changelog 2024-04-25 16:00:37 +01:00
Olivier 'reivilibre
48a90c697b 1.106.0rc1 2024-04-25 15:55:18 +01:00
Till
47773232b0 Redact membership events if the user requested erasure upon deactivating (#17076)
Fixes #15355 by redacting all membership events before leaving rooms.
2024-04-25 14:25:31 +01:00
Quentin Gliech
2e92b718d5 MSC4108 implementation (#17056)
Co-authored-by: Hugh Nimmo-Smith <hughns@element.io>
Co-authored-by: Hugh Nimmo-Smith <hughns@users.noreply.github.com>
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-04-25 12:50:12 +00:00
Andrew Morgan
646cb6ff24 Add type annotation to visited_chains (#17125)
This should fix CI on `develop`. Broke in
0fe9e1f7da,
presumably due to a `mypy` dependency upgrade.
2024-04-25 12:25:26 +00:00
Erik Johnston
0fe9e1f7da Merge branch 'master' into develop 2024-04-23 17:06:52 +01:00
mcalinghee
ae181233aa Send an email if the address is already bound to an user account (#16819)
Co-authored-by: Mathieu Velten <mathieu.velten@beta.gouv.fr>
Co-authored-by: Olivier D <odelcroi@gmail.com>
2024-04-23 16:45:24 +01:00
Erik Johnston
20c9e19519 1.105.1 2024-04-23 15:57:13 +01:00
Erik Johnston
55b0aa847a Fix GHSA-3h7q-rfh9-xm4v
Weakness in auth chain indexing allows DoS from remote room members
through disk fill and high CPU usage.

A remote Matrix user with malicious intent, sharing a room with Synapse
instances before 1.104.1, can dispatch specially crafted events to
exploit a weakness in how the auth chain cover index is calculated. This
can induce high CPU consumption and accumulate excessive data in the
database of such instances, resulting in a denial of service.

Servers in private federations, or those that do not federate, are not
affected.
2024-04-23 15:25:49 +01:00
117 changed files with 3317 additions and 1274 deletions

View File

@@ -85,33 +85,3 @@ jobs:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./book
destination_dir: ./${{ needs.pre.outputs.branch-version }}
################################################################################
pages-devdocs:
name: GitHub Pages (developer docs)
runs-on: ubuntu-latest
needs:
- pre
steps:
- uses: actions/checkout@v4
- name: "Set up Sphinx"
uses: matrix-org/setup-python-poetry@v1
with:
python-version: "3.x"
poetry-version: "1.3.2"
groups: "dev-docs"
extras: ""
- name: Build the documentation
run: |
cd dev-docs
poetry run make html
# Deploy to the target directory.
- name: Deploy to gh pages
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./dev-docs/_build/html
destination_dir: ./dev-docs/${{ needs.pre.outputs.branch-version }}

View File

@@ -1,3 +1,125 @@
# Synapse 1.107.0rc1 (2024-05-07)
### Features
- Add preliminary support for [MSC3823: Account Suspension](https://github.com/matrix-org/matrix-spec-proposals/pull/3823). ([\#17051](https://github.com/element-hq/synapse/issues/17051))
- Declare support for [Matrix v1.10](https://matrix.org/blog/2024/03/22/matrix-v1.10-release/). Contributed by @clokep. ([\#17082](https://github.com/element-hq/synapse/issues/17082))
- Add support for [MSC4115: membership metadata on events](https://github.com/matrix-org/matrix-spec-proposals/pull/4115). ([\#17104](https://github.com/element-hq/synapse/issues/17104), [\#17137](https://github.com/element-hq/synapse/issues/17137))
### Bugfixes
- Fixed search feature of Element Android on homesevers using SQLite by returning search terms as search highlights. ([\#17000](https://github.com/element-hq/synapse/issues/17000))
- Fixes a bug introduced in v1.52.0 where the `destination` query parameter for the [Destination Rooms Admin API](https://element-hq.github.io/synapse/v1.105/usage/administration/admin_api/federation.html#destination-rooms) failed to actually filter returned rooms. ([\#17077](https://github.com/element-hq/synapse/issues/17077))
- For MSC3266 room summaries, support queries at the recommended endpoint of `/_matrix/client/unstable/im.nheko.summary/summary/{roomIdOrAlias}`. The existing endpoint of `/_matrix/client/unstable/im.nheko.summary/rooms/{roomIdOrAlias}/summary` is deprecated. ([\#17078](https://github.com/element-hq/synapse/issues/17078))
- Apply user email & picture during OIDC registration if present & selected. ([\#17120](https://github.com/element-hq/synapse/issues/17120))
- Improve error message for cross signing reset with [MSC3861](https://github.com/matrix-org/matrix-spec-proposals/pull/3861) enabled. ([\#17121](https://github.com/element-hq/synapse/issues/17121))
- Fix a bug which meant that to-device messages received over federation could be dropped when the server was under load or networking problems caused problems between Synapse processes or the database. ([\#17127](https://github.com/element-hq/synapse/issues/17127))
- Fix bug where `StreamChangeCache` would not respect configured cache factors. ([\#17152](https://github.com/element-hq/synapse/issues/17152))
### Updates to the Docker image
- Correct licensing metadata on Docker image. ([\#17141](https://github.com/element-hq/synapse/issues/17141))
### Improved Documentation
- Update the `event_cache_size` and `global_factor` configuration options' documentation. ([\#17071](https://github.com/element-hq/synapse/issues/17071))
- Remove broken sphinx docs. ([\#17073](https://github.com/element-hq/synapse/issues/17073), [\#17148](https://github.com/element-hq/synapse/issues/17148))
- Add RuntimeDirectory to example matrix-synapse.service systemd unit. ([\#17084](https://github.com/element-hq/synapse/issues/17084))
- Fix various small typos throughout the docs. ([\#17114](https://github.com/element-hq/synapse/issues/17114))
- Update enable_notifs configuration documentation. ([\#17116](https://github.com/element-hq/synapse/issues/17116))
- Update the Upgrade Notes with the latest minimum supported Rust version of 1.66.0. Contributed by @jahway603. ([\#17140](https://github.com/element-hq/synapse/issues/17140))
### Internal Changes
- Enable [MSC3266](https://github.com/matrix-org/matrix-spec-proposals/pull/3266) by default in the Synapse Complement image. ([\#17105](https://github.com/element-hq/synapse/issues/17105))
- Add optimisation to `StreamChangeCache.get_entities_changed(..)`. ([\#17130](https://github.com/element-hq/synapse/issues/17130))
### Updates to locked dependencies
* Bump furo from 2024.1.29 to 2024.4.27. ([\#17133](https://github.com/element-hq/synapse/issues/17133))
* Bump idna from 3.6 to 3.7. ([\#17136](https://github.com/element-hq/synapse/issues/17136))
* Bump jsonschema from 4.21.1 to 4.22.0. ([\#17157](https://github.com/element-hq/synapse/issues/17157))
* Bump lxml from 5.1.0 to 5.2.1. ([\#17158](https://github.com/element-hq/synapse/issues/17158))
* Bump phonenumbers from 8.13.29 to 8.13.35. ([\#17106](https://github.com/element-hq/synapse/issues/17106))
- Bump pillow from 10.2.0 to 10.3.0. ([\#17146](https://github.com/element-hq/synapse/issues/17146))
* Bump pydantic from 2.6.4 to 2.7.0. ([\#17107](https://github.com/element-hq/synapse/issues/17107))
* Bump pydantic from 2.7.0 to 2.7.1. ([\#17160](https://github.com/element-hq/synapse/issues/17160))
* Bump pyicu from 2.12 to 2.13. ([\#17109](https://github.com/element-hq/synapse/issues/17109))
* Bump serde from 1.0.197 to 1.0.198. ([\#17111](https://github.com/element-hq/synapse/issues/17111))
* Bump serde from 1.0.198 to 1.0.199. ([\#17132](https://github.com/element-hq/synapse/issues/17132))
* Bump serde from 1.0.199 to 1.0.200. ([\#17161](https://github.com/element-hq/synapse/issues/17161))
* Bump serde_json from 1.0.115 to 1.0.116. ([\#17112](https://github.com/element-hq/synapse/issues/17112))
- Update `tornado` Python dependency from 6.2 to 6.4. ([\#17131](https://github.com/element-hq/synapse/issues/17131))
* Bump twisted from 23.10.0 to 24.3.0. ([\#17135](https://github.com/element-hq/synapse/issues/17135))
* Bump types-bleach from 6.1.0.1 to 6.1.0.20240331. ([\#17110](https://github.com/element-hq/synapse/issues/17110))
* Bump types-pillow from 10.2.0.20240415 to 10.2.0.20240423. ([\#17159](https://github.com/element-hq/synapse/issues/17159))
* Bump types-setuptools from 69.0.0.20240125 to 69.5.0.20240423. ([\#17134](https://github.com/element-hq/synapse/issues/17134))
# Synapse 1.106.0 (2024-04-30)
No significant changes since 1.106.0rc1.
# Synapse 1.106.0rc1 (2024-04-25)
### Features
- Send an email if the address is already bound to an user account. ([\#16819](https://github.com/element-hq/synapse/issues/16819))
- Implement the rendezvous mechanism described by [MSC4108](https://github.com/matrix-org/matrix-spec-proposals/issues/4108). ([\#17056](https://github.com/element-hq/synapse/issues/17056))
- Support delegating the rendezvous mechanism described [MSC4108](https://github.com/matrix-org/matrix-spec-proposals/issues/4108) to an external implementation. ([\#17086](https://github.com/element-hq/synapse/issues/17086))
### Bugfixes
- Add validation to ensure that the `limit` parameter on `/publicRooms` is non-negative. ([\#16920](https://github.com/element-hq/synapse/issues/16920))
- Return `400 M_NOT_JSON` upon receiving invalid JSON in query parameters across various client and admin endpoints, rather than an internal server error. ([\#16923](https://github.com/element-hq/synapse/issues/16923))
- Make the CSAPI endpoint `/keys/device_signing/upload` idempotent. ([\#16943](https://github.com/element-hq/synapse/issues/16943))
- Redact membership events if the user requested erasure upon deactivating. ([\#17076](https://github.com/element-hq/synapse/issues/17076))
### Improved Documentation
- Add a prompt in the contributing guide to manually configure icu4c. ([\#17069](https://github.com/element-hq/synapse/issues/17069))
- Clarify what part of message retention is still experimental. ([\#17099](https://github.com/element-hq/synapse/issues/17099))
### Internal Changes
- Use new receipts column to optimise receipt and push action SQL queries. Contributed by Nick @ Beeper (@fizzadar). ([\#17032](https://github.com/element-hq/synapse/issues/17032), [\#17096](https://github.com/element-hq/synapse/issues/17096))
- Fix mypy with latest Twisted release. ([\#17036](https://github.com/element-hq/synapse/issues/17036))
- Bump minimum supported Rust version to 1.66.0. ([\#17079](https://github.com/element-hq/synapse/issues/17079))
- Add helpers to transform Twisted requests to Rust http Requests/Responses. ([\#17081](https://github.com/element-hq/synapse/issues/17081))
- Fix type annotation for `visited_chains` after `mypy` upgrade. ([\#17125](https://github.com/element-hq/synapse/issues/17125))
### Updates to locked dependencies
* Bump anyhow from 1.0.81 to 1.0.82. ([\#17095](https://github.com/element-hq/synapse/issues/17095))
* Bump peaceiris/actions-gh-pages from 3.9.3 to 4.0.0. ([\#17087](https://github.com/element-hq/synapse/issues/17087))
* Bump peaceiris/actions-mdbook from 1.2.0 to 2.0.0. ([\#17089](https://github.com/element-hq/synapse/issues/17089))
* Bump pyasn1-modules from 0.3.0 to 0.4.0. ([\#17093](https://github.com/element-hq/synapse/issues/17093))
* Bump pygithub from 2.2.0 to 2.3.0. ([\#17092](https://github.com/element-hq/synapse/issues/17092))
* Bump ruff from 0.3.5 to 0.3.7. ([\#17094](https://github.com/element-hq/synapse/issues/17094))
* Bump sigstore/cosign-installer from 3.4.0 to 3.5.0. ([\#17088](https://github.com/element-hq/synapse/issues/17088))
* Bump twine from 4.0.2 to 5.0.0. ([\#17091](https://github.com/element-hq/synapse/issues/17091))
* Bump types-pillow from 10.2.0.20240406 to 10.2.0.20240415. ([\#17090](https://github.com/element-hq/synapse/issues/17090))
# Synapse 1.105.1 (2024-04-23)
## Security advisory
The following issues are fixed in 1.105.1.
- [GHSA-3h7q-rfh9-xm4v](https://github.com/element-hq/synapse/security/advisories/GHSA-3h7q-rfh9-xm4v) / [CVE-2024-31208](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-31208) — High Severity
Weakness in auth chain indexing allows DoS from remote room members through disk fill and high CPU usage.
See the advisories for more details. If you have any questions, email security@element.io.
# Synapse 1.105.0 (2024-04-16)
No significant changes since 1.105.0rc1.

379
Cargo.lock generated
View File

@@ -4,30 +4,30 @@ version = 3
[[package]]
name = "aho-corasick"
version = "1.0.2"
version = "1.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "43f6cb1bf222025340178f382c426f13757b2960e89779dfcb319c32542a5a41"
checksum = "8e60d3430d3a69478ad0993f19238d2df97c507009a52b3c10addcd7f6bcb916"
dependencies = [
"memchr",
]
[[package]]
name = "anyhow"
version = "1.0.82"
version = "1.0.83"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f538837af36e6f6a9be0faa67f9a314f8119e4e4b5867c6ab40ed60360142519"
checksum = "25bdb32cbbdce2b519a9cd7df3a678443100e265d5e25ca763b7572a5104f5f3"
[[package]]
name = "arc-swap"
version = "1.5.1"
version = "1.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "983cd8b9d4b02a6dc6ffa557262eb5858a27a0038ffffe21a0f133eaa819a164"
checksum = "69f7f8c3906b62b754cd5326047894316021dcfe5a194c8ea52bdd94934a3457"
[[package]]
name = "autocfg"
version = "1.1.0"
version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
checksum = "0c4b4d0bd25bd0b74681c0ad21497610ce1b7c91b1022cd21c80c6fbdd9476b0"
[[package]]
name = "base64"
@@ -37,9 +37,9 @@ checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567"
[[package]]
name = "bitflags"
version = "1.3.2"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
checksum = "cf4b9d6a944f767f8e5e0db018570623c85f3d925ac718db4e06d0187adb21c1"
[[package]]
name = "blake2"
@@ -52,13 +52,19 @@ dependencies = [
[[package]]
name = "block-buffer"
version = "0.10.3"
version = "0.10.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69cce20737498f97b993470a6e536b8523f0af7892a4f928cceb1ac5e52ebe7e"
checksum = "3078c7629b62d3f0439517fa394996acacc5cbc91c5a20d8c658e77abd503a71"
dependencies = [
"generic-array",
]
[[package]]
name = "bumpalo"
version = "3.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c"
[[package]]
name = "bytes"
version = "1.6.0"
@@ -92,9 +98,9 @@ dependencies = [
[[package]]
name = "digest"
version = "0.10.5"
version = "0.10.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "adfbc57365a37acbd2ebf2b64d7e69bb766e2fea813521ed536f5d0520dcf86c"
checksum = "9ed9a281f7bc9b7576e61468ba615a66a5c8cfdff42420a70aa82701a3b1e292"
dependencies = [
"block-buffer",
"crypto-common",
@@ -109,14 +115,27 @@ checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
name = "generic-array"
version = "0.14.6"
version = "0.14.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bff49e947297f3312447abdca79f45f4738097cc82b06e72054d2223f601f1b9"
checksum = "85649ca51fd72272d7821adaf274ad91c288277713d9c18820d8499a7ff69e9a"
dependencies = [
"typenum",
"version_check",
]
[[package]]
name = "getrandom"
version = "0.2.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c4567c8db10ae91089c99af84c68c38da3ec2f087c3f82960bcdbf3656b6f4d7"
dependencies = [
"cfg-if",
"js-sys",
"libc",
"wasi",
"wasm-bindgen",
]
[[package]]
name = "headers"
version = "0.4.0"
@@ -172,15 +191,24 @@ checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
[[package]]
name = "indoc"
version = "2.0.4"
version = "2.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e186cfbae8084e513daff4240b4797e342f988cecda4fb6c939150f96315fd8"
checksum = "b248f5224d1d606005e02c97f5aa4e88eeb230488bcc03bc9ca4d7991399f2b5"
[[package]]
name = "itoa"
version = "1.0.4"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4217ad341ebadf8d8e724e264f13e593e0648f5b3e94b3896a5df283be015ecc"
checksum = "49f1f14873335454500d59611f1cf4a4b0f786f9ac11f4312a78e4cf2566695b"
[[package]]
name = "js-sys"
version = "0.3.69"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "29c15563dc2726973df627357ce0c9ddddbea194836909d655df6a75d2cf296d"
dependencies = [
"wasm-bindgen",
]
[[package]]
name = "lazy_static"
@@ -190,15 +218,15 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646"
[[package]]
name = "libc"
version = "0.2.153"
version = "0.2.154"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9c198f91728a82281a64e1f4f9eeb25d82cb32a5de251c6bd1b5154d63a8e7bd"
checksum = "ae743338b92ff9146ce83992f766a31066a91a8c84a45e0e9f21e7cf6de6d346"
[[package]]
name = "lock_api"
version = "0.4.9"
version = "0.4.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "435011366fe56583b16cf956f9df0095b405b82d76425bc8981c0e22e60ec4df"
checksum = "07af8b9cdd281b7915f413fa73f29ebd5d55d0d3f0155584dade1ff18cea1b17"
dependencies = [
"autocfg",
"scopeguard",
@@ -212,15 +240,15 @@ checksum = "90ed8c1e510134f979dbc4f070f87d4313098b704861a105fe34231c70a3901c"
[[package]]
name = "memchr"
version = "2.6.3"
version = "2.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8f232d6ef707e1956a43342693d2a31e72989554d58299d7a88738cc95b0d35c"
checksum = "6c8640c5d730cb13ebd907d8d04b52f55ac9a2eec55b440c8892f40d56c76c1d"
[[package]]
name = "memoffset"
version = "0.9.0"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a634b1c61a95585bd15607c6ab0c4e5b226e695ff2800ba0cdccddf208c406c"
checksum = "488016bfae457b036d996092f6cb448677611ce4449e970ceaf42695203f218a"
dependencies = [
"autocfg",
]
@@ -233,15 +261,15 @@ checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a"
[[package]]
name = "once_cell"
version = "1.15.0"
version = "1.19.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e82dad04139b71a90c080c8463fe0dc7902db5192d939bd0950f074d014339e1"
checksum = "3fdb12b2476b595f9358c5161aa467c2438859caa136dec86c26fdd2efe17b92"
[[package]]
name = "parking_lot"
version = "0.12.1"
version = "0.12.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3742b2c103b9f06bc9fff0a37ff4912935851bee6d36f3c02bcc755bcfec228f"
checksum = "7e4af0ca4f6caed20e900d564c242b8e5d4903fdacf31d3daf527b66fe6f42fb"
dependencies = [
"lock_api",
"parking_lot_core",
@@ -249,15 +277,15 @@ dependencies = [
[[package]]
name = "parking_lot_core"
version = "0.9.3"
version = "0.9.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09a279cbf25cb0757810394fbc1e359949b59e348145c643a939a525692e6929"
checksum = "1e401f977ab385c9e4e3ab30627d6f26d00e2c73eef317493c4ec6d468726cf8"
dependencies = [
"cfg-if",
"libc",
"redox_syscall",
"smallvec",
"windows-sys",
"windows-targets",
]
[[package]]
@@ -267,19 +295,25 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7170ef9988bc169ba16dd36a7fa041e5c4cbeb6a35b76d4c03daded371eae7c0"
[[package]]
name = "proc-macro2"
version = "1.0.76"
name = "ppv-lite86"
version = "0.2.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "95fc56cda0b5c3325f5fbbd7ff9fda9e02bb00bb3dac51252d2f1bfa1cb8cc8c"
checksum = "5b40af805b3121feab8a3c29f04d8ad262fa8e0561883e7653e024ae4479e6de"
[[package]]
name = "proc-macro2"
version = "1.0.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ad3d49ab951a01fbaafe34f2ec74122942fe18a3f9814c3268f1bb72042131b"
dependencies = [
"unicode-ident",
]
[[package]]
name = "pyo3"
version = "0.20.3"
version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "53bdbb96d49157e65d45cc287af5f32ffadd5f4761438b527b055fb0d4bb8233"
checksum = "a5e00b96a521718e08e03b1a622f01c8a8deb50719335de3f60b3b3950f069d8"
dependencies = [
"anyhow",
"cfg-if",
@@ -296,9 +330,9 @@ dependencies = [
[[package]]
name = "pyo3-build-config"
version = "0.20.3"
version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "deaa5745de3f5231ce10517a1f5dd97d53e5a2fd77aa6b5842292085831d48d7"
checksum = "7883df5835fafdad87c0d888b266c8ec0f4c9ca48a5bed6bbb592e8dedee1b50"
dependencies = [
"once_cell",
"target-lexicon",
@@ -306,9 +340,9 @@ dependencies = [
[[package]]
name = "pyo3-ffi"
version = "0.20.3"
version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62b42531d03e08d4ef1f6e85a2ed422eb678b8cd62b762e53891c05faf0d4afa"
checksum = "01be5843dc60b916ab4dad1dca6d20b9b4e6ddc8e15f50c47fe6d85f1fb97403"
dependencies = [
"libc",
"pyo3-build-config",
@@ -316,9 +350,9 @@ dependencies = [
[[package]]
name = "pyo3-log"
version = "0.9.0"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c10808ee7250403bedb24bc30c32493e93875fef7ba3e4292226fe924f398bd"
checksum = "2af49834b8d2ecd555177e63b273b708dea75150abc6f5341d0a6e1a9623976c"
dependencies = [
"arc-swap",
"log",
@@ -327,9 +361,9 @@ dependencies = [
[[package]]
name = "pyo3-macros"
version = "0.20.3"
version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7305c720fa01b8055ec95e484a6eca7a83c841267f0dd5280f0c8b8551d2c158"
checksum = "77b34069fc0682e11b31dbd10321cbf94808394c56fd996796ce45217dfac53c"
dependencies = [
"proc-macro2",
"pyo3-macros-backend",
@@ -339,9 +373,9 @@ dependencies = [
[[package]]
name = "pyo3-macros-backend"
version = "0.20.3"
version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c7e9b68bb9c3149c5b0cade5d07f953d6d125eb4337723c4ccdb665f1f96185"
checksum = "08260721f32db5e1a5beae69a55553f56b99bd0e1c3e6e0a5e8851a9d0f5a85c"
dependencies = [
"heck",
"proc-macro2",
@@ -352,9 +386,9 @@ dependencies = [
[[package]]
name = "pythonize"
version = "0.20.0"
version = "0.21.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ffd1c3ef39c725d63db5f9bc455461bafd80540cb7824c61afb823501921a850"
checksum = "9d0664248812c38cc55a4ed07f88e4df516ce82604b93b1ffdc041aa77a6cb3c"
dependencies = [
"pyo3",
"serde",
@@ -362,18 +396,48 @@ dependencies = [
[[package]]
name = "quote"
version = "1.0.35"
version = "1.0.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "291ec9ab5efd934aaf503a6466c5d5251535d108ee747472c3977cc5acc868ef"
checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7"
dependencies = [
"proc-macro2",
]
[[package]]
name = "redox_syscall"
version = "0.2.16"
name = "rand"
version = "0.8.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a"
checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404"
dependencies = [
"libc",
"rand_chacha",
"rand_core",
]
[[package]]
name = "rand_chacha"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88"
dependencies = [
"ppv-lite86",
"rand_core",
]
[[package]]
name = "rand_core"
version = "0.6.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c"
dependencies = [
"getrandom",
]
[[package]]
name = "redox_syscall"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "469052894dcb553421e483e4209ee581a45100d31b4018de03e5a7ad86374a7e"
dependencies = [
"bitflags",
]
@@ -392,9 +456,9 @@ dependencies = [
[[package]]
name = "regex-automata"
version = "0.4.4"
version = "0.4.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3b7fa1134405e2ec9353fd416b17f8dacd46c473d7d3fd1cf202706a14eb792a"
checksum = "86b83b8b9847f9bf95ef68afb0b8e6cdb80f498442f5179a29fad448fcc1eaea"
dependencies = [
"aho-corasick",
"memchr",
@@ -403,36 +467,36 @@ dependencies = [
[[package]]
name = "regex-syntax"
version = "0.8.2"
version = "0.8.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c08c74e62047bb2de4ff487b251e4a92e24f48745648451635cec7d591162d9f"
checksum = "adad44e29e4c806119491a7f06f03de4d1af22c3a680dd47f1e6e179439d1f56"
[[package]]
name = "ryu"
version = "1.0.11"
version = "1.0.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4501abdff3ae82a1c1b477a17252eb69cee9e66eb915c1abaa4f44d873df9f09"
checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f"
[[package]]
name = "scopeguard"
version = "1.1.0"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "serde"
version = "1.0.197"
version = "1.0.200"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3fb1c873e1b9b056a4dc4c0c198b24c3ffa059243875552b2bd0933b1aee4ce2"
checksum = "ddc6f9cc94d67c0e21aaf7eda3a010fd3af78ebf6e096aa6e2e13c79749cce4f"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.197"
version = "1.0.200"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7eb0b34b42edc17f6b7cac84a52a1c5f0e1bb2227e997ca9011ea3dd34e8610b"
checksum = "856f046b9400cee3c8c94ed572ecdb752444c24528c035cd35882aad6f492bcb"
dependencies = [
"proc-macro2",
"quote",
@@ -441,9 +505,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.115"
version = "1.0.116"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "12dc5c46daa8e9fdf4f5e71b6cf9a53f2487da0e86e55808e2d35539666497dd"
checksum = "3e17db7126d17feb94eb3fad46bf1a96b034e8aacbc2e775fe81505f8b0b2813"
dependencies = [
"itoa",
"ryu",
@@ -452,9 +516,20 @@ dependencies = [
[[package]]
name = "sha1"
version = "0.10.5"
version = "0.10.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f04293dc80c3993519f2d7f6f511707ee7094fe0c6d3406feb330cdb3540eba3"
checksum = "e3bf829a2d51ab4a5ddf1352d8470c140cadc8301b2ae1789db023f01cedd6ba"
dependencies = [
"cfg-if",
"cpufeatures",
"digest",
]
[[package]]
name = "sha2"
version = "0.10.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "793db75ad2bcafc3ffa7c68b215fee268f537982cd901d132f89c6343f3a3dc8"
dependencies = [
"cfg-if",
"cpufeatures",
@@ -463,21 +538,21 @@ dependencies = [
[[package]]
name = "smallvec"
version = "1.10.0"
version = "1.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a507befe795404456341dfab10cef66ead4c041f62b8b11bbb92bffe5d0953e0"
checksum = "3c5e1a9a646d36c3599cd173a41282daf47c44583ad367b8e6837255952e5c67"
[[package]]
name = "subtle"
version = "2.4.1"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6bdef32e8150c2a081110b42772ffe7d7c9032b606bc226c8260fd97e0976601"
checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "2.0.48"
version = "2.0.61"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0f3531638e407dfc0814761abb7c00a5b54992b849452a0646b7f65c9f770f3f"
checksum = "c993ed8ccba56ae856363b1845da7266a7cb78e1d146c8a32d54b45a8b831fc9"
dependencies = [
"proc-macro2",
"quote",
@@ -489,6 +564,7 @@ name = "synapse"
version = "0.1.0"
dependencies = [
"anyhow",
"base64",
"blake2",
"bytes",
"headers",
@@ -496,31 +572,45 @@ dependencies = [
"http",
"lazy_static",
"log",
"mime",
"pyo3",
"pyo3-log",
"pythonize",
"regex",
"serde",
"serde_json",
"sha2",
"ulid",
]
[[package]]
name = "target-lexicon"
version = "0.12.4"
version = "0.12.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c02424087780c9b71cc96799eaeddff35af2bc513278cda5c99fc1f5d026d3c1"
checksum = "e1fc403891a21bcfb7c37834ba66a547a8f402146eba7265b5a6d88059c9ff2f"
[[package]]
name = "typenum"
version = "1.15.0"
version = "1.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dcf81ac59edc17cc8697ff311e8f5ef2d99fcbd9817b34cec66f90b6c3dfd987"
checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
[[package]]
name = "ulid"
version = "1.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34778c17965aa2a08913b57e1f34db9b4a63f5de31768b55bf20d2795f921259"
dependencies = [
"getrandom",
"rand",
"web-time",
]
[[package]]
name = "unicode-ident"
version = "1.0.5"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ceab39d59e4c9499d4e5a8ee0e2735b891bb7308ac83dfb4e80cad195c9f6f3"
checksum = "3354b9ac3fae1ff6755cb6db53683adb661634f67557942dea4facebec0fee4b"
[[package]]
name = "unindent"
@@ -535,44 +625,135 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f"
[[package]]
name = "windows-sys"
version = "0.36.1"
name = "wasi"
version = "0.11.0+wasi-snapshot-preview1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ea04155a16a59f9eab786fe12a4a450e75cdb175f9e0d80da1e17db09f55b8d2"
checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
[[package]]
name = "wasm-bindgen"
version = "0.2.92"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4be2531df63900aeb2bca0daaaddec08491ee64ceecbee5076636a3b026795a8"
dependencies = [
"cfg-if",
"wasm-bindgen-macro",
]
[[package]]
name = "wasm-bindgen-backend"
version = "0.2.92"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "614d787b966d3989fa7bb98a654e369c762374fd3213d212cfc0251257e747da"
dependencies = [
"bumpalo",
"log",
"once_cell",
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-macro"
version = "0.2.92"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1f8823de937b71b9460c0c34e25f3da88250760bec0ebac694b49997550d726"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
]
[[package]]
name = "wasm-bindgen-macro-support"
version = "0.2.92"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e94f17b526d0a461a191c78ea52bbce64071ed5c04c9ffe424dcb38f74171bb7"
dependencies = [
"proc-macro2",
"quote",
"syn",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
[[package]]
name = "wasm-bindgen-shared"
version = "0.2.92"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "af190c94f2773fdb3729c55b007a722abb5384da03bc0986df4c289bf5567e96"
[[package]]
name = "web-time"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a6580f308b1fad9207618087a65c04e7a10bc77e02c8e84e9b00dd4b12fa0bb"
dependencies = [
"js-sys",
"wasm-bindgen",
]
[[package]]
name = "windows-targets"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6f0713a46559409d202e70e28227288446bf7841d3211583a4b53e3f6d96e7eb"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_gnullvm",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
"windows_x86_64_msvc",
]
[[package]]
name = "windows_aarch64_msvc"
version = "0.36.1"
name = "windows_aarch64_gnullvm"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9bb8c3fd39ade2d67e9874ac4f3db21f0d710bee00fe7cab16949ec184eeaa47"
checksum = "7088eed71e8b8dda258ecc8bac5fb1153c5cffaf2578fc8ff5d61e23578d3263"
[[package]]
name = "windows_aarch64_msvc"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9985fd1504e250c615ca5f281c3f7a6da76213ebd5ccc9561496568a2752afb6"
[[package]]
name = "windows_i686_gnu"
version = "0.36.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "180e6ccf01daf4c426b846dfc66db1fc518f074baa793aa7d9b9aaeffad6a3b6"
checksum = "88ba073cf16d5372720ec942a8ccbf61626074c6d4dd2e745299726ce8b89670"
[[package]]
name = "windows_i686_gnullvm"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87f4261229030a858f36b459e748ae97545d6f1ec60e5e0d6a3d32e0dc232ee9"
[[package]]
name = "windows_i686_msvc"
version = "0.36.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e2e7917148b2812d1eeafaeb22a97e4813dfa60a3f8f78ebe204bcc88f12f024"
checksum = "db3c2bf3d13d5b658be73463284eaf12830ac9a26a90c717b7f771dfe97487bf"
[[package]]
name = "windows_x86_64_gnu"
version = "0.36.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4dcd171b8776c41b97521e5da127a2d86ad280114807d0b2ab1e462bc764d9e1"
checksum = "4e4246f76bdeff09eb48875a0fd3e2af6aada79d409d33011886d3e1581517d9"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "852298e482cd67c356ddd9570386e2862b5673c85bd5f88df9ab6802b334c596"
[[package]]
name = "windows_x86_64_msvc"
version = "0.36.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c811ca4a8c853ef420abd8592ba53ddbbac90410fab6903b3e79972a631f7680"
checksum = "bec47e5bfd1bff0eeaf6d8b485cc1074891a197ab4225d504cb7a1ab88b02bf0"

View File

@@ -1 +0,0 @@
Adds validation to ensure that the `limit` parameter on `/publicRooms` is non-negative.

View File

@@ -1 +0,0 @@
Return `400 M_NOT_JSON` upon receiving invalid JSON in query parameters across various client and admin endpoints, rather than an internal server error.

View File

@@ -1 +0,0 @@
Make the CSAPI endpoint `/keys/device_signing/upload` idempotent.

View File

@@ -1 +0,0 @@
Use new receipts column to optimise receipt and push action SQL queries. Contributed by Nick @ Beeper (@fizzadar).

View File

@@ -1 +0,0 @@
Fix mypy with latest Twisted release.

View File

@@ -1 +0,0 @@
Add a prompt in the contributing guide to manually configure icu4c.

View File

@@ -1 +0,0 @@
Bump minimum supported Rust version to 1.66.0.

View File

@@ -1 +0,0 @@
Add helpers to transform Twisted requests to Rust http Requests/Responses.

View File

@@ -1 +0,0 @@
Support delegating the rendezvous mechanism described MSC4108 to an external implementation.

View File

@@ -1 +0,0 @@
Use new receipts column to optimise receipt and push action SQL queries. Contributed by Nick @ Beeper (@fizzadar).

View File

@@ -1 +0,0 @@
Clarify what part of message retention is still experimental.

1
changelog.d/17139.doc Normal file
View File

@@ -0,0 +1 @@
Update User Admin API with note about prefixing OIDC external_id providers.

1
changelog.d/17145.bugfix Normal file
View File

@@ -0,0 +1 @@
Add support for optional whitespace around the Federation API's `Authorization` header's parameter commas.

1
changelog.d/17150.doc Normal file
View File

@@ -0,0 +1 @@
Clarify the state of the created room when using the `autocreate_auto_join_room_preset` config option.

1
changelog.d/17151.misc Normal file
View File

@@ -0,0 +1 @@
Add note to reflect that [MSC3886](https://github.com/matrix-org/matrix-spec-proposals/pull/3886) is closed but will support will remain for some time.

1
changelog.d/17162.misc Normal file
View File

@@ -0,0 +1 @@
Update dependency PyO3 to 0.21.

1
changelog.d/17169.misc Normal file
View File

@@ -0,0 +1 @@
Add database performance improvement when fetching auth chains.

24
debian/changelog vendored
View File

@@ -1,3 +1,27 @@
matrix-synapse-py3 (1.107.0~rc1) stable; urgency=medium
* New Synapse release 1.107.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 May 2024 16:26:26 +0100
matrix-synapse-py3 (1.106.0) stable; urgency=medium
* New Synapse release 1.106.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 30 Apr 2024 11:51:43 +0100
matrix-synapse-py3 (1.106.0~rc1) stable; urgency=medium
* New Synapse release 1.106.0rc1.
-- Synapse Packaging team <packages@matrix.org> Thu, 25 Apr 2024 15:54:59 +0100
matrix-synapse-py3 (1.105.1) stable; urgency=medium
* New Synapse release 1.105.1.
-- Synapse Packaging team <packages@matrix.org> Tue, 23 Apr 2024 15:56:18 +0100
matrix-synapse-py3 (1.105.0) stable; urgency=medium
* New Synapse release 1.105.0.

View File

@@ -1,20 +0,0 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@@ -1,50 +0,0 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "Synapse development"
copyright = "2023, The Matrix.org Foundation C.I.C."
author = "The Synapse Maintainers and Community"
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = [
"autodoc2",
"myst_parser",
]
templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
# -- Options for Autodoc2 ----------------------------------------------------
autodoc2_docstring_parser_regexes = [
# this will render all docstrings as 'MyST' Markdown
(r".*", "myst"),
]
autodoc2_packages = [
{
"path": "../synapse",
# Don't render documentation for everything as a matter of course
"auto_mode": False,
},
]
# -- Options for MyST (Markdown) ---------------------------------------------
# myst_heading_anchors = 2
# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output
html_theme = "furo"
html_static_path = ["_static"]

View File

@@ -1,22 +0,0 @@
.. Synapse Developer Documentation documentation master file, created by
sphinx-quickstart on Mon Mar 13 08:59:51 2023.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to the Synapse Developer Documentation!
===========================================================
.. toctree::
:maxdepth: 2
:caption: Contents:
modules/federation_sender
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,5 +0,0 @@
Federation Sender
=================
```{autodoc2-docstring} synapse.federation.sender
```

View File

@@ -163,7 +163,7 @@ FROM docker.io/library/python:${PYTHON_VERSION}-slim-bookworm
LABEL org.opencontainers.image.url='https://matrix.org/docs/projects/server/synapse'
LABEL org.opencontainers.image.documentation='https://github.com/element-hq/synapse/blob/master/docker/README.md'
LABEL org.opencontainers.image.source='https://github.com/element-hq/synapse.git'
LABEL org.opencontainers.image.licenses='Apache-2.0'
LABEL org.opencontainers.image.licenses='AGPL-3.0-or-later'
RUN \
--mount=type=cache,target=/var/cache/apt,sharing=locked \

View File

@@ -92,8 +92,6 @@ allow_device_name_lookup_over_federation: true
## Experimental Features ##
experimental_features:
# client-side support for partial state in /send_join responses
faster_joins: true
# Enable support for polls
msc3381_polls_enabled: true
# Enable deleting device-specific notification settings stored in account data
@@ -104,6 +102,10 @@ experimental_features:
msc3874_enabled: true
# no UIA for x-signing upload for the first time
msc3967_enabled: true
# Expose a room summary for public rooms
msc3266_enabled: true
msc4115_membership_on_events: true
server_notices:
system_mxid_localpart: _server

View File

@@ -1,6 +1,6 @@
# Edit Room Membership API
This API allows an administrator to join an user account with a given `user_id`
This API allows an administrator to join a user account with a given `user_id`
to a room with a given `room_id_or_alias`. You can only modify the membership of
local users. The server administrator must be in the room and have permission to
invite users.

View File

@@ -141,8 +141,8 @@ Body parameters:
provider for SSO (Single sign-on). More details are in the configuration manual under the
sections [sso](../usage/configuration/config_documentation.md#sso) and [oidc_providers](../usage/configuration/config_documentation.md#oidc_providers).
- `auth_provider` - **string**, required. The unique, internal ID of the external identity provider.
The same as `idp_id` from the homeserver configuration. Note that no error is raised if the
provided value is not in the homeserver configuration.
The same as `idp_id` from the homeserver configuration. If using OIDC, this value should be prefixed
with `oidc-`. Note that no error is raised if the provided value is not in the homeserver configuration.
- `external_id` - **string**, required. An identifier for the user in the external identity provider.
When the user logs in to the identity provider, this must be the unique ID that they map to.
- `admin` - **bool**, optional, defaults to `false`. Whether the user is a homeserver administrator,

View File

@@ -51,8 +51,8 @@ clients.
## Server configuration
Support for this feature can be enabled and configured by adding a the
`retention` in the Synapse configuration file (see
Support for this feature can be enabled and configured by adding the
`retention` option in the Synapse configuration file (see
[configuration manual](usage/configuration/config_documentation.md#retention)).
To enable support for message retention policies, set the setting
@@ -117,7 +117,7 @@ In this example, we define three jobs:
policy's `max_lifetime` is greater than a week.
Note that this example is tailored to show different configurations and
features slightly more jobs than it's probably necessary (in practice, a
features slightly more jobs than is probably necessary (in practice, a
server admin would probably consider it better to replace the two last
jobs with one that runs once a day and handles rooms which
policy's `max_lifetime` is greater than 3 days).

View File

@@ -128,7 +128,7 @@ can read more about that [here](https://www.postgresql.org/docs/10/kernel-resour
### Overview
The script `synapse_port_db` allows porting an existing synapse server
backed by SQLite to using PostgreSQL. This is done in as a two phase
backed by SQLite to using PostgreSQL. This is done as a two phase
process:
1. Copy the existing SQLite database to a separate location and run

View File

@@ -259,9 +259,9 @@ users, etc.) to the developers via the `--report-stats` argument.
This command will generate you a config file that you can then customise, but it will
also generate a set of keys for you. These keys will allow your homeserver to
identify itself to other homeserver, so don't lose or delete them. It would be
identify itself to other homeservers, so don't lose or delete them. It would be
wise to back them up somewhere safe. (If, for whatever reason, you do need to
change your homeserver's keys, you may find that other homeserver have the
change your homeserver's keys, you may find that other homeservers have the
old key cached. If you update the signing key, you should change the name of the
key in the `<server name>.signing.key` file (the second word) to something
different. See the [spec](https://matrix.org/docs/spec/server_server/latest.html#retrieving-server-keys) for more information on key management).

View File

@@ -98,6 +98,7 @@ A custom mapping provider must specify the following methods:
either accept this localpart or pick their own username. Otherwise this
option has no effect. If omitted, defaults to `False`.
- `display_name`: An optional string, the display name for the user.
- `picture`: An optional string, the avatar url for the user.
- `emails`: A list of strings, the email address(es) to associate with
this user. If omitted, defaults to an empty list.
* `async def get_extra_attributes(self, userinfo, token)`

View File

@@ -9,6 +9,7 @@ ReloadPropagatedFrom=matrix-synapse.target
Type=notify
NotifyAccess=main
User=matrix-synapse
RuntimeDirectory=synapse
WorkingDirectory=/var/lib/matrix-synapse
EnvironmentFile=-/etc/default/matrix-synapse
ExecStartPre=/opt/venvs/matrix-synapse/bin/python -m synapse.app.homeserver --config-path=/etc/matrix-synapse/homeserver.yaml --config-path=/etc/matrix-synapse/conf.d/ --generate-keys

View File

@@ -117,6 +117,14 @@ each upgrade are complete before moving on to the next upgrade, to avoid
stacking them up. You can monitor the currently running background updates with
[the Admin API](usage/administration/admin_api/background_updates.html#status).
# Upgrading to v1.106.0
## Minimum supported Rust version
The minimum supported Rust version has been increased from v1.65.0 to v1.66.0.
Users building from source will need to ensure their `rustc` version is up to
date.
# Upgrading to v1.100.0
## Minimum supported Rust version

View File

@@ -44,7 +44,7 @@ For each update:
## Enabled
This API allow pausing background updates.
This API allows pausing background updates.
Background updates should *not* be paused for significant periods of time, as
this can affect the performance of Synapse.

View File

@@ -241,7 +241,7 @@ in memory constrained environments, or increased if performance starts to
degrade.
However, degraded performance due to a low cache factor, common on
machines with slow disks, often leads to explosions in memory use due
machines with slow disks, often leads to explosions in memory use due to
backlogged requests. In this case, reducing the cache factor will make
things worse. Instead, try increasing it drastically. 2.0 is a good
starting value.

View File

@@ -676,8 +676,8 @@ This setting has the following sub-options:
trailing 's'.
* `app_name`: `app_name` defines the default value for '%(app)s' in `notif_from` and email
subjects. It defaults to 'Matrix'.
* `enable_notifs`: Set to true to enable sending emails for messages that the user
has missed. Disabled by default.
* `enable_notifs`: Set to true to allow users to receive e-mail notifications. If this is not set,
users can configure e-mail notifications but will not receive them. Disabled by default.
* `notif_for_new_users`: Set to false to disable automatic subscription to email
notifications for new users. Enabled by default.
* `notif_delay_before_mail`: The time to wait before emailing about a notification.
@@ -1317,6 +1317,12 @@ Options related to caching.
The number of events to cache in memory. Defaults to 10K. Like other caches,
this is affected by `caches.global_factor` (see below).
For example, the default is 10K and the global_factor default is 0.5.
Since 10K * 0.5 is 5K then the event cache size will be 5K.
The cache affected by this configuration is named as "*getEvent*".
Note that this option is not part of the `caches` section.
Example configuration:
@@ -1342,6 +1348,8 @@ number of entries that can be stored.
Defaults to 0.5, which will halve the size of all caches.
Note that changing this value also affects the HTTP connection pool.
* `per_cache_factors`: A dictionary of cache name to cache factor for that individual
cache. Overrides the global cache factor for a given cache.
@@ -2583,6 +2591,11 @@ Possible values for this option are:
* "trusted_private_chat": an invitation is required to join this room and the invitee is
assigned a power level of 100 upon joining the room.
Each preset will set up a room in the same manner as if it were provided as the `preset` parameter when
calling the
[`POST /_matrix/client/v3/createRoom`](https://spec.matrix.org/latest/client-server-api/#post_matrixclientv3createroom)
Client-Server API endpoint.
If a value of "private_chat" or "trusted_private_chat" is used then
`auto_join_mxid_localpart` must also be configured.

View File

@@ -86,9 +86,9 @@ The search term is then split into words:
* If unavailable, then runs of ASCII characters, numbers, underscores, and hyphens
are considered words.
The queries for PostgreSQL and SQLite are detailed below, by their overall goal
The queries for PostgreSQL and SQLite are detailed below, but their overall goal
is to find matching users, preferring users who are "real" (e.g. not bots,
not deactivated). It is assumed that real users will have an display name and
not deactivated). It is assumed that real users will have a display name and
avatar set.
### PostgreSQL

View File

@@ -232,7 +232,7 @@ information.
^/_matrix/client/v1/rooms/.*/hierarchy$
^/_matrix/client/(v1|unstable)/rooms/.*/relations/
^/_matrix/client/v1/rooms/.*/threads$
^/_matrix/client/unstable/im.nheko.summary/rooms/.*/summary$
^/_matrix/client/unstable/im.nheko.summary/summary/.*$
^/_matrix/client/(r0|v3|unstable)/account/3pid$
^/_matrix/client/(r0|v3|unstable)/account/whoami$
^/_matrix/client/(r0|v3|unstable)/devices$
@@ -634,7 +634,7 @@ worker application type.
#### Push Notifications
You can designate generic worker to sending push notifications to
You can designate generic workers to send push notifications to
a [push gateway](https://spec.matrix.org/v1.5/push-gateway-api/) such as
[sygnal](https://github.com/matrix-org/sygnal) and email.

1005
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -96,7 +96,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.105.0"
version = "1.107.0rc1"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"
@@ -364,17 +364,6 @@ towncrier = ">=18.6.0rc1"
tomli = ">=1.2.3"
# Dependencies for building the development documentation
[tool.poetry.group.dev-docs]
optional = true
[tool.poetry.group.dev-docs.dependencies]
sphinx = {version = "^6.1", python = "^3.8"}
sphinx-autodoc2 = {version = ">=0.4.2,<0.6.0", python = "^3.8"}
myst-parser = {version = "^1.0.0", python = "^3.8"}
furo = ">=2022.12.7,<2025.0.0"
[build-system]
# The upper bounds here are defensive, intended to prevent situations like
# https://github.com/matrix-org/synapse/issues/13849 and

View File

@@ -23,22 +23,26 @@ name = "synapse.synapse_rust"
[dependencies]
anyhow = "1.0.63"
base64 = "0.21.7"
bytes = "1.6.0"
headers = "0.4.0"
http = "1.1.0"
lazy_static = "1.4.0"
log = "0.4.17"
pyo3 = { version = "0.20.0", features = [
mime = "0.3.17"
pyo3 = { version = "0.21.0", features = [
"macros",
"anyhow",
"abi3",
"abi3-py38",
] }
pyo3-log = "0.9.0"
pythonize = "0.20.0"
pyo3-log = "0.10.0"
pythonize = "0.21.0"
regex = "1.6.0"
sha2 = "0.10.8"
serde = { version = "1.0.144", features = ["derive"] }
serde_json = "1.0.85"
ulid = "1.1.2"
[features]
extension-module = ["pyo3/extension-module"]

View File

@@ -25,21 +25,21 @@ use std::net::Ipv4Addr;
use std::str::FromStr;
use anyhow::Error;
use pyo3::prelude::*;
use pyo3::{prelude::*, pybacked::PyBackedStr};
use regex::Regex;
use crate::push::utils::{glob_to_regex, GlobMatchType};
/// Called when registering modules with python.
pub fn register_module(py: Python<'_>, m: &PyModule) -> PyResult<()> {
let child_module = PyModule::new(py, "acl")?;
pub fn register_module(py: Python<'_>, m: &Bound<'_, PyModule>) -> PyResult<()> {
let child_module = PyModule::new_bound(py, "acl")?;
child_module.add_class::<ServerAclEvaluator>()?;
m.add_submodule(child_module)?;
m.add_submodule(&child_module)?;
// We need to manually add the module to sys.modules to make `from
// synapse.synapse_rust import acl` work.
py.import("sys")?
py.import_bound("sys")?
.getattr("modules")?
.set_item("synapse.synapse_rust.acl", child_module)?;
@@ -59,8 +59,8 @@ impl ServerAclEvaluator {
#[new]
pub fn py_new(
allow_ip_literals: bool,
allow: Vec<&str>,
deny: Vec<&str>,
allow: Vec<PyBackedStr>,
deny: Vec<PyBackedStr>,
) -> Result<Self, Error> {
let allow = allow
.iter()

View File

@@ -20,8 +20,10 @@
//! Implements the internal metadata class attached to events.
//!
//! The internal metadata is a bit like a `TypedDict`, in that it is stored as a
//! JSON dict in the DB. Most events have zero, or only a few, of these keys
//! The internal metadata is a bit like a `TypedDict`, in that most of
//! it is stored as a JSON dict in the DB (the exceptions being `outlier`
//! and `stream_ordering` which have their own columns in the database).
//! Most events have zero, or only a few, of these keys
//! set. Therefore, since we care more about memory size than performance here,
//! we store these fields in a mapping.
//!
@@ -36,9 +38,10 @@ use anyhow::Context;
use log::warn;
use pyo3::{
exceptions::PyAttributeError,
pybacked::PyBackedStr,
pyclass, pymethods,
types::{PyDict, PyString},
IntoPy, PyAny, PyObject, PyResult, Python,
types::{PyAnyMethods, PyDict, PyDictMethods, PyString},
Bound, IntoPy, PyAny, PyObject, PyResult, Python,
};
/// Definitions of the various fields of the internal metadata.
@@ -57,7 +60,7 @@ enum EventInternalMetadataData {
impl EventInternalMetadataData {
/// Convert the field to its name and python object.
fn to_python_pair<'a>(&self, py: Python<'a>) -> (&'a PyString, PyObject) {
fn to_python_pair<'a>(&self, py: Python<'a>) -> (&'a Bound<'a, PyString>, PyObject) {
match self {
EventInternalMetadataData::OutOfBandMembership(o) => {
(pyo3::intern!(py, "out_of_band_membership"), o.into_py(py))
@@ -88,10 +91,13 @@ impl EventInternalMetadataData {
/// Converts from python key/values to the field.
///
/// Returns `None` if the key is a valid but unrecognized string.
fn from_python_pair(key: &PyAny, value: &PyAny) -> PyResult<Option<Self>> {
let key_str: &str = key.extract()?;
fn from_python_pair(
key: &Bound<'_, PyAny>,
value: &Bound<'_, PyAny>,
) -> PyResult<Option<Self>> {
let key_str: PyBackedStr = key.extract()?;
let e = match key_str {
let e = match &*key_str {
"out_of_band_membership" => EventInternalMetadataData::OutOfBandMembership(
value
.extract()
@@ -208,11 +214,11 @@ pub struct EventInternalMetadata {
#[pymethods]
impl EventInternalMetadata {
#[new]
fn new(dict: &PyDict) -> PyResult<Self> {
fn new(dict: &Bound<'_, PyDict>) -> PyResult<Self> {
let mut data = Vec::with_capacity(dict.len());
for (key, value) in dict.iter() {
match EventInternalMetadataData::from_python_pair(key, value) {
match EventInternalMetadataData::from_python_pair(&key, &value) {
Ok(Some(entry)) => data.push(entry),
Ok(None) => {}
Err(err) => {
@@ -234,8 +240,11 @@ impl EventInternalMetadata {
self.clone()
}
/// Get a dict holding the data stored in the `internal_metadata` column in the database.
///
/// Note that `outlier` and `stream_ordering` are stored in separate columns so are not returned here.
fn get_dict(&self, py: Python<'_>) -> PyResult<PyObject> {
let dict = PyDict::new(py);
let dict = PyDict::new_bound(py);
for entry in &self.data {
let (key, value) = entry.to_python_pair(py);

View File

@@ -20,20 +20,23 @@
//! Classes for representing Events.
use pyo3::{types::PyModule, PyResult, Python};
use pyo3::{
types::{PyAnyMethods, PyModule, PyModuleMethods},
Bound, PyResult, Python,
};
mod internal_metadata;
/// Called when registering modules with python.
pub fn register_module(py: Python<'_>, m: &PyModule) -> PyResult<()> {
let child_module = PyModule::new(py, "events")?;
pub fn register_module(py: Python<'_>, m: &Bound<'_, PyModule>) -> PyResult<()> {
let child_module = PyModule::new_bound(py, "events")?;
child_module.add_class::<internal_metadata::EventInternalMetadata>()?;
m.add_submodule(child_module)?;
m.add_submodule(&child_module)?;
// We need to manually add the module to sys.modules to make `from
// synapse.synapse_rust import events` work.
py.import("sys")?
py.import_bound("sys")?
.getattr("modules")?
.set_item("synapse.synapse_rust.events", child_module)?;

View File

@@ -17,8 +17,8 @@ use headers::{Header, HeaderMapExt};
use http::{HeaderName, HeaderValue, Method, Request, Response, StatusCode, Uri};
use pyo3::{
exceptions::PyValueError,
types::{PyBytes, PySequence, PyTuple},
PyAny, PyResult,
types::{PyAnyMethods, PyBytes, PyBytesMethods, PySequence, PyTuple},
Bound, PyAny, PyResult,
};
use crate::errors::SynapseError;
@@ -28,10 +28,11 @@ use crate::errors::SynapseError;
/// # Errors
///
/// Returns an error if calling the `read` on the Python object failed
fn read_io_body(body: &PyAny, chunk_size: usize) -> PyResult<Bytes> {
fn read_io_body(body: &Bound<'_, PyAny>, chunk_size: usize) -> PyResult<Bytes> {
let mut buf = BytesMut::new();
loop {
let bytes: &PyBytes = body.call_method1("read", (chunk_size,))?.downcast()?;
let bound = &body.call_method1("read", (chunk_size,))?;
let bytes: &Bound<'_, PyBytes> = bound.downcast()?;
if bytes.as_bytes().is_empty() {
return Ok(buf.into());
}
@@ -50,17 +51,19 @@ fn read_io_body(body: &PyAny, chunk_size: usize) -> PyResult<Bytes> {
/// # Errors
///
/// Returns an error if the Python object doesn't properly implement `IRequest`
pub fn http_request_from_twisted(request: &PyAny) -> PyResult<Request<Bytes>> {
pub fn http_request_from_twisted(request: &Bound<'_, PyAny>) -> PyResult<Request<Bytes>> {
let content = request.getattr("content")?;
let body = read_io_body(content, 4096)?;
let body = read_io_body(&content, 4096)?;
let mut req = Request::new(body);
let uri: &PyBytes = request.getattr("uri")?.downcast()?;
let bound = &request.getattr("uri")?;
let uri: &Bound<'_, PyBytes> = bound.downcast()?;
*req.uri_mut() =
Uri::try_from(uri.as_bytes()).map_err(|_| PyValueError::new_err("invalid uri"))?;
let method: &PyBytes = request.getattr("method")?.downcast()?;
let bound = &request.getattr("method")?;
let method: &Bound<'_, PyBytes> = bound.downcast()?;
*req.method_mut() = Method::from_bytes(method.as_bytes())
.map_err(|_| PyValueError::new_err("invalid method"))?;
@@ -71,14 +74,17 @@ pub fn http_request_from_twisted(request: &PyAny) -> PyResult<Request<Bytes>> {
for header in headers_iter {
let header = header?;
let header: &PyTuple = header.downcast()?;
let name: &PyBytes = header.get_item(0)?.downcast()?;
let header: &Bound<'_, PyTuple> = header.downcast()?;
let bound = &header.get_item(0)?;
let name: &Bound<'_, PyBytes> = bound.downcast()?;
let name = HeaderName::from_bytes(name.as_bytes())
.map_err(|_| PyValueError::new_err("invalid header name"))?;
let values: &PySequence = header.get_item(1)?.downcast()?;
let bound = &header.get_item(1)?;
let values: &Bound<'_, PySequence> = bound.downcast()?;
for index in 0..values.len()? {
let value: &PyBytes = values.get_item(index)?.downcast()?;
let bound = &values.get_item(index)?;
let value: &Bound<'_, PyBytes> = bound.downcast()?;
let value = HeaderValue::from_bytes(value.as_bytes())
.map_err(|_| PyValueError::new_err("invalid header value"))?;
req.headers_mut().append(name.clone(), value);
@@ -100,7 +106,10 @@ pub fn http_request_from_twisted(request: &PyAny) -> PyResult<Request<Bytes>> {
/// # Errors
///
/// Returns an error if the Python object doesn't properly implement `IRequest`
pub fn http_response_to_twisted<B>(request: &PyAny, response: Response<B>) -> PyResult<()>
pub fn http_response_to_twisted<B>(
request: &Bound<'_, PyAny>,
response: Response<B>,
) -> PyResult<()>
where
B: Buf,
{

View File

@@ -7,6 +7,7 @@ pub mod errors;
pub mod events;
pub mod http;
pub mod push;
pub mod rendezvous;
lazy_static! {
static ref LOGGING_HANDLE: ResetHandle = pyo3_log::init();
@@ -37,7 +38,7 @@ fn reset_logging_config() {
/// The entry point for defining the Python module.
#[pymodule]
fn synapse_rust(py: Python<'_>, m: &PyModule) -> PyResult<()> {
fn synapse_rust(py: Python<'_>, m: &Bound<'_, PyModule>) -> PyResult<()> {
m.add_function(wrap_pyfunction!(sum_as_string, m)?)?;
m.add_function(wrap_pyfunction!(get_rust_file_digest, m)?)?;
m.add_function(wrap_pyfunction!(reset_logging_config, m)?)?;
@@ -45,6 +46,7 @@ fn synapse_rust(py: Python<'_>, m: &PyModule) -> PyResult<()> {
acl::register_module(py, m)?;
push::register_module(py, m)?;
events::register_module(py, m)?;
rendezvous::register_module(py, m)?;
Ok(())
}

View File

@@ -66,7 +66,7 @@ use log::warn;
use pyo3::exceptions::PyTypeError;
use pyo3::prelude::*;
use pyo3::types::{PyBool, PyList, PyLong, PyString};
use pythonize::{depythonize, pythonize};
use pythonize::{depythonize_bound, pythonize};
use serde::de::Error as _;
use serde::{Deserialize, Serialize};
use serde_json::Value;
@@ -78,19 +78,19 @@ pub mod evaluator;
pub mod utils;
/// Called when registering modules with python.
pub fn register_module(py: Python<'_>, m: &PyModule) -> PyResult<()> {
let child_module = PyModule::new(py, "push")?;
pub fn register_module(py: Python<'_>, m: &Bound<'_, PyModule>) -> PyResult<()> {
let child_module = PyModule::new_bound(py, "push")?;
child_module.add_class::<PushRule>()?;
child_module.add_class::<PushRules>()?;
child_module.add_class::<FilteredPushRules>()?;
child_module.add_class::<PushRuleEvaluator>()?;
child_module.add_function(wrap_pyfunction!(get_base_rule_ids, m)?)?;
m.add_submodule(child_module)?;
m.add_submodule(&child_module)?;
// We need to manually add the module to sys.modules to make `from
// synapse.synapse_rust import push` work.
py.import("sys")?
py.import_bound("sys")?
.getattr("modules")?
.set_item("synapse.synapse_rust.push", child_module)?;
@@ -271,12 +271,12 @@ pub enum SimpleJsonValue {
impl<'source> FromPyObject<'source> for SimpleJsonValue {
fn extract(ob: &'source PyAny) -> PyResult<Self> {
if let Ok(s) = <PyString as pyo3::PyTryFrom>::try_from(ob) {
if let Ok(s) = ob.downcast::<PyString>() {
Ok(SimpleJsonValue::Str(Cow::Owned(s.to_string())))
// A bool *is* an int, ensure we try bool first.
} else if let Ok(b) = <PyBool as pyo3::PyTryFrom>::try_from(ob) {
} else if let Ok(b) = ob.downcast::<PyBool>() {
Ok(SimpleJsonValue::Bool(b.extract()?))
} else if let Ok(i) = <PyLong as pyo3::PyTryFrom>::try_from(ob) {
} else if let Ok(i) = ob.downcast::<PyLong>() {
Ok(SimpleJsonValue::Int(i.extract()?))
} else if ob.is_none() {
Ok(SimpleJsonValue::Null)
@@ -299,7 +299,7 @@ pub enum JsonValue {
impl<'source> FromPyObject<'source> for JsonValue {
fn extract(ob: &'source PyAny) -> PyResult<Self> {
if let Ok(l) = <PyList as pyo3::PyTryFrom>::try_from(ob) {
if let Ok(l) = ob.downcast::<PyList>() {
match l.iter().map(SimpleJsonValue::extract).collect() {
Ok(a) => Ok(JsonValue::Array(a)),
Err(e) => Err(PyTypeError::new_err(format!(
@@ -370,8 +370,8 @@ impl IntoPy<PyObject> for Condition {
}
impl<'source> FromPyObject<'source> for Condition {
fn extract(ob: &'source PyAny) -> PyResult<Self> {
Ok(depythonize(ob)?)
fn extract_bound(ob: &Bound<'source, PyAny>) -> PyResult<Self> {
Ok(depythonize_bound(ob.clone())?)
}
}

327
rust/src/rendezvous/mod.rs Normal file
View File

@@ -0,0 +1,327 @@
/*
* This file is licensed under the Affero General Public License (AGPL) version 3.
*
* Copyright (C) 2024 New Vector, Ltd
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* See the GNU Affero General Public License for more details:
* <https://www.gnu.org/licenses/agpl-3.0.html>.
*
*/
use std::{
collections::{BTreeMap, HashMap},
time::{Duration, SystemTime},
};
use bytes::Bytes;
use headers::{
AccessControlAllowOrigin, AccessControlExposeHeaders, CacheControl, ContentLength, ContentType,
HeaderMapExt, IfMatch, IfNoneMatch, Pragma,
};
use http::{header::ETAG, HeaderMap, Response, StatusCode, Uri};
use mime::Mime;
use pyo3::{
exceptions::PyValueError,
pyclass, pymethods,
types::{PyAnyMethods, PyModule, PyModuleMethods},
Bound, Py, PyAny, PyObject, PyResult, Python, ToPyObject,
};
use ulid::Ulid;
use self::session::Session;
use crate::{
errors::{NotFoundError, SynapseError},
http::{http_request_from_twisted, http_response_to_twisted, HeaderMapPyExt},
};
mod session;
// n.b. Because OPTIONS requests are handled by the Python code, we don't need to set Access-Control-Allow-Headers.
fn prepare_headers(headers: &mut HeaderMap, session: &Session) {
headers.typed_insert(AccessControlAllowOrigin::ANY);
headers.typed_insert(AccessControlExposeHeaders::from_iter([ETAG]));
headers.typed_insert(Pragma::no_cache());
headers.typed_insert(CacheControl::new().with_no_store());
headers.typed_insert(session.etag());
headers.typed_insert(session.expires());
headers.typed_insert(session.last_modified());
}
#[pyclass]
struct RendezvousHandler {
base: Uri,
clock: PyObject,
sessions: BTreeMap<Ulid, Session>,
capacity: usize,
max_content_length: u64,
ttl: Duration,
}
impl RendezvousHandler {
/// Check the input headers of a request which sets data for a session, and return the content type.
fn check_input_headers(&self, headers: &HeaderMap) -> PyResult<Mime> {
let ContentLength(content_length) = headers.typed_get_required()?;
if content_length > self.max_content_length {
return Err(SynapseError::new(
StatusCode::PAYLOAD_TOO_LARGE,
"Payload too large".to_owned(),
"M_TOO_LARGE",
None,
None,
));
}
let content_type: ContentType = headers.typed_get_required()?;
// Content-Type must be text/plain
if content_type != ContentType::text() {
return Err(SynapseError::new(
StatusCode::BAD_REQUEST,
"Content-Type must be text/plain".to_owned(),
"M_INVALID_PARAM",
None,
None,
));
}
Ok(content_type.into())
}
/// Evict expired sessions and remove the oldest sessions until we're under the capacity.
fn evict(&mut self, now: SystemTime) {
// First remove all the entries which expired
self.sessions.retain(|_, session| !session.expired(now));
// Then we remove the oldest entires until we're under the limit
while self.sessions.len() > self.capacity {
self.sessions.pop_first();
}
}
}
#[pymethods]
impl RendezvousHandler {
#[new]
#[pyo3(signature = (homeserver, /, capacity=100, max_content_length=4*1024, eviction_interval=60*1000, ttl=60*1000))]
fn new(
py: Python<'_>,
homeserver: &Bound<'_, PyAny>,
capacity: usize,
max_content_length: u64,
eviction_interval: u64,
ttl: u64,
) -> PyResult<Py<Self>> {
let base: String = homeserver
.getattr("config")?
.getattr("server")?
.getattr("public_baseurl")?
.extract()?;
let base = Uri::try_from(format!("{base}_synapse/client/rendezvous"))
.map_err(|_| PyValueError::new_err("Invalid base URI"))?;
let clock = homeserver.call_method0("get_clock")?.to_object(py);
// Construct a Python object so that we can get a reference to the
// evict method and schedule it to run.
let self_ = Py::new(
py,
Self {
base,
clock,
sessions: BTreeMap::new(),
capacity,
max_content_length,
ttl: Duration::from_millis(ttl),
},
)?;
let evict = self_.getattr(py, "_evict")?;
homeserver.call_method0("get_clock")?.call_method(
"looping_call",
(evict, eviction_interval),
None,
)?;
Ok(self_)
}
fn _evict(&mut self, py: Python<'_>) -> PyResult<()> {
let clock = self.clock.bind(py);
let now: u64 = clock.call_method0("time_msec")?.extract()?;
let now = SystemTime::UNIX_EPOCH + Duration::from_millis(now);
self.evict(now);
Ok(())
}
fn handle_post(&mut self, py: Python<'_>, twisted_request: &Bound<'_, PyAny>) -> PyResult<()> {
let request = http_request_from_twisted(twisted_request)?;
let content_type = self.check_input_headers(request.headers())?;
let clock = self.clock.bind(py);
let now: u64 = clock.call_method0("time_msec")?.extract()?;
let now = SystemTime::UNIX_EPOCH + Duration::from_millis(now);
// We trigger an immediate eviction if we're at 2x the capacity
if self.sessions.len() >= self.capacity * 2 {
self.evict(now);
}
// Generate a new ULID for the session from the current time.
let id = Ulid::from_datetime(now);
let uri = format!("{base}/{id}", base = self.base);
let body = request.into_body();
let session = Session::new(body, content_type, now, self.ttl);
let response = serde_json::json!({
"url": uri,
})
.to_string();
let mut response = Response::new(response.as_bytes());
*response.status_mut() = StatusCode::CREATED;
response.headers_mut().typed_insert(ContentType::json());
prepare_headers(response.headers_mut(), &session);
http_response_to_twisted(twisted_request, response)?;
self.sessions.insert(id, session);
Ok(())
}
fn handle_get(
&mut self,
py: Python<'_>,
twisted_request: &Bound<'_, PyAny>,
id: &str,
) -> PyResult<()> {
let request = http_request_from_twisted(twisted_request)?;
let if_none_match: Option<IfNoneMatch> = request.headers().typed_get_optional()?;
let now: u64 = self.clock.call_method0(py, "time_msec")?.extract(py)?;
let now = SystemTime::UNIX_EPOCH + Duration::from_millis(now);
let id: Ulid = id.parse().map_err(|_| NotFoundError::new())?;
let session = self
.sessions
.get(&id)
.filter(|s| !s.expired(now))
.ok_or_else(NotFoundError::new)?;
if let Some(if_none_match) = if_none_match {
if !if_none_match.precondition_passes(&session.etag()) {
let mut response = Response::new(Bytes::new());
*response.status_mut() = StatusCode::NOT_MODIFIED;
prepare_headers(response.headers_mut(), session);
http_response_to_twisted(twisted_request, response)?;
return Ok(());
}
}
let mut response = Response::new(session.data());
*response.status_mut() = StatusCode::OK;
let headers = response.headers_mut();
prepare_headers(headers, session);
headers.typed_insert(session.content_type());
headers.typed_insert(session.content_length());
http_response_to_twisted(twisted_request, response)?;
Ok(())
}
fn handle_put(
&mut self,
py: Python<'_>,
twisted_request: &Bound<'_, PyAny>,
id: &str,
) -> PyResult<()> {
let request = http_request_from_twisted(twisted_request)?;
let content_type = self.check_input_headers(request.headers())?;
let if_match: IfMatch = request.headers().typed_get_required()?;
let data = request.into_body();
let now: u64 = self.clock.call_method0(py, "time_msec")?.extract(py)?;
let now = SystemTime::UNIX_EPOCH + Duration::from_millis(now);
let id: Ulid = id.parse().map_err(|_| NotFoundError::new())?;
let session = self
.sessions
.get_mut(&id)
.filter(|s| !s.expired(now))
.ok_or_else(NotFoundError::new)?;
if !if_match.precondition_passes(&session.etag()) {
let mut headers = HeaderMap::new();
prepare_headers(&mut headers, session);
let mut additional_fields = HashMap::with_capacity(1);
additional_fields.insert(
String::from("org.matrix.msc4108.errcode"),
String::from("M_CONCURRENT_WRITE"),
);
return Err(SynapseError::new(
StatusCode::PRECONDITION_FAILED,
"ETag does not match".to_owned(),
"M_UNKNOWN", // Would be M_CONCURRENT_WRITE
Some(additional_fields),
Some(headers),
));
}
session.update(data, content_type, now);
let mut response = Response::new(Bytes::new());
*response.status_mut() = StatusCode::ACCEPTED;
prepare_headers(response.headers_mut(), session);
http_response_to_twisted(twisted_request, response)?;
Ok(())
}
fn handle_delete(&mut self, twisted_request: &Bound<'_, PyAny>, id: &str) -> PyResult<()> {
let _request = http_request_from_twisted(twisted_request)?;
let id: Ulid = id.parse().map_err(|_| NotFoundError::new())?;
let _session = self.sessions.remove(&id).ok_or_else(NotFoundError::new)?;
let mut response = Response::new(Bytes::new());
*response.status_mut() = StatusCode::NO_CONTENT;
response
.headers_mut()
.typed_insert(AccessControlAllowOrigin::ANY);
http_response_to_twisted(twisted_request, response)?;
Ok(())
}
}
pub fn register_module(py: Python<'_>, m: &Bound<'_, PyModule>) -> PyResult<()> {
let child_module = PyModule::new_bound(py, "rendezvous")?;
child_module.add_class::<RendezvousHandler>()?;
m.add_submodule(&child_module)?;
// We need to manually add the module to sys.modules to make `from
// synapse.synapse_rust import rendezvous` work.
py.import_bound("sys")?
.getattr("modules")?
.set_item("synapse.synapse_rust.rendezvous", child_module)?;
Ok(())
}

View File

@@ -0,0 +1,91 @@
/*
* This file is licensed under the Affero General Public License (AGPL) version 3.
*
* Copyright (C) 2024 New Vector, Ltd
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* See the GNU Affero General Public License for more details:
* <https://www.gnu.org/licenses/agpl-3.0.html>.
*/
use std::time::{Duration, SystemTime};
use base64::{engine::general_purpose::URL_SAFE_NO_PAD, Engine as _};
use bytes::Bytes;
use headers::{ContentLength, ContentType, ETag, Expires, LastModified};
use mime::Mime;
use sha2::{Digest, Sha256};
/// A single session, containing data, metadata, and expiry information.
pub struct Session {
hash: [u8; 32],
data: Bytes,
content_type: Mime,
last_modified: SystemTime,
expires: SystemTime,
}
impl Session {
/// Create a new session with the given data, content type, and time-to-live.
pub fn new(data: Bytes, content_type: Mime, now: SystemTime, ttl: Duration) -> Self {
let hash = Sha256::digest(&data).into();
Self {
hash,
data,
content_type,
expires: now + ttl,
last_modified: now,
}
}
/// Returns true if the session has expired at the given time.
pub fn expired(&self, now: SystemTime) -> bool {
self.expires <= now
}
/// Update the session with new data, content type, and last modified time.
pub fn update(&mut self, data: Bytes, content_type: Mime, now: SystemTime) {
self.hash = Sha256::digest(&data).into();
self.data = data;
self.content_type = content_type;
self.last_modified = now;
}
/// Returns the Content-Type header of the session.
pub fn content_type(&self) -> ContentType {
self.content_type.clone().into()
}
/// Returns the Content-Length header of the session.
pub fn content_length(&self) -> ContentLength {
ContentLength(self.data.len() as _)
}
/// Returns the ETag header of the session.
pub fn etag(&self) -> ETag {
let encoded = URL_SAFE_NO_PAD.encode(self.hash);
// SAFETY: Base64 encoding is URL-safe, so ETag-safe
format!("\"{encoded}\"")
.parse()
.expect("base64-encoded hash should be URL-safe")
}
/// Returns the Last-Modified header of the session.
pub fn last_modified(&self) -> LastModified {
self.last_modified.into()
}
/// Returns the Expires header of the session.
pub fn expires(&self) -> Expires {
self.expires.into()
}
/// Returns the current data stored in the session.
pub fn data(&self) -> Bytes {
self.data.clone()
}
}

View File

@@ -214,7 +214,17 @@ fi
extra_test_args=()
test_packages="./tests/csapi ./tests ./tests/msc3874 ./tests/msc3890 ./tests/msc3391 ./tests/msc3930 ./tests/msc3902 ./tests/msc3967"
test_packages=(
./tests/csapi
./tests
./tests/msc3874
./tests/msc3890
./tests/msc3391
./tests/msc3930
./tests/msc3902
./tests/msc3967
./tests/msc4115
)
# Enable dirty runs, so tests will reuse the same container where possible.
# This significantly speeds up tests, but increases the possibility of test pollution.
@@ -278,7 +288,7 @@ fi
export PASS_SYNAPSE_LOG_TESTING=1
# Run the tests!
echo "Images built; running complement with ${extra_test_args[@]} $@ $test_packages"
echo "Images built; running complement with ${extra_test_args[@]} $@ ${test_packages[@]}"
cd "$COMPLEMENT_DIR"
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" $test_packages
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" "${test_packages[@]}"

View File

@@ -91,7 +91,6 @@ else
"synapse" "docker" "tests"
"scripts-dev"
"contrib" "synmark" "stubs" ".ci"
"dev-docs"
)
fi
fi

View File

@@ -127,7 +127,7 @@ BOOLEAN_COLUMNS = {
"redactions": ["have_censored"],
"room_stats_state": ["is_federatable"],
"rooms": ["is_public", "has_auth_chain_index"],
"users": ["shadow_banned", "approved", "locked"],
"users": ["shadow_banned", "approved", "locked", "suspended"],
"un_partial_stated_event_stream": ["rejection_status_changed"],
"users_who_share_rooms": ["share_private"],
"per_user_experimental_features": ["enabled"],

View File

@@ -234,6 +234,13 @@ class EventContentFields:
TO_DEVICE_MSGID: Final = "org.matrix.msgid"
class EventUnsignedContentFields:
"""Fields found inside the 'unsigned' data on events"""
# Requesting user's membership, per MSC4115
MSC4115_MEMBERSHIP: Final = "io.element.msc4115.membership"
class RoomTypes:
"""Understood values of the room_type field of m.room.create events."""

View File

@@ -52,6 +52,7 @@ DEFAULT_SUBJECTS = {
"invite_from_person_to_space": "[%(app)s] %(person)s has invited you to join the %(space)s space on %(app)s...",
"password_reset": "[%(server_name)s] Password reset",
"email_validation": "[%(server_name)s] Validate your email",
"email_already_in_use": "[%(server_name)s] Email already in use",
}
LEGACY_TEMPLATE_DIR_WARNING = """
@@ -76,6 +77,7 @@ class EmailSubjectConfig:
invite_from_person_to_space: str
password_reset: str
email_validation: str
email_already_in_use: str
class EmailConfig(Config):
@@ -180,6 +182,12 @@ class EmailConfig(Config):
registration_template_text = email_config.get(
"registration_template_text", "registration.txt"
)
already_in_use_template_html = email_config.get(
"already_in_use_template_html", "already_in_use.html"
)
already_in_use_template_text = email_config.get(
"already_in_use_template_html", "already_in_use.txt"
)
add_threepid_template_html = email_config.get(
"add_threepid_template_html", "add_threepid.html"
)
@@ -215,6 +223,8 @@ class EmailConfig(Config):
self.email_password_reset_template_text,
self.email_registration_template_html,
self.email_registration_template_text,
self.email_already_in_use_template_html,
self.email_already_in_use_template_text,
self.email_add_threepid_template_html,
self.email_add_threepid_template_text,
self.email_password_reset_template_confirmation_html,
@@ -230,6 +240,8 @@ class EmailConfig(Config):
password_reset_template_text,
registration_template_html,
registration_template_text,
already_in_use_template_html,
already_in_use_template_text,
add_threepid_template_html,
add_threepid_template_text,
"password_reset_confirmation.html",

View File

@@ -413,12 +413,26 @@ class ExperimentalConfig(Config):
)
# MSC4108: Mechanism to allow OIDC sign in and E2EE set up via QR code
self.msc4108_enabled = experimental.get("msc4108_enabled", False)
self.msc4108_delegation_endpoint: Optional[str] = experimental.get(
"msc4108_delegation_endpoint", None
)
if self.msc4108_delegation_endpoint is not None and not self.msc3861.enabled:
if (
self.msc4108_enabled or self.msc4108_delegation_endpoint is not None
) and not self.msc3861.enabled:
raise ConfigError(
"MSC4108 requires MSC3861 to be enabled",
("experimental", "msc4108_delegation_endpoint"),
)
if self.msc4108_delegation_endpoint is not None and self.msc4108_enabled:
raise ConfigError(
"You cannot have MSC4108 both enabled and delegated at the same time",
("experimental", "msc4108_delegation_endpoint"),
)
self.msc4115_membership_on_events = experimental.get(
"msc4115_membership_on_events", False
)

View File

@@ -49,7 +49,7 @@ from synapse.api.errors import Codes, SynapseError
from synapse.api.room_versions import RoomVersion
from synapse.types import JsonDict, Requester
from . import EventBase
from . import EventBase, make_event_from_dict
if TYPE_CHECKING:
from synapse.handlers.relations import BundledAggregations
@@ -82,17 +82,14 @@ def prune_event(event: EventBase) -> EventBase:
"""
pruned_event_dict = prune_event_dict(event.room_version, event.get_dict())
from . import make_event_from_dict
pruned_event = make_event_from_dict(
pruned_event_dict, event.room_version, event.internal_metadata.get_dict()
)
# copy the internal fields
# Copy the bits of `internal_metadata` that aren't returned by `get_dict`
pruned_event.internal_metadata.stream_ordering = (
event.internal_metadata.stream_ordering
)
pruned_event.internal_metadata.outlier = event.internal_metadata.outlier
# Mark the event as redacted
@@ -101,6 +98,29 @@ def prune_event(event: EventBase) -> EventBase:
return pruned_event
def clone_event(event: EventBase) -> EventBase:
"""Take a copy of the event.
This is mostly useful because it does a *shallow* copy of the `unsigned` data,
which means it can then be updated without corrupting the in-memory cache. Note that
other properties of the event, such as `content`, are *not* (currently) copied here.
"""
# XXX: We rely on at least one of `event.get_dict()` and `make_event_from_dict()`
# making a copy of `unsigned`. Currently, both do, though I don't really know why.
# Still, as long as they do, there's not much point doing yet another copy here.
new_event = make_event_from_dict(
event.get_dict(), event.room_version, event.internal_metadata.get_dict()
)
# Copy the bits of `internal_metadata` that aren't returned by `get_dict`.
new_event.internal_metadata.stream_ordering = (
event.internal_metadata.stream_ordering
)
new_event.internal_metadata.outlier = event.internal_metadata.outlier
return new_event
def prune_event_dict(room_version: RoomVersion, event_dict: JsonDict) -> JsonDict:
"""Redacts the event_dict in the same way as `prune_event`, except it
operates on dicts rather than event objects

View File

@@ -546,7 +546,25 @@ class FederationServer(FederationBase):
edu_type=edu_dict["edu_type"],
content=edu_dict["content"],
)
await self.registry.on_edu(edu.edu_type, origin, edu.content)
try:
await self.registry.on_edu(edu.edu_type, origin, edu.content)
except Exception:
# If there was an error handling the EDU, we must reject the
# transaction.
#
# Some EDU types (notably, to-device messages) are, despite their name,
# expected to be reliable; if we weren't able to do something with it,
# we have to tell the sender that, and the only way the protocol gives
# us to do so is by sending an HTTP error back on the transaction.
#
# We log the exception now, and then raise a new SynapseError to cause
# the transaction to be failed.
logger.exception("Error handling EDU of type %s", edu.edu_type)
raise SynapseError(500, f"Error handing EDU of type {edu.edu_type}")
# TODO: if the first EDU fails, we should probably abort the whole
# thing rather than carrying on with the rest of them. That would
# probably be best done inside `concurrently_execute`.
await concurrently_execute(
_process_edu,
@@ -1414,12 +1432,7 @@ class FederationHandlerRegistry:
handler = self.edu_handlers.get(edu_type)
if handler:
with start_active_span_from_edu(content, "handle_edu"):
try:
await handler(origin, content)
except SynapseError as e:
logger.info("Failed to handle edu %r: %r", edu_type, e)
except Exception:
logger.exception("Failed to handle edu %r", edu_type)
await handler(origin, content)
return
# Check if we can route it somewhere else that isn't us
@@ -1428,17 +1441,12 @@ class FederationHandlerRegistry:
# Pick an instance randomly so that we don't overload one.
route_to = random.choice(instances)
try:
await self._send_edu(
instance_name=route_to,
edu_type=edu_type,
origin=origin,
content=content,
)
except SynapseError as e:
logger.info("Failed to handle edu %r: %r", edu_type, e)
except Exception:
logger.exception("Failed to handle edu %r", edu_type)
await self._send_edu(
instance_name=route_to,
edu_type=edu_type,
origin=origin,
content=content,
)
return
# Oh well, let's just log and move on.

View File

@@ -180,7 +180,11 @@ def _parse_auth_header(header_bytes: bytes) -> Tuple[str, str, str, Optional[str
"""
try:
header_str = header_bytes.decode("utf-8")
params = re.split(" +", header_str)[1].split(",")
space_or_tab = "[ \t]"
params = re.split(
rf"{space_or_tab}*,{space_or_tab}*",
re.split(r"^X-Matrix +", header_str, maxsplit=1)[1],
)
param_dict: Dict[str, str] = {
k.lower(): v for k, v in [param.split("=", maxsplit=1) for param in params]
}

View File

@@ -42,6 +42,7 @@ class AdminHandler:
self._device_handler = hs.get_device_handler()
self._storage_controllers = hs.get_storage_controllers()
self._state_storage_controller = self._storage_controllers.state
self._hs_config = hs.config
self._msc3866_enabled = hs.config.experimental.msc3866.enabled
async def get_whois(self, user: UserID) -> JsonMapping:
@@ -217,7 +218,10 @@ class AdminHandler:
)
events = await filter_events_for_client(
self._storage_controllers, user_id, events
self._storage_controllers,
user_id,
events,
msc4115_membership_on_events=self._hs_config.experimental.msc4115_membership_on_events,
)
writer.write_events(room_id, events)

View File

@@ -261,11 +261,22 @@ class DeactivateAccountHandler:
user = UserID.from_string(user_id)
rooms_for_user = await self.store.get_rooms_for_user(user_id)
requester = create_requester(user, authenticated_entity=self._server_name)
should_erase = await self.store.is_user_erased(user_id)
for room_id in rooms_for_user:
logger.info("User parter parting %r from %r", user_id, room_id)
try:
# Before parting the user, redact all membership events if requested
if should_erase:
event_ids = await self.store.get_membership_event_ids_for_user(
user_id, room_id
)
for event_id in event_ids:
await self.store.expire_event(event_id)
await self._room_member_handler.update_membership(
create_requester(user, authenticated_entity=self._server_name),
requester,
user,
room_id,
"leave",

View File

@@ -104,6 +104,9 @@ class DeviceMessageHandler:
"""
Handle receiving to-device messages from remote homeservers.
Note that any errors thrown from this method will cause the federation /send
request to receive an error response.
Args:
origin: The remote homeserver.
content: The JSON dictionary containing the to-device messages.

View File

@@ -148,6 +148,7 @@ class EventHandler:
def __init__(self, hs: "HomeServer"):
self.store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
self._config = hs.config
async def get_event(
self,
@@ -189,7 +190,11 @@ class EventHandler:
is_peeking = not is_user_in_room
filtered = await filter_events_for_client(
self._storage_controllers, user.to_string(), [event], is_peeking=is_peeking
self._storage_controllers,
user.to_string(),
[event],
is_peeking=is_peeking,
msc4115_membership_on_events=self._config.experimental.msc4115_membership_on_events,
)
if not filtered:

View File

@@ -221,7 +221,10 @@ class InitialSyncHandler:
).addErrback(unwrapFirstError)
messages = await filter_events_for_client(
self._storage_controllers, user_id, messages
self._storage_controllers,
user_id,
messages,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
start_token = now_token.copy_and_replace(StreamKeyType.ROOM, token)
@@ -380,6 +383,7 @@ class InitialSyncHandler:
requester.user.to_string(),
messages,
is_peeking=is_peeking,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
start_token = StreamToken.START.copy_and_replace(StreamKeyType.ROOM, token)
@@ -494,6 +498,7 @@ class InitialSyncHandler:
requester.user.to_string(),
messages,
is_peeking=is_peeking,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
start_token = now_token.copy_and_replace(StreamKeyType.ROOM, token)

View File

@@ -623,6 +623,7 @@ class PaginationHandler:
user_id,
events,
is_peeking=(member_event_id is None),
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
# if after the filter applied there are no more events

View File

@@ -95,6 +95,7 @@ class RelationsHandler:
self._event_handler = hs.get_event_handler()
self._event_serializer = hs.get_event_client_serializer()
self._event_creation_handler = hs.get_event_creation_handler()
self._config = hs.config
async def get_relations(
self,
@@ -163,6 +164,7 @@ class RelationsHandler:
user_id,
events,
is_peeking=(member_event_id is None),
msc4115_membership_on_events=self._config.experimental.msc4115_membership_on_events,
)
# The relations returned for the requested event do include their
@@ -608,6 +610,7 @@ class RelationsHandler:
user_id,
events,
is_peeking=(member_event_id is None),
msc4115_membership_on_events=self._config.experimental.msc4115_membership_on_events,
)
aggregations = await self.get_bundled_aggregations(

View File

@@ -1476,6 +1476,7 @@ class RoomContextHandler:
user.to_string(),
events,
is_peeking=is_peeking,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
event = await self.store.get_event(

View File

@@ -752,6 +752,36 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
and requester.user.to_string() == self._server_notices_mxid
)
requester_suspended = await self.store.get_user_suspended_status(
requester.user.to_string()
)
if action == Membership.INVITE and requester_suspended:
raise SynapseError(
403,
"Sending invites while account is suspended is not allowed.",
Codes.USER_ACCOUNT_SUSPENDED,
)
if target.to_string() != requester.user.to_string():
target_suspended = await self.store.get_user_suspended_status(
target.to_string()
)
else:
target_suspended = requester_suspended
if action == Membership.JOIN and target_suspended:
raise SynapseError(
403,
"Joining rooms while account is suspended is not allowed.",
Codes.USER_ACCOUNT_SUSPENDED,
)
if action == Membership.KNOCK and target_suspended:
raise SynapseError(
403,
"Knocking on rooms while account is suspended is not allowed.",
Codes.USER_ACCOUNT_SUSPENDED,
)
if (
not self.allow_per_room_profiles and not is_requester_server_notices_user
) or requester.shadow_banned:

View File

@@ -480,7 +480,10 @@ class SearchHandler:
filtered_events = await search_filter.filter([r["event"] for r in results])
events = await filter_events_for_client(
self._storage_controllers, user.to_string(), filtered_events
self._storage_controllers,
user.to_string(),
filtered_events,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
events.sort(key=lambda e: -rank_map[e.event_id])
@@ -579,7 +582,10 @@ class SearchHandler:
filtered_events = await search_filter.filter([r["event"] for r in results])
events = await filter_events_for_client(
self._storage_controllers, user.to_string(), filtered_events
self._storage_controllers,
user.to_string(),
filtered_events,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
room_events.extend(events)
@@ -664,11 +670,17 @@ class SearchHandler:
)
events_before = await filter_events_for_client(
self._storage_controllers, user.to_string(), res.events_before
self._storage_controllers,
user.to_string(),
res.events_before,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
events_after = await filter_events_for_client(
self._storage_controllers, user.to_string(), res.events_after
self._storage_controllers,
user.to_string(),
res.events_after,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
context: JsonDict = {

View File

@@ -169,6 +169,7 @@ class UsernameMappingSession:
# attributes returned by the ID mapper
display_name: Optional[str]
emails: StrCollection
avatar_url: Optional[str]
# An optional dictionary of extra attributes to be provided to the client in the
# login response.
@@ -183,6 +184,7 @@ class UsernameMappingSession:
# choices made by the user
chosen_localpart: Optional[str] = None
use_display_name: bool = True
use_avatar: bool = True
emails_to_use: StrCollection = ()
terms_accepted_version: Optional[str] = None
@@ -660,6 +662,9 @@ class SsoHandler:
remote_user_id=remote_user_id,
display_name=attributes.display_name,
emails=attributes.emails,
avatar_url=attributes.picture,
# Default to using all mapped emails. Will be overwritten in handle_submit_username_request.
emails_to_use=attributes.emails,
client_redirect_url=client_redirect_url,
expiry_time_ms=now + self._MAPPING_SESSION_VALIDITY_PERIOD_MS,
extra_login_attributes=extra_login_attributes,
@@ -966,6 +971,7 @@ class SsoHandler:
session_id: str,
localpart: str,
use_display_name: bool,
use_avatar: bool,
emails_to_use: Iterable[str],
) -> None:
"""Handle a request to the username-picker 'submit' endpoint
@@ -988,6 +994,7 @@ class SsoHandler:
# update the session with the user's choices
session.chosen_localpart = localpart
session.use_display_name = use_display_name
session.use_avatar = use_avatar
emails_from_idp = set(session.emails)
filtered_emails: Set[str] = set()
@@ -1068,6 +1075,9 @@ class SsoHandler:
if session.use_display_name:
attributes.display_name = session.display_name
if session.use_avatar:
attributes.picture = session.avatar_url
# the following will raise a 400 error if the username has been taken in the
# meantime.
user_id = await self._register_mapped_user(

View File

@@ -596,6 +596,7 @@ class SyncHandler:
sync_config.user.to_string(),
recents,
always_include_ids=current_state_ids,
msc4115_membership_on_events=self.hs_config.experimental.msc4115_membership_on_events,
)
log_kv({"recents_after_visibility_filtering": len(recents)})
else:
@@ -681,6 +682,7 @@ class SyncHandler:
sync_config.user.to_string(),
loaded_recents,
always_include_ids=current_state_ids,
msc4115_membership_on_events=self.hs_config.experimental.msc4115_membership_on_events,
)
loaded_recents = []

View File

@@ -909,8 +909,9 @@ def set_cors_headers(request: "SynapseRequest") -> None:
request.setHeader(
b"Access-Control-Allow-Methods", b"GET, HEAD, POST, PUT, DELETE, OPTIONS"
)
if request.path is not None and request.path.startswith(
b"/_matrix/client/unstable/org.matrix.msc4108/rendezvous"
if request.path is not None and (
request.path == b"/_matrix/client/unstable/org.matrix.msc4108/rendezvous"
or request.path.startswith(b"/_synapse/client/rendezvous")
):
request.setHeader(
b"Access-Control-Allow-Headers",

View File

@@ -721,6 +721,7 @@ class Notifier:
user.to_string(),
new_events,
is_peeking=is_peeking,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
elif keyname == StreamKeyType.PRESENCE:
now = self.clock.time_msec()

View File

@@ -205,6 +205,22 @@ class Mailer:
template_vars,
)
emails_sent_counter.labels("already_in_use")
async def send_already_in_use_mail(self, email_address: str) -> None:
"""Send an email if the address is already bound to an user account
Args:
email_address: Email address we're sending to the "already in use" mail
"""
await self.send_email(
email_address,
self.email_subjects.email_already_in_use
% {"server_name": self.hs.config.server.server_name, "app": self.app_name},
{},
)
emails_sent_counter.labels("add_threepid")
async def send_add_threepid_mail(
@@ -513,7 +529,10 @@ class Mailer:
}
the_events = await filter_events_for_client(
self._storage_controllers, user_id, results.events_before
self._storage_controllers,
user_id,
results.events_before,
msc4115_membership_on_events=self.hs.config.experimental.msc4115_membership_on_events,
)
the_events.append(notif_event)

View File

@@ -0,0 +1,12 @@
{% extends "_base.html" %}
{% block title %}Email already in use{% endblock %}
{% block body %}
<p>You have asked us to register this email with a new Matrix account, but this email is already registered with an existing account.</p>
<p>Please reset your password if needed.</p>
<p>If this was not you, you can safely disregard this email.</p>
<p>Thank you.</p>
{% endblock %}

View File

@@ -0,0 +1,10 @@
Hello there,
You have asked us to register this email with a new Matrix account,
but this email is already registered with an existing account.
Please reset your password if needed.
If this was not you, you can safely disregard this email.
Thank you.

View File

@@ -393,17 +393,20 @@ class SigningKeyUploadServlet(RestServlet):
# time. Because there is no UIA in MSC3861, for now we throw an error if the
# user tries to reset the device signing key when MSC3861 is enabled, but allow
# first-time setup.
#
# XXX: We now have a get-out clause by which MAS can temporarily mark the master
# key as replaceable. It should do its own equivalent of user interactive auth
# before doing so.
if self.hs.config.experimental.msc3861.enabled:
# The auth service has to explicitly mark the master key as replaceable
# without UIA to reset the device signing key with MSC3861.
if is_cross_signing_setup and not master_key_updatable_without_uia:
config = self.hs.config.experimental.msc3861
if config.account_management_url is not None:
url = f"{config.account_management_url}?action=org.matrix.cross_signing_reset"
else:
url = config.issuer
raise SynapseError(
HTTPStatus.NOT_IMPLEMENTED,
"Resetting cross signing keys is not yet supported with MSC3861",
"To reset your end-to-end encryption cross-signing identity, "
f"you first need to approve it at {url} and then try again.",
Codes.UNRECOGNIZED,
)
# But first-time setup is fine

View File

@@ -86,12 +86,18 @@ class EmailRegisterRequestTokenRestServlet(RestServlet):
self.config = hs.config
if self.hs.config.email.can_verify_email:
self.mailer = Mailer(
self.registration_mailer = Mailer(
hs=self.hs,
app_name=self.config.email.email_app_name,
template_html=self.config.email.email_registration_template_html,
template_text=self.config.email.email_registration_template_text,
)
self.already_in_use_mailer = Mailer(
hs=self.hs,
app_name=self.config.email.email_app_name,
template_html=self.config.email.email_already_in_use_template_html,
template_text=self.config.email.email_already_in_use_template_text,
)
async def on_POST(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
if not self.hs.config.email.can_verify_email:
@@ -139,8 +145,10 @@ class EmailRegisterRequestTokenRestServlet(RestServlet):
if self.hs.config.server.request_token_inhibit_3pid_errors:
# Make the client think the operation succeeded. See the rationale in the
# comments for request_token_inhibit_3pid_errors.
# Still send an email to warn the user that an account already exists.
# Also wait for some random amount of time between 100ms and 1s to make it
# look like we did something.
await self.already_in_use_mailer.send_already_in_use_mail(email)
await self.hs.get_clock().sleep(random.randint(1, 10) / 10)
return 200, {"sid": random_string(16)}
@@ -151,7 +159,7 @@ class EmailRegisterRequestTokenRestServlet(RestServlet):
email,
client_secret,
send_attempt,
self.mailer.send_registration_mail,
self.registration_mailer.send_registration_mail,
next_link,
)

View File

@@ -34,6 +34,9 @@ if TYPE_CHECKING:
logger = logging.getLogger(__name__)
# n.b [MSC3886](https://github.com/matrix-org/matrix-spec-proposals/pull/3886) has now been closed.
# However, we want to keep this implementation around for some time.
# TODO: define an end-of-life date for this implementation.
class MSC3886RendezvousServlet(RestServlet):
"""
This is a placeholder implementation of [MSC3886](https://github.com/matrix-org/matrix-spec-proposals/pull/3886)
@@ -97,9 +100,25 @@ class MSC4108DelegationRendezvousServlet(RestServlet):
)
class MSC4108RendezvousServlet(RestServlet):
PATTERNS = client_patterns(
"/org.matrix.msc4108/rendezvous$", releases=[], v1=False, unstable=True
)
def __init__(self, hs: "HomeServer") -> None:
super().__init__()
self._handler = hs.get_rendezvous_handler()
def on_POST(self, request: SynapseRequest) -> None:
self._handler.handle_post(request)
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
if hs.config.experimental.msc3886_endpoint is not None:
MSC3886RendezvousServlet(hs).register(http_server)
if hs.config.experimental.msc4108_enabled:
MSC4108RendezvousServlet(hs).register(http_server)
if hs.config.experimental.msc4108_delegation_endpoint is not None:
MSC4108DelegationRendezvousServlet(hs).register(http_server)

View File

@@ -1442,10 +1442,16 @@ class RoomHierarchyRestServlet(RestServlet):
class RoomSummaryRestServlet(ResolveRoomIdMixin, RestServlet):
PATTERNS = (
# deprecated endpoint, to be removed
re.compile(
"^/_matrix/client/unstable/im.nheko.summary"
"/rooms/(?P<room_identifier>[^/]*)/summary$"
),
# recommended endpoint
re.compile(
"^/_matrix/client/unstable/im.nheko.summary"
"/summary/(?P<room_identifier>[^/]*)$"
),
)
CATEGORY = "Client API requests"

View File

@@ -89,6 +89,7 @@ class VersionsRestServlet(RestServlet):
"v1.7",
"v1.8",
"v1.9",
"v1.10",
],
# as per MSC1497:
"unstable_features": {
@@ -141,8 +142,13 @@ class VersionsRestServlet(RestServlet):
# Allows clients to handle push for encrypted events.
"org.matrix.msc4028": self.config.experimental.msc4028_push_encrypted_events,
# MSC4108: Mechanism to allow OIDC sign in and E2EE set up via QR code
"org.matrix.msc4108": self.config.experimental.msc4108_delegation_endpoint
is not None,
"org.matrix.msc4108": (
self.config.experimental.msc4108_enabled
or (
self.config.experimental.msc4108_delegation_endpoint
is not None
)
),
},
},
)

View File

@@ -26,6 +26,7 @@ from twisted.web.resource import Resource
from synapse.rest.synapse.client.new_user_consent import NewUserConsentResource
from synapse.rest.synapse.client.pick_idp import PickIdpResource
from synapse.rest.synapse.client.pick_username import pick_username_resource
from synapse.rest.synapse.client.rendezvous import MSC4108RendezvousSessionResource
from synapse.rest.synapse.client.sso_register import SsoRegisterResource
from synapse.rest.synapse.client.unsubscribe import UnsubscribeResource
@@ -76,6 +77,9 @@ def build_synapse_client_resource_tree(hs: "HomeServer") -> Mapping[str, Resourc
# To be removed in Synapse v1.32.0.
resources["/_matrix/saml2"] = res
if hs.config.experimental.msc4108_enabled:
resources["/_synapse/client/rendezvous"] = MSC4108RendezvousSessionResource(hs)
return resources

View File

@@ -113,6 +113,7 @@ class AccountDetailsResource(DirectServeHtmlResource):
"display_name": session.display_name,
"emails": session.emails,
"localpart": localpart,
"avatar_url": session.avatar_url,
},
}
@@ -134,6 +135,7 @@ class AccountDetailsResource(DirectServeHtmlResource):
try:
localpart = parse_string(request, "username", required=True)
use_display_name = parse_boolean(request, "use_display_name", default=False)
use_avatar = parse_boolean(request, "use_avatar", default=False)
try:
emails_to_use: List[str] = [
@@ -147,5 +149,5 @@ class AccountDetailsResource(DirectServeHtmlResource):
return
await self._sso_handler.handle_submit_username_request(
request, session_id, localpart, use_display_name, emails_to_use
request, session_id, localpart, use_display_name, use_avatar, emails_to_use
)

View File

@@ -0,0 +1,58 @@
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2024 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
#
import logging
from typing import TYPE_CHECKING, List
from synapse.api.errors import UnrecognizedRequestError
from synapse.http.server import DirectServeJsonResource
from synapse.http.site import SynapseRequest
if TYPE_CHECKING:
from synapse.server import HomeServer
logger = logging.getLogger(__name__)
class MSC4108RendezvousSessionResource(DirectServeJsonResource):
isLeaf = True
def __init__(self, hs: "HomeServer") -> None:
super().__init__()
self._handler = hs.get_rendezvous_handler()
async def _async_render_GET(self, request: SynapseRequest) -> None:
postpath: List[bytes] = request.postpath # type: ignore
if len(postpath) != 1:
raise UnrecognizedRequestError()
session_id = postpath[0].decode("ascii")
self._handler.handle_get(request, session_id)
def _async_render_PUT(self, request: SynapseRequest) -> None:
postpath: List[bytes] = request.postpath # type: ignore
if len(postpath) != 1:
raise UnrecognizedRequestError()
session_id = postpath[0].decode("ascii")
self._handler.handle_put(request, session_id)
def _async_render_DELETE(self, request: SynapseRequest) -> None:
postpath: List[bytes] = request.postpath # type: ignore
if len(postpath) != 1:
raise UnrecognizedRequestError()
session_id = postpath[0].decode("ascii")
self._handler.handle_delete(request, session_id)

View File

@@ -143,6 +143,7 @@ from synapse.state import StateHandler, StateResolutionHandler
from synapse.storage import Databases
from synapse.storage.controllers import StorageControllers
from synapse.streams.events import EventSources
from synapse.synapse_rust.rendezvous import RendezvousHandler
from synapse.types import DomainSpecificString, ISynapseReactor
from synapse.util import Clock
from synapse.util.distributor import Distributor
@@ -859,6 +860,10 @@ class HomeServer(metaclass=abc.ABCMeta):
def get_room_forgetter_handler(self) -> RoomForgetterHandler:
return RoomForgetterHandler(self)
@cache_in_self
def get_rendezvous_handler(self) -> RendezvousHandler:
return RendezvousHandler(self)
@cache_in_self
def get_outbound_redis_connection(self) -> "ConnectionHandler":
"""

View File

@@ -21,6 +21,7 @@
import datetime
import itertools
import logging
import time
from queue import Empty, PriorityQueue
from typing import (
TYPE_CHECKING,
@@ -39,6 +40,7 @@ from typing import (
import attr
from prometheus_client import Counter, Gauge
from sortedcontainers import SortedSet
from synapse.api.constants import MAX_DEPTH
from synapse.api.errors import StoreError
@@ -118,6 +120,11 @@ class BackfillQueueNavigationItem:
type: str
@attr.s(frozen=True, slots=True, auto_attribs=True)
class _ChainLinksCacheEntry:
links: List[Tuple[int, int, int, "_ChainLinksCacheEntry"]] = attr.Factory(list)
class _NoChainCoverIndex(Exception):
def __init__(self, room_id: str):
super().__init__("Unexpectedly no chain cover for events in %s" % (room_id,))
@@ -138,6 +145,10 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
self.hs = hs
self._chain_links_cache: LruCache[int, _ChainLinksCacheEntry] = LruCache(
max_size=10000, cache_name="chain_links_cache"
)
if hs.config.worker.run_background_tasks:
hs.get_clock().looping_call(
self._delete_old_forward_extrem_cache, 60 * 60 * 1000
@@ -283,7 +294,9 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
# A map from chain ID to max sequence number *reachable* from any event ID.
chains: Dict[int, int] = {}
for links in self._get_chain_links(txn, set(event_chains.keys())):
for links in self._get_chain_links(
txn, event_chains.keys(), self._chain_links_cache
):
for chain_id in links:
if chain_id not in event_chains:
continue
@@ -335,7 +348,10 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
@classmethod
def _get_chain_links(
cls, txn: LoggingTransaction, chains_to_fetch: Set[int]
cls,
txn: LoggingTransaction,
chains_to_fetch: Collection[int],
cache: Optional[LruCache[int, _ChainLinksCacheEntry]] = None,
) -> Generator[Dict[int, List[Tuple[int, int, int]]], None, None]:
"""Fetch all auth chain links from the given set of chains, and all
links from those chains, recursively.
@@ -347,12 +363,55 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
of origin sequence number, target chain ID and target sequence number.
"""
found_cached_chains = set()
if cache:
entries: Dict[int, _ChainLinksCacheEntry] = {}
for chain_id in chains_to_fetch:
entry = cache.get(chain_id)
if entry:
entries[chain_id] = entry
cached_links: Dict[int, List[Tuple[int, int, int]]] = {}
while entries:
origin_chain_id, entry = entries.popitem()
for (
origin_sequence_number,
target_chain_id,
target_sequence_number,
target_entry,
) in entry.links:
if target_chain_id in found_cached_chains:
continue
found_cached_chains.add(target_chain_id)
cache.get(chain_id)
entries[chain_id] = target_entry
cached_links.setdefault(origin_chain_id, []).append(
(
origin_sequence_number,
target_chain_id,
target_sequence_number,
)
)
yield cached_links
# This query is structured to first get all chain IDs reachable, and
# then pull out all links from those chains. This does pull out more
# rows than is strictly necessary, however there isn't a way of
# structuring the recursive part of query to pull out the links without
# also returning large quantities of redundant data (which can make it a
# lot slower).
if isinstance(txn.database_engine, PostgresEngine):
# JIT and sequential scans sometimes get hit on this code path, which
# can make the queries much more expensive
txn.execute("SET LOCAL jit = off")
txn.execute("SET LOCAL enable_seqscan = off")
sql = """
WITH RECURSIVE links(chain_id) AS (
SELECT
@@ -371,9 +430,22 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
INNER JOIN event_auth_chain_links ON (chain_id = origin_chain_id)
"""
while chains_to_fetch:
batch2 = tuple(itertools.islice(chains_to_fetch, 1000))
chains_to_fetch.difference_update(batch2)
# We fetch the links in batches. Separate batches will likely fetch the
# same set of links (e.g. they'll always pull in the links to create
# event). To try and minimize the amount of redundant links, we query
# the chain IDs in reverse order, as there will be a correlation between
# the order of chain IDs and links (i.e., higher chain IDs are more
# likely to depend on lower chain IDs than vice versa).
BATCH_SIZE = 5000
chains_to_fetch_sorted = SortedSet(chains_to_fetch)
chains_to_fetch_sorted.difference_update(found_cached_chains)
start_block = time.monotonic()
while chains_to_fetch_sorted:
batch2 = list(chains_to_fetch_sorted.islice(-BATCH_SIZE))
chains_to_fetch_sorted.difference_update(batch2)
clause, args = make_in_list_sql_clause(
txn.database_engine, "origin_chain_id", batch2
)
@@ -381,6 +453,8 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
links: Dict[int, List[Tuple[int, int, int]]] = {}
cache_entries: Dict[int, _ChainLinksCacheEntry] = {}
for (
origin_chain_id,
origin_sequence_number,
@@ -391,10 +465,33 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
(origin_sequence_number, target_chain_id, target_sequence_number)
)
chains_to_fetch.difference_update(links)
if cache:
origin_entry = cache_entries.setdefault(
origin_chain_id, _ChainLinksCacheEntry()
)
target_entry = cache_entries.setdefault(
target_chain_id, _ChainLinksCacheEntry()
)
origin_entry.links.append(
(
origin_sequence_number,
target_chain_id,
target_sequence_number,
target_entry,
)
)
if cache:
for chain_id, entry in cache_entries.items():
if chain_id not in cache:
cache[chain_id] = entry
chains_to_fetch_sorted.difference_update(links)
yield links
end_block = time.monotonic()
def _get_auth_chain_ids_txn(
self, txn: LoggingTransaction, event_ids: Collection[str], include_given: bool
) -> Set[str]:
@@ -581,7 +678,7 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
# are reachable from any event.
# (We need to take a copy of `seen_chains` as the function mutates it)
for links in self._get_chain_links(txn, set(seen_chains)):
for links in self._get_chain_links(txn, seen_chains, self._chain_links_cache):
for chains in set_to_chain:
for chain_id in links:
if chain_id not in chains:

View File

@@ -19,6 +19,7 @@
# [This file includes modifications made by New Vector Limited]
#
#
import collections
import itertools
import logging
from collections import OrderedDict
@@ -53,6 +54,7 @@ from synapse.storage.database import (
LoggingDatabaseConnection,
LoggingTransaction,
)
from synapse.storage.databases.main.event_federation import EventFederationStore
from synapse.storage.databases.main.events_worker import EventCacheEntry
from synapse.storage.databases.main.search import SearchEntry
from synapse.storage.engines import PostgresEngine
@@ -768,40 +770,26 @@ class PersistEventsStore:
# that have the same chain ID as the event.
# 2. For each retained auth event we:
# a. Add a link from the event's to the auth event's chain
# ID/sequence number; and
# b. Add a link from the event to every chain reachable by the
# auth event.
# ID/sequence number
# Step 1, fetch all existing links from all the chains we've seen
# referenced.
chain_links = _LinkMap()
auth_chain_rows = cast(
List[Tuple[int, int, int, int]],
db_pool.simple_select_many_txn(
txn,
table="event_auth_chain_links",
column="origin_chain_id",
iterable={chain_id for chain_id, _ in chain_map.values()},
keyvalues={},
retcols=(
"origin_chain_id",
"origin_sequence_number",
"target_chain_id",
"target_sequence_number",
),
),
)
for (
origin_chain_id,
origin_sequence_number,
target_chain_id,
target_sequence_number,
) in auth_chain_rows:
chain_links.add_link(
(origin_chain_id, origin_sequence_number),
(target_chain_id, target_sequence_number),
new=False,
)
for links in EventFederationStore._get_chain_links(
txn, {chain_id for chain_id, _ in chain_map.values()}
):
for origin_chain_id, inner_links in links.items():
for (
origin_sequence_number,
target_chain_id,
target_sequence_number,
) in inner_links:
chain_links.add_link(
(origin_chain_id, origin_sequence_number),
(target_chain_id, target_sequence_number),
new=False,
)
# We do this in toplogical order to avoid adding redundant links.
for event_id in sorted_topologically(
@@ -836,18 +824,6 @@ class PersistEventsStore:
(chain_id, sequence_number), (auth_chain_id, auth_sequence_number)
)
# Step 2b, add a link to chains reachable from the auth
# event.
for target_id, target_seq in chain_links.get_links_from(
(auth_chain_id, auth_sequence_number)
):
if target_id == chain_id:
continue
chain_links.add_link(
(chain_id, sequence_number), (target_id, target_seq)
)
db_pool.simple_insert_many_txn(
txn,
table="event_auth_chain_links",
@@ -2451,31 +2427,6 @@ class _LinkMap:
current_links[src_seq] = target_seq
return True
def get_links_from(
self, src_tuple: Tuple[int, int]
) -> Generator[Tuple[int, int], None, None]:
"""Gets the chains reachable from the given chain/sequence number.
Yields:
The chain ID and sequence number the link points to.
"""
src_chain, src_seq = src_tuple
for target_id, sequence_numbers in self.maps.get(src_chain, {}).items():
for link_src_seq, target_seq in sequence_numbers.items():
if link_src_seq <= src_seq:
yield target_id, target_seq
def get_links_between(
self, source_chain: int, target_chain: int
) -> Generator[Tuple[int, int], None, None]:
"""Gets the links between two chains.
Yields:
The source and target sequence numbers.
"""
yield from self.maps.get(source_chain, {}).get(target_chain, {}).items()
def get_additions(self) -> Generator[Tuple[int, int, int, int], None, None]:
"""Gets any newly added links.
@@ -2502,9 +2453,24 @@ class _LinkMap:
if src_chain == target_chain:
return target_seq <= src_seq
links = self.get_links_between(src_chain, target_chain)
for link_start_seq, link_end_seq in links:
if link_start_seq <= src_seq and target_seq <= link_end_seq:
return True
# We have to graph traverse the links to check for indirect paths.
visited_chains: Dict[int, int] = collections.Counter()
search = [(src_chain, src_seq)]
while search:
chain, seq = search.pop()
visited_chains[chain] = max(seq, visited_chains[chain])
for tc, links in self.maps.get(chain, {}).items():
for ss, ts in links.items():
# Don't revisit chains we've already seen, unless the target
# sequence number is higher than last time.
if ts <= visited_chains.get(tc, 0):
continue
if ss <= seq:
if tc == target_chain:
if target_seq <= ts:
return True
else:
search.append((tc, ts))
return False

View File

@@ -236,7 +236,8 @@ class RegistrationWorkerStore(CacheInvalidationWorkerStore):
consent_server_notice_sent, appservice_id, creation_ts, user_type,
deactivated, COALESCE(shadow_banned, FALSE) AS shadow_banned,
COALESCE(approved, TRUE) AS approved,
COALESCE(locked, FALSE) AS locked
COALESCE(locked, FALSE) AS locked,
suspended
FROM users
WHERE name = ?
""",
@@ -261,6 +262,7 @@ class RegistrationWorkerStore(CacheInvalidationWorkerStore):
shadow_banned,
approved,
locked,
suspended,
) = row
return UserInfo(
@@ -277,6 +279,7 @@ class RegistrationWorkerStore(CacheInvalidationWorkerStore):
user_type=user_type,
approved=bool(approved),
locked=bool(locked),
suspended=bool(suspended),
)
return await self.db_pool.runInteraction(
@@ -1180,6 +1183,27 @@ class RegistrationWorkerStore(CacheInvalidationWorkerStore):
# Convert the potential integer into a boolean.
return bool(res)
@cached()
async def get_user_suspended_status(self, user_id: str) -> bool:
"""
Determine whether the user's account is suspended.
Args:
user_id: The user ID of the user in question
Returns:
True if the user's account is suspended, false if it is not suspended or
if the user ID cannot be found.
"""
res = await self.db_pool.simple_select_one_onecol(
table="users",
keyvalues={"name": user_id},
retcol="suspended",
allow_none=True,
desc="get_user_suspended",
)
return bool(res)
async def get_threepid_validation_session(
self,
medium: Optional[str],
@@ -2213,6 +2237,35 @@ class RegistrationBackgroundUpdateStore(RegistrationWorkerStore):
self._invalidate_cache_and_stream(txn, self.get_user_by_id, (user_id,))
txn.call_after(self.is_guest.invalidate, (user_id,))
async def set_user_suspended_status(self, user_id: str, suspended: bool) -> None:
"""
Set whether the user's account is suspended in the `users` table.
Args:
user_id: The user ID of the user in question
suspended: True if the user is suspended, false if not
"""
await self.db_pool.runInteraction(
"set_user_suspended_status",
self.set_user_suspended_status_txn,
user_id,
suspended,
)
def set_user_suspended_status_txn(
self, txn: LoggingTransaction, user_id: str, suspended: bool
) -> None:
self.db_pool.simple_update_one_txn(
txn=txn,
table="users",
keyvalues={"name": user_id},
updatevalues={"suspended": suspended},
)
self._invalidate_cache_and_stream(
txn, self.get_user_suspended_status, (user_id,)
)
self._invalidate_cache_and_stream(txn, self.get_user_by_id, (user_id,))
async def set_user_locked_status(self, user_id: str, locked: bool) -> None:
"""Set the `locked` property for the provided user to the provided value.

View File

@@ -1234,6 +1234,28 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
return set(room_ids)
async def get_membership_event_ids_for_user(
self, user_id: str, room_id: str
) -> Set[str]:
"""Get all event_ids for the given user and room.
Args:
user_id: The user ID to get the event IDs for.
room_id: The room ID to look up events for.
Returns:
Set of event IDs
"""
event_ids = await self.db_pool.simple_select_onecol(
table="room_memberships",
keyvalues={"user_id": user_id, "room_id": room_id},
retcol="event_id",
desc="get_membership_event_ids_for_user",
)
return set(event_ids)
@cached(max_entries=5000)
async def _get_membership_from_event_id(
self, member_event_id: str

View File

@@ -470,6 +470,8 @@ class SearchStore(SearchBackgroundUpdateStore):
count_args = args
count_clauses = clauses
sqlite_highlights: List[str] = []
if isinstance(self.database_engine, PostgresEngine):
search_query = search_term
sql = """
@@ -486,7 +488,7 @@ class SearchStore(SearchBackgroundUpdateStore):
"""
count_args = [search_query] + count_args
elif isinstance(self.database_engine, Sqlite3Engine):
search_query = _parse_query_for_sqlite(search_term)
search_query, sqlite_highlights = _parse_query_for_sqlite(search_term)
sql = """
SELECT rank(matchinfo(event_search)) as rank, room_id, event_id
@@ -531,9 +533,11 @@ class SearchStore(SearchBackgroundUpdateStore):
event_map = {ev.event_id: ev for ev in events}
highlights = None
highlights: Collection[str] = []
if isinstance(self.database_engine, PostgresEngine):
highlights = await self._find_highlights_in_postgres(search_query, events)
else:
highlights = sqlite_highlights
count_sql += " GROUP BY room_id"
@@ -597,6 +601,8 @@ class SearchStore(SearchBackgroundUpdateStore):
count_args = list(args)
count_clauses = list(clauses)
sqlite_highlights: List[str] = []
if pagination_token:
try:
origin_server_ts_str, stream_str = pagination_token.split(",")
@@ -647,7 +653,7 @@ class SearchStore(SearchBackgroundUpdateStore):
CROSS JOIN events USING (event_id)
WHERE
"""
search_query = _parse_query_for_sqlite(search_term)
search_query, sqlite_highlights = _parse_query_for_sqlite(search_term)
args = [search_query] + args
count_sql = """
@@ -694,9 +700,11 @@ class SearchStore(SearchBackgroundUpdateStore):
event_map = {ev.event_id: ev for ev in events}
highlights = None
highlights: Collection[str] = []
if isinstance(self.database_engine, PostgresEngine):
highlights = await self._find_highlights_in_postgres(search_query, events)
else:
highlights = sqlite_highlights
count_sql += " GROUP BY room_id"
@@ -892,19 +900,25 @@ def _tokenize_query(query: str) -> TokenList:
return tokens
def _tokens_to_sqlite_match_query(tokens: TokenList) -> str:
def _tokens_to_sqlite_match_query(tokens: TokenList) -> Tuple[str, List[str]]:
"""
Convert the list of tokens to a string suitable for passing to sqlite's MATCH.
Assume sqlite was compiled with enhanced query syntax.
Returns the sqlite-formatted query string and the tokenized search terms
that can be used as highlights.
Ref: https://www.sqlite.org/fts3.html#full_text_index_queries
"""
match_query = []
highlights = []
for token in tokens:
if isinstance(token, str):
match_query.append(token)
highlights.append(token)
elif isinstance(token, Phrase):
match_query.append('"' + " ".join(token.phrase) + '"')
highlights.append(" ".join(token.phrase))
elif token == SearchToken.Not:
# TODO: SQLite treats NOT as a *binary* operator. Hopefully a search
# term has already been added before this.
@@ -916,11 +930,14 @@ def _tokens_to_sqlite_match_query(tokens: TokenList) -> str:
else:
raise ValueError(f"unknown token {token}")
return "".join(match_query)
return "".join(match_query), highlights
def _parse_query_for_sqlite(search_term: str) -> str:
def _parse_query_for_sqlite(search_term: str) -> Tuple[str, List[str]]:
"""Takes a plain unicode string from the user and converts it into a form
that can be passed to sqllite's matchinfo().
Returns the converted query string and the tokenized search terms
that can be used as highlights.
"""
return _tokens_to_sqlite_match_query(_tokenize_query(search_term))

View File

@@ -660,6 +660,7 @@ class TransactionWorkerStore(CacheInvalidationWorkerStore):
limit=limit,
retcols=("room_id", "stream_ordering"),
order_direction=order,
keyvalues={"destination": destination},
),
)
return rooms, count

View File

@@ -19,7 +19,7 @@
#
#
SCHEMA_VERSION = 84 # remember to update the list below when updating
SCHEMA_VERSION = 85 # remember to update the list below when updating
"""Represents the expectations made by the codebase about the database schema
This should be incremented whenever the codebase changes its requirements on the
@@ -132,12 +132,19 @@ Changes in SCHEMA_VERSION = 82
Changes in SCHEMA_VERSION = 83
- The event_txn_id is no longer used.
Changes in SCHEMA_VERSION = 84
- No longer assumes that `event_auth_chain_links` holds transitive links, and
so read operations must do graph traversal.
Changes in SCHEMA_VERSION = 85
- Add a column `suspended` to the `users` table
"""
SCHEMA_COMPAT_VERSION = (
# The event_txn_id table and tables from MSC2716 no longer exist.
83
# Transitive links are no longer written to `event_auth_chain_links`
84
)
"""Limit on how far the synapse codebase can be rolled back without breaking db compat

View File

@@ -0,0 +1,14 @@
--
-- This file is licensed under the Affero General Public License (AGPL) version 3.
--
-- Copyright (C) 2024 New Vector, Ltd
--
-- This program is free software: you can redistribute it and/or modify
-- it under the terms of the GNU Affero General Public License as
-- published by the Free Software Foundation, either version 3 of the
-- License, or (at your option) any later version.
--
-- See the GNU Affero General Public License for more details:
-- <https://www.gnu.org/licenses/agpl-3.0.html>.
ALTER TABLE users ADD COLUMN suspended BOOLEAN DEFAULT FALSE NOT NULL;

View File

@@ -0,0 +1,30 @@
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2024 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
from twisted.web.iweb import IRequest
from synapse.server import HomeServer
class RendezvousHandler:
def __init__(
self,
homeserver: HomeServer,
/,
capacity: int = 100,
max_content_length: int = 4 * 1024, # MSC4108 specifies 4KB
eviction_interval: int = 60 * 1000,
ttl: int = 60 * 1000,
) -> None: ...
def handle_post(self, request: IRequest) -> None: ...
def handle_get(self, request: IRequest, session_id: str) -> None: ...
def handle_put(self, request: IRequest, session_id: str) -> None: ...
def handle_delete(self, request: IRequest, session_id: str) -> None: ...

View File

@@ -1156,6 +1156,7 @@ class UserInfo:
user_type: User type (None for normal user, 'support' and 'bot' other options).
approved: If the user has been "approved" to register on the server.
locked: Whether the user's account has been locked
suspended: Whether the user's account is currently suspended
"""
user_id: UserID
@@ -1171,6 +1172,7 @@ class UserInfo:
is_shadow_banned: bool
approved: bool
locked: bool
suspended: bool
class UserProfile(TypedDict):

View File

@@ -115,7 +115,7 @@ class StreamChangeCache:
"""
new_size = math.floor(self._original_max_size * factor)
if new_size != self._max_size:
self.max_size = new_size
self._max_size = new_size
self._evict()
return True
return False
@@ -165,7 +165,7 @@ class StreamChangeCache:
return False
def get_entities_changed(
self, entities: Collection[EntityType], stream_pos: int
self, entities: Collection[EntityType], stream_pos: int, _perf_factor: int = 1
) -> Union[Set[EntityType], FrozenSet[EntityType]]:
"""
Returns the subset of the given entities that have had changes after the given position.
@@ -177,6 +177,8 @@ class StreamChangeCache:
Args:
entities: Entities to check for changes.
stream_pos: The stream position to check for changes after.
_perf_factor: Used by unit tests to choose when to use each
optimisation.
Return:
A subset of entities which have changed after the given stream position.
@@ -184,6 +186,22 @@ class StreamChangeCache:
This will be all entities if the given stream position is at or earlier
than the earliest known stream position.
"""
if not self._cache or stream_pos <= self._earliest_known_stream_pos:
self.metrics.inc_misses()
return set(entities)
# If there have been tonnes of changes compared with the number of
# entities, it is faster to check each entities stream ordering
# one-by-one.
max_stream_pos, _ = self._cache.peekitem()
if max_stream_pos - stream_pos > _perf_factor * len(entities):
self.metrics.inc_hits()
return {
entity
for entity in entities
if self._entity_to_key.get(entity, -1) > stream_pos
}
cache_result = self.get_all_entities_changed(stream_pos)
if cache_result.hit:
# We now do an intersection, trying to do so in the most efficient

View File

@@ -36,10 +36,15 @@ from typing import (
import attr
from synapse.api.constants import EventTypes, HistoryVisibility, Membership
from synapse.api.constants import (
EventTypes,
EventUnsignedContentFields,
HistoryVisibility,
Membership,
)
from synapse.events import EventBase
from synapse.events.snapshot import EventContext
from synapse.events.utils import prune_event
from synapse.events.utils import clone_event, prune_event
from synapse.logging.opentracing import trace
from synapse.storage.controllers import StorageControllers
from synapse.storage.databases.main import DataStore
@@ -77,6 +82,7 @@ async def filter_events_for_client(
is_peeking: bool = False,
always_include_ids: FrozenSet[str] = frozenset(),
filter_send_to_client: bool = True,
msc4115_membership_on_events: bool = False,
) -> List[EventBase]:
"""
Check which events a user is allowed to see. If the user can see the event but its
@@ -95,9 +101,12 @@ async def filter_events_for_client(
filter_send_to_client: Whether we're checking an event that's going to be
sent to a client. This might not always be the case since this function can
also be called to check whether a user can see the state at a given point.
msc4115_membership_on_events: Whether to include the requesting user's
membership in the "unsigned" data, per MSC4115.
Returns:
The filtered events.
The filtered events. If `msc4115_membership_on_events` is true, the `unsigned`
data is annotated with the membership state of `user_id` at each event.
"""
# Filter out events that have been soft failed so that we don't relay them
# to clients.
@@ -134,7 +143,8 @@ async def filter_events_for_client(
)
def allowed(event: EventBase) -> Optional[EventBase]:
return _check_client_allowed_to_see_event(
state_after_event = event_id_to_state.get(event.event_id)
filtered = _check_client_allowed_to_see_event(
user_id=user_id,
event=event,
clock=storage.main.clock,
@@ -142,13 +152,45 @@ async def filter_events_for_client(
sender_ignored=event.sender in ignore_list,
always_include_ids=always_include_ids,
retention_policy=retention_policies[room_id],
state=event_id_to_state.get(event.event_id),
state=state_after_event,
is_peeking=is_peeking,
sender_erased=erased_senders.get(event.sender, False),
)
if filtered is None:
return None
# Check each event: gives an iterable of None or (a potentially modified)
# EventBase.
if not msc4115_membership_on_events:
return filtered
# Annotate the event with the user's membership after the event.
#
# Normally we just look in `state_after_event`, but if the event is an outlier
# we won't have such a state. The only outliers that are returned here are the
# user's own membership event, so we can just inspect that.
user_membership_event: Optional[EventBase]
if event.type == EventTypes.Member and event.state_key == user_id:
user_membership_event = event
elif state_after_event is not None:
user_membership_event = state_after_event.get((EventTypes.Member, user_id))
else:
# unreachable!
raise Exception("Missing state for event that is not user's own membership")
user_membership = (
user_membership_event.membership
if user_membership_event
else Membership.LEAVE
)
# Copy the event before updating the unsigned data: this shouldn't be persisted
# to the cache!
cloned = clone_event(filtered)
cloned.unsigned[EventUnsignedContentFields.MSC4115_MEMBERSHIP] = user_membership
return cloned
# Check each event: gives an iterable of None or (a modified) EventBase.
filtered_events = map(allowed, events)
# Turn it into a list and remove None entries before returning.
@@ -396,7 +438,13 @@ def _check_client_allowed_to_see_event(
@attr.s(frozen=True, slots=True, auto_attribs=True)
class _CheckMembershipReturn:
"Return value of _check_membership"
"""Return value of `_check_membership`.
Attributes:
allowed: Whether the user should be allowed to see the event.
joined: Whether the user was joined to the room at the event.
"""
allowed: bool
joined: bool
@@ -408,12 +456,7 @@ def _check_membership(
state: StateMap[EventBase],
is_peeking: bool,
) -> _CheckMembershipReturn:
"""Check whether the user can see the event due to their membership
Returns:
True if they can, False if they can't, plus the membership of the user
at the event.
"""
"""Check whether the user can see the event due to their membership"""
# If the event is the user's own membership event, use the 'most joined'
# membership
membership = None
@@ -435,7 +478,7 @@ def _check_membership(
if membership == "leave" and (
prev_membership == "join" or prev_membership == "invite"
):
return _CheckMembershipReturn(True, membership == Membership.JOIN)
return _CheckMembershipReturn(True, False)
new_priority = MEMBERSHIP_PRIORITY.index(membership)
old_priority = MEMBERSHIP_PRIORITY.index(prev_membership)

View File

@@ -32,6 +32,7 @@ from synapse.events.utils import (
PowerLevelsContent,
SerializeEventConfig,
_split_field,
clone_event,
copy_and_fixup_power_levels_contents,
maybe_upsert_event_field,
prune_event,
@@ -611,6 +612,29 @@ class PruneEventTestCase(stdlib_unittest.TestCase):
)
class CloneEventTestCase(stdlib_unittest.TestCase):
def test_unsigned_is_copied(self) -> None:
original = make_event_from_dict(
{
"type": "A",
"event_id": "$test:domain",
"unsigned": {"a": 1, "b": 2},
},
RoomVersions.V1,
{"txn_id": "txn"},
)
original.internal_metadata.stream_ordering = 1234
self.assertEqual(original.internal_metadata.stream_ordering, 1234)
cloned = clone_event(original)
cloned.unsigned["b"] = 3
self.assertEqual(original.unsigned, {"a": 1, "b": 2})
self.assertEqual(cloned.unsigned, {"a": 1, "b": 3})
self.assertEqual(cloned.internal_metadata.stream_ordering, 1234)
self.assertEqual(cloned.internal_metadata.txn_id, "txn")
class SerializeEventTestCase(stdlib_unittest.TestCase):
def serialize(self, ev: EventBase, fields: Optional[List[str]]) -> JsonDict:
return serialize_event(

View File

@@ -67,6 +67,23 @@ class FederationServerTests(unittest.FederatingHomeserverTestCase):
self.assertEqual(HTTPStatus.BAD_REQUEST, channel.code, channel.result)
self.assertEqual(channel.json_body["errcode"], "M_NOT_JSON")
def test_failed_edu_causes_500(self) -> None:
"""If the EDU handler fails, /send should return a 500."""
async def failing_handler(_origin: str, _content: JsonDict) -> None:
raise Exception("bleh")
self.hs.get_federation_registry().register_edu_handler(
"FAIL_EDU_TYPE", failing_handler
)
channel = self.make_signed_federation_request(
"PUT",
"/_matrix/federation/v1/send/txn",
{"edus": [{"edu_type": "FAIL_EDU_TYPE", "content": {}}]},
)
self.assertEqual(500, channel.code, channel.result)
class ServerACLsTestCase(unittest.TestCase):
def test_blocked_server(self) -> None:

Some files were not shown because too many files have changed in this diff Show More