Compare commits

...

159 Commits

Author SHA1 Message Date
Richard van der Hoff
2f6cf12255 fixes 2024-04-23 12:05:37 +01:00
Richard van der Hoff
769e9b11cf complement.sh: run tests from all test packages
... to save us remembering to add them to the list each time a new one is
added.
2024-04-23 11:42:20 +01:00
Neil Johnson
074ef4d75f Add an OSX prompt to manually configure icu4c. (#17069)
Documentation fix.
2024-04-19 17:10:44 +01:00
devonh
301c9771c4 Clarify what part of message retention is still experimental (#17099)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2024-04-19 15:26:28 +00:00
dependabot[bot]
800a5b6ef3 Bump types-pillow from 10.2.0.20240406 to 10.2.0.20240415 (#17090)
Bumps [types-pillow](https://github.com/python/typeshed) from
10.2.0.20240406 to 10.2.0.20240415.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-pillow&package-manager=pip&previous-version=10.2.0.20240406&new-version=10.2.0.20240415)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:43:25 +01:00
dependabot[bot]
8c667759ad Bump peaceiris/actions-gh-pages from 3.9.3 to 4.0.0 (#17087)
Bumps
[peaceiris/actions-gh-pages](https://github.com/peaceiris/actions-gh-pages)
from 3.9.3 to 4.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/peaceiris/actions-gh-pages/releases">peaceiris/actions-gh-pages's
releases</a>.</em></p>
<blockquote>
<h2>actions-github-pages v4.0.0</h2>
<p>See <a
href="https://github.com/peaceiris/actions-gh-pages/blob/v4.0.0/CHANGELOG.md">CHANGELOG.md</a>
for more details.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/peaceiris/actions-gh-pages/blob/main/CHANGELOG.md">peaceiris/actions-gh-pages's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<p>All notable changes to this project will be documented in this file.
See <a
href="https://github.com/conventional-changelog/standard-version">standard-version</a>
for commit guidelines.</p>
<h1><a
href="https://github.com/peaceiris/actions-gh-pages/compare/v3.9.3...v4.0.0">4.0.0</a>
(2024-04-08)</h1>
<h3>build</h3>
<ul>
<li>node 20.11.1 (<a
href="5049354438">5049354</a>)</li>
</ul>
<h3>chore</h3>
<ul>
<li>bump node16 to node20 (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1067">#1067</a>)
(<a
href="4eb285e828">4eb285e</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1067">#1067</a></li>
<li>downgrade engines.npm to 8.0.0 (<a
href="87231bc03a">87231bc</a>)</li>
</ul>
<h3>ci</h3>
<ul>
<li>pin node-version to 18 (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/981">#981</a>)
(<a
href="65ebf11929">65ebf11</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/981">#981</a></li>
</ul>
<h3>docs</h3>
<ul>
<li>add Release Strategy (<a
href="67f80d94a1">67f80d9</a>)</li>
<li>fix link to Nuxt github-pages (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/980">#980</a>)
(<a
href="88b4d2aa92">88b4d2a</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/980">#980</a></li>
<li>remove braces in if conditions (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/920">#920</a>)
(<a
href="0fbd122442">0fbd122</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/920">#920</a></li>
</ul>
<h2><a
href="https://github.com/peaceiris/actions-gh-pages/compare/v3.9.2...v3.9.3">3.9.3</a>
(2023-03-30)</h2>
<h3>docs</h3>
<ul>
<li>fix typo, bump hugo version (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/851">#851</a>)
(<a
href="884a0224fd">884a022</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/851">#851</a></li>
</ul>
<h3>fix</h3>
<ul>
<li>fix error handling (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/841">#841</a>)
(<a
href="32e33dcd3a">32e33dc</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/841">#841</a></li>
<li>update known_hosts (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/871">#871</a>)
(<a
href="31c15f0329">31c15f0</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/871">#871</a></li>
</ul>
<h2><a
href="https://github.com/peaceiris/actions-gh-pages/compare/v3.9.1...v3.9.2">3.9.2</a>
(2023-01-17)</h2>
<h3>chore</h3>
<ul>
<li>rename cicd (<a
href="32c9288f55">32c9288</a>)</li>
<li>replace npm ci with install (<a
href="983978086a">9839780</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4f9cc6602d"><code>4f9cc66</code></a>
chore(release): 4.0.0</li>
<li><a
href="9c75028a53"><code>9c75028</code></a>
chore(release): Add build assets</li>
<li><a
href="5049354438"><code>5049354</code></a>
build: node 20.11.1</li>
<li><a
href="4eb285e828"><code>4eb285e</code></a>
chore: bump node16 to node20 (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1067">#1067</a>)</li>
<li><a
href="cdc09a3baa"><code>cdc09a3</code></a>
chore(deps): update dependency <code>@​types/node</code> to v16.18.77
(<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1065">#1065</a>)</li>
<li><a
href="d830378ec6"><code>d830378</code></a>
chore(deps): update dependency <code>@​types/node</code> to v16.18.76
(<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1063">#1063</a>)</li>
<li><a
href="80daa1d144"><code>80daa1d</code></a>
chore(deps): update dependency <code>@​types/node</code> to v16.18.75
(<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1061">#1061</a>)</li>
<li><a
href="108285e909"><code>108285e</code></a>
chore(deps): update dependency ts-jest to v29.1.2 (<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1060">#1060</a>)</li>
<li><a
href="99c95ff54e"><code>99c95ff</code></a>
chore(deps): update dependency <code>@​types/node</code> to v16.18.74
(<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1058">#1058</a>)</li>
<li><a
href="1f4653792d"><code>1f46537</code></a>
chore(deps): update dependency <code>@​types/node</code> to v16.18.73
(<a
href="https://redirect.github.com/peaceiris/actions-gh-pages/issues/1057">#1057</a>)</li>
<li>Additional commits viewable in <a
href="373f7f263a...4f9cc6602d">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=peaceiris/actions-gh-pages&package-manager=github_actions&previous-version=3.9.3&new-version=4.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:43:04 +01:00
dependabot[bot]
14e9ab19be Bump sigstore/cosign-installer from 3.4.0 to 3.5.0 (#17088)
Bumps
[sigstore/cosign-installer](https://github.com/sigstore/cosign-installer)
from 3.4.0 to 3.5.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/sigstore/cosign-installer/releases">sigstore/cosign-installer's
releases</a>.</em></p>
<blockquote>
<h2>v3.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Bump actions/checkout from 4.1.1 to 4.1.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/sigstore/cosign-installer/pull/157">sigstore/cosign-installer#157</a></li>
<li>use go 1.22 now by <a
href="https://github.com/bobcallaway"><code>@​bobcallaway</code></a> in
<a
href="https://redirect.github.com/sigstore/cosign-installer/pull/160">sigstore/cosign-installer#160</a></li>
<li>bump default version to v2.2.4, prep for v3.5.0 release by <a
href="https://github.com/bobcallaway"><code>@​bobcallaway</code></a> in
<a
href="https://redirect.github.com/sigstore/cosign-installer/pull/159">sigstore/cosign-installer#159</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/sigstore/cosign-installer/compare/v3.4.0...v3.5.0">https://github.com/sigstore/cosign-installer/compare/v3.4.0...v3.5.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="59acb6260d"><code>59acb62</code></a>
bump default version to v2.2.4, prep for v3.5.0 release (<a
href="https://redirect.github.com/sigstore/cosign-installer/issues/159">#159</a>)</li>
<li><a
href="22be4ce325"><code>22be4ce</code></a>
use go 1.22 now (<a
href="https://redirect.github.com/sigstore/cosign-installer/issues/160">#160</a>)</li>
<li><a
href="162dfdf7b9"><code>162dfdf</code></a>
Bump actions/checkout from 4.1.1 to 4.1.2 (<a
href="https://redirect.github.com/sigstore/cosign-installer/issues/157">#157</a>)</li>
<li>See full diff in <a
href="https://github.com/sigstore/cosign-installer/compare/v3.4.0...v3.5.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sigstore/cosign-installer&package-manager=github_actions&previous-version=3.4.0&new-version=3.5.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:42:35 +01:00
dependabot[bot]
20c8991a94 Bump peaceiris/actions-mdbook from 1.2.0 to 2.0.0 (#17089)
Bumps
[peaceiris/actions-mdbook](https://github.com/peaceiris/actions-mdbook)
from 1.2.0 to 2.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/peaceiris/actions-mdbook/releases">peaceiris/actions-mdbook's
releases</a>.</em></p>
<blockquote>
<h2>actions-mdbook v2.0.0</h2>
<p>See <a
href="https://github.com/peaceiris/actions-mdbook/blob/v2.0.0/CHANGELOG.md">CHANGELOG.md</a>
for more details.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/peaceiris/actions-mdbook/blob/main/CHANGELOG.md">peaceiris/actions-mdbook's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<p>All notable changes to this project will be documented in this file.
See <a
href="https://github.com/conventional-changelog/standard-version">standard-version</a>
for commit guidelines.</p>
<h1><a
href="https://github.com/peaceiris/actions-mdbook/compare/v1.2.0...v2.0.0">2.0.0</a>
(2024-04-08)</h1>
<h3>build</h3>
<ul>
<li>bump node to 20.12.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/504">#504</a>)
(<a
href="cb4d902e11">cb4d902</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/504">#504</a></li>
</ul>
<h3>chore</h3>
<ul>
<li>revert build (<a
href="c95f05c7f6">c95f05c</a>)</li>
</ul>
<h3>ci</h3>
<ul>
<li>bump actions/checkout from 3 to 4 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/487">#487</a>)
(<a
href="c0c1ffeeae">c0c1ffe</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/487">#487</a></li>
<li>bump actions/dependency-review-action from 2.5.0 to 2.5.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/470">#470</a>)
(<a
href="e8a2552a1a">e8a2552</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/470">#470</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/290">#290</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/300">#300</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/299">#299</a></li>
<li>bump actions/dependency-review-action from 2.5.1 to 3.0.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/472">#472</a>)
(<a
href="9a6ded1ce6">9a6ded1</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/472">#472</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/327">#327</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/324">#324</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/325">#325</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/326">#326</a></li>
<li>bump actions/dependency-review-action from 3.0.0 to 3.0.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/473">#473</a>)
(<a
href="939fe7600f">939fe76</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/473">#473</a></li>
<li>bump actions/dependency-review-action from 3.0.1 to 3.0.2 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/474">#474</a>)
(<a
href="404c95aeed">404c95a</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/474">#474</a></li>
<li>bump actions/dependency-review-action from 3.0.2 to 3.0.3 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/476">#476</a>)
(<a
href="665e827a09">665e827</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/476">#476</a></li>
<li>bump actions/dependency-review-action from 3.0.3 to 3.0.4 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/479">#479</a>)Co-authored-by:
dependabot[bot] <!-- raw HTML omitted --> (<a
href="9d85c8a721">9d85c8a</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/479">#479</a></li>
<li>bump actions/dependency-review-action from 3.0.4 to 3.0.6 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/480">#480</a>)
(<a
href="a1c0a098b8">a1c0a09</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/480">#480</a></li>
<li>bump actions/dependency-review-action from 3.0.6 to 3.0.7 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/483">#483</a>)
(<a
href="2987c698ee">2987c69</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/483">#483</a></li>
<li>bump actions/dependency-review-action from 3.0.7 to 3.0.8 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/485">#485</a>)
(<a
href="162a198ca6">162a198</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/485">#485</a></li>
<li>bump actions/dependency-review-action from 3.0.8 to 3.1.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/488">#488</a>)
(<a
href="60cc2ffbc1">60cc2ff</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/488">#488</a></li>
<li>bump actions/setup-node from 3.5.1 to 3.6.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/475">#475</a>)
(<a
href="10da3f5364">10da3f5</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/475">#475</a></li>
<li>bump actions/setup-node from 3.6.0 to 3.7.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/481">#481</a>)
(<a
href="334df4e551">334df4e</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/481">#481</a></li>
<li>bump actions/setup-node from 3.7.0 to 3.8.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/484">#484</a>)
(<a
href="fe519205c0">fe51920</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/484">#484</a></li>
<li>bump actions/setup-node from 3.8.0 to 3.8.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/486">#486</a>)
(<a
href="c6c9e0f193">c6c9e0f</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/486">#486</a></li>
<li>bump codecov/codecov-action from 3 to 4 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/490">#490</a>)
(<a
href="7b0c98f8f3">7b0c98f</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/490">#490</a></li>
<li>bump github/codeql-action from 1 to 2 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/440">#440</a>)
(<a
href="7ce6923a7b">7ce6923</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/440">#440</a></li>
<li>bump peaceiris/actions-mdbook from 1.1.14 to 1.2.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/469">#469</a>)
(<a
href="59732c82f2">59732c8</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/469">#469</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/397">#397</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/397">#397</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/385">#385</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/385">#385</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/407">#407</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/407">#407</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/409">#409</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/409">#409</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/424">#424</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/424">#424</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/463">#463</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/463">#463</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/393">#393</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/393">#393</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/395">#395</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/395">#395</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/399">#399</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/399">#399</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/400">#400</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/400">#400</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/405">#405</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/405">#405</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/411">#411</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/411">#411</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/412">#412</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/412">#412</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/416">#416</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/416">#416</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/435">#435</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/435">#435</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/438">#438</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/438">#438</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/456">#456</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/456">#456</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/460">#460</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/460">#460</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/462">#462</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/462">#462</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/371">#371</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/371">#371</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/437">#437</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/437">#437</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/392">#392</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/392">#392</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/394">#394</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/394">#394</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/396">#396</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/396">#396</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/402">#402</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/402">#402</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/404">#404</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/404">#404</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/436">#436</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/436">#436</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/373">#373</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/373">#373</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/374">#374</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/374">#374</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/377">#377</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/377">#377</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/380">#380</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/380">#380</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/381">#381</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/381">#381</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/383">#383</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/383">#383</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/384">#384</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/384">#384</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/382">#382</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/382">#382</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/466">#466</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/463">#463</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/462">#462</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/460">#460</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/456">#456</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/438">#438</a>
<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/437">#437</a></li>
</ul>
<h3>feat</h3>
<ul>
<li>bump to node20 runtime (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/500">#500</a>)
(<a
href="46c97c2f70">46c97c2</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/500">#500</a></li>
</ul>
<h1><a
href="https://github.com/peaceiris/actions-mdbook/compare/v1.1.14...v1.2.0">1.2.0</a>
(2022-10-23)</h1>
<h3>chore</h3>
<ul>
<li>Add postinstall for husky install (<a
href="0622767fb2">0622767</a>)</li>
<li>Convert templates to YAML issue forms (<a
href="12969d1763">12969d1</a>)</li>
<li>Fix prettier (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/397">#397</a>)
(<a
href="44ecb22db7">44ecb22</a>),
closes <a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/397">#397</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ee69d230fe"><code>ee69d23</code></a>
chore(release): 2.0.0</li>
<li><a
href="2d79d45dae"><code>2d79d45</code></a>
chore(release): Add build assets</li>
<li><a
href="c95f05c7f6"><code>c95f05c</code></a>
chore: revert build</li>
<li><a
href="cb4d902e11"><code>cb4d902</code></a>
build: bump node to 20.12.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/504">#504</a>)</li>
<li><a
href="46c97c2f70"><code>46c97c2</code></a>
feat: bump to node20 runtime (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/500">#500</a>)</li>
<li><a
href="7b0c98f8f3"><code>7b0c98f</code></a>
ci: bump codecov/codecov-action from 3 to 4 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/490">#490</a>)</li>
<li><a
href="60cc2ffbc1"><code>60cc2ff</code></a>
ci: bump actions/dependency-review-action from 3.0.8 to 3.1.0 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/488">#488</a>)</li>
<li><a
href="c0c1ffeeae"><code>c0c1ffe</code></a>
ci: bump actions/checkout from 3 to 4 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/487">#487</a>)</li>
<li><a
href="c6c9e0f193"><code>c6c9e0f</code></a>
ci: bump actions/setup-node from 3.8.0 to 3.8.1 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/486">#486</a>)</li>
<li><a
href="162a198ca6"><code>162a198</code></a>
ci: bump actions/dependency-review-action from 3.0.7 to 3.0.8 (<a
href="https://redirect.github.com/peaceiris/actions-mdbook/issues/485">#485</a>)</li>
<li>Additional commits viewable in <a
href="adeb05db28...ee69d230fe">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=peaceiris/actions-mdbook&package-manager=github_actions&previous-version=1.2.0&new-version=2.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:42:21 +01:00
dependabot[bot]
dcae2b4ba4 Bump twine from 4.0.2 to 5.0.0 (#17091)
Bumps [twine](https://github.com/pypa/twine) from 4.0.2 to 5.0.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pypa/twine/blob/main/docs/changelog.rst">twine's
changelog</a>.</em></p>
<blockquote>
<h2>Twine 5.0.0 (2024-02-10)</h2>
<p>Bugfixes
^^^^^^^^</p>
<ul>
<li>Use <code>email.message</code> instead of <code>cgi</code> as
<code>cgi</code> has been deprecated
(<code>[#969](https://github.com/pypa/twine/issues/969)
&lt;https://github.com/pypa/twine/issues/969&gt;</code>_)</li>
</ul>
<p>Misc
^^^^</p>
<ul>
<li><code>[#931](https://github.com/pypa/twine/issues/931)
&lt;https://github.com/pypa/twine/issues/931&gt;</code><em>,
<code>[#991](https://github.com/pypa/twine/issues/991)
&lt;https://github.com/pypa/twine/issues/991&gt;</code></em>,
<code>[#1028](https://github.com/pypa/twine/issues/1028)
&lt;https://github.com/pypa/twine/issues/1028&gt;</code><em>,
<code>[#1040](https://github.com/pypa/twine/issues/1040)
&lt;https://github.com/pypa/twine/issues/1040&gt;</code></em></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="94f810c54c"><code>94f810c</code></a>
Merge pull request <a
href="https://redirect.github.com/pypa/twine/issues/1047">#1047</a> from
pypa/new-release</li>
<li><a
href="09d993ad4d"><code>09d993a</code></a>
Update linkcheck_ignore setting for docs</li>
<li><a
href="ab0ed19915"><code>ab0ed19</code></a>
Apply 2024 black format</li>
<li><a
href="407e6cc0c4"><code>407e6cc</code></a>
Build changelog for 5.0.0</li>
<li><a
href="6644b862bb"><code>6644b86</code></a>
Add missing changelog entries</li>
<li><a
href="fe1885f2bf"><code>fe1885f</code></a>
Merge pull request <a
href="https://redirect.github.com/pypa/twine/issues/1034">#1034</a> from
DimitriPapadopoulos/codespell</li>
<li><a
href="694bdcf846"><code>694bdcf</code></a>
Fix typos found by codespell</li>
<li><a
href="89ec78c6be"><code>89ec78c</code></a>
Merge pull request <a
href="https://redirect.github.com/pypa/twine/issues/1040">#1040</a> from
woodruffw-forks/ww/pypi-mandatory-api-tokens</li>
<li><a
href="b3b363aae8"><code>b3b363a</code></a>
tests: lintage</li>
<li><a
href="6e94d200e2"><code>6e94d20</code></a>
tests: more non-PyPI tests</li>
<li>Additional commits viewable in <a
href="https://github.com/pypa/twine/compare/4.0.2...5.0.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=twine&package-manager=pip&previous-version=4.0.2&new-version=5.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:41:45 +01:00
dependabot[bot]
98f57ea3f2 Bump pygithub from 2.2.0 to 2.3.0 (#17092)
Bumps [pygithub](https://github.com/pygithub/pygithub) from 2.2.0 to
2.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pygithub/pygithub/releases">pygithub's
releases</a>.</em></p>
<blockquote>
<h2>v2.3.0</h2>
<h2>New features</h2>
<ul>
<li>Support oauth for enterprise <a
href="https://github.com/EnricoMi"><code>@​EnricoMi</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2780">#2780</a>)</li>
<li>Support creation of Dependabot Organization and Repository Secrets
<a
href="https://github.com/thomascrowley"><code>@​thomascrowley</code></a>
(<a
href="https://redirect.github.com/pygithub/pygithub/issues/2874">#2874</a>)</li>
</ul>
<h2>Improvements</h2>
<ul>
<li>Create release with optional <code>name</code> and
<code>message</code> when <code>generate_release_notes</code> is true <a
href="https://github.com/heitorpolidoro"><code>@​heitorpolidoro</code></a>
(<a
href="https://redirect.github.com/pygithub/pygithub/issues/2868">#2868</a>)</li>
<li>Add missing attributes to <code>WorkflowJob</code> <a
href="https://github.com/xvega"><code>@​xvega</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2921">#2921</a>)</li>
<li>Add <code>created</code> and <code>check_suite_id</code> filter for
Repository Workflow runs <a
href="https://github.com/treee111"><code>@​treee111</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2891">#2891</a>)</li>
<li>Assert requester argument type in Auth <a
href="https://github.com/EnricoMi"><code>@​EnricoMi</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2912">#2912</a>)</li>
</ul>
<h2>Bug Fixes</h2>
<ul>
<li>Revert having allowed values for <code>add_to_collaborators</code>
<a href="https://github.com/jodelasur"><code>@​jodelasur</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2905">#2905</a>)</li>
</ul>
<h2>Maintenance</h2>
<ul>
<li>Fix imports in authentication docs <a
href="https://github.com/wurstbrot"><code>@​wurstbrot</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2923">#2923</a>)</li>
<li>CI: add docformatter to precommit <a
href="https://github.com/Borda"><code>@​Borda</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2614">#2614</a>)</li>
<li>Add <code>.swp</code> fils to <code>.gitignore</code> <a
href="https://github.com/boomanaiden154"><code>@​boomanaiden154</code></a>
(<a
href="https://redirect.github.com/pygithub/pygithub/issues/2903">#2903</a>)</li>
<li>Fix instructions building docs in <code>CONTRIBUTING.md</code> <a
href="https://github.com/wakamex"><code>@​wakamex</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2900">#2900</a>)</li>
<li>Explicitly name the modules built in <code>pyproject.toml</code> <a
href="https://github.com/treee111"><code>@​treee111</code></a> (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2894">#2894</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/PyGithub/PyGithub/blob/main/doc/changes.rst">pygithub's
changelog</a>.</em></p>
<blockquote>
<h2>Version 2.3.0 (March 21, 2024)</h2>
<p>New features
^^^^^^^^^^^^</p>
<ul>
<li>Support OAuth for enterprise (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2780">#2780</a>)
(e4106e00)</li>
<li>Support creation of Dependabot Organization and Repository Secrets
(<a
href="https://redirect.github.com/pygithub/pygithub/issues/2874">#2874</a>)
(0784f835)</li>
</ul>
<p>Improvements
^^^^^^^^^^^^</p>
<ul>
<li>Create release with optional name and message when
generate_release_notes is true (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2868">#2868</a>)
(d65fc30d)</li>
<li>Add missing attributes to WorkflowJob (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2921">#2921</a>)
(9e092458)</li>
<li>Add <code>created</code> and <code>check_suite_id</code> filter for
Repository WorkflowRuns (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2891">#2891</a>)
(c788985c)</li>
<li>Assert requester argument type in Auth (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2912">#2912</a>)
(0b8435fc)</li>
</ul>
<p>Bug Fixes
^^^^^^^^^</p>
<ul>
<li>Revert having allowed values for add_to_collaborators (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2905">#2905</a>)
(b542438e)</li>
</ul>
<p>Maintenance
^^^^^^^^^^^</p>
<ul>
<li>Fix imports in authentication docs (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2923">#2923</a>)
(e3d36535)</li>
<li>CI: add docformatter to precommit (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2614">#2614</a>)
(96ad19ae)</li>
<li>Add .swp files to gitignore (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2903">#2903</a>)
(af529abe)</li>
<li>Fix instructions building docs in CONTRIBUTING.md (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2900">#2900</a>)
(cd8e528d)</li>
<li>Explicitly name the modules built in pyproject.toml (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2894">#2894</a>)
(4d461734)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7266e812ed"><code>7266e81</code></a>
Release v2.3.0 (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2926">#2926</a>)</li>
<li><a
href="e4106e00fc"><code>e4106e0</code></a>
Support oauth for enterprise (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2780">#2780</a>)</li>
<li><a
href="d65fc30d39"><code>d65fc30</code></a>
Create release with optional name and message when
generate_release_notes is ...</li>
<li><a
href="0784f8354d"><code>0784f83</code></a>
Support creation of Dependabot Organization and Repository Secrets (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2874">#2874</a>)</li>
<li><a
href="9e092458a5"><code>9e09245</code></a>
Add missing attributes to WorkflowJob (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2921">#2921</a>)</li>
<li><a
href="e3d365358d"><code>e3d3653</code></a>
Fix imports in authentication docs (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2923">#2923</a>)</li>
<li><a
href="c788985c9f"><code>c788985</code></a>
Add <code>created</code> and <code>check_suite_id</code> filter for
Repository WorkflowRuns (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2891">#2891</a>)</li>
<li><a
href="0b8435fccb"><code>0b8435f</code></a>
Assert requester argument type in Auth (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2912">#2912</a>)</li>
<li><a
href="96ad19aec7"><code>96ad19a</code></a>
CI: add docformatter to precommit (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2614">#2614</a>)</li>
<li><a
href="b542438e31"><code>b542438</code></a>
Revert having allowed values for add_to_collaborators (<a
href="https://redirect.github.com/pygithub/pygithub/issues/2905">#2905</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pygithub/pygithub/compare/v2.2.0...v2.3.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pygithub&package-manager=pip&previous-version=2.2.0&new-version=2.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:41:24 +01:00
dependabot[bot]
f5b6005559 Bump pyasn1-modules from 0.3.0 to 0.4.0 (#17093)
Bumps [pyasn1-modules](https://github.com/pyasn1/pyasn1-modules) from
0.3.0 to 0.4.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pyasn1/pyasn1-modules/releases">pyasn1-modules's
releases</a>.</em></p>
<blockquote>
<h2>Release 0.4.0</h2>
<p>It's a major release where we drop Python 2 support entirely.
The most significant changes are:</p>
<ul>
<li>Added support for Python 3.11, 3.12</li>
<li>Removed support for EOL Pythons 2.7, 3.6, 3.7</li>
</ul>
<p>A full list of changes can be seen in the <a
href="https://github.com/pyasn1/pyasn1-modules/blob/main/CHANGES.txt">CHANGELOG</a>.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyasn1/pyasn1-modules/blob/main/CHANGES.txt">pyasn1-modules's
changelog</a>.</em></p>
<blockquote>
<h2>Revision 0.4.0, released 26-03-2024</h2>
<ul>
<li>Added support for Python 3.11, 3.12</li>
<li>Removed support for EOL Pythons 2.7, 3.6, 3.7</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="98b1e268a3"><code>98b1e26</code></a>
Prepare release 0.4.0</li>
<li><a
href="0339532a08"><code>0339532</code></a>
Drop support for EOL Python 3.6 and 3.7 (<a
href="https://redirect.github.com/pyasn1/pyasn1-modules/issues/14">#14</a>)</li>
<li><a
href="9ec5409154"><code>9ec5409</code></a>
Drop support for EOL Python 2.7 (<a
href="https://redirect.github.com/pyasn1/pyasn1-modules/issues/12">#12</a>)</li>
<li><a
href="252ac00bf1"><code>252ac00</code></a>
Add support for Python 3.12 (<a
href="https://redirect.github.com/pyasn1/pyasn1-modules/issues/11">#11</a>)</li>
<li>See full diff in <a
href="https://github.com/pyasn1/pyasn1-modules/compare/v0.3.0...v0.4.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pyasn1-modules&package-manager=pip&previous-version=0.3.0&new-version=0.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:41:15 +01:00
dependabot[bot]
47f3870894 Bump ruff from 0.3.5 to 0.3.7 (#17094)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.3.5 to 0.3.7.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>v0.3.7</h2>
<h2>Changes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Implement
<code>loop-iterator-mutation</code> (<code>B909</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9578">#9578</a>)</li>
<li>[<code>pylint</code>] Implement rule to prefer augmented assignment
(<code>PLR6104</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9932">#9932</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid TOCTOU errors in cache initialization (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10884">#10884</a>)</li>
<li>[<code>pylint</code>] Recode <code>nan-comparison</code> rule to
<code>W0177</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10894">#10894</a>)</li>
<li>[<code>pylint</code>] Reverse min-max logic in
<code>if-stmt-min-max</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10890">#10890</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a href="https://github.com/carljm"><code>@​carljm</code></a></li>
<li><a
href="https://github.com/charliermarsh"><code>@​charliermarsh</code></a></li>
<li><a href="https://github.com/lshi18"><code>@​lshi18</code></a></li>
<li><a href="https://github.com/mimre25"><code>@​mimre25</code></a></li>
</ul>
<h2>v0.3.6</h2>
<h2>Changes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>pylint</code>] Implement
<code>bad-staticmethod-argument</code> (<code>PLW0211</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10781">#10781</a>)</li>
<li>[<code>pylint</code>] Implement <code>if-stmt-min-max</code>
(<code>PLR1730</code>, <code>PLR1731</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10002">#10002</a>)</li>
<li>[<code>pyupgrade</code>] Replace <code>str,Enum</code> multiple
inheritance with <code>StrEnum</code> <code>UP042</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10713">#10713</a>)</li>
<li>[<code>refurb</code>] Implement
<code>if-expr-instead-of-or-operator</code> (<code>FURB110</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10687">#10687</a>)</li>
<li>[<code>refurb</code>] Implement <code>int-on-sliced-str</code>
(<code>FURB166</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10650">#10650</a>)</li>
<li>[<code>refurb</code>] Implement <code>write-whole-file</code>
(<code>FURB103</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10802">#10802</a>)</li>
<li>[<code>refurb</code>] Support <code>itemgetter</code> in
<code>reimplemented-operator</code> (<code>FURB118</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10526">#10526</a>)</li>
<li>[<code>flake8_comprehensions</code>] Add
<code>sum</code>/<code>min</code>/<code>max</code> to unnecessary
comprehension check (<code>C419</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10759">#10759</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>pydocstyle</code>] Require capitalizing docstrings where the
first sentence is a single word (<code>D403</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10776">#10776</a>)</li>
<li>[<code>pycodestyle</code>] Ignore annotated lambdas in class scopes
(<code>E731</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10720">#10720</a>)</li>
<li>[<code>flake8-pyi</code>] Various improvements to PYI034 (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10807">#10807</a>)</li>
<li>[<code>flake8-slots</code>] Flag subclasses of call-based
<code>typing.NamedTuple</code>s as well as subclasses of
<code>collections.namedtuple()</code> (<code>SLOT002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10808">#10808</a>)</li>
<li>[<code>pyflakes</code>] Allow forward references in class bases in
stub files (<code>F821</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10779">#10779</a>)</li>
<li>[<code>pygrep-hooks</code>] Improve <code>blanket-noqa</code> error
message (<code>PGH004</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10851">#10851</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Support <code>FORCE_COLOR</code> env var (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10839">#10839</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Support negated patterns in <code>[extend-]per-file-ignores</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/10852">#10852</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.3.7</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Implement
<code>loop-iterator-mutation</code> (<code>B909</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9578">#9578</a>)</li>
<li>[<code>pylint</code>] Implement rule to prefer augmented assignment
(<code>PLR6104</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9932">#9932</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid TOCTOU errors in cache initialization (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10884">#10884</a>)</li>
<li>[<code>pylint</code>] Recode <code>nan-comparison</code> rule to
<code>W0177</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10894">#10894</a>)</li>
<li>[<code>pylint</code>] Reverse min-max logic in
<code>if-stmt-min-max</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10890">#10890</a>)</li>
</ul>
<h2>0.3.6</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>pylint</code>] Implement
<code>bad-staticmethod-argument</code> (<code>PLW0211</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10781">#10781</a>)</li>
<li>[<code>pylint</code>] Implement <code>if-stmt-min-max</code>
(<code>PLR1730</code>, <code>PLR1731</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10002">#10002</a>)</li>
<li>[<code>pyupgrade</code>] Replace <code>str,Enum</code> multiple
inheritance with <code>StrEnum</code> <code>UP042</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10713">#10713</a>)</li>
<li>[<code>refurb</code>] Implement
<code>if-expr-instead-of-or-operator</code> (<code>FURB110</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10687">#10687</a>)</li>
<li>[<code>refurb</code>] Implement <code>int-on-sliced-str</code>
(<code>FURB166</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10650">#10650</a>)</li>
<li>[<code>refurb</code>] Implement <code>write-whole-file</code>
(<code>FURB103</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10802">#10802</a>)</li>
<li>[<code>refurb</code>] Support <code>itemgetter</code> in
<code>reimplemented-operator</code> (<code>FURB118</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10526">#10526</a>)</li>
<li>[<code>flake8_comprehensions</code>] Add
<code>sum</code>/<code>min</code>/<code>max</code> to unnecessary
comprehension check (<code>C419</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10759">#10759</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>pydocstyle</code>] Require capitalizing docstrings where the
first sentence is a single word (<code>D403</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10776">#10776</a>)</li>
<li>[<code>pycodestyle</code>] Ignore annotated lambdas in class scopes
(<code>E731</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10720">#10720</a>)</li>
<li>[<code>flake8-pyi</code>] Various improvements to PYI034 (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10807">#10807</a>)</li>
<li>[<code>flake8-slots</code>] Flag subclasses of call-based
<code>typing.NamedTuple</code>s as well as subclasses of
<code>collections.namedtuple()</code> (<code>SLOT002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10808">#10808</a>)</li>
<li>[<code>pyflakes</code>] Allow forward references in class bases in
stub files (<code>F821</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10779">#10779</a>)</li>
<li>[<code>pygrep-hooks</code>] Improve <code>blanket-noqa</code> error
message (<code>PGH004</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10851">#10851</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Support <code>FORCE_COLOR</code> env var (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10839">#10839</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Support negated patterns in <code>[extend-]per-file-ignores</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/10852">#10852</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>flake8-import-conventions</code>] Accept non-aliased (but
correct) import in <code>unconventional-import-alias</code>
(<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10729">#10729</a>)</li>
<li>[<code>flake8-quotes</code>] Add semantic model flag when inside
f-string replacement field (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10766">#10766</a>)</li>
<li>[<code>pep8-naming</code>] Recursively resolve
<code>TypeDicts</code> for N815 violations (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10719">#10719</a>)</li>
<li>[<code>flake8-quotes</code>] Respect <code>Q00*</code> ignores in
<code>flake8-quotes</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10728">#10728</a>)</li>
<li>[<code>flake8-simplify</code>] Show negated condition in
<code>needless-bool</code> diagnostics (<code>SIM103</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10854">#10854</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2e37cf6b3b"><code>2e37cf6</code></a>
Bump version to v0.3.7 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10895">#10895</a>)</li>
<li><a
href="a9e4393008"><code>a9e4393</code></a>
[<code>pylint</code>] Implement rule to prefer augmented assignment
(<code>PLR6104</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/9932">#9932</a>)</li>
<li><a
href="312f43475f"><code>312f434</code></a>
[<code>pylint</code>] Recode <code>nan-comparison</code> rule to
<code>W0177</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10894">#10894</a>)</li>
<li><a
href="563daa8a86"><code>563daa8</code></a>
Fix docs and add overlap test for negated per-file-ignores (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10863">#10863</a>)</li>
<li><a
href="7ae15c6e0a"><code>7ae15c6</code></a>
Fix comment copy/paste typo in newtype_index (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10892">#10892</a>)</li>
<li><a
href="03899dcba3"><code>03899dc</code></a>
[<code>flake8-bugbear</code>] Implement
<code>loop-iterator-mutation</code> (<code>B909</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/9578">#9578</a>)</li>
<li><a
href="25f5a8b201"><code>25f5a8b</code></a>
Struct not tuple for compiled per-file ignores (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10864">#10864</a>)</li>
<li><a
href="e7d1d43f39"><code>e7d1d43</code></a>
[<code>pylint</code>] Reverse min-max logic in
<code>if-stmt-min-max</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10890">#10890</a>)</li>
<li><a
href="9b9098c3dc"><code>9b9098c</code></a>
Downgrade ESLint to v8 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10888">#10888</a>)</li>
<li><a
href="0cc154c2a9"><code>0cc154c</code></a>
Avoid TOCTOU errors in cache initialization (<a
href="https://redirect.github.com/astral-sh/ruff/issues/10884">#10884</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/v0.3.5...v0.3.7">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.3.5&new-version=0.3.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:41:03 +01:00
dependabot[bot]
6d64f1b2b8 Bump anyhow from 1.0.81 to 1.0.82 (#17095)
Bumps [anyhow](https://github.com/dtolnay/anyhow) from 1.0.81 to 1.0.82.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dtolnay/anyhow/releases">anyhow's
releases</a>.</em></p>
<blockquote>
<h2>1.0.82</h2>
<ul>
<li>Documentation improvements</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="074bdea1c7"><code>074bdea</code></a>
Release 1.0.82</li>
<li><a
href="47a4fbfa36"><code>47a4fbf</code></a>
Merge pull request <a
href="https://redirect.github.com/dtolnay/anyhow/issues/360">#360</a>
from dtolnay/docensure</li>
<li><a
href="c5af1db020"><code>c5af1db</code></a>
Make ensure's doc comment apply to the cfg(not(doc)) macro too</li>
<li><a
href="bebc7a2fe4"><code>bebc7a2</code></a>
Revert &quot;Temporarily disable miri on doctests&quot;</li>
<li><a
href="f2c4db9b47"><code>f2c4db9</code></a>
Update ui test suite to nightly-2024-03-31</li>
<li><a
href="028cbeedf5"><code>028cbee</code></a>
Explicitly install a Rust toolchain for cargo-outdated job</li>
<li><a
href="7a4cac5192"><code>7a4cac5</code></a>
Merge pull request <a
href="https://redirect.github.com/dtolnay/anyhow/issues/358">#358</a>
from dtolnay/workspacewrapper</li>
<li><a
href="939db012c2"><code>939db01</code></a>
Apply RUSTC_WORKSPACE_WRAPPER</li>
<li><a
href="9f84a37551"><code>9f84a37</code></a>
Temporarily disable miri on doctests</li>
<li><a
href="45e5a589e9"><code>45e5a58</code></a>
Ignore dead code lint in test</li>
<li>Additional commits viewable in <a
href="https://github.com/dtolnay/anyhow/compare/1.0.81...1.0.82">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=anyhow&package-manager=cargo&previous-version=1.0.81&new-version=1.0.82)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 09:40:55 +01:00
Gordan Trevis
1d47532310 Parse json validation (#16923)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-04-18 13:57:38 +01:00
Quentin Gliech
09f0957b36 Helpers to transform Twisted requests to Rust http Requests/Responses (#17081)
This adds functions to transform a Twisted request to the
`http::Request`, and then to send back an `http::Response` through it.

It also imports the SynapseError exception so that we can throw that
from Rust code directly

Example usage of this would be:

```rust
use crate::http::{http_request_from_twisted, http_response_to_twisted, HeaderMapPyExt};

fn handler(twisted_request: &PyAny) -> PyResult<()> {
    let request = http_request_from_twisted(twisted_request)?;

    let ua: headers::UserAgent = request.headers().typed_get_required()?;

    if whatever {
        return Err((crate::errors::SynapseError::new(
            StatusCode::UNAUTHORIZED,
            "Whatever".to_owned
            "M_UNAUTHORIZED",
            None,
            None,
        )));
    }

    let response = Response::new("hello".as_bytes());
    http_response_to_twisted(twisted_request, response)?;

    Ok(())
}
```
2024-04-18 12:20:30 +02:00
Erik Johnston
803f05f60c Fix remote receipts for events we don't have (#17096)
Introduced in #17032
2024-04-17 16:08:40 +01:00
Quentin Gliech
c8e0bed426 Support for MSC4108 via delegation (#17086)
This adds support for MSC4108 via delegation, similar to what has been done for MSC3886

---------

Co-authored-by: Hugh Nimmo-Smith <hughns@element.io>
2024-04-17 16:47:35 +02:00
Quentin Gliech
28f5ad07d3 Bump minimum required Rust version to 1.66.0 (#17079) 2024-04-17 15:44:40 +02:00
Gordan Trevis
f0d6f14047 Parse Integer negative value validation (#16920) 2024-04-16 19:12:36 +00:00
Olivier Wilkinson (reivilibre)
3a196b3227 Merge branch 'master' into develop 2024-04-16 17:36:21 +01:00
Olivier Wilkinson (reivilibre)
fbb2573525 1.105.0 2024-04-16 15:53:30 +01:00
Kegan Dougal
259442fa4c bugfix: make msc3967 idempotent (#16943)
MSC3967 was updated recently to make it more robust to network failures:

> there is an existing cross-signing master key and it exactly matches
the cross-signing master key provided in the request body. If there are
any additional keys provided in the request (self signing key, user
signing key) they MUST also match the existing keys stored on the
server. In other words, the request contains no new keys. If there are
new keys, UIA MUST be performed.


https://github.com/matrix-org/matrix-spec-proposals/blob/hughns/device-signing-upload-uia/proposals/3967-device-signing-upload-uia.md#proposal

This covers the case where the 200 OK is lost in transit so the client
retries the upload, only to then get UIA'd.

Complement tests: https://github.com/matrix-org/complement/pull/713 -
passing example
https://github.com/element-hq/synapse/actions/runs/7976948122/job/21778795094?pr=16943#step:7:8820

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: reivilibre <oliverw@matrix.org>
2024-04-15 10:57:56 +00:00
Nick Mills-Barrett
fe4719a268 Use receipts event_stream_ordering instead of joins (#17032)
Resurrecting https://github.com/matrix-org/synapse/pull/13918.

This should reduce IOPs incurred by joining to the events table to
lookup stream ordering, which happens in many receipt handling code
paths. Like the previous PR I believe sufficient time has passed between
the original migration in DB schema 72 and now to merge this as-is. It's
highly unlikely that both the migration is still ongoing AND (active)
users still have any receipts prior to that date.

In the unlikely event there is a receipt without a populated
`event_stream_ordering` synapse will behave just as it does now when
receipts exist for events that don't (yet): for push action calculation
the receipts are just ignored.

I've removed the validation on event IDs as this is already covered
here:

59ceabcb97/synapse/handlers/receipts.py (L189-L192)
2024-04-12 09:28:44 +01:00
Erik Johnston
3a30846bd0 Fix mypy on latest Twisted release (#17036)
`ITransport.abortConnection` isn't a thing, but
`HTTPChannel.forceAbortClient` calls it, so lets just use that

Fixes https://github.com/element-hq/synapse/issues/16728
2024-04-11 16:03:45 +01:00
Andrew Morgan
db4e321219 1.105.0rc1 2024-04-11 12:16:31 +01:00
Patrick Cloke
657b8cc75c Stabilize support for MSC4010: push rules & account data. (#17022)
See
[MSC4010](https://github.com/matrix-org/matrix-spec-proposals/pull/4010),
but this is pretty much just removing an experimental flag.

Part of #17021
2024-04-09 17:11:50 +01:00
Patrick Cloke
a2a543fd12 Stabliize support for MSC3981: recurse /relations (#17023)
See
[MSC3981](https://github.com/matrix-org/matrix-spec-proposals/pull/3981),
this pretty much just removes flags though.

Part of #17021
2024-04-09 17:11:08 +01:00
Erik Johnston
89f1092284 Also check if first event matches the last in prev batch (#17066)
Refinement of #17064 

cc @richvdh
2024-04-09 14:01:12 +00:00
Sumiran Pokharel
4ffed6330f #17039 Issue: Update base_rules.rs (#17043)
Co-authored-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2024-04-09 14:07:26 +01:00
Mathieu Velten
e363881592 Fix PR #16677, a parameter was missing in a function call (#17033)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-04-09 14:06:46 +01:00
Erik Johnston
d40878451c Add forgotten schema delta (#17054)
This should have been in #17045. Whoops.
2024-04-09 13:03:41 +01:00
dependabot[bot]
892cbd0624 Bump packaging from 23.2 to 24.0 (#17027) 2024-04-09 11:25:32 +01:00
dependabot[bot]
106cfd4b39 Bump serde_json from 1.0.114 to 1.0.115 (#17041) 2024-04-09 11:25:23 +01:00
dependabot[bot]
0a6ae6fe4c Bump regex from 1.10.3 to 1.10.4 (#17028)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-08 17:56:37 +01:00
dependabot[bot]
13a3987929 Bump ruff from 0.3.2 to 0.3.5 (#17060)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-08 17:54:18 +01:00
dependabot[bot]
680f60102b Bump types-pillow from 10.2.0.20240125 to 10.2.0.20240406 (#17061)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-08 17:52:43 +01:00
dependabot[bot]
3e51b370c5 Bump typing-extensions from 4.9.0 to 4.11.0 (#17062)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-08 17:52:02 +01:00
dependabot[bot]
9b8597e431 Bump types-requests from 2.31.0.20240125 to 2.31.0.20240406 (#17063)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-08 17:50:16 +01:00
Erik Johnston
4d10a8fb18 Fixups to #17064 (#17065)
Forget a line, and an empty batch is trivially linear.

c.f. #17064
2024-04-08 14:55:19 +01:00
Erik Johnston
1f8f991d51 Add back fast path for non-gappy syncs (#17064)
PR #16942 removed an invalid optimisation that avoided pulling out state
for non-gappy syncs. This causes a large increase in DB usage. c.f.
#16941 for why that optimisation was wrong.

However, we can still optimise in the simple case where the events in
the timeline are a linear chain without any branching/merging of the
DAG.

cc. @richvdh
2024-04-08 14:25:28 +01:00
Erik Johnston
5360baeb64 Pull out fewer receipts from DB when doing push (#17049)
Before we were pulling out *all* read receipts for a user for every
event we pushed. Instead let's only pull out the relevant receipts.

This also pulled out the event rows for each receipt, causing load on
the events table.
2024-04-05 12:46:34 +01:00
Richard van der Hoff
0e68e9b7f4 Fix bug in calculating state for non-gappy syncs (#16942)
Unfortunately, the optimisation we applied here for non-gappy syncs is
not actually valid.

Fixes https://github.com/element-hq/synapse/issues/16941.

~~Based on https://github.com/element-hq/synapse/pull/16930.~~
Requires https://github.com/matrix-org/sytest/pull/1374.
2024-04-04 16:15:35 +00:00
Richard van der Hoff
230b709d9d /sync: fix bug in calculating state response (#16930)
Fix a long-standing issue which could cause state to be omitted from the
sync response if the last event was filtered out.

Fixes: https://github.com/element-hq/synapse/issues/16928
2024-04-04 12:14:24 +00:00
Richard van der Hoff
05957ac70f Fix bug in /sync response for archived rooms (#16932)
This PR fixes a very, very niche edge-case, but I've got some more work
coming which will otherwise make the problem worse.

The bug happens when the syncing user leaves a room, and has a sync
filter which includes "left" rooms, but sets the timeline limit to 0. In
that case, the state returned in the `state` section is calculated
incorrectly.

The fix is to pass a token corresponding to the point that the user
leaves the room through to `compute_state_delta`.
2024-04-04 12:47:59 +01:00
Erik Johnston
31122b71bc Add missing index to access_tokens table (#17045)
This was causing sequential scans when using refresh tokens.
2024-04-04 11:05:40 +01:00
Erik Johnston
51776745b9 Merge branch 'master' into develop 2024-04-02 18:44:47 +01:00
Erik Johnston
ca27b51665 1.104.0 2024-04-02 17:17:02 +01:00
Erik Johnston
ec174d0470 Refactor chain fetching (#17044)
Since these queries are duplicated in two places.
2024-04-02 15:33:56 +01:00
Erik Johnston
fd48fc4585 Fixups to new push stream (#17038)
Follow on from #17037
2024-03-28 16:29:23 +00:00
Erik Johnston
ea6bfae0fc Add support for moving /push_rules off of main process (#17037) 2024-03-28 15:44:07 +00:00
Erik Johnston
59ceabcb97 Fixup changelog 2024-03-26 13:45:57 +00:00
Erik Johnston
0581741342 Fixup changelog 2024-03-26 13:44:06 +00:00
Erik Johnston
34878b6bc9 Merge remote-tracking branch 'origin/develop' into release-v1.104 2024-03-26 13:42:09 +00:00
Erik Johnston
c900d18647 Fix OIDC login regression (#17031)
Requests may require a User-Agent header, and the change in #16972
accidentally removed it, resulting in requests getting rejected causing
login to fail.
2024-03-26 13:26:46 +00:00
Erik Johnston
03f0d746c3 1.104.0rc1 2024-03-26 11:49:11 +00:00
Richard van der Hoff
b5322b4daf Ensure that pending to-device events are sent over federation at startup (#16925)
Fixes https://github.com/element-hq/synapse/issues/16680, as well as a
related bug, where servers which we had *never* successfully sent an
event to would not be retried.

In order to fix the case of pending to-device messages, we hook into the
existing `wake_destinations_needing_catchup` process, by extending it to
look for destinations that have pending to-device messages. The
federation transmission loop then attempts to send the pending to-device
messages as normal.
2024-03-22 13:24:11 +00:00
Mathieu Velten
b7af076ab5 Add OIDC config to add extra parameters to the authorize URL (#16971) 2024-03-22 10:35:11 +00:00
SpiritCroc
9ad49e7ecf Do not refuse to set read_marker if previous event_id is in wrong room (#16990) 2024-03-21 18:43:07 +00:00
Hanadi
f7a3ebe44d Fix reject knocks on deactivating account (#17010) 2024-03-21 18:05:54 +00:00
Sam Wedgwood
bef765b262 generate configuration with correct user in start.py for docker (#16978) 2024-03-21 17:55:44 +00:00
dependabot[bot]
6d3ffdd421 Bump dawidd6/action-download-artifact from 3.1.2 to 3.1.4 (#17008) 2024-03-21 17:50:18 +00:00
Mathieu Velten
3ab9e6d524 OIDC: try to JWT decode userinfo response if JSON parsing failed (#16972) 2024-03-21 17:49:44 +00:00
Richard van der Hoff
db95b75515 Patch the db conn pool sooner in tests (#17017)
When running unit tests, we patch the database connection pool so that
it runs queries "synchronously". This is ok, except that if any queries
are launched before we do the patching, those queries get left in limbo
and never complete.

To fix this, let's change the way we do the switcheroo, by patching out
the method which creates the connection pool in the first place.
2024-03-21 17:48:16 +00:00
dependabot[bot]
4c98aad47b Bump netaddr from 0.9.0 to 1.2.1 (#17006) 2024-03-21 17:36:40 +00:00
Tadeusz Sośnierz
5a59c68b3d Remove the hardcoded poetry version from contributing guide (#17002) 2024-03-21 17:12:02 +00:00
grahhnt
6cf23febb9 Add note to using --curses under sqlite porting (#17012) 2024-03-21 17:07:21 +00:00
Eirik
159536d525 Update link, in installation guide, for docker hub synapse images (#17001) 2024-03-21 17:05:52 +00:00
dependabot[bot]
70a86f69c2 Bump types-jsonschema from 4.21.0.20240118 to 4.21.0.20240311 (#17007) 2024-03-21 16:53:51 +00:00
Andrew Morgan
21daa56ee1 Prevent start_for_complement.sh from setting START_POSTGRES to false when it's already set (#16985)
I have a use case where I'd like the Synapse image to start up a
postgres instance that I can use, but don't want to force Synapse to use
postgres as well.

This commit prevents postgres from being started when it has already
been explicitly enabled elsewhere.
2024-03-21 13:50:51 +00:00
Shay
cf5adc80e1 Update power level default for public rooms (#16907) 2024-03-19 17:55:31 +00:00
Shay
8fb5b0f335 Improve event validation (#16908)
As the title states.
2024-03-19 17:52:53 +00:00
dependabot[bot]
77b824008c Bump pydantic from 2.6.0 to 2.6.4 (#17004) 2024-03-19 17:45:56 +00:00
dependabot[bot]
77317cecc7 Bump anyhow from 1.0.80 to 1.0.81 (#17009) 2024-03-19 17:45:41 +00:00
dependabot[bot]
3e89afdef7 Bump jinja2 from 3.1.2 to 3.1.3 (#17005) 2024-03-19 17:45:23 +00:00
dependabot[bot]
f768e028c1 Bump types-pyopenssl from 23.3.0.0 to 24.0.0.20240311 (#17003) 2024-03-19 17:45:15 +00:00
Mathieu Velten
74ab329eaa Pass module API to OIDC mapping provider (#16974)
As done for SAML mapping provider, let's pass the module API to the OIDC
one so the mapper can do more logic in its code.
2024-03-19 17:20:10 +00:00
V02460
05489d89c6 Specify IP subnet literals in canonical form (#16953)
This is needed, because the netaddr package removed support for the
implicit prefix form in version 1.0.0:
https://github.com/netaddr/netaddr/pull/360
2024-03-19 17:19:12 +00:00
Richard van der Hoff
9635822cc1 Clarify docs for some room state functions (#16950)
State *before* an event is different to state *after* that event, and
people tend to assume the wrong one.
2024-03-19 17:16:37 +00:00
Olivier Wilkinson (reivilibre)
42fa47a2a4 Merge branch 'master' into develop 2024-03-19 14:19:00 +00:00
Olivier Wilkinson (reivilibre)
0b4dc4de7c 1.103.0 2024-03-19 12:24:41 +00:00
Richard van der Hoff
52f456a822 /sync: Fix edge-case in calculating the "device_lists" response (#16949)
Fixes https://github.com/element-hq/synapse/issues/16948. If the `join`
and the `leave` are in the same sync response, we need to count them as
a "left" user.
2024-03-14 17:34:19 +00:00
Richard van der Hoff
6d5bafb2c8 Split up SyncHandler.compute_state_delta (#16929)
This is a huge method, which melts my brain.

This is a non-functional change which lays some groundwork for future
work in this area.
2024-03-14 17:18:48 +00:00
Will Hunt
1198f649ea Sort versions in the documentation version picker appropriately. (#16966)
Fixes #16964 

This adds a proper sorter for versions which takes into account semantic
versions, rather than just relying on localeCompare.
2024-03-14 15:18:51 +00:00
Richard van der Hoff
acc2f00eca upgrade.md: fix grammatical errors (#16965)
comma splice
"rollback" is a noun
2024-03-14 13:54:01 +00:00
Andrew Morgan
1c1b0bfa77 Add query to update local cache of a remote user's device list to docs (#16892) 2024-03-14 13:53:25 +00:00
Mathieu Velten
cb562d73aa Improve lock performance when a lot of locks are waiting (#16840)
When a lot of locks are waiting for a single lock, notifying all locks
independently with `call_later` on each release is really costly and
incurs some kind of async contention, where the CPU is spinning a lot
for not much.

The included test is taking around 30s before the change, and 0.5s
after.

It was found following failing tests with
https://github.com/element-hq/synapse/pull/16827.
2024-03-14 13:49:54 +00:00
dependabot[bot]
a111ba0207 Bump types-psycopg2 from 2.9.21.16 to 2.9.21.20240311 (#16995) 2024-03-14 10:36:21 +00:00
dependabot[bot]
1cc1d6b655 Bump pyo3 from 0.20.2 to 0.20.3 (#16962) 2024-03-14 10:36:13 +00:00
Richard van der Hoff
92f2069627 Multi-worker-docker-container: disable log buffering (#16919)
Background: we have a `matrixdotorg/synapse-workers` docker image, which
is intended for running multiple workers within the same container. That
image includes a `prefix-log` script which, for each line printed to
stdout or stderr by one of the processes, prepends the name of the
process.

This commit disables buffering in that script, so that lines are logged
quickly after they are printed. This makes it much easier to understand
the output, since they then come out in a natural order.
2024-03-13 17:21:37 +00:00
dependabot[bot]
9b5eef95ad Bump ruff from 0.1.14 to 0.3.2 (#16994) 2024-03-13 17:06:23 +00:00
dependabot[bot]
e161103b46 Bump mypy from 1.5.1 to 1.8.0 (#16901) 2024-03-13 17:05:57 +00:00
dependabot[bot]
f4e12ceb1f Bump dawidd6/action-download-artifact from 3.1.1 to 3.1.2 (#16960) 2024-03-13 16:50:47 +00:00
dependabot[bot]
10e56b162f Bump cryptography from 41.0.7 to 42.0.5 (#16958) 2024-03-13 16:50:11 +00:00
dependabot[bot]
74fb3e1996 Bump serde_json from 1.0.113 to 1.0.114 (#16961) 2024-03-13 16:49:54 +00:00
dependabot[bot]
a91fb6cc06 Bump serde from 1.0.196 to 1.0.197 (#16963) 2024-03-13 16:49:19 +00:00
dependabot[bot]
6cb8839f67 Bump log from 0.4.20 to 0.4.21 (#16977) 2024-03-13 16:49:06 +00:00
dependabot[bot]
1e68b56a62 Bump black from 23.10.1 to 24.2.0 (#16936) 2024-03-13 16:46:44 +00:00
V02460
2bdf6280f6 Raise poetry-core version cap to 1.9.0 (#16986)
A new poetry-core version was released. See if CI is happy. Required for
the latest Fedora Synapse package.
2024-03-13 16:40:08 +00:00
Erik Johnston
5c0b87ff95 Update changelog 2024-03-12 15:12:19 +00:00
Erik Johnston
0d44f64c4e Merge remote-tracking branch 'origin/develop' into release-v1.103 2024-03-12 15:11:03 +00:00
Gerrit Gogel
1f88790764 Prevent locking up while processing batched_auth_events (#16968)
This PR aims to fix #16895, caused by a regression in #7 and not fixed
by #16903. The PR #16903 only fixes a starvation issue, where the CPU
isn't released. There is a second issue, where the execution is blocked.
This theory is supported by the flame graphs provided in #16895 and the
fact that I see the CPU usage reducing and far below the limit.

Since the changes in #7, the method `check_state_independent_auth_rules`
is called with the additional parameter `batched_auth_events`:


6fa13b4f92/synapse/handlers/federation_event.py (L1741-L1743)


It makes the execution enter this if clause, introduced with #15195


6fa13b4f92/synapse/event_auth.py (L178-L189)

There are two issues in the above code snippet.

First, there is the blocking issue. I'm not entirely sure if this is a
deadlock, starvation, or something different. In the beginning, I
thought the copy operation was responsible. It wasn't. Then I
investigated the nested `store.get_events` inside the function `update`.
This was also not causing the blocking issue. Only when I replaced the
set difference operation (`-` ) with a list comprehension, the blocking
was resolved. Creating and comparing sets with a very large amount of
events seems to be problematic.

This is how the flamegraph looks now while persisting outliers. As you
can see, the execution no longer locks up in the above function.

![output_2024-02-28_13-59-40](https://github.com/element-hq/synapse/assets/13143850/6db9c9ac-484f-47d0-bdde-70abfbd773ec)

Second, the copying here doesn't serve any purpose, because only a
shallow copy is created. This means the same objects from the original
dict are referenced. This fails the intention of protecting these
objects from mutation. The review of the original PR
https://github.com/matrix-org/synapse/pull/15195 had an extensive
discussion about this matter.

Various approaches to copying the auth_events were attempted:
1) Implementing a deepcopy caused issues due to
builtins.EventInternalMetadata not being pickleable.
2) Creating a dict with new objects akin to a deepcopy.
3) Creating a dict with new objects containing only necessary
attributes.

Concluding, there is no easy way to create an actual copy of the
objects. Opting for a deepcopy can significantly strain memory and CPU
resources, making it an inefficient choice. I don't see why the copy is
necessary in the first place. Therefore I'm proposing to remove it
altogether.

After these changes, I was able to successfully join these rooms,
without the main worker locking up:
- #synapse:matrix.org
- #element-android:matrix.org
- #element-web:matrix.org
- #ecips:matrix.org
- #ipfs-chatter:ipfs.io
- #python:matrix.org
- #matrix:matrix.org
2024-03-12 15:07:36 +00:00
Erik Johnston
9d7880c0c6 1.103.0rc1 2024-03-12 15:03:45 +00:00
Alexander Fechler
48f59d3806 deactivated flag refactored to filter deactivated users. (#16874)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-03-11 16:08:04 +00:00
Patrick Cloke
696cc9e802 Stabilize support for Retry-After header (MSC4014) (#16947) 2024-03-08 09:33:46 +00:00
Quentin Gliech
4af33015af Fix joining remote rooms when a on_new_event callback is registered (#16973)
Since Synapse 1.76.0, any module which registers a `on_new_event`
callback would brick the ability to join remote rooms.
This is because this callback tried to get the full state of the room,
which would end up in a deadlock.

Related:
https://github.com/matrix-org/synapse-auto-accept-invite/issues/18

The following module would brick the ability to join remote rooms:

```python
from typing import Any, Dict, Literal, Union
import logging

from synapse.module_api import ModuleApi, EventBase

logger = logging.getLogger(__name__)

class MyModule:
    def __init__(self, config: None, api: ModuleApi):
        self._api = api
        self._config = config

        self._api.register_third_party_rules_callbacks(
            on_new_event=self.on_new_event,
        )

    async def on_new_event(self, event: EventBase, _state_map: Any) -> None:
        logger.info(f"Received new event: {event}")

    @staticmethod
    def parse_config(_config: Dict[str, Any]) -> None:
        return None
```

This is technically a breaking change, as we are now passing partial
state on the `on_new_event` callback.
However, this callback was broken for federated rooms since 1.76.0, and
local rooms have full state anyway, so it's unlikely that it would
change anything.
2024-03-06 16:00:20 +01:00
Andrew Morgan
2d1bb0b06b Merge remote-tracking branch 'origin/release-v1.102' into develop 2024-03-05 16:03:24 +00:00
Andrew Morgan
ab80b3412e Revert "Ignore notification counts from rooms you've left" (#16981) 2024-03-05 16:02:54 +00:00
Andrew Morgan
1dee1b72ec Switch #16979 changelog type from internal change to bugfix 2024-03-05 15:13:32 +00:00
Andrew Morgan
571ca0c004 1.102.0 2024-03-05 14:47:35 +00:00
Andrew Morgan
8a05304222 Revert "Improve DB performance of calculating badge counts for push. (#16756)" (#16979) 2024-03-05 12:27:27 +00:00
Andrew Morgan
274f289a52 Ignore notification counts from rooms you've left (#16954)
Co-authored-by: reivilibre <oliverw@matrix.org>
2024-02-23 14:12:10 +00:00
Twilight Sparkle
8de3283ebe Add docs on upgrading from a very old version (#16951)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-02-22 17:36:41 +00:00
dependabot[bot]
4ad70f115b Bump anyhow from 1.0.79 to 1.0.80 (#16935)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:40:34 +00:00
dependabot[bot]
3778cea776 Bump pyopenssl from 23.3.0 to 24.0.0 (#16937)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:38:38 +00:00
dependabot[bot]
5ce9498047 Bump JasonEtco/create-an-issue from 2.9.1 to 2.9.2 (#16934)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:37:32 +00:00
dependabot[bot]
f2c5f1564e Bump types-netaddr from 0.10.0.20240106 to 1.2.0.20240219 (#16938)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:36:35 +00:00
dependabot[bot]
91694907da Bump furo from 2023.9.10 to 2024.1.29 (#16939)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:35:05 +00:00
dependabot[bot]
e0b19a4777 Bump dawidd6/action-download-artifact from 3.0.0 to 3.1.1 (#16933)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-21 10:34:03 +00:00
kegsay
0c55c76da8 Better complement docs (#16946) 2024-02-20 17:14:50 +00:00
Andrew Morgan
3eb0a3b468 Merge branch 'release-v1.102' into develop 2024-02-20 16:57:37 +00:00
Georg
7c1c011942 Add HAProxy example for single port operation (#16768) 2024-02-20 16:15:58 +00:00
Andrew Morgan
7856ec96ef 1.102.0rc1 2024-02-20 15:51:17 +00:00
Erik Johnston
cdbbf3653d Don't lock up when joining large rooms (#16903)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2024-02-20 14:29:18 +00:00
kegsay
c51a2240d1 bugfix: always prefer unthreaded receipt when >1 exist (MSC4102) (#16927)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-02-20 14:12:06 +00:00
Erik Johnston
e5dfb6ecbf Fix incorrect docker hub link in release script (#16910) 2024-02-20 12:20:31 +00:00
Rainer Zufall
1b7304c8b4 fix typo in admin_api/rooms.md (#16857)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2024-02-20 12:20:23 +00:00
Remi Rampin
0621e8eb0e Add metric for emails sent (#16881)
This adds a counter `synapse_emails_sent_total` for emails sent. They
are broken down by `type`, which are `password_reset`, `registration`,
`add_threepid`, `notification` (matching the methods of `Mailer`).
2024-02-14 15:30:03 +00:00
Erik Johnston
bc1db16086 Merge branch 'master' into develop 2024-02-13 13:24:29 +00:00
Erik Johnston
7b4d7429f8 Don't invalidate the entire event cache when we purge history (#16905)
We do this by adding support to the LRU cache for "extra indices" based
on the cached value. This allows us to efficiently map from room ID to
the cached events and only invalidate those.
2024-02-13 13:24:11 +00:00
Erik Johnston
01910b981f Add a config to not send out device list updates for specific users (#16909)
List of users not to send out device list updates for when they register
new devices. This is useful to handle bot accounts.

This is undocumented as its mostly a hack to test on matrix.org.

Note: This will still send out device list updates if the device is
later updated, e.g. end to end keys are added.
2024-02-13 13:23:03 +00:00
Erik Johnston
2252bae3df 1.101.0 2024-02-13 10:45:40 +00:00
dependabot[bot]
79e31e8527 Bump pygithub from 2.1.1 to 2.2.0 (#16902) 2024-02-12 16:29:07 +00:00
dependabot[bot]
b07617fbe8 Bump attrs from 23.1.0 to 23.2.0 (#16899) 2024-02-12 16:27:51 +00:00
dependabot[bot]
e7cdf6152b Bump bcrypt from 4.0.1 to 4.1.2 (#16900)
Bumps [bcrypt](https://github.com/pyca/bcrypt) from 4.0.1 to 4.1.2.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b9223e61e2"><code>b9223e6</code></a>
Try building py39 wheels to see if that helps with reinitialization
errors (#...</li>
<li><a
href="5049783444"><code>5049783</code></a>
Bump syn from 2.0.40 to 2.0.41 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/696">#696</a>)</li>
<li><a
href="642d070972"><code>642d070</code></a>
Bump syn from 2.0.39 to 2.0.40 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/693">#693</a>)</li>
<li><a
href="8b44a1046a"><code>8b44a10</code></a>
Bump libc from 0.2.150 to 0.2.151 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/692">#692</a>)</li>
<li><a
href="951cc64d0c"><code>951cc64</code></a>
Bump once_cell from 1.18.0 to 1.19.0 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/690">#690</a>)</li>
<li><a
href="7377c6db3a"><code>7377c6d</code></a>
Bump actions/setup-python from 4.8.0 to 5.0.0 (<a
href="https://redirect.github.com/pyca/bcrypt/issues/689">#689</a>)</li>
<li><a
href="61b32039d4"><code>61b3203</code></a>
Bump actions/setup-python from 4.7.1 to 4.8.0 (<a
href="https://redirect.github.com/pyca/bcrypt/issues/688">#688</a>)</li>
<li><a
href="1c3159a28a"><code>1c3159a</code></a>
Fixed wheels for older versions of macOS (<a
href="https://redirect.github.com/pyca/bcrypt/issues/687">#687</a>)</li>
<li><a
href="1a41437d3a"><code>1a41437</code></a>
Update README.rst (<a
href="https://redirect.github.com/pyca/bcrypt/issues/682">#682</a>)</li>
<li><a
href="7881c5beef"><code>7881c5b</code></a>
Fix building windows abi3 wheels (<a
href="https://redirect.github.com/pyca/bcrypt/issues/681">#681</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/bcrypt/compare/4.0.1...4.1.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=bcrypt&package-manager=pip&previous-version=4.0.1&new-version=4.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 16:27:43 +00:00
dependabot[bot]
c415b7a412 Bump sentry-sdk from 1.40.0 to 1.40.3 (#16898)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from
1.40.0 to 1.40.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>1.40.3</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.40.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.40.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix uWSGI workers hanging (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics work with <code>gevent</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Guard against <code>engine.url</code> being <code>None</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2708">#2708</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix performance regression in
<code>sentry_sdk.utils._generate_installed_modules</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2703">#2703</a>)
by <a
href="https://github.com/GlenWalker"><code>@​GlenWalker</code></a></li>
<li>Guard against Sentry initialization mid SQLAlchemy cursor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2702">#2702</a>)
by <a
href="https://github.com/apmorton"><code>@​apmorton</code></a></li>
<li>Fix yaml generation script (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2695">#2695</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix AWS Lambda workflow (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2710">#2710</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2706">#2706</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/cache</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2661">#2661</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/checkout</code> from 3.1.0 to 4.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2561">#2561</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>github/codeql-action</code> from 2 to 3 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2603">#2603</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/setup-python</code> from 4 to 5 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2577">#2577</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>1.40.3</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.40.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.40.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix uWSGI workers hanging (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics work with <code>gevent</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Guard against <code>engine.url</code> being <code>None</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2708">#2708</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix performance regression in
<code>sentry_sdk.utils._generate_installed_modules</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2703">#2703</a>)
by <a
href="https://github.com/GlenWalker"><code>@​GlenWalker</code></a></li>
<li>Guard against Sentry initialization mid SQLAlchemy cursor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2702">#2702</a>)
by <a
href="https://github.com/apmorton"><code>@​apmorton</code></a></li>
<li>Fix yaml generation script (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2695">#2695</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix AWS Lambda workflow (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2710">#2710</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2706">#2706</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/cache</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2661">#2661</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/checkout</code> from 3.1.0 to 4.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2561">#2561</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>github/codeql-action</code> from 2 to 3 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2603">#2603</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/setup-python</code> from 4 to 5 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2577">#2577</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="84c4c127ff"><code>84c4c12</code></a>
Update CHANGELOG.md</li>
<li><a
href="f92b4f2247"><code>f92b4f2</code></a>
release: 1.40.3</li>
<li><a
href="f23bdd32fe"><code>f23bdd3</code></a>
fix(metrics): Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)</li>
<li><a
href="c77a1235f4"><code>c77a123</code></a>
Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)</li>
<li><a
href="2186e227a5"><code>2186e22</code></a>
Merge branch 'release/1.40.2'</li>
<li><a
href="139469a01f"><code>139469a</code></a>
release: 1.40.2</li>
<li><a
href="d97e7d75f7"><code>d97e7d7</code></a>
test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)</li>
<li><a
href="60e644c8e3"><code>60e644c</code></a>
build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)</li>
<li><a
href="d769becc92"><code>d769bec</code></a>
Merge branch 'release/1.40.1'</li>
<li><a
href="ad25ed961b"><code>ad25ed9</code></a>
Update CHANGELOG.md</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/1.40.0...1.40.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sentry-sdk&package-manager=pip&previous-version=1.40.0&new-version=1.40.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 16:27:32 +00:00
Erik Johnston
ea1b30940e Merge remote-tracking branch 'origin/release-v1.101' into develop 2024-02-09 10:52:35 +00:00
Erik Johnston
bfa93d1d3b Only do one concurrent fetch per server in keyring (#16894)
Otherwise if we've stacked a bunch of requests for the keys of a server,
we'll end up sending lots of concurrent requests for its keys,
needlessly.
2024-02-09 10:51:11 +00:00
Erik Johnston
02a147039c Increase batching when fetching auth chains (#16893)
This basically reverts a change that was in
https://github.com/element-hq/synapse/pull/16833, where we reduced the
batching.

The smaller batching can cause performance issues on busy servers and
databases.
2024-02-09 10:51:00 +00:00
Erik Johnston
7c805f00a7 1.101.0rc1 2024-02-06 16:33:19 +00:00
David Baker
71ca199165 Accept unprefixed form of MSC3981 recurse parameter (#16842)
Now that the MSC3981 has passed FCP
2024-02-06 09:48:39 +00:00
dependabot[bot]
871f51c270 Bump lxml-stubs from 0.4.0 to 0.5.1 (#16885) 2024-02-06 09:29:17 +00:00
dependabot[bot]
71e8634069 Bump dorny/paths-filter from 2 to 3 (#16869) 2024-02-06 09:28:22 +00:00
kegsay
93edd0932e Update docs for MacOS installs (#16854)
ICU is an optional dependency and also a pain to install. Mention that
you can just not install it and still get a working installation.
2024-02-06 09:27:38 +00:00
kegsay
505cdd044b Fix broken links on docs (#16853)
Some links seemed to be incorrect (vector-im/sygnal and vector-im/sytest
have never been A Thing iirc) so pointed them back to matrix-org/*).
2024-02-06 09:26:55 +00:00
dependabot[bot]
d2674bacdb Bump sigstore/cosign-installer from 3.3.0 to 3.4.0 (#16890) 2024-02-06 09:17:42 +00:00
dependabot[bot]
afd513fb25 Bump sentry-sdk from 1.39.1 to 1.40.0 (#16889) 2024-02-06 09:17:32 +00:00
dependabot[bot]
53744c7258 Bump pydantic from 2.5.3 to 2.6.0 (#16888) 2024-02-06 09:16:47 +00:00
dependabot[bot]
bdcad7823f Bump jsonschema from 4.20.0 to 4.21.1 (#16887) 2024-02-06 09:16:21 +00:00
dependabot[bot]
6f3f9770dc Bump types-requests from 2.31.0.10 to 2.31.0.20240125 (#16886) 2024-02-06 09:16:05 +00:00
dependabot[bot]
50a332cf30 Bump hiredis from 2.2.3 to 2.3.2 (#16862) 2024-02-01 14:32:50 +00:00
dependabot[bot]
6e714a6277 Bump mypy-zope from 1.0.1 to 1.0.3 (#16865) 2024-02-01 14:32:36 +00:00
dependabot[bot]
0c1c56b75c Bump types-pillow from 10.1.0.2 to 10.2.0.20240125 (#16864) 2024-02-01 14:32:18 +00:00
dependabot[bot]
b720059d75 Bump types-setuptools from 69.0.0.0 to 69.0.0.20240125 (#16863) 2024-02-01 14:32:08 +00:00
dependabot[bot]
fbf7fa986f Bump phonenumbers from 8.13.26 to 8.13.29 (#16868) 2024-02-01 14:31:25 +00:00
dependabot[bot]
ab9d3c0f40 Bump serde from 1.0.195 to 1.0.196 (#16867) 2024-02-01 14:30:56 +00:00
dependabot[bot]
8822ea88a3 Bump serde_json from 1.0.111 to 1.0.113 (#16866) 2024-02-01 14:30:44 +00:00
Will Hunt
d24d115706 Update version picker for element-hq (#16880) 2024-02-01 14:30:16 +00:00
Olivier Wilkinson (reivilibre)
3ba984d7af Merge branch 'master' into develop 2024-01-31 12:03:29 +00:00
Olivier Wilkinson (reivilibre)
4a5ea43f1b 1.100.0 2024-01-30 16:58:24 +00:00
195 changed files with 4457 additions and 1997 deletions

View File

@@ -30,7 +30,7 @@ jobs:
run: docker buildx inspect
- name: Install Cosign
uses: sigstore/cosign-installer@v3.3.0
uses: sigstore/cosign-installer@v3.5.0
- name: Checkout repository
uses: actions/checkout@v4

View File

@@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@e7466d1a7587ed14867642c2ca74b5bcc1e19a2d # v3.0.0
uses: dawidd6/action-download-artifact@09f2f74827fd3a8607589e5ad7f9398816f540fe # v3.1.4
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -19,7 +19,7 @@ jobs:
fetch-depth: 0
- name: Setup mdbook
uses: peaceiris/actions-mdbook@adeb05db28a0c0004681db83893d56c0388ea9ea # v1.2.0
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with:
mdbook-version: '0.4.17'
@@ -53,7 +53,7 @@ jobs:
- uses: actions/checkout@v4
- name: Setup mdbook
uses: peaceiris/actions-mdbook@adeb05db28a0c0004681db83893d56c0388ea9ea # v1.2.0
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with:
mdbook-version: '0.4.17'

View File

@@ -56,7 +56,7 @@ jobs:
fetch-depth: 0
- name: Setup mdbook
uses: peaceiris/actions-mdbook@adeb05db28a0c0004681db83893d56c0388ea9ea # v1.2.0
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with:
mdbook-version: '0.4.17'
@@ -80,7 +80,7 @@ jobs:
# Deploy to the target directory.
- name: Deploy to gh pages
uses: peaceiris/actions-gh-pages@373f7f263a76c20808c831209c920827a82a2847 # v3.9.3
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./book
@@ -110,7 +110,7 @@ jobs:
# Deploy to the target directory.
- name: Deploy to gh pages
uses: peaceiris/actions-gh-pages@373f7f263a76c20808c831209c920827a82a2847 # v3.9.3
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./dev-docs/_build/html

View File

@@ -226,7 +226,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- uses: JasonEtco/create-an-issue@e27dddc79c92bc6e4562f268fffa5ed752639abd # v2.9.1
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -22,7 +22,7 @@ jobs:
integration: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.integration }}
linting: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting }}
steps:
- uses: dorny/paths-filter@v2
- uses: dorny/paths-filter@v3
id: filter
# We only check on PRs
if: startsWith(github.ref, 'refs/pull/')
@@ -81,7 +81,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- uses: matrix-org/setup-python-poetry@v1
with:
@@ -148,7 +148,7 @@ jobs:
uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- name: Setup Poetry
@@ -208,7 +208,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- uses: matrix-org/setup-python-poetry@v1
with:
@@ -225,7 +225,7 @@ jobs:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
with:
components: clippy
- uses: Swatinem/rust-cache@v2
@@ -344,7 +344,7 @@ jobs:
postgres:${{ matrix.job.postgres-version }}
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- uses: matrix-org/setup-python-poetry@v1
@@ -386,7 +386,7 @@ jobs:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
# There aren't wheels for some of the older deps, so we need to install
@@ -498,7 +498,7 @@ jobs:
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- name: Run SyTest
@@ -642,7 +642,7 @@ jobs:
path: synapse
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- name: Prepare Complement's Prerequisites
@@ -674,7 +674,7 @@ jobs:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@1.65.0
uses: dtolnay/rust-toolchain@1.66.0
- uses: Swatinem/rust-cache@v2
- run: cargo test

View File

@@ -207,7 +207,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- uses: JasonEtco/create-an-issue@e27dddc79c92bc6e4562f268fffa5ed752639abd # v2.9.1
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -1,3 +1,232 @@
# Synapse 1.105.0 (2024-04-16)
No significant changes since 1.105.0rc1.
# Synapse 1.105.0rc1 (2024-04-11)
### Features
- Stabilize support for [MSC4010](https://github.com/matrix-org/matrix-spec-proposals/pull/4010) which clarifies the interaction of push rules and account data. Contributed by @clokep. ([\#17022](https://github.com/element-hq/synapse/issues/17022))
- Stabilize support for [MSC3981](https://github.com/matrix-org/matrix-spec-proposals/pull/3981): `/relations` recursion. Contributed by @clokep. ([\#17023](https://github.com/element-hq/synapse/issues/17023))
- Add support for moving `/pushrules` off of main process. ([\#17037](https://github.com/element-hq/synapse/issues/17037), [\#17038](https://github.com/element-hq/synapse/issues/17038))
### Bugfixes
- Fix various long-standing bugs which could cause incorrect state to be returned from `/sync` in certain situations. ([\#16930](https://github.com/element-hq/synapse/issues/16930), [\#16932](https://github.com/element-hq/synapse/issues/16932), [\#16942](https://github.com/element-hq/synapse/issues/16942), [\#17064](https://github.com/element-hq/synapse/issues/17064), [\#17065](https://github.com/element-hq/synapse/issues/17065), [\#17066](https://github.com/element-hq/synapse/issues/17066))
- Fix server notice rooms not always being created as unencrypted rooms, even when `encryption_enabled_by_default_for_room_type` is in use (server notices are always unencrypted). ([\#17033](https://github.com/element-hq/synapse/issues/17033))
- Fix the `.m.rule.encrypted_room_one_to_one` and `.m.rule.room_one_to_one` default underride push rules being in the wrong order. Contributed by @Sumpy1. ([\#17043](https://github.com/element-hq/synapse/issues/17043))
### Internal Changes
- Refactor auth chain fetching to reduce duplication. ([\#17044](https://github.com/element-hq/synapse/issues/17044))
- Improve database performance by adding a missing index to `access_tokens.refresh_token_id`. ([\#17045](https://github.com/element-hq/synapse/issues/17045), [\#17054](https://github.com/element-hq/synapse/issues/17054))
- Improve database performance by reducing number of receipts fetched when sending push notifications. ([\#17049](https://github.com/element-hq/synapse/issues/17049))
### Updates to locked dependencies
* Bump packaging from 23.2 to 24.0. ([\#17027](https://github.com/element-hq/synapse/issues/17027))
* Bump regex from 1.10.3 to 1.10.4. ([\#17028](https://github.com/element-hq/synapse/issues/17028))
* Bump ruff from 0.3.2 to 0.3.5. ([\#17060](https://github.com/element-hq/synapse/issues/17060))
* Bump serde_json from 1.0.114 to 1.0.115. ([\#17041](https://github.com/element-hq/synapse/issues/17041))
* Bump types-pillow from 10.2.0.20240125 to 10.2.0.20240406. ([\#17061](https://github.com/element-hq/synapse/issues/17061))
* Bump types-requests from 2.31.0.20240125 to 2.31.0.20240406. ([\#17063](https://github.com/element-hq/synapse/issues/17063))
* Bump typing-extensions from 4.9.0 to 4.11.0. ([\#17062](https://github.com/element-hq/synapse/issues/17062))
# Synapse 1.104.0 (2024-04-02)
### Bugfixes
- Fix regression when using OIDC provider. Introduced in v1.104.0rc1. ([\#17031](https://github.com/element-hq/synapse/issues/17031))
# Synapse 1.104.0rc1 (2024-03-26)
### Features
- Add an OIDC config to specify extra parameters for the authorization grant URL. IT can be useful to pass an ACR value for example. ([\#16971](https://github.com/element-hq/synapse/issues/16971))
- Add support for OIDC provider returning JWT. ([\#16972](https://github.com/element-hq/synapse/issues/16972), [\#17031](https://github.com/element-hq/synapse/issues/17031))
### Bugfixes
- Fix a bug which meant that, under certain circumstances, we might never retry sending events or to-device messages over federation after a failure. ([\#16925](https://github.com/element-hq/synapse/issues/16925))
- Fix various long-standing bugs which could cause incorrect state to be returned from `/sync` in certain situations. ([\#16949](https://github.com/element-hq/synapse/issues/16949))
- Fix case in which `m.fully_read` marker would not get updated. Contributed by @SpiritCroc. ([\#16990](https://github.com/element-hq/synapse/issues/16990))
- Fix bug which did not retract a user's pending knocks at rooms when their account was deactivated. Contributed by @hanadi92. ([\#17010](https://github.com/element-hq/synapse/issues/17010))
### Updates to the Docker image
- Updated `start.py` to generate config using the correct user ID when running as root (fixes [\#16824](https://github.com/element-hq/synapse/issues/16824), [\#15202](https://github.com/element-hq/synapse/issues/15202)). ([\#16978](https://github.com/element-hq/synapse/issues/16978))
### Improved Documentation
- Add a query to force a refresh of a remote user's device list to the "Useful SQL for Admins" documentation page. ([\#16892](https://github.com/element-hq/synapse/issues/16892))
- Minor grammatical corrections to the upgrade documentation. ([\#16965](https://github.com/element-hq/synapse/issues/16965))
- Fix the sort order for the documentation version picker, so that newer releases appear above older ones. ([\#16966](https://github.com/element-hq/synapse/issues/16966))
- Remove recommendation for a specific poetry version from contributing guide. ([\#17002](https://github.com/element-hq/synapse/issues/17002))
### Internal Changes
- Improve lock performance when a lot of locks are all waiting for a single lock to be released. ([\#16840](https://github.com/element-hq/synapse/issues/16840))
- Update power level default for public rooms. ([\#16907](https://github.com/element-hq/synapse/issues/16907))
- Improve event validation. ([\#16908](https://github.com/element-hq/synapse/issues/16908))
- Multi-worker-docker-container: disable log buffering. ([\#16919](https://github.com/element-hq/synapse/issues/16919))
- Refactor state delta calculation in `/sync` handler. ([\#16929](https://github.com/element-hq/synapse/issues/16929))
- Clarify docs for some room state functions. ([\#16950](https://github.com/element-hq/synapse/issues/16950))
- Specify IP subnets in canonical form. ([\#16953](https://github.com/element-hq/synapse/issues/16953))
- As done for SAML mapping provider, let's pass the module API to the OIDC one so the mapper can do more logic in its code. ([\#16974](https://github.com/element-hq/synapse/issues/16974))
- Allow containers building on top of Synapse's Complement container is use the included PostgreSQL cluster. ([\#16985](https://github.com/element-hq/synapse/issues/16985))
- Raise poetry-core version cap to 1.9.0. ([\#16986](https://github.com/element-hq/synapse/issues/16986))
- Patch the db conn pool sooner in tests. ([\#17017](https://github.com/element-hq/synapse/issues/17017))
### Updates to locked dependencies
* Bump anyhow from 1.0.80 to 1.0.81. ([\#17009](https://github.com/element-hq/synapse/issues/17009))
* Bump black from 23.10.1 to 24.2.0. ([\#16936](https://github.com/element-hq/synapse/issues/16936))
* Bump cryptography from 41.0.7 to 42.0.5. ([\#16958](https://github.com/element-hq/synapse/issues/16958))
* Bump dawidd6/action-download-artifact from 3.1.1 to 3.1.2. ([\#16960](https://github.com/element-hq/synapse/issues/16960))
* Bump dawidd6/action-download-artifact from 3.1.2 to 3.1.4. ([\#17008](https://github.com/element-hq/synapse/issues/17008))
* Bump jinja2 from 3.1.2 to 3.1.3. ([\#17005](https://github.com/element-hq/synapse/issues/17005))
* Bump log from 0.4.20 to 0.4.21. ([\#16977](https://github.com/element-hq/synapse/issues/16977))
* Bump mypy from 1.5.1 to 1.8.0. ([\#16901](https://github.com/element-hq/synapse/issues/16901))
* Bump netaddr from 0.9.0 to 1.2.1. ([\#17006](https://github.com/element-hq/synapse/issues/17006))
* Bump pydantic from 2.6.0 to 2.6.4. ([\#17004](https://github.com/element-hq/synapse/issues/17004))
* Bump pyo3 from 0.20.2 to 0.20.3. ([\#16962](https://github.com/element-hq/synapse/issues/16962))
* Bump ruff from 0.1.14 to 0.3.2. ([\#16994](https://github.com/element-hq/synapse/issues/16994))
* Bump serde from 1.0.196 to 1.0.197. ([\#16963](https://github.com/element-hq/synapse/issues/16963))
* Bump serde_json from 1.0.113 to 1.0.114. ([\#16961](https://github.com/element-hq/synapse/issues/16961))
* Bump types-jsonschema from 4.21.0.20240118 to 4.21.0.20240311. ([\#17007](https://github.com/element-hq/synapse/issues/17007))
* Bump types-psycopg2 from 2.9.21.16 to 2.9.21.20240311. ([\#16995](https://github.com/element-hq/synapse/issues/16995))
* Bump types-pyopenssl from 23.3.0.0 to 24.0.0.20240311. ([\#17003](https://github.com/element-hq/synapse/issues/17003))
# Synapse 1.103.0 (2024-03-19)
No significant changes since 1.103.0rc1.
# Synapse 1.103.0rc1 (2024-03-12)
### Features
- Add a new [List Accounts v3](https://element-hq.github.io/synapse/v1.103/admin_api/user_admin_api.html#list-accounts-v3) Admin API with improved deactivated user filtering capabilities. ([\#16874](https://github.com/element-hq/synapse/issues/16874))
- Include `Retry-After` header by default per [MSC4041](https://github.com/matrix-org/matrix-spec-proposals/pull/4041). Contributed by @clokep. ([\#16947](https://github.com/element-hq/synapse/issues/16947))
### Bugfixes
- Fix joining remote rooms when a module uses the `on_new_event` callback. This callback may now pass partial state events instead of the full state for remote rooms. Introduced in v1.76.0. ([\#16973](https://github.com/element-hq/synapse/issues/16973))
- Fix performance issue when joining very large rooms that can cause the server to lock up. Introduced in v1.100.0. Contributed by @ggogel. ([\#16968](https://github.com/element-hq/synapse/issues/16968))
### Improved Documentation
- Add HAProxy example for single port operation to reverse proxy documentation. Contributed by Georg Pfuetzenreuter (@tacerus). ([\#16768](https://github.com/element-hq/synapse/issues/16768))
- Improve the documentation around running Complement tests with new configuration parameters. ([\#16946](https://github.com/element-hq/synapse/issues/16946))
- Add docs on upgrading from a very old version. ([\#16951](https://github.com/element-hq/synapse/issues/16951))
### Updates to locked dependencies
* Bump JasonEtco/create-an-issue from 2.9.1 to 2.9.2. ([\#16934](https://github.com/element-hq/synapse/issues/16934))
* Bump anyhow from 1.0.79 to 1.0.80. ([\#16935](https://github.com/element-hq/synapse/issues/16935))
* Bump dawidd6/action-download-artifact from 3.0.0 to 3.1.1. ([\#16933](https://github.com/element-hq/synapse/issues/16933))
* Bump furo from 2023.9.10 to 2024.1.29. ([\#16939](https://github.com/element-hq/synapse/issues/16939))
* Bump pyopenssl from 23.3.0 to 24.0.0. ([\#16937](https://github.com/element-hq/synapse/issues/16937))
* Bump types-netaddr from 0.10.0.20240106 to 1.2.0.20240219. ([\#16938](https://github.com/element-hq/synapse/issues/16938))
# Synapse 1.102.0 (2024-03-05)
### Bugfixes
- Revert https://github.com/element-hq/synapse/pull/16756, which caused incorrect notification counts on mobile clients since v1.100.0. ([\#16979](https://github.com/element-hq/synapse/issues/16979))
# Synapse 1.102.0rc1 (2024-02-20)
### Features
- A metric was added for emails sent by Synapse, broken down by type: `synapse_emails_sent_total`. Contributed by Remi Rampin. ([\#16881](https://github.com/element-hq/synapse/issues/16881))
### Bugfixes
- Do not send multiple concurrent requests for keys for the same server. ([\#16894](https://github.com/element-hq/synapse/issues/16894))
- Fix performance issue when joining very large rooms that can cause the server to lock up. Introduced in v1.100.0. ([\#16903](https://github.com/element-hq/synapse/issues/16903))
- Always prefer unthreaded receipt when >1 exist ([MSC4102](https://github.com/matrix-org/matrix-spec-proposals/pull/4102)). ([\#16927](https://github.com/element-hq/synapse/issues/16927))
### Improved Documentation
- Fix a small typo in the Rooms section of the Admin API documentation. Contributed by @RainerZufall187. ([\#16857](https://github.com/element-hq/synapse/issues/16857))
### Internal Changes
- Don't invalidate the entire event cache when we purge history. ([\#16905](https://github.com/element-hq/synapse/issues/16905))
- Add experimental config option to not send device list updates for specific users. ([\#16909](https://github.com/element-hq/synapse/issues/16909))
- Fix incorrect docker hub link in release script. ([\#16910](https://github.com/element-hq/synapse/issues/16910))
### Updates to locked dependencies
* Bump attrs from 23.1.0 to 23.2.0. ([\#16899](https://github.com/element-hq/synapse/issues/16899))
* Bump bcrypt from 4.0.1 to 4.1.2. ([\#16900](https://github.com/element-hq/synapse/issues/16900))
* Bump pygithub from 2.1.1 to 2.2.0. ([\#16902](https://github.com/element-hq/synapse/issues/16902))
* Bump sentry-sdk from 1.40.0 to 1.40.3. ([\#16898](https://github.com/element-hq/synapse/issues/16898))
# Synapse 1.101.0 (2024-02-13)
### Bugfixes
- Fix performance regression when fetching auth chains from the DB. Introduced in v1.100.0. ([\#16893](https://github.com/element-hq/synapse/issues/16893))
# Synapse 1.101.0rc1 (2024-02-06)
### Improved Documentation
- Fix broken links in the documentation. ([\#16853](https://github.com/element-hq/synapse/issues/16853))
- Update MacOS installation instructions to mention that libicu is optional. ([\#16854](https://github.com/element-hq/synapse/issues/16854))
- The version picker now correctly lists versions after `v1.98.0`. ([\#16880](https://github.com/element-hq/synapse/issues/16880))
### Internal Changes
- Add support for stabilised [MSC3981](https://github.com/matrix-org/matrix-spec-proposals/pull/3981) that adds a `recurse` parameter on the `/relations` API. ([\#16842](https://github.com/element-hq/synapse/issues/16842))
### Updates to locked dependencies
* Bump dorny/paths-filter from 2 to 3. ([\#16869](https://github.com/element-hq/synapse/issues/16869))
* Bump gitpython from 3.1.40 to 3.1.41. ([\#16850](https://github.com/element-hq/synapse/issues/16850))
* Bump hiredis from 2.2.3 to 2.3.2. ([\#16862](https://github.com/element-hq/synapse/issues/16862))
* Bump jsonschema from 4.20.0 to 4.21.1. ([\#16887](https://github.com/element-hq/synapse/issues/16887))
* Bump lxml-stubs from 0.4.0 to 0.5.1. ([\#16885](https://github.com/element-hq/synapse/issues/16885))
* Bump mypy-zope from 1.0.1 to 1.0.3. ([\#16865](https://github.com/element-hq/synapse/issues/16865))
* Bump phonenumbers from 8.13.26 to 8.13.29. ([\#16868](https://github.com/element-hq/synapse/issues/16868))
* Bump pydantic from 2.5.3 to 2.6.0. ([\#16888](https://github.com/element-hq/synapse/issues/16888))
* Bump sentry-sdk from 1.39.1 to 1.40.0. ([\#16889](https://github.com/element-hq/synapse/issues/16889))
* Bump serde from 1.0.195 to 1.0.196. ([\#16867](https://github.com/element-hq/synapse/issues/16867))
* Bump serde_json from 1.0.111 to 1.0.113. ([\#16866](https://github.com/element-hq/synapse/issues/16866))
* Bump sigstore/cosign-installer from 3.3.0 to 3.4.0. ([\#16890](https://github.com/element-hq/synapse/issues/16890))
* Bump types-pillow from 10.1.0.2 to 10.2.0.20240125. ([\#16864](https://github.com/element-hq/synapse/issues/16864))
* Bump types-requests from 2.31.0.10 to 2.31.0.20240125. ([\#16886](https://github.com/element-hq/synapse/issues/16886))
* Bump types-setuptools from 69.0.0.0 to 69.0.0.20240125. ([\#16863](https://github.com/element-hq/synapse/issues/16863))
# Synapse 1.100.0 (2024-01-30)
No significant changes since 1.100.0rc3.
# Synapse 1.100.0rc3 (2024-01-24)
### Bugfixes

144
Cargo.lock generated
View File

@@ -13,9 +13,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.79"
version = "1.0.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "080e9890a082662b09c1ad45f567faeeb47f22b5fb23895fbe1e651e718e25ca"
checksum = "f538837af36e6f6a9be0faa67f9a314f8119e4e4b5867c6ab40ed60360142519"
[[package]]
name = "arc-swap"
@@ -29,6 +29,12 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
[[package]]
name = "base64"
version = "0.21.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567"
[[package]]
name = "bitflags"
version = "1.3.2"
@@ -53,12 +59,27 @@ dependencies = [
"generic-array",
]
[[package]]
name = "bytes"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "514de17de45fdb8dc022b1a7975556c53c86f9f0aa5f534b98977b171857c2c9"
[[package]]
name = "cfg-if"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd"
[[package]]
name = "cpufeatures"
version = "0.2.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "53fe5e26ff1b7aef8bca9c6080520cfb8d9333c7568e1829cef191a9723e5504"
dependencies = [
"libc",
]
[[package]]
name = "crypto-common"
version = "0.1.6"
@@ -80,6 +101,12 @@ dependencies = [
"subtle",
]
[[package]]
name = "fnv"
version = "1.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
name = "generic-array"
version = "0.14.6"
@@ -90,6 +117,30 @@ dependencies = [
"version_check",
]
[[package]]
name = "headers"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "322106e6bd0cba2d5ead589ddb8150a13d7c4217cf80d7c4f682ca994ccc6aa9"
dependencies = [
"base64",
"bytes",
"headers-core",
"http",
"httpdate",
"mime",
"sha1",
]
[[package]]
name = "headers-core"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "54b4a22553d4242c49fddb9ba998a99962b5cc6f22cb5a3482bec22522403ce4"
dependencies = [
"http",
]
[[package]]
name = "heck"
version = "0.4.1"
@@ -102,6 +153,23 @@ version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70"
[[package]]
name = "http"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21b9ddb458710bc376481b842f5da65cdf31522de232c1ca8146abce2a358258"
dependencies = [
"bytes",
"fnv",
"itoa",
]
[[package]]
name = "httpdate"
version = "1.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
[[package]]
name = "indoc"
version = "2.0.4"
@@ -122,9 +190,9 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646"
[[package]]
name = "libc"
version = "0.2.135"
version = "0.2.153"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68783febc7782c6c5cb401fbda4de5a9898be1762314da0bb2c10ced61f18b0c"
checksum = "9c198f91728a82281a64e1f4f9eeb25d82cb32a5de251c6bd1b5154d63a8e7bd"
[[package]]
name = "lock_api"
@@ -138,9 +206,9 @@ dependencies = [
[[package]]
name = "log"
version = "0.4.20"
version = "0.4.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5e6163cb8c49088c2c36f57875e58ccd8c87c7427f7fbd50ea6710b2f3f2e8f"
checksum = "90ed8c1e510134f979dbc4f070f87d4313098b704861a105fe34231c70a3901c"
[[package]]
name = "memchr"
@@ -157,6 +225,12 @@ dependencies = [
"autocfg",
]
[[package]]
name = "mime"
version = "0.3.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a"
[[package]]
name = "once_cell"
version = "1.15.0"
@@ -186,6 +260,12 @@ dependencies = [
"windows-sys",
]
[[package]]
name = "portable-atomic"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7170ef9988bc169ba16dd36a7fa041e5c4cbeb6a35b76d4c03daded371eae7c0"
[[package]]
name = "proc-macro2"
version = "1.0.76"
@@ -197,9 +277,9 @@ dependencies = [
[[package]]
name = "pyo3"
version = "0.20.2"
version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9a89dc7a5850d0e983be1ec2a463a171d20990487c3cfcd68b5363f1ee3d6fe0"
checksum = "53bdbb96d49157e65d45cc287af5f32ffadd5f4761438b527b055fb0d4bb8233"
dependencies = [
"anyhow",
"cfg-if",
@@ -207,6 +287,7 @@ dependencies = [
"libc",
"memoffset",
"parking_lot",
"portable-atomic",
"pyo3-build-config",
"pyo3-ffi",
"pyo3-macros",
@@ -215,9 +296,9 @@ dependencies = [
[[package]]
name = "pyo3-build-config"
version = "0.20.2"
version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07426f0d8fe5a601f26293f300afd1a7b1ed5e78b2a705870c5f30893c5163be"
checksum = "deaa5745de3f5231ce10517a1f5dd97d53e5a2fd77aa6b5842292085831d48d7"
dependencies = [
"once_cell",
"target-lexicon",
@@ -225,9 +306,9 @@ dependencies = [
[[package]]
name = "pyo3-ffi"
version = "0.20.2"
version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dbb7dec17e17766b46bca4f1a4215a85006b4c2ecde122076c562dd058da6cf1"
checksum = "62b42531d03e08d4ef1f6e85a2ed422eb678b8cd62b762e53891c05faf0d4afa"
dependencies = [
"libc",
"pyo3-build-config",
@@ -246,9 +327,9 @@ dependencies = [
[[package]]
name = "pyo3-macros"
version = "0.20.2"
version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05f738b4e40d50b5711957f142878cfa0f28e054aa0ebdfc3fd137a843f74ed3"
checksum = "7305c720fa01b8055ec95e484a6eca7a83c841267f0dd5280f0c8b8551d2c158"
dependencies = [
"proc-macro2",
"pyo3-macros-backend",
@@ -258,12 +339,13 @@ dependencies = [
[[package]]
name = "pyo3-macros-backend"
version = "0.20.2"
version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0fc910d4851847827daf9d6cdd4a823fbdaab5b8818325c5e97a86da79e8881f"
checksum = "7c7e9b68bb9c3149c5b0cade5d07f953d6d125eb4337723c4ccdb665f1f96185"
dependencies = [
"heck",
"proc-macro2",
"pyo3-build-config",
"quote",
"syn",
]
@@ -298,9 +380,9 @@ dependencies = [
[[package]]
name = "regex"
version = "1.10.3"
version = "1.10.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b62dbe01f0b06f9d8dc7d49e05a0785f153b00b2c227856282f671e0318c9b15"
checksum = "c117dbdfde9c8308975b6a18d71f3f385c89461f7b3fb054288ecf2a2058ba4c"
dependencies = [
"aho-corasick",
"memchr",
@@ -339,18 +421,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "serde"
version = "1.0.195"
version = "1.0.197"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "63261df402c67811e9ac6def069e4786148c4563f4b50fd4bf30aa370d626b02"
checksum = "3fb1c873e1b9b056a4dc4c0c198b24c3ffa059243875552b2bd0933b1aee4ce2"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.195"
version = "1.0.197"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46fe8f8603d81ba86327b23a2e9cdf49e1255fb94a4c5f297f6ee0547178ea2c"
checksum = "7eb0b34b42edc17f6b7cac84a52a1c5f0e1bb2227e997ca9011ea3dd34e8610b"
dependencies = [
"proc-macro2",
"quote",
@@ -359,15 +441,26 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.111"
version = "1.0.115"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "176e46fa42316f18edd598015a5166857fc835ec732f5215eac6b7bdbf0a84f4"
checksum = "12dc5c46daa8e9fdf4f5e71b6cf9a53f2487da0e86e55808e2d35539666497dd"
dependencies = [
"itoa",
"ryu",
"serde",
]
[[package]]
name = "sha1"
version = "0.10.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f04293dc80c3993519f2d7f6f511707ee7094fe0c6d3406feb330cdb3540eba3"
dependencies = [
"cfg-if",
"cpufeatures",
"digest",
]
[[package]]
name = "smallvec"
version = "1.10.0"
@@ -397,7 +490,10 @@ version = "0.1.0"
dependencies = [
"anyhow",
"blake2",
"bytes",
"headers",
"hex",
"http",
"lazy_static",
"log",
"pyo3",

1
changelog.d/16920.bugfix Normal file
View File

@@ -0,0 +1 @@
Adds validation to ensure that the `limit` parameter on `/publicRooms` is non-negative.

1
changelog.d/16923.bugfix Normal file
View File

@@ -0,0 +1 @@
Return `400 M_NOT_JSON` upon receiving invalid JSON in query parameters across various client and admin endpoints, rather than an internal server error.

1
changelog.d/16943.bugfix Normal file
View File

@@ -0,0 +1 @@
Make the CSAPI endpoint `/keys/device_signing/upload` idempotent.

1
changelog.d/17032.misc Normal file
View File

@@ -0,0 +1 @@
Use new receipts column to optimise receipt and push action SQL queries. Contributed by Nick @ Beeper (@fizzadar).

1
changelog.d/17036.misc Normal file
View File

@@ -0,0 +1 @@
Fix mypy with latest Twisted release.

1
changelog.d/17069.doc Normal file
View File

@@ -0,0 +1 @@
Add a prompt in the contributing guide to manually configure icu4c.

1
changelog.d/17079.misc Normal file
View File

@@ -0,0 +1 @@
Bump minimum supported Rust version to 1.66.0.

1
changelog.d/17081.misc Normal file
View File

@@ -0,0 +1 @@
Add helpers to transform Twisted requests to Rust http Requests/Responses.

View File

@@ -0,0 +1 @@
Support delegating the rendezvous mechanism described MSC4108 to an external implementation.

1
changelog.d/17096.misc Normal file
View File

@@ -0,0 +1 @@
Use new receipts column to optimise receipt and push action SQL queries. Contributed by Nick @ Beeper (@fizzadar).

1
changelog.d/17099.doc Normal file
View File

@@ -0,0 +1 @@
Clarify what part of message retention is still experimental.

1
changelog.d/17115.misc Normal file
View File

@@ -0,0 +1 @@
`complement.sh`: run tests from all test packages.

66
debian/changelog vendored
View File

@@ -1,3 +1,69 @@
matrix-synapse-py3 (1.105.0) stable; urgency=medium
* New Synapse release 1.105.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 16 Apr 2024 15:53:23 +0100
matrix-synapse-py3 (1.105.0~rc1) stable; urgency=medium
* New Synapse release 1.105.0rc1.
-- Synapse Packaging team <packages@matrix.org> Thu, 11 Apr 2024 12:15:49 +0100
matrix-synapse-py3 (1.104.0) stable; urgency=medium
* New Synapse release 1.104.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 02 Apr 2024 17:15:45 +0100
matrix-synapse-py3 (1.104.0~rc1) stable; urgency=medium
* New Synapse release 1.104.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 26 Mar 2024 11:48:58 +0000
matrix-synapse-py3 (1.103.0) stable; urgency=medium
* New Synapse release 1.103.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 19 Mar 2024 12:24:36 +0000
matrix-synapse-py3 (1.103.0~rc1) stable; urgency=medium
* New Synapse release 1.103.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 12 Mar 2024 15:02:56 +0000
matrix-synapse-py3 (1.102.0) stable; urgency=medium
* New Synapse release 1.102.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 05 Mar 2024 14:47:03 +0000
matrix-synapse-py3 (1.102.0~rc1) stable; urgency=medium
* New Synapse release 1.102.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 20 Feb 2024 15:50:36 +0000
matrix-synapse-py3 (1.101.0) stable; urgency=medium
* New Synapse release 1.101.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 13 Feb 2024 10:45:35 +0000
matrix-synapse-py3 (1.101.0~rc1) stable; urgency=medium
* New Synapse release 1.101.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 06 Feb 2024 16:02:02 +0000
matrix-synapse-py3 (1.100.0) stable; urgency=medium
* New Synapse release 1.100.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 30 Jan 2024 16:58:19 +0000
matrix-synapse-py3 (1.100.0~rc3) stable; urgency=medium
* New Synapse release 1.100.0rc3.

View File

@@ -30,3 +30,14 @@ Consult `scripts-dev/complement.sh` in the repository root for a real example.
[complement]: https://github.com/matrix-org/complement
[complementEnv]: https://github.com/matrix-org/complement/pull/382
## How to modify homeserver.yaml for Complement tests
It's common for MSCs to be gated behind a feature flag like this:
```yaml
experimental_features:
faster_joins: true
```
To modify this for the Complement image, modify `./conf/workers-shared-extra.yaml.j2`. Despite the name,
this will affect non-worker mode as well. Remember to _rebuild_ the image (so don't use `-e` if using
`complement.sh`).

View File

@@ -1,7 +1,7 @@
[program:postgres]
command=/usr/local/bin/prefix-log gosu postgres postgres
# Only start if START_POSTGRES=1
# Only start if START_POSTGRES=true
autostart=%(ENV_START_POSTGRES)s
# Lower priority number = starts first

View File

@@ -32,8 +32,9 @@ case "$SYNAPSE_COMPLEMENT_DATABASE" in
;;
sqlite|"")
# Configure supervisord not to start Postgres, as we don't need it
export START_POSTGRES=false
# Set START_POSTGRES to false unless it has already been set
# (i.e. by another container image inheriting our own).
export START_POSTGRES=${START_POSTGRES:-false}
;;
*)

View File

@@ -102,6 +102,8 @@ experimental_features:
msc3391_enabled: true
# Filtering /messages by relation type.
msc3874_enabled: true
# no UIA for x-signing upload for the first time
msc3967_enabled: true
server_notices:
system_mxid_localpart: _server

View File

@@ -310,6 +310,13 @@ WORKERS_CONFIG: Dict[str, Dict[str, Any]] = {
"shared_extra_conf": {},
"worker_extra_conf": "",
},
"push_rules": {
"app": "synapse.app.generic_worker",
"listener_resources": ["client", "replication"],
"endpoint_patterns": ["^/_matrix/client/(api/v1|r0|v3|unstable)/pushrules/"],
"shared_extra_conf": {},
"worker_extra_conf": "",
},
}
# Templates for sections that may be inserted multiple times in config files
@@ -401,6 +408,7 @@ def add_worker_roles_to_shared_config(
"receipts",
"to_device",
"typing",
"push_rules",
]
# Worker-type specific sharding config. Now a single worker can fulfill multiple

View File

@@ -7,6 +7,9 @@
# prefix-log command [args...]
#
exec 1> >(awk '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0}' >&1)
exec 2> >(awk '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0}' >&2)
# '-W interactive' is a `mawk` extension which disables buffering on stdout and sets line-buffered reads on
# stdin. The effect is that the output is flushed after each line, rather than being batched, which helps reduce
# confusion due to to interleaving of the different processes.
exec 1> >(awk -W interactive '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0 }' >&1)
exec 2> >(awk -W interactive '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0 }' >&2)
exec "$@"

View File

@@ -160,11 +160,6 @@ def run_generate_config(environ: Mapping[str, str], ownership: Optional[str]) ->
config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml")
data_dir = environ.get("SYNAPSE_DATA_DIR", "/data")
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}")
subprocess.run(["chown", ownership, data_dir], check=True)
# create a suitable log config from our template
log_config_file = "%s/%s.log.config" % (config_dir, server_name)
if not os.path.exists(log_config_file):
@@ -189,9 +184,15 @@ def run_generate_config(environ: Mapping[str, str], ownership: Optional[str]) ->
"--generate-config",
"--open-private-ports",
]
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}")
subprocess.run(["chown", ownership, data_dir], check=True)
args = ["gosu", ownership] + args
# log("running %s" % (args, ))
flush_buffers()
os.execv(sys.executable, args)
subprocess.run(args, check=True)
def main(args: List[str], environ: MutableMapping[str, str]) -> None:

View File

@@ -913,7 +913,7 @@ With all that being said, if you still want to try and recover the room:
them handle rejoining themselves.
4. If `new_room_user_id` was given, a 'Content Violation' will have been
created. Consider whether you want to delete that roomm.
created. Consider whether you want to delete that room.
# Make Room Admin API

View File

@@ -164,6 +164,7 @@ Body parameters:
Other allowed options are: `bot` and `support`.
## List Accounts
### List Accounts (V2)
This API returns all local user accounts.
By default, the response is ordered by ascending user ID.
@@ -287,6 +288,19 @@ The following fields are returned in the JSON response body:
*Added in Synapse 1.93:* the `locked` query parameter and response field.
### List Accounts (V3)
This API returns all local user accounts (see v2). In contrast to v2, the query parameter `deactivated` is handled differently.
```
GET /_synapse/admin/v3/users
```
**Parameters**
- `deactivated` - Optional flag to filter deactivated users. If `true`, only deactivated users are returned.
If `false`, deactivated users are excluded from the query. When the flag is absent (the default),
users are not filtered by deactivation status.
## Query current sessions for a user
This API returns information about the active sessions for a specific user.

View File

@@ -68,7 +68,7 @@ Of their installation methods, we recommend
```shell
pip install --user pipx
pipx install poetry==1.5.1 # Problems with Poetry 1.6, see https://github.com/matrix-org/synapse/issues/16147
pipx install poetry
```
but see poetry's [installation instructions](https://python-poetry.org/docs/#installation)
@@ -86,6 +86,8 @@ poetry install --extras all
This will install the runtime and developer dependencies for the project. Be sure to check
that the `poetry install` step completed cleanly.
For OSX users, be sure to set `PKG_CONFIG_PATH` to support `icu4c`. Run `brew info icu4c` for more details.
## Running Synapse via poetry
To start a local instance of Synapse in the locked poetry environment, create a config file:
@@ -329,7 +331,7 @@ This configuration should generally cover your needs.
- To run with Postgres, supply the `-e POSTGRES=1 -e MULTI_POSTGRES=1` environment flags.
- To run with Synapse in worker mode, supply the `-e WORKERS=1 -e REDIS=1` environment flags (in addition to the Postgres flags).
For more details about other configurations, see the [Docker-specific documentation in the SyTest repo](https://github.com/vector-im/sytest/blob/develop/docker/README.md).
For more details about other configurations, see the [Docker-specific documentation in the SyTest repo](https://github.com/matrix-org/sytest/blob/develop/docker/README.md).
## Run the integration tests ([Complement](https://github.com/matrix-org/complement)).

View File

@@ -7,8 +7,10 @@ follow the semantics described in
and allow server and room admins to configure how long messages should
be kept in a homeserver's database before being purged from it.
**Please note that, as this feature isn't part of the Matrix
specification yet, this implementation is to be considered as
experimental.**
specification yet, the use of `m.room.retention` events for per-room
retention policies is to be considered as experimental. However, the use
of a default message retention policy is considered a stable feature
in Synapse.**
A message retention policy is mainly defined by its `max_lifetime`
parameter, which defines how long a message can be kept around after

View File

@@ -142,6 +142,10 @@ Called after sending an event into a room. The module is passed the event, as we
as the state of the room _after_ the event. This means that if the event is a state event,
it will be included in this state.
The state map may not be complete if Synapse hasn't yet loaded the full state
of the room. This can happen for events in rooms that were just joined from
a remote server.
Note that this callback is called when the event has already been processed and stored
into the room, which means this callback cannot be used to deny persisting the event. To
deny an incoming event, see [`check_event_for_spam`](spam_checker_callbacks.md#check_event_for_spam) instead.

View File

@@ -12,7 +12,7 @@ This is the main reason people have a poor matrix experience on resource constra
While synapse does have some performance issues with presence [#3971](https://github.com/matrix-org/synapse/issues/3971), the fundamental problem is that this is an easy feature to implement for a centralised service at nearly no overhead, but federation makes it combinatorial [#8055](https://github.com/matrix-org/synapse/issues/8055). There is also a client-side config option which disables the UI and idle tracking [enable_presence_by_hs_url] to blacklist the largest instances but I didn't notice much difference, so I recommend disabling the feature entirely at the server level as well.
[enable_presence_by_hs_url]: https://github.com/vector-im/element-web/blob/v1.7.8/config.sample.json#L45
[enable_presence_by_hs_url]: https://github.com/element-hq/element-web/blob/v1.7.8/config.sample.json#L45
### Joining

View File

@@ -182,7 +182,7 @@ synapse_port_db --sqlite-database homeserver.db.snapshot \
--postgres-config homeserver-postgres.yaml
```
The flag `--curses` displays a coloured curses progress UI.
The flag `--curses` displays a coloured curses progress UI. (NOTE: if your terminal is too small the script will error out)
If the script took a long time to complete, or time has otherwise passed
since the original snapshot was taken, repeat the previous steps with a

View File

@@ -186,6 +186,25 @@ Example configuration, if using a UNIX socket. The configuration lines regarding
backend matrix
server matrix unix@/run/synapse/main_public.sock
```
Example configuration when using a single port for both client and federation traffic.
```
frontend https
bind *:443,[::]:443 ssl crt /etc/ssl/haproxy/ strict-sni alpn h2,http/1.1
http-request set-header X-Forwarded-Proto https if { ssl_fc }
http-request set-header X-Forwarded-Proto http if !{ ssl_fc }
http-request set-header X-Forwarded-For %[src]
acl matrix-host hdr(host) -i matrix.example.com matrix.example.com:443
acl matrix-sni ssl_fc_sni matrix.example.com
acl matrix-path path_beg /_matrix
acl matrix-path path_beg /_synapse/client
use_backend matrix if matrix-host matrix-path
use_backend matrix if matrix-sni
backend matrix
server matrix 127.0.0.1:8008
```
[Delegation](delegate.md) example:
```

View File

@@ -26,7 +26,7 @@ for most users.
#### Docker images and Ansible playbooks
There is an official synapse image available at
<https://hub.docker.com/r/vectorim/synapse> or at [`ghcr.io/element-hq/synapse`](https://ghcr.io/element-hq/synapse)
<https://hub.docker.com/r/matrixdotorg/synapse> or at [`ghcr.io/element-hq/synapse`](https://ghcr.io/element-hq/synapse)
which can be used with the docker-compose file available at
[contrib/docker](https://github.com/element-hq/synapse/tree/develop/contrib/docker).
Further information on this including configuration options is available in the README
@@ -326,6 +326,17 @@ Some extra dependencies may be needed. You can use Homebrew (https://brew.sh) fo
You may need to install icu, and make the icu binaries and libraries accessible.
Please follow [the official instructions of PyICU](https://pypi.org/project/PyICU/) to do so.
If you're struggling to get icu discovered, and see:
```
RuntimeError:
Please install pkg-config on your system or set the ICU_VERSION environment
variable to the version of ICU you have installed.
```
despite it being installed and having your `PATH` updated, you can omit this dependency by
not specifying `--extras all` to `poetry`. If using postgres, you can install Synapse via
`poetry install --extras saml2 --extras oidc --extras postgres --extras opentracing --extras redis --extras sentry`.
ICU is not a hard dependency on getting a working installation.
On ARM-based Macs you may also need to install libjpeg and libpq:
```sh
brew install jpeg libpq

View File

@@ -136,8 +136,8 @@ This will install and start a systemd service called `coturn`.
NB: If your TLS certificate was provided by Let's Encrypt, TLS/DTLS will
not work with any Matrix client that uses Chromium's WebRTC library. This
currently includes Element Android & iOS; for more details, see their
[respective](https://github.com/vector-im/element-android/issues/1533)
[issues](https://github.com/vector-im/element-ios/issues/2712) as well as the underlying
[respective](https://github.com/element-hq/element-android/issues/1533)
[issues](https://github.com/element-hq/element-ios/issues/2712) as well as the underlying
[WebRTC issue](https://bugs.chromium.org/p/webrtc/issues/detail?id=11710).
Consider using a ZeroSSL certificate for your TURN server as a working alternative.

View File

@@ -137,8 +137,8 @@ must be edited:
NB: If your TLS certificate was provided by Let's Encrypt, TLS/DTLS will
not work with any Matrix client that uses Chromium's WebRTC library. This
currently includes Element Android & iOS; for more details, see their
[respective](https://github.com/vector-im/element-android/issues/1533)
[issues](https://github.com/vector-im/element-ios/issues/2712) as well as the underlying
[respective](https://github.com/element-hq/element-android/issues/1533)
[issues](https://github.com/element-hq/element-ios/issues/2712) as well as the underlying
[WebRTC issue](https://bugs.chromium.org/p/webrtc/issues/detail?id=11710).
Consider using a ZeroSSL certificate for your TURN server as a working alternative.

View File

@@ -50,11 +50,13 @@ comment these options out and use those specified by the module instead.
A custom mapping provider must specify the following methods:
* `def __init__(self, parsed_config)`
* `def __init__(self, parsed_config, module_api)`
- Arguments:
- `parsed_config` - A configuration object that is the return value of the
`parse_config` method. You should set any configuration options needed by
the module here.
- `module_api` - a `synapse.module_api.ModuleApi` object which provides the
stable API available for extension modules.
* `def parse_config(config)`
- This method should have the `@staticmethod` decoration.
- Arguments:

View File

@@ -88,15 +88,35 @@ process, for example:
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
```
Generally Synapse database schemas are compatible across multiple versions, once
a version of Synapse is deployed you may not be able to rollback automatically.
Generally Synapse database schemas are compatible across multiple versions, but once
a version of Synapse is deployed you may not be able to roll back automatically.
The following table gives the version ranges and the earliest version they can
be rolled back to. E.g. Synapse versions v1.58.0 through v1.61.1 can be rolled
back safely to v1.57.0, but starting with v1.62.0 it is only safe to rollback to
back safely to v1.57.0, but starting with v1.62.0 it is only safe to roll back to
v1.61.0.
<!-- REPLACE_WITH_SCHEMA_VERSIONS -->
## Upgrading from a very old version
You need to read all of the upgrade notes for each version between your current
version and the latest so that you can update your dependencies, environment,
config files, etc. if necessary. But you do not need to perform an
upgrade to each individual version that was missed.
We do not have a list of which versions must be installed. Instead, we recommend
that you upgrade through each incompatible database schema version, which would
give you the ability to roll back the maximum number of versions should anything
go wrong. See [Rolling back to older versions](#rolling-back-to-older-versions)
above.
Additionally, new versions of Synapse will occasionally run database migrations
and background updates to update the database. Synapse will not start until
database migrations are complete. You should wait until background updates from
each upgrade are complete before moving on to the next upgrade, to avoid
stacking them up. You can monitor the currently running background updates with
[the Admin API](usage/administration/admin_api/background_updates.html#status).
# Upgrading to v1.100.0
## Minimum supported Rust version

View File

@@ -120,6 +120,11 @@ for file in $source_directory/*; do
done
```
How do I upgrade from a very old version of Synapse to the latest?
---
See [this](../../upgrade.html#upgrading-from-a-very-old-version) section in the
upgrade docs.
Manually resetting passwords
---
Users can reset their password through their client. Alternatively, a server admin

View File

@@ -205,3 +205,12 @@ SELECT user_id, device_id, user_agent, TO_TIMESTAMP(last_seen / 1000) AS "last_s
FROM devices
WHERE last_seen < DATE_PART('epoch', NOW() - INTERVAL '3 month') * 1000;
```
## Clear the cache of a remote user's device list
Forces the resync of a remote user's device list - if you have somehow cached a bad state, and the remote server is
will not send out a device list update.
```sql
INSERT INTO device_lists_remote_resync
VALUES ('USER_ID', (EXTRACT(epoch FROM NOW()) * 1000)::BIGINT);
```

View File

@@ -3349,6 +3349,9 @@ Options for each entry include:
not included in `scopes`. Set to `userinfo_endpoint` to always use the
userinfo endpoint.
* `additional_authorization_parameters`: String to string dictionary that will be passed as
additional parameters to the authorization grant URL.
* `allow_existing_users`: set to true to allow a user logging in via OIDC to
match a pre-existing account instead of failing. This could be used if
switching from password logins to OIDC. Defaults to false.
@@ -3473,6 +3476,8 @@ oidc_providers:
token_endpoint: "https://accounts.example.com/oauth2/token"
userinfo_endpoint: "https://accounts.example.com/userinfo"
jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
additional_authorization_parameters:
acr_values: 2fa
skip_verification: true
enable_registration: true
user_mapping_provider:

View File

@@ -54,7 +54,7 @@ function fetchVersions(dropdown, dropdownMenu) {
return new Promise((resolve, reject) => {
window.addEventListener("load", () => {
fetch("https://api.github.com/repos/matrix-org/synapse/git/trees/gh-pages", {
fetch("https://api.github.com/repos/element-hq/synapse/git/trees/gh-pages", {
cache: "force-cache",
}).then(res =>
res.json()
@@ -100,10 +100,30 @@ function sortVersions(a, b) {
if (a === 'develop' || a === 'latest') return -1;
if (b === 'develop' || b === 'latest') return 1;
const versionA = (a.match(/v\d+(\.\d+)+/) || [])[0];
const versionB = (b.match(/v\d+(\.\d+)+/) || [])[0];
// If any of the versions do not confrom to a semantic version string, they
// will be sorted behind a valid version.
const versionA = (a.match(/v(\d+(\.\d+)+)/) || [])[1]?.split('.') ?? '';
const versionB = (b.match(/v(\d+(\.\d+)+)/) || [])[1]?.split('.') ?? '';
return versionB.localeCompare(versionA);
for (let i = 0; i < Math.max(versionA.length, versionB.length); i++) {
if (versionB[i] === undefined) {
return -1;
}
if (versionA[i] === undefined) {
return 1;
}
const partA = parseInt(versionA[i], 10);
const partB = parseInt(versionB[i], 10);
if (partA > partB) {
return -1;
} else if (partB > partA) {
return 1;
}
}
return 0;
}
/**
@@ -124,4 +144,4 @@ function changeVersion(url, newVersion) {
parsedURL.pathname = pathSegments.join('/');
return parsedURL.href;
}
}

View File

@@ -532,6 +532,13 @@ the stream writer for the `presence` stream:
^/_matrix/client/(api/v1|r0|v3|unstable)/presence/
##### The `push_rules` stream
The following endpoints should be routed directly to the worker configured as
the stream writer for the `push` stream:
^/_matrix/client/(api/v1|r0|v3|unstable)/pushrules/
#### Restrict outbound federation traffic to a specific set of workers
The
@@ -629,7 +636,7 @@ worker application type.
You can designate generic worker to sending push notifications to
a [push gateway](https://spec.matrix.org/v1.5/push-gateway-api/) such as
[sygnal](https://github.com/vector-im/sygnal) and email.
[sygnal](https://github.com/matrix-org/sygnal) and email.
This will stop the main process sending push notifications.

823
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -96,7 +96,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.100.0rc3"
version = "1.105.0"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"
@@ -321,7 +321,7 @@ all = [
# This helps prevents merge conflicts when running a batch of dependabot updates.
isort = ">=5.10.1"
black = ">=22.7.0"
ruff = "0.1.14"
ruff = "0.3.7"
# Type checking only works with the pydantic.v1 compat module from pydantic v2
pydantic = "^2"
@@ -372,7 +372,7 @@ optional = true
sphinx = {version = "^6.1", python = "^3.8"}
sphinx-autodoc2 = {version = ">=0.4.2,<0.6.0", python = "^3.8"}
myst-parser = {version = "^1.0.0", python = "^3.8"}
furo = ">=2022.12.7,<2024.0.0"
furo = ">=2022.12.7,<2025.0.0"
[build-system]
@@ -382,7 +382,7 @@ furo = ">=2022.12.7,<2024.0.0"
# runtime errors caused by build system changes.
# We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=1.1.0,<=1.8.1", "setuptools_rust>=1.3,<=1.8.1"]
requires = ["poetry-core>=1.1.0,<=1.9.0", "setuptools_rust>=1.3,<=1.8.1"]
build-backend = "poetry.core.masonry.api"

View File

@@ -7,7 +7,7 @@ name = "synapse"
version = "0.1.0"
edition = "2021"
rust-version = "1.65.0"
rust-version = "1.66.0"
[lib]
name = "synapse"
@@ -23,6 +23,9 @@ name = "synapse.synapse_rust"
[dependencies]
anyhow = "1.0.63"
bytes = "1.6.0"
headers = "0.4.0"
http = "1.1.0"
lazy_static = "1.4.0"
log = "0.4.17"
pyo3 = { version = "0.20.0", features = [

60
rust/src/errors.rs Normal file
View File

@@ -0,0 +1,60 @@
/*
* This file is licensed under the Affero General Public License (AGPL) version 3.
*
* Copyright (C) 2024 New Vector, Ltd
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* See the GNU Affero General Public License for more details:
* <https://www.gnu.org/licenses/agpl-3.0.html>.
*/
#![allow(clippy::new_ret_no_self)]
use std::collections::HashMap;
use http::{HeaderMap, StatusCode};
use pyo3::{exceptions::PyValueError, import_exception};
import_exception!(synapse.api.errors, SynapseError);
impl SynapseError {
pub fn new(
code: StatusCode,
message: String,
errcode: &'static str,
additional_fields: Option<HashMap<String, String>>,
headers: Option<HeaderMap>,
) -> pyo3::PyErr {
// Transform the HeaderMap into a HashMap<String, String>
let headers = if let Some(headers) = headers {
let mut map = HashMap::with_capacity(headers.len());
for (key, value) in headers.iter() {
let Ok(value) = value.to_str() else {
// This should never happen, but we don't want to panic in case it does
return PyValueError::new_err(
"Could not construct SynapseError: header value is not valid ASCII",
);
};
map.insert(key.as_str().to_owned(), value.to_owned());
}
Some(map)
} else {
None
};
SynapseError::new_err((code.as_u16(), message, errcode, additional_fields, headers))
}
}
import_exception!(synapse.api.errors, NotFoundError);
impl NotFoundError {
pub fn new() -> pyo3::PyErr {
NotFoundError::new_err(())
}
}

165
rust/src/http.rs Normal file
View File

@@ -0,0 +1,165 @@
/*
* This file is licensed under the Affero General Public License (AGPL) version 3.
*
* Copyright (C) 2024 New Vector, Ltd
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* See the GNU Affero General Public License for more details:
* <https://www.gnu.org/licenses/agpl-3.0.html>.
*/
use bytes::{Buf, BufMut, Bytes, BytesMut};
use headers::{Header, HeaderMapExt};
use http::{HeaderName, HeaderValue, Method, Request, Response, StatusCode, Uri};
use pyo3::{
exceptions::PyValueError,
types::{PyBytes, PySequence, PyTuple},
PyAny, PyResult,
};
use crate::errors::SynapseError;
/// Read a file-like Python object by chunks
///
/// # Errors
///
/// Returns an error if calling the `read` on the Python object failed
fn read_io_body(body: &PyAny, chunk_size: usize) -> PyResult<Bytes> {
let mut buf = BytesMut::new();
loop {
let bytes: &PyBytes = body.call_method1("read", (chunk_size,))?.downcast()?;
if bytes.as_bytes().is_empty() {
return Ok(buf.into());
}
buf.put(bytes.as_bytes());
}
}
/// Transform a Twisted `IRequest` to an [`http::Request`]
///
/// It uses the following members of `IRequest`:
/// - `content`, which is expected to be a file-like object with a `read` method
/// - `uri`, which is expected to be a valid URI as `bytes`
/// - `method`, which is expected to be a valid HTTP method as `bytes`
/// - `requestHeaders`, which is expected to have a `getAllRawHeaders` method
///
/// # Errors
///
/// Returns an error if the Python object doesn't properly implement `IRequest`
pub fn http_request_from_twisted(request: &PyAny) -> PyResult<Request<Bytes>> {
let content = request.getattr("content")?;
let body = read_io_body(content, 4096)?;
let mut req = Request::new(body);
let uri: &PyBytes = request.getattr("uri")?.downcast()?;
*req.uri_mut() =
Uri::try_from(uri.as_bytes()).map_err(|_| PyValueError::new_err("invalid uri"))?;
let method: &PyBytes = request.getattr("method")?.downcast()?;
*req.method_mut() = Method::from_bytes(method.as_bytes())
.map_err(|_| PyValueError::new_err("invalid method"))?;
let headers_iter = request
.getattr("requestHeaders")?
.call_method0("getAllRawHeaders")?
.iter()?;
for header in headers_iter {
let header = header?;
let header: &PyTuple = header.downcast()?;
let name: &PyBytes = header.get_item(0)?.downcast()?;
let name = HeaderName::from_bytes(name.as_bytes())
.map_err(|_| PyValueError::new_err("invalid header name"))?;
let values: &PySequence = header.get_item(1)?.downcast()?;
for index in 0..values.len()? {
let value: &PyBytes = values.get_item(index)?.downcast()?;
let value = HeaderValue::from_bytes(value.as_bytes())
.map_err(|_| PyValueError::new_err("invalid header value"))?;
req.headers_mut().append(name.clone(), value);
}
}
Ok(req)
}
/// Send an [`http::Response`] through a Twisted `IRequest`
///
/// It uses the following members of `IRequest`:
///
/// - `responseHeaders`, which is expected to have a `addRawHeader(bytes, bytes)` method
/// - `setResponseCode(int)` method
/// - `write(bytes)` method
/// - `finish()` method
///
/// # Errors
///
/// Returns an error if the Python object doesn't properly implement `IRequest`
pub fn http_response_to_twisted<B>(request: &PyAny, response: Response<B>) -> PyResult<()>
where
B: Buf,
{
let (parts, mut body) = response.into_parts();
request.call_method1("setResponseCode", (parts.status.as_u16(),))?;
let response_headers = request.getattr("responseHeaders")?;
for (name, value) in parts.headers.iter() {
response_headers.call_method1("addRawHeader", (name.as_str(), value.as_bytes()))?;
}
while body.remaining() != 0 {
let chunk = body.chunk();
request.call_method1("write", (chunk,))?;
body.advance(chunk.len());
}
request.call_method0("finish")?;
Ok(())
}
/// An extension trait for [`HeaderMap`] that provides typed access to headers, and throws the
/// right python exceptions when the header is missing or fails to parse.
///
/// [`HeaderMap`]: headers::HeaderMap
pub trait HeaderMapPyExt: HeaderMapExt {
/// Get a header from the map, returning an error if it is missing or invalid.
fn typed_get_required<H>(&self) -> PyResult<H>
where
H: Header,
{
self.typed_get_optional::<H>()?.ok_or_else(|| {
SynapseError::new(
StatusCode::BAD_REQUEST,
format!("Missing required header: {}", H::name()),
"M_MISSING_PARAM",
None,
None,
)
})
}
/// Get a header from the map, returning `None` if it is missing and an error if it is invalid.
fn typed_get_optional<H>(&self) -> PyResult<Option<H>>
where
H: Header,
{
self.typed_try_get::<H>().map_err(|_| {
SynapseError::new(
StatusCode::BAD_REQUEST,
format!("Invalid header: {}", H::name()),
"M_INVALID_PARAM",
None,
None,
)
})
}
}
impl<T: HeaderMapExt> HeaderMapPyExt for T {}

View File

@@ -3,7 +3,9 @@ use pyo3::prelude::*;
use pyo3_log::ResetHandle;
pub mod acl;
pub mod errors;
pub mod events;
pub mod http;
pub mod push;
lazy_static! {

View File

@@ -304,12 +304,12 @@ pub const BASE_APPEND_UNDERRIDE_RULES: &[PushRule] = &[
default_enabled: true,
},
PushRule {
rule_id: Cow::Borrowed("global/underride/.m.rule.room_one_to_one"),
rule_id: Cow::Borrowed("global/underride/.m.rule.encrypted_room_one_to_one"),
priority_class: 1,
conditions: Cow::Borrowed(&[
Condition::Known(KnownCondition::EventMatch(EventMatchCondition {
key: Cow::Borrowed("type"),
pattern: Cow::Borrowed("m.room.message"),
pattern: Cow::Borrowed("m.room.encrypted"),
})),
Condition::Known(KnownCondition::RoomMemberCount {
is: Some(Cow::Borrowed("2")),
@@ -320,12 +320,12 @@ pub const BASE_APPEND_UNDERRIDE_RULES: &[PushRule] = &[
default_enabled: true,
},
PushRule {
rule_id: Cow::Borrowed("global/underride/.m.rule.encrypted_room_one_to_one"),
rule_id: Cow::Borrowed("global/underride/.m.rule.room_one_to_one"),
priority_class: 1,
conditions: Cow::Borrowed(&[
Condition::Known(KnownCondition::EventMatch(EventMatchCondition {
key: Cow::Borrowed("type"),
pattern: Cow::Borrowed("m.room.encrypted"),
pattern: Cow::Borrowed("m.room.message"),
})),
Condition::Known(KnownCondition::RoomMemberCount {
is: Some(Cow::Borrowed("2")),

View File

@@ -214,8 +214,6 @@ fi
extra_test_args=()
test_packages="./tests/csapi ./tests ./tests/msc3874 ./tests/msc3890 ./tests/msc3391 ./tests/msc3930 ./tests/msc3902"
# Enable dirty runs, so tests will reuse the same container where possible.
# This significantly speeds up tests, but increases the possibility of test pollution.
export COMPLEMENT_ENABLE_DIRTY_RUNS=1
@@ -278,7 +276,12 @@ fi
export PASS_SYNAPSE_LOG_TESTING=1
# Run the tests!
echo "Images built; running complement with ${extra_test_args[@]} $@ $test_packages"
cd "$COMPLEMENT_DIR"
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" $test_packages
# This isn't whitespace-safe but *does* work on the prehistoric version of bash
# on OSX.
test_packages=( $(find ./tests -type d) )
echo "Images built; running complement with ${extra_test_args[@]} $@ ${test_packages[@]}"
go test -v -tags "synapse_blacklist" -count=1 "${extra_test_args[@]}" "$@" "${test_packages[@]}"

View File

@@ -660,7 +660,7 @@ def _announce() -> None:
Hi everyone. Synapse {current_version} has just been released.
[notes](https://github.com/element-hq/synapse/releases/tag/{tag_name}) | \
[docker](https://hub.docker.com/r/vectorim/synapse/tags?name={tag_name}) | \
[docker](https://hub.docker.com/r/matrixdotorg/synapse/tags?name={tag_name}) | \
[debs](https://packages.matrix.org/debian/) | \
[pypi](https://pypi.org/project/matrix-synapse/{current_version}/)"""
)

View File

@@ -60,7 +60,7 @@ from synapse.logging.context import (
)
from synapse.notifier import ReplicationNotifier
from synapse.storage.database import DatabasePool, LoggingTransaction, make_conn
from synapse.storage.databases.main import FilteringWorkerStore, PushRuleStore
from synapse.storage.databases.main import FilteringWorkerStore
from synapse.storage.databases.main.account_data import AccountDataWorkerStore
from synapse.storage.databases.main.client_ips import ClientIpBackgroundUpdateStore
from synapse.storage.databases.main.deviceinbox import DeviceInboxBackgroundUpdateStore
@@ -77,10 +77,8 @@ from synapse.storage.databases.main.media_repository import (
)
from synapse.storage.databases.main.presence import PresenceBackgroundUpdateStore
from synapse.storage.databases.main.profile import ProfileWorkerStore
from synapse.storage.databases.main.pusher import (
PusherBackgroundUpdatesStore,
PusherWorkerStore,
)
from synapse.storage.databases.main.push_rule import PusherWorkerStore
from synapse.storage.databases.main.pusher import PusherBackgroundUpdatesStore
from synapse.storage.databases.main.receipts import ReceiptsBackgroundUpdateStore
from synapse.storage.databases.main.registration import (
RegistrationBackgroundUpdateStore,
@@ -245,7 +243,6 @@ class Store(
AccountDataWorkerStore,
FilteringWorkerStore,
ProfileWorkerStore,
PushRuleStore,
PusherWorkerStore,
PusherBackgroundUpdatesStore,
PresenceBackgroundUpdateStore,
@@ -1040,10 +1037,10 @@ class Porter:
return done, remaining + done
async def _setup_state_group_id_seq(self) -> None:
curr_id: Optional[
int
] = await self.sqlite_store.db_pool.simple_select_one_onecol(
table="state_groups", keyvalues={}, retcol="MAX(id)", allow_none=True
curr_id: Optional[int] = (
await self.sqlite_store.db_pool.simple_select_one_onecol(
table="state_groups", keyvalues={}, retcol="MAX(id)", allow_none=True
)
)
if not curr_id:
@@ -1132,13 +1129,13 @@ class Porter:
)
async def _setup_auth_chain_sequence(self) -> None:
curr_chain_id: Optional[
int
] = await self.sqlite_store.db_pool.simple_select_one_onecol(
table="event_auth_chains",
keyvalues={},
retcol="MAX(chain_id)",
allow_none=True,
curr_chain_id: Optional[int] = (
await self.sqlite_store.db_pool.simple_select_one_onecol(
table="event_auth_chains",
keyvalues={},
retcol="MAX(chain_id)",
allow_none=True,
)
)
def r(txn: LoggingTransaction) -> None:

View File

@@ -43,7 +43,6 @@ MAIN_TIMELINE: Final = "main"
class Membership:
"""Represents the membership states of a user in a room."""
INVITE: Final = "invite"
@@ -130,6 +129,8 @@ class EventTypes:
Reaction: Final = "m.reaction"
CallInvite: Final = "m.call.invite"
class ToDeviceEventTypes:
RoomKeyRequest: Final = "m.room_key_request"

View File

@@ -517,8 +517,6 @@ class InvalidCaptchaError(SynapseError):
class LimitExceededError(SynapseError):
"""A client has sent too many requests and is being throttled."""
include_retry_after_header = False
def __init__(
self,
limiter_name: str,
@@ -526,9 +524,10 @@ class LimitExceededError(SynapseError):
retry_after_ms: Optional[int] = None,
errcode: str = Codes.LIMIT_EXCEEDED,
):
# Use HTTP header Retry-After to enable library-assisted retry handling.
headers = (
{"Retry-After": str(math.ceil(retry_after_ms / 1000))}
if self.include_retry_after_header and retry_after_ms is not None
if retry_after_ms is not None
else None
)
super().__init__(code, "Too Many Requests", errcode, headers=headers)

View File

@@ -370,9 +370,11 @@ class RoomVersionCapability:
MSC3244_CAPABILITIES = {
cap.identifier: {
"preferred": cap.preferred_version.identifier
if cap.preferred_version is not None
else None,
"preferred": (
cap.preferred_version.identifier
if cap.preferred_version is not None
else None
),
"support": [
v.identifier
for v in KNOWN_ROOM_VERSIONS.values()

View File

@@ -188,9 +188,9 @@ class SynapseHomeServer(HomeServer):
PasswordResetSubmitTokenResource,
)
resources[
"/_synapse/client/password_reset/email/submit_token"
] = PasswordResetSubmitTokenResource(self)
resources["/_synapse/client/password_reset/email/submit_token"] = (
PasswordResetSubmitTokenResource(self)
)
if name == "consent":
from synapse.rest.consent.consent_resource import ConsentResource

View File

@@ -362,16 +362,16 @@ class ApplicationServiceApi(SimpleHttpClient):
# TODO: Update to stable prefixes once MSC3202 completes FCP merge
if service.msc3202_transaction_extensions:
if one_time_keys_count:
body[
"org.matrix.msc3202.device_one_time_key_counts"
] = one_time_keys_count
body[
"org.matrix.msc3202.device_one_time_keys_count"
] = one_time_keys_count
body["org.matrix.msc3202.device_one_time_key_counts"] = (
one_time_keys_count
)
body["org.matrix.msc3202.device_one_time_keys_count"] = (
one_time_keys_count
)
if unused_fallback_keys:
body[
"org.matrix.msc3202.device_unused_fallback_key_types"
] = unused_fallback_keys
body["org.matrix.msc3202.device_unused_fallback_key_types"] = (
unused_fallback_keys
)
if device_list_summary:
body["org.matrix.msc3202.device_lists"] = {
"changed": list(device_list_summary.changed),

View File

@@ -25,7 +25,6 @@ from typing import TYPE_CHECKING, Any, Optional
import attr
import attr.validators
from synapse.api.errors import LimitExceededError
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS, RoomVersions
from synapse.config import ConfigError
from synapse.config._base import Config, RootConfig
@@ -394,11 +393,6 @@ class ExperimentalConfig(Config):
# MSC3967: Do not require UIA when first uploading cross signing keys
self.msc3967_enabled = experimental.get("msc3967_enabled", False)
# MSC3981: Recurse relations
self.msc3981_recurse_relations = experimental.get(
"msc3981_recurse_relations", False
)
# MSC3861: Matrix architecture change to delegate authentication via OIDC
try:
self.msc3861 = MSC3861(**experimental.get("msc3861", {}))
@@ -410,19 +404,6 @@ class ExperimentalConfig(Config):
# Check that none of the other config options conflict with MSC3861 when enabled
self.msc3861.check_config_conflicts(self.root)
# MSC4010: Do not allow setting m.push_rules account data.
self.msc4010_push_rules_account_data = experimental.get(
"msc4010_push_rules_account_data", False
)
# MSC4041: Use HTTP header Retry-After to enable library-assisted retry handling
#
# This is a bit hacky, but the most reasonable way to *alway* include the
# headers.
LimitExceededError.include_retry_after_header = experimental.get(
"msc4041_enabled", False
)
self.msc4028_push_encrypted_events = experimental.get(
"msc4028_push_encrypted_events", False
)
@@ -430,3 +411,14 @@ class ExperimentalConfig(Config):
self.msc4069_profile_inhibit_propagation = experimental.get(
"msc4069_profile_inhibit_propagation", False
)
# MSC4108: Mechanism to allow OIDC sign in and E2EE set up via QR code
self.msc4108_delegation_endpoint: Optional[str] = experimental.get(
"msc4108_delegation_endpoint", None
)
if self.msc4108_delegation_endpoint is not None and not self.msc3861.enabled:
raise ConfigError(
"MSC4108 requires MSC3861 to be enabled",
("experimental", "msc4108_delegation_endpoint"),
)

View File

@@ -342,6 +342,9 @@ def _parse_oidc_config_dict(
user_mapping_provider_config=user_mapping_provider_config,
attribute_requirements=attribute_requirements,
enable_registration=oidc_config.get("enable_registration", True),
additional_authorization_parameters=oidc_config.get(
"additional_authorization_parameters", {}
),
)
@@ -444,3 +447,6 @@ class OidcProviderConfig:
# Whether automatic registrations are enabled in the ODIC flow. Defaults to True
enable_registration: bool
# Additional parameters that will be passed to the authorization grant URL
additional_authorization_parameters: Mapping[str, str]

View File

@@ -171,9 +171,9 @@ class RegistrationConfig(Config):
refreshable_access_token_lifetime = self.parse_duration(
refreshable_access_token_lifetime
)
self.refreshable_access_token_lifetime: Optional[
int
] = refreshable_access_token_lifetime
self.refreshable_access_token_lifetime: Optional[int] = (
refreshable_access_token_lifetime
)
if (
self.session_lifetime is not None
@@ -237,6 +237,14 @@ class RegistrationConfig(Config):
self.inhibit_user_in_use_error = config.get("inhibit_user_in_use_error", False)
# List of user IDs not to send out device list updates for when they
# register new devices. This is useful to handle bot accounts.
#
# Note: This will still send out device list updates if the device is
# later updated, e.g. end to end keys are added.
dont_notify_new_devices_for = config.get("dont_notify_new_devices_for", [])
self.dont_notify_new_devices_for = frozenset(dont_notify_new_devices_for)
def generate_config_section(
self, generate_secrets: bool = False, **kwargs: Any
) -> str:

View File

@@ -199,9 +199,9 @@ class ContentRepositoryConfig(Config):
provider_config["module"] == "file_system"
or provider_config["module"] == "synapse.rest.media.v1.storage_provider"
):
provider_config[
"module"
] = "synapse.media.storage_provider.FileStorageProviderBackend"
provider_config["module"] = (
"synapse.media.storage_provider.FileStorageProviderBackend"
)
provider_class, parsed_config = load_module(
provider_config, ("media_storage_providers", "<item %i>" % i)

View File

@@ -156,6 +156,8 @@ class WriterLocations:
can only be a single instance.
presence: The instances that write to the presence stream. Currently
can only be a single instance.
push_rules: The instances that write to the push stream. Currently
can only be a single instance.
"""
events: List[str] = attr.ib(
@@ -182,6 +184,10 @@ class WriterLocations:
default=["master"],
converter=_instance_to_list_converter,
)
push_rules: List[str] = attr.ib(
default=["master"],
converter=_instance_to_list_converter,
)
@attr.s(auto_attribs=True)
@@ -341,6 +347,7 @@ class WorkerConfig(Config):
"account_data",
"receipts",
"presence",
"push_rules",
):
instances = _instance_to_list_converter(getattr(self.writers, stream))
for instance in instances:
@@ -378,6 +385,11 @@ class WorkerConfig(Config):
"Must only specify one instance to handle `presence` messages."
)
if len(self.writers.push_rules) != 1:
raise ConfigError(
"Must only specify one instance to handle `push` messages."
)
self.events_shard_config = RoutableShardedWorkerHandlingConfig(
self.writers.events
)

View File

@@ -839,11 +839,12 @@ class ServerKeyFetcher(BaseV2KeyFetcher):
Map from server_name -> key_id -> FetchKeyResult
"""
# We only need to do one request per server.
servers_to_fetch = {k.server_name for k in keys_to_fetch}
results = {}
async def get_keys(key_to_fetch_item: _FetchKeyRequest) -> None:
server_name = key_to_fetch_item.server_name
async def get_keys(server_name: str) -> None:
try:
keys = await self.get_server_verify_keys_v2_direct(server_name)
results[server_name] = keys
@@ -852,7 +853,7 @@ class ServerKeyFetcher(BaseV2KeyFetcher):
except Exception:
logger.exception("Error getting keys from %s", server_name)
await yieldable_gather_results(get_keys, keys_to_fetch)
await yieldable_gather_results(get_keys, servers_to_fetch)
return results
async def get_server_verify_keys_v2_direct(

View File

@@ -23,7 +23,20 @@
import collections.abc
import logging
import typing
from typing import Any, Dict, Iterable, List, Mapping, Optional, Set, Tuple, Union
from typing import (
Any,
ChainMap,
Dict,
Iterable,
List,
Mapping,
MutableMapping,
Optional,
Set,
Tuple,
Union,
cast,
)
from canonicaljson import encode_canonical_json
from signedjson.key import decode_verify_key_bytes
@@ -75,8 +88,7 @@ class _EventSourceStore(Protocol):
redact_behaviour: EventRedactBehaviour,
get_prev_content: bool = False,
allow_rejected: bool = False,
) -> Dict[str, "EventBase"]:
...
) -> Dict[str, "EventBase"]: ...
def validate_event_for_room_version(event: "EventBase") -> None:
@@ -175,12 +187,22 @@ async def check_state_independent_auth_rules(
return
# 2. Reject if event has auth_events that: ...
auth_events: ChainMap[str, EventBase] = ChainMap()
if batched_auth_events:
# Copy the batched auth events to avoid mutating them.
auth_events = dict(batched_auth_events)
needed_auth_event_ids = set(event.auth_event_ids()) - batched_auth_events.keys()
# batched_auth_events can become very large. To avoid repeatedly copying it, which
# would significantly impact performance, we use a ChainMap.
# batched_auth_events must be cast to MutableMapping because .new_child() requires
# this type. This casting is safe as the mapping is never mutated.
auth_events = auth_events.new_child(
cast(MutableMapping[str, "EventBase"], batched_auth_events)
)
needed_auth_event_ids = [
event_id
for event_id in event.auth_event_ids()
if event_id not in batched_auth_events
]
if needed_auth_event_ids:
auth_events.update(
auth_events = auth_events.new_child(
await store.get_events(
needed_auth_event_ids,
redact_behaviour=EventRedactBehaviour.as_is,
@@ -188,10 +210,12 @@ async def check_state_independent_auth_rules(
)
)
else:
auth_events = await store.get_events(
event.auth_event_ids(),
redact_behaviour=EventRedactBehaviour.as_is,
allow_rejected=True,
auth_events = auth_events.new_child(
await store.get_events(
event.auth_event_ids(),
redact_behaviour=EventRedactBehaviour.as_is,
allow_rejected=True,
)
)
room_id = event.room_id

View File

@@ -93,16 +93,14 @@ class DictProperty(Generic[T]):
self,
instance: Literal[None],
owner: Optional[Type[_DictPropertyInstance]] = None,
) -> "DictProperty":
...
) -> "DictProperty": ...
@overload
def __get__(
self,
instance: _DictPropertyInstance,
owner: Optional[Type[_DictPropertyInstance]] = None,
) -> T:
...
) -> T: ...
def __get__(
self,
@@ -161,16 +159,14 @@ class DefaultDictProperty(DictProperty, Generic[T]):
self,
instance: Literal[None],
owner: Optional[Type[_DictPropertyInstance]] = None,
) -> "DefaultDictProperty":
...
) -> "DefaultDictProperty": ...
@overload
def __get__(
self,
instance: _DictPropertyInstance,
owner: Optional[Type[_DictPropertyInstance]] = None,
) -> T:
...
) -> T: ...
def __get__(
self,

View File

@@ -612,9 +612,9 @@ class EventClientSerializer:
serialized_aggregations = {}
if event_aggregations.references:
serialized_aggregations[
RelationTypes.REFERENCE
] = event_aggregations.references
serialized_aggregations[RelationTypes.REFERENCE] = (
event_aggregations.references
)
if event_aggregations.replace:
# Include information about it in the relations dict.

View File

@@ -169,9 +169,9 @@ class FederationServer(FederationBase):
# We cache responses to state queries, as they take a while and often
# come in waves.
self._state_resp_cache: ResponseCache[
Tuple[str, Optional[str]]
] = ResponseCache(hs.get_clock(), "state_resp", timeout_ms=30000)
self._state_resp_cache: ResponseCache[Tuple[str, Optional[str]]] = (
ResponseCache(hs.get_clock(), "state_resp", timeout_ms=30000)
)
self._state_ids_resp_cache: ResponseCache[Tuple[str, str]] = ResponseCache(
hs.get_clock(), "state_ids_resp", timeout_ms=30000
)

View File

@@ -88,9 +88,9 @@ class FederationRemoteSendQueue(AbstractFederationSender):
# Stores the destinations we need to explicitly send presence to about a
# given user.
# Stream position -> (user_id, destinations)
self.presence_destinations: SortedDict[
int, Tuple[str, Iterable[str]]
] = SortedDict()
self.presence_destinations: SortedDict[int, Tuple[str, Iterable[str]]] = (
SortedDict()
)
# (destination, key) -> EDU
self.keyed_edu: Dict[Tuple[str, tuple], Edu] = {}

View File

@@ -192,10 +192,9 @@ sent_pdus_destination_dist_total = Counter(
)
# Time (in s) to wait before trying to wake up destinations that have
# catch-up outstanding. This will also be the delay applied at startup
# before trying the same.
# catch-up outstanding.
# Please note that rate limiting still applies, so while the loop is
# executed every X seconds the destinations may not be wake up because
# executed every X seconds the destinations may not be woken up because
# they are being rate limited following previous attempt failures.
WAKEUP_RETRY_PERIOD_SEC = 60
@@ -428,18 +427,17 @@ class FederationSender(AbstractFederationSender):
/ hs.config.ratelimiting.federation_rr_transactions_per_room_per_second
)
self._external_cache = hs.get_external_cache()
self._destination_wakeup_queue = _DestinationWakeupQueue(self, self.clock)
# Regularly wake up destinations that have outstanding PDUs to be caught up
self.clock.looping_call(
self.clock.looping_call_now(
run_as_background_process,
WAKEUP_RETRY_PERIOD_SEC * 1000.0,
"wake_destinations_needing_catchup",
self._wake_destinations_needing_catchup,
)
self._external_cache = hs.get_external_cache()
self._destination_wakeup_queue = _DestinationWakeupQueue(self, self.clock)
def _get_per_destination_queue(self, destination: str) -> PerDestinationQueue:
"""Get or create a PerDestinationQueue for the given destination

View File

@@ -118,10 +118,10 @@ class AccountHandler:
}
if self._use_account_validity_in_account_status:
status[
"org.matrix.expired"
] = await self._account_validity_handler.is_user_expired(
user_id.to_string()
status["org.matrix.expired"] = (
await self._account_validity_handler.is_user_expired(
user_id.to_string()
)
)
return status

View File

@@ -2185,7 +2185,7 @@ class PasswordAuthProvider:
# result is always the right type, but as it is 3rd party code it might not be
if not isinstance(result, tuple) or len(result) != 2:
logger.warning(
logger.warning( # type: ignore[unreachable]
"Wrong type returned by module API callback %s: %s, expected"
" Optional[Tuple[str, Optional[Callable]]]",
callback,
@@ -2248,7 +2248,7 @@ class PasswordAuthProvider:
# result is always the right type, but as it is 3rd party code it might not be
if not isinstance(result, tuple) or len(result) != 2:
logger.warning(
logger.warning( # type: ignore[unreachable]
"Wrong type returned by module API callback %s: %s, expected"
" Optional[Tuple[str, Optional[Callable]]]",
callback,

View File

@@ -18,9 +18,11 @@
# [This file includes modifications made by New Vector Limited]
#
#
import itertools
import logging
from typing import TYPE_CHECKING, Optional
from synapse.api.constants import Membership
from synapse.api.errors import SynapseError
from synapse.handlers.device import DeviceHandler
from synapse.metrics.background_process_metrics import run_as_background_process
@@ -168,9 +170,9 @@ class DeactivateAccountHandler:
# parts users from rooms (if it isn't already running)
self._start_user_parting()
# Reject all pending invites for the user, so that the user doesn't show up in the
# "invited" section of rooms' members list.
await self._reject_pending_invites_for_user(user_id)
# Reject all pending invites and knocks for the user, so that the
# user doesn't show up in the "invited" section of rooms' members list.
await self._reject_pending_invites_and_knocks_for_user(user_id)
# Remove all information on the user from the account_validity table.
if self._account_validity_enabled:
@@ -194,34 +196,37 @@ class DeactivateAccountHandler:
return identity_server_supports_unbinding
async def _reject_pending_invites_for_user(self, user_id: str) -> None:
"""Reject pending invites addressed to a given user ID.
async def _reject_pending_invites_and_knocks_for_user(self, user_id: str) -> None:
"""Reject pending invites and knocks addressed to a given user ID.
Args:
user_id: The user ID to reject pending invites for.
user_id: The user ID to reject pending invites and knocks for.
"""
user = UserID.from_string(user_id)
pending_invites = await self.store.get_invited_rooms_for_local_user(user_id)
pending_knocks = await self.store.get_knocked_at_rooms_for_local_user(user_id)
for room in pending_invites:
for room in itertools.chain(pending_invites, pending_knocks):
try:
await self._room_member_handler.update_membership(
create_requester(user, authenticated_entity=self._server_name),
user,
room.room_id,
"leave",
Membership.LEAVE,
ratelimit=False,
require_consent=False,
)
logger.info(
"Rejected invite for deactivated user %r in room %r",
"Rejected %r for deactivated user %r in room %r",
room.membership,
user_id,
room.room_id,
)
except Exception:
logger.exception(
"Failed to reject invite for user %r in room %r:"
"Failed to reject %r for user %r in room %r:"
" ignoring and continuing",
room.membership,
user_id,
room.room_id,
)

View File

@@ -429,6 +429,10 @@ class DeviceHandler(DeviceWorkerHandler):
self._storage_controllers = hs.get_storage_controllers()
self.db_pool = hs.get_datastores().main.db_pool
self._dont_notify_new_devices_for = (
hs.config.registration.dont_notify_new_devices_for
)
self.device_list_updater = DeviceListUpdater(hs, self)
federation_registry = hs.get_federation_registry()
@@ -505,6 +509,9 @@ class DeviceHandler(DeviceWorkerHandler):
self._check_device_name_length(initial_device_display_name)
# Check if we should send out device lists updates for this new device.
notify = user_id not in self._dont_notify_new_devices_for
if device_id is not None:
new_device = await self.store.store_device(
user_id=user_id,
@@ -514,7 +521,8 @@ class DeviceHandler(DeviceWorkerHandler):
auth_provider_session_id=auth_provider_session_id,
)
if new_device:
await self.notify_device_update(user_id, [device_id])
if notify:
await self.notify_device_update(user_id, [device_id])
return device_id
# if the device id is not specified, we'll autogen one, but loop a few
@@ -530,7 +538,8 @@ class DeviceHandler(DeviceWorkerHandler):
auth_provider_session_id=auth_provider_session_id,
)
if new_device:
await self.notify_device_update(user_id, [new_device_id])
if notify:
await self.notify_device_update(user_id, [new_device_id])
return new_device_id
attempts += 1

View File

@@ -265,9 +265,9 @@ class DirectoryHandler:
async def get_association(self, room_alias: RoomAlias) -> JsonDict:
room_id = None
if self.hs.is_mine(room_alias):
result: Optional[
RoomAliasMapping
] = await self.get_association_from_room_alias(room_alias)
result: Optional[RoomAliasMapping] = (
await self.get_association_from_room_alias(room_alias)
)
if result:
room_id = result.room_id

View File

@@ -1476,6 +1476,42 @@ class E2eKeysHandler:
else:
return exists, self.clock.time_msec() < ts_replacable_without_uia_before
async def has_different_keys(self, user_id: str, body: JsonDict) -> bool:
"""
Check if a key provided in `body` differs from the same key stored in the DB. Returns
true on the first difference. If a key exists in `body` but does not exist in the DB,
returns True. If `body` has no keys, this always returns False.
Note by 'key' we mean Matrix key rather than JSON key.
The purpose of this function is to detect whether or not we need to apply UIA checks.
We must apply UIA checks if any key in the database is being overwritten. If a key is
being inserted for the first time, or if the key exactly matches what is in the database,
then no UIA check needs to be performed.
Args:
user_id: The user who sent the `body`.
body: The JSON request body from POST /keys/device_signing/upload
Returns:
True if any key in `body` has a different value in the database.
"""
# Ensure that each key provided in the request body exactly matches the one we have stored.
# The first time we see the DB having a different key to the matching request key, bail.
# Note: we do not care if the DB has a key which the request does not specify, as we only
# care about *replacements* or *insertions* (i.e UPSERT)
req_body_key_to_db_key = {
"master_key": "master",
"self_signing_key": "self_signing",
"user_signing_key": "user_signing",
}
for req_body_key, db_key in req_body_key_to_db_key.items():
if req_body_key in body:
existing_key = await self.store.get_e2e_cross_signing_key(
user_id, db_key
)
if existing_key != body[req_body_key]:
return True
return False
def _check_cross_signing_key(
key: JsonDict, user_id: str, key_type: str, signing_key: Optional[VerifyKey] = None

View File

@@ -1001,11 +1001,11 @@ class FederationHandler:
)
if include_auth_user_id:
event_content[
EventContentFields.AUTHORISING_USER
] = await self._event_auth_handler.get_user_which_could_invite(
room_id,
state_ids,
event_content[EventContentFields.AUTHORISING_USER] = (
await self._event_auth_handler.get_user_which_could_invite(
room_id,
state_ids,
)
)
builder = self.event_builder_factory.for_room_version(

View File

@@ -1367,9 +1367,9 @@ class FederationEventHandler:
)
if remote_event.is_state() and remote_event.rejected_reason is None:
state_map[
(remote_event.type, remote_event.state_key)
] = remote_event.event_id
state_map[(remote_event.type, remote_event.state_key)] = (
remote_event.event_id
)
return state_map
@@ -1757,17 +1757,25 @@ class FederationEventHandler:
events_and_contexts_to_persist.append((event, context))
for event in sorted_auth_events:
for i, event in enumerate(sorted_auth_events):
await prep(event)
await self.persist_events_and_notify(
room_id,
events_and_contexts_to_persist,
# Mark these events backfilled as they're historic events that will
# eventually be backfilled. For example, missing events we fetch
# during backfill should be marked as backfilled as well.
backfilled=True,
)
# The above function is typically not async, and so won't yield to
# the reactor. For large rooms let's yield to the reactor
# occasionally to ensure we don't block other work.
if (i + 1) % 1000 == 0:
await self._clock.sleep(0)
# Also persist the new event in batches for similar reasons as above.
for batch in batch_iter(events_and_contexts_to_persist, 1000):
await self.persist_events_and_notify(
room_id,
batch,
# Mark these events as backfilled as they're historic events that will
# eventually be backfilled. For example, missing events we fetch
# during backfill should be marked as backfilled as well.
backfilled=True,
)
@trace
async def _check_event_auth(

View File

@@ -34,6 +34,7 @@ from synapse.api.constants import (
EventTypes,
GuestAccess,
HistoryVisibility,
JoinRules,
Membership,
RelationTypes,
UserTypes,
@@ -1325,6 +1326,18 @@ class EventCreationHandler:
self.validator.validate_new(event, self.config)
await self._validate_event_relation(event)
if event.type == EventTypes.CallInvite:
room_id = event.room_id
room_info = await self.store.get_room_with_stats(room_id)
assert room_info is not None
if room_info.join_rules == JoinRules.PUBLIC:
raise SynapseError(
403,
"Call invites are not allowed in public rooms.",
Codes.FORBIDDEN,
)
logger.debug("Created event %s", event.event_id)
return event, context
@@ -1654,9 +1667,9 @@ class EventCreationHandler:
expiry_ms=60 * 60 * 1000,
)
self._external_cache_joined_hosts_updates[
state_entry.state_group
] = None
self._external_cache_joined_hosts_updates[state_entry.state_group] = (
None
)
async def _validate_canonical_alias(
self,

View File

@@ -65,6 +65,7 @@ from synapse.http.server import finish_request
from synapse.http.servlet import parse_string
from synapse.http.site import SynapseRequest
from synapse.logging.context import make_deferred_yieldable
from synapse.module_api import ModuleApi
from synapse.types import JsonDict, UserID, map_username_to_mxid_localpart
from synapse.util import Clock, json_decoder
from synapse.util.caches.cached_call import RetryOnExceptionCachedCall
@@ -421,9 +422,19 @@ class OidcProvider:
# from the IdP's jwks_uri, if required.
self._jwks = RetryOnExceptionCachedCall(self._load_jwks)
self._user_mapping_provider = provider.user_mapping_provider_class(
provider.user_mapping_provider_config
user_mapping_provider_init_method = (
provider.user_mapping_provider_class.__init__
)
if len(inspect.signature(user_mapping_provider_init_method).parameters) == 3:
self._user_mapping_provider = provider.user_mapping_provider_class(
provider.user_mapping_provider_config,
ModuleApi(hs, hs.get_auth_handler()),
)
else:
self._user_mapping_provider = provider.user_mapping_provider_class(
provider.user_mapping_provider_config,
)
self._skip_verification = provider.skip_verification
self._allow_existing_users = provider.allow_existing_users
@@ -442,6 +453,10 @@ class OidcProvider:
# optional brand identifier for this auth provider
self.idp_brand = provider.idp_brand
self.additional_authorization_parameters = (
provider.additional_authorization_parameters
)
self._sso_handler = hs.get_sso_handler()
self._device_handler = hs.get_device_handler()
@@ -818,14 +833,38 @@ class OidcProvider:
logger.debug("Using the OAuth2 access_token to request userinfo")
metadata = await self.load_metadata()
resp = await self._http_client.get_json(
resp = await self._http_client.request(
"GET",
metadata["userinfo_endpoint"],
headers={"Authorization": ["Bearer {}".format(token["access_token"])]},
headers=Headers(
{"Authorization": ["Bearer {}".format(token["access_token"])]}
),
)
logger.debug("Retrieved user info from userinfo endpoint: %r", resp)
body = await readBody(resp)
return UserInfo(resp)
content_type_headers = resp.headers.getRawHeaders("Content-Type")
assert content_type_headers
# We use `startswith` because the header value can contain the `charset` parameter
# even if it is useless, and Twisted doesn't take care of that for us.
if content_type_headers[0].startswith("application/jwt"):
alg_values = metadata.get(
"id_token_signing_alg_values_supported", ["RS256"]
)
jwt = JsonWebToken(alg_values)
jwk_set = await self.load_jwks()
try:
decoded_resp = jwt.decode(body, key=jwk_set)
except ValueError:
logger.info("Reloading JWKS after decode error")
jwk_set = await self.load_jwks(force=True) # try reloading the jwks
decoded_resp = jwt.decode(body, key=jwk_set)
else:
decoded_resp = json_decoder.decode(body.decode("utf-8"))
logger.debug("Retrieved user info from userinfo endpoint: %r", decoded_resp)
return UserInfo(decoded_resp)
async def _verify_jwt(
self,
@@ -971,17 +1010,21 @@ class OidcProvider:
metadata = await self.load_metadata()
additional_authorization_parameters = dict(
self.additional_authorization_parameters
)
# Automatically enable PKCE if it is supported.
extra_grant_values = {}
if metadata.get("code_challenge_methods_supported"):
code_verifier = generate_token(48)
# Note that we verified the server supports S256 earlier (in
# OidcProvider._validate_metadata).
extra_grant_values = {
"code_challenge_method": "S256",
"code_challenge": create_s256_code_challenge(code_verifier),
}
additional_authorization_parameters.update(
{
"code_challenge_method": "S256",
"code_challenge": create_s256_code_challenge(code_verifier),
}
)
cookie = self._macaroon_generaton.generate_oidc_session_token(
state=state,
@@ -1020,7 +1063,7 @@ class OidcProvider:
scope=self._scopes,
state=state,
nonce=nonce,
**extra_grant_values,
**additional_authorization_parameters,
)
async def handle_oidc_callback(
@@ -1583,7 +1626,7 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
This is the default mapping provider.
"""
def __init__(self, config: JinjaOidcMappingConfig):
def __init__(self, config: JinjaOidcMappingConfig, module_api: ModuleApi):
self._config = config
@staticmethod

View File

@@ -493,9 +493,9 @@ class WorkerPresenceHandler(BasePresenceHandler):
# The number of ongoing syncs on this process, by (user ID, device ID).
# Empty if _presence_enabled is false.
self._user_device_to_num_current_syncs: Dict[
Tuple[str, Optional[str]], int
] = {}
self._user_device_to_num_current_syncs: Dict[Tuple[str, Optional[str]], int] = (
{}
)
self.notifier = hs.get_notifier()
self.instance_id = hs.get_instance_id()
@@ -818,9 +818,9 @@ class PresenceHandler(BasePresenceHandler):
# Keeps track of the number of *ongoing* syncs on this process. While
# this is non zero a user will never go offline.
self._user_device_to_num_current_syncs: Dict[
Tuple[str, Optional[str]], int
] = {}
self._user_device_to_num_current_syncs: Dict[Tuple[str, Optional[str]], int] = (
{}
)
# Keeps track of the number of *ongoing* syncs on other processes.
#

View File

@@ -320,9 +320,9 @@ class ProfileHandler:
server_name = host
if self._is_mine_server_name(server_name):
media_info: Optional[
Union[LocalMedia, RemoteMedia]
] = await self.store.get_local_media(media_id)
media_info: Optional[Union[LocalMedia, RemoteMedia]] = (
await self.store.get_local_media(media_id)
)
else:
media_info = await self.store.get_cached_remote_media(server_name, media_id)

View File

@@ -55,12 +55,12 @@ class ReadMarkerHandler:
should_update = True
# Get event ordering, this also ensures we know about the event
event_ordering = await self.store.get_event_ordering(event_id)
event_ordering = await self.store.get_event_ordering(event_id, room_id)
if existing_read_marker:
try:
old_event_ordering = await self.store.get_event_ordering(
existing_read_marker["event_id"]
existing_read_marker["event_id"], room_id
)
except SynapseError:
# Old event no longer exists, assume new is ahead. This may

View File

@@ -188,13 +188,13 @@ class RelationsHandler:
if include_original_event:
# Do not bundle aggregations when retrieving the original event because
# we want the content before relations are applied to it.
return_value[
"original_event"
] = await self._event_serializer.serialize_event(
event,
now,
bundle_aggregations=None,
config=serialize_options,
return_value["original_event"] = (
await self._event_serializer.serialize_event(
event,
now,
bundle_aggregations=None,
config=serialize_options,
)
)
if next_token:

View File

@@ -151,7 +151,7 @@ class RoomCreationHandler:
"history_visibility": HistoryVisibility.SHARED,
"original_invitees_have_ops": False,
"guest_can_join": False,
"power_level_content_override": {},
"power_level_content_override": {EventTypes.CallInvite: 50},
},
}
@@ -538,10 +538,10 @@ class RoomCreationHandler:
# deep-copy the power-levels event before we start modifying it
# note that if frozen_dicts are enabled, `power_levels` will be a frozen
# dict so we can't just copy.deepcopy it.
initial_state[
(EventTypes.PowerLevels, "")
] = power_levels = copy_and_fixup_power_levels_contents(
initial_state[(EventTypes.PowerLevels, "")]
initial_state[(EventTypes.PowerLevels, "")] = power_levels = (
copy_and_fixup_power_levels_contents(
initial_state[(EventTypes.PowerLevels, "")]
)
)
# Resolve the minimum power level required to send any state event
@@ -956,6 +956,7 @@ class RoomCreationHandler:
room_alias=room_alias,
power_level_content_override=power_level_content_override,
creator_join_profile=creator_join_profile,
ignore_forced_encryption=ignore_forced_encryption,
)
# we avoid dropping the lock between invites, as otherwise joins can
@@ -1362,9 +1363,11 @@ class RoomCreationHandler:
visibility = room_config.get("visibility", "private")
preset_name = room_config.get(
"preset",
RoomCreationPreset.PRIVATE_CHAT
if visibility == "private"
else RoomCreationPreset.PUBLIC_CHAT,
(
RoomCreationPreset.PRIVATE_CHAT
if visibility == "private"
else RoomCreationPreset.PUBLIC_CHAT
),
)
try:
preset_config = self._presets_dict[preset_name]

View File

@@ -51,6 +51,7 @@ from synapse.handlers.worker_lock import NEW_EVENT_DURING_PURGE_LOCK_NAME
from synapse.logging import opentracing
from synapse.metrics import event_processing_positions
from synapse.metrics.background_process_metrics import run_as_background_process
from synapse.replication.http.push import ReplicationCopyPusherRestServlet
from synapse.storage.databases.main.state_deltas import StateDelta
from synapse.types import (
JsonDict,
@@ -181,6 +182,12 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
hs.config.server.forgotten_room_retention_period
)
self._is_push_writer = (
hs.get_instance_name() in hs.config.worker.writers.push_rules
)
self._push_writer = hs.config.worker.writers.push_rules[0]
self._copy_push_client = ReplicationCopyPusherRestServlet.make_client(hs)
def _on_user_joined_room(self, event_id: str, room_id: str) -> None:
"""Notify the rate limiter that a room join has occurred.
@@ -1236,11 +1243,11 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
# If this is going to be a local join, additional information must
# be included in the event content in order to efficiently validate
# the event.
content[
EventContentFields.AUTHORISING_USER
] = await self.event_auth_handler.get_user_which_could_invite(
room_id,
state_before_join,
content[EventContentFields.AUTHORISING_USER] = (
await self.event_auth_handler.get_user_which_could_invite(
room_id,
state_before_join,
)
)
return False, []
@@ -1301,9 +1308,17 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
old_room_id, new_room_id, user_id
)
# Copy over push rules
await self.store.copy_push_rules_from_room_to_room_for_user(
old_room_id, new_room_id, user_id
)
if self._is_push_writer:
await self.store.copy_push_rules_from_room_to_room_for_user(
old_room_id, new_room_id, user_id
)
else:
await self._copy_push_client(
instance_name=self._push_writer,
user_id=user_id,
old_room_id=old_room_id,
new_room_id=new_room_id,
)
except Exception:
logger.exception(
"Error copying tags and/or push rules from rooms %s to %s for user %s. "

View File

@@ -150,7 +150,7 @@ class UserAttributes:
display_name: Optional[str] = None
picture: Optional[str] = None
# mypy thinks these are incompatible for some reason.
emails: StrCollection = attr.Factory(list) # type: ignore[assignment]
emails: StrCollection = attr.Factory(list)
@attr.s(slots=True, auto_attribs=True)

View File

@@ -41,6 +41,7 @@ from synapse.api.constants import (
AccountDataTypes,
EventContentFields,
EventTypes,
JoinRules,
Membership,
)
from synapse.api.filtering import FilterCollection
@@ -675,13 +676,22 @@ class SyncHandler:
)
)
loaded_recents = await filter_events_for_client(
filtered_recents = await filter_events_for_client(
self._storage_controllers,
sync_config.user.to_string(),
loaded_recents,
always_include_ids=current_state_ids,
)
loaded_recents = []
for event in filtered_recents:
if event.type == EventTypes.CallInvite:
room_info = await self.store.get_room_with_stats(event.room_id)
assert room_info is not None
if room_info.join_rules == JoinRules.PUBLIC:
continue
loaded_recents.append(event)
log_kv({"loaded_recents_after_client_filtering": len(loaded_recents)})
loaded_recents.extend(recents)
@@ -943,7 +953,7 @@ class SyncHandler:
batch: TimelineBatch,
sync_config: SyncConfig,
since_token: Optional[StreamToken],
now_token: StreamToken,
end_token: StreamToken,
full_state: bool,
) -> MutableStateMap[EventBase]:
"""Works out the difference in state between the end of the previous sync and
@@ -954,7 +964,9 @@ class SyncHandler:
batch: The timeline batch for the room that will be sent to the user.
sync_config:
since_token: Token of the end of the previous batch. May be `None`.
now_token: Token of the end of the current batch.
end_token: Token of the end of the current batch. Normally this will be
the same as the global "now_token", but if the user has left the room,
the point just after their leave event.
full_state: Whether to force returning the full state.
`lazy_load_members` still applies when `full_state` is `True`.
@@ -1014,30 +1026,6 @@ class SyncHandler:
if event.is_state():
timeline_state[(event.type, event.state_key)] = event.event_id
if full_state:
# always make sure we LL ourselves so we know we're in the room
# (if we are) to fix https://github.com/vector-im/riot-web/issues/7209
# We only need apply this on full state syncs given we disabled
# LL for incr syncs in https://github.com/matrix-org/synapse/pull/3840.
# We don't insert ourselves into `members_to_fetch`, because in some
# rare cases (an empty event batch with a now_token after the user's
# leave in a partial state room which another local user has
# joined), the room state will be missing our membership and there
# is no guarantee that our membership will be in the auth events of
# timeline events when the room is partial stated.
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch.union((sync_config.user.to_string(),))
)
else:
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch
)
# We are happy to use partial state to compute the `/sync` response.
# Since partial state may not include the lazy-loaded memberships we
# require, we fix up the state response afterwards with memberships from
# auth events.
await_full_state = False
else:
timeline_state = {
(event.type, event.state_key): event.event_id
@@ -1045,9 +1033,6 @@ class SyncHandler:
if event.is_state()
}
state_filter = StateFilter.all()
await_full_state = True
# Now calculate the state to return in the sync response for the room.
# This is more or less the change in state between the end of the previous
# sync's timeline and the start of the current sync's timeline.
@@ -1057,132 +1042,29 @@ class SyncHandler:
# whether the room is partial stated *before* fetching it.
is_partial_state_room = await self.store.is_partial_state_room(room_id)
if full_state:
if batch:
state_at_timeline_end = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[-1].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_at_timeline_start = state_at_timeline_end
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end={},
lazy_load_members=lazy_load_members,
state_ids = await self._compute_state_delta_for_full_sync(
room_id,
sync_config.user,
batch,
end_token,
members_to_fetch,
timeline_state,
)
elif batch.limited:
if batch:
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_start = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
# for now, we disable LL for gappy syncs - see
# https://github.com/vector-im/riot-web/issues/7211#issuecomment-419976346
# N.B. this slows down incr syncs as we are now processing way
# more state in the server than if we were LLing.
#
# We still have to filter timeline_start to LL entries (above) in order
# for _calculate_state's LL logic to work, as we have to include LL
# members for timeline senders in case they weren't loaded in the initial
# sync. We do this by (counterintuitively) by filtering timeline_start
# members to just be ones which were timeline senders, which then ensures
# all of the rest get included in the state block (if we need to know
# about them).
state_filter = StateFilter.all()
else:
# If this is an initial sync then full_state should be set, and
# that case is handled above. We assert here to ensure that this
# is indeed the case.
assert since_token is not None
state_at_previous_sync = await self.get_state_at(
state_ids = await self._compute_state_delta_for_incremental_sync(
room_id,
stream_position=since_token,
state_filter=state_filter,
await_full_state=await_full_state,
batch,
since_token,
end_token,
members_to_fetch,
timeline_state,
)
if batch:
state_at_timeline_end = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[-1].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end=state_at_previous_sync,
# we have to include LL members in case LL initial sync missed them
lazy_load_members=lazy_load_members,
)
else:
state_ids = {}
if lazy_load_members:
if members_to_fetch and batch.events:
# We're returning an incremental sync, with no
# "gap" since the previous sync, so normally there would be
# no state to return.
# But we're lazy-loading, so the client might need some more
# member events to understand the events in this timeline.
# So we fish out all the member events corresponding to the
# timeline here, and then dedupe any redundant ones below.
state_ids = await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
# we only want members!
state_filter=StateFilter.from_types(
(EventTypes.Member, member)
for member in members_to_fetch
),
await_full_state=False,
)
# If we only have partial state for the room, `state_ids` may be missing the
# memberships we wanted. We attempt to find some by digging through the auth
# events of timeline events.
@@ -1245,6 +1127,240 @@ class SyncHandler:
if e.type != EventTypes.Aliases # until MSC2261 or alternative solution
}
async def _compute_state_delta_for_full_sync(
self,
room_id: str,
syncing_user: UserID,
batch: TimelineBatch,
end_token: StreamToken,
members_to_fetch: Optional[Set[str]],
timeline_state: StateMap[str],
) -> StateMap[str]:
"""Calculate the state events to be included in a full sync response.
As with `_compute_state_delta_for_incremental_sync`, the result will include
the membership events for the senders of each event in `members_to_fetch`.
Args:
room_id: The room we are calculating for.
syncing_user: The user that is calling `/sync`.
batch: The timeline batch for the room that will be sent to the user.
end_token: Token of the end of the current batch. Normally this will be
the same as the global "now_token", but if the user has left the room,
the point just after their leave event.
members_to_fetch: If lazy-loading is enabled, the memberships needed for
events in the timeline.
timeline_state: The contribution to the room state from state events in
`batch`. Only contains the last event for any given state key.
Returns:
A map from (type, state_key) to event_id, for each event that we believe
should be included in the `state` part of the sync response.
"""
if members_to_fetch is not None:
# Lazy-loading of membership events is enabled.
#
# Always make sure we load our own membership event so we know if
# we're in the room, to fix https://github.com/vector-im/riot-web/issues/7209.
#
# We only need apply this on full state syncs given we disabled
# LL for incr syncs in https://github.com/matrix-org/synapse/pull/3840.
#
# We don't insert ourselves into `members_to_fetch`, because in some
# rare cases (an empty event batch with a now_token after the user's
# leave in a partial state room which another local user has
# joined), the room state will be missing our membership and there
# is no guarantee that our membership will be in the auth events of
# timeline events when the room is partial stated.
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch.union((syncing_user.to_string(),))
)
# We are happy to use partial state to compute the `/sync` response.
# Since partial state may not include the lazy-loaded memberships we
# require, we fix up the state response afterwards with memberships from
# auth events.
await_full_state = False
lazy_load_members = True
else:
state_filter = StateFilter.all()
await_full_state = True
lazy_load_members = False
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=end_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
if batch:
# Strictly speaking, this returns the state *after* the first event in the
# timeline, but that is good enough here.
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
state_at_timeline_start = state_at_timeline_end
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end={},
lazy_load_members=lazy_load_members,
)
return state_ids
async def _compute_state_delta_for_incremental_sync(
self,
room_id: str,
batch: TimelineBatch,
since_token: StreamToken,
end_token: StreamToken,
members_to_fetch: Optional[Set[str]],
timeline_state: StateMap[str],
) -> StateMap[str]:
"""Calculate the state events to be included in an incremental sync response.
If lazy-loading of membership events is enabled (as indicated by
`members_to_fetch` being not-`None`), the result will include the membership
events for each member in `members_to_fetch`. The caller
(`compute_state_delta`) is responsible for keeping track of which membership
events we have already sent to the client, and hence ripping them out.
Args:
room_id: The room we are calculating for.
batch: The timeline batch for the room that will be sent to the user.
since_token: Token of the end of the previous batch.
end_token: Token of the end of the current batch. Normally this will be
the same as the global "now_token", but if the user has left the room,
the point just after their leave event.
members_to_fetch: If lazy-loading is enabled, the memberships needed for
events in the timeline. Otherwise, `None`.
timeline_state: The contribution to the room state from state events in
`batch`. Only contains the last event for any given state key.
Returns:
A map from (type, state_key) to event_id, for each event that we believe
should be included in the `state` part of the sync response.
"""
if members_to_fetch is not None:
# Lazy-loading is enabled. Only return the state that is needed.
state_filter = StateFilter.from_lazy_load_member_list(members_to_fetch)
await_full_state = False
lazy_load_members = True
else:
state_filter = StateFilter.all()
await_full_state = True
lazy_load_members = False
# For a non-gappy sync if the events in the timeline are simply a linear
# chain (i.e. no merging/branching of the graph), then we know the state
# delta between the end of the previous sync and start of the new one is
# empty.
#
# c.f. #16941 for an example of why we can't do this for all non-gappy
# syncs.
is_linear_timeline = True
if batch.events:
# We need to make sure the first event in our batch points to the
# last event in the previous batch.
last_event_id_prev_batch = (
await self.store.get_last_event_in_room_before_stream_ordering(
room_id,
end_token=since_token.room_key,
)
)
prev_event_id = last_event_id_prev_batch
for e in batch.events:
if e.prev_event_ids() != [prev_event_id]:
is_linear_timeline = False
break
prev_event_id = e.event_id
if is_linear_timeline and not batch.limited:
state_ids: StateMap[str] = {}
if lazy_load_members:
if members_to_fetch and batch.events:
# We're lazy-loading, so the client might need some more
# member events to understand the events in this timeline.
# So we fish out all the member events corresponding to the
# timeline here. The caller will then dedupe any redundant
# ones.
state_ids = await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
# we only want members!
state_filter=StateFilter.from_types(
(EventTypes.Member, member) for member in members_to_fetch
),
await_full_state=False,
)
return state_ids
if batch:
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_start = await self.get_state_at(
room_id,
stream_position=end_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
if batch.limited:
# for now, we disable LL for gappy syncs - see
# https://github.com/vector-im/riot-web/issues/7211#issuecomment-419976346
# N.B. this slows down incr syncs as we are now processing way
# more state in the server than if we were LLing.
#
# We still have to filter timeline_start to LL entries (above) in order
# for _calculate_state's LL logic to work, as we have to include LL
# members for timeline senders in case they weren't loaded in the initial
# sync. We do this by (counterintuitively) by filtering timeline_start
# members to just be ones which were timeline senders, which then ensures
# all of the rest get included in the state block (if we need to know
# about them).
state_filter = StateFilter.all()
state_at_previous_sync = await self.get_state_at(
room_id,
stream_position=since_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=end_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end=state_at_previous_sync,
lazy_load_members=lazy_load_members,
)
return state_ids
async def _find_missing_partial_state_memberships(
self,
room_id: str,
@@ -1333,9 +1449,9 @@ class SyncHandler:
and auth_event.state_key == member
):
missing_members.discard(member)
additional_state_ids[
(EventTypes.Member, member)
] = auth_event.event_id
additional_state_ids[(EventTypes.Member, member)] = (
auth_event.event_id
)
break
if missing_members:
@@ -2243,6 +2359,7 @@ class SyncHandler:
full_state=False,
since_token=since_token,
upto_token=leave_token,
end_token=leave_token,
out_of_band=leave_event.internal_metadata.is_out_of_band_membership(),
)
)
@@ -2280,6 +2397,7 @@ class SyncHandler:
full_state=False,
since_token=None if newly_joined else since_token,
upto_token=prev_batch_token,
end_token=now_token,
)
else:
entry = RoomSyncResultBuilder(
@@ -2290,6 +2408,7 @@ class SyncHandler:
full_state=False,
since_token=since_token,
upto_token=since_token,
end_token=now_token,
)
room_entries.append(entry)
@@ -2348,6 +2467,7 @@ class SyncHandler:
full_state=True,
since_token=since_token,
upto_token=now_token,
end_token=now_token,
)
)
elif event.membership == Membership.INVITE:
@@ -2377,6 +2497,7 @@ class SyncHandler:
full_state=True,
since_token=since_token,
upto_token=leave_token,
end_token=leave_token,
)
)
@@ -2447,6 +2568,7 @@ class SyncHandler:
{
"since_token": since_token,
"upto_token": upto_token,
"end_token": room_builder.end_token,
}
)
@@ -2520,7 +2642,7 @@ class SyncHandler:
batch,
sync_config,
since_token,
now_token,
room_builder.end_token,
full_state=full_state,
)
else:
@@ -2680,6 +2802,61 @@ def _calculate_state(
e for t, e in timeline_start.items() if t[0] == EventTypes.Member
)
# Naively, we would just return the difference between the state at the start
# of the timeline (`timeline_start_ids`) and that at the end of the previous sync
# (`previous_timeline_end_ids`). However, that fails in the presence of forks in
# the DAG.
#
# For example, consider a DAG such as the following:
#
# E1
# ↗ ↖
# | S2
# | ↑
# --|------|----
# | |
# E3 |
# ↖ /
# E4
#
# ... and a filter that means we only return 2 events, represented by the dashed
# horizontal line. Assuming S2 was *not* included in the previous sync, we need to
# include it in the `state` section.
#
# Note that the state at the start of the timeline (E3) does not include S2. So,
# to make sure it gets included in the calculation here, we actually look at
# the state at the *end* of the timeline, and subtract any events that are present
# in the timeline.
#
# ----------
#
# Aside 1: You may then wonder if we need to include `timeline_start` in the
# calculation. Consider a linear DAG:
#
# E1
# ↑
# S2
# ↑
# ----|------
# |
# E3
# ↑
# S4
# ↑
# E5
#
# ... where S2 and S4 change the same piece of state; and where we have a filter
# that returns 3 events (E3, S4, E5). We still need to tell the client about S2,
# because it might affect the display of E3. However, the state at the end of the
# timeline only tells us about S4; if we don't inspect `timeline_start` we won't
# find out about S2.
#
# (There are yet more complicated cases in which a state event is excluded from the
# timeline, but whose effect actually lands in the DAG in the *middle* of the
# timeline. We have no way to represent that in the /sync response, and we don't
# even try; it is ether omitted or plonked into `state` as if it were at the start
# of the timeline, depending on what else is in the timeline.)
state_ids = (
(timeline_end_ids | timeline_start_ids)
- previous_timeline_end_ids
@@ -2746,7 +2923,7 @@ class SyncResultBuilder:
if self.since_token:
for joined_sync in self.joined:
it = itertools.chain(
joined_sync.timeline.events, joined_sync.state.values()
joined_sync.state.values(), joined_sync.timeline.events
)
for event in it:
if event.type == EventTypes.Member:
@@ -2758,13 +2935,20 @@ class SyncResultBuilder:
newly_joined_or_invited_or_knocked_users.add(
event.state_key
)
# If the user left and rejoined in the same batch, they
# count as a newly-joined user, *not* a newly-left user.
newly_left_users.discard(event.state_key)
else:
prev_content = event.unsigned.get("prev_content", {})
prev_membership = prev_content.get("membership", None)
if prev_membership == Membership.JOIN:
newly_left_users.add(event.state_key)
# If the user joined and left in the same batch, they
# count as a newly-left user, not a newly-joined user.
newly_joined_or_invited_or_knocked_users.discard(
event.state_key
)
newly_left_users -= newly_joined_or_invited_or_knocked_users
return newly_joined_or_invited_or_knocked_users, newly_left_users
@@ -2775,13 +2959,30 @@ class RoomSyncResultBuilder:
Attributes:
room_id
rtype: One of `"joined"` or `"archived"`
events: List of events to include in the room (more events may be added
when generating result).
newly_joined: If the user has newly joined the room
full_state: Whether the full state should be sent in result
since_token: Earliest point to return events from, or None
upto_token: Latest point to return events from.
upto_token: Latest point to return events from. If `events` is populated,
this is set to the token at the start of `events`
end_token: The last point in the timeline that the client should see events
from. Normally this will be the same as the global `now_token`, but in
the case of rooms where the user has left the room, this will be the point
just after their leave event.
This is used in the calculation of the state which is returned in `state`:
any state changes *up to* `end_token` (and not beyond!) which are not
reflected in the timeline need to be returned in `state`.
out_of_band: whether the events in the room are "out of band" events
and the server isn't in the room.
"""
@@ -2793,5 +2994,5 @@ class RoomSyncResultBuilder:
full_state: bool
since_token: Optional[StreamToken]
upto_token: StreamToken
end_token: StreamToken
out_of_band: bool = False

View File

@@ -182,12 +182,15 @@ class WorkerLocksHandler:
if not locks:
return
def _wake_deferred(deferred: defer.Deferred) -> None:
if not deferred.called:
deferred.callback(None)
def _wake_all_locks(
locks: Collection[Union[WaitingLock, WaitingMultiLock]]
) -> None:
for lock in locks:
deferred = lock.deferred
if not deferred.called:
deferred.callback(None)
for lock in locks:
self._clock.call_later(0, _wake_deferred, lock.deferred)
self._clock.call_later(0, _wake_all_locks, locks)
@wrap_as_background_process("_cleanup_locks")
async def _cleanup_locks(self) -> None:

View File

@@ -390,6 +390,13 @@ class BaseHttpClient:
cooperator=self._cooperator,
)
# Always make sure we add a user agent to the request
if headers is None:
headers = Headers()
if not headers.hasHeader("User-Agent"):
headers.addRawHeader("User-Agent", self.user_agent)
request_deferred: defer.Deferred = treq.request(
method,
uri,

View File

@@ -931,8 +931,7 @@ class MatrixFederationHttpClient:
try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None,
backoff_on_all_error_codes: bool = False,
) -> JsonDict:
...
) -> JsonDict: ...
@overload
async def put_json(
@@ -949,8 +948,7 @@ class MatrixFederationHttpClient:
try_trailing_slash_on_400: bool = False,
parser: Optional[ByteParser[T]] = None,
backoff_on_all_error_codes: bool = False,
) -> T:
...
) -> T: ...
async def put_json(
self,
@@ -1140,8 +1138,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = False,
try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None,
) -> JsonDict:
...
) -> JsonDict: ...
@overload
async def get_json(
@@ -1154,8 +1151,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = ...,
try_trailing_slash_on_400: bool = ...,
parser: ByteParser[T] = ...,
) -> T:
...
) -> T: ...
async def get_json(
self,
@@ -1236,8 +1232,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = False,
try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None,
) -> Tuple[JsonDict, Dict[bytes, List[bytes]]]:
...
) -> Tuple[JsonDict, Dict[bytes, List[bytes]]]: ...
@overload
async def get_json_with_headers(
@@ -1250,8 +1245,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = ...,
try_trailing_slash_on_400: bool = ...,
parser: ByteParser[T] = ...,
) -> Tuple[T, Dict[bytes, List[bytes]]]:
...
) -> Tuple[T, Dict[bytes, List[bytes]]]: ...
async def get_json_with_headers(
self,

View File

@@ -262,7 +262,8 @@ class _ProxyResponseBody(protocol.Protocol):
self._request.finish()
else:
# Abort the underlying request since our remote request also failed.
self._request.transport.abortConnection()
if self._request.channel:
self._request.channel.forceAbortClient()
class ProxySite(Site):

View File

@@ -153,9 +153,9 @@ def return_json_error(
# Only respond with an error response if we haven't already started writing,
# otherwise lets just kill the connection
if request.startedWriting:
if request.transport:
if request.channel:
try:
request.transport.abortConnection()
request.channel.forceAbortClient()
except Exception:
# abortConnection throws if the connection is already closed
pass
@@ -909,7 +909,18 @@ def set_cors_headers(request: "SynapseRequest") -> None:
request.setHeader(
b"Access-Control-Allow-Methods", b"GET, HEAD, POST, PUT, DELETE, OPTIONS"
)
if request.experimental_cors_msc3886:
if request.path is not None and request.path.startswith(
b"/_matrix/client/unstable/org.matrix.msc4108/rendezvous"
):
request.setHeader(
b"Access-Control-Allow-Headers",
b"Content-Type, If-Match, If-None-Match",
)
request.setHeader(
b"Access-Control-Expose-Headers",
b"Synapse-Trace-Id, Server, ETag",
)
elif request.experimental_cors_msc3886:
request.setHeader(
b"Access-Control-Allow-Headers",
b"X-Requested-With, Content-Type, Authorization, Date, If-Match, If-None-Match",

View File

@@ -19,9 +19,11 @@
#
#
""" This module contains base REST classes for constructing REST servlets. """
"""This module contains base REST classes for constructing REST servlets."""
import enum
import logging
import urllib.parse as urlparse
from http import HTTPStatus
from typing import (
TYPE_CHECKING,
@@ -61,24 +63,53 @@ logger = logging.getLogger(__name__)
@overload
def parse_integer(request: Request, name: str, default: int) -> int:
...
@overload
def parse_integer(request: Request, name: str, *, required: Literal[True]) -> int:
...
def parse_integer(request: Request, name: str, default: int) -> int: ...
@overload
def parse_integer(
request: Request, name: str, default: Optional[int] = None, required: bool = False
) -> Optional[int]:
...
request: Request, name: str, *, default: int, negative: bool
) -> int: ...
@overload
def parse_integer(
request: Request, name: str, *, default: int, negative: bool = False
) -> int: ...
@overload
def parse_integer(
request: Request, name: str, *, required: Literal[True], negative: bool = False
) -> int: ...
@overload
def parse_integer(
request: Request, name: str, *, default: Literal[None], negative: bool = False
) -> None: ...
@overload
def parse_integer(request: Request, name: str, *, negative: bool) -> Optional[int]: ...
@overload
def parse_integer(
request: Request,
name: str,
default: Optional[int] = None,
required: bool = False,
negative: bool = False,
) -> Optional[int]: ...
def parse_integer(
request: Request, name: str, default: Optional[int] = None, required: bool = False
request: Request,
name: str,
default: Optional[int] = None,
required: bool = False,
negative: bool = False,
) -> Optional[int]:
"""Parse an integer parameter from the request string
@@ -88,16 +119,17 @@ def parse_integer(
default: value to use if the parameter is absent, defaults to None.
required: whether to raise a 400 SynapseError if the parameter is absent,
defaults to False.
negative: whether to allow negative integers, defaults to True.
Returns:
An int value or the default.
Raises:
SynapseError: if the parameter is absent and required, or if the
parameter is present and not an integer.
SynapseError: if the parameter is absent and required, if the
parameter is present and not an integer, or if the
parameter is illegitimate negative.
"""
args: Mapping[bytes, Sequence[bytes]] = request.args # type: ignore
return parse_integer_from_args(args, name, default, required)
return parse_integer_from_args(args, name, default, required, negative)
@overload
@@ -105,8 +137,7 @@ def parse_integer_from_args(
args: Mapping[bytes, Sequence[bytes]],
name: str,
default: Optional[int] = None,
) -> Optional[int]:
...
) -> Optional[int]: ...
@overload
@@ -115,8 +146,7 @@ def parse_integer_from_args(
name: str,
*,
required: Literal[True],
) -> int:
...
) -> int: ...
@overload
@@ -125,8 +155,8 @@ def parse_integer_from_args(
name: str,
default: Optional[int] = None,
required: bool = False,
) -> Optional[int]:
...
negative: bool = False,
) -> Optional[int]: ...
def parse_integer_from_args(
@@ -134,6 +164,7 @@ def parse_integer_from_args(
name: str,
default: Optional[int] = None,
required: bool = False,
negative: bool = True,
) -> Optional[int]:
"""Parse an integer parameter from the request string
@@ -143,49 +174,50 @@ def parse_integer_from_args(
default: value to use if the parameter is absent, defaults to None.
required: whether to raise a 400 SynapseError if the parameter is absent,
defaults to False.
negative: whether to allow negative integers, defaults to True.
Returns:
An int value or the default.
Raises:
SynapseError: if the parameter is absent and required, or if the
parameter is present and not an integer.
SynapseError: if the parameter is absent and required, if the
parameter is present and not an integer, or if the
parameter is illegitimate negative.
"""
name_bytes = name.encode("ascii")
if name_bytes in args:
try:
return int(args[name_bytes][0])
except Exception:
message = "Query parameter %r must be an integer" % (name,)
raise SynapseError(
HTTPStatus.BAD_REQUEST, message, errcode=Codes.INVALID_PARAM
)
else:
if required:
message = "Missing integer query parameter %r" % (name,)
raise SynapseError(
HTTPStatus.BAD_REQUEST, message, errcode=Codes.MISSING_PARAM
)
else:
if name_bytes not in args:
if not required:
return default
message = f"Missing required integer query parameter {name}"
raise SynapseError(HTTPStatus.BAD_REQUEST, message, errcode=Codes.MISSING_PARAM)
@overload
def parse_boolean(request: Request, name: str, default: bool) -> bool:
...
try:
integer = int(args[name_bytes][0])
except Exception:
message = f"Query parameter {name} must be an integer"
raise SynapseError(HTTPStatus.BAD_REQUEST, message, errcode=Codes.INVALID_PARAM)
if not negative and integer < 0:
message = f"Query parameter {name} must be a positive integer."
raise SynapseError(HTTPStatus.BAD_REQUEST, message, errcode=Codes.INVALID_PARAM)
return integer
@overload
def parse_boolean(request: Request, name: str, *, required: Literal[True]) -> bool:
...
def parse_boolean(request: Request, name: str, default: bool) -> bool: ...
@overload
def parse_boolean(request: Request, name: str, *, required: Literal[True]) -> bool: ...
@overload
def parse_boolean(
request: Request, name: str, default: Optional[bool] = None, required: bool = False
) -> Optional[bool]:
...
) -> Optional[bool]: ...
def parse_boolean(
@@ -216,8 +248,7 @@ def parse_boolean_from_args(
args: Mapping[bytes, Sequence[bytes]],
name: str,
default: bool,
) -> bool:
...
) -> bool: ...
@overload
@@ -226,8 +257,7 @@ def parse_boolean_from_args(
name: str,
*,
required: Literal[True],
) -> bool:
...
) -> bool: ...
@overload
@@ -236,8 +266,7 @@ def parse_boolean_from_args(
name: str,
default: Optional[bool] = None,
required: bool = False,
) -> Optional[bool]:
...
) -> Optional[bool]: ...
def parse_boolean_from_args(
@@ -289,8 +318,7 @@ def parse_bytes_from_args(
args: Mapping[bytes, Sequence[bytes]],
name: str,
default: Optional[bytes] = None,
) -> Optional[bytes]:
...
) -> Optional[bytes]: ...
@overload
@@ -300,8 +328,7 @@ def parse_bytes_from_args(
default: Literal[None] = None,
*,
required: Literal[True],
) -> bytes:
...
) -> bytes: ...
@overload
@@ -310,8 +337,7 @@ def parse_bytes_from_args(
name: str,
default: Optional[bytes] = None,
required: bool = False,
) -> Optional[bytes]:
...
) -> Optional[bytes]: ...
def parse_bytes_from_args(
@@ -355,8 +381,7 @@ def parse_string(
*,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> str:
...
) -> str: ...
@overload
@@ -367,8 +392,7 @@ def parse_string(
required: Literal[True],
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> str:
...
) -> str: ...
@overload
@@ -380,8 +404,7 @@ def parse_string(
required: bool = False,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> Optional[str]:
...
) -> Optional[str]: ...
def parse_string(
@@ -428,6 +451,87 @@ def parse_string(
)
def parse_json(
request: Request,
name: str,
default: Optional[dict] = None,
required: bool = False,
encoding: str = "ascii",
) -> Optional[JsonDict]:
"""
Parse a JSON parameter from the request query string.
Args:
request: the twisted HTTP request.
name: the name of the query parameter.
default: value to use if the parameter is absent,
defaults to None.
required: whether to raise a 400 SynapseError if the
parameter is absent, defaults to False.
encoding: The encoding to decode the string content with.
Returns:
A JSON value, or `default` if the named query parameter was not found
and `required` was False.
Raises:
SynapseError if the parameter is absent and required, or if the
parameter is present and not a JSON object.
"""
args: Mapping[bytes, Sequence[bytes]] = request.args # type: ignore
return parse_json_from_args(
args,
name,
default,
required=required,
encoding=encoding,
)
def parse_json_from_args(
args: Mapping[bytes, Sequence[bytes]],
name: str,
default: Optional[dict] = None,
required: bool = False,
encoding: str = "ascii",
) -> Optional[JsonDict]:
"""
Parse a JSON parameter from the request query string.
Args:
args: a mapping of request args as bytes to a list of bytes (e.g. request.args).
name: the name of the query parameter.
default: value to use if the parameter is absent,
defaults to None.
required: whether to raise a 400 SynapseError if the
parameter is absent, defaults to False.
encoding: the encoding to decode the string content with.
A JSON value, or `default` if the named query parameter was not found
and `required` was False.
Raises:
SynapseError if the parameter is absent and required, or if the
parameter is present and not a JSON object.
"""
name_bytes = name.encode("ascii")
if name_bytes not in args:
if not required:
return default
message = f"Missing required integer query parameter {name}"
raise SynapseError(HTTPStatus.BAD_REQUEST, message, errcode=Codes.MISSING_PARAM)
json_str = parse_string_from_args(args, name, required=True, encoding=encoding)
try:
return json_decoder.decode(urlparse.unquote(json_str))
except Exception:
message = f"Query parameter {name} must be a valid JSON object"
raise SynapseError(HTTPStatus.BAD_REQUEST, message, errcode=Codes.NOT_JSON)
EnumT = TypeVar("EnumT", bound=enum.Enum)
@@ -437,8 +541,7 @@ def parse_enum(
name: str,
E: Type[EnumT],
default: EnumT,
) -> EnumT:
...
) -> EnumT: ...
@overload
@@ -448,8 +551,7 @@ def parse_enum(
E: Type[EnumT],
*,
required: Literal[True],
) -> EnumT:
...
) -> EnumT: ...
def parse_enum(
@@ -526,8 +628,7 @@ def parse_strings_from_args(
*,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> Optional[List[str]]:
...
) -> Optional[List[str]]: ...
@overload
@@ -538,8 +639,7 @@ def parse_strings_from_args(
*,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> List[str]:
...
) -> List[str]: ...
@overload
@@ -550,8 +650,7 @@ def parse_strings_from_args(
required: Literal[True],
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> List[str]:
...
) -> List[str]: ...
@overload
@@ -563,8 +662,7 @@ def parse_strings_from_args(
required: bool = False,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> Optional[List[str]]:
...
) -> Optional[List[str]]: ...
def parse_strings_from_args(
@@ -625,8 +723,7 @@ def parse_string_from_args(
*,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> Optional[str]:
...
) -> Optional[str]: ...
@overload
@@ -638,8 +735,7 @@ def parse_string_from_args(
required: Literal[True],
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> str:
...
) -> str: ...
@overload
@@ -650,8 +746,7 @@ def parse_string_from_args(
required: bool = False,
allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii",
) -> Optional[str]:
...
) -> Optional[str]: ...
def parse_string_from_args(
@@ -704,22 +799,19 @@ def parse_string_from_args(
@overload
def parse_json_value_from_request(request: Request) -> JsonDict:
...
def parse_json_value_from_request(request: Request) -> JsonDict: ...
@overload
def parse_json_value_from_request(
request: Request, allow_empty_body: Literal[False]
) -> JsonDict:
...
) -> JsonDict: ...
@overload
def parse_json_value_from_request(
request: Request, allow_empty_body: bool = False
) -> Optional[JsonDict]:
...
) -> Optional[JsonDict]: ...
def parse_json_value_from_request(
@@ -847,7 +939,6 @@ def assert_params_in_dict(body: JsonDict, required: StrCollection) -> None:
class RestServlet:
"""A Synapse REST Servlet.
An implementing class can either provide its own custom 'register' method,

View File

@@ -150,7 +150,8 @@ class SynapseRequest(Request):
self.get_method(),
self.get_redacted_uri(),
)
self.transport.abortConnection()
if self.channel:
self.channel.forceAbortClient()
return
super().handleContentChunk(data)

View File

@@ -744,8 +744,7 @@ def preserve_fn(
@overload
def preserve_fn(f: Callable[P, R]) -> Callable[P, "defer.Deferred[R]"]:
...
def preserve_fn(f: Callable[P, R]) -> Callable[P, "defer.Deferred[R]"]: ...
def preserve_fn(
@@ -774,15 +773,10 @@ def run_in_background(
@overload
def run_in_background(
f: Callable[P, R], *args: P.args, **kwargs: P.kwargs
) -> "defer.Deferred[R]":
...
) -> "defer.Deferred[R]": ...
def run_in_background( # type: ignore[misc]
# The `type: ignore[misc]` above suppresses
# "Overloaded function implementation does not accept all possible arguments of signature 1"
# "Overloaded function implementation does not accept all possible arguments of signature 2"
# which seems like a bug in mypy.
def run_in_background(
f: Union[
Callable[P, R],
Callable[P, Awaitable[R]],

Some files were not shown because too many files have changed in this diff Show More