mirror of
https://github.com/go-gitea/gitea.git
synced 2025-12-19 02:20:37 +00:00
Compare commits
47 Commits
v1.13.3
...
release/v1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cefcc7613b | ||
|
|
05f266c331 | ||
|
|
bec60518e9 | ||
|
|
015d11d26d | ||
|
|
62fbca3af4 | ||
|
|
223dddb29e | ||
|
|
ef2cb41dc3 | ||
|
|
9201068ff9 | ||
|
|
bfd33088b4 | ||
|
|
711ca0c410 | ||
|
|
013639b13f | ||
|
|
558b0005ff | ||
|
|
0d7afb02c0 | ||
|
|
1a26f6c7ab | ||
|
|
1062931cf1 | ||
|
|
8d4f8ebf31 | ||
|
|
4f47bf5346 | ||
|
|
6dfa92bb1c | ||
|
|
151bedab52 | ||
|
|
6198403fbc | ||
|
|
a6290f603f | ||
|
|
2f09e5775f | ||
|
|
b0819efaea | ||
|
|
d7a3bcdd70 | ||
|
|
7a85e228d8 | ||
|
|
a461d90415 | ||
|
|
70e4134130 | ||
|
|
909f2be99d | ||
|
|
645c0d8abd | ||
|
|
8c461eb261 | ||
|
|
fff66eb016 | ||
|
|
c965ed6529 | ||
|
|
71a2adbf10 | ||
|
|
3231b70043 | ||
|
|
e3c44923d7 | ||
|
|
3e7dccdf47 | ||
|
|
33c2c49627 | ||
|
|
05ac72cf33 | ||
|
|
906ecfd173 | ||
|
|
75496b9ff5 | ||
|
|
8dad47a94a | ||
|
|
8e792986bb | ||
|
|
da80e90ac8 | ||
|
|
74dc22358b | ||
|
|
7d3e174906 | ||
|
|
8456700411 | ||
|
|
8a6acbbc12 |
61
CHANGELOG.md
61
CHANGELOG.md
@@ -4,6 +4,65 @@ This changelog goes through all the changes that have been made in each release
|
|||||||
without substantial changes to our git log; to see the highlights of what has
|
without substantial changes to our git log; to see the highlights of what has
|
||||||
been added to each release, please refer to the [blog](https://blog.gitea.io).
|
been added to each release, please refer to the [blog](https://blog.gitea.io).
|
||||||
|
|
||||||
|
## [1.13.7](https://github.com/go-gitea/gitea/releases/tag/v1.13.7) - 2021-04-07
|
||||||
|
|
||||||
|
* SECURITY
|
||||||
|
* Update to bluemonday-1.0.6 (#15294) (#15298)
|
||||||
|
* Clusterfuzz found another way (#15160) (#15169)
|
||||||
|
* API
|
||||||
|
* Fix wrong user returned in API (#15139) (#15150)
|
||||||
|
* BUGFIXES
|
||||||
|
* Add 'fonts' into 'KnownPublicEntries' (#15188) (#15317)
|
||||||
|
* Speed up `enry.IsVendor` (#15213) (#15246)
|
||||||
|
* Response 404 for diff/patch of a commit that not exist (#15221) (#15238)
|
||||||
|
* Prevent NPE in CommentMustAsDiff if no hunk header (#15199) (#15201)
|
||||||
|
* MISC
|
||||||
|
* Add size to Save function (#15264) (#15271)
|
||||||
|
|
||||||
|
## [1.13.6](https://github.com/go-gitea/gitea/releases/tag/v1.13.6) - 2021-03-23
|
||||||
|
|
||||||
|
* SECURITY
|
||||||
|
* Fix bug on avatar middleware (#15124) (#15125)
|
||||||
|
* Fix another clusterfuzz identified issue (#15096) (#15114)
|
||||||
|
* API
|
||||||
|
* Fix nil exeption for get pull reviews API #15104 (#15106)
|
||||||
|
* BUGFIXES
|
||||||
|
* Fix markdown rendering in milestone content (#15056) (#15092)
|
||||||
|
|
||||||
|
## [1.13.5](https://github.com/go-gitea/gitea/releases/tag/v1.13.5) - 2021-03-21
|
||||||
|
|
||||||
|
* SECURITY
|
||||||
|
* Update to goldmark 1.3.3 (#15059) (#15061)
|
||||||
|
* API
|
||||||
|
* Fix set milestone on PR creation (#14981) (#15001)
|
||||||
|
* Prevent panic when editing forked repos by API (#14960) (#14963)
|
||||||
|
* BUGFIXES
|
||||||
|
* Fix bug when upload on web (#15042) (#15055)
|
||||||
|
* Delete Labels & IssueLabels on Repo Delete too (#15039) (#15051)
|
||||||
|
* another clusterfuzz spotted issue (#15032) (#15034)
|
||||||
|
* Fix postgres ID sequences broken by recreate-table (#15015) (#15029)
|
||||||
|
* Fix several render issues (#14986) (#15013)
|
||||||
|
* Make sure sibling images get a link too (#14979) (#14995)
|
||||||
|
* Fix Anchor jumping with escaped query components (#14969) (#14977)
|
||||||
|
* fix release mail html template (#14976)
|
||||||
|
* Fix excluding more than two labels on issues list (#14962) (#14973)
|
||||||
|
* don't mark each comment poster as OP (#14971) (#14972)
|
||||||
|
* Add "captcha" to list of reserved usernames (#14930)
|
||||||
|
* Re-enable import local paths after reversion from #13610 (#14925) (#14927)
|
||||||
|
|
||||||
|
## [1.13.4](https://github.com/go-gitea/gitea/releases/tag/v1.13.4) - 2021-03-07
|
||||||
|
|
||||||
|
* SECURITY
|
||||||
|
* Fix issue popups (#14898) (#14899)
|
||||||
|
* BUGFIXES
|
||||||
|
* Fix race in LFS ContentStore.Put(...) (#14895) (#14913)
|
||||||
|
* Fix a couple of issues with a feeds (#14897) (#14903)
|
||||||
|
* When transfering repository and database transaction failed, rollback the renames (#14864) (#14902)
|
||||||
|
* Fix race in local storage (#14888) (#14901)
|
||||||
|
* Fix 500 on pull view page if user is not loged in (#14885) (#14886)
|
||||||
|
* DOCS
|
||||||
|
* Fix how lfs data path is set (#14855) (#14884)
|
||||||
|
|
||||||
## [1.13.3](https://github.com/go-gitea/gitea/releases/tag/v1.13.3) - 2021-03-04
|
## [1.13.3](https://github.com/go-gitea/gitea/releases/tag/v1.13.3) - 2021-03-04
|
||||||
|
|
||||||
* BREAKING
|
* BREAKING
|
||||||
@@ -194,7 +253,7 @@ been added to each release, please refer to the [blog](https://blog.gitea.io).
|
|||||||
* Fix scrolling to resolved comment anchors (#13343) (#13371)
|
* Fix scrolling to resolved comment anchors (#13343) (#13371)
|
||||||
* Storage configuration support `[storage]` (#13314) (#13379)
|
* Storage configuration support `[storage]` (#13314) (#13379)
|
||||||
* When creating line diffs do not split within an html entity (#13357) (#13375) (#13425) (#13427)
|
* When creating line diffs do not split within an html entity (#13357) (#13375) (#13425) (#13427)
|
||||||
* Fix reactions on code comments (#13390) (#13401)
|
* Fix reactions on code comments (#13390) (#13401)
|
||||||
* Add missing full names when DEFAULT_SHOW_FULL_NAME is enabled (#13424)
|
* Add missing full names when DEFAULT_SHOW_FULL_NAME is enabled (#13424)
|
||||||
* Replies to outdated code comments should also be outdated (#13217) (#13433)
|
* Replies to outdated code comments should also be outdated (#13217) (#13433)
|
||||||
* Fix panic bug in handling multiple references in commit (#13486) (#13487)
|
* Fix panic bug in handling multiple references in commit (#13486) (#13487)
|
||||||
|
|||||||
@@ -606,6 +606,22 @@ func runDoctorCheckDBConsistency(ctx *cli.Context) ([]string, error) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// find IssueLabels without existing label
|
||||||
|
count, err = models.CountOrphanedIssueLabels()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if count > 0 {
|
||||||
|
if ctx.Bool("fix") {
|
||||||
|
if err = models.DeleteOrphanedIssueLabels(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
results = append(results, fmt.Sprintf("%d issue_labels without existing label deleted", count))
|
||||||
|
} else {
|
||||||
|
results = append(results, fmt.Sprintf("%d issue_labels without existing label", count))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
//find issues without existing repository
|
//find issues without existing repository
|
||||||
count, err = models.CountOrphanedIssues()
|
count, err = models.CountOrphanedIssues()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -670,6 +686,23 @@ func runDoctorCheckDBConsistency(ctx *cli.Context) ([]string, error) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if setting.Database.UsePostgreSQL {
|
||||||
|
count, err = models.CountBadSequences()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if count > 0 {
|
||||||
|
if ctx.Bool("fix") {
|
||||||
|
err := models.FixBadSequences()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
results = append(results, fmt.Sprintf("%d sequences updated", count))
|
||||||
|
} else {
|
||||||
|
results = append(results, fmt.Sprintf("%d sequences with incorrect values", count))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
//ToDo: function to recalc all counters
|
//ToDo: function to recalc all counters
|
||||||
|
|
||||||
return results, nil
|
return results, nil
|
||||||
|
|||||||
@@ -276,7 +276,7 @@ Values containing `#` or `;` must be quoted using `` ` `` or `"""`.
|
|||||||
- `LANDING_PAGE`: **home**: Landing page for unauthenticated users \[home, explore, organizations, login\].
|
- `LANDING_PAGE`: **home**: Landing page for unauthenticated users \[home, explore, organizations, login\].
|
||||||
|
|
||||||
- `LFS_START_SERVER`: **false**: Enables git-lfs support.
|
- `LFS_START_SERVER`: **false**: Enables git-lfs support.
|
||||||
- `LFS_CONTENT_PATH`: **%(APP_DATA_PATH)/lfs**: Default LFS content path. (if it is on local storage.)
|
- `LFS_CONTENT_PATH`: **%(APP_DATA_PATH)/lfs**: DEPRECATED: Default LFS content path. (if it is on local storage.)
|
||||||
- `LFS_JWT_SECRET`: **\<empty\>**: LFS authentication secret, change this a unique string.
|
- `LFS_JWT_SECRET`: **\<empty\>**: LFS authentication secret, change this a unique string.
|
||||||
- `LFS_HTTP_AUTH_EXPIRY`: **20m**: LFS authentication validity period in time.Duration, pushes taking longer than this may fail.
|
- `LFS_HTTP_AUTH_EXPIRY`: **20m**: LFS authentication validity period in time.Duration, pushes taking longer than this may fail.
|
||||||
- `LFS_MAX_FILE_SIZE`: **0**: Maximum allowed LFS file size in bytes (Set to 0 for no limit).
|
- `LFS_MAX_FILE_SIZE`: **0**: Maximum allowed LFS file size in bytes (Set to 0 for no limit).
|
||||||
@@ -828,7 +828,7 @@ is `data/lfs` and the default of `MINIO_BASE_PATH` is `lfs/`.
|
|||||||
|
|
||||||
- `STORAGE_TYPE`: **local**: Storage type for lfs, `local` for local disk or `minio` for s3 compatible object storage service or other name defined with `[storage.xxx]`
|
- `STORAGE_TYPE`: **local**: Storage type for lfs, `local` for local disk or `minio` for s3 compatible object storage service or other name defined with `[storage.xxx]`
|
||||||
- `SERVE_DIRECT`: **false**: Allows the storage driver to redirect to authenticated URLs to serve files directly. Currently, only Minio/S3 is supported via signed URLs, local does nothing.
|
- `SERVE_DIRECT`: **false**: Allows the storage driver to redirect to authenticated URLs to serve files directly. Currently, only Minio/S3 is supported via signed URLs, local does nothing.
|
||||||
- `CONTENT_PATH`: **./data/lfs**: Where to store LFS files, only available when `STORAGE_TYPE` is `local`.
|
- `PATH`: **./data/lfs**: Where to store LFS files, only available when `STORAGE_TYPE` is `local`. If not set it fall back to deprecated LFS_CONTENT_PATH value in [server] section.
|
||||||
- `MINIO_ENDPOINT`: **localhost:9000**: Minio endpoint to connect only available when `STORAGE_TYPE` is `minio`
|
- `MINIO_ENDPOINT`: **localhost:9000**: Minio endpoint to connect only available when `STORAGE_TYPE` is `minio`
|
||||||
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID to connect only available when `STORAGE_TYPE` is `minio`
|
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID to connect only available when `STORAGE_TYPE` is `minio`
|
||||||
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey to connect only available when `STORAGE_TYPE is` `minio`
|
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey to connect only available when `STORAGE_TYPE is` `minio`
|
||||||
|
|||||||
@@ -73,6 +73,7 @@ menu:
|
|||||||
|
|
||||||
- `LFS_START_SERVER`: 是否启用 git-lfs 支持. 可以为 `true` 或 `false`, 默认是 `false`。
|
- `LFS_START_SERVER`: 是否启用 git-lfs 支持. 可以为 `true` 或 `false`, 默认是 `false`。
|
||||||
- `LFS_JWT_SECRET`: LFS 认证密钥,改成自己的。
|
- `LFS_JWT_SECRET`: LFS 认证密钥,改成自己的。
|
||||||
|
- `LFS_CONTENT_PATH`: **已废弃**, 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`。
|
||||||
|
|
||||||
## Database (`database`)
|
## Database (`database`)
|
||||||
|
|
||||||
@@ -323,7 +324,7 @@ LFS 的存储配置。 如果 `STORAGE_TYPE` 为空,则此配置将从 `[stora
|
|||||||
|
|
||||||
- `STORAGE_TYPE`: **local**: LFS 的存储类型,`local` 将存储到磁盘,`minio` 将存储到 s3 兼容的对象服务。
|
- `STORAGE_TYPE`: **local**: LFS 的存储类型,`local` 将存储到磁盘,`minio` 将存储到 s3 兼容的对象服务。
|
||||||
- `SERVE_DIRECT`: **false**: 允许直接重定向到存储系统。当前,仅 Minio/S3 是支持的。
|
- `SERVE_DIRECT`: **false**: 允许直接重定向到存储系统。当前,仅 Minio/S3 是支持的。
|
||||||
- `CONTENT_PATH`: 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`。
|
- `PATH`: 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`。
|
||||||
- `MINIO_ENDPOINT`: **localhost:9000**: Minio 地址,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
- `MINIO_ENDPOINT`: **localhost:9000**: Minio 地址,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
||||||
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
||||||
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey,仅当 `LFS_STORAGE_TYPE` 为 `minio` 时有效。
|
||||||
|
|||||||
10
go.mod
10
go.mod
@@ -70,7 +70,7 @@ require (
|
|||||||
github.com/mgechev/dots v0.0.0-20190921121421-c36f7dcfbb81
|
github.com/mgechev/dots v0.0.0-20190921121421-c36f7dcfbb81
|
||||||
github.com/mgechev/revive v1.0.3-0.20200921231451-246eac737dc7
|
github.com/mgechev/revive v1.0.3-0.20200921231451-246eac737dc7
|
||||||
github.com/mholt/archiver/v3 v3.3.0
|
github.com/mholt/archiver/v3 v3.3.0
|
||||||
github.com/microcosm-cc/bluemonday v1.0.3-0.20191119130333-0a75d7616912
|
github.com/microcosm-cc/bluemonday v1.0.6
|
||||||
github.com/minio/minio-go/v7 v7.0.4
|
github.com/minio/minio-go/v7 v7.0.4
|
||||||
github.com/mitchellh/go-homedir v1.1.0
|
github.com/mitchellh/go-homedir v1.1.0
|
||||||
github.com/msteinert/pam v0.0.0-20151204160544-02ccfbfaf0cc
|
github.com/msteinert/pam v0.0.0-20151204160544-02ccfbfaf0cc
|
||||||
@@ -99,15 +99,15 @@ require (
|
|||||||
github.com/urfave/cli v1.20.0
|
github.com/urfave/cli v1.20.0
|
||||||
github.com/xanzy/go-gitlab v0.37.0
|
github.com/xanzy/go-gitlab v0.37.0
|
||||||
github.com/yohcop/openid-go v1.0.0
|
github.com/yohcop/openid-go v1.0.0
|
||||||
github.com/yuin/goldmark v1.2.1
|
github.com/yuin/goldmark v1.3.3
|
||||||
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691
|
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691
|
||||||
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60
|
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60
|
||||||
go.jolheiser.com/hcaptcha v0.0.4
|
go.jolheiser.com/hcaptcha v0.0.4
|
||||||
go.jolheiser.com/pwn v0.0.3
|
go.jolheiser.com/pwn v0.0.3
|
||||||
golang.org/x/crypto v0.0.0-20201217014255-9d1352758620
|
golang.org/x/crypto v0.0.0-20201217014255-9d1352758620
|
||||||
golang.org/x/net v0.0.0-20200904194848-62affa334b73
|
golang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4
|
||||||
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d
|
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d
|
||||||
golang.org/x/sys v0.0.0-20200918174421-af09f7315aff
|
golang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44
|
||||||
golang.org/x/text v0.3.3
|
golang.org/x/text v0.3.3
|
||||||
golang.org/x/time v0.0.0-20200630173020-3af7569d3a1e // indirect
|
golang.org/x/time v0.0.0-20200630173020-3af7569d3a1e // indirect
|
||||||
golang.org/x/tools v0.0.0-20200921210052-fa0125251cc4
|
golang.org/x/tools v0.0.0-20200921210052-fa0125251cc4
|
||||||
@@ -124,5 +124,3 @@ require (
|
|||||||
)
|
)
|
||||||
|
|
||||||
replace github.com/hashicorp/go-version => github.com/6543/go-version v1.2.4
|
replace github.com/hashicorp/go-version => github.com/6543/go-version v1.2.4
|
||||||
|
|
||||||
replace github.com/microcosm-cc/bluemonday => github.com/lunny/bluemonday v1.0.5-0.20201227154428-ca34796141e8
|
|
||||||
|
|||||||
22
go.sum
22
go.sum
@@ -140,8 +140,6 @@ github.com/bradfitz/gomemcache v0.0.0-20190329173943-551aad21a668 h1:U/lr3Dgy4WK
|
|||||||
github.com/bradfitz/gomemcache v0.0.0-20190329173943-551aad21a668/go.mod h1:H0wQNHz2YrLsuXOZozoeDmnHXkNCRmMW0gwFWDfEZDA=
|
github.com/bradfitz/gomemcache v0.0.0-20190329173943-551aad21a668/go.mod h1:H0wQNHz2YrLsuXOZozoeDmnHXkNCRmMW0gwFWDfEZDA=
|
||||||
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
|
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
|
||||||
github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghfAqPWnc=
|
github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghfAqPWnc=
|
||||||
github.com/chris-ramon/douceur v0.2.0 h1:IDMEdxlEUUBYBKE4z/mJnFyVXox+MjuEVDJNN27glkU=
|
|
||||||
github.com/chris-ramon/douceur v0.2.0/go.mod h1:wDW5xjJdeoMm1mRt4sD4c/LbF/mWdEpRXQKjTR8nIBE=
|
|
||||||
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
|
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
|
||||||
github.com/cockroachdb/apd v1.1.0 h1:3LFP3629v+1aKXU5Q37mxmRxX/pIu1nijXydLShEq5I=
|
github.com/cockroachdb/apd v1.1.0 h1:3LFP3629v+1aKXU5Q37mxmRxX/pIu1nijXydLShEq5I=
|
||||||
github.com/cockroachdb/apd v1.1.0/go.mod h1:8Sl8LxpKi29FqWXR16WEFZRNSz3SoPzUzeMeY4+DwBQ=
|
github.com/cockroachdb/apd v1.1.0/go.mod h1:8Sl8LxpKi29FqWXR16WEFZRNSz3SoPzUzeMeY4+DwBQ=
|
||||||
@@ -598,8 +596,6 @@ github.com/lib/pq v1.3.0/go.mod h1:5WUZQaWbwv1U+lTReE5YruASi9Al49XbQIvNi/34Woo=
|
|||||||
github.com/lib/pq v1.7.0/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
github.com/lib/pq v1.7.0/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
||||||
github.com/lib/pq v1.8.1-0.20200908161135-083382b7e6fc h1:ERSU1OvZ6MdWhHieo2oT7xwR/HCksqKdgK6iYPU5pHI=
|
github.com/lib/pq v1.8.1-0.20200908161135-083382b7e6fc h1:ERSU1OvZ6MdWhHieo2oT7xwR/HCksqKdgK6iYPU5pHI=
|
||||||
github.com/lib/pq v1.8.1-0.20200908161135-083382b7e6fc/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
github.com/lib/pq v1.8.1-0.20200908161135-083382b7e6fc/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
||||||
github.com/lunny/bluemonday v1.0.5-0.20201227154428-ca34796141e8 h1:1omo92DLtxQu6VwVPSZAmduHaK5zssed6cvkHyl1XOg=
|
|
||||||
github.com/lunny/bluemonday v1.0.5-0.20201227154428-ca34796141e8/go.mod h1:8iwZnFn2CDDNZ0r6UXhF4xawGvzaqzCRa1n3/lO3W2w=
|
|
||||||
github.com/lunny/dingtalk_webhook v0.0.0-20171025031554-e3534c89ef96 h1:uNwtsDp7ci48vBTTxDuwcoTXz4lwtDTe7TjCQ0noaWY=
|
github.com/lunny/dingtalk_webhook v0.0.0-20171025031554-e3534c89ef96 h1:uNwtsDp7ci48vBTTxDuwcoTXz4lwtDTe7TjCQ0noaWY=
|
||||||
github.com/lunny/dingtalk_webhook v0.0.0-20171025031554-e3534c89ef96/go.mod h1:mmIfjCSQlGYXmJ95jFN84AkQFnVABtKuJL8IrzwvUKQ=
|
github.com/lunny/dingtalk_webhook v0.0.0-20171025031554-e3534c89ef96/go.mod h1:mmIfjCSQlGYXmJ95jFN84AkQFnVABtKuJL8IrzwvUKQ=
|
||||||
github.com/lunny/log v0.0.0-20160921050905-7887c61bf0de h1:nyxwRdWHAVxpFcDThedEgQ07DbcRc5xgNObtbTp76fk=
|
github.com/lunny/log v0.0.0-20160921050905-7887c61bf0de h1:nyxwRdWHAVxpFcDThedEgQ07DbcRc5xgNObtbTp76fk=
|
||||||
@@ -651,6 +647,8 @@ github.com/mgechev/revive v1.0.3-0.20200921231451-246eac737dc7 h1:ydVkpU/M4/c45y
|
|||||||
github.com/mgechev/revive v1.0.3-0.20200921231451-246eac737dc7/go.mod h1:no/hfevHbndpXR5CaJahkYCfM/FFpmM/dSOwFGU7Z1o=
|
github.com/mgechev/revive v1.0.3-0.20200921231451-246eac737dc7/go.mod h1:no/hfevHbndpXR5CaJahkYCfM/FFpmM/dSOwFGU7Z1o=
|
||||||
github.com/mholt/archiver/v3 v3.3.0 h1:vWjhY8SQp5yzM9P6OJ/eZEkmi3UAbRrxCq48MxjAzig=
|
github.com/mholt/archiver/v3 v3.3.0 h1:vWjhY8SQp5yzM9P6OJ/eZEkmi3UAbRrxCq48MxjAzig=
|
||||||
github.com/mholt/archiver/v3 v3.3.0/go.mod h1:YnQtqsp+94Rwd0D/rk5cnLrxusUBUXg+08Ebtr1Mqao=
|
github.com/mholt/archiver/v3 v3.3.0/go.mod h1:YnQtqsp+94Rwd0D/rk5cnLrxusUBUXg+08Ebtr1Mqao=
|
||||||
|
github.com/microcosm-cc/bluemonday v1.0.6 h1:ZOvqHKtnx0fUpnbQm3m3zKFWE+DRC+XB1onh8JoEObE=
|
||||||
|
github.com/microcosm-cc/bluemonday v1.0.6/go.mod h1:HOT/6NaBlR0f9XlxD3zolN6Z3N8Lp4pvhp+jLS5ihnI=
|
||||||
github.com/miekg/dns v1.0.14/go.mod h1:W1PPwlIAgtquWBMBEV9nkV9Cazfe8ScdGz/Lj7v3Nrg=
|
github.com/miekg/dns v1.0.14/go.mod h1:W1PPwlIAgtquWBMBEV9nkV9Cazfe8ScdGz/Lj7v3Nrg=
|
||||||
github.com/minio/md5-simd v1.1.0 h1:QPfiOqlZH+Cj9teu0t9b1nTBfPbyTl16Of5MeuShdK4=
|
github.com/minio/md5-simd v1.1.0 h1:QPfiOqlZH+Cj9teu0t9b1nTBfPbyTl16Of5MeuShdK4=
|
||||||
github.com/minio/md5-simd v1.1.0/go.mod h1:XpBqgZULrMYD3R+M28PcmP0CkI7PEMzB3U77ZrKZ0Gw=
|
github.com/minio/md5-simd v1.1.0/go.mod h1:XpBqgZULrMYD3R+M28PcmP0CkI7PEMzB3U77ZrKZ0Gw=
|
||||||
@@ -885,8 +883,9 @@ github.com/yuin/goldmark v1.1.7/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9dec
|
|||||||
github.com/yuin/goldmark v1.1.22/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
github.com/yuin/goldmark v1.1.22/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
||||||
github.com/yuin/goldmark v1.1.25/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
github.com/yuin/goldmark v1.1.25/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
||||||
github.com/yuin/goldmark v1.1.32/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
github.com/yuin/goldmark v1.1.32/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
||||||
github.com/yuin/goldmark v1.2.1 h1:ruQGxdhGHe7FWOJPT0mKs5+pD2Xs1Bm/kdGlHO04FmM=
|
|
||||||
github.com/yuin/goldmark v1.2.1/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
github.com/yuin/goldmark v1.2.1/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
|
||||||
|
github.com/yuin/goldmark v1.3.3 h1:37BdQwPx8VOSic8eDSWee6QL9mRpZRm9VJp/QugNrW0=
|
||||||
|
github.com/yuin/goldmark v1.3.3/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=
|
||||||
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691 h1:VWSxtAiQNh3zgHJpdpkpVYjTPqRE3P6UZCOPa1nRDio=
|
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691 h1:VWSxtAiQNh3zgHJpdpkpVYjTPqRE3P6UZCOPa1nRDio=
|
||||||
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691/go.mod h1:YLF3kDffRfUH/bTxOxHhV6lxwIB3Vfj91rEwNMS9MXo=
|
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691/go.mod h1:YLF3kDffRfUH/bTxOxHhV6lxwIB3Vfj91rEwNMS9MXo=
|
||||||
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60 h1:gZucqLjL1eDzVWrXj4uiWeMbAopJlBR2mKQAsTGdPwo=
|
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60 h1:gZucqLjL1eDzVWrXj4uiWeMbAopJlBR2mKQAsTGdPwo=
|
||||||
@@ -995,8 +994,9 @@ golang.org/x/net v0.0.0-20200602114024-627f9648deb9/go.mod h1:qpuaurCH72eLCgpAm/
|
|||||||
golang.org/x/net v0.0.0-20200625001655-4c5254603344/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
golang.org/x/net v0.0.0-20200625001655-4c5254603344/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
||||||
golang.org/x/net v0.0.0-20200707034311-ab3426394381/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
golang.org/x/net v0.0.0-20200707034311-ab3426394381/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
||||||
golang.org/x/net v0.0.0-20200822124328-c89045814202/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
golang.org/x/net v0.0.0-20200822124328-c89045814202/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
||||||
golang.org/x/net v0.0.0-20200904194848-62affa334b73 h1:MXfv8rhZWmFeqX3GNZRsd6vOLoaCHjYEX3qkRo3YBUA=
|
golang.org/x/net v0.0.0-20210331212208-0fccb6fa2b5c/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=
|
||||||
golang.org/x/net v0.0.0-20200904194848-62affa334b73/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=
|
golang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4 h1:4nGaVu0QrbjT/AK2PRLuQfQuh6DJve+pELhqTdAj3x0=
|
||||||
|
golang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=
|
||||||
golang.org/x/oauth2 v0.0.0-20180620175406-ef147856a6dd/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
golang.org/x/oauth2 v0.0.0-20180620175406-ef147856a6dd/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||||
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||||
golang.org/x/oauth2 v0.0.0-20181106182150-f42d05182288/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
golang.org/x/oauth2 v0.0.0-20181106182150-f42d05182288/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||||
@@ -1052,10 +1052,12 @@ golang.org/x/sys v0.0.0-20200302150141-5c8b2ff67527/go.mod h1:h1NjWce9XRLGQEsW7w
|
|||||||
golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20200625212154-ddb9806d33ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20200625212154-ddb9806d33ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20200918174421-af09f7315aff h1:1CPUrky56AcgSpxz/KfgzQWzfG09u5YOL8MvPYBlrL8=
|
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20200918174421-af09f7315aff/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44 h1:Bli41pIlzTzf3KEY06n+xnzK/BESIg2ze4Pgfh/aI8c=
|
||||||
golang.org/x/term v0.0.0-20201117132131-f5c789dd3221 h1:/ZHdbVpdR/jk3g30/d4yUL0JU9kksj8+F/bnQUVLGDM=
|
golang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/term v0.0.0-20201117132131-f5c789dd3221/go.mod h1:Nr5EML6q2oocZ2LXRh80K7BxOlk5/8JxuGnuhpl+muw=
|
golang.org/x/term v0.0.0-20201117132131-f5c789dd3221/go.mod h1:Nr5EML6q2oocZ2LXRh80K7BxOlk5/8JxuGnuhpl+muw=
|
||||||
|
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1 h1:v+OssWQX+hTHEmOBgwxdZxK4zHq3yOs8F9J7mk0PY8E=
|
||||||
|
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||||
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||||
golang.org/x/text v0.3.1-0.20180807135948-17ff2d5776d2/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
golang.org/x/text v0.3.1-0.20180807135948-17ff2d5776d2/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||||
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
|
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
|
||||||
|
|||||||
@@ -92,6 +92,10 @@ func testAPIDeleteOAuth2Application(t *testing.T) {
|
|||||||
session.MakeRequest(t, req, http.StatusNoContent)
|
session.MakeRequest(t, req, http.StatusNoContent)
|
||||||
|
|
||||||
models.AssertNotExistsBean(t, &models.OAuth2Application{UID: oldApp.UID, Name: oldApp.Name})
|
models.AssertNotExistsBean(t, &models.OAuth2Application{UID: oldApp.UID, Name: oldApp.Name})
|
||||||
|
|
||||||
|
// Delete again will return not found
|
||||||
|
req = NewRequest(t, "DELETE", urlStr)
|
||||||
|
session.MakeRequest(t, req, http.StatusNotFound)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testAPIGetOAuth2Application(t *testing.T) {
|
func testAPIGetOAuth2Application(t *testing.T) {
|
||||||
|
|||||||
@@ -74,8 +74,79 @@ func TestAPICreatePullSuccess(t *testing.T) {
|
|||||||
Base: "master",
|
Base: "master",
|
||||||
Title: "create a failure pr",
|
Title: "create a failure pr",
|
||||||
})
|
})
|
||||||
|
|
||||||
session.MakeRequest(t, req, 201)
|
session.MakeRequest(t, req, 201)
|
||||||
|
session.MakeRequest(t, req, http.StatusUnprocessableEntity) // second request should fail
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAPICreatePullWithFieldsSuccess(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
// repo10 have code, pulls units.
|
||||||
|
repo10 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 10}).(*models.Repository)
|
||||||
|
owner10 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo10.OwnerID}).(*models.User)
|
||||||
|
// repo11 only have code unit but should still create pulls
|
||||||
|
repo11 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 11}).(*models.Repository)
|
||||||
|
owner11 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo11.OwnerID}).(*models.User)
|
||||||
|
|
||||||
|
session := loginUser(t, owner11.Name)
|
||||||
|
token := getTokenForLoggedInUser(t, session)
|
||||||
|
|
||||||
|
opts := &api.CreatePullRequestOption{
|
||||||
|
Head: fmt.Sprintf("%s:master", owner11.Name),
|
||||||
|
Base: "master",
|
||||||
|
Title: "create a failure pr",
|
||||||
|
Body: "foobaaar",
|
||||||
|
Milestone: 5,
|
||||||
|
Assignees: []string{owner10.Name},
|
||||||
|
Labels: []int64{5},
|
||||||
|
}
|
||||||
|
|
||||||
|
req := NewRequestWithJSON(t, http.MethodPost, fmt.Sprintf("/api/v1/repos/%s/%s/pulls?token=%s", owner10.Name, repo10.Name, token), opts)
|
||||||
|
|
||||||
|
res := session.MakeRequest(t, req, 201)
|
||||||
|
pull := new(api.PullRequest)
|
||||||
|
DecodeJSON(t, res, pull)
|
||||||
|
|
||||||
|
assert.NotNil(t, pull.Milestone)
|
||||||
|
assert.EqualValues(t, opts.Milestone, pull.Milestone.ID)
|
||||||
|
if assert.Len(t, pull.Assignees, 1) {
|
||||||
|
assert.EqualValues(t, opts.Assignees[0], owner10.Name)
|
||||||
|
}
|
||||||
|
assert.NotNil(t, pull.Labels)
|
||||||
|
assert.EqualValues(t, opts.Labels[0], pull.Labels[0].ID)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAPICreatePullWithFieldsFailure(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
// repo10 have code, pulls units.
|
||||||
|
repo10 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 10}).(*models.Repository)
|
||||||
|
owner10 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo10.OwnerID}).(*models.User)
|
||||||
|
// repo11 only have code unit but should still create pulls
|
||||||
|
repo11 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 11}).(*models.Repository)
|
||||||
|
owner11 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo11.OwnerID}).(*models.User)
|
||||||
|
|
||||||
|
session := loginUser(t, owner11.Name)
|
||||||
|
token := getTokenForLoggedInUser(t, session)
|
||||||
|
|
||||||
|
opts := &api.CreatePullRequestOption{
|
||||||
|
Head: fmt.Sprintf("%s:master", owner11.Name),
|
||||||
|
Base: "master",
|
||||||
|
}
|
||||||
|
|
||||||
|
req := NewRequestWithJSON(t, http.MethodPost, fmt.Sprintf("/api/v1/repos/%s/%s/pulls?token=%s", owner10.Name, repo10.Name, token), opts)
|
||||||
|
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
|
||||||
|
opts.Title = "is required"
|
||||||
|
|
||||||
|
opts.Milestone = 666
|
||||||
|
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
|
||||||
|
opts.Milestone = 5
|
||||||
|
|
||||||
|
opts.Assignees = []string{"qweruqweroiuyqweoiruywqer"}
|
||||||
|
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
|
||||||
|
opts.Assignees = []string{owner10.LoginName}
|
||||||
|
|
||||||
|
opts.Labels = []int64{55555}
|
||||||
|
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
|
||||||
|
opts.Labels = []int64{5}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestAPIEditPull(t *testing.T) {
|
func TestAPIEditPull(t *testing.T) {
|
||||||
|
|||||||
@@ -122,7 +122,7 @@ func TestGetAttachment(t *testing.T) {
|
|||||||
t.Run(tc.name, func(t *testing.T) {
|
t.Run(tc.name, func(t *testing.T) {
|
||||||
//Write empty file to be available for response
|
//Write empty file to be available for response
|
||||||
if tc.createFile {
|
if tc.createFile {
|
||||||
_, err := storage.Attachments.Save(models.AttachmentRelativePath(tc.uuid), strings.NewReader("hello world"))
|
_, err := storage.Attachments.Save(models.AttachmentRelativePath(tc.uuid), strings.NewReader("hello world"), -1)
|
||||||
assert.NoError(t, err)
|
assert.NoError(t, err)
|
||||||
}
|
}
|
||||||
//Actual test
|
//Actual test
|
||||||
|
|||||||
@@ -99,7 +99,7 @@ func (a *Attachment) LinkedRepository() (*Repository, UnitType, error) {
|
|||||||
func NewAttachment(attach *Attachment, buf []byte, file io.Reader) (_ *Attachment, err error) {
|
func NewAttachment(attach *Attachment, buf []byte, file io.Reader) (_ *Attachment, err error) {
|
||||||
attach.UUID = gouuid.New().String()
|
attach.UUID = gouuid.New().String()
|
||||||
|
|
||||||
size, err := storage.Attachments.Save(attach.RelativePath(), io.MultiReader(bytes.NewReader(buf), file))
|
size, err := storage.Attachments.Save(attach.RelativePath(), io.MultiReader(bytes.NewReader(buf), file), -1)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("Create: %v", err)
|
return nil, fmt.Errorf("Create: %v", err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,10 +5,13 @@
|
|||||||
package models
|
package models
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"fmt"
|
||||||
"reflect"
|
"reflect"
|
||||||
|
"regexp"
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
"github.com/stretchr/testify/assert"
|
"github.com/stretchr/testify/assert"
|
||||||
"xorm.io/builder"
|
"xorm.io/builder"
|
||||||
)
|
)
|
||||||
@@ -221,6 +224,24 @@ func DeleteOrphanedLabels() error {
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CountOrphanedIssueLabels return count of IssueLabels witch have no label behind anymore
|
||||||
|
func CountOrphanedIssueLabels() (int64, error) {
|
||||||
|
return x.Table("issue_label").
|
||||||
|
Join("LEFT", "label", "issue_label.label_id = label.id").
|
||||||
|
Where(builder.IsNull{"label.id"}).Count()
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteOrphanedIssueLabels delete IssueLabels witch have no label behind anymore
|
||||||
|
func DeleteOrphanedIssueLabels() error {
|
||||||
|
|
||||||
|
_, err := x.In("id", builder.Select("issue_label.id").From("issue_label").
|
||||||
|
Join("LEFT", "label", "issue_label.label_id = label.id").
|
||||||
|
Where(builder.IsNull{"label.id"})).
|
||||||
|
Delete(IssueLabel{})
|
||||||
|
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
// CountOrphanedIssues count issues without a repo
|
// CountOrphanedIssues count issues without a repo
|
||||||
func CountOrphanedIssues() (int64, error) {
|
func CountOrphanedIssues() (int64, error) {
|
||||||
return x.Table("issue").
|
return x.Table("issue").
|
||||||
@@ -276,11 +297,15 @@ func CountOrphanedObjects(subject, refobject, joinCond string) (int64, error) {
|
|||||||
|
|
||||||
// DeleteOrphanedObjects delete subjects with have no existing refobject anymore
|
// DeleteOrphanedObjects delete subjects with have no existing refobject anymore
|
||||||
func DeleteOrphanedObjects(subject, refobject, joinCond string) error {
|
func DeleteOrphanedObjects(subject, refobject, joinCond string) error {
|
||||||
_, err := x.In("id", builder.Select("`"+subject+"`.id").
|
subQuery := builder.Select("`"+subject+"`.id").
|
||||||
From("`"+subject+"`").
|
From("`"+subject+"`").
|
||||||
Join("LEFT", "`"+refobject+"`", joinCond).
|
Join("LEFT", "`"+refobject+"`", joinCond).
|
||||||
Where(builder.IsNull{"`" + refobject + "`.id"})).
|
Where(builder.IsNull{"`" + refobject + "`.id"})
|
||||||
Delete("`" + subject + "`")
|
sql, args, err := builder.Delete(builder.In("id", subQuery)).From("`" + subject + "`").ToSQL()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
_, err = x.Exec(append([]interface{}{sql}, args...)...)
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -295,3 +320,61 @@ func FixNullArchivedRepository() (int64, error) {
|
|||||||
IsArchived: false,
|
IsArchived: false,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CountBadSequences looks for broken sequences from recreate-table mistakes
|
||||||
|
func CountBadSequences() (int64, error) {
|
||||||
|
if !setting.Database.UsePostgreSQL {
|
||||||
|
return 0, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
sess := x.NewSession()
|
||||||
|
defer sess.Close()
|
||||||
|
|
||||||
|
var sequences []string
|
||||||
|
schema := sess.Engine().Dialect().URI().Schema
|
||||||
|
|
||||||
|
sess.Engine().SetSchema("")
|
||||||
|
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__%_id_seq%' AND sequence_catalog = ?", setting.Database.Name).Find(&sequences); err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
sess.Engine().SetSchema(schema)
|
||||||
|
|
||||||
|
return int64(len(sequences)), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// FixBadSequences fixes for broken sequences from recreate-table mistakes
|
||||||
|
func FixBadSequences() error {
|
||||||
|
if !setting.Database.UsePostgreSQL {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
sess := x.NewSession()
|
||||||
|
defer sess.Close()
|
||||||
|
if err := sess.Begin(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
var sequences []string
|
||||||
|
schema := sess.Engine().Dialect().URI().Schema
|
||||||
|
|
||||||
|
sess.Engine().SetSchema("")
|
||||||
|
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__%_id_seq%' AND sequence_catalog = ?", setting.Database.Name).Find(&sequences); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
sess.Engine().SetSchema(schema)
|
||||||
|
|
||||||
|
sequenceRegexp := regexp.MustCompile(`tmp_recreate__(\w+)_id_seq.*`)
|
||||||
|
|
||||||
|
for _, sequence := range sequences {
|
||||||
|
tableName := sequenceRegexp.FindStringSubmatch(sequence)[1]
|
||||||
|
newSequenceName := tableName + "_id_seq"
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("ALTER SEQUENCE `%s` RENAME TO `%s`", sequence, newSequenceName)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', COALESCE((SELECT MAX(id)+1 FROM `%s`), 1), false)", newSequenceName, tableName)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return sess.Commit()
|
||||||
|
}
|
||||||
|
|||||||
32
models/consistency_test.go
Normal file
32
models/consistency_test.go
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
// Copyright 2021 Gitea. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package models
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestDeleteOrphanedObjects(t *testing.T) {
|
||||||
|
assert.NoError(t, PrepareTestDatabase())
|
||||||
|
|
||||||
|
countBefore, err := x.Count(&PullRequest{})
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
_, err = x.Insert(&PullRequest{IssueID: 1000}, &PullRequest{IssueID: 1001}, &PullRequest{IssueID: 1003})
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
orphaned, err := CountOrphanedObjects("pull_request", "issue", "pull_request.issue_id=issue.id")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.EqualValues(t, 3, orphaned)
|
||||||
|
|
||||||
|
err = DeleteOrphanedObjects("pull_request", "issue", "pull_request.issue_id=issue.id")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
countAfter, err := x.Count(&PullRequest{})
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.EqualValues(t, countBefore, countAfter)
|
||||||
|
}
|
||||||
@@ -33,3 +33,11 @@
|
|||||||
num_issues: 1
|
num_issues: 1
|
||||||
num_closed_issues: 0
|
num_closed_issues: 0
|
||||||
|
|
||||||
|
-
|
||||||
|
id: 5
|
||||||
|
repo_id: 10
|
||||||
|
org_id: 0
|
||||||
|
name: pull-test-label
|
||||||
|
color: '#000000'
|
||||||
|
num_issues: 0
|
||||||
|
num_closed_issues: 0
|
||||||
|
|||||||
@@ -29,3 +29,11 @@
|
|||||||
content: content random
|
content: content random
|
||||||
is_closed: false
|
is_closed: false
|
||||||
num_issues: 0
|
num_issues: 0
|
||||||
|
|
||||||
|
-
|
||||||
|
id: 5
|
||||||
|
repo_id: 10
|
||||||
|
name: milestone of repo 10
|
||||||
|
content: for testing with PRs
|
||||||
|
is_closed: false
|
||||||
|
num_issues: 0
|
||||||
|
|||||||
@@ -146,6 +146,7 @@
|
|||||||
num_closed_issues: 0
|
num_closed_issues: 0
|
||||||
num_pulls: 1
|
num_pulls: 1
|
||||||
num_closed_pulls: 0
|
num_closed_pulls: 0
|
||||||
|
num_milestones: 1
|
||||||
is_mirror: false
|
is_mirror: false
|
||||||
num_forks: 1
|
num_forks: 1
|
||||||
status: 0
|
status: 0
|
||||||
|
|||||||
@@ -764,3 +764,15 @@ func DeleteIssueLabel(issue *Issue, label *Label, doer *User) (err error) {
|
|||||||
|
|
||||||
return sess.Commit()
|
return sess.Commit()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func deleteLabelsByRepoID(sess Engine, repoID int64) error {
|
||||||
|
deleteCond := builder.Select("id").From("label").Where(builder.Eq{"label.repo_id": repoID})
|
||||||
|
|
||||||
|
if _, err := sess.In("label_id", deleteCond).
|
||||||
|
Delete(&IssueLabel{}); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := sess.Delete(&Label{RepoID: repoID})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|||||||
@@ -39,6 +39,7 @@ func InsertMilestones(ms ...*Milestone) (err error) {
|
|||||||
// InsertIssues insert issues to database
|
// InsertIssues insert issues to database
|
||||||
func InsertIssues(issues ...*Issue) error {
|
func InsertIssues(issues ...*Issue) error {
|
||||||
sess := x.NewSession()
|
sess := x.NewSession()
|
||||||
|
defer sess.Close()
|
||||||
if err := sess.Begin(); err != nil {
|
if err := sess.Begin(); err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
@@ -194,6 +195,7 @@ func InsertPullRequests(prs ...*PullRequest) error {
|
|||||||
// InsertReleases migrates release
|
// InsertReleases migrates release
|
||||||
func InsertReleases(rels ...*Release) error {
|
func InsertReleases(rels ...*Release) error {
|
||||||
sess := x.NewSession()
|
sess := x.NewSession()
|
||||||
|
defer sess.Close()
|
||||||
if err := sess.Begin(); err != nil {
|
if err := sess.Begin(); err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -516,6 +516,31 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
|
|||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
case setting.Database.UsePostgreSQL:
|
case setting.Database.UsePostgreSQL:
|
||||||
|
var originalSequences []string
|
||||||
|
type sequenceData struct {
|
||||||
|
LastValue int `xorm:"'last_value'"`
|
||||||
|
IsCalled bool `xorm:"'is_called'"`
|
||||||
|
}
|
||||||
|
sequenceMap := map[string]sequenceData{}
|
||||||
|
|
||||||
|
schema := sess.Engine().Dialect().URI().Schema
|
||||||
|
sess.Engine().SetSchema("")
|
||||||
|
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE ? || '_%' AND sequence_catalog = ?", tableName, setting.Database.Name).Find(&originalSequences); err != nil {
|
||||||
|
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
sess.Engine().SetSchema(schema)
|
||||||
|
|
||||||
|
for _, sequence := range originalSequences {
|
||||||
|
sequenceData := sequenceData{}
|
||||||
|
if _, err := sess.Table(sequence).Cols("last_value", "is_called").Get(&sequenceData); err != nil {
|
||||||
|
log.Error("Unable to get last_value and is_called from %s. Error: %v", sequence, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
sequenceMap[sequence] = sequenceData
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
// CASCADE causes postgres to drop all the constraints on the old table
|
// CASCADE causes postgres to drop all the constraints on the old table
|
||||||
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s` CASCADE", tableName)); err != nil {
|
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s` CASCADE", tableName)); err != nil {
|
||||||
log.Error("Unable to drop old table %s. Error: %v", tableName, err)
|
log.Error("Unable to drop old table %s. Error: %v", tableName, err)
|
||||||
@@ -529,7 +554,6 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var indices []string
|
var indices []string
|
||||||
schema := sess.Engine().Dialect().URI().Schema
|
|
||||||
sess.Engine().SetSchema("")
|
sess.Engine().SetSchema("")
|
||||||
if err := sess.Table("pg_indexes").Cols("indexname").Where("tablename = ? ", tableName).Find(&indices); err != nil {
|
if err := sess.Table("pg_indexes").Cols("indexname").Where("tablename = ? ", tableName).Find(&indices); err != nil {
|
||||||
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
|
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
|
||||||
@@ -545,6 +569,43 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var sequences []string
|
||||||
|
sess.Engine().SetSchema("")
|
||||||
|
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__' || ? || '_%' AND sequence_catalog = ?", tableName, setting.Database.Name).Find(&sequences); err != nil {
|
||||||
|
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
sess.Engine().SetSchema(schema)
|
||||||
|
|
||||||
|
for _, sequence := range sequences {
|
||||||
|
newSequenceName := strings.Replace(sequence, "tmp_recreate__", "", 1)
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("ALTER SEQUENCE `%s` RENAME TO `%s`", sequence, newSequenceName)); err != nil {
|
||||||
|
log.Error("Unable to rename %s sequence to %s. Error: %v", sequence, newSequenceName, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
val, ok := sequenceMap[newSequenceName]
|
||||||
|
if newSequenceName == tableName+"_id_seq" {
|
||||||
|
if ok && val.LastValue != 0 {
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', %d, %t)", newSequenceName, val.LastValue, val.IsCalled)); err != nil {
|
||||||
|
log.Error("Unable to reset %s to %d. Error: %v", newSequenceName, val, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// We're going to try to guess this
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', COALESCE((SELECT MAX(id)+1 FROM `%s`), 1), false)", newSequenceName, tableName)); err != nil {
|
||||||
|
log.Error("Unable to reset %s. Error: %v", newSequenceName, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if ok {
|
||||||
|
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', %d, %t)", newSequenceName, val.LastValue, val.IsCalled)); err != nil {
|
||||||
|
log.Error("Unable to reset %s to %d. Error: %v", newSequenceName, val, err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
case setting.Database.UseMSSQL:
|
case setting.Database.UseMSSQL:
|
||||||
// MSSQL will drop all the constraints on the old table
|
// MSSQL will drop all the constraints on the old table
|
||||||
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s`", tableName)); err != nil {
|
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s`", tableName)); err != nil {
|
||||||
|
|||||||
@@ -233,7 +233,7 @@ func deleteOAuth2Application(sess *xorm.Session, id, userid int64) error {
|
|||||||
if deleted, err := sess.Delete(&OAuth2Application{ID: id, UID: userid}); err != nil {
|
if deleted, err := sess.Delete(&OAuth2Application{ID: id, UID: userid}); err != nil {
|
||||||
return err
|
return err
|
||||||
} else if deleted == 0 {
|
} else if deleted == 0 {
|
||||||
return fmt.Errorf("cannot find oauth2 application")
|
return ErrOAuthApplicationNotFound{ID: id}
|
||||||
}
|
}
|
||||||
codes := make([]*OAuth2AuthorizationCode, 0)
|
codes := make([]*OAuth2AuthorizationCode, 0)
|
||||||
// delete correlating auth codes
|
// delete correlating auth codes
|
||||||
@@ -259,6 +259,7 @@ func deleteOAuth2Application(sess *xorm.Session, id, userid int64) error {
|
|||||||
// DeleteOAuth2Application deletes the application with the given id and the grants and auth codes related to it. It checks if the userid was the creator of the app.
|
// DeleteOAuth2Application deletes the application with the given id and the grants and auth codes related to it. It checks if the userid was the creator of the app.
|
||||||
func DeleteOAuth2Application(id, userid int64) error {
|
func DeleteOAuth2Application(id, userid int64) error {
|
||||||
sess := x.NewSession()
|
sess := x.NewSession()
|
||||||
|
defer sess.Close()
|
||||||
if err := sess.Begin(); err != nil {
|
if err := sess.Begin(); err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1290,11 +1290,44 @@ func IncrementRepoForkNum(ctx DBContext, repoID int64) error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// TransferOwnership transfers all corresponding setting from old user to new one.
|
// TransferOwnership transfers all corresponding setting from old user to new one.
|
||||||
func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error {
|
func TransferOwnership(doer *User, newOwnerName string, repo *Repository) (err error) {
|
||||||
|
repoRenamed := false
|
||||||
|
wikiRenamed := false
|
||||||
|
oldOwnerName := doer.Name
|
||||||
|
|
||||||
|
defer func() {
|
||||||
|
if !repoRenamed && !wikiRenamed {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
recoverErr := recover()
|
||||||
|
if err == nil && recoverErr == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if repoRenamed {
|
||||||
|
if err := os.Rename(RepoPath(newOwnerName, repo.Name), RepoPath(oldOwnerName, repo.Name)); err != nil {
|
||||||
|
log.Critical("Unable to move repository %s/%s directory from %s back to correct place %s: %v", oldOwnerName, repo.Name, RepoPath(newOwnerName, repo.Name), RepoPath(oldOwnerName, repo.Name), err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if wikiRenamed {
|
||||||
|
if err := os.Rename(WikiPath(newOwnerName, repo.Name), WikiPath(oldOwnerName, repo.Name)); err != nil {
|
||||||
|
log.Critical("Unable to move wiki for repository %s/%s directory from %s back to correct place %s: %v", oldOwnerName, repo.Name, WikiPath(newOwnerName, repo.Name), WikiPath(oldOwnerName, repo.Name), err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if recoverErr != nil {
|
||||||
|
log.Error("Panic within TransferOwnership: %v\n%s", recoverErr, log.Stack(2))
|
||||||
|
panic(recoverErr)
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
newOwner, err := GetUserByName(newOwnerName)
|
newOwner, err := GetUserByName(newOwnerName)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("get new owner '%s': %v", newOwnerName, err)
|
return fmt.Errorf("get new owner '%s': %v", newOwnerName, err)
|
||||||
}
|
}
|
||||||
|
newOwnerName = newOwner.Name // ensure capitalisation matches
|
||||||
|
|
||||||
// Check if new owner has repository with same name.
|
// Check if new owner has repository with same name.
|
||||||
has, err := IsRepositoryExist(newOwner, repo.Name)
|
has, err := IsRepositoryExist(newOwner, repo.Name)
|
||||||
@@ -1311,6 +1344,7 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
|
|||||||
}
|
}
|
||||||
|
|
||||||
oldOwner := repo.Owner
|
oldOwner := repo.Owner
|
||||||
|
oldOwnerName = oldOwner.Name
|
||||||
|
|
||||||
// Note: we have to set value here to make sure recalculate accesses is based on
|
// Note: we have to set value here to make sure recalculate accesses is based on
|
||||||
// new owner.
|
// new owner.
|
||||||
@@ -1370,9 +1404,9 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Update repository count.
|
// Update repository count.
|
||||||
if _, err = sess.Exec("UPDATE `user` SET num_repos=num_repos+1 WHERE id=?", newOwner.ID); err != nil {
|
if _, err := sess.Exec("UPDATE `user` SET num_repos=num_repos+1 WHERE id=?", newOwner.ID); err != nil {
|
||||||
return fmt.Errorf("increase new owner repository count: %v", err)
|
return fmt.Errorf("increase new owner repository count: %v", err)
|
||||||
} else if _, err = sess.Exec("UPDATE `user` SET num_repos=num_repos-1 WHERE id=?", oldOwner.ID); err != nil {
|
} else if _, err := sess.Exec("UPDATE `user` SET num_repos=num_repos-1 WHERE id=?", oldOwner.ID); err != nil {
|
||||||
return fmt.Errorf("decrease old owner repository count: %v", err)
|
return fmt.Errorf("decrease old owner repository count: %v", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1382,7 +1416,7 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
|
|||||||
|
|
||||||
// Remove watch for organization.
|
// Remove watch for organization.
|
||||||
if oldOwner.IsOrganization() {
|
if oldOwner.IsOrganization() {
|
||||||
if err = watchRepo(sess, oldOwner.ID, repo.ID, false); err != nil {
|
if err := watchRepo(sess, oldOwner.ID, repo.ID, false); err != nil {
|
||||||
return fmt.Errorf("watchRepo [false]: %v", err)
|
return fmt.Errorf("watchRepo [false]: %v", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1394,16 +1428,18 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
|
|||||||
return fmt.Errorf("Failed to create dir %s: %v", dir, err)
|
return fmt.Errorf("Failed to create dir %s: %v", dir, err)
|
||||||
}
|
}
|
||||||
|
|
||||||
if err = os.Rename(RepoPath(oldOwner.Name, repo.Name), RepoPath(newOwner.Name, repo.Name)); err != nil {
|
if err := os.Rename(RepoPath(oldOwner.Name, repo.Name), RepoPath(newOwner.Name, repo.Name)); err != nil {
|
||||||
return fmt.Errorf("rename repository directory: %v", err)
|
return fmt.Errorf("rename repository directory: %v", err)
|
||||||
}
|
}
|
||||||
|
repoRenamed = true
|
||||||
|
|
||||||
// Rename remote wiki repository to new path and delete local copy.
|
// Rename remote wiki repository to new path and delete local copy.
|
||||||
wikiPath := WikiPath(oldOwner.Name, repo.Name)
|
wikiPath := WikiPath(oldOwner.Name, repo.Name)
|
||||||
if com.IsExist(wikiPath) {
|
if com.IsExist(wikiPath) {
|
||||||
if err = os.Rename(wikiPath, WikiPath(newOwner.Name, repo.Name)); err != nil {
|
if err := os.Rename(wikiPath, WikiPath(newOwner.Name, repo.Name)); err != nil {
|
||||||
return fmt.Errorf("rename repository wiki: %v", err)
|
return fmt.Errorf("rename repository wiki: %v", err)
|
||||||
}
|
}
|
||||||
|
wikiRenamed = true
|
||||||
}
|
}
|
||||||
|
|
||||||
// If there was previously a redirect at this location, remove it.
|
// If there was previously a redirect at this location, remove it.
|
||||||
@@ -1693,6 +1729,10 @@ func DeleteRepository(doer *User, uid, repoID int64) error {
|
|||||||
return fmt.Errorf("deleteBeans: %v", err)
|
return fmt.Errorf("deleteBeans: %v", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if err := deleteLabelsByRepoID(sess, repoID); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
// Delete Issues and related objects
|
// Delete Issues and related objects
|
||||||
var attachmentPaths []string
|
var attachmentPaths []string
|
||||||
if attachmentPaths, err = deleteIssuesByRepoID(sess, repoID); err != nil {
|
if attachmentPaths, err = deleteIssuesByRepoID(sess, repoID); err != nil {
|
||||||
|
|||||||
@@ -727,6 +727,7 @@ var (
|
|||||||
"assets",
|
"assets",
|
||||||
"attachments",
|
"attachments",
|
||||||
"avatars",
|
"avatars",
|
||||||
|
"captcha",
|
||||||
"commits",
|
"commits",
|
||||||
"debug",
|
"debug",
|
||||||
"error",
|
"error",
|
||||||
|
|||||||
70
modules/analyze/vendor.go
Normal file
70
modules/analyze/vendor.go
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package analyze
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
"sort"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/go-enry/go-enry/v2/data"
|
||||||
|
)
|
||||||
|
|
||||||
|
var isVendorRegExp *regexp.Regexp
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
matchers := data.VendorMatchers
|
||||||
|
|
||||||
|
caretStrings := make([]string, 0, 10)
|
||||||
|
caretShareStrings := make([]string, 0, 10)
|
||||||
|
|
||||||
|
matcherStrings := make([]string, 0, len(matchers))
|
||||||
|
for _, matcher := range matchers {
|
||||||
|
str := matcher.String()
|
||||||
|
if str[0] == '^' {
|
||||||
|
caretStrings = append(caretStrings, str[1:])
|
||||||
|
} else if str[0:5] == "(^|/)" {
|
||||||
|
caretShareStrings = append(caretShareStrings, str[5:])
|
||||||
|
} else {
|
||||||
|
matcherStrings = append(matcherStrings, str)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sort.Strings(caretShareStrings)
|
||||||
|
sort.Strings(caretStrings)
|
||||||
|
sort.Strings(matcherStrings)
|
||||||
|
|
||||||
|
sb := &strings.Builder{}
|
||||||
|
sb.WriteString("(?:^(?:")
|
||||||
|
sb.WriteString(caretStrings[0])
|
||||||
|
for _, matcher := range caretStrings[1:] {
|
||||||
|
sb.WriteString(")|(?:")
|
||||||
|
sb.WriteString(matcher)
|
||||||
|
}
|
||||||
|
sb.WriteString("))")
|
||||||
|
sb.WriteString("|")
|
||||||
|
sb.WriteString("(?:(?:^|/)(?:")
|
||||||
|
sb.WriteString(caretShareStrings[0])
|
||||||
|
for _, matcher := range caretShareStrings[1:] {
|
||||||
|
sb.WriteString(")|(?:")
|
||||||
|
sb.WriteString(matcher)
|
||||||
|
}
|
||||||
|
sb.WriteString("))")
|
||||||
|
sb.WriteString("|")
|
||||||
|
sb.WriteString("(?:")
|
||||||
|
sb.WriteString(matcherStrings[0])
|
||||||
|
for _, matcher := range matcherStrings[1:] {
|
||||||
|
sb.WriteString(")|(?:")
|
||||||
|
sb.WriteString(matcher)
|
||||||
|
}
|
||||||
|
sb.WriteString(")")
|
||||||
|
combined := sb.String()
|
||||||
|
isVendorRegExp = regexp.MustCompile(combined)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsVendor returns whether or not path is a vendor path.
|
||||||
|
func IsVendor(path string) bool {
|
||||||
|
return isVendorRegExp.MatchString(path)
|
||||||
|
}
|
||||||
42
modules/analyze/vendor_test.go
Normal file
42
modules/analyze/vendor_test.go
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package analyze
|
||||||
|
|
||||||
|
import "testing"
|
||||||
|
|
||||||
|
func TestIsVendor(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
path string
|
||||||
|
want bool
|
||||||
|
}{
|
||||||
|
{"cache/", true},
|
||||||
|
{"random/cache/", true},
|
||||||
|
{"cache", false},
|
||||||
|
{"dependencies/", true},
|
||||||
|
{"Dependencies/", true},
|
||||||
|
{"dependency/", false},
|
||||||
|
{"dist/", true},
|
||||||
|
{"dist", false},
|
||||||
|
{"random/dist/", true},
|
||||||
|
{"random/dist", false},
|
||||||
|
{"deps/", true},
|
||||||
|
{"configure", true},
|
||||||
|
{"a/configure", true},
|
||||||
|
{"config.guess", true},
|
||||||
|
{"config.guess/", false},
|
||||||
|
{".vscode/", true},
|
||||||
|
{"doc/_build/", true},
|
||||||
|
{"a/docs/_build/", true},
|
||||||
|
{"a/dasdocs/_build-vsdoc.js", true},
|
||||||
|
{"a/dasdocs/_build-vsdoc.j", false},
|
||||||
|
}
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.path, func(t *testing.T) {
|
||||||
|
if got := IsVendor(tt.path); got != tt.want {
|
||||||
|
t.Errorf("IsVendor() = %v, want %v", got, tt.want)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -83,18 +83,17 @@ func ToPullReviewCommentList(review *models.Review, doer *models.User) ([]*api.P
|
|||||||
|
|
||||||
apiComments := make([]*api.PullReviewComment, 0, len(review.CodeComments))
|
apiComments := make([]*api.PullReviewComment, 0, len(review.CodeComments))
|
||||||
|
|
||||||
auth := false
|
|
||||||
if doer != nil {
|
|
||||||
auth = doer.IsAdmin || doer.ID == review.ReviewerID
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, lines := range review.CodeComments {
|
for _, lines := range review.CodeComments {
|
||||||
for _, comments := range lines {
|
for _, comments := range lines {
|
||||||
for _, comment := range comments {
|
for _, comment := range comments {
|
||||||
|
auth := false
|
||||||
|
if doer != nil {
|
||||||
|
auth = doer.IsAdmin || doer.ID == comment.Poster.ID
|
||||||
|
}
|
||||||
apiComment := &api.PullReviewComment{
|
apiComment := &api.PullReviewComment{
|
||||||
ID: comment.ID,
|
ID: comment.ID,
|
||||||
Body: comment.Content,
|
Body: comment.Content,
|
||||||
Reviewer: ToUser(review.Reviewer, doer != nil, auth),
|
Reviewer: ToUser(comment.Poster, doer != nil, auth),
|
||||||
ReviewID: review.ID,
|
ReviewID: review.ID,
|
||||||
Created: comment.CreatedUnix.AsTime(),
|
Created: comment.CreatedUnix.AsTime(),
|
||||||
Updated: comment.UpdatedUnix.AsTime(),
|
Updated: comment.UpdatedUnix.AsTime(),
|
||||||
|
|||||||
@@ -13,6 +13,10 @@ import (
|
|||||||
// ToUser convert models.User to api.User
|
// ToUser convert models.User to api.User
|
||||||
// signed shall only be set if requester is logged in. authed shall only be set if user is site admin or user himself
|
// signed shall only be set if requester is logged in. authed shall only be set if user is site admin or user himself
|
||||||
func ToUser(user *models.User, signed, authed bool) *api.User {
|
func ToUser(user *models.User, signed, authed bool) *api.User {
|
||||||
|
if user == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
result := &api.User{
|
result := &api.User{
|
||||||
ID: user.ID,
|
ID: user.ID,
|
||||||
UserName: user.Name,
|
UserName: user.Name,
|
||||||
|
|||||||
@@ -30,6 +30,9 @@ var (
|
|||||||
// aliasMap provides a map of the alias to its emoji data.
|
// aliasMap provides a map of the alias to its emoji data.
|
||||||
aliasMap map[string]int
|
aliasMap map[string]int
|
||||||
|
|
||||||
|
// emptyReplacer is the string replacer for emoji codes.
|
||||||
|
emptyReplacer *strings.Replacer
|
||||||
|
|
||||||
// codeReplacer is the string replacer for emoji codes.
|
// codeReplacer is the string replacer for emoji codes.
|
||||||
codeReplacer *strings.Replacer
|
codeReplacer *strings.Replacer
|
||||||
|
|
||||||
@@ -49,6 +52,7 @@ func loadMap() {
|
|||||||
|
|
||||||
// process emoji codes and aliases
|
// process emoji codes and aliases
|
||||||
codePairs := make([]string, 0)
|
codePairs := make([]string, 0)
|
||||||
|
emptyPairs := make([]string, 0)
|
||||||
aliasPairs := make([]string, 0)
|
aliasPairs := make([]string, 0)
|
||||||
|
|
||||||
// sort from largest to small so we match combined emoji first
|
// sort from largest to small so we match combined emoji first
|
||||||
@@ -64,6 +68,7 @@ func loadMap() {
|
|||||||
// setup codes
|
// setup codes
|
||||||
codeMap[e.Emoji] = i
|
codeMap[e.Emoji] = i
|
||||||
codePairs = append(codePairs, e.Emoji, ":"+e.Aliases[0]+":")
|
codePairs = append(codePairs, e.Emoji, ":"+e.Aliases[0]+":")
|
||||||
|
emptyPairs = append(emptyPairs, e.Emoji, e.Emoji)
|
||||||
|
|
||||||
// setup aliases
|
// setup aliases
|
||||||
for _, a := range e.Aliases {
|
for _, a := range e.Aliases {
|
||||||
@@ -77,6 +82,7 @@ func loadMap() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// create replacers
|
// create replacers
|
||||||
|
emptyReplacer = strings.NewReplacer(emptyPairs...)
|
||||||
codeReplacer = strings.NewReplacer(codePairs...)
|
codeReplacer = strings.NewReplacer(codePairs...)
|
||||||
aliasReplacer = strings.NewReplacer(aliasPairs...)
|
aliasReplacer = strings.NewReplacer(aliasPairs...)
|
||||||
})
|
})
|
||||||
@@ -127,38 +133,53 @@ func ReplaceAliases(s string) string {
|
|||||||
return aliasReplacer.Replace(s)
|
return aliasReplacer.Replace(s)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type rememberSecondWriteWriter struct {
|
||||||
|
pos int
|
||||||
|
idx int
|
||||||
|
end int
|
||||||
|
writecount int
|
||||||
|
}
|
||||||
|
|
||||||
|
func (n *rememberSecondWriteWriter) Write(p []byte) (int, error) {
|
||||||
|
n.writecount++
|
||||||
|
if n.writecount == 2 {
|
||||||
|
n.idx = n.pos
|
||||||
|
n.end = n.pos + len(p)
|
||||||
|
}
|
||||||
|
n.pos += len(p)
|
||||||
|
return len(p), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (n *rememberSecondWriteWriter) WriteString(s string) (int, error) {
|
||||||
|
n.writecount++
|
||||||
|
if n.writecount == 2 {
|
||||||
|
n.idx = n.pos
|
||||||
|
n.end = n.pos + len(s)
|
||||||
|
}
|
||||||
|
n.pos += len(s)
|
||||||
|
return len(s), nil
|
||||||
|
}
|
||||||
|
|
||||||
// FindEmojiSubmatchIndex returns index pair of longest emoji in a string
|
// FindEmojiSubmatchIndex returns index pair of longest emoji in a string
|
||||||
func FindEmojiSubmatchIndex(s string) []int {
|
func FindEmojiSubmatchIndex(s string) []int {
|
||||||
loadMap()
|
loadMap()
|
||||||
found := make(map[int]int)
|
secondWriteWriter := rememberSecondWriteWriter{}
|
||||||
keys := make([]int, 0)
|
|
||||||
|
|
||||||
//see if there are any emoji in string before looking for position of specific ones
|
// A faster and clean implementation would copy the trie tree formation in strings.NewReplacer but
|
||||||
//no performance difference when there is a match but 10x faster when there are not
|
// we can be lazy here.
|
||||||
if s == ReplaceCodes(s) {
|
//
|
||||||
|
// The implementation of strings.Replacer.WriteString is such that the first index of the emoji
|
||||||
|
// submatch is simply the second thing that is written to WriteString in the writer.
|
||||||
|
//
|
||||||
|
// Therefore we can simply take the index of the second write as our first emoji
|
||||||
|
//
|
||||||
|
// FIXME: just copy the trie implementation from strings.NewReplacer
|
||||||
|
_, _ = emptyReplacer.WriteString(&secondWriteWriter, s)
|
||||||
|
|
||||||
|
// if we wrote less than twice then we never "replaced"
|
||||||
|
if secondWriteWriter.writecount < 2 {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// get index of first emoji occurrence while also checking for longest combination
|
return []int{secondWriteWriter.idx, secondWriteWriter.end}
|
||||||
for j := range GemojiData {
|
|
||||||
i := strings.Index(s, GemojiData[j].Emoji)
|
|
||||||
if i != -1 {
|
|
||||||
if _, ok := found[i]; !ok {
|
|
||||||
if len(keys) == 0 || i < keys[0] {
|
|
||||||
found[i] = j
|
|
||||||
keys = []int{i}
|
|
||||||
}
|
|
||||||
if i == 0 {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if len(keys) > 0 {
|
|
||||||
index := keys[0]
|
|
||||||
return []int{index, index + len(GemojiData[found[index]].Emoji)}
|
|
||||||
}
|
|
||||||
|
|
||||||
return nil
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -8,6 +8,8 @@ package emoji
|
|||||||
import (
|
import (
|
||||||
"reflect"
|
"reflect"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestDumpInfo(t *testing.T) {
|
func TestDumpInfo(t *testing.T) {
|
||||||
@@ -65,3 +67,34 @@ func TestReplacers(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestFindEmojiSubmatchIndex(t *testing.T) {
|
||||||
|
type testcase struct {
|
||||||
|
teststring string
|
||||||
|
expected []int
|
||||||
|
}
|
||||||
|
|
||||||
|
testcases := []testcase{
|
||||||
|
{
|
||||||
|
"\U0001f44d",
|
||||||
|
[]int{0, len("\U0001f44d")},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"\U0001f44d +1 \U0001f44d \U0001f37a",
|
||||||
|
[]int{0, 4},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
" \U0001f44d",
|
||||||
|
[]int{1, 1 + len("\U0001f44d")},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
string([]byte{'\u0001'}) + "\U0001f44d",
|
||||||
|
[]int{1, 1 + len("\U0001f44d")},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, kase := range testcases {
|
||||||
|
actual := FindEmojiSubmatchIndex(kase.teststring)
|
||||||
|
assert.Equal(t, kase.expected, actual)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -47,7 +47,7 @@ func GetRawDiffForFile(repoPath, startCommit, endCommit string, diffType RawDiff
|
|||||||
func GetRepoRawDiffForFile(repo *Repository, startCommit, endCommit string, diffType RawDiffType, file string, writer io.Writer) error {
|
func GetRepoRawDiffForFile(repo *Repository, startCommit, endCommit string, diffType RawDiffType, file string, writer io.Writer) error {
|
||||||
commit, err := repo.GetCommit(endCommit)
|
commit, err := repo.GetCommit(endCommit)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("GetCommit: %v", err)
|
return err
|
||||||
}
|
}
|
||||||
fileArgs := make([]string, 0)
|
fileArgs := make([]string, 0)
|
||||||
if len(file) > 0 {
|
if len(file) > 0 {
|
||||||
|
|||||||
@@ -47,14 +47,7 @@ func (repo *Repository) GetBranchCommitID(name string) (string, error) {
|
|||||||
|
|
||||||
// GetTagCommitID returns last commit ID string of given tag.
|
// GetTagCommitID returns last commit ID string of given tag.
|
||||||
func (repo *Repository) GetTagCommitID(name string) (string, error) {
|
func (repo *Repository) GetTagCommitID(name string) (string, error) {
|
||||||
stdout, err := NewCommand("rev-list", "-n", "1", TagPrefix+name).RunInDir(repo.Path)
|
return repo.GetRefCommitID(TagPrefix + name)
|
||||||
if err != nil {
|
|
||||||
if strings.Contains(err.Error(), "unknown revision or path") {
|
|
||||||
return "", ErrNotExist{name, ""}
|
|
||||||
}
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
return strings.TrimSpace(stdout), nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func convertPGPSignatureForTag(t *object.Tag) *CommitGPGSignature {
|
func convertPGPSignatureForTag(t *object.Tag) *CommitGPGSignature {
|
||||||
|
|||||||
@@ -44,7 +44,7 @@ func (repo *Repository) GetLanguageStats(commitID string) (map[string]int64, err
|
|||||||
|
|
||||||
sizes := make(map[string]int64)
|
sizes := make(map[string]int64)
|
||||||
err = tree.Files().ForEach(func(f *object.File) error {
|
err = tree.Files().ForEach(func(f *object.File) error {
|
||||||
if f.Size == 0 || enry.IsVendor(f.Name) || enry.IsDotFile(f.Name) ||
|
if f.Size == 0 || analyze.IsVendor(f.Name) || enry.IsDotFile(f.Name) ||
|
||||||
enry.IsDocumentation(f.Name) || enry.IsConfiguration(f.Name) {
|
enry.IsDocumentation(f.Name) || enry.IsConfiguration(f.Name) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -175,7 +175,7 @@ func NewBleveIndexer(indexDir string) (*BleveIndexer, bool, error) {
|
|||||||
|
|
||||||
func (b *BleveIndexer) addUpdate(commitSha string, update fileUpdate, repo *models.Repository, batch rupture.FlushingBatch) error {
|
func (b *BleveIndexer) addUpdate(commitSha string, update fileUpdate, repo *models.Repository, batch rupture.FlushingBatch) error {
|
||||||
// Ignore vendored files in code search
|
// Ignore vendored files in code search
|
||||||
if setting.Indexer.ExcludeVendored && enry.IsVendor(update.Filename) {
|
if setting.Indexer.ExcludeVendored && analyze.IsVendor(update.Filename) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -170,7 +170,7 @@ func (b *ElasticSearchIndexer) init() (bool, error) {
|
|||||||
|
|
||||||
func (b *ElasticSearchIndexer) addUpdate(sha string, update fileUpdate, repo *models.Repository) ([]elastic.BulkableRequest, error) {
|
func (b *ElasticSearchIndexer) addUpdate(sha string, update fileUpdate, repo *models.Repository) ([]elastic.BulkableRequest, error) {
|
||||||
// Ignore vendored files in code search
|
// Ignore vendored files in code search
|
||||||
if setting.Indexer.ExcludeVendored && enry.IsVendor(update.Filename) {
|
if setting.Indexer.ExcludeVendored && analyze.IsVendor(update.Filename) {
|
||||||
return nil, nil
|
return nil, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ import (
|
|||||||
"encoding/hex"
|
"encoding/hex"
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"hash"
|
||||||
"io"
|
"io"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
@@ -66,15 +67,20 @@ func (s *ContentStore) Get(meta *models.LFSMetaObject, fromByte int64) (io.ReadC
|
|||||||
|
|
||||||
// Put takes a Meta object and an io.Reader and writes the content to the store.
|
// Put takes a Meta object and an io.Reader and writes the content to the store.
|
||||||
func (s *ContentStore) Put(meta *models.LFSMetaObject, r io.Reader) error {
|
func (s *ContentStore) Put(meta *models.LFSMetaObject, r io.Reader) error {
|
||||||
hash := sha256.New()
|
|
||||||
rd := io.TeeReader(r, hash)
|
|
||||||
p := meta.RelativePath()
|
p := meta.RelativePath()
|
||||||
written, err := s.Save(p, rd)
|
|
||||||
|
// Wrap the provided reader with an inline hashing and size checker
|
||||||
|
wrappedRd := newHashingReader(meta.Size, meta.Oid, r)
|
||||||
|
|
||||||
|
// now pass the wrapped reader to Save - if there is a size mismatch or hash mismatch then
|
||||||
|
// the errors returned by the newHashingReader should percolate up to here
|
||||||
|
written, err := s.Save(p, wrappedRd, meta.Size)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Error("Whilst putting LFS OID[%s]: Failed to copy to tmpPath: %s Error: %v", meta.Oid, p, err)
|
log.Error("Whilst putting LFS OID[%s]: Failed to copy to tmpPath: %s Error: %v", meta.Oid, p, err)
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// This shouldn't happen but it is sensible to test
|
||||||
if written != meta.Size {
|
if written != meta.Size {
|
||||||
if err := s.Delete(p); err != nil {
|
if err := s.Delete(p); err != nil {
|
||||||
log.Error("Cleaning the LFS OID[%s] failed: %v", meta.Oid, err)
|
log.Error("Cleaning the LFS OID[%s] failed: %v", meta.Oid, err)
|
||||||
@@ -82,14 +88,6 @@ func (s *ContentStore) Put(meta *models.LFSMetaObject, r io.Reader) error {
|
|||||||
return errSizeMismatch
|
return errSizeMismatch
|
||||||
}
|
}
|
||||||
|
|
||||||
shaStr := hex.EncodeToString(hash.Sum(nil))
|
|
||||||
if shaStr != meta.Oid {
|
|
||||||
if err := s.Delete(p); err != nil {
|
|
||||||
log.Error("Cleaning the LFS OID[%s] failed: %v", meta.Oid, err)
|
|
||||||
}
|
|
||||||
return errHashMismatch
|
|
||||||
}
|
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -118,3 +116,45 @@ func (s *ContentStore) Verify(meta *models.LFSMetaObject) (bool, error) {
|
|||||||
|
|
||||||
return true, nil
|
return true, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type hashingReader struct {
|
||||||
|
internal io.Reader
|
||||||
|
currentSize int64
|
||||||
|
expectedSize int64
|
||||||
|
hash hash.Hash
|
||||||
|
expectedHash string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *hashingReader) Read(b []byte) (int, error) {
|
||||||
|
n, err := r.internal.Read(b)
|
||||||
|
|
||||||
|
if n > 0 {
|
||||||
|
r.currentSize += int64(n)
|
||||||
|
wn, werr := r.hash.Write(b[:n])
|
||||||
|
if wn != n || werr != nil {
|
||||||
|
return n, werr
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if err != nil && err == io.EOF {
|
||||||
|
if r.currentSize != r.expectedSize {
|
||||||
|
return n, errSizeMismatch
|
||||||
|
}
|
||||||
|
|
||||||
|
shaStr := hex.EncodeToString(r.hash.Sum(nil))
|
||||||
|
if shaStr != r.expectedHash {
|
||||||
|
return n, errHashMismatch
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return n, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func newHashingReader(expectedSize int64, expectedHash string, reader io.Reader) *hashingReader {
|
||||||
|
return &hashingReader{
|
||||||
|
internal: reader,
|
||||||
|
expectedSize: expectedSize,
|
||||||
|
expectedHash: expectedHash,
|
||||||
|
hash: sha256.New(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -298,19 +298,27 @@ func RenderEmoji(
|
|||||||
return ctx.postProcess(rawHTML)
|
return ctx.postProcess(rawHTML)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var tagCleaner = regexp.MustCompile(`<((?:/?\w+/\w+)|(?:/[\w ]+/)|(/?[hH][tT][mM][lL]\b)|(/?[hH][eE][aA][dD]\b))`)
|
||||||
|
var nulCleaner = strings.NewReplacer("\000", "")
|
||||||
|
|
||||||
func (ctx *postProcessCtx) postProcess(rawHTML []byte) ([]byte, error) {
|
func (ctx *postProcessCtx) postProcess(rawHTML []byte) ([]byte, error) {
|
||||||
if ctx.procs == nil {
|
if ctx.procs == nil {
|
||||||
ctx.procs = defaultProcessors
|
ctx.procs = defaultProcessors
|
||||||
}
|
}
|
||||||
|
|
||||||
// give a generous extra 50 bytes
|
// give a generous extra 50 bytes
|
||||||
res := make([]byte, 0, len(rawHTML)+50)
|
res := bytes.NewBuffer(make([]byte, 0, len(rawHTML)+50))
|
||||||
res = append(res, "<html><body>"...)
|
// prepend "<html><body>"
|
||||||
res = append(res, rawHTML...)
|
_, _ = res.WriteString("<html><body>")
|
||||||
res = append(res, "</body></html>"...)
|
|
||||||
|
// Strip out nuls - they're always invalid
|
||||||
|
_, _ = res.Write(tagCleaner.ReplaceAll([]byte(nulCleaner.Replace(string(rawHTML))), []byte("<$1")))
|
||||||
|
|
||||||
|
// close the tags
|
||||||
|
_, _ = res.WriteString("</body></html>")
|
||||||
|
|
||||||
// parse the HTML
|
// parse the HTML
|
||||||
nodes, err := html.ParseFragment(bytes.NewReader(res), nil)
|
nodes, err := html.ParseFragment(res, nil)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, &postProcessError{"invalid HTML", err}
|
return nil, &postProcessError{"invalid HTML", err}
|
||||||
}
|
}
|
||||||
@@ -347,17 +355,17 @@ func (ctx *postProcessCtx) postProcess(rawHTML []byte) ([]byte, error) {
|
|||||||
// Create buffer in which the data will be placed again. We know that the
|
// Create buffer in which the data will be placed again. We know that the
|
||||||
// length will be at least that of res; to spare a few alloc+copy, we
|
// length will be at least that of res; to spare a few alloc+copy, we
|
||||||
// reuse res, resetting its length to 0.
|
// reuse res, resetting its length to 0.
|
||||||
buf := bytes.NewBuffer(res[:0])
|
res.Reset()
|
||||||
// Render everything to buf.
|
// Render everything to buf.
|
||||||
for _, node := range nodes {
|
for _, node := range nodes {
|
||||||
err = html.Render(buf, node)
|
err = html.Render(res, node)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, &postProcessError{"error rendering processed HTML", err}
|
return nil, &postProcessError{"error rendering processed HTML", err}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Everything done successfully, return parsed data.
|
// Everything done successfully, return parsed data.
|
||||||
return buf.Bytes(), nil
|
return res.Bytes(), nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (ctx *postProcessCtx) visitNode(node *html.Node, visitText bool) {
|
func (ctx *postProcessCtx) visitNode(node *html.Node, visitText bool) {
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import (
|
|||||||
"regexp"
|
"regexp"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/log"
|
||||||
"code.gitea.io/gitea/modules/markup"
|
"code.gitea.io/gitea/modules/markup"
|
||||||
"code.gitea.io/gitea/modules/markup/common"
|
"code.gitea.io/gitea/modules/markup/common"
|
||||||
"code.gitea.io/gitea/modules/setting"
|
"code.gitea.io/gitea/modules/setting"
|
||||||
@@ -76,6 +77,12 @@ func (g *ASTTransformer) Transform(node *ast.Document, reader text.Reader, pc pa
|
|||||||
header.ID = util.BytesToReadOnlyString(id.([]byte))
|
header.ID = util.BytesToReadOnlyString(id.([]byte))
|
||||||
}
|
}
|
||||||
toc = append(toc, header)
|
toc = append(toc, header)
|
||||||
|
} else {
|
||||||
|
for _, attr := range v.Attributes() {
|
||||||
|
if _, ok := attr.Value.([]byte); !ok {
|
||||||
|
v.SetAttribute(attr.Name, []byte(fmt.Sprintf("%v", attr.Value)))
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
case *ast.Image:
|
case *ast.Image:
|
||||||
// Images need two things:
|
// Images need two things:
|
||||||
@@ -101,11 +108,41 @@ func (g *ASTTransformer) Transform(node *ast.Document, reader text.Reader, pc pa
|
|||||||
parent := n.Parent()
|
parent := n.Parent()
|
||||||
// Create a link around image only if parent is not already a link
|
// Create a link around image only if parent is not already a link
|
||||||
if _, ok := parent.(*ast.Link); !ok && parent != nil {
|
if _, ok := parent.(*ast.Link); !ok && parent != nil {
|
||||||
|
next := n.NextSibling()
|
||||||
|
|
||||||
|
// Create a link wrapper
|
||||||
wrap := ast.NewLink()
|
wrap := ast.NewLink()
|
||||||
wrap.Destination = link
|
wrap.Destination = link
|
||||||
wrap.Title = v.Title
|
wrap.Title = v.Title
|
||||||
|
|
||||||
|
// Duplicate the current image node
|
||||||
|
image := ast.NewImage(ast.NewLink())
|
||||||
|
image.Destination = link
|
||||||
|
image.Title = v.Title
|
||||||
|
for _, attr := range v.Attributes() {
|
||||||
|
image.SetAttribute(attr.Name, attr.Value)
|
||||||
|
}
|
||||||
|
for child := v.FirstChild(); child != nil; {
|
||||||
|
next := child.NextSibling()
|
||||||
|
image.AppendChild(image, child)
|
||||||
|
child = next
|
||||||
|
}
|
||||||
|
|
||||||
|
// Append our duplicate image to the wrapper link
|
||||||
|
wrap.AppendChild(wrap, image)
|
||||||
|
|
||||||
|
// Wire in the next sibling
|
||||||
|
wrap.SetNextSibling(next)
|
||||||
|
|
||||||
|
// Replace the current node with the wrapper link
|
||||||
parent.ReplaceChild(parent, n, wrap)
|
parent.ReplaceChild(parent, n, wrap)
|
||||||
wrap.AppendChild(wrap, n)
|
|
||||||
|
// But most importantly ensure the next sibling is still on the old image too
|
||||||
|
v.SetNextSibling(next)
|
||||||
|
|
||||||
|
} else {
|
||||||
|
log.Debug("ast.Image: %s has parent: %v", link, parent)
|
||||||
|
|
||||||
}
|
}
|
||||||
case *ast.Link:
|
case *ast.Link:
|
||||||
// Links need their href to munged to be a real value
|
// Links need their href to munged to be a real value
|
||||||
|
|||||||
@@ -6,7 +6,8 @@
|
|||||||
package markdown
|
package markdown
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"fmt"
|
||||||
|
"io"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
|
|
||||||
@@ -18,7 +19,7 @@ import (
|
|||||||
|
|
||||||
chromahtml "github.com/alecthomas/chroma/formatters/html"
|
chromahtml "github.com/alecthomas/chroma/formatters/html"
|
||||||
"github.com/yuin/goldmark"
|
"github.com/yuin/goldmark"
|
||||||
"github.com/yuin/goldmark-highlighting"
|
highlighting "github.com/yuin/goldmark-highlighting"
|
||||||
meta "github.com/yuin/goldmark-meta"
|
meta "github.com/yuin/goldmark-meta"
|
||||||
"github.com/yuin/goldmark/extension"
|
"github.com/yuin/goldmark/extension"
|
||||||
"github.com/yuin/goldmark/parser"
|
"github.com/yuin/goldmark/parser"
|
||||||
@@ -34,6 +35,44 @@ var urlPrefixKey = parser.NewContextKey()
|
|||||||
var isWikiKey = parser.NewContextKey()
|
var isWikiKey = parser.NewContextKey()
|
||||||
var renderMetasKey = parser.NewContextKey()
|
var renderMetasKey = parser.NewContextKey()
|
||||||
|
|
||||||
|
type closesWithError interface {
|
||||||
|
io.WriteCloser
|
||||||
|
CloseWithError(err error) error
|
||||||
|
}
|
||||||
|
|
||||||
|
type limitWriter struct {
|
||||||
|
w closesWithError
|
||||||
|
sum int64
|
||||||
|
limit int64
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write implements the standard Write interface:
|
||||||
|
func (l *limitWriter) Write(data []byte) (int, error) {
|
||||||
|
leftToWrite := l.limit - l.sum
|
||||||
|
if leftToWrite < int64(len(data)) {
|
||||||
|
n, err := l.w.Write(data[:leftToWrite])
|
||||||
|
l.sum += int64(n)
|
||||||
|
if err != nil {
|
||||||
|
return n, err
|
||||||
|
}
|
||||||
|
_ = l.w.Close()
|
||||||
|
return n, fmt.Errorf("Rendered content too large - truncating render")
|
||||||
|
}
|
||||||
|
n, err := l.w.Write(data)
|
||||||
|
l.sum += int64(n)
|
||||||
|
return n, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close closes the writer
|
||||||
|
func (l *limitWriter) Close() error {
|
||||||
|
return l.w.Close()
|
||||||
|
}
|
||||||
|
|
||||||
|
// CloseWithError closes the writer
|
||||||
|
func (l *limitWriter) CloseWithError(err error) error {
|
||||||
|
return l.w.CloseWithError(err)
|
||||||
|
}
|
||||||
|
|
||||||
// NewGiteaParseContext creates a parser.Context with the gitea context set
|
// NewGiteaParseContext creates a parser.Context with the gitea context set
|
||||||
func NewGiteaParseContext(urlPrefix string, metas map[string]string, isWiki bool) parser.Context {
|
func NewGiteaParseContext(urlPrefix string, metas map[string]string, isWiki bool) parser.Context {
|
||||||
pc := parser.NewContext(parser.WithIDs(newPrefixedIDs()))
|
pc := parser.NewContext(parser.WithIDs(newPrefixedIDs()))
|
||||||
@@ -43,8 +82,8 @@ func NewGiteaParseContext(urlPrefix string, metas map[string]string, isWiki bool
|
|||||||
return pc
|
return pc
|
||||||
}
|
}
|
||||||
|
|
||||||
// render renders Markdown to HTML without handling special links.
|
// actualRender renders Markdown to HTML without handling special links.
|
||||||
func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) []byte {
|
func actualRender(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) []byte {
|
||||||
once.Do(func() {
|
once.Do(func() {
|
||||||
converter = goldmark.New(
|
converter = goldmark.New(
|
||||||
goldmark.WithExtensions(extension.Table,
|
goldmark.WithExtensions(extension.Table,
|
||||||
@@ -119,12 +158,57 @@ func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown
|
|||||||
|
|
||||||
})
|
})
|
||||||
|
|
||||||
pc := NewGiteaParseContext(urlPrefix, metas, wikiMarkdown)
|
rd, wr := io.Pipe()
|
||||||
var buf bytes.Buffer
|
defer func() {
|
||||||
if err := converter.Convert(giteautil.NormalizeEOL(body), &buf, parser.WithContext(pc)); err != nil {
|
_ = rd.Close()
|
||||||
log.Error("Unable to render: %v", err)
|
_ = wr.Close()
|
||||||
|
}()
|
||||||
|
|
||||||
|
lw := &limitWriter{
|
||||||
|
w: wr,
|
||||||
|
limit: setting.UI.MaxDisplayFileSize * 3,
|
||||||
}
|
}
|
||||||
return markup.SanitizeReader(&buf).Bytes()
|
|
||||||
|
// FIXME: should we include a timeout that closes the pipe to abort the parser and sanitizer if it takes too long?
|
||||||
|
go func() {
|
||||||
|
defer func() {
|
||||||
|
err := recover()
|
||||||
|
if err == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Warn("Unable to render markdown due to panic in goldmark: %v", err)
|
||||||
|
if log.IsDebug() {
|
||||||
|
log.Debug("Panic in markdown: %v\n%s", err, string(log.Stack(2)))
|
||||||
|
}
|
||||||
|
_ = lw.CloseWithError(fmt.Errorf("%v", err))
|
||||||
|
}()
|
||||||
|
|
||||||
|
pc := NewGiteaParseContext(urlPrefix, metas, wikiMarkdown)
|
||||||
|
if err := converter.Convert(giteautil.NormalizeEOL(body), lw, parser.WithContext(pc)); err != nil {
|
||||||
|
log.Error("Unable to render: %v", err)
|
||||||
|
_ = lw.CloseWithError(err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
_ = lw.Close()
|
||||||
|
}()
|
||||||
|
return markup.SanitizeReader(rd).Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) (ret []byte) {
|
||||||
|
defer func() {
|
||||||
|
err := recover()
|
||||||
|
if err == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Warn("Unable to render markdown due to panic in goldmark - will return sanitized raw bytes")
|
||||||
|
if log.IsDebug() {
|
||||||
|
log.Debug("Panic in markdown: %v\n%s", err, string(log.Stack(2)))
|
||||||
|
}
|
||||||
|
ret = markup.SanitizeBytes(body)
|
||||||
|
}()
|
||||||
|
return actualRender(body, urlPrefix, metas, wikiMarkdown)
|
||||||
}
|
}
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
|||||||
@@ -308,3 +308,34 @@ func TestRender_RenderParagraphs(t *testing.T) {
|
|||||||
test(t, "A\n\nB\nC\n", 2)
|
test(t, "A\n\nB\nC\n", 2)
|
||||||
test(t, "A\n\n\nB\nC\n", 2)
|
test(t, "A\n\n\nB\nC\n", 2)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestMarkdownRenderRaw(t *testing.T) {
|
||||||
|
testcases := [][]byte{
|
||||||
|
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6267570554535936
|
||||||
|
0x2a, 0x20, 0x2d, 0x0a, 0x09, 0x20, 0x60, 0x5b, 0x0a, 0x09, 0x20, 0x60,
|
||||||
|
0x5b,
|
||||||
|
},
|
||||||
|
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6278827345051648
|
||||||
|
0x2d, 0x20, 0x2d, 0x0d, 0x09, 0x60, 0x0d, 0x09, 0x60,
|
||||||
|
},
|
||||||
|
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6016973788020736[] = {
|
||||||
|
0x7b, 0x63, 0x6c, 0x61, 0x73, 0x73, 0x3d, 0x35, 0x7d, 0x0a, 0x3d,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, testcase := range testcases {
|
||||||
|
_ = RenderRaw(testcase, "", false)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRenderSiblingImages_Issue12925(t *testing.T) {
|
||||||
|
testcase := `
|
||||||
|

|
||||||
|
`
|
||||||
|
expected := `<p><a href="/image1" rel="nofollow"><img src="/image1" alt="image1"></a><br>
|
||||||
|
<a href="/image2" rel="nofollow"><img src="/image2" alt="image2"></a></p>
|
||||||
|
`
|
||||||
|
res := string(RenderRaw([]byte(testcase), "", false))
|
||||||
|
assert.Equal(t, expected, res)
|
||||||
|
|
||||||
|
}
|
||||||
|
|||||||
@@ -46,7 +46,9 @@ func ReplaceSanitizer() {
|
|||||||
sanitizer.policy.AllowAttrs("checked", "disabled", "readonly").OnElements("input")
|
sanitizer.policy.AllowAttrs("checked", "disabled", "readonly").OnElements("input")
|
||||||
|
|
||||||
// Custom URL-Schemes
|
// Custom URL-Schemes
|
||||||
sanitizer.policy.AllowURLSchemes(setting.Markdown.CustomURLSchemes...)
|
if len(setting.Markdown.CustomURLSchemes) > 0 {
|
||||||
|
sanitizer.policy.AllowURLSchemes(setting.Markdown.CustomURLSchemes...)
|
||||||
|
}
|
||||||
|
|
||||||
// Allow keyword markup
|
// Allow keyword markup
|
||||||
sanitizer.policy.AllowAttrs("class").Matching(regexp.MustCompile(`^` + keywordClass + `$`)).OnElements("span")
|
sanitizer.policy.AllowAttrs("class").Matching(regexp.MustCompile(`^` + keywordClass + `$`)).OnElements("span")
|
||||||
|
|||||||
@@ -6,6 +6,8 @@
|
|||||||
package markup
|
package markup
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"html/template"
|
||||||
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
"github.com/stretchr/testify/assert"
|
||||||
@@ -50,3 +52,13 @@ func Test_Sanitizer(t *testing.T) {
|
|||||||
assert.Equal(t, testCases[i+1], string(SanitizeBytes([]byte(testCases[i]))))
|
assert.Equal(t, testCases[i+1], string(SanitizeBytes([]byte(testCases[i]))))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestSanitizeNonEscape(t *testing.T) {
|
||||||
|
descStr := "<scrİpt><script>alert(document.domain)</script></scrİpt>"
|
||||||
|
|
||||||
|
output := template.HTML(Sanitize(string(descStr)))
|
||||||
|
if strings.Contains(string(output), "<script>") {
|
||||||
|
t.Errorf("un-escaped <script> in output: %q", output)
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|||||||
@@ -295,7 +295,8 @@ func (g *GiteaLocalUploader) CreateReleases(downloader base.Downloader, releases
|
|||||||
}
|
}
|
||||||
rc = resp.Body
|
rc = resp.Body
|
||||||
}
|
}
|
||||||
_, err = storage.Attachments.Save(attach.RelativePath(), rc)
|
defer rc.Close()
|
||||||
|
_, err = storage.Attachments.Save(attach.RelativePath(), rc, int64(*asset.Size))
|
||||||
return err
|
return err
|
||||||
}()
|
}()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -52,6 +52,13 @@ func isMigrateURLAllowed(remoteURL string) error {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if u.Host == "" {
|
||||||
|
if !setting.ImportLocalPaths {
|
||||||
|
return &models.ErrMigrationNotAllowed{Host: "<LOCAL_FILESYSTEM>"}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
if !setting.Migrations.AllowLocalNetworks {
|
if !setting.Migrations.AllowLocalNetworks {
|
||||||
addrList, err := net.LookupIP(strings.Split(u.Host, ":")[0])
|
addrList, err := net.LookupIP(strings.Split(u.Host, ":")[0])
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -31,4 +31,16 @@ func TestMigrateWhiteBlocklist(t *testing.T) {
|
|||||||
|
|
||||||
err = isMigrateURLAllowed("https://github.com/go-gitea/gitea.git")
|
err = isMigrateURLAllowed("https://github.com/go-gitea/gitea.git")
|
||||||
assert.Error(t, err)
|
assert.Error(t, err)
|
||||||
|
|
||||||
|
old := setting.ImportLocalPaths
|
||||||
|
setting.ImportLocalPaths = false
|
||||||
|
|
||||||
|
err = isMigrateURLAllowed("/home/foo/bar/goo")
|
||||||
|
assert.Error(t, err)
|
||||||
|
|
||||||
|
setting.ImportLocalPaths = true
|
||||||
|
err = isMigrateURLAllowed("/home/foo/bar/goo")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
setting.ImportLocalPaths = old
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -34,6 +34,7 @@ type Options struct {
|
|||||||
// KnownPublicEntries list all direct children in the `public` directory
|
// KnownPublicEntries list all direct children in the `public` directory
|
||||||
var KnownPublicEntries = []string{
|
var KnownPublicEntries = []string{
|
||||||
"css",
|
"css",
|
||||||
|
"fonts",
|
||||||
"img",
|
"img",
|
||||||
"js",
|
"js",
|
||||||
"serviceworker.js",
|
"serviceworker.js",
|
||||||
|
|||||||
@@ -120,7 +120,6 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
|
|||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
infos[i] = uploadInfo
|
infos[i] = uploadInfo
|
||||||
|
|
||||||
} else if objectHash, err = t.HashObject(file); err != nil {
|
} else if objectHash, err = t.HashObject(file); err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
@@ -128,7 +127,6 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
|
|||||||
// Add the object to the index
|
// Add the object to the index
|
||||||
if err := t.AddObjectToIndex("100644", objectHash, path.Join(opts.TreePath, uploadInfo.upload.Name)); err != nil {
|
if err := t.AddObjectToIndex("100644", objectHash, path.Join(opts.TreePath, uploadInfo.upload.Name)); err != nil {
|
||||||
return err
|
return err
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -165,28 +163,10 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
|
|||||||
// OK now we can insert the data into the store - there's no way to clean up the store
|
// OK now we can insert the data into the store - there's no way to clean up the store
|
||||||
// once it's in there, it's in there.
|
// once it's in there, it's in there.
|
||||||
contentStore := &lfs.ContentStore{ObjectStorage: storage.LFS}
|
contentStore := &lfs.ContentStore{ObjectStorage: storage.LFS}
|
||||||
for _, uploadInfo := range infos {
|
for _, info := range infos {
|
||||||
if uploadInfo.lfsMetaObject == nil {
|
if err := uploadToLFSContentStore(info, contentStore); err != nil {
|
||||||
continue
|
|
||||||
}
|
|
||||||
exist, err := contentStore.Exists(uploadInfo.lfsMetaObject)
|
|
||||||
if err != nil {
|
|
||||||
return cleanUpAfterFailure(&infos, t, err)
|
return cleanUpAfterFailure(&infos, t, err)
|
||||||
}
|
}
|
||||||
if !exist {
|
|
||||||
file, err := os.Open(uploadInfo.upload.LocalPath())
|
|
||||||
if err != nil {
|
|
||||||
return cleanUpAfterFailure(&infos, t, err)
|
|
||||||
}
|
|
||||||
defer file.Close()
|
|
||||||
// FIXME: Put regenerates the hash and copies the file over.
|
|
||||||
// I guess this strictly ensures the soundness of the store but this is inefficient.
|
|
||||||
if err := contentStore.Put(uploadInfo.lfsMetaObject, file); err != nil {
|
|
||||||
// OK Now we need to cleanup
|
|
||||||
// Can't clean up the store, once uploaded there they're there.
|
|
||||||
return cleanUpAfterFailure(&infos, t, err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Then push this tree to NewBranch
|
// Then push this tree to NewBranch
|
||||||
@@ -196,3 +176,29 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
|
|||||||
|
|
||||||
return models.DeleteUploads(uploads...)
|
return models.DeleteUploads(uploads...)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func uploadToLFSContentStore(info uploadInfo, contentStore *lfs.ContentStore) error {
|
||||||
|
if info.lfsMetaObject == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
exist, err := contentStore.Exists(info.lfsMetaObject)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if !exist {
|
||||||
|
file, err := os.Open(info.upload.LocalPath())
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
defer file.Close()
|
||||||
|
// FIXME: Put regenerates the hash and copies the file over.
|
||||||
|
// I guess this strictly ensures the soundness of the store but this is inefficient.
|
||||||
|
if err := contentStore.Put(info.lfsMetaObject, file); err != nil {
|
||||||
|
// OK Now we need to cleanup
|
||||||
|
// Can't clean up the store, once uploaded there they're there.
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ package storage
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"io"
|
"io"
|
||||||
|
"io/ioutil"
|
||||||
"net/url"
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
@@ -24,13 +25,15 @@ const LocalStorageType Type = "local"
|
|||||||
|
|
||||||
// LocalStorageConfig represents the configuration for a local storage
|
// LocalStorageConfig represents the configuration for a local storage
|
||||||
type LocalStorageConfig struct {
|
type LocalStorageConfig struct {
|
||||||
Path string `ini:"PATH"`
|
Path string `ini:"PATH"`
|
||||||
|
TemporaryPath string `ini:"TEMPORARY_PATH"`
|
||||||
}
|
}
|
||||||
|
|
||||||
// LocalStorage represents a local files storage
|
// LocalStorage represents a local files storage
|
||||||
type LocalStorage struct {
|
type LocalStorage struct {
|
||||||
ctx context.Context
|
ctx context.Context
|
||||||
dir string
|
dir string
|
||||||
|
tmpdir string
|
||||||
}
|
}
|
||||||
|
|
||||||
// NewLocalStorage returns a local files
|
// NewLocalStorage returns a local files
|
||||||
@@ -45,9 +48,14 @@ func NewLocalStorage(ctx context.Context, cfg interface{}) (ObjectStorage, error
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if config.TemporaryPath == "" {
|
||||||
|
config.TemporaryPath = config.Path + "/tmp"
|
||||||
|
}
|
||||||
|
|
||||||
return &LocalStorage{
|
return &LocalStorage{
|
||||||
ctx: ctx,
|
ctx: ctx,
|
||||||
dir: config.Path,
|
dir: config.Path,
|
||||||
|
tmpdir: config.TemporaryPath,
|
||||||
}, nil
|
}, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -57,23 +65,43 @@ func (l *LocalStorage) Open(path string) (Object, error) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Save a file
|
// Save a file
|
||||||
func (l *LocalStorage) Save(path string, r io.Reader) (int64, error) {
|
func (l *LocalStorage) Save(path string, r io.Reader, size int64) (int64, error) {
|
||||||
p := filepath.Join(l.dir, path)
|
p := filepath.Join(l.dir, path)
|
||||||
if err := os.MkdirAll(filepath.Dir(p), os.ModePerm); err != nil {
|
if err := os.MkdirAll(filepath.Dir(p), os.ModePerm); err != nil {
|
||||||
return 0, err
|
return 0, err
|
||||||
}
|
}
|
||||||
|
|
||||||
// always override
|
// Create a temporary file to save to
|
||||||
if err := util.Remove(p); err != nil {
|
if err := os.MkdirAll(l.tmpdir, os.ModePerm); err != nil {
|
||||||
return 0, err
|
return 0, err
|
||||||
}
|
}
|
||||||
|
tmp, err := ioutil.TempFile(l.tmpdir, "upload-*")
|
||||||
f, err := os.Create(p)
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return 0, err
|
return 0, err
|
||||||
}
|
}
|
||||||
defer f.Close()
|
tmpRemoved := false
|
||||||
return io.Copy(f, r)
|
defer func() {
|
||||||
|
if !tmpRemoved {
|
||||||
|
_ = util.Remove(tmp.Name())
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
n, err := io.Copy(tmp, r)
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := tmp.Close(); err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := os.Rename(tmp.Name(), p); err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
tmpRemoved = true
|
||||||
|
|
||||||
|
return n, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// Stat returns the info of the file
|
// Stat returns the info of the file
|
||||||
|
|||||||
@@ -129,13 +129,13 @@ func (m *MinioStorage) Open(path string) (Object, error) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Save save a file to minio
|
// Save save a file to minio
|
||||||
func (m *MinioStorage) Save(path string, r io.Reader) (int64, error) {
|
func (m *MinioStorage) Save(path string, r io.Reader, size int64) (int64, error) {
|
||||||
uploadInfo, err := m.client.PutObject(
|
uploadInfo, err := m.client.PutObject(
|
||||||
m.ctx,
|
m.ctx,
|
||||||
m.bucket,
|
m.bucket,
|
||||||
m.buildMinioPath(path),
|
m.buildMinioPath(path),
|
||||||
r,
|
r,
|
||||||
-1,
|
size,
|
||||||
minio.PutObjectOptions{ContentType: "application/octet-stream"},
|
minio.PutObjectOptions{ContentType: "application/octet-stream"},
|
||||||
)
|
)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -65,7 +65,8 @@ type Object interface {
|
|||||||
// ObjectStorage represents an object storage to handle a bucket and files
|
// ObjectStorage represents an object storage to handle a bucket and files
|
||||||
type ObjectStorage interface {
|
type ObjectStorage interface {
|
||||||
Open(path string) (Object, error)
|
Open(path string) (Object, error)
|
||||||
Save(path string, r io.Reader) (int64, error)
|
// Save store a object, if size is unknown set -1
|
||||||
|
Save(path string, r io.Reader, size int64) (int64, error)
|
||||||
Stat(path string) (os.FileInfo, error)
|
Stat(path string) (os.FileInfo, error)
|
||||||
Delete(path string) error
|
Delete(path string) error
|
||||||
URL(path, name string) (*url.URL, error)
|
URL(path, name string) (*url.URL, error)
|
||||||
@@ -80,7 +81,13 @@ func Copy(dstStorage ObjectStorage, dstPath string, srcStorage ObjectStorage, sr
|
|||||||
}
|
}
|
||||||
defer f.Close()
|
defer f.Close()
|
||||||
|
|
||||||
return dstStorage.Save(dstPath, f)
|
size := int64(-1)
|
||||||
|
fsinfo, err := f.Stat()
|
||||||
|
if err == nil {
|
||||||
|
size = fsinfo.Size()
|
||||||
|
}
|
||||||
|
|
||||||
|
return dstStorage.Save(dstPath, f, size)
|
||||||
}
|
}
|
||||||
|
|
||||||
// SaveFrom saves data to the ObjectStorage with path p from the callback
|
// SaveFrom saves data to the ObjectStorage with path p from the callback
|
||||||
@@ -94,7 +101,7 @@ func SaveFrom(objStorage ObjectStorage, p string, callback func(w io.Writer) err
|
|||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
_, err := objStorage.Save(p, pr)
|
_, err := objStorage.Save(p, pr, -1)
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -689,6 +689,11 @@ func ActionIcon(opType models.ActionType) string {
|
|||||||
// ActionContent2Commits converts action content to push commits
|
// ActionContent2Commits converts action content to push commits
|
||||||
func ActionContent2Commits(act Actioner) *repository.PushCommits {
|
func ActionContent2Commits(act Actioner) *repository.PushCommits {
|
||||||
push := repository.NewPushCommits()
|
push := repository.NewPushCommits()
|
||||||
|
|
||||||
|
if act == nil || act.GetContent() == "" {
|
||||||
|
return push
|
||||||
|
}
|
||||||
|
|
||||||
if err := json.Unmarshal([]byte(act.GetContent()), push); err != nil {
|
if err := json.Unmarshal([]byte(act.GetContent()), push); err != nil {
|
||||||
log.Error("json.Unmarshal:\n%s\nERROR: %v", act.GetContent(), err)
|
log.Error("json.Unmarshal:\n%s\nERROR: %v", act.GetContent(), err)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -293,7 +293,6 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
|
|||||||
var (
|
var (
|
||||||
repo = ctx.Repo.Repository
|
repo = ctx.Repo.Repository
|
||||||
labelIDs []int64
|
labelIDs []int64
|
||||||
assigneeID int64
|
|
||||||
milestoneID int64
|
milestoneID int64
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -354,7 +353,7 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
|
|||||||
}
|
}
|
||||||
|
|
||||||
if form.Milestone > 0 {
|
if form.Milestone > 0 {
|
||||||
milestone, err := models.GetMilestoneByRepoID(ctx.Repo.Repository.ID, milestoneID)
|
milestone, err := models.GetMilestoneByRepoID(ctx.Repo.Repository.ID, form.Milestone)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
if models.IsErrMilestoneNotExist(err) {
|
if models.IsErrMilestoneNotExist(err) {
|
||||||
ctx.NotFound()
|
ctx.NotFound()
|
||||||
@@ -378,7 +377,6 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
|
|||||||
PosterID: ctx.User.ID,
|
PosterID: ctx.User.ID,
|
||||||
Poster: ctx.User,
|
Poster: ctx.User,
|
||||||
MilestoneID: milestoneID,
|
MilestoneID: milestoneID,
|
||||||
AssigneeID: assigneeID,
|
|
||||||
IsPull: true,
|
IsPull: true,
|
||||||
Content: form.Body,
|
Content: form.Body,
|
||||||
DeadlineUnix: deadlineUnix,
|
DeadlineUnix: deadlineUnix,
|
||||||
|
|||||||
@@ -539,6 +539,10 @@ func updateBasicProperties(ctx *context.APIContext, opts api.EditRepoOption) err
|
|||||||
if opts.Private != nil {
|
if opts.Private != nil {
|
||||||
// Visibility of forked repository is forced sync with base repository.
|
// Visibility of forked repository is forced sync with base repository.
|
||||||
if repo.IsFork {
|
if repo.IsFork {
|
||||||
|
if err := repo.GetBaseRepo(); err != nil {
|
||||||
|
ctx.Error(http.StatusInternalServerError, "Unable to load base repository", err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
*opts.Private = repo.BaseRepo.IsPrivate
|
*opts.Private = repo.BaseRepo.IsPrivate
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -268,7 +268,11 @@ func DeleteOauth2Application(ctx *context.APIContext) {
|
|||||||
// "$ref": "#/responses/empty"
|
// "$ref": "#/responses/empty"
|
||||||
appID := ctx.ParamsInt64(":id")
|
appID := ctx.ParamsInt64(":id")
|
||||||
if err := models.DeleteOAuth2Application(appID, ctx.User.ID); err != nil {
|
if err := models.DeleteOAuth2Application(appID, ctx.User.ID); err != nil {
|
||||||
ctx.Error(http.StatusInternalServerError, "DeleteOauth2ApplicationByID", err)
|
if models.IsErrOAuthApplicationNotFound(err) {
|
||||||
|
ctx.NotFound()
|
||||||
|
} else {
|
||||||
|
ctx.Error(http.StatusInternalServerError, "DeleteOauth2ApplicationByID", err)
|
||||||
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -6,6 +6,7 @@
|
|||||||
package repo
|
package repo
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"errors"
|
||||||
"path"
|
"path"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
@@ -341,6 +342,11 @@ func RawDiff(ctx *context.Context) {
|
|||||||
git.RawDiffType(ctx.Params(":ext")),
|
git.RawDiffType(ctx.Params(":ext")),
|
||||||
ctx.Resp,
|
ctx.Resp,
|
||||||
); err != nil {
|
); err != nil {
|
||||||
|
if git.IsErrNotExist(err) {
|
||||||
|
ctx.NotFound("GetRawDiff",
|
||||||
|
errors.New("commit "+ctx.Params(":sha")+" does not exist."))
|
||||||
|
return
|
||||||
|
}
|
||||||
ctx.ServerError("GetRawDiff", err)
|
ctx.ServerError("GetRawDiff", err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ import (
|
|||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
"path"
|
"path"
|
||||||
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
"text/template"
|
"text/template"
|
||||||
"time"
|
"time"
|
||||||
@@ -152,12 +153,21 @@ func storageHandler(storageSetting setting.Storage, prefix string, objStore stor
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
if !strings.HasPrefix(req.URL.RequestURI(), "/"+prefix) {
|
prefix := strings.Trim(prefix, "/")
|
||||||
|
|
||||||
|
if !strings.HasPrefix(req.URL.EscapedPath(), "/"+prefix+"/") {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
rPath := strings.TrimPrefix(req.URL.EscapedPath(), "/"+prefix+"/")
|
||||||
|
|
||||||
rPath := strings.TrimPrefix(req.URL.RequestURI(), "/"+prefix)
|
|
||||||
rPath = strings.TrimPrefix(rPath, "/")
|
rPath = strings.TrimPrefix(rPath, "/")
|
||||||
|
if rPath == "" {
|
||||||
|
ctx.Error(404, "file not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
rPath = path.Clean("/" + filepath.ToSlash(rPath))
|
||||||
|
rPath = rPath[1:]
|
||||||
|
|
||||||
//If we have matched and access to release or issue
|
//If we have matched and access to release or issue
|
||||||
fr, err := objStore.Open(rPath)
|
fr, err := objStore.Open(rPath)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -1014,6 +1014,11 @@ func parseHunks(curFile *DiffFile, maxLines, maxLineCharacters int, input *bufio
|
|||||||
}
|
}
|
||||||
diffLine := &DiffLine{Type: DiffLineAdd, RightIdx: rightLine}
|
diffLine := &DiffLine{Type: DiffLineAdd, RightIdx: rightLine}
|
||||||
rightLine++
|
rightLine++
|
||||||
|
if curSection == nil {
|
||||||
|
// Create a new section to represent this hunk
|
||||||
|
curSection = &DiffSection{}
|
||||||
|
curFile.Sections = append(curFile.Sections, curSection)
|
||||||
|
}
|
||||||
curSection.Lines = append(curSection.Lines, diffLine)
|
curSection.Lines = append(curSection.Lines, diffLine)
|
||||||
case '-':
|
case '-':
|
||||||
curFileLinesCount++
|
curFileLinesCount++
|
||||||
@@ -1026,6 +1031,11 @@ func parseHunks(curFile *DiffFile, maxLines, maxLineCharacters int, input *bufio
|
|||||||
if leftLine > 0 {
|
if leftLine > 0 {
|
||||||
leftLine++
|
leftLine++
|
||||||
}
|
}
|
||||||
|
if curSection == nil {
|
||||||
|
// Create a new section to represent this hunk
|
||||||
|
curSection = &DiffSection{}
|
||||||
|
curFile.Sections = append(curFile.Sections, curSection)
|
||||||
|
}
|
||||||
curSection.Lines = append(curSection.Lines, diffLine)
|
curSection.Lines = append(curSection.Lines, diffLine)
|
||||||
case ' ':
|
case ' ':
|
||||||
curFileLinesCount++
|
curFileLinesCount++
|
||||||
@@ -1036,6 +1046,11 @@ func parseHunks(curFile *DiffFile, maxLines, maxLineCharacters int, input *bufio
|
|||||||
diffLine := &DiffLine{Type: DiffLinePlain, LeftIdx: leftLine, RightIdx: rightLine}
|
diffLine := &DiffLine{Type: DiffLinePlain, LeftIdx: leftLine, RightIdx: rightLine}
|
||||||
leftLine++
|
leftLine++
|
||||||
rightLine++
|
rightLine++
|
||||||
|
if curSection == nil {
|
||||||
|
// Create a new section to represent this hunk
|
||||||
|
curSection = &DiffSection{}
|
||||||
|
curFile.Sections = append(curFile.Sections, curSection)
|
||||||
|
}
|
||||||
curSection.Lines = append(curSection.Lines, diffLine)
|
curSection.Lines = append(curSection.Lines, diffLine)
|
||||||
default:
|
default:
|
||||||
// This is unexpected
|
// This is unexpected
|
||||||
@@ -1282,6 +1297,16 @@ func CommentAsDiff(c *models.Comment) (*Diff, error) {
|
|||||||
|
|
||||||
// CommentMustAsDiff executes AsDiff and logs the error instead of returning
|
// CommentMustAsDiff executes AsDiff and logs the error instead of returning
|
||||||
func CommentMustAsDiff(c *models.Comment) *Diff {
|
func CommentMustAsDiff(c *models.Comment) *Diff {
|
||||||
|
if c == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
defer func() {
|
||||||
|
if err := recover(); err != nil {
|
||||||
|
stack := log.Stack(2)
|
||||||
|
log.Error("PANIC whilst retrieving diff for comment[%d] Error: %v\nStack: %s", c.ID, err, stack)
|
||||||
|
panic(fmt.Errorf("PANIC whilst retrieving diff for comment[%d] Error: %v\nStack: %s", c.ID, err, stack))
|
||||||
|
}
|
||||||
|
}()
|
||||||
diff, err := CommentAsDiff(c)
|
diff, err := CommentAsDiff(c)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Warn("CommentMustAsDiff: %v", err)
|
log.Warn("CommentMustAsDiff: %v", err)
|
||||||
|
|||||||
@@ -48,6 +48,9 @@ func Update(pull *models.PullRequest, doer *models.User, message string) error {
|
|||||||
|
|
||||||
// IsUserAllowedToUpdate check if user is allowed to update PR with given permissions and branch protections
|
// IsUserAllowedToUpdate check if user is allowed to update PR with given permissions and branch protections
|
||||||
func IsUserAllowedToUpdate(pull *models.PullRequest, user *models.User) (bool, error) {
|
func IsUserAllowedToUpdate(pull *models.PullRequest, user *models.User) (bool, error) {
|
||||||
|
if user == nil {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
headRepoPerm, err := models.GetUserRepoPermission(pull.HeadRepo, user)
|
headRepoPerm, err := models.GetUserRepoPermission(pull.HeadRepo, user)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return false, err
|
return false, err
|
||||||
|
|||||||
@@ -12,8 +12,10 @@
|
|||||||
</head>
|
</head>
|
||||||
|
|
||||||
<body>
|
<body>
|
||||||
<p><b>@{{.Release.Publisher.Name}}</b> released <a href="{{.Release.HTMLURL}}">{{.Release.TagName}}</a>
|
<p>
|
||||||
in <a href="{{AppUrl}}{{.Release.Repo.OwnerName}}/{{.Release.Repo.Name}}">{{.Release.Repo.FullName}} </p>
|
<b>@{{.Release.Publisher.Name}}</b> released <a href="{{.Release.HTMLURL}}">{{.Release.TagName}}</a>
|
||||||
|
in <a href="{{AppUrl}}{{.Release.Repo.OwnerName}}/{{.Release.Repo.Name}}">{{.Release.Repo.FullName}}</a>
|
||||||
|
</p>
|
||||||
<h4>Title: {{.Release.Title}}</h4>
|
<h4>Title: {{.Release.Title}}</h4>
|
||||||
<p>
|
<p>
|
||||||
Note: <br>
|
Note: <br>
|
||||||
|
|||||||
@@ -57,14 +57,12 @@
|
|||||||
<div class="diff-file-box diff-box file-content">
|
<div class="diff-file-box diff-box file-content">
|
||||||
<h4 class="ui top attached normal header rounded">
|
<h4 class="ui top attached normal header rounded">
|
||||||
<div class="diff-counter count ui left">
|
<div class="diff-counter count ui left">
|
||||||
{{if not $file.IsRenamed}}
|
<span class="add" data-line="{{.Addition}}">+ {{.Addition}}</span>
|
||||||
<span class="add" data-line="{{.Addition}}">+ {{.Addition}}</span>
|
<span class="bar">
|
||||||
<span class="bar">
|
<div class="pull-left add"></div>
|
||||||
<div class="pull-left add"></div>
|
<div class="pull-left del"></div>
|
||||||
<div class="pull-left del"></div>
|
</span>
|
||||||
</span>
|
<span class="del" data-line="{{.Deletion}}">- {{.Deletion}}</span>
|
||||||
<span class="del" data-line="{{.Deletion}}">- {{.Deletion}}</span>
|
|
||||||
{{end}}
|
|
||||||
</div>
|
</div>
|
||||||
<span class="file">{{$file.Name}}</span>
|
<span class="file">{{$file.Name}}</span>
|
||||||
<div>{{$.i18n.Tr "repo.diff.file_suppressed"}}</div>
|
<div>{{$.i18n.Tr "repo.diff.file_suppressed"}}</div>
|
||||||
@@ -97,7 +95,7 @@
|
|||||||
<div class="diff-counter count">
|
<div class="diff-counter count">
|
||||||
{{if $file.IsBin}}
|
{{if $file.IsBin}}
|
||||||
{{$.i18n.Tr "repo.diff.bin"}}
|
{{$.i18n.Tr "repo.diff.bin"}}
|
||||||
{{else if not $file.IsRenamed}}
|
{{else}}
|
||||||
<span class="add" data-line="{{.Addition}}">+ {{.Addition}}</span>
|
<span class="add" data-line="{{.Addition}}">+ {{.Addition}}</span>
|
||||||
<span class="bar">
|
<span class="bar">
|
||||||
<div class="pull-left add"></div>
|
<div class="pull-left add"></div>
|
||||||
@@ -119,138 +117,136 @@
|
|||||||
{{end}}
|
{{end}}
|
||||||
</h4>
|
</h4>
|
||||||
<div class="diff-file-body ui attached unstackable table segment">
|
<div class="diff-file-body ui attached unstackable table segment">
|
||||||
{{if ne $file.Type 4}}
|
<div class="file-body file-code has-context-menu code-diff {{if $.IsSplitStyle}}code-diff-split{{else}}code-diff-unified{{end}}">
|
||||||
<div class="file-body file-code has-context-menu code-diff {{if $.IsSplitStyle}}code-diff-split{{else}}code-diff-unified{{end}}">
|
<table class="chroma">
|
||||||
<table class="chroma">
|
<tbody>
|
||||||
<tbody>
|
{{if $isImage}}
|
||||||
{{if $isImage}}
|
{{template "repo/diff/image_diff" dict "file" . "root" $}}
|
||||||
{{template "repo/diff/image_diff" dict "file" . "root" $}}
|
{{else}}
|
||||||
{{else}}
|
{{if $.IsSplitStyle}}
|
||||||
{{if $.IsSplitStyle}}
|
{{range $j, $section := $file.Sections}}
|
||||||
{{range $j, $section := $file.Sections}}
|
{{range $k, $line := $section.Lines}}
|
||||||
{{range $k, $line := $section.Lines}}
|
<tr class="{{DiffLineTypeToStr .GetType}}-code nl-{{$k}} ol-{{$k}}">
|
||||||
<tr class="{{DiffLineTypeToStr .GetType}}-code nl-{{$k}} ol-{{$k}}">
|
{{if eq .GetType 4}}
|
||||||
{{if eq .GetType 4}}
|
<td class="lines-num lines-num-old">
|
||||||
<td class="lines-num lines-num-old">
|
{{if or (eq $line.GetExpandDirection 3) (eq $line.GetExpandDirection 5) }}
|
||||||
{{if or (eq $line.GetExpandDirection 3) (eq $line.GetExpandDirection 5) }}
|
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=down" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
||||||
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=down" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
{{svg "octicon-fold-down"}}
|
||||||
{{svg "octicon-fold-down"}}
|
</a>
|
||||||
</a>
|
{{end}}
|
||||||
{{end}}
|
{{if or (eq $line.GetExpandDirection 3) (eq $line.GetExpandDirection 4) }}
|
||||||
{{if or (eq $line.GetExpandDirection 3) (eq $line.GetExpandDirection 4) }}
|
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=up" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
||||||
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=up" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
{{svg "octicon-fold-up"}}
|
||||||
{{svg "octicon-fold-up"}}
|
</a>
|
||||||
</a>
|
{{end}}
|
||||||
{{end}}
|
{{if eq $line.GetExpandDirection 2}}
|
||||||
{{if eq $line.GetExpandDirection 2}}
|
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
||||||
<a role="button" class="blob-excerpt" data-url="{{$.RepoLink}}/blob_excerpt/{{$.AfterCommitID}}" data-query="{{$line.GetBlobExcerptQuery}}&style=split&direction=" data-anchor="diff-{{Sha1 $file.Name}}K{{$line.SectionInfo.RightIdx}}">
|
{{svg "octicon-fold"}}
|
||||||
{{svg "octicon-fold"}}
|
</a>
|
||||||
</a>
|
{{end}}
|
||||||
{{end}}
|
</td>
|
||||||
</td>
|
<td colspan="5" class="lines-code lines-code-old "><span class="mono wrap">{{$section.GetComputedInlineDiffFor $line}}</span></td>
|
||||||
<td colspan="5" class="lines-code lines-code-old "><span class="mono wrap">{{$section.GetComputedInlineDiffFor $line}}</span></td>
|
{{else}}
|
||||||
{{else}}
|
<td class="lines-num lines-num-old" data-line-num="{{if $line.LeftIdx}}{{$line.LeftIdx}}{{end}}"><span rel="{{if $line.LeftIdx}}diff-{{Sha1 $file.Name}}L{{$line.LeftIdx}}{{end}}"></span></td>
|
||||||
<td class="lines-num lines-num-old" data-line-num="{{if $line.LeftIdx}}{{$line.LeftIdx}}{{end}}"><span rel="{{if $line.LeftIdx}}diff-{{Sha1 $file.Name}}L{{$line.LeftIdx}}{{end}}"></span></td>
|
<td class="lines-type-marker lines-type-marker-old">{{if $line.LeftIdx}}<span class="mono" data-type-marker="{{$line.GetLineTypeMarker}}"></span>{{end}}</td>
|
||||||
<td class="lines-type-marker lines-type-marker-old">{{if $line.LeftIdx}}<span class="mono" data-type-marker="{{$line.GetLineTypeMarker}}"></span>{{end}}</td>
|
<td class="lines-code lines-code-old halfwidth">{{if and $.SignedUserID $line.CanComment $.PageIsPullFiles (not (eq .GetType 2))}}<a class="ui green button add-code-comment add-code-comment-left" data-path="{{$file.Name}}" data-side="left" data-idx="{{$line.LeftIdx}}" data-type-marker="+"></a>{{end}}<span class="mono wrap">{{if $line.LeftIdx}}{{$section.GetComputedInlineDiffFor $line}}{{end}}</span></td>
|
||||||
<td class="lines-code lines-code-old halfwidth">{{if and $.SignedUserID $line.CanComment $.PageIsPullFiles (not (eq .GetType 2))}}<a class="ui green button add-code-comment add-code-comment-left" data-path="{{$file.Name}}" data-side="left" data-idx="{{$line.LeftIdx}}" data-type-marker="+"></a>{{end}}<span class="mono wrap">{{if $line.LeftIdx}}{{$section.GetComputedInlineDiffFor $line}}{{end}}</span></td>
|
<td class="lines-num lines-num-new" data-line-num="{{if $line.RightIdx}}{{$line.RightIdx}}{{end}}"><span rel="{{if $line.RightIdx}}diff-{{Sha1 $file.Name}}R{{$line.RightIdx}}{{end}}"></span></td>
|
||||||
<td class="lines-num lines-num-new" data-line-num="{{if $line.RightIdx}}{{$line.RightIdx}}{{end}}"><span rel="{{if $line.RightIdx}}diff-{{Sha1 $file.Name}}R{{$line.RightIdx}}{{end}}"></span></td>
|
<td class="lines-type-marker lines-type-marker-new">{{if $line.RightIdx}}<span class="mono" data-type-marker="{{$line.GetLineTypeMarker}}"></span>{{end}}</td>
|
||||||
<td class="lines-type-marker lines-type-marker-new">{{if $line.RightIdx}}<span class="mono" data-type-marker="{{$line.GetLineTypeMarker}}"></span>{{end}}</td>
|
<td class="lines-code lines-code-new halfwidth">{{if and $.SignedUserID $line.CanComment $.PageIsPullFiles (not (eq .GetType 3))}}<a class="ui green button add-code-comment add-code-comment-right" data-path="{{$file.Name}}" data-side="right" data-idx="{{$line.RightIdx}}" data-type-marker="+"></a>{{end}}<span class="mono wrap">{{if $line.RightIdx}}{{$section.GetComputedInlineDiffFor $line}}{{end}}</span></td>
|
||||||
<td class="lines-code lines-code-new halfwidth">{{if and $.SignedUserID $line.CanComment $.PageIsPullFiles (not (eq .GetType 3))}}<a class="ui green button add-code-comment add-code-comment-right" data-path="{{$file.Name}}" data-side="right" data-idx="{{$line.RightIdx}}" data-type-marker="+"></a>{{end}}<span class="mono wrap">{{if $line.RightIdx}}{{$section.GetComputedInlineDiffFor $line}}{{end}}</span></td>
|
|
||||||
{{end}}
|
|
||||||
</tr>
|
|
||||||
{{if gt (len $line.Comments) 0}}
|
|
||||||
{{$resolved := (index $line.Comments 0).IsResolved}}
|
|
||||||
{{$resolveDoer := (index $line.Comments 0).ResolveDoer}}
|
|
||||||
{{$isNotPending := (not (eq (index $line.Comments 0).Review.Type 0))}}
|
|
||||||
<tr class="add-code-comment">
|
|
||||||
<td class="lines-num"></td>
|
|
||||||
<td class="lines-type-marker"></td>
|
|
||||||
<td class="add-comment-left">
|
|
||||||
<div class="conversation-holder">
|
|
||||||
{{if and $resolved (eq $line.GetCommentSide "previous")}}
|
|
||||||
<div class="ui top attached header">
|
|
||||||
<span class="ui grey text left"><b>{{$resolveDoer.Name}}</b> {{$.i18n.Tr "repo.issues.review.resolved_by"}}</span>
|
|
||||||
<button id="show-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="ui compact right labeled button show-outdated">
|
|
||||||
{{svg "octicon-unfold"}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
|
|
||||||
</button>
|
|
||||||
<button id="hide-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="hide ui compact right labeled button hide-outdated">
|
|
||||||
{{svg "octicon-fold"}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
{{end}}
|
|
||||||
{{if eq $line.GetCommentSide "previous"}}
|
|
||||||
<div id="code-comments-{{(index $line.Comments 0).ID}}" class="field comment-code-cloud {{if $resolved}}hide{{end}}">
|
|
||||||
<div class="comment-list">
|
|
||||||
<ui class="ui comments">
|
|
||||||
{{ template "repo/diff/comments" dict "root" $ "comments" $line.Comments}}
|
|
||||||
</ui>
|
|
||||||
</div>
|
|
||||||
{{template "repo/diff/comment_form_datahandler" dict "reply" (index $line.Comments 0).ReviewID "hidden" true "root" $ "comment" (index $line.Comments 0)}}
|
|
||||||
{{if and $.CanMarkConversation $isNotPending}}
|
|
||||||
<button class="ui icon tiny button resolve-conversation" data-action="{{if not $resolved}}Resolve{{else}}UnResolve{{end}}" data-comment-id="{{(index $line.Comments 0).ID}}" data-update-url="{{$.RepoLink}}/issues/resolve_conversation" >
|
|
||||||
{{if $resolved}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.un_resolve_conversation"}}
|
|
||||||
{{else}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.resolve_conversation"}}
|
|
||||||
{{end}}
|
|
||||||
</button>
|
|
||||||
{{end}}
|
|
||||||
</div>
|
|
||||||
{{end}}
|
|
||||||
</div>
|
|
||||||
</td>
|
|
||||||
<td class="lines-num"></td>
|
|
||||||
<td class="lines-type-marker"></td>
|
|
||||||
<td class="add-comment-right">
|
|
||||||
<div class="conversation-holder">
|
|
||||||
{{if and $resolved (eq $line.GetCommentSide "proposed")}}
|
|
||||||
<div class="ui top attached header">
|
|
||||||
<span class="ui grey text left"><b>{{$resolveDoer.Name}}</b> {{$.i18n.Tr "repo.issues.review.resolved_by"}}</span>
|
|
||||||
<button id="show-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="ui compact right labeled button show-outdated">
|
|
||||||
{{svg "octicon-unfold"}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
|
|
||||||
</button>
|
|
||||||
<button id="hide-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="hide ui compact right labeled button hide-outdated">
|
|
||||||
{{svg "octicon-fold"}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
{{end}}
|
|
||||||
{{if eq $line.GetCommentSide "proposed"}}
|
|
||||||
<div id="code-comments-{{(index $line.Comments 0).ID}}" class="field comment-code-cloud {{if $resolved}}hide{{end}}">
|
|
||||||
<div class="comment-list">
|
|
||||||
<ui class="ui comments">
|
|
||||||
{{ template "repo/diff/comments" dict "root" $ "comments" $line.Comments}}
|
|
||||||
</ui>
|
|
||||||
</div>
|
|
||||||
{{template "repo/diff/comment_form_datahandler" dict "reply" (index $line.Comments 0).ReviewID "hidden" true "root" $ "comment" (index $line.Comments 0)}}
|
|
||||||
{{if and $.CanMarkConversation $isNotPending}}
|
|
||||||
<button class="ui icon tiny button resolve-conversation" data-action="{{if not $resolved}}Resolve{{else}}UnResolve{{end}}" data-comment-id="{{(index $line.Comments 0).ID}}" data-update-url="{{$.RepoLink}}/issues/resolve_conversation" >
|
|
||||||
{{if $resolved}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.un_resolve_conversation"}}
|
|
||||||
{{else}}
|
|
||||||
{{$.i18n.Tr "repo.issues.review.resolve_conversation"}}
|
|
||||||
{{end}}
|
|
||||||
</button>
|
|
||||||
{{end}}
|
|
||||||
</div>
|
|
||||||
{{end}}
|
|
||||||
</div>
|
|
||||||
</td>
|
|
||||||
</tr>
|
|
||||||
{{end}}
|
{{end}}
|
||||||
|
</tr>
|
||||||
|
{{if gt (len $line.Comments) 0}}
|
||||||
|
{{$resolved := (index $line.Comments 0).IsResolved}}
|
||||||
|
{{$resolveDoer := (index $line.Comments 0).ResolveDoer}}
|
||||||
|
{{$isNotPending := (not (eq (index $line.Comments 0).Review.Type 0))}}
|
||||||
|
<tr class="add-code-comment">
|
||||||
|
<td class="lines-num"></td>
|
||||||
|
<td class="lines-type-marker"></td>
|
||||||
|
<td class="add-comment-left">
|
||||||
|
<div class="conversation-holder">
|
||||||
|
{{if and $resolved (eq $line.GetCommentSide "previous")}}
|
||||||
|
<div class="ui top attached header">
|
||||||
|
<span class="ui grey text left"><b>{{$resolveDoer.Name}}</b> {{$.i18n.Tr "repo.issues.review.resolved_by"}}</span>
|
||||||
|
<button id="show-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="ui compact right labeled button show-outdated">
|
||||||
|
{{svg "octicon-unfold"}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
|
||||||
|
</button>
|
||||||
|
<button id="hide-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="hide ui compact right labeled button hide-outdated">
|
||||||
|
{{svg "octicon-fold"}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{{end}}
|
||||||
|
{{if eq $line.GetCommentSide "previous"}}
|
||||||
|
<div id="code-comments-{{(index $line.Comments 0).ID}}" class="field comment-code-cloud {{if $resolved}}hide{{end}}">
|
||||||
|
<div class="comment-list">
|
||||||
|
<ui class="ui comments">
|
||||||
|
{{ template "repo/diff/comments" dict "root" $ "comments" $line.Comments}}
|
||||||
|
</ui>
|
||||||
|
</div>
|
||||||
|
{{template "repo/diff/comment_form_datahandler" dict "reply" (index $line.Comments 0).ReviewID "hidden" true "root" $ "comment" (index $line.Comments 0)}}
|
||||||
|
{{if and $.CanMarkConversation $isNotPending}}
|
||||||
|
<button class="ui icon tiny button resolve-conversation" data-action="{{if not $resolved}}Resolve{{else}}UnResolve{{end}}" data-comment-id="{{(index $line.Comments 0).ID}}" data-update-url="{{$.RepoLink}}/issues/resolve_conversation" >
|
||||||
|
{{if $resolved}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.un_resolve_conversation"}}
|
||||||
|
{{else}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.resolve_conversation"}}
|
||||||
|
{{end}}
|
||||||
|
</button>
|
||||||
|
{{end}}
|
||||||
|
</div>
|
||||||
|
{{end}}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td class="lines-num"></td>
|
||||||
|
<td class="lines-type-marker"></td>
|
||||||
|
<td class="add-comment-right">
|
||||||
|
<div class="conversation-holder">
|
||||||
|
{{if and $resolved (eq $line.GetCommentSide "proposed")}}
|
||||||
|
<div class="ui top attached header">
|
||||||
|
<span class="ui grey text left"><b>{{$resolveDoer.Name}}</b> {{$.i18n.Tr "repo.issues.review.resolved_by"}}</span>
|
||||||
|
<button id="show-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="ui compact right labeled button show-outdated">
|
||||||
|
{{svg "octicon-unfold"}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
|
||||||
|
</button>
|
||||||
|
<button id="hide-outdated-{{(index $line.Comments 0).ID}}" data-comment="{{(index $line.Comments 0).ID}}" class="hide ui compact right labeled button hide-outdated">
|
||||||
|
{{svg "octicon-fold"}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{{end}}
|
||||||
|
{{if eq $line.GetCommentSide "proposed"}}
|
||||||
|
<div id="code-comments-{{(index $line.Comments 0).ID}}" class="field comment-code-cloud {{if $resolved}}hide{{end}}">
|
||||||
|
<div class="comment-list">
|
||||||
|
<ui class="ui comments">
|
||||||
|
{{ template "repo/diff/comments" dict "root" $ "comments" $line.Comments}}
|
||||||
|
</ui>
|
||||||
|
</div>
|
||||||
|
{{template "repo/diff/comment_form_datahandler" dict "reply" (index $line.Comments 0).ReviewID "hidden" true "root" $ "comment" (index $line.Comments 0)}}
|
||||||
|
{{if and $.CanMarkConversation $isNotPending}}
|
||||||
|
<button class="ui icon tiny button resolve-conversation" data-action="{{if not $resolved}}Resolve{{else}}UnResolve{{end}}" data-comment-id="{{(index $line.Comments 0).ID}}" data-update-url="{{$.RepoLink}}/issues/resolve_conversation" >
|
||||||
|
{{if $resolved}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.un_resolve_conversation"}}
|
||||||
|
{{else}}
|
||||||
|
{{$.i18n.Tr "repo.issues.review.resolve_conversation"}}
|
||||||
|
{{end}}
|
||||||
|
</button>
|
||||||
|
{{end}}
|
||||||
|
</div>
|
||||||
|
{{end}}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
{{end}}
|
{{end}}
|
||||||
{{end}}
|
{{end}}
|
||||||
{{else}}
|
|
||||||
{{template "repo/diff/section_unified" dict "file" . "root" $}}
|
|
||||||
{{end}}
|
{{end}}
|
||||||
|
{{else}}
|
||||||
|
{{template "repo/diff/section_unified" dict "file" . "root" $}}
|
||||||
{{end}}
|
{{end}}
|
||||||
</tbody>
|
{{end}}
|
||||||
</table>
|
</tbody>
|
||||||
</div>
|
</table>
|
||||||
{{end}}
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{{end}}
|
{{end}}
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
{{ $createdStr:= TimeSinceUnix .CreatedUnix $.root.Lang }}
|
{{ $createdStr:= TimeSinceUnix .CreatedUnix $.root.Lang }}
|
||||||
<div class="comment" id="{{.HashTag}}">
|
<div class="comment" id="{{.HashTag}}">
|
||||||
{{if .OriginalAuthor }}
|
{{if .OriginalAuthor }}
|
||||||
<span class="avatar"><img src="/img/avatar_default.png"></span>
|
<span class="avatar"><img src="{{AppSubUrl}}/img/avatar_default.png"></span>
|
||||||
{{else}}
|
{{else}}
|
||||||
<a class="avatar" {{if gt .Poster.ID 0}}href="{{.Poster.HomeLink}}"{{end}}>
|
<a class="avatar" {{if gt .Poster.ID 0}}href="{{.Poster.HomeLink}}"{{end}}>
|
||||||
<img src="{{.Poster.RelAvatarLink}}">
|
<img src="{{.Poster.RelAvatarLink}}">
|
||||||
|
|||||||
@@ -4,8 +4,8 @@
|
|||||||
<div class="ui container">
|
<div class="ui container">
|
||||||
<div class="ui three column stackable grid">
|
<div class="ui three column stackable grid">
|
||||||
<div class="column">
|
<div class="column">
|
||||||
<h3>{{.Milestone.Name}}</h3>
|
<h1>{{.Milestone.Name}}</h1>
|
||||||
<div class="content">
|
<div class="markdown content">
|
||||||
{{.Milestone.RenderedContent|Str2html}}
|
{{.Milestone.RenderedContent|Str2html}}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -63,7 +63,7 @@
|
|||||||
<span class="info">{{.i18n.Tr "repo.issues.filter_label_exclude" | Safe}}</span>
|
<span class="info">{{.i18n.Tr "repo.issues.filter_label_exclude" | Safe}}</span>
|
||||||
<a class="item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&assignee={{$.AssigneeID}}">{{.i18n.Tr "repo.issues.filter_label_no_select"}}</a>
|
<a class="item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&assignee={{$.AssigneeID}}">{{.i18n.Tr "repo.issues.filter_label_no_select"}}</a>
|
||||||
{{range .Labels}}
|
{{range .Labels}}
|
||||||
<a class="item label-filter-item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&labels={{.ID}}&assignee={{$.AssigneeID}}" data-label-id="{{.ID}}">{{if .IsExcluded}}{{svg "octicon-circle-slash"}}{{else if contain $.SelLabelIDs .ID}}{{svg "octicon-check"}}{{end}}<span class="label color" style="background-color: {{.Color}}"></span> {{.Name | RenderEmoji}}</a>
|
<a class="item label-filter-item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&labels={{.QueryString}}&assignee={{$.AssigneeID}}" data-label-id="{{.ID}}">{{if .IsExcluded}}{{svg "octicon-circle-slash"}}{{else if contain $.SelLabelIDs .ID}}{{svg "octicon-check"}}{{end}}<span class="label color" style="background-color: {{.Color}}"></span> {{.Name | RenderEmoji}}</a>
|
||||||
{{end}}
|
{{end}}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -43,7 +43,7 @@
|
|||||||
<div class="milestone list">
|
<div class="milestone list">
|
||||||
{{range .Milestones}}
|
{{range .Milestones}}
|
||||||
<li class="item">
|
<li class="item">
|
||||||
{{svg "octicon-milestone"}} <a href="{{$.RepoLink}}/milestone/{{.ID}}">{{.Name}}</a>
|
{{svg "octicon-milestone" 16 "mr-2"}} <a href="{{$.RepoLink}}/milestone/{{.ID}}">{{.Name}}</a>
|
||||||
<div class="ui right green progress" data-percent="{{.Completeness}}">
|
<div class="ui right green progress" data-percent="{{.Completeness}}">
|
||||||
<div class="bar" {{if not .Completeness}}style="background-color: transparent"{{end}}>
|
<div class="bar" {{if not .Completeness}}style="background-color: transparent"{{end}}>
|
||||||
<div class="progress"></div>
|
<div class="progress"></div>
|
||||||
@@ -80,7 +80,7 @@
|
|||||||
</div>
|
</div>
|
||||||
{{end}}
|
{{end}}
|
||||||
{{if .Content}}
|
{{if .Content}}
|
||||||
<div class="content">
|
<div class="markdown content">
|
||||||
{{.RenderedContent|Str2html}}
|
{{.RenderedContent|Str2html}}
|
||||||
</div>
|
</div>
|
||||||
{{end}}
|
{{end}}
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
<ui class="ui timeline">
|
<ui class="ui timeline">
|
||||||
<div id="{{.Issue.HashTag}}" class="timeline-item comment first">
|
<div id="{{.Issue.HashTag}}" class="timeline-item comment first">
|
||||||
{{if .Issue.OriginalAuthor }}
|
{{if .Issue.OriginalAuthor }}
|
||||||
<span class="timeline-avatar"><img src="/img/avatar_default.png"></span>
|
<span class="timeline-avatar"><img src="{{AppSubUrl}}/img/avatar_default.png"></span>
|
||||||
{{else}}
|
{{else}}
|
||||||
<a class="timeline-avatar" {{if gt .Issue.Poster.ID 0}}href="{{.Issue.Poster.HomeLink}}"{{end}}>
|
<a class="timeline-avatar" {{if gt .Issue.Poster.ID 0}}href="{{.Issue.Poster.HomeLink}}"{{end}}>
|
||||||
<img src="{{.Issue.Poster.RelAvatarLink}}">
|
<img src="{{.Issue.Poster.RelAvatarLink}}">
|
||||||
|
|||||||
@@ -12,7 +12,7 @@
|
|||||||
{{if eq .Type 0}}
|
{{if eq .Type 0}}
|
||||||
<div class="timeline-item comment" id="{{.HashTag}}">
|
<div class="timeline-item comment" id="{{.HashTag}}">
|
||||||
{{if .OriginalAuthor }}
|
{{if .OriginalAuthor }}
|
||||||
<span class="timeline-avatar"><img src="/img/avatar_default.png"></span>
|
<span class="timeline-avatar"><img src="{{AppSubUrl}}/img/avatar_default.png"></span>
|
||||||
{{else}}
|
{{else}}
|
||||||
<a class="timeline-avatar" {{if gt .Poster.ID 0}}href="{{.Poster.HomeLink}}"{{end}}>
|
<a class="timeline-avatar" {{if gt .Poster.ID 0}}href="{{.Poster.HomeLink}}"{{end}}>
|
||||||
<img src="{{.Poster.RelAvatarLink}}">
|
<img src="{{.Poster.RelAvatarLink}}">
|
||||||
@@ -29,7 +29,7 @@
|
|||||||
</div>
|
</div>
|
||||||
<div class="header-right actions df ac">
|
<div class="header-right actions df ac">
|
||||||
{{if not $.Repository.IsArchived}}
|
{{if not $.Repository.IsArchived}}
|
||||||
{{if or (and (eq .PosterID .Issue.PosterID) (eq .Issue.OriginalAuthorID 0)) (eq .Issue.OriginalAuthorID .OriginalAuthorID) }}
|
{{if or (and (eq .PosterID .Issue.PosterID) (eq .Issue.OriginalAuthorID 0)) (and (eq .Issue.OriginalAuthorID .OriginalAuthorID) (not (eq .OriginalAuthorID 0))) }}
|
||||||
<div class="item tag">
|
<div class="item tag">
|
||||||
{{$.i18n.Tr "repo.issues.poster"}}
|
{{$.i18n.Tr "repo.issues.poster"}}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -112,10 +112,23 @@
|
|||||||
<div class="card board-card" data-issue="{{.ID}}">
|
<div class="card board-card" data-issue="{{.ID}}">
|
||||||
<div class="content">
|
<div class="content">
|
||||||
<div class="header">
|
<div class="header">
|
||||||
<span class="{{if .IsClosed}}red{{else}}green{{end}}">
|
<span>
|
||||||
{{if .IsPull}}{{svg "octicon-git-merge"}}
|
{{if .IsPull}}
|
||||||
{{else if .IsClosed}}{{svg "octicon-issue-closed"}}
|
{{if .PullRequest.HasMerged}}
|
||||||
{{else}}{{svg "octicon-issue-opened"}}
|
{{svg "octicon-git-merge" 16 "text purple"}}
|
||||||
|
{{else}}
|
||||||
|
{{if .IsClosed}}
|
||||||
|
{{svg "octicon-git-pull-request" 16 "text red"}}
|
||||||
|
{{else}}
|
||||||
|
{{svg "octicon-git-pull-request" 16 "text green"}}
|
||||||
|
{{end}}
|
||||||
|
{{end}}
|
||||||
|
{{else}}
|
||||||
|
{{if .IsClosed}}
|
||||||
|
{{svg "octicon-issue-closed" 16 "text red"}}
|
||||||
|
{{else}}
|
||||||
|
{{svg "octicon-issue-opened" 16 "text green"}}
|
||||||
|
{{end}}
|
||||||
{{end}}
|
{{end}}
|
||||||
</span>
|
</span>
|
||||||
<a class="project-board-title" href="{{$.RepoLink}}/issues/{{.Index}}">#{{.Index}} {{.Title}}</a>
|
<a class="project-board-title" href="{{$.RepoLink}}/issues/{{.Index}}">#{{.Index}} {{.Title}}</a>
|
||||||
|
|||||||
@@ -96,7 +96,8 @@
|
|||||||
<span class="text truncate issue title">{{index .GetIssueInfos 1 | RenderEmoji}}</span>
|
<span class="text truncate issue title">{{index .GetIssueInfos 1 | RenderEmoji}}</span>
|
||||||
{{else if or (eq .GetOpType 10) (eq .GetOpType 21) (eq .GetOpType 22) (eq .GetOpType 23)}}
|
{{else if or (eq .GetOpType 10) (eq .GetOpType 21) (eq .GetOpType 22) (eq .GetOpType 23)}}
|
||||||
<a href="{{.GetCommentLink}}" class="text truncate issue title">{{.GetIssueTitle | RenderEmoji}}</a>
|
<a href="{{.GetCommentLink}}" class="text truncate issue title">{{.GetIssueTitle | RenderEmoji}}</a>
|
||||||
<p class="text light grey">{{index .GetIssueInfos 1 | RenderEmoji}}</p>
|
{{$comment := index .GetIssueInfos 1}}
|
||||||
|
{{if gt (len $comment) 0}}<p class="text light grey">{{$comment | RenderEmoji}}</p>{{end}}
|
||||||
{{else if eq .GetOpType 11}}
|
{{else if eq .GetOpType 11}}
|
||||||
<p class="text light grey">{{index .GetIssueInfos 1}}</p>
|
<p class="text light grey">{{index .GetIssueInfos 1}}</p>
|
||||||
{{else if or (eq .GetOpType 12) (eq .GetOpType 13) (eq .GetOpType 14) (eq .GetOpType 15)}}
|
{{else if or (eq .GetOpType 12) (eq .GetOpType 13) (eq .GetOpType 14) (eq .GetOpType 15)}}
|
||||||
|
|||||||
22
vendor/github.com/chris-ramon/douceur/LICENSE
generated
vendored
22
vendor/github.com/chris-ramon/douceur/LICENSE
generated
vendored
@@ -1,22 +0,0 @@
|
|||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2015 Aymerick JEHANNE
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
|
||||||
in the Software without restriction, including without limitation the rights
|
|
||||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
||||||
copies of the Software, and to permit persons to whom the Software is
|
|
||||||
furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
||||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
||||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
||||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
||||||
SOFTWARE.
|
|
||||||
|
|
||||||
15
vendor/github.com/microcosm-cc/bluemonday/SECURITY.md
generated
vendored
Normal file
15
vendor/github.com/microcosm-cc/bluemonday/SECURITY.md
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# Security Policy
|
||||||
|
|
||||||
|
## Supported Versions
|
||||||
|
|
||||||
|
Latest tag and tip are supported.
|
||||||
|
|
||||||
|
Older tags remain present but changes result in new tags and are not back ported... please verify any issue against the latest tag and tip.
|
||||||
|
|
||||||
|
## Reporting a Vulnerability
|
||||||
|
|
||||||
|
Email: <bluemonday@buro9.com>
|
||||||
|
|
||||||
|
Bluemonday is pure OSS and not maintained by a company. As such there is no bug bounty program but security issues will be taken seriously and resolved as soon as possible.
|
||||||
|
|
||||||
|
The maintainer lives in the United Kingdom and whilst the email is monitored expect a reply or ACK when the maintainer is awake.
|
||||||
7
vendor/github.com/microcosm-cc/bluemonday/go.mod
generated
vendored
7
vendor/github.com/microcosm-cc/bluemonday/go.mod
generated
vendored
@@ -1,10 +1,9 @@
|
|||||||
module github.com/microcosm-cc/bluemonday
|
module github.com/microcosm-cc/bluemonday
|
||||||
|
|
||||||
go 1.9
|
go 1.16
|
||||||
|
|
||||||
require (
|
require (
|
||||||
github.com/aymerick/douceur v0.2.0 // indirect
|
github.com/aymerick/douceur v0.2.0
|
||||||
github.com/chris-ramon/douceur v0.2.0
|
|
||||||
github.com/gorilla/css v1.0.0 // indirect
|
github.com/gorilla/css v1.0.0 // indirect
|
||||||
golang.org/x/net v0.0.0-20181220203305-927f97764cc3
|
golang.org/x/net v0.0.0-20210331212208-0fccb6fa2b5c
|
||||||
)
|
)
|
||||||
|
|||||||
11
vendor/github.com/microcosm-cc/bluemonday/go.sum
generated
vendored
11
vendor/github.com/microcosm-cc/bluemonday/go.sum
generated
vendored
@@ -1,8 +1,11 @@
|
|||||||
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
|
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
|
||||||
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
|
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
|
||||||
github.com/chris-ramon/douceur v0.2.0 h1:IDMEdxlEUUBYBKE4z/mJnFyVXox+MjuEVDJNN27glkU=
|
|
||||||
github.com/chris-ramon/douceur v0.2.0/go.mod h1:wDW5xjJdeoMm1mRt4sD4c/LbF/mWdEpRXQKjTR8nIBE=
|
|
||||||
github.com/gorilla/css v1.0.0 h1:BQqNyPTi50JCFMTw/b67hByjMVXZRwGha6wxVGkeihY=
|
github.com/gorilla/css v1.0.0 h1:BQqNyPTi50JCFMTw/b67hByjMVXZRwGha6wxVGkeihY=
|
||||||
github.com/gorilla/css v1.0.0/go.mod h1:Dn721qIggHpt4+EFCcTLTU/vk5ySda2ReITrtgBl60c=
|
github.com/gorilla/css v1.0.0/go.mod h1:Dn721qIggHpt4+EFCcTLTU/vk5ySda2ReITrtgBl60c=
|
||||||
golang.org/x/net v0.0.0-20181220203305-927f97764cc3 h1:eH6Eip3UpmR+yM/qI9Ijluzb1bNv/cAU/n+6l8tRSis=
|
golang.org/x/net v0.0.0-20210331212208-0fccb6fa2b5c h1:KHUzaHIpjWVlVVNh65G3hhuj3KB1HnjY6Cq5cTvRQT8=
|
||||||
golang.org/x/net v0.0.0-20181220203305-927f97764cc3/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
golang.org/x/net v0.0.0-20210331212208-0fccb6fa2b5c/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=
|
||||||
|
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
|
golang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
|
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||||
|
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||||
|
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||||
|
|||||||
1
vendor/github.com/microcosm-cc/bluemonday/handlers.go
generated
vendored
1
vendor/github.com/microcosm-cc/bluemonday/handlers.go
generated
vendored
@@ -26,6 +26,7 @@
|
|||||||
// CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
// CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
||||||
// OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
// OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
package bluemonday
|
package bluemonday
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
|||||||
45
vendor/github.com/microcosm-cc/bluemonday/policy.go
generated
vendored
45
vendor/github.com/microcosm-cc/bluemonday/policy.go
generated
vendored
@@ -69,6 +69,9 @@ type Policy struct {
|
|||||||
// Will skip for href="/foo" or href="foo"
|
// Will skip for href="/foo" or href="foo"
|
||||||
requireNoReferrerFullyQualifiedLinks bool
|
requireNoReferrerFullyQualifiedLinks bool
|
||||||
|
|
||||||
|
// When true, add crossorigin="anonymous" to HTML audio, img, link, script, and video tags
|
||||||
|
requireCrossOriginAnonymous bool
|
||||||
|
|
||||||
// When true add target="_blank" to fully qualified links
|
// When true add target="_blank" to fully qualified links
|
||||||
// Will add for href="http://foo"
|
// Will add for href="http://foo"
|
||||||
// Will skip for href="/foo" or href="foo"
|
// Will skip for href="/foo" or href="foo"
|
||||||
@@ -433,25 +436,25 @@ func (spb *stylePolicyBuilder) OnElements(elements ...string) *Policy {
|
|||||||
// and return the updated policy
|
// and return the updated policy
|
||||||
func (spb *stylePolicyBuilder) OnElementsMatching(regex *regexp.Regexp) *Policy {
|
func (spb *stylePolicyBuilder) OnElementsMatching(regex *regexp.Regexp) *Policy {
|
||||||
|
|
||||||
for _, attr := range spb.propertyNames {
|
for _, attr := range spb.propertyNames {
|
||||||
|
|
||||||
if _, ok := spb.p.elsMatchingAndStyles[regex]; !ok {
|
if _, ok := spb.p.elsMatchingAndStyles[regex]; !ok {
|
||||||
spb.p.elsMatchingAndStyles[regex] = make(map[string]stylePolicy)
|
spb.p.elsMatchingAndStyles[regex] = make(map[string]stylePolicy)
|
||||||
}
|
|
||||||
|
|
||||||
sp := stylePolicy{}
|
|
||||||
if spb.handler != nil {
|
|
||||||
sp.handler = spb.handler
|
|
||||||
} else if len(spb.enum) > 0 {
|
|
||||||
sp.enum = spb.enum
|
|
||||||
} else if spb.regexp != nil {
|
|
||||||
sp.regexp = spb.regexp
|
|
||||||
} else {
|
|
||||||
sp.handler = getDefaultHandler(attr)
|
|
||||||
}
|
|
||||||
spb.p.elsMatchingAndStyles[regex][attr] = sp
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
sp := stylePolicy{}
|
||||||
|
if spb.handler != nil {
|
||||||
|
sp.handler = spb.handler
|
||||||
|
} else if len(spb.enum) > 0 {
|
||||||
|
sp.enum = spb.enum
|
||||||
|
} else if spb.regexp != nil {
|
||||||
|
sp.regexp = spb.regexp
|
||||||
|
} else {
|
||||||
|
sp.handler = getDefaultHandler(attr)
|
||||||
|
}
|
||||||
|
spb.p.elsMatchingAndStyles[regex][attr] = sp
|
||||||
|
}
|
||||||
|
|
||||||
return spb.p
|
return spb.p
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -558,6 +561,16 @@ func (p *Policy) RequireNoReferrerOnFullyQualifiedLinks(require bool) *Policy {
|
|||||||
return p
|
return p
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// RequireCrossOriginAnonymous will result in all audio, img, link, script, and
|
||||||
|
// video tags having a crossorigin="anonymous" added to them if one does not
|
||||||
|
// already exist
|
||||||
|
func (p *Policy) RequireCrossOriginAnonymous(require bool) *Policy {
|
||||||
|
|
||||||
|
p.requireCrossOriginAnonymous = require
|
||||||
|
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
|
||||||
// AddTargetBlankToFullyQualifiedLinks will result in all a, area and link tags
|
// AddTargetBlankToFullyQualifiedLinks will result in all a, area and link tags
|
||||||
// that point to a non-local destination (i.e. starts with a protocol and has a
|
// that point to a non-local destination (i.e. starts with a protocol and has a
|
||||||
// host) having a target="_blank" added to them if one does not already exist
|
// host) having a target="_blank" added to them if one does not already exist
|
||||||
|
|||||||
52
vendor/github.com/microcosm-cc/bluemonday/sanitize.go
generated
vendored
52
vendor/github.com/microcosm-cc/bluemonday/sanitize.go
generated
vendored
@@ -39,7 +39,7 @@ import (
|
|||||||
|
|
||||||
"golang.org/x/net/html"
|
"golang.org/x/net/html"
|
||||||
|
|
||||||
cssparser "github.com/chris-ramon/douceur/parser"
|
"github.com/aymerick/douceur/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
@@ -286,7 +286,7 @@ func (p *Policy) sanitize(r io.Reader) *bytes.Buffer {
|
|||||||
|
|
||||||
case html.StartTagToken:
|
case html.StartTagToken:
|
||||||
|
|
||||||
mostRecentlyStartedToken = strings.ToLower(token.Data)
|
mostRecentlyStartedToken = normaliseElementName(token.Data)
|
||||||
|
|
||||||
aps, ok := p.elsAndAttrs[token.Data]
|
aps, ok := p.elsAndAttrs[token.Data]
|
||||||
if !ok {
|
if !ok {
|
||||||
@@ -329,7 +329,7 @@ func (p *Policy) sanitize(r io.Reader) *bytes.Buffer {
|
|||||||
|
|
||||||
case html.EndTagToken:
|
case html.EndTagToken:
|
||||||
|
|
||||||
if mostRecentlyStartedToken == strings.ToLower(token.Data) {
|
if mostRecentlyStartedToken == normaliseElementName(token.Data) {
|
||||||
mostRecentlyStartedToken = ""
|
mostRecentlyStartedToken = ""
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -407,11 +407,11 @@ func (p *Policy) sanitize(r io.Reader) *bytes.Buffer {
|
|||||||
|
|
||||||
if !skipElementContent {
|
if !skipElementContent {
|
||||||
switch mostRecentlyStartedToken {
|
switch mostRecentlyStartedToken {
|
||||||
case "script":
|
case `script`:
|
||||||
// not encouraged, but if a policy allows JavaScript we
|
// not encouraged, but if a policy allows JavaScript we
|
||||||
// should not HTML escape it as that would break the output
|
// should not HTML escape it as that would break the output
|
||||||
buff.WriteString(token.Data)
|
buff.WriteString(token.Data)
|
||||||
case "style":
|
case `style`:
|
||||||
// not encouraged, but if a policy allows CSS styles we
|
// not encouraged, but if a policy allows CSS styles we
|
||||||
// should not HTML escape it as that would break the output
|
// should not HTML escape it as that would break the output
|
||||||
buff.WriteString(token.Data)
|
buff.WriteString(token.Data)
|
||||||
@@ -721,6 +721,26 @@ func (p *Policy) sanitizeAttrs(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if p.requireCrossOriginAnonymous && len(cleanAttrs) > 0 {
|
||||||
|
switch elementName {
|
||||||
|
case "audio", "img", "link", "script", "video":
|
||||||
|
var crossOriginFound bool
|
||||||
|
for _, htmlAttr := range cleanAttrs {
|
||||||
|
if htmlAttr.Key == "crossorigin" {
|
||||||
|
crossOriginFound = true
|
||||||
|
htmlAttr.Val = "anonymous"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !crossOriginFound {
|
||||||
|
crossOrigin := html.Attribute{}
|
||||||
|
crossOrigin.Key = "crossorigin"
|
||||||
|
crossOrigin.Val = "anonymous"
|
||||||
|
cleanAttrs = append(cleanAttrs, crossOrigin)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return cleanAttrs
|
return cleanAttrs
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -744,7 +764,7 @@ func (p *Policy) sanitizeStyles(attr html.Attribute, elementName string) html.At
|
|||||||
if len(attr.Val) > 0 && attr.Val[len(attr.Val)-1] != ';' {
|
if len(attr.Val) > 0 && attr.Val[len(attr.Val)-1] != ';' {
|
||||||
attr.Val = attr.Val + ";"
|
attr.Val = attr.Val + ";"
|
||||||
}
|
}
|
||||||
decs, err := cssparser.ParseDeclarations(attr.Val)
|
decs, err := parser.ParseDeclarations(attr.Val)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
attr.Val = ""
|
attr.Val = ""
|
||||||
return attr
|
return attr
|
||||||
@@ -944,3 +964,23 @@ func (p *Policy) matchRegex(elementName string) (map[string]attrPolicy, bool) {
|
|||||||
}
|
}
|
||||||
return aps, matched
|
return aps, matched
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// normaliseElementName takes a HTML element like <script> which is user input
|
||||||
|
// and returns a lower case version of it that is immune to UTF-8 to ASCII
|
||||||
|
// conversion tricks (like the use of upper case cyrillic i scrİpt which a
|
||||||
|
// strings.ToLower would convert to script). Instead this func will preserve
|
||||||
|
// all non-ASCII as their escaped equivalent, i.e. \u0130 which reveals the
|
||||||
|
// characters when lower cased
|
||||||
|
func normaliseElementName(str string) string {
|
||||||
|
// that useful QuoteToASCII put quote marks at the start and end
|
||||||
|
// so those are trimmed off
|
||||||
|
return strings.TrimSuffix(
|
||||||
|
strings.TrimPrefix(
|
||||||
|
strings.ToLower(
|
||||||
|
strconv.QuoteToASCII(str),
|
||||||
|
),
|
||||||
|
`"`),
|
||||||
|
`"`,
|
||||||
|
)
|
||||||
|
}
|
||||||
90
vendor/github.com/yuin/goldmark/README.md
generated
vendored
90
vendor/github.com/yuin/goldmark/README.md
generated
vendored
@@ -1,7 +1,7 @@
|
|||||||
goldmark
|
goldmark
|
||||||
==========================================
|
==========================================
|
||||||
|
|
||||||
[](http://godoc.org/github.com/yuin/goldmark)
|
[](https://pkg.go.dev/github.com/yuin/goldmark)
|
||||||
[](https://github.com/yuin/goldmark/actions?query=workflow:test)
|
[](https://github.com/yuin/goldmark/actions?query=workflow:test)
|
||||||
[](https://coveralls.io/github/yuin/goldmark)
|
[](https://coveralls.io/github/yuin/goldmark)
|
||||||
[](https://goreportcard.com/report/github.com/yuin/goldmark)
|
[](https://goreportcard.com/report/github.com/yuin/goldmark)
|
||||||
@@ -173,6 +173,7 @@ Parser and Renderer options
|
|||||||
- This extension enables Table, Strikethrough, Linkify and TaskList.
|
- This extension enables Table, Strikethrough, Linkify and TaskList.
|
||||||
- This extension does not filter tags defined in [6.11: Disallowed Raw HTML (extension)](https://github.github.com/gfm/#disallowed-raw-html-extension-).
|
- This extension does not filter tags defined in [6.11: Disallowed Raw HTML (extension)](https://github.github.com/gfm/#disallowed-raw-html-extension-).
|
||||||
If you need to filter HTML tags, see [Security](#security).
|
If you need to filter HTML tags, see [Security](#security).
|
||||||
|
- If you need to parse github emojis, you can use [goldmark-emoji](https://github.com/yuin/goldmark-emoji) extension.
|
||||||
- `extension.DefinitionList`
|
- `extension.DefinitionList`
|
||||||
- [PHP Markdown Extra: Definition lists](https://michelf.ca/projects/php-markdown/extra/#def-list)
|
- [PHP Markdown Extra: Definition lists](https://michelf.ca/projects/php-markdown/extra/#def-list)
|
||||||
- `extension.Footnote`
|
- `extension.Footnote`
|
||||||
@@ -279,13 +280,96 @@ markdown := goldmark.New(
|
|||||||
[]byte("https:"),
|
[]byte("https:"),
|
||||||
}),
|
}),
|
||||||
extension.WithLinkifyURLRegexp(
|
extension.WithLinkifyURLRegexp(
|
||||||
xurls.Strict(),
|
xurls.Strict,
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Footnotes extension
|
||||||
|
|
||||||
|
The Footnote extension implements [PHP Markdown Extra: Footnotes](https://michelf.ca/projects/php-markdown/extra/#footnotes).
|
||||||
|
|
||||||
|
This extension has some options:
|
||||||
|
|
||||||
|
| Functional option | Type | Description |
|
||||||
|
| ----------------- | ---- | ----------- |
|
||||||
|
| `extension.WithFootnoteIDPrefix` | `[]byte` | a prefix for the id attributes.|
|
||||||
|
| `extension.WithFootnoteIDPrefixFunction` | `func(gast.Node) []byte` | a function that determines the id attribute for given Node.|
|
||||||
|
| `extension.WithFootnoteLinkTitle` | `[]byte` | an optional title attribute for footnote links.|
|
||||||
|
| `extension.WithFootnoteBacklinkTitle` | `[]byte` | an optional title attribute for footnote backlinks. |
|
||||||
|
| `extension.WithFootnoteLinkClass` | `[]byte` | a class for footnote links. This defaults to `footnote-ref`. |
|
||||||
|
| `extension.WithFootnoteBacklinkClass` | `[]byte` | a class for footnote backlinks. This defaults to `footnote-backref`. |
|
||||||
|
| `extension.WithFootnoteBacklinkHTML` | `[]byte` | a class for footnote backlinks. This defaults to `↩︎`. |
|
||||||
|
|
||||||
|
Some options can have special substitutions. Occurances of “^^” in the string will be replaced by the corresponding footnote number in the HTML output. Occurances of “%%” will be replaced by a number for the reference (footnotes can have multiple references).
|
||||||
|
|
||||||
|
`extension.WithFootnoteIDPrefix` and `extension.WithFootnoteIDPrefixFunction` are useful if you have multiple Markdown documents displayed inside one HTML document to avoid footnote ids to clash each other.
|
||||||
|
|
||||||
|
`extension.WithFootnoteIDPrefix` sets fixed id prefix, so you may write codes like the following:
|
||||||
|
|
||||||
|
```go
|
||||||
|
for _, path := range files {
|
||||||
|
source := readAll(path)
|
||||||
|
prefix := getPrefix(path)
|
||||||
|
|
||||||
|
markdown := goldmark.New(
|
||||||
|
goldmark.WithExtensions(
|
||||||
|
NewFootnote(
|
||||||
|
WithFootnoteIDPrefix([]byte(path)),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
var b bytes.Buffer
|
||||||
|
err := markdown.Convert(source, &b)
|
||||||
|
if err != nil {
|
||||||
|
t.Error(err.Error())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`extension.WithFootnoteIDPrefixFunction` determines an id prefix by calling given function, so you may write codes like the following:
|
||||||
|
|
||||||
|
```go
|
||||||
|
markdown := goldmark.New(
|
||||||
|
goldmark.WithExtensions(
|
||||||
|
NewFootnote(
|
||||||
|
WithFootnoteIDPrefixFunction(func(n gast.Node) []byte {
|
||||||
|
v, ok := n.OwnerDocument().Meta()["footnote-prefix"]
|
||||||
|
if ok {
|
||||||
|
return util.StringToReadOnlyBytes(v.(string))
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
for _, path := range files {
|
||||||
|
source := readAll(path)
|
||||||
|
var b bytes.Buffer
|
||||||
|
|
||||||
|
doc := markdown.Parser().Parse(text.NewReader(source))
|
||||||
|
doc.Meta()["footnote-prefix"] = getPrefix(path)
|
||||||
|
err := markdown.Renderer().Render(&b, source, doc)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
You can use [goldmark-meta](https://github.com/yuin/goldmark-meta) to define a id prefix in the markdown document:
|
||||||
|
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
title: document title
|
||||||
|
slug: article1
|
||||||
|
footnote-prefix: article1
|
||||||
|
---
|
||||||
|
|
||||||
|
# My article
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
Security
|
Security
|
||||||
--------------------
|
--------------------
|
||||||
By default, goldmark does not render raw HTML or potentially-dangerous URLs.
|
By default, goldmark does not render raw HTML or potentially-dangerous URLs.
|
||||||
@@ -336,6 +420,8 @@ Extensions
|
|||||||
extension for the goldmark Markdown parser.
|
extension for the goldmark Markdown parser.
|
||||||
- [goldmark-highlighting](https://github.com/yuin/goldmark-highlighting): A syntax-highlighting extension
|
- [goldmark-highlighting](https://github.com/yuin/goldmark-highlighting): A syntax-highlighting extension
|
||||||
for the goldmark markdown parser.
|
for the goldmark markdown parser.
|
||||||
|
- [goldmark-emoji](https://github.com/yuin/goldmark-emoji): An emoji
|
||||||
|
extension for the goldmark Markdown parser.
|
||||||
- [goldmark-mathjax](https://github.com/litao91/goldmark-mathjax): Mathjax support for the goldmark markdown parser
|
- [goldmark-mathjax](https://github.com/litao91/goldmark-mathjax): Mathjax support for the goldmark markdown parser
|
||||||
|
|
||||||
goldmark internal(for extension developers)
|
goldmark internal(for extension developers)
|
||||||
|
|||||||
28
vendor/github.com/yuin/goldmark/ast/ast.go
generated
vendored
28
vendor/github.com/yuin/goldmark/ast/ast.go
generated
vendored
@@ -45,11 +45,6 @@ type Attribute struct {
|
|||||||
Value interface{}
|
Value interface{}
|
||||||
}
|
}
|
||||||
|
|
||||||
var attrNameIDS = []byte("#")
|
|
||||||
var attrNameID = []byte("id")
|
|
||||||
var attrNameClassS = []byte(".")
|
|
||||||
var attrNameClass = []byte("class")
|
|
||||||
|
|
||||||
// A Node interface defines basic AST node functionalities.
|
// A Node interface defines basic AST node functionalities.
|
||||||
type Node interface {
|
type Node interface {
|
||||||
// Type returns a type of this node.
|
// Type returns a type of this node.
|
||||||
@@ -116,6 +111,11 @@ type Node interface {
|
|||||||
// tail of the children.
|
// tail of the children.
|
||||||
InsertAfter(self, v1, insertee Node)
|
InsertAfter(self, v1, insertee Node)
|
||||||
|
|
||||||
|
// OwnerDocument returns this node's owner document.
|
||||||
|
// If this node is not a child of the Document node, OwnerDocument
|
||||||
|
// returns nil.
|
||||||
|
OwnerDocument() *Document
|
||||||
|
|
||||||
// Dump dumps an AST tree structure to stdout.
|
// Dump dumps an AST tree structure to stdout.
|
||||||
// This function completely aimed for debugging.
|
// This function completely aimed for debugging.
|
||||||
// level is a indent level. Implementer should indent informations with
|
// level is a indent level. Implementer should indent informations with
|
||||||
@@ -169,7 +169,7 @@ type Node interface {
|
|||||||
RemoveAttributes()
|
RemoveAttributes()
|
||||||
}
|
}
|
||||||
|
|
||||||
// A BaseNode struct implements the Node interface.
|
// A BaseNode struct implements the Node interface partialliy.
|
||||||
type BaseNode struct {
|
type BaseNode struct {
|
||||||
firstChild Node
|
firstChild Node
|
||||||
lastChild Node
|
lastChild Node
|
||||||
@@ -358,6 +358,22 @@ func (n *BaseNode) InsertBefore(self, v1, insertee Node) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// OwnerDocument implements Node.OwnerDocument
|
||||||
|
func (n *BaseNode) OwnerDocument() *Document {
|
||||||
|
d := n.Parent()
|
||||||
|
for {
|
||||||
|
p := d.Parent()
|
||||||
|
if p == nil {
|
||||||
|
if v, ok := d.(*Document); ok {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
d = p
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
// Text implements Node.Text .
|
// Text implements Node.Text .
|
||||||
func (n *BaseNode) Text(source []byte) []byte {
|
func (n *BaseNode) Text(source []byte) []byte {
|
||||||
var buf bytes.Buffer
|
var buf bytes.Buffer
|
||||||
|
|||||||
23
vendor/github.com/yuin/goldmark/ast/block.go
generated
vendored
23
vendor/github.com/yuin/goldmark/ast/block.go
generated
vendored
@@ -7,7 +7,7 @@ import (
|
|||||||
textm "github.com/yuin/goldmark/text"
|
textm "github.com/yuin/goldmark/text"
|
||||||
)
|
)
|
||||||
|
|
||||||
// A BaseBlock struct implements the Node interface.
|
// A BaseBlock struct implements the Node interface partialliy.
|
||||||
type BaseBlock struct {
|
type BaseBlock struct {
|
||||||
BaseNode
|
BaseNode
|
||||||
blankPreviousLines bool
|
blankPreviousLines bool
|
||||||
@@ -50,6 +50,8 @@ func (b *BaseBlock) SetLines(v *textm.Segments) {
|
|||||||
// A Document struct is a root node of Markdown text.
|
// A Document struct is a root node of Markdown text.
|
||||||
type Document struct {
|
type Document struct {
|
||||||
BaseBlock
|
BaseBlock
|
||||||
|
|
||||||
|
meta map[string]interface{}
|
||||||
}
|
}
|
||||||
|
|
||||||
// KindDocument is a NodeKind of the Document node.
|
// KindDocument is a NodeKind of the Document node.
|
||||||
@@ -70,10 +72,29 @@ func (n *Document) Kind() NodeKind {
|
|||||||
return KindDocument
|
return KindDocument
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// OwnerDocument implements Node.OwnerDocument
|
||||||
|
func (n *Document) OwnerDocument() *Document {
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// Meta returns metadata of this document.
|
||||||
|
func (n *Document) Meta() map[string]interface{} {
|
||||||
|
if n.meta == nil {
|
||||||
|
n.meta = map[string]interface{}{}
|
||||||
|
}
|
||||||
|
return n.meta
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetMeta sets given metadata to this document.
|
||||||
|
func (n *Document) SetMeta(meta map[string]interface{}) {
|
||||||
|
n.meta = meta
|
||||||
|
}
|
||||||
|
|
||||||
// NewDocument returns a new Document node.
|
// NewDocument returns a new Document node.
|
||||||
func NewDocument() *Document {
|
func NewDocument() *Document {
|
||||||
return &Document{
|
return &Document{
|
||||||
BaseBlock: BaseBlock{},
|
BaseBlock: BaseBlock{},
|
||||||
|
meta: nil,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
2
vendor/github.com/yuin/goldmark/ast/inline.go
generated
vendored
2
vendor/github.com/yuin/goldmark/ast/inline.go
generated
vendored
@@ -8,7 +8,7 @@ import (
|
|||||||
"github.com/yuin/goldmark/util"
|
"github.com/yuin/goldmark/util"
|
||||||
)
|
)
|
||||||
|
|
||||||
// A BaseInline struct implements the Node interface.
|
// A BaseInline struct implements the Node interface partialliy.
|
||||||
type BaseInline struct {
|
type BaseInline struct {
|
||||||
BaseNode
|
BaseNode
|
||||||
}
|
}
|
||||||
|
|||||||
35
vendor/github.com/yuin/goldmark/extension/ast/footnote.go
generated
vendored
35
vendor/github.com/yuin/goldmark/extension/ast/footnote.go
generated
vendored
@@ -2,6 +2,7 @@ package ast
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
gast "github.com/yuin/goldmark/ast"
|
gast "github.com/yuin/goldmark/ast"
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -9,13 +10,15 @@ import (
|
|||||||
// (PHP Markdown Extra) text.
|
// (PHP Markdown Extra) text.
|
||||||
type FootnoteLink struct {
|
type FootnoteLink struct {
|
||||||
gast.BaseInline
|
gast.BaseInline
|
||||||
Index int
|
Index int
|
||||||
|
RefCount int
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dump implements Node.Dump.
|
// Dump implements Node.Dump.
|
||||||
func (n *FootnoteLink) Dump(source []byte, level int) {
|
func (n *FootnoteLink) Dump(source []byte, level int) {
|
||||||
m := map[string]string{}
|
m := map[string]string{}
|
||||||
m["Index"] = fmt.Sprintf("%v", n.Index)
|
m["Index"] = fmt.Sprintf("%v", n.Index)
|
||||||
|
m["RefCount"] = fmt.Sprintf("%v", n.RefCount)
|
||||||
gast.DumpHelper(n, source, level, m, nil)
|
gast.DumpHelper(n, source, level, m, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -30,36 +33,40 @@ func (n *FootnoteLink) Kind() gast.NodeKind {
|
|||||||
// NewFootnoteLink returns a new FootnoteLink node.
|
// NewFootnoteLink returns a new FootnoteLink node.
|
||||||
func NewFootnoteLink(index int) *FootnoteLink {
|
func NewFootnoteLink(index int) *FootnoteLink {
|
||||||
return &FootnoteLink{
|
return &FootnoteLink{
|
||||||
Index: index,
|
Index: index,
|
||||||
|
RefCount: 0,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// A FootnoteBackLink struct represents a link to a footnote of Markdown
|
// A FootnoteBacklink struct represents a link to a footnote of Markdown
|
||||||
// (PHP Markdown Extra) text.
|
// (PHP Markdown Extra) text.
|
||||||
type FootnoteBackLink struct {
|
type FootnoteBacklink struct {
|
||||||
gast.BaseInline
|
gast.BaseInline
|
||||||
Index int
|
Index int
|
||||||
|
RefCount int
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dump implements Node.Dump.
|
// Dump implements Node.Dump.
|
||||||
func (n *FootnoteBackLink) Dump(source []byte, level int) {
|
func (n *FootnoteBacklink) Dump(source []byte, level int) {
|
||||||
m := map[string]string{}
|
m := map[string]string{}
|
||||||
m["Index"] = fmt.Sprintf("%v", n.Index)
|
m["Index"] = fmt.Sprintf("%v", n.Index)
|
||||||
|
m["RefCount"] = fmt.Sprintf("%v", n.RefCount)
|
||||||
gast.DumpHelper(n, source, level, m, nil)
|
gast.DumpHelper(n, source, level, m, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
// KindFootnoteBackLink is a NodeKind of the FootnoteBackLink node.
|
// KindFootnoteBacklink is a NodeKind of the FootnoteBacklink node.
|
||||||
var KindFootnoteBackLink = gast.NewNodeKind("FootnoteBackLink")
|
var KindFootnoteBacklink = gast.NewNodeKind("FootnoteBacklink")
|
||||||
|
|
||||||
// Kind implements Node.Kind.
|
// Kind implements Node.Kind.
|
||||||
func (n *FootnoteBackLink) Kind() gast.NodeKind {
|
func (n *FootnoteBacklink) Kind() gast.NodeKind {
|
||||||
return KindFootnoteBackLink
|
return KindFootnoteBacklink
|
||||||
}
|
}
|
||||||
|
|
||||||
// NewFootnoteBackLink returns a new FootnoteBackLink node.
|
// NewFootnoteBacklink returns a new FootnoteBacklink node.
|
||||||
func NewFootnoteBackLink(index int) *FootnoteBackLink {
|
func NewFootnoteBacklink(index int) *FootnoteBacklink {
|
||||||
return &FootnoteBackLink{
|
return &FootnoteBacklink{
|
||||||
Index: index,
|
Index: index,
|
||||||
|
RefCount: 0,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
382
vendor/github.com/yuin/goldmark/extension/footnote.go
generated
vendored
382
vendor/github.com/yuin/goldmark/extension/footnote.go
generated
vendored
@@ -2,6 +2,8 @@ package extension
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
|
"strconv"
|
||||||
|
|
||||||
"github.com/yuin/goldmark"
|
"github.com/yuin/goldmark"
|
||||||
gast "github.com/yuin/goldmark/ast"
|
gast "github.com/yuin/goldmark/ast"
|
||||||
"github.com/yuin/goldmark/extension/ast"
|
"github.com/yuin/goldmark/extension/ast"
|
||||||
@@ -10,10 +12,10 @@ import (
|
|||||||
"github.com/yuin/goldmark/renderer/html"
|
"github.com/yuin/goldmark/renderer/html"
|
||||||
"github.com/yuin/goldmark/text"
|
"github.com/yuin/goldmark/text"
|
||||||
"github.com/yuin/goldmark/util"
|
"github.com/yuin/goldmark/util"
|
||||||
"strconv"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
var footnoteListKey = parser.NewContextKey()
|
var footnoteListKey = parser.NewContextKey()
|
||||||
|
var footnoteLinkListKey = parser.NewContextKey()
|
||||||
|
|
||||||
type footnoteBlockParser struct {
|
type footnoteBlockParser struct {
|
||||||
}
|
}
|
||||||
@@ -164,7 +166,20 @@ func (s *footnoteParser) Parse(parent gast.Node, block text.Reader, pc parser.Co
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
return ast.NewFootnoteLink(index)
|
fnlink := ast.NewFootnoteLink(index)
|
||||||
|
var fnlist []*ast.FootnoteLink
|
||||||
|
if tmp := pc.Get(footnoteLinkListKey); tmp != nil {
|
||||||
|
fnlist = tmp.([]*ast.FootnoteLink)
|
||||||
|
} else {
|
||||||
|
fnlist = []*ast.FootnoteLink{}
|
||||||
|
pc.Set(footnoteLinkListKey, fnlist)
|
||||||
|
}
|
||||||
|
pc.Set(footnoteLinkListKey, append(fnlist, fnlink))
|
||||||
|
if line[0] == '!' {
|
||||||
|
parent.AppendChild(parent, gast.NewTextSegment(text.NewSegment(segment.Start, segment.Start+1)))
|
||||||
|
}
|
||||||
|
|
||||||
|
return fnlink
|
||||||
}
|
}
|
||||||
|
|
||||||
type footnoteASTTransformer struct {
|
type footnoteASTTransformer struct {
|
||||||
@@ -180,23 +195,46 @@ func NewFootnoteASTTransformer() parser.ASTTransformer {
|
|||||||
|
|
||||||
func (a *footnoteASTTransformer) Transform(node *gast.Document, reader text.Reader, pc parser.Context) {
|
func (a *footnoteASTTransformer) Transform(node *gast.Document, reader text.Reader, pc parser.Context) {
|
||||||
var list *ast.FootnoteList
|
var list *ast.FootnoteList
|
||||||
if tlist := pc.Get(footnoteListKey); tlist != nil {
|
var fnlist []*ast.FootnoteLink
|
||||||
list = tlist.(*ast.FootnoteList)
|
if tmp := pc.Get(footnoteListKey); tmp != nil {
|
||||||
} else {
|
list = tmp.(*ast.FootnoteList)
|
||||||
|
}
|
||||||
|
if tmp := pc.Get(footnoteLinkListKey); tmp != nil {
|
||||||
|
fnlist = tmp.([]*ast.FootnoteLink)
|
||||||
|
}
|
||||||
|
|
||||||
|
pc.Set(footnoteListKey, nil)
|
||||||
|
pc.Set(footnoteLinkListKey, nil)
|
||||||
|
|
||||||
|
if list == nil {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
pc.Set(footnoteListKey, nil)
|
|
||||||
|
counter := map[int]int{}
|
||||||
|
if fnlist != nil {
|
||||||
|
for _, fnlink := range fnlist {
|
||||||
|
if fnlink.Index >= 0 {
|
||||||
|
counter[fnlink.Index]++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, fnlink := range fnlist {
|
||||||
|
fnlink.RefCount = counter[fnlink.Index]
|
||||||
|
}
|
||||||
|
}
|
||||||
for footnote := list.FirstChild(); footnote != nil; {
|
for footnote := list.FirstChild(); footnote != nil; {
|
||||||
var container gast.Node = footnote
|
var container gast.Node = footnote
|
||||||
next := footnote.NextSibling()
|
next := footnote.NextSibling()
|
||||||
if fc := container.LastChild(); fc != nil && gast.IsParagraph(fc) {
|
if fc := container.LastChild(); fc != nil && gast.IsParagraph(fc) {
|
||||||
container = fc
|
container = fc
|
||||||
}
|
}
|
||||||
index := footnote.(*ast.Footnote).Index
|
fn := footnote.(*ast.Footnote)
|
||||||
|
index := fn.Index
|
||||||
if index < 0 {
|
if index < 0 {
|
||||||
list.RemoveChild(list, footnote)
|
list.RemoveChild(list, footnote)
|
||||||
} else {
|
} else {
|
||||||
container.AppendChild(container, ast.NewFootnoteBackLink(index))
|
backLink := ast.NewFootnoteBacklink(index)
|
||||||
|
backLink.RefCount = counter[index]
|
||||||
|
container.AppendChild(container, backLink)
|
||||||
}
|
}
|
||||||
footnote = next
|
footnote = next
|
||||||
}
|
}
|
||||||
@@ -214,19 +252,250 @@ func (a *footnoteASTTransformer) Transform(node *gast.Document, reader text.Read
|
|||||||
node.AppendChild(node, list)
|
node.AppendChild(node, list)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// FootnoteConfig holds configuration values for the footnote extension.
|
||||||
|
//
|
||||||
|
// Link* and Backlink* configurations have some variables:
|
||||||
|
// Occurrances of “^^” in the string will be replaced by the
|
||||||
|
// corresponding footnote number in the HTML output.
|
||||||
|
// Occurrances of “%%” will be replaced by a number for the
|
||||||
|
// reference (footnotes can have multiple references).
|
||||||
|
type FootnoteConfig struct {
|
||||||
|
html.Config
|
||||||
|
|
||||||
|
// IDPrefix is a prefix for the id attributes generated by footnotes.
|
||||||
|
IDPrefix []byte
|
||||||
|
|
||||||
|
// IDPrefix is a function that determines the id attribute for given Node.
|
||||||
|
IDPrefixFunction func(gast.Node) []byte
|
||||||
|
|
||||||
|
// LinkTitle is an optional title attribute for footnote links.
|
||||||
|
LinkTitle []byte
|
||||||
|
|
||||||
|
// BacklinkTitle is an optional title attribute for footnote backlinks.
|
||||||
|
BacklinkTitle []byte
|
||||||
|
|
||||||
|
// LinkClass is a class for footnote links.
|
||||||
|
LinkClass []byte
|
||||||
|
|
||||||
|
// BacklinkClass is a class for footnote backlinks.
|
||||||
|
BacklinkClass []byte
|
||||||
|
|
||||||
|
// BacklinkHTML is an HTML content for footnote backlinks.
|
||||||
|
BacklinkHTML []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
// FootnoteOption interface is a functional option interface for the extension.
|
||||||
|
type FootnoteOption interface {
|
||||||
|
renderer.Option
|
||||||
|
// SetFootnoteOption sets given option to the extension.
|
||||||
|
SetFootnoteOption(*FootnoteConfig)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewFootnoteConfig returns a new Config with defaults.
|
||||||
|
func NewFootnoteConfig() FootnoteConfig {
|
||||||
|
return FootnoteConfig{
|
||||||
|
Config: html.NewConfig(),
|
||||||
|
LinkTitle: []byte(""),
|
||||||
|
BacklinkTitle: []byte(""),
|
||||||
|
LinkClass: []byte("footnote-ref"),
|
||||||
|
BacklinkClass: []byte("footnote-backref"),
|
||||||
|
BacklinkHTML: []byte("↩︎"),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetOption implements renderer.SetOptioner.
|
||||||
|
func (c *FootnoteConfig) SetOption(name renderer.OptionName, value interface{}) {
|
||||||
|
switch name {
|
||||||
|
case optFootnoteIDPrefixFunction:
|
||||||
|
c.IDPrefixFunction = value.(func(gast.Node) []byte)
|
||||||
|
case optFootnoteIDPrefix:
|
||||||
|
c.IDPrefix = value.([]byte)
|
||||||
|
case optFootnoteLinkTitle:
|
||||||
|
c.LinkTitle = value.([]byte)
|
||||||
|
case optFootnoteBacklinkTitle:
|
||||||
|
c.BacklinkTitle = value.([]byte)
|
||||||
|
case optFootnoteLinkClass:
|
||||||
|
c.LinkClass = value.([]byte)
|
||||||
|
case optFootnoteBacklinkClass:
|
||||||
|
c.BacklinkClass = value.([]byte)
|
||||||
|
case optFootnoteBacklinkHTML:
|
||||||
|
c.BacklinkHTML = value.([]byte)
|
||||||
|
default:
|
||||||
|
c.Config.SetOption(name, value)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type withFootnoteHTMLOptions struct {
|
||||||
|
value []html.Option
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteHTMLOptions) SetConfig(c *renderer.Config) {
|
||||||
|
if o.value != nil {
|
||||||
|
for _, v := range o.value {
|
||||||
|
v.(renderer.Option).SetConfig(c)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteHTMLOptions) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
if o.value != nil {
|
||||||
|
for _, v := range o.value {
|
||||||
|
v.SetHTMLOption(&c.Config)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteHTMLOptions is functional option that wraps goldmark HTMLRenderer options.
|
||||||
|
func WithFootnoteHTMLOptions(opts ...html.Option) FootnoteOption {
|
||||||
|
return &withFootnoteHTMLOptions{opts}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteIDPrefix renderer.OptionName = "FootnoteIDPrefix"
|
||||||
|
|
||||||
|
type withFootnoteIDPrefix struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteIDPrefix) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteIDPrefix] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteIDPrefix) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.IDPrefix = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteIDPrefix is a functional option that is a prefix for the id attributes generated by footnotes.
|
||||||
|
func WithFootnoteIDPrefix(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteIDPrefix{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteIDPrefixFunction renderer.OptionName = "FootnoteIDPrefixFunction"
|
||||||
|
|
||||||
|
type withFootnoteIDPrefixFunction struct {
|
||||||
|
value func(gast.Node) []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteIDPrefixFunction) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteIDPrefixFunction] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteIDPrefixFunction) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.IDPrefixFunction = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteIDPrefixFunction is a functional option that is a prefix for the id attributes generated by footnotes.
|
||||||
|
func WithFootnoteIDPrefixFunction(a func(gast.Node) []byte) FootnoteOption {
|
||||||
|
return &withFootnoteIDPrefixFunction{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteLinkTitle renderer.OptionName = "FootnoteLinkTitle"
|
||||||
|
|
||||||
|
type withFootnoteLinkTitle struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteLinkTitle) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteLinkTitle] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteLinkTitle) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.LinkTitle = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteLinkTitle is a functional option that is an optional title attribute for footnote links.
|
||||||
|
func WithFootnoteLinkTitle(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteLinkTitle{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteBacklinkTitle renderer.OptionName = "FootnoteBacklinkTitle"
|
||||||
|
|
||||||
|
type withFootnoteBacklinkTitle struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkTitle) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteBacklinkTitle] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkTitle) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.BacklinkTitle = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteBacklinkTitle is a functional option that is an optional title attribute for footnote backlinks.
|
||||||
|
func WithFootnoteBacklinkTitle(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteBacklinkTitle{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteLinkClass renderer.OptionName = "FootnoteLinkClass"
|
||||||
|
|
||||||
|
type withFootnoteLinkClass struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteLinkClass) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteLinkClass] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteLinkClass) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.LinkClass = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteLinkClass is a functional option that is a class for footnote links.
|
||||||
|
func WithFootnoteLinkClass(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteLinkClass{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteBacklinkClass renderer.OptionName = "FootnoteBacklinkClass"
|
||||||
|
|
||||||
|
type withFootnoteBacklinkClass struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkClass) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteBacklinkClass] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkClass) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.BacklinkClass = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteBacklinkClass is a functional option that is a class for footnote backlinks.
|
||||||
|
func WithFootnoteBacklinkClass(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteBacklinkClass{a}
|
||||||
|
}
|
||||||
|
|
||||||
|
const optFootnoteBacklinkHTML renderer.OptionName = "FootnoteBacklinkHTML"
|
||||||
|
|
||||||
|
type withFootnoteBacklinkHTML struct {
|
||||||
|
value []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkHTML) SetConfig(c *renderer.Config) {
|
||||||
|
c.Options[optFootnoteBacklinkHTML] = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *withFootnoteBacklinkHTML) SetFootnoteOption(c *FootnoteConfig) {
|
||||||
|
c.BacklinkHTML = o.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithFootnoteBacklinkHTML is an HTML content for footnote backlinks.
|
||||||
|
func WithFootnoteBacklinkHTML(a []byte) FootnoteOption {
|
||||||
|
return &withFootnoteBacklinkHTML{a}
|
||||||
|
}
|
||||||
|
|
||||||
// FootnoteHTMLRenderer is a renderer.NodeRenderer implementation that
|
// FootnoteHTMLRenderer is a renderer.NodeRenderer implementation that
|
||||||
// renders FootnoteLink nodes.
|
// renders FootnoteLink nodes.
|
||||||
type FootnoteHTMLRenderer struct {
|
type FootnoteHTMLRenderer struct {
|
||||||
html.Config
|
FootnoteConfig
|
||||||
}
|
}
|
||||||
|
|
||||||
// NewFootnoteHTMLRenderer returns a new FootnoteHTMLRenderer.
|
// NewFootnoteHTMLRenderer returns a new FootnoteHTMLRenderer.
|
||||||
func NewFootnoteHTMLRenderer(opts ...html.Option) renderer.NodeRenderer {
|
func NewFootnoteHTMLRenderer(opts ...FootnoteOption) renderer.NodeRenderer {
|
||||||
r := &FootnoteHTMLRenderer{
|
r := &FootnoteHTMLRenderer{
|
||||||
Config: html.NewConfig(),
|
FootnoteConfig: NewFootnoteConfig(),
|
||||||
}
|
}
|
||||||
for _, opt := range opts {
|
for _, opt := range opts {
|
||||||
opt.SetHTMLOption(&r.Config)
|
opt.SetFootnoteOption(&r.FootnoteConfig)
|
||||||
}
|
}
|
||||||
return r
|
return r
|
||||||
}
|
}
|
||||||
@@ -234,7 +503,7 @@ func NewFootnoteHTMLRenderer(opts ...html.Option) renderer.NodeRenderer {
|
|||||||
// RegisterFuncs implements renderer.NodeRenderer.RegisterFuncs.
|
// RegisterFuncs implements renderer.NodeRenderer.RegisterFuncs.
|
||||||
func (r *FootnoteHTMLRenderer) RegisterFuncs(reg renderer.NodeRendererFuncRegisterer) {
|
func (r *FootnoteHTMLRenderer) RegisterFuncs(reg renderer.NodeRendererFuncRegisterer) {
|
||||||
reg.Register(ast.KindFootnoteLink, r.renderFootnoteLink)
|
reg.Register(ast.KindFootnoteLink, r.renderFootnoteLink)
|
||||||
reg.Register(ast.KindFootnoteBackLink, r.renderFootnoteBackLink)
|
reg.Register(ast.KindFootnoteBacklink, r.renderFootnoteBacklink)
|
||||||
reg.Register(ast.KindFootnote, r.renderFootnote)
|
reg.Register(ast.KindFootnote, r.renderFootnote)
|
||||||
reg.Register(ast.KindFootnoteList, r.renderFootnoteList)
|
reg.Register(ast.KindFootnoteList, r.renderFootnoteList)
|
||||||
}
|
}
|
||||||
@@ -243,25 +512,45 @@ func (r *FootnoteHTMLRenderer) renderFootnoteLink(w util.BufWriter, source []byt
|
|||||||
if entering {
|
if entering {
|
||||||
n := node.(*ast.FootnoteLink)
|
n := node.(*ast.FootnoteLink)
|
||||||
is := strconv.Itoa(n.Index)
|
is := strconv.Itoa(n.Index)
|
||||||
_, _ = w.WriteString(`<sup id="fnref:`)
|
_, _ = w.WriteString(`<sup id="`)
|
||||||
|
_, _ = w.Write(r.idPrefix(node))
|
||||||
|
_, _ = w.WriteString(`fnref:`)
|
||||||
_, _ = w.WriteString(is)
|
_, _ = w.WriteString(is)
|
||||||
_, _ = w.WriteString(`"><a href="#fn:`)
|
_, _ = w.WriteString(`"><a href="#`)
|
||||||
|
_, _ = w.Write(r.idPrefix(node))
|
||||||
|
_, _ = w.WriteString(`fn:`)
|
||||||
_, _ = w.WriteString(is)
|
_, _ = w.WriteString(is)
|
||||||
_, _ = w.WriteString(`" class="footnote-ref" role="doc-noteref">`)
|
_, _ = w.WriteString(`" class="`)
|
||||||
|
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.LinkClass,
|
||||||
|
n.Index, n.RefCount))
|
||||||
|
if len(r.FootnoteConfig.LinkTitle) > 0 {
|
||||||
|
_, _ = w.WriteString(`" title="`)
|
||||||
|
_, _ = w.Write(util.EscapeHTML(applyFootnoteTemplate(r.FootnoteConfig.LinkTitle, n.Index, n.RefCount)))
|
||||||
|
}
|
||||||
|
_, _ = w.WriteString(`" role="doc-noteref">`)
|
||||||
|
|
||||||
_, _ = w.WriteString(is)
|
_, _ = w.WriteString(is)
|
||||||
_, _ = w.WriteString(`</a></sup>`)
|
_, _ = w.WriteString(`</a></sup>`)
|
||||||
}
|
}
|
||||||
return gast.WalkContinue, nil
|
return gast.WalkContinue, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (r *FootnoteHTMLRenderer) renderFootnoteBackLink(w util.BufWriter, source []byte, node gast.Node, entering bool) (gast.WalkStatus, error) {
|
func (r *FootnoteHTMLRenderer) renderFootnoteBacklink(w util.BufWriter, source []byte, node gast.Node, entering bool) (gast.WalkStatus, error) {
|
||||||
if entering {
|
if entering {
|
||||||
n := node.(*ast.FootnoteBackLink)
|
n := node.(*ast.FootnoteBacklink)
|
||||||
is := strconv.Itoa(n.Index)
|
is := strconv.Itoa(n.Index)
|
||||||
_, _ = w.WriteString(` <a href="#fnref:`)
|
_, _ = w.WriteString(` <a href="#`)
|
||||||
|
_, _ = w.Write(r.idPrefix(node))
|
||||||
|
_, _ = w.WriteString(`fnref:`)
|
||||||
_, _ = w.WriteString(is)
|
_, _ = w.WriteString(is)
|
||||||
_, _ = w.WriteString(`" class="footnote-backref" role="doc-backlink">`)
|
_, _ = w.WriteString(`" class="`)
|
||||||
_, _ = w.WriteString("↩︎")
|
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.BacklinkClass, n.Index, n.RefCount))
|
||||||
|
if len(r.FootnoteConfig.BacklinkTitle) > 0 {
|
||||||
|
_, _ = w.WriteString(`" title="`)
|
||||||
|
_, _ = w.Write(util.EscapeHTML(applyFootnoteTemplate(r.FootnoteConfig.BacklinkTitle, n.Index, n.RefCount)))
|
||||||
|
}
|
||||||
|
_, _ = w.WriteString(`" role="doc-backlink">`)
|
||||||
|
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.BacklinkHTML, n.Index, n.RefCount))
|
||||||
_, _ = w.WriteString(`</a>`)
|
_, _ = w.WriteString(`</a>`)
|
||||||
}
|
}
|
||||||
return gast.WalkContinue, nil
|
return gast.WalkContinue, nil
|
||||||
@@ -271,7 +560,9 @@ func (r *FootnoteHTMLRenderer) renderFootnote(w util.BufWriter, source []byte, n
|
|||||||
n := node.(*ast.Footnote)
|
n := node.(*ast.Footnote)
|
||||||
is := strconv.Itoa(n.Index)
|
is := strconv.Itoa(n.Index)
|
||||||
if entering {
|
if entering {
|
||||||
_, _ = w.WriteString(`<li id="fn:`)
|
_, _ = w.WriteString(`<li id="`)
|
||||||
|
_, _ = w.Write(r.idPrefix(node))
|
||||||
|
_, _ = w.WriteString(`fn:`)
|
||||||
_, _ = w.WriteString(is)
|
_, _ = w.WriteString(is)
|
||||||
_, _ = w.WriteString(`" role="doc-endnote"`)
|
_, _ = w.WriteString(`" role="doc-endnote"`)
|
||||||
if node.Attributes() != nil {
|
if node.Attributes() != nil {
|
||||||
@@ -312,11 +603,54 @@ func (r *FootnoteHTMLRenderer) renderFootnoteList(w util.BufWriter, source []byt
|
|||||||
return gast.WalkContinue, nil
|
return gast.WalkContinue, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (r *FootnoteHTMLRenderer) idPrefix(node gast.Node) []byte {
|
||||||
|
if r.FootnoteConfig.IDPrefix != nil {
|
||||||
|
return r.FootnoteConfig.IDPrefix
|
||||||
|
}
|
||||||
|
if r.FootnoteConfig.IDPrefixFunction != nil {
|
||||||
|
return r.FootnoteConfig.IDPrefixFunction(node)
|
||||||
|
}
|
||||||
|
return []byte("")
|
||||||
|
}
|
||||||
|
|
||||||
|
func applyFootnoteTemplate(b []byte, index, refCount int) []byte {
|
||||||
|
fast := true
|
||||||
|
for i, c := range b {
|
||||||
|
if i != 0 {
|
||||||
|
if b[i-1] == '^' && c == '^' {
|
||||||
|
fast = false
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if b[i-1] == '%' && c == '%' {
|
||||||
|
fast = false
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if fast {
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
is := []byte(strconv.Itoa(index))
|
||||||
|
rs := []byte(strconv.Itoa(refCount))
|
||||||
|
ret := bytes.Replace(b, []byte("^^"), is, -1)
|
||||||
|
return bytes.Replace(ret, []byte("%%"), rs, -1)
|
||||||
|
}
|
||||||
|
|
||||||
type footnote struct {
|
type footnote struct {
|
||||||
|
options []FootnoteOption
|
||||||
}
|
}
|
||||||
|
|
||||||
// Footnote is an extension that allow you to use PHP Markdown Extra Footnotes.
|
// Footnote is an extension that allow you to use PHP Markdown Extra Footnotes.
|
||||||
var Footnote = &footnote{}
|
var Footnote = &footnote{
|
||||||
|
options: []FootnoteOption{},
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewFootnote returns a new extension with given options.
|
||||||
|
func NewFootnote(opts ...FootnoteOption) goldmark.Extender {
|
||||||
|
return &footnote{
|
||||||
|
options: opts,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func (e *footnote) Extend(m goldmark.Markdown) {
|
func (e *footnote) Extend(m goldmark.Markdown) {
|
||||||
m.Parser().AddOptions(
|
m.Parser().AddOptions(
|
||||||
@@ -331,6 +665,6 @@ func (e *footnote) Extend(m goldmark.Markdown) {
|
|||||||
),
|
),
|
||||||
)
|
)
|
||||||
m.Renderer().AddOptions(renderer.WithNodeRenderers(
|
m.Renderer().AddOptions(renderer.WithNodeRenderers(
|
||||||
util.Prioritized(NewFootnoteHTMLRenderer(), 500),
|
util.Prioritized(NewFootnoteHTMLRenderer(e.options...), 500),
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
|||||||
24
vendor/github.com/yuin/goldmark/extension/linkify.go
generated
vendored
24
vendor/github.com/yuin/goldmark/extension/linkify.go
generated
vendored
@@ -11,9 +11,9 @@ import (
|
|||||||
"github.com/yuin/goldmark/util"
|
"github.com/yuin/goldmark/util"
|
||||||
)
|
)
|
||||||
|
|
||||||
var wwwURLRegxp = regexp.MustCompile(`^www\.[-a-zA-Z0-9@:%._\+~#=]{2,256}\.[a-z]+(?:(?:/|[#?])[-a-zA-Z0-9@:%_\+.~#!?&//=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
|
var wwwURLRegxp = regexp.MustCompile(`^www\.[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-z]+(?:[/#?][-a-zA-Z0-9@:%_\+.~#!?&/=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
|
||||||
|
|
||||||
var urlRegexp = regexp.MustCompile(`^(?:http|https|ftp):\/\/(?:www\.)?[-a-zA-Z0-9@:%._\+~#=]{2,256}\.[a-z]+(?:(?:/|[#?])[-a-zA-Z0-9@:%_+.~#$!?&//=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
|
var urlRegexp = regexp.MustCompile(`^(?:http|https|ftp)://[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-z]+(?::\d+)?(?:[/#?][-a-zA-Z0-9@:%_+.~#$!?&/=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
|
||||||
|
|
||||||
// An LinkifyConfig struct is a data structure that holds configuration of the
|
// An LinkifyConfig struct is a data structure that holds configuration of the
|
||||||
// Linkify extension.
|
// Linkify extension.
|
||||||
@@ -24,10 +24,12 @@ type LinkifyConfig struct {
|
|||||||
EmailRegexp *regexp.Regexp
|
EmailRegexp *regexp.Regexp
|
||||||
}
|
}
|
||||||
|
|
||||||
const optLinkifyAllowedProtocols parser.OptionName = "LinkifyAllowedProtocols"
|
const (
|
||||||
const optLinkifyURLRegexp parser.OptionName = "LinkifyURLRegexp"
|
optLinkifyAllowedProtocols parser.OptionName = "LinkifyAllowedProtocols"
|
||||||
const optLinkifyWWWRegexp parser.OptionName = "LinkifyWWWRegexp"
|
optLinkifyURLRegexp parser.OptionName = "LinkifyURLRegexp"
|
||||||
const optLinkifyEmailRegexp parser.OptionName = "LinkifyEmailRegexp"
|
optLinkifyWWWRegexp parser.OptionName = "LinkifyWWWRegexp"
|
||||||
|
optLinkifyEmailRegexp parser.OptionName = "LinkifyEmailRegexp"
|
||||||
|
)
|
||||||
|
|
||||||
// SetOption implements SetOptioner.
|
// SetOption implements SetOptioner.
|
||||||
func (c *LinkifyConfig) SetOption(name parser.OptionName, value interface{}) {
|
func (c *LinkifyConfig) SetOption(name parser.OptionName, value interface{}) {
|
||||||
@@ -156,10 +158,12 @@ func (s *linkifyParser) Trigger() []byte {
|
|||||||
return []byte{' ', '*', '_', '~', '('}
|
return []byte{' ', '*', '_', '~', '('}
|
||||||
}
|
}
|
||||||
|
|
||||||
var protoHTTP = []byte("http:")
|
var (
|
||||||
var protoHTTPS = []byte("https:")
|
protoHTTP = []byte("http:")
|
||||||
var protoFTP = []byte("ftp:")
|
protoHTTPS = []byte("https:")
|
||||||
var domainWWW = []byte("www.")
|
protoFTP = []byte("ftp:")
|
||||||
|
domainWWW = []byte("www.")
|
||||||
|
)
|
||||||
|
|
||||||
func (s *linkifyParser) Parse(parent ast.Node, block text.Reader, pc parser.Context) ast.Node {
|
func (s *linkifyParser) Parse(parent ast.Node, block text.Reader, pc parser.Context) ast.Node {
|
||||||
if pc.IsInLinkLabel() {
|
if pc.IsInLinkLabel() {
|
||||||
|
|||||||
117
vendor/github.com/yuin/goldmark/extension/table.go
generated
vendored
117
vendor/github.com/yuin/goldmark/extension/table.go
generated
vendored
@@ -15,6 +15,14 @@ import (
|
|||||||
"github.com/yuin/goldmark/util"
|
"github.com/yuin/goldmark/util"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
var escapedPipeCellListKey = parser.NewContextKey()
|
||||||
|
|
||||||
|
type escapedPipeCell struct {
|
||||||
|
Cell *ast.TableCell
|
||||||
|
Pos []int
|
||||||
|
Transformed bool
|
||||||
|
}
|
||||||
|
|
||||||
// TableCellAlignMethod indicates how are table cells aligned in HTML format.indicates how are table cells aligned in HTML format.
|
// TableCellAlignMethod indicates how are table cells aligned in HTML format.indicates how are table cells aligned in HTML format.
|
||||||
type TableCellAlignMethod int
|
type TableCellAlignMethod int
|
||||||
|
|
||||||
@@ -148,7 +156,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
|
|||||||
if alignments == nil {
|
if alignments == nil {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
header := b.parseRow(lines.At(i-1), alignments, true, reader)
|
header := b.parseRow(lines.At(i-1), alignments, true, reader, pc)
|
||||||
if header == nil || len(alignments) != header.ChildCount() {
|
if header == nil || len(alignments) != header.ChildCount() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -156,7 +164,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
|
|||||||
table.Alignments = alignments
|
table.Alignments = alignments
|
||||||
table.AppendChild(table, ast.NewTableHeader(header))
|
table.AppendChild(table, ast.NewTableHeader(header))
|
||||||
for j := i + 1; j < lines.Len(); j++ {
|
for j := i + 1; j < lines.Len(); j++ {
|
||||||
table.AppendChild(table, b.parseRow(lines.At(j), alignments, false, reader))
|
table.AppendChild(table, b.parseRow(lines.At(j), alignments, false, reader, pc))
|
||||||
}
|
}
|
||||||
node.Lines().SetSliced(0, i-1)
|
node.Lines().SetSliced(0, i-1)
|
||||||
node.Parent().InsertAfter(node.Parent(), node, table)
|
node.Parent().InsertAfter(node.Parent(), node, table)
|
||||||
@@ -170,7 +178,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []ast.Alignment, isHeader bool, reader text.Reader) *ast.TableRow {
|
func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []ast.Alignment, isHeader bool, reader text.Reader, pc parser.Context) *ast.TableRow {
|
||||||
source := reader.Source()
|
source := reader.Source()
|
||||||
line := segment.Value(source)
|
line := segment.Value(source)
|
||||||
pos := 0
|
pos := 0
|
||||||
@@ -194,18 +202,39 @@ func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []
|
|||||||
} else {
|
} else {
|
||||||
alignment = alignments[i]
|
alignment = alignments[i]
|
||||||
}
|
}
|
||||||
closure := util.FindClosure(line[pos:], byte(0), '|', true, false)
|
|
||||||
if closure < 0 {
|
var escapedCell *escapedPipeCell
|
||||||
closure = len(line[pos:])
|
|
||||||
}
|
|
||||||
node := ast.NewTableCell()
|
node := ast.NewTableCell()
|
||||||
seg := text.NewSegment(segment.Start+pos, segment.Start+pos+closure)
|
node.Alignment = alignment
|
||||||
|
hasBacktick := false
|
||||||
|
closure := pos
|
||||||
|
for ; closure < limit; closure++ {
|
||||||
|
if line[closure] == '`' {
|
||||||
|
hasBacktick = true
|
||||||
|
}
|
||||||
|
if line[closure] == '|' {
|
||||||
|
if closure == 0 || line[closure-1] != '\\' {
|
||||||
|
break
|
||||||
|
} else if hasBacktick {
|
||||||
|
if escapedCell == nil {
|
||||||
|
escapedCell = &escapedPipeCell{node, []int{}, false}
|
||||||
|
escapedList := pc.ComputeIfAbsent(escapedPipeCellListKey,
|
||||||
|
func() interface{} {
|
||||||
|
return []*escapedPipeCell{}
|
||||||
|
}).([]*escapedPipeCell)
|
||||||
|
escapedList = append(escapedList, escapedCell)
|
||||||
|
pc.Set(escapedPipeCellListKey, escapedList)
|
||||||
|
}
|
||||||
|
escapedCell.Pos = append(escapedCell.Pos, segment.Start+closure-1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
seg := text.NewSegment(segment.Start+pos, segment.Start+closure)
|
||||||
seg = seg.TrimLeftSpace(source)
|
seg = seg.TrimLeftSpace(source)
|
||||||
seg = seg.TrimRightSpace(source)
|
seg = seg.TrimRightSpace(source)
|
||||||
node.Lines().Append(seg)
|
node.Lines().Append(seg)
|
||||||
node.Alignment = alignment
|
|
||||||
row.AppendChild(row, node)
|
row.AppendChild(row, node)
|
||||||
pos += closure + 1
|
pos = closure + 1
|
||||||
}
|
}
|
||||||
for ; i < len(alignments); i++ {
|
for ; i < len(alignments); i++ {
|
||||||
row.AppendChild(row, ast.NewTableCell())
|
row.AppendChild(row, ast.NewTableCell())
|
||||||
@@ -243,6 +272,61 @@ func (b *tableParagraphTransformer) parseDelimiter(segment text.Segment, reader
|
|||||||
return alignments
|
return alignments
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type tableASTTransformer struct {
|
||||||
|
}
|
||||||
|
|
||||||
|
var defaultTableASTTransformer = &tableASTTransformer{}
|
||||||
|
|
||||||
|
// NewTableASTTransformer returns a parser.ASTTransformer for tables.
|
||||||
|
func NewTableASTTransformer() parser.ASTTransformer {
|
||||||
|
return defaultTableASTTransformer
|
||||||
|
}
|
||||||
|
|
||||||
|
func (a *tableASTTransformer) Transform(node *gast.Document, reader text.Reader, pc parser.Context) {
|
||||||
|
lst := pc.Get(escapedPipeCellListKey)
|
||||||
|
if lst == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
pc.Set(escapedPipeCellListKey, nil)
|
||||||
|
for _, v := range lst.([]*escapedPipeCell) {
|
||||||
|
if v.Transformed {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
_ = gast.Walk(v.Cell, func(n gast.Node, entering bool) (gast.WalkStatus, error) {
|
||||||
|
if !entering || n.Kind() != gast.KindCodeSpan {
|
||||||
|
return gast.WalkContinue, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := n.FirstChild(); c != nil; {
|
||||||
|
next := c.NextSibling()
|
||||||
|
if c.Kind() != gast.KindText {
|
||||||
|
c = next
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
parent := c.Parent()
|
||||||
|
ts := &c.(*gast.Text).Segment
|
||||||
|
n := c
|
||||||
|
for _, v := range lst.([]*escapedPipeCell) {
|
||||||
|
for _, pos := range v.Pos {
|
||||||
|
if ts.Start <= pos && pos < ts.Stop {
|
||||||
|
segment := n.(*gast.Text).Segment
|
||||||
|
n1 := gast.NewRawTextSegment(segment.WithStop(pos))
|
||||||
|
n2 := gast.NewRawTextSegment(segment.WithStart(pos + 1))
|
||||||
|
parent.InsertAfter(parent, n, n1)
|
||||||
|
parent.InsertAfter(parent, n1, n2)
|
||||||
|
parent.RemoveChild(parent, n)
|
||||||
|
n = n2
|
||||||
|
v.Transformed = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
c = next
|
||||||
|
}
|
||||||
|
return gast.WalkContinue, nil
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// TableHTMLRenderer is a renderer.NodeRenderer implementation that
|
// TableHTMLRenderer is a renderer.NodeRenderer implementation that
|
||||||
// renders Table nodes.
|
// renders Table nodes.
|
||||||
type TableHTMLRenderer struct {
|
type TableHTMLRenderer struct {
|
||||||
@@ -419,7 +503,7 @@ func (r *TableHTMLRenderer) renderTableCell(w util.BufWriter, source []byte, nod
|
|||||||
cob.AppendByte(';')
|
cob.AppendByte(';')
|
||||||
}
|
}
|
||||||
style := fmt.Sprintf("text-align:%s", n.Alignment.String())
|
style := fmt.Sprintf("text-align:%s", n.Alignment.String())
|
||||||
cob.Append(util.StringToReadOnlyBytes(style))
|
cob.AppendString(style)
|
||||||
n.SetAttributeString("style", cob.Bytes())
|
n.SetAttributeString("style", cob.Bytes())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -454,9 +538,14 @@ func NewTable(opts ...TableOption) goldmark.Extender {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (e *table) Extend(m goldmark.Markdown) {
|
func (e *table) Extend(m goldmark.Markdown) {
|
||||||
m.Parser().AddOptions(parser.WithParagraphTransformers(
|
m.Parser().AddOptions(
|
||||||
util.Prioritized(NewTableParagraphTransformer(), 200),
|
parser.WithParagraphTransformers(
|
||||||
))
|
util.Prioritized(NewTableParagraphTransformer(), 200),
|
||||||
|
),
|
||||||
|
parser.WithASTTransformers(
|
||||||
|
util.Prioritized(defaultTableASTTransformer, 0),
|
||||||
|
),
|
||||||
|
)
|
||||||
m.Renderer().AddOptions(renderer.WithNodeRenderers(
|
m.Renderer().AddOptions(renderer.WithNodeRenderers(
|
||||||
util.Prioritized(NewTableHTMLRenderer(e.options...), 500),
|
util.Prioritized(NewTableHTMLRenderer(e.options...), 500),
|
||||||
))
|
))
|
||||||
|
|||||||
2
vendor/github.com/yuin/goldmark/go.mod
generated
vendored
2
vendor/github.com/yuin/goldmark/go.mod
generated
vendored
@@ -1,3 +1,3 @@
|
|||||||
module github.com/yuin/goldmark
|
module github.com/yuin/goldmark
|
||||||
|
|
||||||
go 1.13
|
go 1.15
|
||||||
|
|||||||
17
vendor/github.com/yuin/goldmark/parser/code_block.go
generated
vendored
17
vendor/github.com/yuin/goldmark/parser/code_block.go
generated
vendored
@@ -49,6 +49,12 @@ func (b *codeBlockParser) Continue(node ast.Node, reader text.Reader, pc Context
|
|||||||
}
|
}
|
||||||
reader.AdvanceAndSetPadding(pos, padding)
|
reader.AdvanceAndSetPadding(pos, padding)
|
||||||
_, segment = reader.PeekLine()
|
_, segment = reader.PeekLine()
|
||||||
|
|
||||||
|
// if code block line starts with a tab, keep a tab as it is.
|
||||||
|
if segment.Padding != 0 {
|
||||||
|
preserveLeadingTabInCodeBlock(&segment, reader)
|
||||||
|
}
|
||||||
|
|
||||||
node.Lines().Append(segment)
|
node.Lines().Append(segment)
|
||||||
reader.Advance(segment.Len() - 1)
|
reader.Advance(segment.Len() - 1)
|
||||||
return Continue | NoChildren
|
return Continue | NoChildren
|
||||||
@@ -77,3 +83,14 @@ func (b *codeBlockParser) CanInterruptParagraph() bool {
|
|||||||
func (b *codeBlockParser) CanAcceptIndentedLine() bool {
|
func (b *codeBlockParser) CanAcceptIndentedLine() bool {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func preserveLeadingTabInCodeBlock(segment *text.Segment, reader text.Reader) {
|
||||||
|
offsetWithPadding := reader.LineOffset()
|
||||||
|
sl, ss := reader.Position()
|
||||||
|
reader.SetPosition(sl, text.NewSegment(ss.Start-1, ss.Stop))
|
||||||
|
if offsetWithPadding == reader.LineOffset() {
|
||||||
|
segment.Padding = 0
|
||||||
|
segment.Start--
|
||||||
|
}
|
||||||
|
reader.SetPosition(sl, ss)
|
||||||
|
}
|
||||||
|
|||||||
4
vendor/github.com/yuin/goldmark/parser/fcode_block.go
generated
vendored
4
vendor/github.com/yuin/goldmark/parser/fcode_block.go
generated
vendored
@@ -71,6 +71,10 @@ func (b *fencedCodeBlockParser) Open(parent ast.Node, reader text.Reader, pc Con
|
|||||||
func (b *fencedCodeBlockParser) Continue(node ast.Node, reader text.Reader, pc Context) State {
|
func (b *fencedCodeBlockParser) Continue(node ast.Node, reader text.Reader, pc Context) State {
|
||||||
line, segment := reader.PeekLine()
|
line, segment := reader.PeekLine()
|
||||||
fdata := pc.Get(fencedCodeBlockInfoKey).(*fenceData)
|
fdata := pc.Get(fencedCodeBlockInfoKey).(*fenceData)
|
||||||
|
// if code block line starts with a tab, keep a tab as it is.
|
||||||
|
if segment.Padding != 0 {
|
||||||
|
preserveLeadingTabInCodeBlock(&segment, reader)
|
||||||
|
}
|
||||||
w, pos := util.IndentWidth(line, reader.LineOffset())
|
w, pos := util.IndentWidth(line, reader.LineOffset())
|
||||||
if w < 4 {
|
if w < 4 {
|
||||||
i := pos
|
i := pos
|
||||||
|
|||||||
8
vendor/github.com/yuin/goldmark/parser/link.go
generated
vendored
8
vendor/github.com/yuin/goldmark/parser/link.go
generated
vendored
@@ -2,7 +2,6 @@ package parser
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"regexp"
|
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/yuin/goldmark/ast"
|
"github.com/yuin/goldmark/ast"
|
||||||
@@ -113,8 +112,6 @@ func (s *linkParser) Trigger() []byte {
|
|||||||
return []byte{'!', '[', ']'}
|
return []byte{'!', '[', ']'}
|
||||||
}
|
}
|
||||||
|
|
||||||
var linkDestinationRegexp = regexp.MustCompile(`\s*([^\s].+)`)
|
|
||||||
var linkTitleRegexp = regexp.MustCompile(`\s+(\)|["'\(].+)`)
|
|
||||||
var linkBottom = NewContextKey()
|
var linkBottom = NewContextKey()
|
||||||
|
|
||||||
func (s *linkParser) Parse(parent ast.Node, block text.Reader, pc Context) ast.Node {
|
func (s *linkParser) Parse(parent ast.Node, block text.Reader, pc Context) ast.Node {
|
||||||
@@ -293,20 +290,17 @@ func (s *linkParser) parseLink(parent ast.Node, last *linkLabelState, block text
|
|||||||
func parseLinkDestination(block text.Reader) ([]byte, bool) {
|
func parseLinkDestination(block text.Reader) ([]byte, bool) {
|
||||||
block.SkipSpaces()
|
block.SkipSpaces()
|
||||||
line, _ := block.PeekLine()
|
line, _ := block.PeekLine()
|
||||||
buf := []byte{}
|
|
||||||
if block.Peek() == '<' {
|
if block.Peek() == '<' {
|
||||||
i := 1
|
i := 1
|
||||||
for i < len(line) {
|
for i < len(line) {
|
||||||
c := line[i]
|
c := line[i]
|
||||||
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
|
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
|
||||||
buf = append(buf, '\\', line[i+1])
|
|
||||||
i += 2
|
i += 2
|
||||||
continue
|
continue
|
||||||
} else if c == '>' {
|
} else if c == '>' {
|
||||||
block.Advance(i + 1)
|
block.Advance(i + 1)
|
||||||
return line[1:i], true
|
return line[1:i], true
|
||||||
}
|
}
|
||||||
buf = append(buf, c)
|
|
||||||
i++
|
i++
|
||||||
}
|
}
|
||||||
return nil, false
|
return nil, false
|
||||||
@@ -316,7 +310,6 @@ func parseLinkDestination(block text.Reader) ([]byte, bool) {
|
|||||||
for i < len(line) {
|
for i < len(line) {
|
||||||
c := line[i]
|
c := line[i]
|
||||||
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
|
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
|
||||||
buf = append(buf, '\\', line[i+1])
|
|
||||||
i += 2
|
i += 2
|
||||||
continue
|
continue
|
||||||
} else if c == '(' {
|
} else if c == '(' {
|
||||||
@@ -329,7 +322,6 @@ func parseLinkDestination(block text.Reader) ([]byte, bool) {
|
|||||||
} else if util.IsSpace(c) {
|
} else if util.IsSpace(c) {
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
buf = append(buf, c)
|
|
||||||
i++
|
i++
|
||||||
}
|
}
|
||||||
block.Advance(i)
|
block.Advance(i)
|
||||||
|
|||||||
12
vendor/github.com/yuin/goldmark/parser/parser.go
generated
vendored
12
vendor/github.com/yuin/goldmark/parser/parser.go
generated
vendored
@@ -138,6 +138,9 @@ type Context interface {
|
|||||||
// Get returns a value associated with the given key.
|
// Get returns a value associated with the given key.
|
||||||
Get(ContextKey) interface{}
|
Get(ContextKey) interface{}
|
||||||
|
|
||||||
|
// ComputeIfAbsent computes a value if a value associated with the given key is absent and returns the value.
|
||||||
|
ComputeIfAbsent(ContextKey, func() interface{}) interface{}
|
||||||
|
|
||||||
// Set sets the given value to the context.
|
// Set sets the given value to the context.
|
||||||
Set(ContextKey, interface{})
|
Set(ContextKey, interface{})
|
||||||
|
|
||||||
@@ -252,6 +255,15 @@ func (p *parseContext) Get(key ContextKey) interface{} {
|
|||||||
return p.store[key]
|
return p.store[key]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (p *parseContext) ComputeIfAbsent(key ContextKey, f func() interface{}) interface{} {
|
||||||
|
v := p.store[key]
|
||||||
|
if v == nil {
|
||||||
|
v = f()
|
||||||
|
p.store[key] = v
|
||||||
|
}
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
|
||||||
func (p *parseContext) Set(key ContextKey, value interface{}) {
|
func (p *parseContext) Set(key ContextKey, value interface{}) {
|
||||||
p.store[key] = value
|
p.store[key] = value
|
||||||
}
|
}
|
||||||
|
|||||||
9
vendor/github.com/yuin/goldmark/parser/raw_html.go
generated
vendored
9
vendor/github.com/yuin/goldmark/parser/raw_html.go
generated
vendored
@@ -2,10 +2,11 @@ package parser
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
|
"regexp"
|
||||||
|
|
||||||
"github.com/yuin/goldmark/ast"
|
"github.com/yuin/goldmark/ast"
|
||||||
"github.com/yuin/goldmark/text"
|
"github.com/yuin/goldmark/text"
|
||||||
"github.com/yuin/goldmark/util"
|
"github.com/yuin/goldmark/util"
|
||||||
"regexp"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
type rawHTMLParser struct {
|
type rawHTMLParser struct {
|
||||||
@@ -67,8 +68,6 @@ func (s *rawHTMLParser) parseSingleLineRegexp(reg *regexp.Regexp, block text.Rea
|
|||||||
return node
|
return node
|
||||||
}
|
}
|
||||||
|
|
||||||
var dummyMatch = [][]byte{}
|
|
||||||
|
|
||||||
func (s *rawHTMLParser) parseMultiLineRegexp(reg *regexp.Regexp, block text.Reader, pc Context) ast.Node {
|
func (s *rawHTMLParser) parseMultiLineRegexp(reg *regexp.Regexp, block text.Reader, pc Context) ast.Node {
|
||||||
sline, ssegment := block.Position()
|
sline, ssegment := block.Position()
|
||||||
if block.Match(reg) {
|
if block.Match(reg) {
|
||||||
@@ -102,7 +101,3 @@ func (s *rawHTMLParser) parseMultiLineRegexp(reg *regexp.Regexp, block text.Read
|
|||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (s *rawHTMLParser) CloseBlock(parent ast.Node, pc Context) {
|
|
||||||
// nothing to do
|
|
||||||
}
|
|
||||||
|
|||||||
16
vendor/github.com/yuin/goldmark/util/util.go
generated
vendored
16
vendor/github.com/yuin/goldmark/util/util.go
generated
vendored
@@ -37,6 +37,12 @@ func (b *CopyOnWriteBuffer) Write(value []byte) {
|
|||||||
b.buffer = append(b.buffer, value...)
|
b.buffer = append(b.buffer, value...)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// WriteString writes given string to the buffer.
|
||||||
|
// WriteString allocate new buffer and clears it at the first time.
|
||||||
|
func (b *CopyOnWriteBuffer) WriteString(value string) {
|
||||||
|
b.Write(StringToReadOnlyBytes(value))
|
||||||
|
}
|
||||||
|
|
||||||
// Append appends given bytes to the buffer.
|
// Append appends given bytes to the buffer.
|
||||||
// Append copy buffer at the first time.
|
// Append copy buffer at the first time.
|
||||||
func (b *CopyOnWriteBuffer) Append(value []byte) {
|
func (b *CopyOnWriteBuffer) Append(value []byte) {
|
||||||
@@ -49,6 +55,12 @@ func (b *CopyOnWriteBuffer) Append(value []byte) {
|
|||||||
b.buffer = append(b.buffer, value...)
|
b.buffer = append(b.buffer, value...)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// AppendString appends given string to the buffer.
|
||||||
|
// AppendString copy buffer at the first time.
|
||||||
|
func (b *CopyOnWriteBuffer) AppendString(value string) {
|
||||||
|
b.Append(StringToReadOnlyBytes(value))
|
||||||
|
}
|
||||||
|
|
||||||
// WriteByte writes the given byte to the buffer.
|
// WriteByte writes the given byte to the buffer.
|
||||||
// WriteByte allocate new buffer and clears it at the first time.
|
// WriteByte allocate new buffer and clears it at the first time.
|
||||||
func (b *CopyOnWriteBuffer) WriteByte(c byte) {
|
func (b *CopyOnWriteBuffer) WriteByte(c byte) {
|
||||||
@@ -804,7 +816,7 @@ func IsPunct(c byte) bool {
|
|||||||
return punctTable[c] == 1
|
return punctTable[c] == 1
|
||||||
}
|
}
|
||||||
|
|
||||||
// IsPunct returns true if the given rune is a punctuation, otherwise false.
|
// IsPunctRune returns true if the given rune is a punctuation, otherwise false.
|
||||||
func IsPunctRune(r rune) bool {
|
func IsPunctRune(r rune) bool {
|
||||||
return int32(r) <= 256 && IsPunct(byte(r)) || unicode.IsPunct(r)
|
return int32(r) <= 256 && IsPunct(byte(r)) || unicode.IsPunct(r)
|
||||||
}
|
}
|
||||||
@@ -814,7 +826,7 @@ func IsSpace(c byte) bool {
|
|||||||
return spaceTable[c] == 1
|
return spaceTable[c] == 1
|
||||||
}
|
}
|
||||||
|
|
||||||
// IsSpace returns true if the given rune is a space, otherwise false.
|
// IsSpaceRune returns true if the given rune is a space, otherwise false.
|
||||||
func IsSpaceRune(r rune) bool {
|
func IsSpaceRune(r rune) bool {
|
||||||
return int32(r) <= 256 && IsSpace(byte(r)) || unicode.IsSpace(r)
|
return int32(r) <= 256 && IsSpace(byte(r)) || unicode.IsSpace(r)
|
||||||
}
|
}
|
||||||
|
|||||||
1
vendor/golang.org/x/net/context/go17.go
generated
vendored
1
vendor/golang.org/x/net/context/go17.go
generated
vendored
@@ -2,6 +2,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build go1.7
|
||||||
// +build go1.7
|
// +build go1.7
|
||||||
|
|
||||||
package context
|
package context
|
||||||
|
|||||||
1
vendor/golang.org/x/net/context/go19.go
generated
vendored
1
vendor/golang.org/x/net/context/go19.go
generated
vendored
@@ -2,6 +2,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build go1.9
|
||||||
// +build go1.9
|
// +build go1.9
|
||||||
|
|
||||||
package context
|
package context
|
||||||
|
|||||||
1
vendor/golang.org/x/net/context/pre_go17.go
generated
vendored
1
vendor/golang.org/x/net/context/pre_go17.go
generated
vendored
@@ -2,6 +2,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build !go1.7
|
||||||
// +build !go1.7
|
// +build !go1.7
|
||||||
|
|
||||||
package context
|
package context
|
||||||
|
|||||||
1
vendor/golang.org/x/net/context/pre_go19.go
generated
vendored
1
vendor/golang.org/x/net/context/pre_go19.go
generated
vendored
@@ -2,6 +2,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build !go1.9
|
||||||
// +build !go1.9
|
// +build !go1.9
|
||||||
|
|
||||||
package context
|
package context
|
||||||
|
|||||||
2
vendor/golang.org/x/net/html/const.go
generated
vendored
2
vendor/golang.org/x/net/html/const.go
generated
vendored
@@ -52,7 +52,7 @@ var isSpecialElementMap = map[string]bool{
|
|||||||
"iframe": true,
|
"iframe": true,
|
||||||
"img": true,
|
"img": true,
|
||||||
"input": true,
|
"input": true,
|
||||||
"keygen": true,
|
"keygen": true, // "keygen" has been removed from the spec, but are kept here for backwards compatibility.
|
||||||
"li": true,
|
"li": true,
|
||||||
"link": true,
|
"link": true,
|
||||||
"listing": true,
|
"listing": true,
|
||||||
|
|||||||
119
vendor/golang.org/x/net/html/foreign.go
generated
vendored
119
vendor/golang.org/x/net/html/foreign.go
generated
vendored
@@ -161,65 +161,62 @@ var mathMLAttributeAdjustments = map[string]string{
|
|||||||
}
|
}
|
||||||
|
|
||||||
var svgAttributeAdjustments = map[string]string{
|
var svgAttributeAdjustments = map[string]string{
|
||||||
"attributename": "attributeName",
|
"attributename": "attributeName",
|
||||||
"attributetype": "attributeType",
|
"attributetype": "attributeType",
|
||||||
"basefrequency": "baseFrequency",
|
"basefrequency": "baseFrequency",
|
||||||
"baseprofile": "baseProfile",
|
"baseprofile": "baseProfile",
|
||||||
"calcmode": "calcMode",
|
"calcmode": "calcMode",
|
||||||
"clippathunits": "clipPathUnits",
|
"clippathunits": "clipPathUnits",
|
||||||
"contentscripttype": "contentScriptType",
|
"diffuseconstant": "diffuseConstant",
|
||||||
"contentstyletype": "contentStyleType",
|
"edgemode": "edgeMode",
|
||||||
"diffuseconstant": "diffuseConstant",
|
"filterunits": "filterUnits",
|
||||||
"edgemode": "edgeMode",
|
"glyphref": "glyphRef",
|
||||||
"externalresourcesrequired": "externalResourcesRequired",
|
"gradienttransform": "gradientTransform",
|
||||||
"filterunits": "filterUnits",
|
"gradientunits": "gradientUnits",
|
||||||
"glyphref": "glyphRef",
|
"kernelmatrix": "kernelMatrix",
|
||||||
"gradienttransform": "gradientTransform",
|
"kernelunitlength": "kernelUnitLength",
|
||||||
"gradientunits": "gradientUnits",
|
"keypoints": "keyPoints",
|
||||||
"kernelmatrix": "kernelMatrix",
|
"keysplines": "keySplines",
|
||||||
"kernelunitlength": "kernelUnitLength",
|
"keytimes": "keyTimes",
|
||||||
"keypoints": "keyPoints",
|
"lengthadjust": "lengthAdjust",
|
||||||
"keysplines": "keySplines",
|
"limitingconeangle": "limitingConeAngle",
|
||||||
"keytimes": "keyTimes",
|
"markerheight": "markerHeight",
|
||||||
"lengthadjust": "lengthAdjust",
|
"markerunits": "markerUnits",
|
||||||
"limitingconeangle": "limitingConeAngle",
|
"markerwidth": "markerWidth",
|
||||||
"markerheight": "markerHeight",
|
"maskcontentunits": "maskContentUnits",
|
||||||
"markerunits": "markerUnits",
|
"maskunits": "maskUnits",
|
||||||
"markerwidth": "markerWidth",
|
"numoctaves": "numOctaves",
|
||||||
"maskcontentunits": "maskContentUnits",
|
"pathlength": "pathLength",
|
||||||
"maskunits": "maskUnits",
|
"patterncontentunits": "patternContentUnits",
|
||||||
"numoctaves": "numOctaves",
|
"patterntransform": "patternTransform",
|
||||||
"pathlength": "pathLength",
|
"patternunits": "patternUnits",
|
||||||
"patterncontentunits": "patternContentUnits",
|
"pointsatx": "pointsAtX",
|
||||||
"patterntransform": "patternTransform",
|
"pointsaty": "pointsAtY",
|
||||||
"patternunits": "patternUnits",
|
"pointsatz": "pointsAtZ",
|
||||||
"pointsatx": "pointsAtX",
|
"preservealpha": "preserveAlpha",
|
||||||
"pointsaty": "pointsAtY",
|
"preserveaspectratio": "preserveAspectRatio",
|
||||||
"pointsatz": "pointsAtZ",
|
"primitiveunits": "primitiveUnits",
|
||||||
"preservealpha": "preserveAlpha",
|
"refx": "refX",
|
||||||
"preserveaspectratio": "preserveAspectRatio",
|
"refy": "refY",
|
||||||
"primitiveunits": "primitiveUnits",
|
"repeatcount": "repeatCount",
|
||||||
"refx": "refX",
|
"repeatdur": "repeatDur",
|
||||||
"refy": "refY",
|
"requiredextensions": "requiredExtensions",
|
||||||
"repeatcount": "repeatCount",
|
"requiredfeatures": "requiredFeatures",
|
||||||
"repeatdur": "repeatDur",
|
"specularconstant": "specularConstant",
|
||||||
"requiredextensions": "requiredExtensions",
|
"specularexponent": "specularExponent",
|
||||||
"requiredfeatures": "requiredFeatures",
|
"spreadmethod": "spreadMethod",
|
||||||
"specularconstant": "specularConstant",
|
"startoffset": "startOffset",
|
||||||
"specularexponent": "specularExponent",
|
"stddeviation": "stdDeviation",
|
||||||
"spreadmethod": "spreadMethod",
|
"stitchtiles": "stitchTiles",
|
||||||
"startoffset": "startOffset",
|
"surfacescale": "surfaceScale",
|
||||||
"stddeviation": "stdDeviation",
|
"systemlanguage": "systemLanguage",
|
||||||
"stitchtiles": "stitchTiles",
|
"tablevalues": "tableValues",
|
||||||
"surfacescale": "surfaceScale",
|
"targetx": "targetX",
|
||||||
"systemlanguage": "systemLanguage",
|
"targety": "targetY",
|
||||||
"tablevalues": "tableValues",
|
"textlength": "textLength",
|
||||||
"targetx": "targetX",
|
"viewbox": "viewBox",
|
||||||
"targety": "targetY",
|
"viewtarget": "viewTarget",
|
||||||
"textlength": "textLength",
|
"xchannelselector": "xChannelSelector",
|
||||||
"viewbox": "viewBox",
|
"ychannelselector": "yChannelSelector",
|
||||||
"viewtarget": "viewTarget",
|
"zoomandpan": "zoomAndPan",
|
||||||
"xchannelselector": "xChannelSelector",
|
|
||||||
"ychannelselector": "yChannelSelector",
|
|
||||||
"zoomandpan": "zoomAndPan",
|
|
||||||
}
|
}
|
||||||
|
|||||||
15
vendor/golang.org/x/net/html/parse.go
generated
vendored
15
vendor/golang.org/x/net/html/parse.go
generated
vendored
@@ -728,7 +728,13 @@ func inHeadNoscriptIM(p *parser) bool {
|
|||||||
return inBodyIM(p)
|
return inBodyIM(p)
|
||||||
case a.Basefont, a.Bgsound, a.Link, a.Meta, a.Noframes, a.Style:
|
case a.Basefont, a.Bgsound, a.Link, a.Meta, a.Noframes, a.Style:
|
||||||
return inHeadIM(p)
|
return inHeadIM(p)
|
||||||
case a.Head, a.Noscript:
|
case a.Head:
|
||||||
|
// Ignore the token.
|
||||||
|
return true
|
||||||
|
case a.Noscript:
|
||||||
|
// Don't let the tokenizer go into raw text mode even when a <noscript>
|
||||||
|
// tag is in "in head noscript" insertion mode.
|
||||||
|
p.tokenizer.NextIsNotRawText()
|
||||||
// Ignore the token.
|
// Ignore the token.
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
@@ -1790,6 +1796,13 @@ func inSelectIM(p *parser) bool {
|
|||||||
return true
|
return true
|
||||||
case a.Script, a.Template:
|
case a.Script, a.Template:
|
||||||
return inHeadIM(p)
|
return inHeadIM(p)
|
||||||
|
case a.Iframe, a.Noembed, a.Noframes, a.Noscript, a.Plaintext, a.Style, a.Title, a.Xmp:
|
||||||
|
// Don't let the tokenizer go into raw text mode when there are raw tags
|
||||||
|
// to be ignored. These tags should be ignored from the tokenizer
|
||||||
|
// properly.
|
||||||
|
p.tokenizer.NextIsNotRawText()
|
||||||
|
// Ignore the token.
|
||||||
|
return true
|
||||||
}
|
}
|
||||||
case EndTagToken:
|
case EndTagToken:
|
||||||
switch p.tok.DataAtom {
|
switch p.tok.DataAtom {
|
||||||
|
|||||||
2
vendor/golang.org/x/net/html/render.go
generated
vendored
2
vendor/golang.org/x/net/html/render.go
generated
vendored
@@ -263,7 +263,7 @@ var voidElements = map[string]bool{
|
|||||||
"hr": true,
|
"hr": true,
|
||||||
"img": true,
|
"img": true,
|
||||||
"input": true,
|
"input": true,
|
||||||
"keygen": true,
|
"keygen": true, // "keygen" has been removed from the spec, but are kept here for backwards compatibility.
|
||||||
"link": true,
|
"link": true,
|
||||||
"meta": true,
|
"meta": true,
|
||||||
"param": true,
|
"param": true,
|
||||||
|
|||||||
1
vendor/golang.org/x/net/idna/idna10.0.0.go
generated
vendored
1
vendor/golang.org/x/net/idna/idna10.0.0.go
generated
vendored
@@ -4,6 +4,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build go1.10
|
||||||
// +build go1.10
|
// +build go1.10
|
||||||
|
|
||||||
// Package idna implements IDNA2008 using the compatibility processing
|
// Package idna implements IDNA2008 using the compatibility processing
|
||||||
|
|||||||
1
vendor/golang.org/x/net/idna/idna9.0.0.go
generated
vendored
1
vendor/golang.org/x/net/idna/idna9.0.0.go
generated
vendored
@@ -4,6 +4,7 @@
|
|||||||
// Use of this source code is governed by a BSD-style
|
// Use of this source code is governed by a BSD-style
|
||||||
// license that can be found in the LICENSE file.
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:build !go1.10
|
||||||
// +build !go1.10
|
// +build !go1.10
|
||||||
|
|
||||||
// Package idna implements IDNA2008 using the compatibility processing
|
// Package idna implements IDNA2008 using the compatibility processing
|
||||||
|
|||||||
1
vendor/golang.org/x/net/idna/tables10.0.0.go
generated
vendored
1
vendor/golang.org/x/net/idna/tables10.0.0.go
generated
vendored
@@ -1,5 +1,6 @@
|
|||||||
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
||||||
|
|
||||||
|
//go:build go1.10 && !go1.13
|
||||||
// +build go1.10,!go1.13
|
// +build go1.10,!go1.13
|
||||||
|
|
||||||
package idna
|
package idna
|
||||||
|
|||||||
1
vendor/golang.org/x/net/idna/tables11.0.0.go
generated
vendored
1
vendor/golang.org/x/net/idna/tables11.0.0.go
generated
vendored
@@ -1,5 +1,6 @@
|
|||||||
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
||||||
|
|
||||||
|
//go:build go1.13 && !go1.14
|
||||||
// +build go1.13,!go1.14
|
// +build go1.13,!go1.14
|
||||||
|
|
||||||
package idna
|
package idna
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
||||||
|
|
||||||
// +build go1.14
|
//go:build go1.14 && !go1.16
|
||||||
|
// +build go1.14,!go1.16
|
||||||
|
|
||||||
package idna
|
package idna
|
||||||
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user