Compare commits

..

3 Commits

Author SHA1 Message Date
GiteaBot
e4a3785218 [skip ci] Updated translations via Crowdin 2020-10-14 21:45:21 +00:00
techknowlogick
76ac83402b Clean up mysql service in drone (#13145) 2020-10-14 17:44:18 -04:00
GiteaBot
07c9f6dca4 [skip ci] Updated translations via Crowdin 2020-10-14 18:49:08 +00:00
86 changed files with 743 additions and 1615 deletions

View File

@@ -666,6 +666,7 @@ steps:
event:
exclude:
- pull_request
---
kind: pipeline
name: docker-linux-arm64-dry-run
@@ -695,9 +696,6 @@ steps:
tags: linux-arm64
build_args:
- GOPROXY=off
environment:
PLUGIN_MIRROR:
from_secret: plugin_mirror
when:
event:
- pull_request
@@ -742,13 +740,11 @@ steps:
from_secret: docker_password
username:
from_secret: docker_username
environment:
PLUGIN_MIRROR:
from_secret: plugin_mirror
when:
event:
exclude:
- pull_request
---
kind: pipeline
name: docker-manifest

View File

@@ -4,39 +4,7 @@ This changelog goes through all the changes that have been made in each release
without substantial changes to our git log; to see the highlights of what has
been added to each release, please refer to the [blog](https://blog.gitea.io).
## [1.13.0-rc2](https://github.com/go-gitea/gitea/releases/tag/v1.13.0-rc2) - 2020-11-10
* ENHANCEMENTS
* Return the full rejection message and errors in flash errors (#13221) (#13237)
* Remove PAM from auth dropdown when unavailable (#13276) (#13281)
* BUGFIXES
* Fix Italian language file parsing error (#13156)
* Show outdated comments in pull request (#13148) (#13162)
* Fix parsing of pre-release git version (#13169) (#13172)
* Fix diff skipping lines (#13154) (#13155)
* When handling errors in storageHandler check underlying error (#13178) (#13193)
* Fix size and clickable area on file table back link (#13205) (#13207)
* Add better error checking for inline html diff code (#13251)
* Fix initial commit page & binary munching problem (#13249) (#13258)
* Fix migrations from remote Gitea instances when configuration not set (#13229) (#13273)
* Store task errors following migrations and display them (#13246) (#13287)
* Fix bug isEnd detection on getIssues/getPullRequests (#13299) (#13301)
* When the git ref is unable to be found return broken pr (#13218) (#13303)
* Ensure topics added using the API are added to the repository (#13285) (#13302)
* Fix avatar autogeneration (#13233) (#13282)
* Add migrated pulls to pull request task queue (#13331) (#13334)
* Issue comment reactions should also check pull type on API (#13349) (#13350)
* Fix links to repositories in /user/setting/repos (#13360) (#13362)
* Remove obsolete change of email on profile page (#13341) (#13347)
* Fix scrolling to resolved comment anchors (#13343) (#13371)
* Storage configuration support `[storage]` (#13314) (#13379)
* When creating line diffs do not split within an html entity (#13357) (#13375) (#13425) (#13427)
* Fix reactions on code comments (#13390) (#13401)
* Add missing full names when DEFAULT_SHOW_FULL_NAME is enabled (#13424)
* Replies to outdated code comments should also be outdated (#13217) (#13433)
* Fix panic bug in handling multiple references in commit (#13486) (#13487)
* Prevent panic on git blame by limiting lines to 4096 bytes at most (#13470) (#13491)
## [1.13.0-rc1](https://github.com/go-gitea/gitea/releases/tag/v1.13.0-rc1) - 2020-10-14
## [1.13.0-RC1](https://github.com/go-gitea/gitea/releases/tag/v1.13.0-RC1) - 2020-10-14
* SECURITY
* Mitigate Security vulnerability in the git hook feature (#13058)

View File

@@ -30,8 +30,6 @@ All event pushes are POST requests. The methods currently supported are:
### Event information
**WARNING**: The `secret` field in the payload is deprecated as of Gitea 1.13.0 and will be removed in 1.14.0: https://github.com/go-gitea/gitea/issues/11755
The following is an example of event information that will be sent by Gitea to
a Payload URL:

4
go.mod
View File

@@ -117,10 +117,10 @@ require (
gopkg.in/ini.v1 v1.61.0
gopkg.in/ldap.v3 v3.0.2
gopkg.in/yaml.v2 v2.3.0
mvdan.cc/xurls/v2 v2.2.0
mvdan.cc/xurls/v2 v2.1.0
strk.kbt.io/projects/go/libravatar v0.0.0-20191008002943-06d1c002b251
xorm.io/builder v0.3.7
xorm.io/xorm v1.0.5
)
replace github.com/hashicorp/go-version => github.com/6543/go-version v1.2.4
replace github.com/hashicorp/go-version => github.com/6543/go-version v1.2.3

8
go.sum
View File

@@ -48,8 +48,8 @@ gitea.com/macaron/toolbox v0.0.0-20190822013122-05ff0fc766b7 h1:N9QFoeNsUXLhl14m
gitea.com/macaron/toolbox v0.0.0-20190822013122-05ff0fc766b7/go.mod h1:kgsbFPPS4P+acDYDOPDa3N4IWWOuDJt5/INKRUz7aks=
gitea.com/xorm/sqlfiddle v0.0.0-20180821085327-62ce714f951a h1:lSA0F4e9A2NcQSqGqTOXqu2aRi/XEQxDCBwM8yJtE6s=
gitea.com/xorm/sqlfiddle v0.0.0-20180821085327-62ce714f951a/go.mod h1:EXuID2Zs0pAQhH8yz+DNjUbjppKQzKFAn28TMYPB6IU=
github.com/6543/go-version v1.2.4 h1:MPsSnqNrM0HwA9tnmWNnsMdQMg4/u4fflARjwomoof4=
github.com/6543/go-version v1.2.4/go.mod h1:oqFAHCwtLVUTLdhQmVZWYvaHXTdsbB4SY85at64SQEo=
github.com/6543/go-version v1.2.3 h1:uF30BawMhoQLzqBeCwhFcWM6HVxlzMHe/zXbzJeKP+o=
github.com/6543/go-version v1.2.3/go.mod h1:fcfWh4zkneEgGXe8JJptiGwp8l6JgJJgS7oTw6P83So=
github.com/BurntSushi/toml v0.3.1 h1:WXkYYl6Yr3qBf1K79EBnL4mak0OimBfB0XUf9Vl28OQ=
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
github.com/BurntSushi/xgb v0.0.0-20160522181843-27f122750802/go.mod h1:IVnqGOEym/WlBOVXweHU+Q+/VP0lqqI8lqeDx9IjBqo=
@@ -768,7 +768,6 @@ github.com/rogpeppe/fastuuid v0.0.0-20150106093220-6724a57986af/go.mod h1:XWv6So
github.com/rogpeppe/go-internal v1.1.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.2.2/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.5.2/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
github.com/rs/xid v1.2.1 h1:mhH9Nq+C1fY2l1XIpgxIiUOfNpRBYH1kKcr+qfKgjRc=
github.com/rs/xid v1.2.1/go.mod h1:+uKXf+4Djp6Md1KODXJxgGQPKngRmWyn10oCKFzNHOQ=
github.com/rs/zerolog v1.13.0/go.mod h1:YbFCdg8HfsridGWAh22vktObvhZbQsZXe4/zB0OKkWU=
@@ -1197,7 +1196,8 @@ honnef.co/go/tools v0.0.0-20190106161140-3f1c8253044a/go.mod h1:rf3lG4BRIbNafJWh
honnef.co/go/tools v0.0.0-20190418001031-e561f6794a2a/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
honnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
honnef.co/go/tools v0.0.1-2019.2.3/go.mod h1:a3bituU0lyd329TUQxRnasdCoJDkEUEAqEt0JzvZhAg=
mvdan.cc/xurls/v2 v2.2.0/go.mod h1:EV1RMtya9D6G5DMYPGD8zTQzaHet6Jh8gFlRgGRJeO8=
mvdan.cc/xurls/v2 v2.1.0 h1:KaMb5GLhlcSX+e+qhbRJODnUUBvlw01jt4yrjFIHAuA=
mvdan.cc/xurls/v2 v2.1.0/go.mod h1:5GrSd9rOnKOpZaji1OZLYL/yeAAtGDlo/cFe+8K5n8E=
rsc.io/binaryregexp v0.2.0/go.mod h1:qTv7/COck+e2FymRvadv62gMdZztPaShugOCi3I+8D8=
strk.kbt.io/projects/go/libravatar v0.0.0-20191008002943-06d1c002b251 h1:mUcz5b3FJbP5Cvdq7Khzn6J9OCUQJaBwgBkCR+MOwSs=
strk.kbt.io/projects/go/libravatar v0.0.0-20191008002943-06d1c002b251/go.mod h1:FJGmPh3vz9jSos1L/F91iAgnC/aejc0wIIrF2ZwJxdY=

View File

@@ -5,17 +5,14 @@
package integrations
import (
"context"
"encoding/json"
"fmt"
"io/ioutil"
"net/http"
"testing"
"time"
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/auth"
"code.gitea.io/gitea/modules/queue"
api "code.gitea.io/gitea/modules/structs"
"github.com/stretchr/testify/assert"
@@ -228,29 +225,11 @@ func doAPIMergePullRequest(ctx APITestContext, owner, repo string, index int64)
Do: string(models.MergeStyleMerge),
})
resp := ctx.Session.MakeRequest(t, req, NoExpectedStatus)
if resp.Code == http.StatusMethodNotAllowed {
err := api.APIError{}
DecodeJSON(t, resp, &err)
assert.EqualValues(t, "Please try again later", err.Message)
queue.GetManager().FlushAll(context.Background(), 5*time.Second)
req = NewRequestWithJSON(t, http.MethodPost, urlStr, &auth.MergePullRequestForm{
MergeMessageField: "doAPIMergePullRequest Merge",
Do: string(models.MergeStyleMerge),
})
resp = ctx.Session.MakeRequest(t, req, NoExpectedStatus)
}
expected := ctx.ExpectedCode
if expected == 0 {
expected = 200
}
if !assert.EqualValues(t, expected, resp.Code,
"Request: %s %s", req.Method, req.URL.String()) {
logUnexpectedResponse(t, resp)
if ctx.ExpectedCode != 0 {
ctx.Session.MakeRequest(t, req, ctx.ExpectedCode)
return
}
ctx.Session.MakeRequest(t, req, 200)
}
}

View File

@@ -26,7 +26,7 @@ func TestUserHeatmap(t *testing.T) {
var heatmap []*models.UserHeatmapData
DecodeJSON(t, resp, &heatmap)
var dummyheatmap []*models.UserHeatmapData
dummyheatmap = append(dummyheatmap, &models.UserHeatmapData{Timestamp: 1603152000, Contributions: 1})
dummyheatmap = append(dummyheatmap, &models.UserHeatmapData{Timestamp: 1571616000, Contributions: 1})
assert.Equal(t, dummyheatmap, heatmap)
}

View File

@@ -141,7 +141,7 @@ func TestLDAPUserSignin(t *testing.T) {
assert.Equal(t, u.UserName, htmlDoc.GetInputValueByName("name"))
assert.Equal(t, u.FullName, htmlDoc.GetInputValueByName("full_name"))
assert.Equal(t, u.Email, htmlDoc.Find(`label[for="email"]`).Siblings().First().Text())
assert.Equal(t, u.Email, htmlDoc.GetInputValueByName("email"))
}
func TestLDAPUserSync(t *testing.T) {

View File

@@ -37,13 +37,6 @@ func (doc *HTMLDoc) GetInputValueByName(name string) string {
return text
}
// Find gets the descendants of each element in the current set of
// matched elements, filtered by a selector. It returns a new Selection
// object containing these matched elements.
func (doc *HTMLDoc) Find(selector string) *goquery.Selection {
return doc.doc.Find(selector)
}
// GetCSRF for get CSRC token value from input
func (doc *HTMLDoc) GetCSRF() string {
return doc.GetInputValueByName("_csrf")

View File

@@ -11,6 +11,7 @@ import (
"encoding/json"
"fmt"
"io"
"log"
"net/http"
"net/http/cookiejar"
"net/http/httptest"
@@ -26,10 +27,8 @@ import (
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/base"
"code.gitea.io/gitea/modules/graceful"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/queue"
"code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/storage"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/routers"
"code.gitea.io/gitea/routers/routes"
@@ -60,8 +59,6 @@ func NewNilResponseRecorder() *NilResponseRecorder {
}
func TestMain(m *testing.M) {
defer log.Close()
managerCtx, cancel := context.WithCancel(context.Background())
graceful.InitManager(managerCtx)
defer cancel()
@@ -145,10 +142,6 @@ func initIntegrationTest() {
util.RemoveAll(models.LocalCopyPath())
setting.CheckLFSVersion()
setting.InitDBConfig()
if err := storage.Init(); err != nil {
fmt.Printf("Init storage failed: %v", err)
os.Exit(1)
}
switch {
case setting.Database.UseMySQL:
@@ -156,27 +149,27 @@ func initIntegrationTest() {
setting.Database.User, setting.Database.Passwd, setting.Database.Host))
defer db.Close()
if err != nil {
log.Fatal("sql.Open: %v", err)
log.Fatalf("sql.Open: %v", err)
}
if _, err = db.Exec(fmt.Sprintf("CREATE DATABASE IF NOT EXISTS %s", setting.Database.Name)); err != nil {
log.Fatal("db.Exec: %v", err)
log.Fatalf("db.Exec: %v", err)
}
case setting.Database.UsePostgreSQL:
db, err := sql.Open("postgres", fmt.Sprintf("postgres://%s:%s@%s/?sslmode=%s",
setting.Database.User, setting.Database.Passwd, setting.Database.Host, setting.Database.SSLMode))
defer db.Close()
if err != nil {
log.Fatal("sql.Open: %v", err)
log.Fatalf("sql.Open: %v", err)
}
dbrows, err := db.Query(fmt.Sprintf("SELECT 1 FROM pg_database WHERE datname = '%s'", setting.Database.Name))
if err != nil {
log.Fatal("db.Query: %v", err)
log.Fatalf("db.Query: %v", err)
}
defer dbrows.Close()
if !dbrows.Next() {
if _, err = db.Exec(fmt.Sprintf("CREATE DATABASE %s", setting.Database.Name)); err != nil {
log.Fatal("db.Exec: CREATE DATABASE: %v", err)
log.Fatalf("db.Exec: CREATE DATABASE: %v", err)
}
}
// Check if we need to setup a specific schema
@@ -190,18 +183,18 @@ func initIntegrationTest() {
// This is a different db object; requires a different Close()
defer db.Close()
if err != nil {
log.Fatal("sql.Open: %v", err)
log.Fatalf("sql.Open: %v", err)
}
schrows, err := db.Query(fmt.Sprintf("SELECT 1 FROM information_schema.schemata WHERE schema_name = '%s'", setting.Database.Schema))
if err != nil {
log.Fatal("db.Query: %v", err)
log.Fatalf("db.Query: %v", err)
}
defer schrows.Close()
if !schrows.Next() {
// Create and setup a DB schema
if _, err = db.Exec(fmt.Sprintf("CREATE SCHEMA %s", setting.Database.Schema)); err != nil {
log.Fatal("db.Exec: CREATE SCHEMA: %v", err)
log.Fatalf("db.Exec: CREATE SCHEMA: %v", err)
}
}
@@ -210,10 +203,10 @@ func initIntegrationTest() {
db, err := sql.Open("mssql", fmt.Sprintf("server=%s; port=%s; database=%s; user id=%s; password=%s;",
host, port, "master", setting.Database.User, setting.Database.Passwd))
if err != nil {
log.Fatal("sql.Open: %v", err)
log.Fatalf("sql.Open: %v", err)
}
if _, err := db.Exec(fmt.Sprintf("If(db_id(N'%s') IS NULL) BEGIN CREATE DATABASE %s; END;", setting.Database.Name, setting.Database.Name)); err != nil {
log.Fatal("db.Exec: %v", err)
log.Fatalf("db.Exec: %v", err)
}
defer db.Close()
}

View File

@@ -78,7 +78,6 @@ func storeAndGetLfs(t *testing.T, content *[]byte, extraHeader *http.Header, exp
}
}
}
resp := session.MakeRequest(t, req, expectedStatus)
return resp
@@ -211,7 +210,7 @@ func TestGetLFSRange(t *testing.T) {
{"bytes=0-10", "123456789\n", http.StatusPartialContent},
// end-range bigger than length-1 is ignored
{"bytes=0-11", "123456789\n", http.StatusPartialContent},
{"bytes=11-", "Requested Range Not Satisfiable", http.StatusRequestedRangeNotSatisfiable},
{"bytes=11-", "", http.StatusPartialContent},
// incorrect header value cause whole header to be ignored
{"bytes=-", "123456789\n", http.StatusOK},
{"foobar", "123456789\n", http.StatusOK},

View File

@@ -45,21 +45,19 @@ START_SSH_SERVER = true
OFFLINE_MODE = false
LFS_START_SERVER = true
LFS_CONTENT_PATH = integrations/gitea-integration-mysql/datalfs-mysql
LFS_JWT_SECRET = Tv_MjmZuHqpIY6GFl12ebgkRAMt4RlWt0v4EHKSXO0w
[lfs]
MINIO_BASE_PATH = lfs/
LFS_STORE_TYPE = minio
LFS_SERVE_DIRECT = false
LFS_MINIO_ENDPOINT = minio:9000
LFS_MINIO_ACCESS_KEY_ID = 123456
LFS_MINIO_SECRET_ACCESS_KEY = 12345678
LFS_MINIO_BUCKET = gitea
LFS_MINIO_LOCATION = us-east-1
LFS_MINIO_BASE_PATH = lfs/
LFS_MINIO_USE_SSL = false
[attachment]
MINIO_BASE_PATH = attachments/
[avatars]
MINIO_BASE_PATH = avatars/
[repo-avatars]
MINIO_BASE_PATH = repo-avatars/
[storage]
STORAGE_TYPE = minio
SERVE_DIRECT = false
MINIO_ENDPOINT = minio:9000
@@ -67,6 +65,7 @@ MINIO_ACCESS_KEY_ID = 123456
MINIO_SECRET_ACCESS_KEY = 12345678
MINIO_BUCKET = gitea
MINIO_LOCATION = us-east-1
MINIO_BASE_PATH = attachments/
MINIO_USE_SSL = false
[mailer]
@@ -89,6 +88,9 @@ ENABLE_NOTIFY_MAIL = true
DISABLE_GRAVATAR = false
ENABLE_FEDERATED_AVATAR = false
AVATAR_UPLOAD_PATH = integrations/gitea-integration-mysql/data/avatars
REPOSITORY_AVATAR_UPLOAD_PATH = integrations/gitea-integration-mysql/data/repo-avatars
[session]
PROVIDER = file
PROVIDER_CONFIG = integrations/gitea-integration-mysql/data/sessions

View File

@@ -2003,7 +2003,7 @@ type ErrNotValidReviewRequest struct {
// IsErrNotValidReviewRequest checks if an error is a ErrNotValidReviewRequest.
func IsErrNotValidReviewRequest(err error) bool {
_, ok := err.(ErrNotValidReviewRequest)
_, ok := err.(ErrReviewNotExist)
return ok
}

View File

@@ -5,7 +5,7 @@
act_user_id: 2
repo_id: 2
is_private: true
created_unix: 1603228283
created_unix: 1571686356
-
id: 2

View File

@@ -725,7 +725,6 @@ func createComment(e *xorm.Session, opts *CreateCommentOptions) (_ *Comment, err
RefAction: opts.RefAction,
RefIsPull: opts.RefIsPull,
IsForcePush: opts.IsForcePush,
Invalidated: opts.Invalidated,
}
if _, err = e.Insert(comment); err != nil {
return nil, err
@@ -892,7 +891,6 @@ type CreateCommentOptions struct {
RefAction references.XRefAction
RefIsPull bool
IsForcePush bool
Invalidated bool
}
// CreateComment creates comment of issue or commit.
@@ -968,8 +966,6 @@ type FindCommentsOptions struct {
ReviewID int64
Since int64
Before int64
Line int64
TreePath string
Type CommentType
}
@@ -993,12 +989,6 @@ func (opts *FindCommentsOptions) toConds() builder.Cond {
if opts.Type != CommentTypeUnknown {
cond = cond.And(builder.Eq{"comment.type": opts.Type})
}
if opts.Line > 0 {
cond = cond.And(builder.Eq{"comment.line": opts.Line})
}
if len(opts.TreePath) > 0 {
cond = cond.And(builder.Eq{"comment.tree_path": opts.TreePath})
}
return cond
}
@@ -1013,8 +1003,6 @@ func findComments(e Engine, opts FindCommentsOptions) ([]*Comment, error) {
sess = opts.setSessionPagination(sess)
}
// WARNING: If you change this order you will need to fix createCodeComment
return comments, sess.
Asc("comment.created_unix").
Asc("comment.id").
@@ -1136,10 +1124,6 @@ func fetchCodeCommentsByReview(e Engine, issue *Issue, currentUser *User, review
return nil, err
}
if err := comment.LoadReactions(issue.Repo); err != nil {
return nil, err
}
if re, ok := reviews[comment.ReviewID]; ok && re != nil {
// If the review is pending only the author can see the comments (except the review is set)
if review.ID == 0 {

View File

@@ -147,27 +147,6 @@ func GetMigratingTask(repoID int64) (*Task, error) {
return &task, nil
}
// GetMigratingTaskByID returns the migrating task by repo's id
func GetMigratingTaskByID(id, doerID int64) (*Task, *migration.MigrateOptions, error) {
var task = Task{
ID: id,
DoerID: doerID,
Type: structs.TaskTypeMigrateRepo,
}
has, err := x.Get(&task)
if err != nil {
return nil, nil, err
} else if !has {
return nil, nil, ErrTaskDoesNotExist{id, 0, task.Type}
}
var opts migration.MigrateOptions
if err := json.Unmarshal([]byte(task.PayloadContent), &opts); err != nil {
return nil, nil, err
}
return &task, &opts, nil
}
// FindTaskOptions find all tasks
type FindTaskOptions struct {
Status int

View File

@@ -197,13 +197,10 @@ func FindTopics(opts *FindTopicOptions) (topics []*Topic, err error) {
// GetRepoTopicByName retrives topic from name for a repo if it exist
func GetRepoTopicByName(repoID int64, topicName string) (*Topic, error) {
return getRepoTopicByName(x, repoID, topicName)
}
func getRepoTopicByName(e Engine, repoID int64, topicName string) (*Topic, error) {
var cond = builder.NewCond()
var topic Topic
cond = cond.And(builder.Eq{"repo_topic.repo_id": repoID}).And(builder.Eq{"topic.name": topicName})
sess := e.Table("topic").Where(cond)
sess := x.Table("topic").Where(cond)
sess.Join("INNER", "repo_topic", "repo_topic.topic_id = topic.id")
has, err := sess.Get(&topic)
if has {
@@ -214,13 +211,7 @@ func getRepoTopicByName(e Engine, repoID int64, topicName string) (*Topic, error
// AddTopic adds a topic name to a repository (if it does not already have it)
func AddTopic(repoID int64, topicName string) (*Topic, error) {
sess := x.NewSession()
defer sess.Close()
if err := sess.Begin(); err != nil {
return nil, err
}
topic, err := getRepoTopicByName(sess, repoID, topicName)
topic, err := GetRepoTopicByName(repoID, topicName)
if err != nil {
return nil, err
}
@@ -229,25 +220,7 @@ func AddTopic(repoID int64, topicName string) (*Topic, error) {
return topic, nil
}
topic, err = addTopicByNameToRepo(sess, repoID, topicName)
if err != nil {
return nil, err
}
topicNames := make([]string, 0, 25)
if err := sess.Select("name").Table("topic").
Join("INNER", "repo_topic", "repo_topic.topic_id = topic.id").
Where("repo_topic.repo_id = ?", repoID).Desc("topic.repo_count").Find(&topicNames); err != nil {
return nil, err
}
if _, err := sess.ID(repoID).Cols("topics").Update(&Repository{
Topics: topicNames,
}); err != nil {
return nil, err
}
return topic, sess.Commit()
return addTopicByNameToRepo(x, repoID, topicName)
}
// DeleteTopic removes a topic name from a repository (if it has it)

View File

@@ -191,6 +191,9 @@ func (u *User) BeforeUpdate() {
if len(u.AvatarEmail) == 0 {
u.AvatarEmail = u.Email
}
if len(u.AvatarEmail) > 0 && u.Avatar == "" {
u.Avatar = base.HashEmail(u.AvatarEmail)
}
}
u.LowerName = strings.ToLower(u.Name)
@@ -832,6 +835,7 @@ func CreateUser(u *User) (err error) {
u.LowerName = strings.ToLower(u.Name)
u.AvatarEmail = u.Email
u.Avatar = base.HashEmail(u.AvatarEmail)
if u.Rands, err = GetUserSalt(); err != nil {
return err
}

View File

@@ -39,9 +39,10 @@ func (u *User) generateRandomAvatar(e Engine) error {
if err != nil {
return fmt.Errorf("RandomImage: %v", err)
}
// NOTICE for random avatar, it still uses id as avatar name, but custom avatar use md5
// since random image is not a user's photo, there is no security for enumable
if u.Avatar == "" {
u.Avatar = base.HashEmail(u.AvatarEmail)
u.Avatar = fmt.Sprintf("%d", u.ID)
}
if err := storage.SaveFrom(storage.Avatars, u.CustomAvatarRelativePath(), func(w io.Writer) error {

View File

@@ -17,7 +17,7 @@ func TestGetUserHeatmapDataByUser(t *testing.T) {
CountResult int
JSONResult string
}{
{2, 1, `[{"timestamp":1603152000,"contributions":1}]`},
{2, 1, `[{"timestamp":1571616000,"contributions":1}]`},
{3, 0, `[]`},
}
// Prepare

View File

@@ -12,9 +12,6 @@ import (
"github.com/msteinert/pam"
)
// Supported is true when built with PAM
var Supported = true
// Auth pam auth service
func Auth(serviceName, userName, passwd string) (string, error) {
t, err := pam.StartFunc(serviceName, userName, func(s pam.Style, msg string) (string, error) {

View File

@@ -10,9 +10,6 @@ import (
"errors"
)
// Supported is false when built without PAM
var Supported = false
// Auth not supported lack of pam tag
func Auth(serviceName, userName, passwd string) (string, error) {
return "", errors.New("PAM not supported")

View File

@@ -199,6 +199,7 @@ func (f *AccessTokenForm) Validate(ctx *macaron.Context, errs binding.Errors) bi
type UpdateProfileForm struct {
Name string `binding:"AlphaDashDot;MaxSize(40)"`
FullName string `binding:"MaxSize(100)"`
Email string `binding:"Required;Email;MaxSize(254)"`
KeepEmailPrivate bool
Website string `binding:"ValidUrl;MaxSize(255)"`
Location string `binding:"MaxSize(50)"`

View File

@@ -27,7 +27,7 @@ type BlameReader struct {
cmd *exec.Cmd
pid int64
output io.ReadCloser
reader *bufio.Reader
scanner *bufio.Scanner
lastSha *string
cancel context.CancelFunc
}
@@ -38,30 +38,23 @@ var shaLineRegex = regexp.MustCompile("^([a-z0-9]{40})")
func (r *BlameReader) NextPart() (*BlamePart, error) {
var blamePart *BlamePart
reader := r.reader
scanner := r.scanner
if r.lastSha != nil {
blamePart = &BlamePart{*r.lastSha, make([]string, 0)}
}
var line []byte
var isPrefix bool
var err error
for err != io.EOF {
line, isPrefix, err = reader.ReadLine()
if err != nil && err != io.EOF {
return blamePart, err
}
for scanner.Scan() {
line := scanner.Text()
// Skip empty lines
if len(line) == 0 {
// isPrefix will be false
continue
}
lines := shaLineRegex.FindSubmatch(line)
lines := shaLineRegex.FindStringSubmatch(line)
if lines != nil {
sha1 := string(lines[1])
sha1 := lines[1]
if blamePart == nil {
blamePart = &BlamePart{sha1, make([]string, 0)}
@@ -69,27 +62,12 @@ func (r *BlameReader) NextPart() (*BlamePart, error) {
if blamePart.Sha != sha1 {
r.lastSha = &sha1
// need to munch to end of line...
for isPrefix {
_, isPrefix, err = reader.ReadLine()
if err != nil && err != io.EOF {
return blamePart, err
}
}
return blamePart, nil
}
} else if line[0] == '\t' {
code := line[1:]
blamePart.Lines = append(blamePart.Lines, string(code))
}
// need to munch to end of line...
for isPrefix {
_, isPrefix, err = reader.ReadLine()
if err != nil && err != io.EOF {
return blamePart, err
}
blamePart.Lines = append(blamePart.Lines, code)
}
}
@@ -143,13 +121,13 @@ func createBlameReader(ctx context.Context, dir string, command ...string) (*Bla
pid := process.GetManager().Add(fmt.Sprintf("GetBlame [repo_path: %s]", dir), cancel)
reader := bufio.NewReader(stdout)
scanner := bufio.NewScanner(stdout)
return &BlameReader{
cmd,
pid,
stdout,
reader,
scanner,
nil,
cancel,
}, nil

View File

@@ -8,7 +8,6 @@ import (
"crypto/sha256"
"encoding/hex"
"errors"
"fmt"
"io"
"os"
@@ -22,21 +21,6 @@ var (
errSizeMismatch = errors.New("Content size does not match")
)
// ErrRangeNotSatisfiable represents an error which request range is not satisfiable.
type ErrRangeNotSatisfiable struct {
FromByte int64
}
func (err ErrRangeNotSatisfiable) Error() string {
return fmt.Sprintf("Requested range %d is not satisfiable", err.FromByte)
}
// IsErrRangeNotSatisfiable returns true if the error is an ErrRangeNotSatisfiable
func IsErrRangeNotSatisfiable(err error) bool {
_, ok := err.(ErrRangeNotSatisfiable)
return ok
}
// ContentStore provides a simple file system based storage.
type ContentStore struct {
storage.ObjectStorage
@@ -51,12 +35,7 @@ func (s *ContentStore) Get(meta *models.LFSMetaObject, fromByte int64) (io.ReadC
return nil, err
}
if fromByte > 0 {
if fromByte >= meta.Size {
return nil, ErrRangeNotSatisfiable{
FromByte: fromByte,
}
}
_, err = f.Seek(fromByte, io.SeekStart)
_, err = f.Seek(fromByte, os.SEEK_CUR)
if err != nil {
log.Error("Whilst trying to read LFS OID[%s]: Unable to seek to %d Error: %v", meta.Oid, fromByte, err)
}

View File

@@ -191,12 +191,8 @@ func getContentHandler(ctx *context.Context) {
contentStore := &ContentStore{ObjectStorage: storage.LFS}
content, err := contentStore.Get(meta, fromByte)
if err != nil {
if IsErrRangeNotSatisfiable(err) {
writeStatus(ctx, http.StatusRequestedRangeNotSatisfiable)
} else {
// Errors are logged in contentStore.Get
writeStatus(ctx, 404)
}
// Errors are logged in contentStore.Get
writeStatus(ctx, 404)
return
}
defer content.Close()

View File

@@ -14,7 +14,6 @@ import (
"strings"
"time"
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/migrations/base"
"code.gitea.io/gitea/modules/structs"
@@ -48,7 +47,7 @@ func (f *GiteaDownloaderFactory) New(ctx context.Context, opts base.MigrateOptio
path := strings.Split(repoNameSpace, "/")
if len(path) < 2 {
return nil, fmt.Errorf("invalid path: %s", repoNameSpace)
return nil, fmt.Errorf("invalid path")
}
repoPath := strings.Join(path[len(path)-2:], "/")
@@ -88,7 +87,7 @@ func NewGiteaDownloader(ctx context.Context, baseURL, repoPath, username, passwo
gitea_sdk.SetContext(ctx),
)
if err != nil {
log.Error(fmt.Sprintf("Failed to create NewGiteaDownloader for: %s. Error: %v", baseURL, err))
log.Error(fmt.Sprintf("NewGiteaDownloader: %s", err.Error()))
return nil, err
}
@@ -102,13 +101,12 @@ func NewGiteaDownloader(ctx context.Context, baseURL, repoPath, username, passwo
// set small maxPerPage since we can only guess
// (default would be 50 but this can differ)
maxPerPage := 10
// gitea instances >=1.13 can tell us what maximum they have
apiConf, _, err := giteaClient.GetGlobalAPISettings()
if err != nil {
log.Info("Unable to get global API settings. Ignoring these.")
log.Debug("giteaClient.GetGlobalAPISettings. Error: %v", err)
}
if apiConf != nil {
// new gitea instances can tell us what maximum they have
if giteaClient.CheckServerVersionConstraint(">=1.13.0") == nil {
apiConf, _, err := giteaClient.GetGlobalAPISettings()
if err != nil {
return nil, err
}
maxPerPage = apiConf.MaxResponseItems
}
@@ -396,11 +394,7 @@ func (g *GiteaDownloader) GetIssues(page, perPage int) ([]*base.Issue, bool, err
reactions, err := g.getIssueReactions(issue.Index)
if err != nil {
log.Warn("Unable to load reactions during migrating issue #%d to %s/%s. Error: %v", issue.Index, g.repoOwner, g.repoName, err)
if err2 := models.CreateRepositoryNotice(
fmt.Sprintf("Unable to load reactions during migrating issue #%d to %s/%s. Error: %v", issue.Index, g.repoOwner, g.repoName, err)); err2 != nil {
log.Error("create repository notice failed: ", err2)
}
return nil, false, fmt.Errorf("error while loading reactions: %v", err)
}
var assignees []string
@@ -451,17 +445,13 @@ func (g *GiteaDownloader) GetComments(index int64) ([]*base.Comment, error) {
// Page: i,
}})
if err != nil {
return nil, fmt.Errorf("error while listing comments for issue #%d. Error: %v", index, err)
return nil, fmt.Errorf("error while listing comments: %v", err)
}
for _, comment := range comments {
reactions, err := g.getCommentReactions(comment.ID)
if err != nil {
log.Warn("Unable to load comment reactions during migrating issue #%d for comment %d to %s/%s. Error: %v", index, comment.ID, g.repoOwner, g.repoName, err)
if err2 := models.CreateRepositoryNotice(
fmt.Sprintf("Unable to load reactions during migrating issue #%d for comment %d to %s/%s. Error: %v", index, comment.ID, g.repoOwner, g.repoName, err)); err2 != nil {
log.Error("create repository notice failed: ", err2)
}
return nil, fmt.Errorf("error while listing comment creactions: %v", err)
}
allComments = append(allComments, &base.Comment{
@@ -499,7 +489,7 @@ func (g *GiteaDownloader) GetPullRequests(page, perPage int) ([]*base.PullReques
State: gitea_sdk.StateAll,
})
if err != nil {
return nil, false, fmt.Errorf("error while listing pull requests (page: %d, pagesize: %d). Error: %v", page, perPage, err)
return nil, false, fmt.Errorf("error while listing repos: %v", err)
}
for _, pr := range prs {
var milestone string
@@ -530,7 +520,7 @@ func (g *GiteaDownloader) GetPullRequests(page, perPage int) ([]*base.PullReques
if headSHA == "" {
headCommit, _, err := g.client.GetSingleCommit(g.repoOwner, g.repoName, url.PathEscape(pr.Head.Ref))
if err != nil {
return nil, false, fmt.Errorf("error while resolving head git ref: %s for pull #%d. Error: %v", pr.Head.Ref, pr.Index, err)
return nil, false, fmt.Errorf("error while resolving git ref: %v", err)
}
headSHA = headCommit.SHA
}
@@ -543,11 +533,7 @@ func (g *GiteaDownloader) GetPullRequests(page, perPage int) ([]*base.PullReques
reactions, err := g.getIssueReactions(pr.Index)
if err != nil {
log.Warn("Unable to load reactions during migrating pull #%d to %s/%s. Error: %v", pr.Index, g.repoOwner, g.repoName, err)
if err2 := models.CreateRepositoryNotice(
fmt.Sprintf("Unable to load reactions during migrating pull #%d to %s/%s. Error: %v", pr.Index, g.repoOwner, g.repoName, err)); err2 != nil {
log.Error("create repository notice failed: ", err2)
}
return nil, false, fmt.Errorf("error while loading reactions: %v", err)
}
var assignees []string

View File

@@ -28,7 +28,6 @@ import (
"code.gitea.io/gitea/modules/storage"
"code.gitea.io/gitea/modules/structs"
"code.gitea.io/gitea/modules/timeutil"
"code.gitea.io/gitea/services/pull"
gouuid "github.com/google/uuid"
)
@@ -525,7 +524,6 @@ func (g *GiteaLocalUploader) CreatePullRequests(prs ...*base.PullRequest) error
}
for _, pr := range gprs {
g.issues.Store(pr.Issue.Index, pr.Issue.ID)
pull.AddToTaskQueue(pr)
}
return nil
}

View File

@@ -65,25 +65,23 @@ func (f *GithubDownloaderV3Factory) GitServiceType() structs.GitServiceType {
// GithubDownloaderV3 implements a Downloader interface to get repository informations
// from github via APIv3
type GithubDownloaderV3 struct {
ctx context.Context
client *github.Client
repoOwner string
repoName string
userName string
password string
rate *github.Rate
maxPerPage int
ctx context.Context
client *github.Client
repoOwner string
repoName string
userName string
password string
rate *github.Rate
}
// NewGithubDownloaderV3 creates a github Downloader via github v3 API
func NewGithubDownloaderV3(ctx context.Context, baseURL, userName, password, token, repoOwner, repoName string) *GithubDownloaderV3 {
var downloader = GithubDownloaderV3{
userName: userName,
password: password,
ctx: ctx,
repoOwner: repoOwner,
repoName: repoName,
maxPerPage: 100,
userName: userName,
password: password,
ctx: ctx,
repoOwner: repoOwner,
repoName: repoName,
}
client := &http.Client{
@@ -179,7 +177,7 @@ func (g *GithubDownloaderV3) GetTopics() ([]string, error) {
// GetMilestones returns milestones
func (g *GithubDownloaderV3) GetMilestones() ([]*base.Milestone, error) {
var perPage = g.maxPerPage
var perPage = 100
var milestones = make([]*base.Milestone, 0, perPage)
for i := 1; ; i++ {
g.sleep()
@@ -235,7 +233,7 @@ func convertGithubLabel(label *github.Label) *base.Label {
// GetLabels returns labels
func (g *GithubDownloaderV3) GetLabels() ([]*base.Label, error) {
var perPage = g.maxPerPage
var perPage = 100
var labels = make([]*base.Label, 0, perPage)
for i := 1; ; i++ {
g.sleep()
@@ -306,7 +304,7 @@ func (g *GithubDownloaderV3) convertGithubRelease(rel *github.RepositoryRelease)
// GetReleases returns releases
func (g *GithubDownloaderV3) GetReleases() ([]*base.Release, error) {
var perPage = g.maxPerPage
var perPage = 100
var releases = make([]*base.Release, 0, perPage)
for i := 1; ; i++ {
g.sleep()
@@ -344,9 +342,6 @@ func (g *GithubDownloaderV3) GetAsset(_ string, _, id int64) (io.ReadCloser, err
// GetIssues returns issues according start and limit
func (g *GithubDownloaderV3) GetIssues(page, perPage int) ([]*base.Issue, bool, error) {
if perPage > g.maxPerPage {
perPage = g.maxPerPage
}
opt := &github.IssueListByRepoOptions{
Sort: "created",
Direction: "asc",
@@ -434,7 +429,7 @@ func (g *GithubDownloaderV3) GetIssues(page, perPage int) ([]*base.Issue, bool,
// GetComments returns comments according issueNumber
func (g *GithubDownloaderV3) GetComments(issueNumber int64) ([]*base.Comment, error) {
var (
allComments = make([]*base.Comment, 0, g.maxPerPage)
allComments = make([]*base.Comment, 0, 100)
created = "created"
asc = "asc"
)
@@ -442,7 +437,7 @@ func (g *GithubDownloaderV3) GetComments(issueNumber int64) ([]*base.Comment, er
Sort: &created,
Direction: &asc,
ListOptions: github.ListOptions{
PerPage: g.maxPerPage,
PerPage: 100,
},
}
for {
@@ -464,7 +459,7 @@ func (g *GithubDownloaderV3) GetComments(issueNumber int64) ([]*base.Comment, er
g.sleep()
res, resp, err := g.client.Reactions.ListIssueCommentReactions(g.ctx, g.repoOwner, g.repoName, comment.GetID(), &github.ListOptions{
Page: i,
PerPage: g.maxPerPage,
PerPage: 100,
})
if err != nil {
return nil, err
@@ -502,9 +497,6 @@ func (g *GithubDownloaderV3) GetComments(issueNumber int64) ([]*base.Comment, er
// GetPullRequests returns pull requests according page and perPage
func (g *GithubDownloaderV3) GetPullRequests(page, perPage int) ([]*base.PullRequest, bool, error) {
if perPage > g.maxPerPage {
perPage = g.maxPerPage
}
opt := &github.PullRequestListOptions{
Sort: "created",
Direction: "asc",
@@ -658,7 +650,7 @@ func (g *GithubDownloaderV3) convertGithubReviewComments(cs []*github.PullReques
g.sleep()
res, resp, err := g.client.Reactions.ListPullRequestCommentReactions(g.ctx, g.repoOwner, g.repoName, c.GetID(), &github.ListOptions{
Page: i,
PerPage: g.maxPerPage,
PerPage: 100,
})
if err != nil {
return nil, err
@@ -695,9 +687,9 @@ func (g *GithubDownloaderV3) convertGithubReviewComments(cs []*github.PullReques
// GetReviews returns pull requests review
func (g *GithubDownloaderV3) GetReviews(pullRequestNumber int64) ([]*base.Review, error) {
var allReviews = make([]*base.Review, 0, g.maxPerPage)
var allReviews = make([]*base.Review, 0, 100)
opt := &github.ListOptions{
PerPage: g.maxPerPage,
PerPage: 100,
}
for {
g.sleep()
@@ -711,7 +703,7 @@ func (g *GithubDownloaderV3) GetReviews(pullRequestNumber int64) ([]*base.Review
r.IssueIndex = pullRequestNumber
// retrieve all review comments
opt2 := &github.ListOptions{
PerPage: g.maxPerPage,
PerPage: 100,
}
for {
g.sleep()

View File

@@ -68,7 +68,6 @@ type GitlabDownloader struct {
repoName string
issueCount int64
fetchPRcomments bool
maxPerPage int
}
// NewGitlabDownloader creates a gitlab Downloader via gitlab API
@@ -100,11 +99,10 @@ func NewGitlabDownloader(ctx context.Context, baseURL, repoPath, username, passw
}
return &GitlabDownloader{
ctx: ctx,
client: gitlabClient,
repoID: gr.ID,
repoName: gr.Name,
maxPerPage: 100,
ctx: ctx,
client: gitlabClient,
repoID: gr.ID,
repoName: gr.Name,
}, nil
}
@@ -161,7 +159,7 @@ func (g *GitlabDownloader) GetTopics() ([]string, error) {
// GetMilestones returns milestones
func (g *GitlabDownloader) GetMilestones() ([]*base.Milestone, error) {
var perPage = g.maxPerPage
var perPage = 100
var state = "all"
var milestones = make([]*base.Milestone, 0, perPage)
for i := 1; ; i++ {
@@ -232,7 +230,7 @@ func (g *GitlabDownloader) normalizeColor(val string) string {
// GetLabels returns labels
func (g *GitlabDownloader) GetLabels() ([]*base.Label, error) {
var perPage = g.maxPerPage
var perPage = 100
var labels = make([]*base.Label, 0, perPage)
for i := 1; ; i++ {
ls, _, err := g.client.Labels.ListLabels(g.repoID, &gitlab.ListLabelsOptions{ListOptions: gitlab.ListOptions{
@@ -283,7 +281,7 @@ func (g *GitlabDownloader) convertGitlabRelease(rel *gitlab.Release) *base.Relea
// GetReleases returns releases
func (g *GitlabDownloader) GetReleases() ([]*base.Release, error) {
var perPage = g.maxPerPage
var perPage = 100
var releases = make([]*base.Release, 0, perPage)
for i := 1; ; i++ {
ls, _, err := g.client.Releases.ListReleases(g.repoID, &gitlab.ListReleasesOptions{
@@ -332,10 +330,6 @@ func (g *GitlabDownloader) GetIssues(page, perPage int) ([]*base.Issue, bool, er
state := "all"
sort := "asc"
if perPage > g.maxPerPage {
perPage = g.maxPerPage
}
opt := &gitlab.ListProjectIssuesOptions{
State: &state,
Sort: &sort,
@@ -407,7 +401,7 @@ func (g *GitlabDownloader) GetIssues(page, perPage int) ([]*base.Issue, bool, er
// GetComments returns comments according issueNumber
// TODO: figure out how to transfer comment reactions
func (g *GitlabDownloader) GetComments(issueNumber int64) ([]*base.Comment, error) {
var allComments = make([]*base.Comment, 0, g.maxPerPage)
var allComments = make([]*base.Comment, 0, 100)
var page = 1
var realIssueNumber int64
@@ -421,14 +415,14 @@ func (g *GitlabDownloader) GetComments(issueNumber int64) ([]*base.Comment, erro
realIssueNumber = issueNumber
comments, resp, err = g.client.Discussions.ListIssueDiscussions(g.repoID, int(realIssueNumber), &gitlab.ListIssueDiscussionsOptions{
Page: page,
PerPage: g.maxPerPage,
PerPage: 100,
}, nil, gitlab.WithContext(g.ctx))
} else {
// If this is a PR, we need to figure out the Gitlab/original PR ID to be passed below
realIssueNumber = issueNumber - g.issueCount
comments, resp, err = g.client.Discussions.ListMergeRequestDiscussions(g.repoID, int(realIssueNumber), &gitlab.ListMergeRequestDiscussionsOptions{
Page: page,
PerPage: g.maxPerPage,
PerPage: 100,
}, nil, gitlab.WithContext(g.ctx))
}
@@ -471,10 +465,6 @@ func (g *GitlabDownloader) GetComments(issueNumber int64) ([]*base.Comment, erro
// GetPullRequests returns pull requests according page and perPage
func (g *GitlabDownloader) GetPullRequests(page, perPage int) ([]*base.PullRequest, bool, error) {
if perPage > g.maxPerPage {
perPage = g.maxPerPage
}
opt := &gitlab.ListProjectMergeRequestsOptions{
ListOptions: gitlab.ListOptions{
PerPage: perPage,

View File

@@ -69,7 +69,7 @@ func MigrateRepository(ctx context.Context, doer *models.User, ownerName string,
}
if err2 := models.CreateRepositoryNotice(fmt.Sprintf("Migrate repository from %s failed: %v", opts.OriginalURL, err)); err2 != nil {
log.Error("create repository notice failed: ", err2)
log.Error("create respotiry notice failed: ", err2)
}
return nil, err
}

View File

@@ -314,7 +314,7 @@ func (a *actionNotifier) NotifySyncDeleteRef(doer *models.User, repo *models.Rep
if err := models.NotifyWatchers(&models.Action{
ActUserID: repo.OwnerID,
ActUser: repo.MustOwner(),
OpType: models.ActionMirrorSyncDelete,
OpType: models.ActionMirrorSyncCreate,
RepoID: repo.ID,
Repo: repo,
IsPrivate: repo.IsPrivate,

View File

@@ -235,78 +235,40 @@ func findAllIssueReferencesMarkdown(content string) []*rawReference {
return findAllIssueReferencesBytes(bcontent, links)
}
func convertFullHTMLReferencesToShortRefs(re *regexp.Regexp, contentBytes *[]byte) {
// We will iterate through the content, rewrite and simplify full references.
//
// We want to transform something like:
//
// this is a https://ourgitea.com/git/owner/repo/issues/123456789, foo
// https://ourgitea.com/git/owner/repo/pulls/123456789
//
// Into something like:
//
// this is a #123456789, foo
// !123456789
pos := 0
for {
// re looks for something like: (\s|^|\(|\[)https://ourgitea.com/git/(owner/repo)/(issues)/(123456789)(?:\s|$|\)|\]|[:;,.?!]\s|[:;,.?!]$)
match := re.FindSubmatchIndex((*contentBytes)[pos:])
if match == nil {
break
}
// match is a bunch of indices into the content from pos onwards so
// to simplify things let's just add pos to all of the indices in match
for i := range match {
match[i] += pos
}
// match[0]-match[1] is whole string
// match[2]-match[3] is preamble
// move the position to the end of the preamble
pos = match[3]
// match[4]-match[5] is owner/repo
// now copy the owner/repo to end of the preamble
endPos := pos + match[5] - match[4]
copy((*contentBytes)[pos:endPos], (*contentBytes)[match[4]:match[5]])
// move the current position to the end of the newly copied owner/repo
pos = endPos
// Now set the issue/pull marker:
//
// match[6]-match[7] == 'issues'
(*contentBytes)[pos] = '#'
if string((*contentBytes)[match[6]:match[7]]) == "pulls" {
(*contentBytes)[pos] = '!'
}
pos++
// Then add the issue/pull number
//
// match[8]-match[9] is the number
endPos = pos + match[9] - match[8]
copy((*contentBytes)[pos:endPos], (*contentBytes)[match[8]:match[9]])
// Now copy what's left at the end of the string to the new end position
copy((*contentBytes)[endPos:], (*contentBytes)[match[9]:])
// now we reset the length
// our new section has length endPos - match[3]
// our old section has length match[9] - match[3]
(*contentBytes) = (*contentBytes)[:len((*contentBytes))-match[9]+endPos]
pos = endPos
}
}
// FindAllIssueReferences returns a list of unvalidated references found in a string.
func FindAllIssueReferences(content string) []IssueReference {
// Need to convert fully qualified html references to local system to #/! short codes
contentBytes := []byte(content)
if re := getGiteaIssuePullPattern(); re != nil {
convertFullHTMLReferencesToShortRefs(re, &contentBytes)
pos := 0
for {
match := re.FindSubmatchIndex(contentBytes[pos:])
if match == nil {
break
}
// match[0]-match[1] is whole string
// match[2]-match[3] is preamble
pos += match[3]
// match[4]-match[5] is owner/repo
endPos := pos + match[5] - match[4]
copy(contentBytes[pos:endPos], contentBytes[match[4]:match[5]])
pos = endPos
// match[6]-match[7] == 'issues'
contentBytes[pos] = '#'
if string(contentBytes[match[6]:match[7]]) == "pulls" {
contentBytes[pos] = '!'
}
pos++
// match[8]-match[9] is the number
endPos = pos + match[9] - match[8]
copy(contentBytes[pos:endPos], contentBytes[match[8]:match[9]])
copy(contentBytes[endPos:], contentBytes[match[9]:])
// now we reset the length
// our new section has length endPos - match[3]
// our old section has length match[9] - match[3]
contentBytes = contentBytes[:len(contentBytes)-match[9]+endPos]
pos = endPos
}
} else {
log.Debug("No GiteaIssuePullPattern pattern")
}

View File

@@ -5,7 +5,6 @@
package references
import (
"regexp"
"testing"
"code.gitea.io/gitea/modules/setting"
@@ -30,26 +29,6 @@ type testResult struct {
TimeLog string
}
func TestConvertFullHTMLReferencesToShortRefs(t *testing.T) {
re := regexp.MustCompile(`(\s|^|\(|\[)` +
regexp.QuoteMeta("https://ourgitea.com/git/") +
`([0-9a-zA-Z-_\.]+/[0-9a-zA-Z-_\.]+)/` +
`((?:issues)|(?:pulls))/([0-9]+)(?:\s|$|\)|\]|[:;,.?!]\s|[:;,.?!]$)`)
test := `this is a https://ourgitea.com/git/owner/repo/issues/123456789, foo
https://ourgitea.com/git/owner/repo/pulls/123456789
And https://ourgitea.com/git/owner/repo/pulls/123
`
expect := `this is a owner/repo#123456789, foo
owner/repo!123456789
And owner/repo!123
`
contentBytes := []byte(test)
convertFullHTMLReferencesToShortRefs(re, &contentBytes)
result := string(contentBytes)
assert.EqualValues(t, expect, result)
}
func TestFindAllIssueReferences(t *testing.T) {
fixtures := []testFixture{
@@ -127,13 +106,6 @@ func TestFindAllIssueReferences(t *testing.T) {
{202, "user4", "repo5", "202", true, XRefActionNone, nil, nil, ""},
},
},
{
"This http://gitea.com:3000/user4/repo5/pulls/202 yes. http://gitea.com:3000/user4/repo5/pulls/203 no",
[]testResult{
{202, "user4", "repo5", "202", true, XRefActionNone, nil, nil, ""},
{203, "user4", "repo5", "203", true, XRefActionNone, nil, nil, ""},
},
},
{
"This http://GiTeA.COM:3000/user4/repo6/pulls/205 yes.",
[]testResult{

View File

@@ -21,7 +21,7 @@ type Storage struct {
// MapTo implements the Mappable interface
func (s *Storage) MapTo(v interface{}) error {
pathValue := reflect.ValueOf(v).Elem().FieldByName("Path")
pathValue := reflect.ValueOf(v).FieldByName("Path")
if pathValue.IsValid() && pathValue.Kind() == reflect.String {
pathValue.SetString(s.Path)
}
@@ -32,19 +32,21 @@ func (s *Storage) MapTo(v interface{}) error {
}
func getStorage(name, typ string, overrides ...*ini.Section) Storage {
const sectionName = "storage"
sectionName := "storage"
if len(name) > 0 {
sectionName = sectionName + "." + typ
}
sec := Cfg.Section(sectionName)
if len(overrides) == 0 {
overrides = []*ini.Section{
Cfg.Section(sectionName + "." + typ),
Cfg.Section(sectionName + "." + name),
}
}
var storage Storage
storage.Type = sec.Key("STORAGE_TYPE").MustString(typ)
storage.Type = sec.Key("STORAGE_TYPE").MustString("")
storage.ServeDirect = sec.Key("SERVE_DIRECT").MustBool(false)
// Global Defaults

View File

@@ -11,7 +11,6 @@ import (
"os"
"path/filepath"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/util"
)
@@ -40,7 +39,7 @@ func NewLocalStorage(ctx context.Context, cfg interface{}) (ObjectStorage, error
return nil, err
}
config := configInterface.(LocalStorageConfig)
log.Info("Creating new Local Storage at %s", config.Path)
if err := os.MkdirAll(config.Path, os.ModePerm); err != nil {
return nil, err
}

View File

@@ -13,7 +13,6 @@ import (
"strings"
"time"
"code.gitea.io/gitea/modules/log"
"github.com/minio/minio-go/v7"
"github.com/minio/minio-go/v7/pkg/credentials"
)
@@ -31,7 +30,7 @@ type minioObject struct {
func (m *minioObject) Stat() (os.FileInfo, error) {
oi, err := m.Object.Stat()
if err != nil {
return nil, convertMinioErr(err)
return nil, err
}
return &minioFileInfo{oi}, nil
@@ -59,41 +58,20 @@ type MinioStorage struct {
basePath string
}
func convertMinioErr(err error) error {
if err == nil {
return nil
}
errResp, ok := err.(minio.ErrorResponse)
if !ok {
return err
}
// Convert two responses to standard analogues
switch errResp.Code {
case "NoSuchKey":
return os.ErrNotExist
case "AccessDenied":
return os.ErrPermission
}
return err
}
// NewMinioStorage returns a minio storage
func NewMinioStorage(ctx context.Context, cfg interface{}) (ObjectStorage, error) {
configInterface, err := toConfig(MinioStorageConfig{}, cfg)
if err != nil {
return nil, convertMinioErr(err)
return nil, err
}
config := configInterface.(MinioStorageConfig)
log.Info("Creating Minio storage at %s:%s with base path %s", config.Endpoint, config.Bucket, config.BasePath)
minioClient, err := minio.New(config.Endpoint, &minio.Options{
Creds: credentials.NewStaticV4(config.AccessKeyID, config.SecretAccessKey, ""),
Secure: config.UseSSL,
})
if err != nil {
return nil, convertMinioErr(err)
return nil, err
}
if err := minioClient.MakeBucket(ctx, config.Bucket, minio.MakeBucketOptions{
@@ -102,7 +80,7 @@ func NewMinioStorage(ctx context.Context, cfg interface{}) (ObjectStorage, error
// Check to see if we already own this bucket (which happens if you run this twice)
exists, errBucketExists := minioClient.BucketExists(ctx, config.Bucket)
if !exists || errBucketExists != nil {
return nil, convertMinioErr(err)
return nil, err
}
}
@@ -123,7 +101,7 @@ func (m *MinioStorage) Open(path string) (Object, error) {
var opts = minio.GetObjectOptions{}
object, err := m.client.GetObject(m.ctx, m.bucket, m.buildMinioPath(path), opts)
if err != nil {
return nil, convertMinioErr(err)
return nil, err
}
return &minioObject{object}, nil
}
@@ -139,7 +117,7 @@ func (m *MinioStorage) Save(path string, r io.Reader) (int64, error) {
minio.PutObjectOptions{ContentType: "application/octet-stream"},
)
if err != nil {
return 0, convertMinioErr(err)
return 0, err
}
return uploadInfo.Size, nil
}
@@ -186,17 +164,14 @@ func (m *MinioStorage) Stat(path string) (os.FileInfo, error) {
return nil, os.ErrNotExist
}
}
return nil, convertMinioErr(err)
return nil, err
}
return &minioFileInfo{info}, nil
}
// Delete delete a file
func (m *MinioStorage) Delete(path string) error {
if err := m.client.RemoveObject(m.ctx, m.bucket, m.buildMinioPath(path), minio.RemoveObjectOptions{}); err != nil {
return convertMinioErr(err)
}
return nil
return m.client.RemoveObject(m.ctx, m.bucket, m.buildMinioPath(path), minio.RemoveObjectOptions{})
}
// URL gets the redirect URL to a file. The presigned link is valid for 5 minutes.
@@ -204,8 +179,7 @@ func (m *MinioStorage) URL(path, name string) (*url.URL, error) {
reqParams := make(url.Values)
// TODO it may be good to embed images with 'inline' like ServeData does, but we don't want to have to read the file, do we?
reqParams.Set("response-content-disposition", "attachment; filename=\""+quoteEscaper.Replace(name)+"\"")
u, err := m.client.PresignedGetObject(m.ctx, m.bucket, m.buildMinioPath(path), 5*time.Minute, reqParams)
return u, convertMinioErr(err)
return m.client.PresignedGetObject(m.ctx, m.bucket, m.buildMinioPath(path), 5*time.Minute, reqParams)
}
// IterateObjects iterates across the objects in the miniostorage
@@ -219,13 +193,13 @@ func (m *MinioStorage) IterateObjects(fn func(path string, obj Object) error) er
}) {
object, err := m.client.GetObject(lobjectCtx, m.bucket, mObjInfo.Key, opts)
if err != nil {
return convertMinioErr(err)
return err
}
if err := func(object *minio.Object, fn func(path string, obj Object) error) error {
defer object.Close()
return fn(strings.TrimPrefix(m.basePath, mObjInfo.Key), &minioObject{object})
}(object, fn); err != nil {
return convertMinioErr(err)
return err
}
}
return nil

View File

@@ -12,7 +12,6 @@ import (
"net/url"
"os"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/setting"
)
@@ -142,25 +141,21 @@ func NewStorage(typStr string, cfg interface{}) (ObjectStorage, error) {
}
func initAvatars() (err error) {
log.Info("Initialising Avatar storage with type: %s", setting.Avatar.Storage.Type)
Avatars, err = NewStorage(setting.Avatar.Storage.Type, &setting.Avatar.Storage)
Avatars, err = NewStorage(setting.Avatar.Storage.Type, setting.Avatar.Storage)
return
}
func initAttachments() (err error) {
log.Info("Initialising Attachment storage with type: %s", setting.Attachment.Storage.Type)
Attachments, err = NewStorage(setting.Attachment.Storage.Type, &setting.Attachment.Storage)
Attachments, err = NewStorage(setting.Attachment.Storage.Type, setting.Attachment.Storage)
return
}
func initLFS() (err error) {
log.Info("Initialising LFS storage with type: %s", setting.LFS.Storage.Type)
LFS, err = NewStorage(setting.LFS.Storage.Type, &setting.LFS.Storage)
LFS, err = NewStorage(setting.LFS.Storage.Type, setting.LFS.Storage)
return
}
func initRepoAvatars() (err error) {
log.Info("Initialising Repository Avatar storage with type: %s", setting.RepoAvatar.Storage.Type)
RepoAvatars, err = NewStorage(setting.RepoAvatar.Storage.Type, &setting.RepoAvatar.Storage)
RepoAvatars, err = NewStorage(setting.RepoAvatar.Storage.Type, setting.RepoAvatar.Storage)
return
}

View File

@@ -20,7 +20,7 @@ import (
"code.gitea.io/gitea/modules/util"
)
func handleCreateError(owner *models.User, err error) error {
func handleCreateError(owner *models.User, err error, name string) error {
switch {
case models.IsErrReachLimitOfRepo(err):
return fmt.Errorf("You have already reached your limit of %d repositories", owner.MaxCreationLimit())
@@ -38,8 +38,8 @@ func handleCreateError(owner *models.User, err error) error {
func runMigrateTask(t *models.Task) (err error) {
defer func() {
if e := recover(); e != nil {
err = fmt.Errorf("PANIC whilst trying to do migrate task: %v", e)
log.Critical("PANIC during runMigrateTask[%d] by DoerID[%d] to RepoID[%d] for OwnerID[%d]: %v\nStacktrace: %v", t.ID, t.DoerID, t.RepoID, t.OwnerID, e, log.Stack(2))
err = fmt.Errorf("PANIC whilst trying to do migrate task: %v\nStacktrace: %v", err, log.Stack(2))
log.Critical("PANIC during runMigrateTask[%d] by DoerID[%d] to RepoID[%d] for OwnerID[%d]: %v", t.ID, t.DoerID, t.RepoID, t.OwnerID, err)
}
if err == nil {
@@ -55,8 +55,7 @@ func runMigrateTask(t *models.Task) (err error) {
t.EndTime = timeutil.TimeStampNow()
t.Status = structs.TaskStatusFailed
t.Errors = err.Error()
t.RepoID = 0
if err := t.UpdateCols("status", "errors", "repo_id", "end_time"); err != nil {
if err := t.UpdateCols("status", "errors", "end_time"); err != nil {
log.Error("Task UpdateCols failed: %v", err)
}
@@ -67,8 +66,8 @@ func runMigrateTask(t *models.Task) (err error) {
}
}()
if err = t.LoadRepo(); err != nil {
return
if err := t.LoadRepo(); err != nil {
return err
}
// if repository is ready, then just finsih the task
@@ -76,35 +75,33 @@ func runMigrateTask(t *models.Task) (err error) {
return nil
}
if err = t.LoadDoer(); err != nil {
return
if err := t.LoadDoer(); err != nil {
return err
}
if err = t.LoadOwner(); err != nil {
return
if err := t.LoadOwner(); err != nil {
return err
}
t.StartTime = timeutil.TimeStampNow()
t.Status = structs.TaskStatusRunning
if err = t.UpdateCols("start_time", "status"); err != nil {
return
if err := t.UpdateCols("start_time", "status"); err != nil {
return err
}
var opts *migration.MigrateOptions
opts, err = t.MigrateConfig()
if err != nil {
return
return err
}
opts.MigrateToRepoID = t.RepoID
var repo *models.Repository
repo, err = migrations.MigrateRepository(graceful.GetManager().HammerContext(), t.Doer, t.Owner.Name, *opts)
repo, err := migrations.MigrateRepository(graceful.GetManager().HammerContext(), t.Doer, t.Owner.Name, *opts)
if err == nil {
log.Trace("Repository migrated [%d]: %s/%s", repo.ID, t.Owner.Name, repo.Name)
return
return nil
}
if models.IsErrRepoAlreadyExist(err) {
err = errors.New("The repository name is already used")
return
return errors.New("The repository name is already used")
}
// remoteAddr may contain credentials, so we sanitize it
@@ -116,7 +113,5 @@ func runMigrateTask(t *models.Task) (err error) {
return fmt.Errorf("Migration failed: %v", err.Error())
}
// do not be tempted to coalesce this line with the return
err = handleCreateError(t.Owner, err)
return
return handleCreateError(t.Owner, err, "MigratePost")
}

View File

@@ -870,11 +870,9 @@ editor.file_already_exists = A file named '%s' already exists in this repository
editor.commit_empty_file_header = Commit an empty file
editor.commit_empty_file_text = The file you're about to commit is empty. Proceed?
editor.no_changes_to_show = There are no changes to show.
editor.fail_to_update_file = Failed to update/create file '%s'.
editor.fail_to_update_file_summary = Error Message:
editor.fail_to_update_file = Failed to update/create file '%s' with error: %v
editor.push_rejected_no_message = The change was rejected by the server without a message. Please check githooks.
editor.push_rejected = The change was rejected by the server. Please check githooks.
editor.push_rejected_summary = Full Rejection Message:
editor.push_rejected = The change was rejected by the server with the following message:<br>%s<br> Please check githooks.
editor.add_subdir = Add a directory…
editor.unable_to_upload_files = Failed to upload files to '%s' with error: %v
editor.upload_file_is_locked = File '%s' is locked by %s.
@@ -1192,7 +1190,6 @@ issues.review.remove_review_request_self = "refused to review %s"
issues.review.pending = Pending
issues.review.review = Review
issues.review.reviewers = Reviewers
issues.review.outdated = Outdated
issues.review.show_outdated = Show outdated
issues.review.hide_outdated = Hide outdated
issues.review.show_resolved = Show resolved
@@ -1261,15 +1258,11 @@ pulls.rebase_merge_commit_pull_request = Rebase and Merge (--no-ff)
pulls.squash_merge_pull_request = Squash and Merge
pulls.require_signed_wont_sign = The branch requires signed commits but this merge will not be signed
pulls.invalid_merge_option = You cannot use this merge option for this pull request.
pulls.merge_conflict = Merge Failed: There was a conflict whilst merging. Hint: Try a different strategy
pulls.merge_conflict_summary = Error Message
pulls.rebase_conflict = Merge Failed: There was a conflict whilst rebasing commit: %[1]s. Hint: Try a different strategy
pulls.rebase_conflict_summary = Error Message
; </summary><code>%[2]s<br>%[3]s</code></details>
pulls.merge_conflict = Merge Failed: There was a conflict whilst merging: %[1]s<br>%[2]s<br>Hint: Try a different strategy
pulls.rebase_conflict = Merge Failed: There was a conflict whilst rebasing commit: %[1]s<br>%[2]s<br>%[3]s<br>Hint:Try a different strategy
pulls.unrelated_histories = Merge Failed: The merge head and base do not share a common history. Hint: Try a different strategy
pulls.merge_out_of_date = Merge Failed: Whilst generating the merge, the base was updated. Hint: Try again.
pulls.push_rejected = Merge Failed: The push was rejected. Review the githooks for this repository.
pulls.push_rejected_summary = Full Rejection Message
pulls.push_rejected = Merge Failed: The push was rejected with the following message:<br>%s<br>Review the githooks for this repository
pulls.push_rejected_no_message = Merge Failed: The push was rejected but there was no remote message.<br>Review the githooks for this repository
pulls.open_unmerged_pull_exists = `You cannot perform a reopen operation because there is a pending pull request (#%d) with identical properties.`
pulls.status_checking = Some checks are pending

View File

@@ -1037,7 +1037,8 @@ issues.close_comment_issue=Commenta e Chiudi
issues.reopen_issue=Riapri
issues.reopen_comment_issue=Commenta e Riapri
issues.create_comment=Commento
issues.closed_at=`chiuso questo probleam <a id="%[1]s" href="#%[1]s">%[2]s</a>`
issues.closed_at="`chiuso questo probleam <a id=\"%[1]s\" href=\"#%[1]s\">%[2]s</a>`
Contextrequest"
issues.reopened_at=`riaperto questo problema <a id="%[1]s" href="#%[1]s">%[2]s</a>`
issues.commit_ref_at=`ha fatto riferimento a questa issue dal commit <a id="%[1]s" href="#%[1]s">%[2]s</a>`
issues.ref_issue_from=`<a href="%[3]s">ha fatto riferimento a questo problema %[4]s</a> <a id="%[1]s" href="#%[1]s">%[2]s</a>`

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

View File

@@ -13,7 +13,6 @@ import (
"code.gitea.io/gitea/modules/auth"
"code.gitea.io/gitea/modules/auth/ldap"
"code.gitea.io/gitea/modules/auth/oauth2"
"code.gitea.io/gitea/modules/auth/pam"
"code.gitea.io/gitea/modules/base"
"code.gitea.io/gitea/modules/context"
"code.gitea.io/gitea/modules/log"
@@ -58,20 +57,14 @@ type dropdownItem struct {
}
var (
authSources = func() []dropdownItem {
items := []dropdownItem{
{models.LoginNames[models.LoginLDAP], models.LoginLDAP},
{models.LoginNames[models.LoginDLDAP], models.LoginDLDAP},
{models.LoginNames[models.LoginSMTP], models.LoginSMTP},
{models.LoginNames[models.LoginOAuth2], models.LoginOAuth2},
{models.LoginNames[models.LoginSSPI], models.LoginSSPI},
}
if pam.Supported {
items = append(items, dropdownItem{models.LoginNames[models.LoginPAM], models.LoginPAM})
}
return items
}()
authSources = []dropdownItem{
{models.LoginNames[models.LoginLDAP], models.LoginLDAP},
{models.LoginNames[models.LoginDLDAP], models.LoginDLDAP},
{models.LoginNames[models.LoginSMTP], models.LoginSMTP},
{models.LoginNames[models.LoginPAM], models.LoginPAM},
{models.LoginNames[models.LoginOAuth2], models.LoginOAuth2},
{models.LoginNames[models.LoginSSPI], models.LoginSSPI},
}
securityProtocols = []dropdownItem{
{models.SecurityProtocolNames[ldap.SecurityProtocolUnencrypted], ldap.SecurityProtocolUnencrypted},
{models.SecurityProtocolNames[ldap.SecurityProtocolLDAPS], ldap.SecurityProtocolLDAPS},

View File

@@ -56,11 +56,7 @@ func GetIssueCommentReactions(ctx *context.APIContext) {
return
}
if err := comment.LoadIssue(); err != nil {
ctx.Error(http.StatusInternalServerError, "comment.LoadIssue", err)
}
if !ctx.Repo.CanReadIssuesOrPulls(comment.Issue.IsPull) {
if !ctx.Repo.CanRead(models.UnitTypeIssues) {
ctx.Error(http.StatusForbidden, "GetIssueCommentReactions", errors.New("no permission to get reactions"))
return
}
@@ -274,7 +270,7 @@ func GetIssueReactions(ctx *context.APIContext) {
return
}
if !ctx.Repo.CanReadIssuesOrPulls(issue.IsPull) {
if !ctx.Repo.CanRead(models.UnitTypeIssues) {
ctx.Error(http.StatusForbidden, "GetIssueReactions", errors.New("no permission to get reactions"))
return
}

View File

@@ -284,12 +284,6 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
// "422":
// "$ref": "#/responses/validationError"
if form.Head == form.Base {
ctx.Error(http.StatusUnprocessableEntity, "BaseHeadSame",
"Invalid PullRequest: There are no changes between the head and the base")
return
}
var (
repo = ctx.Repo.Repository
labelIDs []int64

View File

@@ -355,16 +355,7 @@ func CreateBranch(ctx *context.Context, form auth.NewBranchForm) {
if len(e.Message) == 0 {
ctx.Flash.Error(ctx.Tr("repo.editor.push_rejected_no_message"))
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.push_rejected"),
"Summary": ctx.Tr("repo.editor.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(e.Message),
})
if err != nil {
ctx.ServerError("UpdatePullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.editor.push_rejected", utils.SanitizeFlashErrorString(e.Message)))
}
ctx.Redirect(ctx.Repo.RepoLink + "/src/" + ctx.Repo.BranchNameSubURL())
return

View File

@@ -293,28 +293,10 @@ func editFilePost(ctx *context.Context, form auth.EditRepoFileForm, isNewFile bo
if len(errPushRej.Message) == 0 {
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected_no_message"), tplEditFile, &form)
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.push_rejected"),
"Summary": ctx.Tr("repo.editor.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(errPushRej.Message),
})
if err != nil {
ctx.ServerError("editFilePost.HTMLString", err)
return
}
ctx.RenderWithErr(flashError, tplEditFile, &form)
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected", utils.SanitizeFlashErrorString(errPushRej.Message)), tplEditFile, &form)
}
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.fail_to_update_file", form.TreePath),
"Summary": ctx.Tr("repo.editor.fail_to_update_file_summary"),
"Details": utils.SanitizeFlashErrorString(err.Error()),
})
if err != nil {
ctx.ServerError("editFilePost.HTMLString", err)
return
}
ctx.RenderWithErr(flashError, tplEditFile, &form)
ctx.RenderWithErr(ctx.Tr("repo.editor.fail_to_update_file", form.TreePath, utils.SanitizeFlashErrorString(err.Error())), tplEditFile, &form)
}
}
@@ -482,16 +464,7 @@ func DeleteFilePost(ctx *context.Context, form auth.DeleteRepoFileForm) {
if len(errPushRej.Message) == 0 {
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected_no_message"), tplDeleteFile, &form)
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.push_rejected"),
"Summary": ctx.Tr("repo.editor.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(errPushRej.Message),
})
if err != nil {
ctx.ServerError("DeleteFilePost.HTMLString", err)
return
}
ctx.RenderWithErr(flashError, tplDeleteFile, &form)
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected", utils.SanitizeFlashErrorString(errPushRej.Message)), tplDeleteFile, &form)
}
} else {
ctx.ServerError("DeleteRepoFile", err)
@@ -683,16 +656,7 @@ func UploadFilePost(ctx *context.Context, form auth.UploadRepoFileForm) {
if len(errPushRej.Message) == 0 {
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected_no_message"), tplUploadFile, &form)
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.push_rejected"),
"Summary": ctx.Tr("repo.editor.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(errPushRej.Message),
})
if err != nil {
ctx.ServerError("UploadFilePost.HTMLString", err)
return
}
ctx.RenderWithErr(flashError, tplUploadFile, &form)
ctx.RenderWithErr(ctx.Tr("repo.editor.push_rejected", utils.SanitizeFlashErrorString(errPushRej.Message)), tplUploadFile, &form)
}
} else {
// os.ErrNotExist - upload file missing in the intervening time?!

View File

@@ -311,7 +311,7 @@ func PrepareMergedViewPullInfo(ctx *context.Context, issue *models.Issue) *git.C
compareInfo, err := ctx.Repo.GitRepo.GetCompareInfo(ctx.Repo.Repository.RepoPath(),
pull.MergeBase, pull.GetGitRefName())
if err != nil {
if strings.Contains(err.Error(), "fatal: Not a valid object name") || strings.Contains(err.Error(), "unknown revision or path not in the working tree") {
if strings.Contains(err.Error(), "fatal: Not a valid object name") {
ctx.Data["IsPullRequestBroken"] = true
ctx.Data["BaseTarget"] = pull.BaseBranch
ctx.Data["NumCommits"] = 0
@@ -723,16 +723,7 @@ func UpdatePullRequest(ctx *context.Context) {
if err = pull_service.Update(issue.PullRequest, ctx.User, message); err != nil {
if models.IsErrMergeConflicts(err) {
conflictError := err.(models.ErrMergeConflicts)
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.pulls.merge_conflict"),
"Summary": ctx.Tr("repo.pulls.merge_conflict_summary"),
"Details": utils.SanitizeFlashErrorString(conflictError.StdErr) + "<br>" + utils.SanitizeFlashErrorString(conflictError.StdOut),
})
if err != nil {
ctx.ServerError("UpdatePullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.pulls.merge_conflict", utils.SanitizeFlashErrorString(conflictError.StdErr), utils.SanitizeFlashErrorString(conflictError.StdOut)))
ctx.Redirect(ctx.Repo.RepoLink + "/pulls/" + com.ToStr(issue.Index))
return
}
@@ -855,30 +846,12 @@ func MergePullRequest(ctx *context.Context, form auth.MergePullRequestForm) {
return
} else if models.IsErrMergeConflicts(err) {
conflictError := err.(models.ErrMergeConflicts)
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.editor.merge_conflict"),
"Summary": ctx.Tr("repo.editor.merge_conflict_summary"),
"Details": utils.SanitizeFlashErrorString(conflictError.StdErr) + "<br>" + utils.SanitizeFlashErrorString(conflictError.StdOut),
})
if err != nil {
ctx.ServerError("MergePullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.pulls.merge_conflict", utils.SanitizeFlashErrorString(conflictError.StdErr), utils.SanitizeFlashErrorString(conflictError.StdOut)))
ctx.Redirect(ctx.Repo.RepoLink + "/pulls/" + com.ToStr(pr.Index))
return
} else if models.IsErrRebaseConflicts(err) {
conflictError := err.(models.ErrRebaseConflicts)
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.pulls.rebase_conflict", utils.SanitizeFlashErrorString(conflictError.CommitSHA)),
"Summary": ctx.Tr("repo.pulls.rebase_conflict_summary"),
"Details": utils.SanitizeFlashErrorString(conflictError.StdErr) + "<br>" + utils.SanitizeFlashErrorString(conflictError.StdOut),
})
if err != nil {
ctx.ServerError("MergePullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.pulls.rebase_conflict", utils.SanitizeFlashErrorString(conflictError.CommitSHA), utils.SanitizeFlashErrorString(conflictError.StdErr), utils.SanitizeFlashErrorString(conflictError.StdOut)))
ctx.Redirect(ctx.Repo.RepoLink + "/pulls/" + com.ToStr(pr.Index))
return
} else if models.IsErrMergeUnrelatedHistories(err) {
@@ -898,16 +871,7 @@ func MergePullRequest(ctx *context.Context, form auth.MergePullRequestForm) {
if len(message) == 0 {
ctx.Flash.Error(ctx.Tr("repo.pulls.push_rejected_no_message"))
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.pulls.push_rejected"),
"Summary": ctx.Tr("repo.pulls.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(pushrejErr.Message),
})
if err != nil {
ctx.ServerError("MergePullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.pulls.push_rejected", utils.SanitizeFlashErrorString(pushrejErr.Message)))
}
ctx.Redirect(ctx.Repo.RepoLink + "/pulls/" + com.ToStr(pr.Index))
return
@@ -1022,16 +986,7 @@ func CompareAndPullRequestPost(ctx *context.Context, form auth.CreateIssueForm)
if len(message) == 0 {
ctx.Flash.Error(ctx.Tr("repo.pulls.push_rejected_no_message"))
} else {
flashError, err := ctx.HTMLString(string(tplAlertDetails), map[string]interface{}{
"Message": ctx.Tr("repo.pulls.push_rejected"),
"Summary": ctx.Tr("repo.pulls.push_rejected_summary"),
"Details": utils.SanitizeFlashErrorString(pushrejErr.Message),
})
if err != nil {
ctx.ServerError("CompareAndPullRequest.HTMLString", err)
return
}
ctx.Flash.Error(flashError)
ctx.Flash.Error(ctx.Tr("repo.pulls.push_rejected", utils.SanitizeFlashErrorString(pushrejErr.Message)))
}
ctx.Redirect(ctx.Repo.RepoLink + "/pulls/" + com.ToStr(pullIssue.Index))
return

View File

@@ -24,8 +24,7 @@ import (
)
const (
tplCreate base.TplName = "repo/create"
tplAlertDetails base.TplName = "base/alert_details"
tplCreate base.TplName = "repo/create"
)
// MustBeNotEmpty render when a repo is a empty git dir
@@ -402,3 +401,19 @@ func Download(ctx *context.Context) {
ctx.ServeFile(archivePath, ctx.Repo.Repository.Name+"-"+refName+ext)
}
// Status returns repository's status
func Status(ctx *context.Context) {
task, err := models.GetMigratingTask(ctx.Repo.Repository.ID)
if err != nil {
ctx.JSON(500, map[string]interface{}{
"err": err,
})
return
}
ctx.JSON(200, map[string]interface{}{
"status": ctx.Repo.Repository.Status,
"err": task.Errors,
})
}

View File

@@ -7,11 +7,8 @@ package routes
import (
"bytes"
"encoding/gob"
"errors"
"fmt"
"io"
"net/http"
"os"
"path"
"strings"
"text/template"
@@ -128,13 +125,7 @@ func storageHandler(storageSetting setting.Storage, prefix string, objStore stor
rPath := strings.TrimPrefix(req.RequestURI, "/"+prefix)
u, err := objStore.URL(rPath, path.Base(rPath))
if err != nil {
if os.IsNotExist(err) || errors.Is(err, os.ErrNotExist) {
log.Warn("Unable to find %s %s", prefix, rPath)
ctx.Error(404, "file not found")
return
}
log.Error("Error whilst getting URL for %s %s. Error: %v", prefix, rPath, err)
ctx.Error(500, fmt.Sprintf("Error whilst getting URL for %s %s", prefix, rPath))
ctx.Error(500, err.Error())
return
}
http.Redirect(
@@ -161,13 +152,7 @@ func storageHandler(storageSetting setting.Storage, prefix string, objStore stor
//If we have matched and access to release or issue
fr, err := objStore.Open(rPath)
if err != nil {
if os.IsNotExist(err) || errors.Is(err, os.ErrNotExist) {
log.Warn("Unable to find %s %s", prefix, rPath)
ctx.Error(404, "file not found")
return
}
log.Error("Error whilst opening %s %s. Error: %v", prefix, rPath, err)
ctx.Error(500, fmt.Sprintf("Error whilst opening %s %s", prefix, rPath))
ctx.Error(500, err.Error())
return
}
defer fr.Close()
@@ -479,7 +464,6 @@ func RegisterRoutes(m *macaron.Macaron) {
m.Get("/forgot_password", user.ForgotPasswd)
m.Post("/forgot_password", user.ForgotPasswdPost)
m.Post("/logout", user.SignOut)
m.Get("/task/:task", user.TaskStatus)
})
// ***** END: User *****
@@ -987,6 +971,8 @@ func RegisterRoutes(m *macaron.Macaron) {
m.Get("/archive/*", repo.MustBeNotEmpty, reqRepoCodeReader, repo.Download)
m.Get("/status", reqRepoCodeReader, repo.Status)
m.Group("/branches", func() {
m.Get("", repo.Branches)
}, repo.MustBeNotEmpty, context.RepoRef(), reqRepoCodeReader)

View File

@@ -91,6 +91,7 @@ func ProfilePost(ctx *context.Context, form auth.UpdateProfileForm) {
}
ctx.User.FullName = form.FullName
ctx.User.Email = form.Email
ctx.User.KeepEmailPrivate = form.KeepEmailPrivate
ctx.User.Website = form.Website
ctx.User.Location = form.Location
@@ -120,11 +121,7 @@ func ProfilePost(ctx *context.Context, form auth.UpdateProfileForm) {
func UpdateAvatarSetting(ctx *context.Context, form auth.AvatarForm, ctxUser *models.User) error {
ctxUser.UseCustomAvatar = form.Source == auth.AvatarLocal
if len(form.Gravatar) > 0 {
if form.Avatar != nil {
ctxUser.Avatar = base.EncodeMD5(form.Gravatar)
} else {
ctxUser.Avatar = ""
}
ctxUser.Avatar = base.EncodeMD5(form.Gravatar)
ctxUser.AvatarEmail = form.Gravatar
}

View File

@@ -1,30 +0,0 @@
// Copyright 2020 The Gitea Authors. All rights reserved.
// Use of this source code is governed by a MIT-style
// license that can be found in the LICENSE file.
package user
import (
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/context"
)
// TaskStatus returns task's status
func TaskStatus(ctx *context.Context) {
task, opts, err := models.GetMigratingTaskByID(ctx.ParamsInt64("task"), ctx.User.ID)
if err != nil {
ctx.JSON(500, map[string]interface{}{
"err": err,
})
return
}
ctx.JSON(200, map[string]interface{}{
"status": task.Status,
"err": task.Errors,
"repo-id": task.RepoID,
"repo-name": opts.RepoName,
"start": task.StartTime,
"end": task.EndTime,
})
}

View File

@@ -41,6 +41,12 @@ func IsValidSlackChannel(channelName string) bool {
// SanitizeFlashErrorString will sanitize a flash error string
func SanitizeFlashErrorString(x string) string {
runes := []rune(x)
if len(runes) > 512 {
x = "..." + string(runes[len(runes)-512:])
}
return strings.ReplaceAll(html.EscapeString(x), "\n", "<br>")
}

View File

@@ -181,87 +181,64 @@ var (
removedCodePrefix = []byte(`<span class="removed-code">`)
codeTagSuffix = []byte(`</span>`)
)
var trailingSpanRegex = regexp.MustCompile(`<span\s*[[:alpha:]="]*?[>]?$`)
var entityRegex = regexp.MustCompile(`&[#]*?[0-9[:alpha:]]*$`)
// shouldWriteInline represents combinations where we manually write inline changes
func shouldWriteInline(diff diffmatchpatch.Diff, lineType DiffLineType) bool {
if true &&
diff.Type == diffmatchpatch.DiffEqual ||
diff.Type == diffmatchpatch.DiffInsert && lineType == DiffLineAdd ||
diff.Type == diffmatchpatch.DiffDelete && lineType == DiffLineDel {
return true
}
return false
}
var addSpanRegex = regexp.MustCompile(`<span [class="[a-z]*]*$`)
func diffToHTML(fileName string, diffs []diffmatchpatch.Diff, lineType DiffLineType) template.HTML {
buf := bytes.NewBuffer(nil)
match := ""
for _, diff := range diffs {
if shouldWriteInline(diff, lineType) {
if len(match) > 0 {
diff.Text = match + diff.Text
match = ""
}
// Chroma HTML syntax highlighting is done before diffing individual lines in order to maintain consistency.
// Since inline changes might split in the middle of a chroma span tag or HTML entity, make we manually put it back together
// before writing so we don't try insert added/removed code spans in the middle of one of those
// and create broken HTML. This is done by moving incomplete HTML forward until it no longer matches our pattern of
// a line ending with an incomplete HTML entity or partial/opening <span>.
// EX:
// diffs[{Type: dmp.DiffDelete, Text: "language</span><span "},
// {Type: dmp.DiffEqual, Text: "c"},
// {Type: dmp.DiffDelete, Text: "lass="p">}]
// After first iteration
// diffs[{Type: dmp.DiffDelete, Text: "language</span>"}, //write out
// {Type: dmp.DiffEqual, Text: "<span c"},
// {Type: dmp.DiffDelete, Text: "lass="p">,</span>}]
// After second iteration
// {Type: dmp.DiffEqual, Text: ""}, // write out
// {Type: dmp.DiffDelete, Text: "<span class="p">,</span>}]
// Final
// {Type: dmp.DiffDelete, Text: "<span class="p">,</span>}]
// end up writing <span class="removed-code"><span class="p">,</span></span>
// Instead of <span class="removed-code">lass="p",</span></span>
m := trailingSpanRegex.FindStringSubmatchIndex(diff.Text)
if m != nil {
match = diff.Text[m[0]:m[1]]
diff.Text = strings.TrimSuffix(diff.Text, match)
}
m = entityRegex.FindStringSubmatchIndex(diff.Text)
if m != nil {
match = diff.Text[m[0]:m[1]]
diff.Text = strings.TrimSuffix(diff.Text, match)
}
// Print an existing closing span first before opening added/remove-code span so it doesn't unintentionally close it
if strings.HasPrefix(diff.Text, "</span>") {
buf.WriteString("</span>")
diff.Text = strings.TrimPrefix(diff.Text, "</span>")
}
// If we weren't able to fix it then this should avoid broken HTML by not inserting more spans below
// The previous/next diff section will contain the rest of the tag that is missing here
if strings.Count(diff.Text, "<") != strings.Count(diff.Text, ">") {
buf.WriteString(diff.Text)
continue
}
}
var addSpan string
for i := range diffs {
switch {
case diff.Type == diffmatchpatch.DiffEqual:
buf.WriteString(diff.Text)
case diff.Type == diffmatchpatch.DiffInsert && lineType == DiffLineAdd:
case diffs[i].Type == diffmatchpatch.DiffEqual:
// Looking for the case where our 3rd party diff library previously detected a string difference
// in the middle of a span class because we highlight them first. This happens when added/deleted code
// also changes the chroma class name, either partially or fully. If found, just move the openining span code forward into the next section
// see TestDiffToHTML for examples
if len(addSpan) > 0 {
diffs[i].Text = addSpan + diffs[i].Text
addSpan = ""
}
m := addSpanRegex.FindStringSubmatchIndex(diffs[i].Text)
if m != nil {
addSpan = diffs[i].Text[m[0]:m[1]]
buf.WriteString(strings.TrimSuffix(diffs[i].Text, addSpan))
} else {
addSpan = ""
buf.WriteString(getLineContent(diffs[i].Text))
}
case diffs[i].Type == diffmatchpatch.DiffInsert && lineType == DiffLineAdd:
if len(addSpan) > 0 {
diffs[i].Text = addSpan + diffs[i].Text
addSpan = ""
}
// Print existing closing span first before opening added-code span so it doesn't unintentionally close it
if strings.HasPrefix(diffs[i].Text, "</span>") {
buf.WriteString("</span>")
diffs[i].Text = strings.TrimPrefix(diffs[i].Text, "</span>")
}
m := addSpanRegex.FindStringSubmatchIndex(diffs[i].Text)
if m != nil {
addSpan = diffs[i].Text[m[0]:m[1]]
diffs[i].Text = strings.TrimSuffix(diffs[i].Text, addSpan)
}
buf.Write(addedCodePrefix)
buf.WriteString(diff.Text)
buf.WriteString(getLineContent(diffs[i].Text))
buf.Write(codeTagSuffix)
case diff.Type == diffmatchpatch.DiffDelete && lineType == DiffLineDel:
case diffs[i].Type == diffmatchpatch.DiffDelete && lineType == DiffLineDel:
if len(addSpan) > 0 {
diffs[i].Text = addSpan + diffs[i].Text
addSpan = ""
}
if strings.HasPrefix(diffs[i].Text, "</span>") {
buf.WriteString("</span>")
diffs[i].Text = strings.TrimPrefix(diffs[i].Text, "</span>")
}
m := addSpanRegex.FindStringSubmatchIndex(diffs[i].Text)
if m != nil {
addSpan = diffs[i].Text[m[0]:m[1]]
diffs[i].Text = strings.TrimSuffix(diffs[i].Text, addSpan)
}
buf.Write(removedCodePrefix)
buf.WriteString(diff.Text)
buf.WriteString(getLineContent(diffs[i].Text))
buf.Write(codeTagSuffix)
}
}
@@ -356,9 +333,6 @@ func (diffSection *DiffSection) GetComputedInlineDiffFor(diffLine *DiffLine) tem
diffRecord := diffMatchPatch.DiffMain(highlight.Code(diffSection.FileName, diff1[1:]), highlight.Code(diffSection.FileName, diff2[1:]), true)
diffRecord = diffMatchPatch.DiffCleanupEfficiency(diffRecord)
diffRecord = diffMatchPatch.DiffCleanupEfficiency(diffRecord)
return diffToHTML(diffSection.FileName, diffRecord, diffLine.Type)
}
@@ -467,254 +441,91 @@ func (diff *Diff) LoadComments(issue *models.Issue, currentUser *models.User) er
const cmdDiffHead = "diff --git "
// ParsePatch builds a Diff object from a io.Reader and some parameters.
// ParsePatch builds a Diff object from a io.Reader and some
// parameters.
// TODO: move this function to gogits/git-module
func ParsePatch(maxLines, maxLineCharacters, maxFiles int, reader io.Reader) (*Diff, error) {
var curFile *DiffFile
diff := &Diff{Files: make([]*DiffFile, 0)}
sb := strings.Builder{}
// OK let's set a reasonable buffer size.
// This should be let's say at least the size of maxLineCharacters or 4096 whichever is larger.
readerSize := maxLineCharacters
if readerSize < 4096 {
readerSize = 4096
}
input := bufio.NewReaderSize(reader, readerSize)
line, err := input.ReadString('\n')
if err != nil {
if err == io.EOF {
return diff, nil
}
return diff, err
}
parsingLoop:
for {
// 1. A patch file always begins with `diff --git ` + `a/path b/path` (possibly quoted)
// if it does not we have bad input!
if !strings.HasPrefix(line, cmdDiffHead) {
return diff, fmt.Errorf("Invalid first file line: %s", line)
}
// TODO: Handle skipping first n files
if len(diff.Files) >= maxFiles {
diff.IsIncomplete = true
_, err := io.Copy(ioutil.Discard, reader)
if err != nil {
// By the definition of io.Copy this never returns io.EOF
return diff, fmt.Errorf("Copy: %v", err)
}
break parsingLoop
}
curFile = createDiffFile(diff, line)
diff.Files = append(diff.Files, curFile)
// 2. It is followed by one or more extended header lines:
//
// old mode <mode>
// new mode <mode>
// deleted file mode <mode>
// new file mode <mode>
// copy from <path>
// copy to <path>
// rename from <path>
// rename to <path>
// similarity index <number>
// dissimilarity index <number>
// index <hash>..<hash> <mode>
//
// * <mode> 6-digit octal numbers including the file type and file permission bits.
// * <path> does not include the a/ and b/ prefixes
// * <number> percentage of unchanged lines for similarity, percentage of changed
// lines dissimilarity as integer rounded down with terminal %. 100% => equal files.
// * The index line includes the blob object names before and after the change.
// The <mode> is included if the file mode does not change; otherwise, separate
// lines indicate the old and the new mode.
// 3. Following this header the "standard unified" diff format header may be encountered: (but not for every case...)
//
// --- a/<path>
// +++ b/<path>
//
// With multiple hunks
//
// @@ <hunk descriptor> @@
// +added line
// -removed line
// unchanged line
//
// 4. Binary files get:
//
// Binary files a/<path> and b/<path> differ
//
// but one of a/<path> and b/<path> could be /dev/null.
curFileLoop:
for {
line, err = input.ReadString('\n')
if err != nil {
if err != io.EOF {
return diff, err
}
break parsingLoop
}
switch {
case strings.HasPrefix(line, cmdDiffHead):
break curFileLoop
case strings.HasPrefix(line, "old mode ") ||
strings.HasPrefix(line, "new mode "):
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
case strings.HasPrefix(line, "copy from "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
case strings.HasPrefix(line, "copy to "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
case strings.HasPrefix(line, "new file"):
curFile.Type = DiffFileAdd
curFile.IsCreated = true
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
case strings.HasPrefix(line, "deleted"):
curFile.Type = DiffFileDel
curFile.IsDeleted = true
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
case strings.HasPrefix(line, "index"):
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
case strings.HasPrefix(line, "similarity index 100%"):
curFile.Type = DiffFileRename
case strings.HasPrefix(line, "Binary"):
curFile.IsBin = true
case strings.HasPrefix(line, "--- "):
// Do nothing with this line
case strings.HasPrefix(line, "+++ "):
// Do nothing with this line
lineBytes, isFragment, err := parseHunks(curFile, maxLines, maxLineCharacters, input)
diff.TotalAddition += curFile.Addition
diff.TotalDeletion += curFile.Deletion
if err != nil {
if err != io.EOF {
return diff, err
}
break parsingLoop
}
sb.Reset()
_, _ = sb.Write(lineBytes)
for isFragment {
lineBytes, isFragment, err = input.ReadLine()
if err != nil {
// Now by the definition of ReadLine this cannot be io.EOF
return diff, fmt.Errorf("Unable to ReadLine: %v", err)
}
_, _ = sb.Write(lineBytes)
}
line = sb.String()
sb.Reset()
break curFileLoop
}
}
}
// FIXME: There are numerous issues with this:
// - we might want to consider detecting encoding while parsing but...
// - we're likely to fail to get the correct encoding here anyway as we won't have enough information
// - and this doesn't really account for changes in encoding
var buf bytes.Buffer
for _, f := range diff.Files {
buf.Reset()
for _, sec := range f.Sections {
for _, l := range sec.Lines {
if l.Type == DiffLineSection {
continue
}
buf.WriteString(l.Content[1:])
buf.WriteString("\n")
}
}
charsetLabel, err := charset.DetectEncoding(buf.Bytes())
if charsetLabel != "UTF-8" && err == nil {
encoding, _ := stdcharset.Lookup(charsetLabel)
if encoding != nil {
d := encoding.NewDecoder()
for _, sec := range f.Sections {
for _, l := range sec.Lines {
if l.Type == DiffLineSection {
continue
}
if c, _, err := transform.String(d, l.Content[1:]); err == nil {
l.Content = l.Content[0:1] + c
}
}
}
}
}
}
diff.NumFiles = len(diff.Files)
return diff, nil
}
func parseHunks(curFile *DiffFile, maxLines, maxLineCharacters int, input *bufio.Reader) (lineBytes []byte, isFragment bool, err error) {
sb := strings.Builder{}
var (
curSection *DiffSection
curFileLinesCount int
curFileLFSPrefix bool
diff = &Diff{Files: make([]*DiffFile, 0)}
curFile = &DiffFile{}
curSection = &DiffSection{
Lines: make([]*DiffLine, 0, 10),
}
leftLine, rightLine int
lineCount int
curFileLinesCount int
curFileLFSPrefix bool
)
leftLine, rightLine := 1, 1
for {
sb.Reset()
lineBytes, isFragment, err = input.ReadLine()
if err != nil {
if err == io.EOF {
return
}
err = fmt.Errorf("Unable to ReadLine: %v", err)
return
}
if lineBytes[0] == 'd' {
// End of hunks
return
}
switch lineBytes[0] {
case '@':
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
continue
}
_, _ = sb.Write(lineBytes)
for isFragment {
// This is very odd indeed - we're in a section header and the line is too long
// This really shouldn't happen...
lineBytes, isFragment, err = input.ReadLine()
if err != nil {
// Now by the definition of ReadLine this cannot be io.EOF
err = fmt.Errorf("Unable to ReadLine: %v", err)
return
input := bufio.NewReader(reader)
isEOF := false
for !isEOF {
var linebuf bytes.Buffer
for {
b, err := input.ReadByte()
if err != nil {
if err == io.EOF {
isEOF = true
break
} else {
return nil, fmt.Errorf("ReadByte: %v", err)
}
_, _ = sb.Write(lineBytes)
}
line := sb.String()
if b == '\n' {
break
}
if linebuf.Len() < maxLineCharacters {
linebuf.WriteByte(b)
} else if linebuf.Len() == maxLineCharacters {
curFile.IsIncomplete = true
}
}
line := linebuf.String()
// Create a new section to represent this hunk
if strings.HasPrefix(line, "+++ ") || strings.HasPrefix(line, "--- ") || len(line) == 0 {
continue
}
trimLine := strings.Trim(line, "+- ")
if trimLine == models.LFSMetaFileIdentifier {
curFileLFSPrefix = true
}
if curFileLFSPrefix && strings.HasPrefix(trimLine, models.LFSMetaFileOidPrefix) {
oid := strings.TrimPrefix(trimLine, models.LFSMetaFileOidPrefix)
if len(oid) == 64 {
m := &models.LFSMetaObject{Oid: oid}
count, err := models.Count(m)
if err == nil && count > 0 {
curFile.IsBin = true
curFile.IsLFSFile = true
curSection.Lines = nil
}
}
}
curFileLinesCount++
lineCount++
// Diff data too large, we only show the first about maxLines lines
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
}
switch {
case line[0] == ' ':
diffLine := &DiffLine{Type: DiffLinePlain, Content: line, LeftIdx: leftLine, RightIdx: rightLine}
leftLine++
rightLine++
curSection.Lines = append(curSection.Lines, diffLine)
curSection.FileName = curFile.Name
continue
case line[0] == '@':
curSection = &DiffSection{}
curFile.Sections = append(curFile.Sections, curSection)
lineSectionInfo := getDiffLineSectionInfo(curFile.Name, line, leftLine-1, rightLine-1)
diffLine := &DiffLine{
Type: DiffLineSection,
@@ -727,132 +538,148 @@ func parseHunks(curFile *DiffFile, maxLines, maxLineCharacters int, input *bufio
leftLine = lineSectionInfo.LeftIdx
rightLine = lineSectionInfo.RightIdx
continue
case '\\':
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
continue
}
// This is used only to indicate that the current file does not have a terminal newline
if !bytes.Equal(lineBytes, []byte("\\ No newline at end of file")) {
err = fmt.Errorf("Unexpected line in hunk: %s", string(lineBytes))
return
}
// Technically this should be the end the file!
// FIXME: we should be putting a marker at the end of the file if there is no terminal new line
continue
case '+':
curFileLinesCount++
case line[0] == '+':
curFile.Addition++
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
continue
}
diffLine := &DiffLine{Type: DiffLineAdd, RightIdx: rightLine}
diff.TotalAddition++
diffLine := &DiffLine{Type: DiffLineAdd, Content: line, RightIdx: rightLine}
rightLine++
curSection.Lines = append(curSection.Lines, diffLine)
case '-':
curFileLinesCount++
curSection.FileName = curFile.Name
continue
case line[0] == '-':
curFile.Deletion++
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
continue
}
diffLine := &DiffLine{Type: DiffLineDel, LeftIdx: leftLine}
diff.TotalDeletion++
diffLine := &DiffLine{Type: DiffLineDel, Content: line, LeftIdx: leftLine}
if leftLine > 0 {
leftLine++
}
curSection.Lines = append(curSection.Lines, diffLine)
case ' ':
curFileLinesCount++
if curFileLinesCount >= maxLines {
curFile.IsIncomplete = true
continue
}
diffLine := &DiffLine{Type: DiffLinePlain, LeftIdx: leftLine, RightIdx: rightLine}
leftLine++
rightLine++
curSection.Lines = append(curSection.Lines, diffLine)
default:
// This is unexpected
err = fmt.Errorf("Unexpected line in hunk: %s", string(lineBytes))
return
curSection.FileName = curFile.Name
case strings.HasPrefix(line, "Binary"):
curFile.IsBin = true
continue
}
line := string(lineBytes)
if isFragment {
curFile.IsIncomplete = true
for isFragment {
lineBytes, isFragment, err = input.ReadLine()
// Get new file.
if strings.HasPrefix(line, cmdDiffHead) {
if len(diff.Files) >= maxFiles {
diff.IsIncomplete = true
_, err := io.Copy(ioutil.Discard, reader)
if err != nil {
// Now by the definition of ReadLine this cannot be io.EOF
err = fmt.Errorf("Unable to ReadLine: %v", err)
return
return nil, fmt.Errorf("Copy: %v", err)
}
break
}
}
curSection.Lines[len(curSection.Lines)-1].Content = line
// handle LFS
if line[1:] == models.LFSMetaFileIdentifier {
curFileLFSPrefix = true
} else if curFileLFSPrefix && strings.HasPrefix(line[1:], models.LFSMetaFileOidPrefix) {
oid := strings.TrimPrefix(line[1:], models.LFSMetaFileOidPrefix)
if len(oid) == 64 {
m := &models.LFSMetaObject{Oid: oid}
count, err := models.Count(m)
// Note: In case file name is surrounded by double quotes (it happens only in git-shell).
// e.g. diff --git "a/xxx" "b/xxx"
var a string
var b string
if err == nil && count > 0 {
curFile.IsBin = true
curFile.IsLFSFile = true
curSection.Lines = nil
rd := strings.NewReader(line[len(cmdDiffHead):])
char, _ := rd.ReadByte()
_ = rd.UnreadByte()
if char == '"' {
fmt.Fscanf(rd, "%q ", &a)
if a[0] == '\\' {
a = a[1:]
}
} else {
fmt.Fscanf(rd, "%s ", &a)
}
char, _ = rd.ReadByte()
_ = rd.UnreadByte()
if char == '"' {
fmt.Fscanf(rd, "%q", &b)
if b[0] == '\\' {
b = b[1:]
}
} else {
fmt.Fscanf(rd, "%s", &b)
}
a = a[2:]
b = b[2:]
curFile = &DiffFile{
Name: b,
OldName: a,
Index: len(diff.Files) + 1,
Type: DiffFileChange,
Sections: make([]*DiffSection, 0, 10),
IsRenamed: a != b,
}
diff.Files = append(diff.Files, curFile)
curFileLinesCount = 0
leftLine = 1
rightLine = 1
curFileLFSPrefix = false
// Check file diff type and is submodule.
for {
line, err := input.ReadString('\n')
if err != nil {
if err == io.EOF {
isEOF = true
} else {
return nil, fmt.Errorf("ReadString: %v", err)
}
}
switch {
case strings.HasPrefix(line, "copy from "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
case strings.HasPrefix(line, "copy to "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
case strings.HasPrefix(line, "new file"):
curFile.Type = DiffFileAdd
curFile.IsCreated = true
case strings.HasPrefix(line, "deleted"):
curFile.Type = DiffFileDel
curFile.IsDeleted = true
case strings.HasPrefix(line, "index"):
curFile.Type = DiffFileChange
case strings.HasPrefix(line, "similarity index 100%"):
curFile.Type = DiffFileRename
}
if curFile.Type > 0 {
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
break
}
}
}
}
}
func createDiffFile(diff *Diff, line string) *DiffFile {
// The a/ and b/ filenames are the same unless rename/copy is involved.
// Especially, even for a creation or a deletion, /dev/null is not used
// in place of the a/ or b/ filenames.
//
// When rename/copy is involved, file1 and file2 show the name of the
// source file of the rename/copy and the name of the file that rename/copy
// produces, respectively.
//
// Path names are quoted if necessary.
//
// This means that you should always be able to determine the file name even when there
// there is potential ambiguity...
//
// but we can be simpler with our heuristics by just forcing git to prefix things nicely
curFile := &DiffFile{
Index: len(diff.Files) + 1,
Type: DiffFileChange,
Sections: make([]*DiffSection, 0, 10),
}
rd := strings.NewReader(line[len(cmdDiffHead):] + " ")
curFile.Type = DiffFileChange
curFile.OldName = readFileName(rd)
curFile.Name = readFileName(rd)
curFile.IsRenamed = curFile.Name != curFile.OldName
return curFile
}
func readFileName(rd *strings.Reader) string {
var name string
char, _ := rd.ReadByte()
_ = rd.UnreadByte()
if char == '"' {
fmt.Fscanf(rd, "%q ", &name)
if name[0] == '\\' {
name = name[1:]
// FIXME: detect encoding while parsing.
var buf bytes.Buffer
for _, f := range diff.Files {
buf.Reset()
for _, sec := range f.Sections {
for _, l := range sec.Lines {
buf.WriteString(l.Content)
buf.WriteString("\n")
}
}
charsetLabel, err := charset.DetectEncoding(buf.Bytes())
if charsetLabel != "UTF-8" && err == nil {
encoding, _ := stdcharset.Lookup(charsetLabel)
if encoding != nil {
d := encoding.NewDecoder()
for _, sec := range f.Sections {
for _, l := range sec.Lines {
if c, _, err := transform.String(d, l.Content); err == nil {
l.Content = c
}
}
}
}
}
} else {
fmt.Fscanf(rd, "%s ", &name)
}
return name[2:]
diff.NumFiles = len(diff.Files)
return diff, nil
}
// GetDiffRange builds a Diff between two commits of a repository.
@@ -882,14 +709,7 @@ func GetDiffRangeWithWhitespaceBehavior(repoPath, beforeCommitID, afterCommitID
defer cancel()
var cmd *exec.Cmd
if (len(beforeCommitID) == 0 || beforeCommitID == git.EmptySHA) && commit.ParentCount() == 0 {
diffArgs := []string{"diff", "--src-prefix=\\a/", "--dst-prefix=\\b/", "-M"}
if len(whitespaceBehavior) != 0 {
diffArgs = append(diffArgs, whitespaceBehavior)
}
// append empty tree ref
diffArgs = append(diffArgs, "4b825dc642cb6eb9a060e54bf8d69288fbee4904")
diffArgs = append(diffArgs, afterCommitID)
cmd = exec.CommandContext(ctx, git.GitExecutable, diffArgs...)
cmd = exec.CommandContext(ctx, git.GitExecutable, "show", afterCommitID)
} else {
actualBeforeCommitID := beforeCommitID
if len(actualBeforeCommitID) == 0 {

View File

@@ -50,7 +50,7 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffEqual, Text: "</span> <span class=\"p\">{</span>"},
}, DiffLineAdd))
assertEqual(t, "<span class=\"nx\">tagURL</span> <span class=\"o\">:=</span> <span class=\"removed-code\"><span class=\"nx\">fmt</span><span class=\"p\">.</span><span class=\"nf\">Sprintf</span><span class=\"p\">(</span><span class=\"s\">&#34;## [%s](%s/%s/%s/%s?q=&amp;type=all&amp;state=closed&amp;milestone=%d) - %s&#34;</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Milestone\"</span></span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">BaseURL</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Owner</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Repo</span><span class=\"p\">,</span> <span class=\"removed-code\"><span class=\"nx\">from</span><span class=\"p\">,</span> <span class=\"nx\">milestoneID</span><span class=\"p\">,</span> <span class=\"nx\">time</span><span class=\"p\">.</span><span class=\"nf\">Now</span><span class=\"p\">(</span><span class=\"p\">)</span><span class=\"p\">.</span><span class=\"nf\">Format</span><span class=\"p\">(</span><span class=\"s\">&#34;2006-01-02&#34;</span><span class=\"p\">)</span></span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"nx\">tagURL</span> <span class=\"o\">:=</span> <span class=\"removed-code\"><span class=\"nx\">fmt</span><span class=\"p\">.</span><span class=\"nf\">Sprintf</span><span class=\"p\">(</span><span class=\"s\">&#34;## [%s](%s/%s/%s/%s?q=&amp;type=all&amp;state=closed&amp;milestone=%d) - %s&#34;</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Milestone\"</span></span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">BaseURL</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Owner</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Repo</span><span class=\"p\">,</span> <span class=\"nx\"><span class=\"removed-code\">from</span><span class=\"p\">,</span> <span class=\"nx\">milestoneID</span><span class=\"p\">,</span> <span class=\"nx\">time</span><span class=\"p\">.</span><span class=\"nf\">Now</span><span class=\"p\">(</span><span class=\"p\">)</span><span class=\"p\">.</span><span class=\"nf\">Format</span><span class=\"p\">(</span><span class=\"s\">&#34;2006-01-02&#34;</span><span class=\"p\">)</span></span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"nx\">tagURL</span> <span class=\"o\">:=</span> <span class=\"n"},
{Type: dmp.DiffDelete, Text: "x\">fmt</span><span class=\"p\">.</span><span class=\"nf\">Sprintf</span><span class=\"p\">(</span><span class=\"s\">&#34;## [%s](%s/%s/%s/%s?q=&amp;type=all&amp;state=closed&amp;milestone=%d) - %s&#34;</span><span class=\"p\">,</span> <span class=\"nx\">ge</span><span class=\"p\">.</span><span class=\"nx\">Milestone\""},
{Type: dmp.DiffInsert, Text: "f\">getGiteaTagURL</span><span class=\"p\">(</span><span class=\"nx\">client"},
@@ -60,7 +60,7 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffEqual, Text: "</span><span class=\"p\">)</span>"},
}, DiffLineDel))
assertEqual(t, "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"removed-code\"><span class=\"nx\">language</span></span><span class=\"removed-code\"><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"nx\"><span class=\"removed-code\">language</span></span><span class=\"removed-code\"><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"nx\">"},
{Type: dmp.DiffDelete, Text: "language</span><span "},
{Type: dmp.DiffEqual, Text: "c"},
@@ -74,30 +74,6 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffInsert, Text: "lass=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs"},
{Type: dmp.DiffEqual, Text: "</span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>"},
}, DiffLineAdd))
assertEqual(t, "<span class=\"k\">print</span><span class=\"added-code\"></span><span class=\"added-code\"><span class=\"p\">(</span></span><span class=\"sa\"></span><span class=\"s2\">&#34;</span><span class=\"s2\">// </span><span class=\"s2\">&#34;</span><span class=\"p\">,</span> <span class=\"n\">sys</span><span class=\"o\">.</span><span class=\"n\">argv</span><span class=\"added-code\"><span class=\"p\">)</span></span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"k\">print</span>"},
{Type: dmp.DiffInsert, Text: "<span"},
{Type: dmp.DiffEqual, Text: " "},
{Type: dmp.DiffInsert, Text: "class=\"p\">(</span>"},
{Type: dmp.DiffEqual, Text: "<span class=\"sa\"></span><span class=\"s2\">&#34;</span><span class=\"s2\">// </span><span class=\"s2\">&#34;</span><span class=\"p\">,</span> <span class=\"n\">sys</span><span class=\"o\">.</span><span class=\"n\">argv</span>"},
{Type: dmp.DiffInsert, Text: "<span class=\"p\">)</span>"},
}, DiffLineAdd))
assertEqual(t, "sh <span class=\"added-code\">&#39;useradd -u $(stat -c &#34;%u&#34; .gitignore) jenkins</span>&#39;", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "sh &#3"},
{Type: dmp.DiffDelete, Text: "4;useradd -u 111 jenkins&#34"},
{Type: dmp.DiffInsert, Text: "9;useradd -u $(stat -c &#34;%u&#34; .gitignore) jenkins&#39"},
{Type: dmp.DiffEqual, Text: ";"},
}, DiffLineAdd))
assertEqual(t, "<span class=\"x\"> &lt;h<span class=\"added-code\">4 class=</span><span class=\"added-code\">&#34;release-list-title df ac&#34;</span>&gt;</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"x\"> &lt;h"},
{Type: dmp.DiffInsert, Text: "4 class=&#"},
{Type: dmp.DiffEqual, Text: "3"},
{Type: dmp.DiffInsert, Text: "4;release-list-title df ac&#34;"},
{Type: dmp.DiffEqual, Text: "&gt;</span>"},
}, DiffLineAdd))
}
func TestParsePatch_singlefile(t *testing.T) {
@@ -206,27 +182,6 @@ rename to a b/a a/file b/b file
oldFilename: "a b/file b/a a/file",
filename: "a b/a a/file b/b file",
},
{
name: "minuses-and-pluses",
gitdiff: `diff --git a/minuses-and-pluses b/minuses-and-pluses
index 6961180..9ba1a00 100644
--- a/minuses-and-pluses
+++ b/minuses-and-pluses
@@ -1,4 +1,4 @@
--- 1st line
-++ 2nd line
--- 3rd line
-++ 4th line
+++ 1st line
+-- 2nd line
+++ 3rd line
+-- 4th line
`,
oldFilename: "minuses-and-pluses",
filename: "minuses-and-pluses",
addition: 4,
deletion: 4,
},
}
for _, testcase := range tests {

View File

@@ -122,76 +122,41 @@ func createCodeComment(doer *models.User, repo *models.Repository, issue *models
}
defer gitRepo.Close()
invalidated := false
head := pr.GetGitRefName()
// FIXME validate treePath
// Get latest commit referencing the commented line
// No need for get commit for base branch changes
if line > 0 {
if reviewID != 0 {
first, err := models.FindComments(models.FindCommentsOptions{
ReviewID: reviewID,
Line: line,
TreePath: treePath,
Type: models.CommentTypeCode,
ListOptions: models.ListOptions{
PageSize: 1,
Page: 1,
},
})
if err == nil && len(first) > 0 {
commitID = first[0].CommitSHA
invalidated = first[0].Invalidated
patch = first[0].Patch
} else if err != nil && !models.IsErrCommentNotExist(err) {
return nil, fmt.Errorf("Find first comment for %d line %d path %s. Error: %v", reviewID, line, treePath, err)
} else {
review, err := models.GetReviewByID(reviewID)
if err == nil && len(review.CommitID) > 0 {
head = review.CommitID
} else if err != nil && !models.IsErrReviewNotExist(err) {
return nil, fmt.Errorf("GetReviewByID %d. Error: %v", reviewID, err)
}
}
}
if len(commitID) == 0 {
// FIXME validate treePath
// Get latest commit referencing the commented line
// No need for get commit for base branch changes
commit, err := gitRepo.LineBlame(head, gitRepo.Path, treePath, uint(line))
if err == nil {
commitID = commit.ID.String()
} else if !(strings.Contains(err.Error(), "exit status 128 - fatal: no such path") || notEnoughLines.MatchString(err.Error())) {
return nil, fmt.Errorf("LineBlame[%s, %s, %s, %d]: %v", pr.GetGitRefName(), gitRepo.Path, treePath, line, err)
}
commit, err := gitRepo.LineBlame(pr.GetGitRefName(), gitRepo.Path, treePath, uint(line))
if err == nil {
commitID = commit.ID.String()
} else if !(strings.Contains(err.Error(), "exit status 128 - fatal: no such path") || notEnoughLines.MatchString(err.Error())) {
return nil, fmt.Errorf("LineBlame[%s, %s, %s, %d]: %v", pr.GetGitRefName(), gitRepo.Path, treePath, line, err)
}
}
// Only fetch diff if comment is review comment
if len(patch) == 0 && reviewID != 0 {
if len(commitID) == 0 {
commitID, err = gitRepo.GetRefCommitID(pr.GetGitRefName())
if err != nil {
return nil, fmt.Errorf("GetRefCommitID[%s]: %v", pr.GetGitRefName(), err)
}
if reviewID != 0 {
headCommitID, err := gitRepo.GetRefCommitID(pr.GetGitRefName())
if err != nil {
return nil, fmt.Errorf("GetRefCommitID[%s]: %v", pr.GetGitRefName(), err)
}
patchBuf := new(bytes.Buffer)
if err := git.GetRepoRawDiffForFile(gitRepo, pr.MergeBase, commitID, git.RawDiffNormal, treePath, patchBuf); err != nil {
return nil, fmt.Errorf("GetRawDiffForLine[%s, %s, %s, %s]: %v", gitRepo.Path, pr.MergeBase, commitID, treePath, err)
if err := git.GetRepoRawDiffForFile(gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, treePath, patchBuf); err != nil {
return nil, fmt.Errorf("GetRawDiffForLine[%s, %s, %s, %s]: %v", err, gitRepo.Path, pr.MergeBase, headCommitID, treePath)
}
patch = git.CutDiffAroundLine(patchBuf, int64((&models.Comment{Line: line}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
}
return models.CreateComment(&models.CreateCommentOptions{
Type: models.CommentTypeCode,
Doer: doer,
Repo: repo,
Issue: issue,
Content: content,
LineNum: line,
TreePath: treePath,
CommitSHA: commitID,
ReviewID: reviewID,
Patch: patch,
Invalidated: invalidated,
Type: models.CommentTypeCode,
Doer: doer,
Repo: repo,
Issue: issue,
Content: content,
LineNum: line,
TreePath: treePath,
CommitSHA: commitID,
ReviewID: reviewID,
Patch: patch,
})
}

View File

@@ -1,15 +1,15 @@
{{if .Flash.ErrorMsg}}
<div class="ui negative message flash-error">
<div class="ui negative message">
<p>{{.Flash.ErrorMsg | Str2html}}</p>
</div>
{{end}}
{{if .Flash.SuccessMsg}}
<div class="ui positive message flash-success">
<div class="ui positive message">
<p>{{.Flash.SuccessMsg | Str2html}}</p>
</div>
{{end}}
{{if .Flash.InfoMsg}}
<div class="ui info message flash-info">
<div class="ui info message">
<p>{{.Flash.InfoMsg | Str2html}}</p>
</div>
{{end}}

View File

@@ -1,7 +0,0 @@
{{.Message}}
<details>
<summary>{{.Summary}}</summary>
<code>
{{.Details | Str2html}}
</code>
</details>

View File

@@ -30,7 +30,7 @@
</div>
{{end}}
{{end}}
{{template "repo/issue/view_content/add_reaction" Dict "ctx" $.root "ActionURL" (Printf "%s/comments/%d/reactions" $.root.RepoLink .ID) }}
{{template "repo/issue/view_content/add_reaction" Dict "ctx" $ "ActionURL" (Printf "%s/comments/%d/reactions" $.root.RepoLink .ID) }}
{{template "repo/issue/view_content/context_menu" Dict "ctx" $.root "item" . "delete" true "diff" true "IsCommentPoster" (and $.root.IsSigned (eq $.root.SignedUserID .PosterID))}}
</div>
</div>
@@ -48,7 +48,7 @@
{{$reactions := .Reactions.GroupByType}}
{{if $reactions}}
<div class="ui attached segment reactions">
{{template "repo/issue/view_content/reactions" Dict "ctx" $.root "ActionURL" (Printf "%s/comments/%d/reactions" $.root.RepoLink .ID) "Reactions" $reactions}}
{{template "repo/issue/view_content/reactions" Dict "ctx" $ "ActionURL" (Printf "%s/comments/%d/reactions" $.root.RepoLink .ID) "Reactions" $reactions}}
</div>
{{end}}
</div>

View File

@@ -264,7 +264,7 @@
</span>
{{end}}
{{range .Assignees}}
<a class="ui right assignee poping up" href="{{.HomeLink}}" data-content="{{.GetDisplayName}}" data-variation="inverted" data-position="left center">
<a class="ui right assignee poping up" href="{{.HomeLink}}" data-content="{{.Name}}" data-variation="inverted" data-position="left center">
<img class="ui avatar image" src="{{.RelAvatarLink}}">
</a>
{{end}}

View File

@@ -262,7 +262,7 @@
<span{{if .IsOverdue}} class="overdue"{{end}}>{{.DeadlineUnix.FormatShort}}</span>
{{end}}
{{range .Assignees}}
<a class="ui right assignee poping up" href="{{.HomeLink}}" data-content="{{.GetDisplayName}}" data-variation="inverted" data-position="left center">
<a class="ui right assignee poping up" href="{{.HomeLink}}" data-content="{{.Name}}" data-variation="inverted" data-position="left center">
<img class="ui avatar image" src="{{.RelAvatarLink}}">
</a>
{{end}}

View File

@@ -21,27 +21,25 @@
{{end}}
<div class="content">
<div class="ui top attached header">
<div class="header-left df ac">
{{if .Issue.OriginalAuthor }}
<span class="text black">
<i class="fa {{MigrationIcon .Repository.GetOriginalURLHostname}}" aria-hidden="true"></i>
{{ .Issue.OriginalAuthor }}
</span>
<span class="text grey">
{{ .i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe }}
</span>
<span class="text migrate">
{{if .Repository.OriginalURL}} ({{$.i18n.Tr "repo.migrated_from" .Repository.OriginalURL .Repository.GetOriginalURLHostname | Safe }}){{end}}
</span>
{{else}}
<span class="text grey">
<a class="author"{{if gt .Issue.Poster.ID 0}} href="{{.Issue.Poster.HomeLink}}"{{end}}>{{.Issue.Poster.GetDisplayName}}</a>
{{.i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe}}
</span>
{{end}}
</div>
<div class="header-right actions df ac">
{{if not $.Repository.IsArchived}}
{{if .Issue.OriginalAuthor }}
<span class="text black">
<i class="fa {{MigrationIcon .Repository.GetOriginalURLHostname}}" aria-hidden="true"></i>
{{ .Issue.OriginalAuthor }}
</span>
<span class="text grey">
{{ .i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe }}
</span>
<span class="text migrate">
{{if .Repository.OriginalURL}} ({{$.i18n.Tr "repo.migrated_from" .Repository.OriginalURL .Repository.GetOriginalURLHostname | Safe }}){{end}}
</span>
{{else}}
<span class="text grey">
<a class="author"{{if gt .Issue.Poster.ID 0}} href="{{.Issue.Poster.HomeLink}}"{{end}}>{{.Issue.Poster.GetDisplayName}}</a>
{{.i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe}}
</span>
{{end}}
{{if not $.Repository.IsArchived}}
<div class="ui right actions">
{{if gt .Issue.ShowTag 0}}
<div class="item tag">
{{if eq .Issue.ShowTag 2}}
@@ -53,8 +51,8 @@
{{end}}
{{template "repo/issue/view_content/add_reaction" Dict "ctx" $ "ActionURL" (Printf "%s/issues/%d/reactions" $.RepoLink .Issue.Index)}}
{{template "repo/issue/view_content/context_menu" Dict "ctx" $ "item" .Issue "delete" false "diff" false "IsCommentPoster" $.IsIssuePoster}}
{{end}}
</div>
</div>
{{end}}
</div>
<div class="ui attached segment">
<div class="render-content markdown">

View File

@@ -20,15 +20,13 @@
{{end}}
<div class="content">
<div class="ui top attached header">
<div class="header-left df ac">
{{if .OriginalAuthor }}
<span class="text black"><i class="fa {{MigrationIcon $.Repository.GetOriginalURLHostname}}" aria-hidden="true"></i> {{ .OriginalAuthor }}</span><span class="text grey"> {{$.i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe}} {{if $.Repository.OriginalURL}}</span><span class="text migrate">({{$.i18n.Tr "repo.migrated_from" $.Repository.OriginalURL $.Repository.GetOriginalURLHostname | Safe }}){{end}}</span>
{{else}}
<span class="text grey"><a class="author"{{if gt .Poster.ID 0}} href="{{.Poster.HomeLink}}"{{end}}>{{.Poster.GetDisplayName}}</a> {{$.i18n.Tr "repo.issues.commented_at" .HashTag $createdStr | Safe}}</span>
{{end}}
</div>
<div class="header-right actions df ac">
{{if not $.Repository.IsArchived}}
{{if .OriginalAuthor }}
<span class="text black"><i class="fa {{MigrationIcon $.Repository.GetOriginalURLHostname}}" aria-hidden="true"></i> {{ .OriginalAuthor }}</span><span class="text grey"> {{$.i18n.Tr "repo.issues.commented_at" .Issue.HashTag $createdStr | Safe}} {{if $.Repository.OriginalURL}}</span><span class="text migrate">({{$.i18n.Tr "repo.migrated_from" $.Repository.OriginalURL $.Repository.GetOriginalURLHostname | Safe }}){{end}}</span>
{{else}}
<span class="text grey"><a class="author"{{if gt .Poster.ID 0}} href="{{.Poster.HomeLink}}"{{end}}>{{.Poster.GetDisplayName}}</a> {{$.i18n.Tr "repo.issues.commented_at" .HashTag $createdStr | Safe}}</span>
{{end}}
{{if not $.Repository.IsArchived}}
<div class="ui right actions">
{{if eq .PosterID .Issue.PosterID }}
<div class="item tag">
{{$.i18n.Tr "repo.issues.poster"}}
@@ -45,8 +43,8 @@
{{end}}
{{template "repo/issue/view_content/add_reaction" Dict "ctx" $ "ActionURL" (Printf "%s/comments/%d/reactions" $.RepoLink .ID)}}
{{template "repo/issue/view_content/context_menu" Dict "ctx" $ "item" . "delete" true "diff" false "IsCommentPoster" (and $.IsSigned (eq $.SignedUserID .PosterID))}}
{{end}}
</div>
</div>
{{end}}
</div>
<div class="ui attached segment">
<div class="render-content markdown">
@@ -443,34 +441,29 @@
{{$resolveDoer := (index $comms 0).ResolveDoer}}
{{$isNotPending := (not (eq (index $comms 0).Review.Type 0))}}
{{if or $invalid $resolved}}
<button id="show-outdated-{{(index $comms 0).ID}}" data-comment="{{(index $comms 0).ID}}" class="{{if not $resolved}}hide {{end}}ui compact right labeled button show-outdated">
<button id="show-outdated-{{(index $comms 0).ID}}" data-comment="{{(index $comms 0).ID}}" class="ui compact right labeled button show-outdated">
{{svg "octicon-unfold"}}
{{if $resolved}}
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
{{else}}
{{if $invalid }}
{{$.i18n.Tr "repo.issues.review.show_outdated"}}
{{end}}
</button>
<button id="hide-outdated-{{(index $comms 0).ID}}" data-comment="{{(index $comms 0).ID}}" class="{{if $resolved}}hide {{end}}ui compact right labeled button hide-outdated">
{{svg "octicon-fold"}}
{{if $resolved}}
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
{{else}}
{{$.i18n.Tr "repo.issues.review.show_resolved"}}
{{end}}
</button>
<button id="hide-outdated-{{(index $comms 0).ID}}" data-comment="{{(index $comms 0).ID}}" class="hide ui compact right labeled button hide-outdated">
{{svg "octicon-fold"}}
{{if $invalid}}
{{$.i18n.Tr "repo.issues.review.hide_outdated"}}
{{else}}
{{$.i18n.Tr "repo.issues.review.hide_resolved"}}
{{end}}
</button>
{{end}}
<a href="{{(index $comms 0).CodeCommentURL}}" class="file-comment">{{$filename}}</a>
{{if and $invalid (not $resolved)}}
<span class="tag">
{{$.i18n.Tr "repo.issues.review.outdated"}}
</span>
{{end}}
<a href="{{(index $comms 0).CodeCommentURL}}" class="file-comment">{{$filename}}</a>
</div>
{{$diff := (CommentMustAsDiff (index $comms 0))}}
{{if $diff}}
{{$file := (index $diff.Files 0)}}
<div id="code-preview-{{(index $comms 0).ID}}" class="ui table segment{{if $resolved}} hide{{end}}">
<div id="code-preview-{{(index $comms 0).ID}}" class="ui table segment{{if or $invalid $resolved}} hide{{end}}">
<div class="diff-file-box diff-box file-content {{TabSizeClass $.Editorconfig $file.Name}}">
<div class="file-body file-code code-view code-diff code-diff-unified">
<table>
@@ -482,7 +475,7 @@
</div>
</div>
{{end}}
<div id="code-comments-{{(index $comms 0).ID}}" class="ui segment{{if $resolved}} hide{{end}}">
<div id="code-comments-{{(index $comms 0).ID}}" class="ui segment{{if or $invalid $resolved}} hide{{end}}">
<div class="ui comments">
{{range $comms}}
{{ $createdSubStr:= TimeSinceUnix .CreatedUnix $.Lang }}

View File

@@ -15,7 +15,7 @@
{{end}}
<span class="text grey">
{{if .User}}
<a href="{{.User.HomeLink}}">{{.User.GetDisplayName}}</a>
<a href="{{.User.HomeLink}}">{{.User.Name}}</a>
{{else if .Team}}
<span class="ui text">{{$.Issue.Repo.OwnerName}}/{{.Team.Name}}</span>
{{end}}

View File

@@ -7,16 +7,11 @@
{{template "base/alert" .}}
<div class="home">
<div class="ui stackable middle very relaxed page grid">
<div id="repo_migrating" class="sixteen wide center aligned centered column" task="{{.MigrateTask.ID}}">
<div id="repo_migrating" class="sixteen wide center aligned centered column" repo="{{.Repo.Repository.FullName}}">
<div>
<img src="{{StaticUrlPrefix}}/img/loading.png"/>
</div>
</div>
<div id="repo_migrating_failed_image" class="sixteen wide center aligned centered column" style="display: none;">
<div>
<img src="{{StaticUrlPrefix}}/img/failed.png"/>
</div>
</div>
</div>
<div class="ui stackable middle very relaxed page grid">
<div class="sixteen wide center aligned centered column">
@@ -25,7 +20,6 @@
</div>
<div id="repo_migrating_failed">
<p>{{.i18n.Tr "repo.migrate.migrating_failed" .CloneAddr | Safe}}</p>
<p id="repo_migrating_failed_error"></p>
</div>
</div>
</div>

View File

@@ -59,7 +59,7 @@
{{.OriginalAuthor}}
{{else if .Publisher}}
<img class="img-10" src="{{.Publisher.RelAvatarLink}}">
<a href="{{AppSubUrl}}/{{.Publisher.Name}}">{{.Publisher.GetDisplayName}}</a>
<a href="{{AppSubUrl}}/{{.Publisher.Name}}">{{.Publisher.Name}}</a>
{{else}}
Ghost
{{end}}

View File

@@ -284,7 +284,7 @@
</div>
</div>
{{if not .IsMirror}}
{{if and (not .IsMirror) (.Repository.UnitEnabled $.UnitTypePullRequests)}}
<div class="ui divider"></div>
{{$pullRequestEnabled := .Repository.UnitEnabled $.UnitTypePullRequests}}
{{$prUnit := .Repository.MustGetUnit $.UnitTypePullRequests}}

View File

@@ -49,7 +49,7 @@
{{range .Users}}
<div class="item" data-value="{{.ID}}">
<img class="ui mini image" src="{{.RelAvatarLink}}">
{{.GetDisplayName}}
{{.Name}}
</div>
{{end}}
</div>
@@ -99,7 +99,7 @@
{{range .Users}}
<div class="item" data-value="{{.ID}}">
<img class="ui mini image" src="{{.RelAvatarLink}}">
{{.GetDisplayName}}
{{.Name}}
</div>
{{end}}
</div>
@@ -179,7 +179,7 @@
{{range .Users}}
<div class="item" data-value="{{.ID}}">
<img class="ui mini image" src="{{.RelAvatarLink}}">
{{.GetDisplayName}}
{{.Name}}
</div>
{{end}}
</div>

View File

@@ -21,9 +21,9 @@
<label for="full_name">{{.i18n.Tr "settings.full_name"}}</label>
<input id="full_name" name="full_name" value="{{.SignedUser.FullName}}">
</div>
<div class="field {{if .Err_Email}}error{{end}}">
<div class="required field {{if .Err_Email}}error{{end}}">
<label for="email">{{.i18n.Tr "email"}}</label>
<p>{{.SignedUser.Email}}</p>
<input id="email" name="email" value="{{.SignedUser.Email}}">
</div>
<div class="inline field">
<div class="ui checkbox" id="keep-email-private">

View File

@@ -119,7 +119,7 @@
{{else}}
<span class="iconFloat">{{svg "octicon-repo"}}</span>
{{end}}
<a class="name" href="{{AppSubUrl}}/{{.OwnerName}}/{{.Name}}">{{.OwnerName}}/{{.Name}}</a>
<a class="name" href="{{AppSubUrl}}/{{$.OwnerName}}/{{.Name}}">{{$.OwnerName}}/{{.Name}}</a>
<span>{{SizeFmt .Size}}</span>
{{if .IsFork}}
{{$.i18n.Tr "repo.forked_from"}}

View File

@@ -1,6 +1,6 @@
# Versioning Library for Go
![Build Status](https://github.com/6543/go-version/workflows/Release/badge.svg)
[![GoDoc](https://godoc.org/github.com/6543/go-version?status.svg)](https://godoc.org/github.com/6543/go-version)
[![Build Status](https://circleci.com/gh/hashicorp/go-version/tree/master.svg?style=svg)](https://circleci.com/gh/hashicorp/go-version/tree/master)
[![GoDoc](https://godoc.org/github.com/hashicorp/go-version?status.svg)](https://godoc.org/github.com/hashicorp/go-version)
go-version is a library for parsing versions and version constraints,
and verifying versions against a set of constraints. go-version

View File

@@ -1,3 +1 @@
module github.com/6543/go-version
go 1.15

View File

@@ -18,14 +18,10 @@ var (
// The raw regular expression string used for testing the validity
// of a version.
const (
VersionRegexpRaw string = `[vV]?` + // Optional [vV] prefix
`([0-9]+(\.[0-9]+)*?)` + // ( MajorNum ( '.' MinorNums ) *? )
`(-` + // Followed by (optionally): ( '-'
`([0-9]+[0-9A-Za-z\-~]*(\.[0-9A-Za-z\-~]+)*)` + // Either ( PreNum String ( '.' OtherString ) * )
`|` +
`([-\.]?([A-Za-z\-~]+[0-9A-Za-z\-~]*(\.[0-9A-Za-z\-~]+)*)))?` + // Or ( ['-' '.' ] ? ( AlphaHyphenTilde String * ( '.' String ) * ))) ?
`(\+([0-9A-Za-z\-~]+(\.[0-9A-Za-z\-~]+)*))?` + // and more Optionally: ( '+' String ( '.' String ) * )
`([\+\.\-~]g[0-9A-Fa-f]{10}$)?` + // Optionally a: ( Punct 'g' Sha )
VersionRegexpRaw string = `[vV]?([0-9]+(\.[0-9]+)*?)` +
`(-([0-9]+[0-9A-Za-z\-~]*(\.[0-9A-Za-z\-~]+)*)|(-?([A-Za-z\-~]+[0-9A-Za-z\-~]*(\.[0-9A-Za-z\-~]+)*)))?` +
`(\+([0-9A-Za-z\-~]+(\.[0-9A-Za-z\-~]+)*))?` +
`([\+\.\-~]g[0-9A-Fa-f]{10}$)?` +
`?`
// SemverRegexpRaw requires a separator between version and prerelease

6
vendor/modules.txt vendored
View File

@@ -426,7 +426,7 @@ github.com/hashicorp/go-cleanhttp
# github.com/hashicorp/go-retryablehttp v0.6.7
## explicit
github.com/hashicorp/go-retryablehttp
# github.com/hashicorp/go-version v1.2.1 => github.com/6543/go-version v1.2.4
# github.com/hashicorp/go-version v1.2.1 => github.com/6543/go-version v1.2.3
## explicit
github.com/hashicorp/go-version
# github.com/hashicorp/hcl v1.0.0
@@ -954,7 +954,7 @@ gopkg.in/warnings.v0
gopkg.in/yaml.v2
# gopkg.in/yaml.v3 v3.0.0-20200615113413-eeeca48fe776
gopkg.in/yaml.v3
# mvdan.cc/xurls/v2 v2.2.0
# mvdan.cc/xurls/v2 v2.1.0
## explicit
mvdan.cc/xurls/v2
# strk.kbt.io/projects/go/libravatar v0.0.0-20191008002943-06d1c002b251
@@ -978,4 +978,4 @@ xorm.io/xorm/log
xorm.io/xorm/names
xorm.io/xorm/schemas
xorm.io/xorm/tags
# github.com/hashicorp/go-version => github.com/6543/go-version v1.2.4
# github.com/hashicorp/go-version => github.com/6543/go-version v1.2.3

View File

@@ -2,7 +2,7 @@
[![GoDoc](https://godoc.org/mvdan.cc/xurls?status.svg)](https://godoc.org/mvdan.cc/xurls)
Extract urls from text using regular expressions. Requires Go 1.13 or later.
Extract urls from text using regular expressions. Requires Go 1.12 or later.
```go
import "mvdan.cc/xurls/v2"
@@ -18,18 +18,13 @@ func main() {
}
```
Since API is centered around [regexp.Regexp](https://golang.org/pkg/regexp/#Regexp),
many other methods are available, such as finding the [byte indexes](https://golang.org/pkg/regexp/#Regexp.FindAllIndex)
for all matches.
Note that calling the exposed functions means compiling a regular expression, so
repeated calls should be avoided.
Note that the funcs compile regexes, so avoid calling them repeatedly.
#### cmd/xurls
To install the tool globally:
cd $(mktemp -d); go mod init tmp; GO111MODULE=on go get mvdan.cc/xurls/v2/cmd/xurls
go get mvdan.cc/xurls/cmd/xurls
```shell
$ echo "Do gophers live in http://golang.org?" | xurls

View File

@@ -1,8 +1,3 @@
module mvdan.cc/xurls/v2
go 1.14
require (
github.com/rogpeppe/go-internal v1.5.2
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15 // indirect
)
go 1.13

View File

@@ -1,12 +0,0 @@
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/rogpeppe/go-internal v1.5.2 h1:qLvObTrvO/XRCqmkKxUlOBc48bI3efyDuAZe25QiF0w=
github.com/rogpeppe/go-internal v1.5.2/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15 h1:YR8cESwS4TdDjEe65xsg0ogRM/Nc3DYOhEAlW+xobZo=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/errgo.v2 v2.1.0 h1:0vLT13EuvQ0hNvakwLuFZ/jYrLp5F3kcWHXdRggjCE8=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=

View File

@@ -66,7 +66,6 @@ var Schemes = []string{
`dpp`,
`drm`,
`drop`,
`dtmi`,
`dtn`,
`dvb`,
`ed2k`,
@@ -227,7 +226,6 @@ var Schemes = []string{
`pack`,
`palm`,
`paparazzi`,
`payment`,
`payto`,
`pkcs11`,
`platform`,
@@ -240,7 +238,6 @@ var Schemes = []string{
`pttp`,
`qb`,
`query`,
`quic-transport`,
`redis`,
`rediss`,
`reload`,

View File

@@ -57,7 +57,6 @@ var TLDs = []string{
`alsace`,
`alstom`,
`am`,
`amazon`,
`americanexpress`,
`americanfamily`,
`amex`,
@@ -220,6 +219,7 @@ var TLDs = []string{
`career`,
`careers`,
`cars`,
`cartier`,
`casa`,
`case`,
`caseih`,
@@ -252,6 +252,7 @@ var TLDs = []string{
`chintai`,
`christmas`,
`chrome`,
`chrysler`,
`church`,
`ci`,
`cipriani`,
@@ -365,6 +366,7 @@ var TLDs = []string{
`do`,
`docs`,
`doctor`,
`dodge`,
`dog`,
`domains`,
`dot`,
@@ -409,6 +411,7 @@ var TLDs = []string{
`eurovision`,
`eus`,
`events`,
`everbank`,
`exchange`,
`expert`,
`exposed`,
@@ -698,10 +701,12 @@ var TLDs = []string{
`kz`,
`la`,
`lacaixa`,
`ladbrokes`,
`lamborghini`,
`lamer`,
`lancaster`,
`lancia`,
`lancome`,
`land`,
`landrover`,
`lanxess`,
@@ -722,6 +727,7 @@ var TLDs = []string{
`lexus`,
`lgbt`,
`li`,
`liaison`,
`lidl`,
`life`,
`lifeinsurance`,
@@ -823,6 +829,7 @@ var TLDs = []string{
`monash`,
`money`,
`monster`,
`mopar`,
`mormon`,
`mortgage`,
`moscow`,
@@ -830,6 +837,7 @@ var TLDs = []string{
`motorcycles`,
`mov`,
`movie`,
`movistar`,
`mp`,
`mq`,
`mr`,
@@ -848,6 +856,7 @@ var TLDs = []string{
`mz`,
`na`,
`nab`,
`nadex`,
`nagoya`,
`name`,
`nationwide`,
@@ -949,6 +958,7 @@ var TLDs = []string{
`photography`,
`photos`,
`physio`,
`piaget`,
`pics`,
`pictet`,
`pictures`,
@@ -1144,13 +1154,13 @@ var TLDs = []string{
`song`,
`sony`,
`soy`,
`spa`,
`space`,
`sport`,
`spot`,
`spreadbetting`,
`sr`,
`srl`,
`srt`,
`ss`,
`st`,
`stada`,
@@ -1203,6 +1213,7 @@ var TLDs = []string{
`tech`,
`technology`,
`tel`,
`telefonica`,
`temasek`,
`tennis`,
`teva`,
@@ -1262,6 +1273,7 @@ var TLDs = []string{
`ua`,
`ubank`,
`ubs`,
`uconnect`,
`ug`,
`uk`,
`unicom`,
@@ -1297,6 +1309,7 @@ var TLDs = []string{
`virgin`,
`visa`,
`vision`,
`vistaprint`,
`viva`,
`vivo`,
`vlaanderen`,
@@ -1315,6 +1328,7 @@ var TLDs = []string{
`walter`,
`wang`,
`wanggou`,
`warman`,
`watch`,
`watches`,
`weather`,
@@ -1374,7 +1388,6 @@ var TLDs = []string{
`zuerich`,
`zw`,
`ελ`,
`ευ`,
`бг`,
`бел`,
`дети`,
@@ -1456,7 +1469,6 @@ var TLDs = []string{
`ไทย`,
`გე`,
`みんな`,
`アマゾン`,
`クラウド`,
`グーグル`,
`コム`,
@@ -1469,7 +1481,6 @@ var TLDs = []string{
`中国`,
`中國`,
`中文网`,
`亚马逊`,
`企业`,
`佛山`,
`信息`,
@@ -1490,6 +1501,7 @@ var TLDs = []string{
`天主教`,
`娱乐`,
`家電`,
`工行`,
`广东`,
`微博`,
`慈善`,

View File

@@ -19,7 +19,7 @@ const (
iriChar = letter + mark + number
currency = `\p{Sc}`
otherSymb = `\p{So}`
endChar = iriChar + `/\-_+&~%=#` + currency + otherSymb
endChar = iriChar + `/\-+&~%=#` + currency + otherSymb
otherPunc = `\p{Po}`
midChar = endChar + "_*" + otherPunc
wellParen = `\([` + midChar + `]*(\([` + midChar + `]*\)[` + midChar + `]*)*\)`
@@ -76,7 +76,7 @@ func relaxedExp() string {
knownTLDs := anyOf(append(TLDs, PseudoTLDs...)...)
site := domain + `(?i)(` + punycode + `|` + knownTLDs + `)(?-i)`
hostName := `(` + site + `|` + ipAddr + `)`
webURL := hostName + port + `(/|/` + pathCont + `)?`
webURL := hostName + port + `(/|/` + pathCont + `?|\b|(?m)$)`
return strictExp() + `|` + webURL
}

View File

@@ -191,32 +191,25 @@ function updateIssuesMeta(url, action, issueIds, elementId) {
function initRepoStatusChecker() {
const migrating = $('#repo_migrating');
$('#repo_migrating_failed').hide();
$('#repo_migrating_failed_image').hide();
if (migrating) {
const task = migrating.attr('task');
if (typeof task === 'undefined') {
const repo_name = migrating.attr('repo');
if (typeof repo_name === 'undefined') {
return;
}
$.ajax({
type: 'GET',
url: `${AppSubUrl}/user/task/${task}`,
url: `${AppSubUrl}/${repo_name}/status`,
data: {
_csrf: csrf,
},
complete(xhr) {
if (xhr.status === 200) {
if (xhr.responseJSON) {
if (xhr.responseJSON.status === 4) {
if (xhr.responseJSON.status === 0) {
window.location.reload();
return;
} else if (xhr.responseJSON.status === 3) {
$('#repo_migrating_progress').hide();
$('#repo_migrating').hide();
$('#repo_migrating_failed').show();
$('#repo_migrating_failed_image').show();
$('#repo_migrating_failed_error').text(xhr.responseJSON.err);
return;
}
setTimeout(() => {
initRepoStatusChecker();
}, 2000);
@@ -224,9 +217,7 @@ function initRepoStatusChecker() {
}
}
$('#repo_migrating_progress').hide();
$('#repo_migrating').hide();
$('#repo_migrating_failed').show();
$('#repo_migrating_failed_image').show();
}
});
}
@@ -1170,22 +1161,6 @@ async function initRepository() {
}
function initPullRequestReview() {
if (window.location.hash && window.location.hash.startsWith('#issuecomment-')) {
const commentDiv = $(window.location.hash);
if (commentDiv) {
// get the name of the parent id
const groupID = commentDiv.closest('div[id^="code-comments-"]').attr('id');
if (groupID && groupID.startsWith('code-comments-')) {
const id = groupID.substr(14);
$(`#show-outdated-${id}`).addClass('hide');
$(`#code-comments-${id}`).removeClass('hide');
$(`#code-preview-${id}`).removeClass('hide');
$(`#hide-outdated-${id}`).removeClass('hide');
$(window).scrollTop(commentDiv.offset().top);
}
}
}
$('.show-outdated').on('click', function (e) {
e.preventDefault();
const id = $(this).data('comment');
@@ -1236,6 +1211,16 @@ function initPullRequestReview() {
$(this).closest('.menu').toggle('visible');
});
$('.code-view .lines-code,.code-view .lines-num')
.on('mouseenter', function () {
const parent = $(this).closest('td');
$(this).closest('tr').addClass(
parent.hasClass('lines-num-old') || parent.hasClass('lines-code-old') ? 'focus-lines-old' : 'focus-lines-new'
);
})
.on('mouseleave', function () {
$(this).closest('tr').removeClass('focus-lines-new focus-lines-old');
});
$('.add-code-comment').on('click', function (e) {
if ($(e.target).hasClass('btn-add-single')) return; // https://github.com/go-gitea/gitea/issues/4745
e.preventDefault();

View File

@@ -1280,8 +1280,3 @@ table th[data-sortt-desc] {
.ui.header > .ui.label.compact {
margin-top: inherit;
}
.flash-error details code {
display: block;
text-align: left;
}

View File

@@ -10,9 +10,13 @@
.ui.attached.header {
background: #f0f0f0;
.right .button {
padding: 8px 10px;
font-weight: normal;
.right {
margin-top: -5px;
.button {
padding: 8px 10px;
font-weight: normal;
}
}
}

View File

@@ -400,13 +400,6 @@
background-color: #ffffee;
}
tr.has-parent a {
display: inline-block;
padding-top: 8px;
padding-bottom: 8px;
width: calc(100% - 1.25rem);
}
.jumpable-path {
color: #888888;
}
@@ -955,6 +948,7 @@
.tag {
color: #767676;
margin-top: 3px;
padding: 2px 5px;
font-size: 12px;
border: 1px solid rgba(0, 0, 0, .1);
@@ -968,6 +962,26 @@
}
}
.actions {
.item {
float: left;
&.context {
float: none;
}
&.tag {
margin-right: 5px;
}
&.action {
margin-top: 6px;
padding-left: 10px;
padding-right: 3px;
}
}
}
> .content {
> div:first-child {
border-top-left-radius: 4px;
@@ -1007,14 +1021,11 @@
left: 7px;
}
.header-left > * + *,
.header-right > * + * {
margin-left: .25rem;
}
.actions {
display: flex;
padding: 0 .5rem;
a {
padding: .5rem;
color: rgba(0, 0, 0, .4);
&:hover {
@@ -1209,16 +1220,6 @@
display: block;
}
}
.tag {
color: black;
margin: 3px 0 0 5px;
padding: 2px 5px;
font-size: 12px;
border: 1px solid rgba(0, 0, 0, .1);
border-radius: 3px;
background-color: #fffbb2;
}
}
}

View File

@@ -1,7 +1,7 @@
.ui.button.add-code-comment {
font-size: 14px;
height: 16px;
line-height: 12px !important;
line-height: 16px !important;
padding: 0;
position: relative;
width: 16px;
@@ -17,10 +17,6 @@
}
}
.diff-file-box .lines-code:hover .ui.button.add-code-comment {
opacity: 1;
}
.add-comment-left.add-comment-right .ui.attached.header {
border: 1px solid #d4d4d5;
margin-top: .5em;
@@ -36,6 +32,11 @@
}
}
.focus-lines-new .ui.button.add-code-comment.add-code-comment-right,
.focus-lines-old .ui.button.add-code-comment.add-code-comment-left {
opacity: 1;
}
.comment-code-cloud {
padding: 4px;
position: relative;

View File

@@ -384,10 +384,6 @@
}
}
.repository .ui.segment.sub-menu .list .item a:hover {
color: #fff;
}
.ui.horizontal.segments > .segment {
background-color: #383c4a;
}
@@ -539,26 +535,15 @@ a:hover {
.ui.secondary.menu .link.item:hover,
.ui.secondary.menu .active.item:hover,
.ui.secondary.menu a.item:hover,
.ui.dropdown .menu .active.item,
.ui.link.menu .item:hover,
.ui.menu .dropdown.item:hover,
.ui.menu .link.item:hover,
.ui.menu a.item:hover,
.ui.menu .active.item {
color: #dbdbdb;
background: #454b5a;
.ui.dropdown .menu .active.item {
color: #ffffff;
}
.ui.menu .ui.dropdown .menu > .item {
background: #2c303a !important;
color: #9e9e9e !important;
}
.ui.menu .ui.dropdown .menu > .item:hover,
.ui.menu .ui.dropdown .menu > .selected.item {
color: #dbdbdb !important;
background: #454b5a !important;
}
.ui.secondary.menu .dropdown.item > .menu,
.ui.text.menu .dropdown.item > .menu {
border: 1px solid #434444;
@@ -574,8 +559,12 @@ footer {
background: #2c303a;
}
.ui.dropdown .menu > .header,
.ui.dropdown .menu > .header:not(.ui) {
.ui.menu .ui.dropdown .menu > .item:hover,
.ui.menu .ui.dropdown .menu > .selected.item {
color: #ffffff !important;
}
.ui.dropdown .menu > .header {
color: #dbdbdb;
}
@@ -596,16 +585,29 @@ footer {
background: #4b5162;
}
.ui.link.menu .item:hover,
.ui.menu .dropdown.item:hover,
.ui.menu .link.item:hover,
.ui.menu a.item:hover {
color: #dbdbdb;
background: #454b5a;
}
.ui.menu .active.item {
background: #4b5162;
color: #dbdbdb;
}
.ui.input input {
background: #404552;
border: 1px solid #4b505f;
border: 2px solid #353945;
color: #dbdbdb;
}
.ui.input input:focus,
.ui.input.focus input {
background: #404552;
border: 1px solid #6a737d;
border: 2px solid #353945;
color: #dbdbdb;
}
@@ -616,7 +618,7 @@ footer {
.ui.label,
.ui.label.basic {
color: #dbdbdb;
background-color: #2a2e39;
background-color: #404552;
}
.issue.list > .item .title {
@@ -649,12 +651,26 @@ a.ui.basic.green.label:hover {
color: #129c92;
}
.ui.basic.button,
.ui.basic.buttons .button {
color: #797979;
}
.ui.basic.red.active.button,
.ui.basic.red.buttons .active.button {
box-shadow: 0 0 0 1px #c75252 inset !important;
color: #c75252 !important;
}
.ui.basic.button:focus,
.ui.basic.button:hover,
.ui.basic.buttons .button:focus,
.ui.basic.buttons .button:hover {
color: #dbdbdb;
box-shadow: 0 0 0 1px rgba(200, 200, 200, .35) inset;
background: rgba(0, 0, 0, .5);
}
.ui.menu .item {
background: #404552;
color: #9e9e9e;
@@ -718,7 +734,6 @@ a.ui.basic.green.label:hover {
}
.ui.form input:not([type]),
.ui.form textarea,
.ui.form input[type="date"],
.ui.form input[type="datetime-local"],
.ui.form input[type="email"],
@@ -729,34 +744,13 @@ a.ui.basic.green.label:hover {
.ui.form input[type="tel"],
.ui.form input[type="text"],
.ui.form input[type="time"],
.ui.form input[type="url"],
.ui.selection.dropdown {
.ui.form input[type="url"] {
color: #9e9e9e;
background: #404552;
border: 1px solid #4b505f;
}
.ui.form input:not([type]):hover,
.ui.form textarea:hover,
.ui.form input[type="date"]:hover,
.ui.form input[type="datetime-local"]:hover,
.ui.form input[type="email"]:hover,
.ui.form input[type="file"]:hover,
.ui.form input[type="number"]:hover,
.ui.form input[type="password"]:hover,
.ui.form input[type="search"]:hover,
.ui.form input[type="tel"]:hover,
.ui.form input[type="text"]:hover,
.ui.form input[type="time"]:hover,
.ui.form input[type="url"]:hover,
.ui.selection.dropdown:hover {
background: #404552;
border: 1px solid #4b505f;
color: #dbdbdb;
border: 2px solid #353945;
}
.ui.form input:not([type]):focus,
.ui.form textarea:focus,
.ui.form input[type="date"]:focus,
.ui.form input[type="datetime-local"]:focus,
.ui.form input[type="email"]:focus,
@@ -767,66 +761,14 @@ a.ui.basic.green.label:hover {
.ui.form input[type="tel"]:focus,
.ui.form input[type="text"]:focus,
.ui.form input[type="time"]:focus,
.ui.form input[type="url"]:focus,
.ui.selection.dropdown:focus {
.ui.form input[type="url"]:focus {
background: #404552;
border: 1px solid #6a737d;
border: 2px solid #4b505f;
color: #dbdbdb;
}
.ui.form .fields.error .field textarea,
.ui.form .fields.error .field select,
.ui.form .fields.error .field input:not([type]),
.ui.form .fields.error .field input[type="date"],
.ui.form .fields.error .field input[type="datetime-local"],
.ui.form .fields.error .field input[type="email"],
.ui.form .fields.error .field input[type="number"],
.ui.form .fields.error .field input[type="password"],
.ui.form .fields.error .field input[type="search"],
.ui.form .fields.error .field input[type="tel"],
.ui.form .fields.error .field input[type="time"],
.ui.form .fields.error .field input[type="text"],
.ui.form .fields.error .field input[type="file"],
.ui.form .fields.error .field input[type="url"],
.ui.form .field.error textarea,
.ui.form .field.error select,
.ui.form .field.error input:not([type]),
.ui.form .field.error input[type="date"],
.ui.form .field.error input[type="datetime-local"],
.ui.form .field.error input[type="email"],
.ui.form .field.error input[type="number"],
.ui.form .field.error input[type="password"],
.ui.form .field.error input[type="search"],
.ui.form .field.error input[type="tel"],
.ui.form .field.error input[type="time"],
.ui.form .field.error input[type="text"],
.ui.form .field.error input[type="file"],
.ui.form .field.error input[type="url"] {
background-color: #522;
border: 1px solid #7d3434;
color: #f9cbcb;
}
.ui.form .field.error select:focus,
.ui.form .field.error input:not([type]):focus,
.ui.form .field.error input[type="date"]:focus,
.ui.form .field.error input[type="datetime-local"]:focus,
.ui.form .field.error input[type="email"]:focus,
.ui.form .field.error input[type="number"]:focus,
.ui.form .field.error input[type="password"]:focus,
.ui.form .field.error input[type="search"]:focus,
.ui.form .field.error input[type="tel"]:focus,
.ui.form .field.error input[type="time"]:focus,
.ui.form .field.error input[type="text"]:focus,
.ui.form .field.error input[type="file"]:focus,
.ui.form .field.error input[type="url"]:focus {
background-color: #522;
border: 1px solid #a04141;
color: #f9cbcb;
}
.ui.action.input:not([class*="left action"]) input:focus {
border-right-color: #6a737d !important;
border-right-color: #4b505f !important;
}
.ui.green.button,
@@ -845,22 +787,6 @@ a.ui.basic.green.label:hover {
color: #dbdbdb;
}
.ui.basic.button,
.ui.basic.buttons .button {
color: #9e9e9e;
background: rgba(0, 0, 0, .08);
box-shadow: none;
}
.ui.basic.button:focus,
.ui.basic.button:hover,
.ui.basic.buttons .button:focus,
.ui.basic.buttons .button:hover {
color: #dbdbdb;
background: rgba(255, 255, 255, .08);
box-shadow: none;
}
.ui.labeled.button:not([class*="left labeled"]) > .label,
.ui[class*="left labeled"].button > .button {
background: #404552;
@@ -873,20 +799,6 @@ a.ui.basic.green.label:hover {
color: #dbdbdb;
}
.ui.search > .results {
background: #383c4a;
border-color: #4c505c;
}
.ui.search > .results .result:hover,
.ui.category.search > .results .category .result:hover {
background: #404552;
}
.ui.search > .results .result .title {
color: #dbdbdb;
}
.ui.table thead th,
.ui.table > thead > tr > th {
background: #404552;
@@ -1040,7 +952,6 @@ a.ui.basic.green.label:hover {
.ui.dropdown .menu > .item:hover {
color: #dbdbdb;
background: #353945;
}
.ui.dropdown .menu > .item {
@@ -1124,22 +1035,13 @@ a.ui.basic.green.label:hover {
.repository.view.issue .comment-list .comment .tag {
color: #dbdbdb;
border-color: #505667;
border-color: rgb(152, 152, 152);
}
.repository.view.issue .comment-list .timeline-item .badge.badge-commit {
background: radial-gradient(#383c4a 40%, transparent 40%) no-repeat;
}
.repository.file.editor .commit-form-wrapper .commit-form {
border-color: #505667;
}
.repository.file.editor .commit-form-wrapper .commit-form::before,
.repository.file.editor .commit-form-wrapper .commit-form::after {
border-right-color: #505667;
}
.repository .comment.form .content .form:after {
border-right-color: #313c47;
}
@@ -1158,6 +1060,17 @@ a.ui.basic.green.label:hover {
box-shadow: 0 0 0 1px #13ae38 inset !important;
}
.ui.form textarea,
.ui.form textarea:focus {
color: #dbdbdb;
background: #404552;
border: 2px solid #353945;
}
.ui.form textarea:focus {
border: 1px solid #456580;
}
.ui .info.segment.top {
background-color: #404552 !important;
}
@@ -1239,6 +1152,12 @@ td.blob-hunk {
box-shadow: 0 2px 3px 0 rgba(34, 36, 38, .15);
}
.ui.selection.dropdown {
background: #404552;
border: 1px solid #404552;
color: #9e9e9e;
}
.ui.menu .ui.dropdown .menu > .active.item {
color: #dbdbdb !important;
}
@@ -1246,7 +1165,7 @@ td.blob-hunk {
.ui.card,
.ui.cards > .card {
background: #353945;
box-shadow: 0 0 0 1px #4c505c;
box-shadow: 0 1px 3px 0 #4c505c, 0 0 0 1px #4c505c;
}
.ui.card > .content > .header,
@@ -1406,58 +1325,57 @@ input {
.ui.checkbox input:checked ~ .box:after,
.ui.checkbox input:checked ~ label:after {
color: #dbdbdb;
color: #7f98ad;
}
.ui.checkbox input:checked ~ .box:before,
.ui.checkbox input:checked ~ label:before {
background: #404552;
background: #304251;
opacity: 1;
color: #dbdbdb;
border-color: #4b505f;
color: #7f98ad;
border-color: #304251;
}
.ui.checkbox .box:hover::before,
.ui.checkbox label:hover::before {
background: #404552;
border-color: #4b505f;
background: #304251;
}
.ui.checkbox .box:before,
.ui.checkbox label:before {
background: #404552;
border: 1px solid #4b505f;
background: #304251;
border: 1px solid #304251;
}
.ui.checkbox label:before {
border-color: #4b505f;
border-color: #476075;
}
.ui.checkbox .box:active::before,
.ui.checkbox label:active::before {
background: #404552;
border-color: #6a737d;
background: #304251;
border-color: rgba(34, 36, 38, .35);
}
.ui.checkbox input:focus ~ .box:before,
.ui.checkbox input:focus ~ label:before {
border-color: #6a737d;
background: #404552;
border-color: #304251;
background: #304251;
}
.ui.checkbox input:checked:focus ~ .box:before,
.ui.checkbox input:checked:focus ~ label:before,
.ui.checkbox input:not([type="radio"]):indeterminate:focus ~ .box:before,
.ui.checkbox input:not([type="radio"]):indeterminate:focus ~ label:before {
border-color: #6a737d;
background: #404552;
border-color: #304251;
background: #304251;
}
.ui.checkbox input:checked:focus ~ .box:after,
.ui.checkbox input:checked:focus ~ label:after,
.ui.checkbox input:not([type="radio"]):indeterminate:focus ~ .box:after,
.ui.checkbox input:not([type="radio"]):indeterminate:focus ~ label:after {
color: #dbdbdb;
color: #7f98ad;
}
.ui.checkbox input:focus ~ .box:after,
@@ -1466,17 +1384,8 @@ input {
color: #9a9a9a;
}
.ui.radio.checkbox label::after,
.ui.radio.checkbox input:checked ~ label::after,
.ui.radio.checkbox input:focus ~ label::after,
.ui.radio.checkbox input:focus:checked ~ label::after, {
background: #dbdbdb;
}
.ui.radio.checkbox input:checked ~ label::before,
.ui.radio.checkbox input:focus ~ label::before,
.ui.radio.checkbox input:focus:checked ~ label::before {
background: none;
.ui.selection.dropdown:hover {
border: 1px solid #456580;
}
.ui.selection.dropdown .menu > .item {
@@ -1504,7 +1413,7 @@ input {
}
.ui.form .dropzone {
border: 1px dashed #7f98ad;
border: 2px dashed #7f98ad;
background-color: #2e323e;
.dz-button {
@@ -1717,7 +1626,6 @@ a.ui.labels .label:hover {
border-color: #634343 !important;
}
.organization.settings .labelspage .item,
.organization.teams .repositories .item:not(:last-child),
.organization.teams .members .item:not(:last-child),
.organization.teams .detail .item:not(:last-child),
@@ -1725,10 +1633,6 @@ a.ui.labels .label:hover {
border-bottom-color: #404552;
}
.organization.settings .labelspage .item a:hover {
color: #fff;
}
.ui.blue.button:active,
.ui.blue.buttons .button:active {
background-color: #a27558;
@@ -1766,7 +1670,7 @@ a.ui.labels .label:hover {
.editor-toolbar {
background-color: #404552;
border-color: #4b505f;
border-color: #7f98ad;
}
.edit-diff > div > .ui.table {
@@ -1785,7 +1689,7 @@ a.ui.labels .label:hover {
}
.editor-toolbar i.separator {
border-right-color: #87ab63;
border-right-color: #7f98ad;
}
.repository .diff-detail-box {
@@ -1897,7 +1801,7 @@ a.ui.labels .label:hover {
.CodeMirror {
color: #9daccc;
background-color: #2e323e;
border-color: #4b505f;
border-color: #7f98ad;
border-top: 0;
div.CodeMirror-cursor {
@@ -2073,10 +1977,6 @@ footer .container .links > * {
color: #2a2e3a;
}
img[src$="/img/matrix.svg"] {
filter: invert(80%);
}
#git-graph-container.monochrome #rel-container .flow-group {
stroke: dimgrey;
fill: dimgrey;