30 Commits

Author SHA1 Message Date
Liangjun Song
b8da04078d Merge pull request #28851 from overleaf/ls-collect-business-details-in-stripe-elements
Collect business details in Stripe Elements

GitOrigin-RevId: 5a3affd1916b2ba659e007a7c25f64879899fd1a
2025-10-09 08:08:28 +00:00
Liangjun Song
a79171479b Merge pull request #28847 from overleaf/kh-verify-trials
[web] ensure trials work for Stripe custom checkout

GitOrigin-RevId: 9918e768502d50d61cf1a01dfc244fc57411ed35
2025-10-09 08:08:24 +00:00
Liangjun Song
fd9be80bb3 Merge pull request #28781 from overleaf/ls-handle-business-details-on-backend
Handle Stripe business details on backend

GitOrigin-RevId: 32608ba4913da493a09341b8880cd5b639066462
2025-10-09 08:08:19 +00:00
Alf Eaton
7e3b853fc1 Revert "Show tooltip immediately if a tooltip is already open (#28870)" (#28935)
This reverts commit 74950ea7e705acb8f42dea552b23ce93c66058c7.

GitOrigin-RevId: 346a947c420448becf294f0174937a5c256bf945
2025-10-09 08:08:14 +00:00
Jakob Ackermann
dd3c1b686e [document-updater] fix test after other async/await work (#28944)
GitOrigin-RevId: 7338f340924b3355dac39a86c40327a2964c3020
2025-10-09 08:08:02 +00:00
Andrew Rumble
3211c7c37a Make test admin users engineers
GitOrigin-RevId: 627b5b05eefdfa675937764b7c798e99ab6ef37e
2025-10-09 08:07:48 +00:00
Jakob Ackermann
d648c96603 [document-updater] migrate HistoryManager to async/await (#28789)
shouldFlushHistoryOps has a default value for 'threshold', which keeps
the exports simpler and still lets the unit tests override it.

GitOrigin-RevId: 1c6d4a2778052b5af40e2e338589a230ac2f4646
2025-10-09 08:07:36 +00:00
Jakob Ackermann
b0b9733a42 [document-updater] migrate ProjectFlusher to async/await (#28796)
GitOrigin-RevId: 24f61d6c0fab5d65b962cc7031ce0b8c84d5a915
2025-10-09 08:07:31 +00:00
ilkin-overleaf
c0a836082c Merge pull request #28865 from overleaf/ii-domain-capture-join-success-message-edit
[web] Domain capture join group message edit

GitOrigin-RevId: 8949dff2e1d95dd978ee5e04165ad4cefe51b088
2025-10-09 08:07:26 +00:00
ilkin-overleaf
9cc6fd9d82 Merge pull request #28833 from overleaf/ii-await-project-helper
[web] Promisify ProjectHelper

GitOrigin-RevId: a31457228c335ba1d70acdfa4671effce30c8014
2025-10-09 08:07:22 +00:00
ilkin-overleaf
22b38d02b0 Merge pull request #28808 from overleaf/ii-await-user-handler
[web] Promisify UserHandler

GitOrigin-RevId: 2daa6f74ec566851d208bf1b3d12d89ecf183383
2025-10-09 08:07:17 +00:00
Andrew Rumble
8b5c920cea Remove request from analytics service
GitOrigin-RevId: f4c5046095e7193449649f845560ecd477280cb5
2025-10-09 08:07:04 +00:00
Brian Gough
d24f37d3a4 Merge pull request #28880 from overleaf/bg-add-time-option-to-clsi
add latexmk `-time` option to clsi and record performance logs

GitOrigin-RevId: 467473859359913da73f83e10b63b45603ea175c
2025-10-09 08:06:12 +00:00
Jakob Ackermann
3913008e02 [web] convert GeoIpLookup to async/await (#28802)
GitOrigin-RevId: 38ad8af970a0674a514bf5ed0dacb8becd7c1f72
2025-10-09 08:06:00 +00:00
Mathias Jakobsen
c90e1cb82c Merge pull request #28889 from overleaf/dp-reference-manager-cta-tweaks
Tweaks to reference manager paywall modal

GitOrigin-RevId: df41862ab99642c9b4e51c06429382692f75212d
2025-10-09 08:05:54 +00:00
Alf Eaton
d3f05fda77 Show tooltip immediately if a tooltip is already open (#28870)
* Memoize delayProps
* Refactor Escape key handler
* Use useTooltipContext
* Remove delay: 0 from tooltips
* Only use isTooltipOpen if available
* Only show transition for initial tooltip

GitOrigin-RevId: 74950ea7e705acb8f42dea552b23ce93c66058c7
2025-10-09 08:05:49 +00:00
Mathias Jakobsen
3a8d383ac3 Merge pull request #28871 from overleaf/mj-recompile-setting-changed-add-missing
[web] Add missing events for changing compile-related settings

GitOrigin-RevId: b2ccb4c8f0f3920762d6e69ccb537ae9bedb0281
2025-10-09 08:05:36 +00:00
Mathias Jakobsen
f9d0f7e3ee Merge pull request #28893 from overleaf/mj-linter-brace-check
[web] Allow braces in documentclass options

GitOrigin-RevId: 9675d3fc760a3b7d402c5a9df57a0cf183a1e648
2025-10-09 08:05:31 +00:00
Miguel Serrano
b6d116e957 Merge pull request #28912 from overleaf/msm-fix-certs-build
Fix `certs` image build

GitOrigin-RevId: 02af78b29915276d55e86001a1cdc4703fc830b5
2025-10-09 08:05:26 +00:00
Miguel Serrano
9723800b68 Merge pull request #28868 from overleaf/msm-async-docstore-acceptance
[docstore] async/await migration in acceptance tests + `request` removal

GitOrigin-RevId: af1fe2b3de3d0b449ba3dad3555b309af3d35b62
2025-10-09 08:05:21 +00:00
Miguel Serrano
985a873971 Merge pull request #28779 from overleaf/msm-clsi-loadtest-async-await
[clsi] Replaced callbacks with async/await in `loadTests`

GitOrigin-RevId: 81e84dd77f71560f765625dfdbeafcf14312a3ff
2025-10-09 08:05:16 +00:00
Antoine Clausse
33e63d79fc Merge pull request #28584 from overleaf/ac-some-web-esm-migration-5
[web] Convert some Features files to ES modules (part 5)

GitOrigin-RevId: 0cad67f9afe0095e2b066bf2f4d3717c00540dab
2025-10-08 08:06:15 +00:00
Domagoj Kriskovic
267fc5393a Promisify ProjectHistoryClient, ProjectHistoryApp, SyncTests and SendingUpdatesTests (#28890)
GitOrigin-RevId: 7bf26c6ed1a172c6506449a821d4e43f424a72bd
2025-10-08 08:06:00 +00:00
Miguel Serrano
3a35b8680e Merge pull request #28554 from overleaf/msm-force-s3-lib-storage-uploads
[object-persistor] Use `@aws-sdk/lib-storage` for all uploads

GitOrigin-RevId: ab8e54a7bae843f9e6b05ed9cf936130a36b8c2f
2025-10-08 08:05:55 +00:00
Simon Gardner
4a5b29d166 Improve helpfulness of sso error messages
GitOrigin-RevId: 4459603cb1a84c21143e47eb817f9455aa9015e9
2025-10-08 08:05:50 +00:00
Miguel Serrano
f326f29a83 Merge pull request #28826 from overleaf/msm-bump-dockerode-4-0-9
[clsi] Bump dockerode 4.0.7 -> 4.0.9

GitOrigin-RevId: ec07c7c1d9e95f415b528a7b61b390f95014ea15
2025-10-08 08:05:45 +00:00
David
48cc1b1cd8 Merge pull request #28800 from overleaf/dp-promisify-user-getter
Promisify UserGetter and UserGetterTests

GitOrigin-RevId: 4a2613e632e6306751d19cb7160ee1f6c5c9e2f4
2025-10-08 08:05:41 +00:00
David
6715b0a6f8 Merge pull request #28801 from overleaf/dp-promisify-login-rate-limiter
Promisify LoginRateLimiter

GitOrigin-RevId: e7247258147635019fe229a6bc6aab3a6cc64f75
2025-10-08 08:05:36 +00:00
David
abd3e6e325 Merge pull request #28811 from overleaf/dp-promisify-learned-words-manager
Promisify LearnedWordsManager and LearnedWordsManagerTests

GitOrigin-RevId: f4e30eca0292409bcefe82b17facd1129fdc85ae
2025-10-08 08:05:31 +00:00
Tim Down
c104aa454e Merge pull request #28845 from overleaf/td-async-await-doc-updater-client
Convert DocUpdateClient in document-updater acceptance tests to async/await

GitOrigin-RevId: 8f2352119f8f1175c2703ed90dbbc483ed039e86
2025-10-08 08:05:26 +00:00
163 changed files with 8170 additions and 11603 deletions

View File

@@ -46,7 +46,7 @@ Uploads a stream to the backend.
- `key`: The key for the uploaded object
- `readStream`: The data stream to upload
- `opts` (optional):
- `sourceMd5`: The md5 hash of the source data, if known. The uploaded data will be compared against this and the operation will fail if it does not match. If omitted, the md5 is calculated as the data is uploaded instead, and verified against the backend.
- `sourceMd5`: The md5 hash of the source data, if known. The uploaded data will be compared against this and the operation will fail if it does not match. If omitted, the md5 is calculated as the data is uploaded instead, and verified against the backend. This is not supported in `S3Persistor` as it performs [its own integrity protections](https://aws.amazon.com/blogs/aws/introducing-default-data-integrity-protections-for-new-objects-in-amazon-s3/). Setting `sourceMd5` with `S3Persistor` will result in an error being thrown.
- `contentType`: The content type to write in the object metadata
- `contentEncoding`: The content encoding to write in the object metadata
@@ -266,7 +266,8 @@ For the `FS` persistor, the `bucketName` should be the full path to the folder o
- `s3.partSize`: The part size for S3 uploads. Defaults to 100 megabytes.
- `s3.httpOptions`: HTTP Options passed to the [`NodeHttpHandler` constructor](https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/Package/-smithy-node-http-handler/Class/NodeHttpHandler/)
- For backwards compatibility reasons, the `timeout` property that was passed to the [S3 constructor](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property) before migrating to AWS SDK v3 is now passed to the `NodeHttpHandler` constructor as `connectionTimeout`.
- `s3.maxRetries`: The number of times the S3 client will retry in case of an error
- `s3.maxRetries` (legacy): The number of times the S3 client will retry in case of an error
- `s3.maxAttempts`: The number of times the S3 client will attempt to perform the operation in case there are errors. Default value is 3.
- `s3.endpoint`: For testing - overrides the S3 endpoint to use a different service (e.g. a fake S3 server)
- `s3.pathStyle`: For testing - use old path-style URLs, for services that do not support subdomain-based access
@@ -281,12 +282,6 @@ For the `FS` persistor, the `bucketName` should be the full path to the folder o
}
```
#### Notes
In order for server-side MD5 generation to work, uploads must be below the `partSize`. Otherwise a multipart upload will be used, and the S3 `eTag` which is used to retrieve the MD5 will not be the MD5 hash of the uploaded object. In these cases, we download the data and calculate the MD5 manually.
For verification during upload, we use S3's checksum mechanism to verify the integrity of the uploaded data, but when explicitly retrieving the md5 hash this will download the entire object if its size is above the part size.
### GCS-specific parameters
GCS authentication is configured automatically via the local service account, or the `GOOGLE_APPLICATION_CREDENTIALS` environment variable.

View File

@@ -171,19 +171,7 @@ module.exports = class MigrationPersistor extends AbstractPersistor {
destKey
) {
try {
let sourceMd5
try {
sourceMd5 = await this.fallbackPersistor.getObjectMd5Hash(
sourceBucket,
sourceKey
)
} catch (err) {
Logger.warn(err, 'error getting md5 hash from fallback persistor')
}
await this.primaryPersistor.sendStream(destBucket, destKey, stream, {
sourceMd5,
})
await this.primaryPersistor.sendStream(destBucket, destKey, stream)
} catch (err) {
const error = new WriteError(
'unable to copy file to destination persistor',

View File

@@ -433,7 +433,6 @@ class CachedPerProjectEncryptedS3Persistor {
* @param {number} [opts.contentLength]
* @param {'*'} [opts.ifNoneMatch]
* @param {SSECOptions} [opts.ssecOptions]
* @param {string} [opts.sourceMd5]
* @return {Promise<void>}
*/
async sendStream(bucketName, path, sourceStream, opts = {}) {

View File

@@ -81,9 +81,6 @@ class S3Persistor extends AbstractPersistor {
/** @type {Map<string, S3Client>} */
#clients = new Map()
/** @type {Map<string, S3Client>} */
#noRetryClients = new Map()
constructor(settings = {}) {
super()
@@ -111,7 +108,6 @@ class S3Persistor extends AbstractPersistor {
* @param {number} [opts.contentLength]
* @param {'*'} [opts.ifNoneMatch]
* @param {SSECOptions} [opts.ssecOptions]
* @param {string} [opts.sourceMd5]
* @return {Promise<void>}
*/
async sendStream(bucketName, key, readStream, opts = {}) {
@@ -136,6 +132,12 @@ class S3Persistor extends AbstractPersistor {
uploadOptions.StorageClass = this.settings.storageClass[bucketName]
}
if ('sourceMd5' in opts) {
// we fail straight away to prevent the client from wasting CPU/IO precomputing the hash
throw new Error(
'sourceMd5 option is not supported, S3 provides its own integrity protection mechanism'
)
}
if (opts.contentType) {
uploadOptions.ContentType = opts.contentType
}
@@ -152,32 +154,12 @@ class S3Persistor extends AbstractPersistor {
Object.assign(uploadOptions, opts.ssecOptions.getPutOptions())
}
// if we have an md5 hash, pass this to S3 to verify the upload - otherwise
// we rely on the S3 client's checksum calculation to validate the upload
let computeChecksums = false
if (opts.sourceMd5) {
uploadOptions.ContentMD5 = PersistorHelper.hexToBase64(opts.sourceMd5)
} else {
computeChecksums = true
}
if (this.settings.disableMultiPartUpload) {
// retries are disabled for `PutObjectCommand` when using streams
// https://github.com/aws/aws-sdk-js-v3/issues/6770
const noRetry = true
await this._getClientForBucket(
bucketName,
computeChecksums,
noRetry
).send(new PutObjectCommand(uploadOptions))
} else {
const upload = new Upload({
client: this._getClientForBucket(bucketName, computeChecksums),
params: uploadOptions,
partSize: this.settings.partSize,
})
await upload.done()
}
const upload = new Upload({
client: this._getClientForBucket(bucketName),
params: uploadOptions,
partSize: this.settings.partSize,
})
await upload.done()
} catch (err) {
throw PersistorHelper.wrapError(
err,
@@ -555,34 +537,31 @@ class S3Persistor extends AbstractPersistor {
/**
* @param {string} bucket
* @param {boolean} computeChecksums
* @param {boolean} noRetries
* @return {S3Client}
* @private
*/
_getClientForBucket(bucket, computeChecksums = false, noRetries = false) {
_getClientForBucket(bucket) {
/** @type {import('@aws-sdk/client-s3').S3ClientConfig} */
const clientOptions = {}
const cacheKey = `${bucket}:${computeChecksums}`
if (computeChecksums) {
clientOptions.requestChecksumCalculation = 'WHEN_SUPPORTED'
clientOptions.responseChecksumValidation = 'WHEN_SUPPORTED'
}
const cacheKey = bucket
const clientMap = noRetries ? this.#noRetryClients : this.#clients
let client = clientMap.get(cacheKey)
let client = this.#clients.get(cacheKey)
if (!client) {
client = new S3Client(
this._buildClientOptions(
this.settings.bucketCreds?.[bucket],
clientOptions,
noRetries
clientOptions
)
)
clientMap.set(cacheKey, client)
this.#clients.set(cacheKey, client)
// Some third-party S3-compatible services (as MinIO) do not support the default checksums
// for object integrity in `DeleteObjectsCommand`. We add a fallback for those cases.
// https://github.com/aws/aws-sdk-js-v3/blob/main/supplemental-docs/MD5_FALLBACK.md
addMd5Middleware(client)
const useMd5Fallback = process.env.DELETE_OBJECTS_MD5_FALLBACK === 'true'
if (useMd5Fallback) {
addMd5Middleware(client)
}
}
return client
@@ -591,11 +570,10 @@ class S3Persistor extends AbstractPersistor {
/**
* @param {Object} bucketCredentials
* @param {import('@aws-sdk/client-s3').S3ClientConfig} clientOptions
* @param {boolean} noRetries
* @return {import('@aws-sdk/client-s3').S3ClientConfig}
* @private
*/
_buildClientOptions(bucketCredentials, clientOptions, noRetries = false) {
_buildClientOptions(bucketCredentials, clientOptions) {
const options = clientOptions || {}
if (bucketCredentials) {
@@ -635,10 +613,6 @@ class S3Persistor extends AbstractPersistor {
options.maxAttempts = this.settings.maxRetries + 1
}
if (noRetries) {
options.maxAttempts = 1
}
const requestHandlerParams = this.settings.httpOptions || {}
if (sslEnabled && this.settings.ca) {

View File

@@ -197,19 +197,11 @@ describe('MigrationPersistorTests', function () {
expect(fallbackPersistor.getObjectStream).to.have.been.calledOnce
})
it('should get the md5 hash from the source', function () {
expect(fallbackPersistor.getObjectMd5Hash).to.have.been.calledWith(
fallbackBucket,
key
)
})
it('should send a stream to the primary', function () {
expect(primaryPersistor.sendStream).to.have.been.calledWithExactly(
bucket,
key,
sinon.match.instanceOf(Stream.PassThrough),
{ sourceMd5: md5 }
sinon.match.instanceOf(Stream.PassThrough)
)
})
@@ -476,19 +468,11 @@ describe('MigrationPersistorTests', function () {
).not.to.have.been.calledWithExactly(fallbackBucket, key)
})
it('should get the md5 hash from the source', function () {
expect(fallbackPersistor.getObjectMd5Hash).to.have.been.calledWith(
fallbackBucket,
key
)
})
it('should send the file to the primary', function () {
expect(primaryPersistor.sendStream).to.have.been.calledWithExactly(
bucket,
destKey,
sinon.match.instanceOf(Stream.PassThrough),
{ sourceMd5: md5 }
sinon.match.instanceOf(Stream.PassThrough)
)
})
})

View File

@@ -108,11 +108,5 @@ module.exports = function () {
this.payload = payload
}
},
PutObjectCommand: class PutObjectCommand {
constructor(payload) {
this.name = 'PutObjectCommand'
this.payload = payload
}
},
}
}

View File

@@ -549,27 +549,6 @@ describe('S3PersistorTests', function () {
})
})
describe('when a hash is supplied', function () {
beforeEach(async function () {
await S3Persistor.sendStream(bucket, key, ReadStream, {
sourceMd5: 'aaaaaaaabbbbbbbbaaaaaaaabbbbbbbb',
})
})
it('sends the hash in base64', function () {
expect(awsLibStorageUpload).to.have.been.calledWith({
client: S3.s3ClientStub,
params: {
Bucket: bucket,
Key: key,
Body: sinon.match.instanceOf(Transform),
ContentMD5: 'qqqqqru7u7uqqqqqu7u7uw==',
},
partSize: 100 * 1024 * 1024,
})
})
})
describe('when metadata is supplied', function () {
const contentType = 'text/csv'
const contentEncoding = 'gzip'
@@ -596,23 +575,23 @@ describe('S3PersistorTests', function () {
})
})
describe('when multipart upload is disabled', function () {
const contentType = 'text/csv'
const contentEncoding = 'gzip'
describe('with sourceMd5 option', function () {
let error
beforeEach(async function () {
S3.mockSend(S3.PutObjectCommand, {})
settings.disableMultiPartUpload = true
await S3Persistor.sendStream(bucket, key, ReadStream, {
contentType,
contentEncoding,
})
try {
await S3Persistor.sendStream(bucket, key, ReadStream, {
sourceMd5: 'ffffffff',
})
} catch (err) {
error = err
}
})
it('configures the options to not to retry requests', function () {
expect(S3.S3Client).to.have.been.calledWithMatch({
maxAttempts: 1,
})
it('should throw an error', function () {
expect(error.message).to.equal('upload to S3 failed')
expect(error.cause.message).to.equal(
'sourceMd5 option is not supported, S3 provides its own integrity protection mechanism'
)
})
})

21
package-lock.json generated
View File

@@ -27031,9 +27031,9 @@
}
},
"node_modules/dockerode": {
"version": "4.0.7",
"resolved": "https://registry.npmjs.org/dockerode/-/dockerode-4.0.7.tgz",
"integrity": "sha512-R+rgrSRTRdU5mH14PZTCPZtW/zw3HDWNTS/1ZAQpL/5Upe/ye5K9WQkIysu4wBoiMwKynsz0a8qWuGsHgEvSAA==",
"version": "4.0.9",
"resolved": "https://registry.npmjs.org/dockerode/-/dockerode-4.0.9.tgz",
"integrity": "sha512-iND4mcOWhPaCNh54WmK/KoSb35AFqPAUWFMffTQcp52uQt36b5uNwEJTSXntJZBbeGad72Crbi/hvDIv6us/6Q==",
"license": "Apache-2.0",
"dependencies": {
"@balena/dockerignore": "^1.0.2",
@@ -27041,7 +27041,7 @@
"@grpc/proto-loader": "^0.7.13",
"docker-modem": "^5.0.6",
"protobufjs": "^7.3.2",
"tar-fs": "~2.1.2",
"tar-fs": "^2.1.4",
"uuid": "^10.0.0"
},
"engines": {
@@ -27073,9 +27073,9 @@
}
},
"node_modules/dockerode/node_modules/tar-fs": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.3.tgz",
"integrity": "sha512-090nwYJDmlhwFwEW3QQl+vaNnxsO2yVsd45eTKRBzSzu+hlb1w2K9inVq5b0ngXuLVqQ4ApvsUHHnu/zQNkWAg==",
"version": "2.1.4",
"resolved": "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.4.tgz",
"integrity": "sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ==",
"license": "MIT",
"dependencies": {
"chownr": "^1.1.1",
@@ -50770,7 +50770,6 @@
"pg-copy-streams": "^2.2.2",
"promptly": "^3.0.3",
"recurly": "^4.0.1",
"request": "^2.88.2",
"sequelize": "^6.31.0",
"yargs": "^17.0.0"
},
@@ -50844,7 +50843,7 @@
"async": "^3.2.5",
"body-parser": "^1.20.3",
"bunyan": "^1.8.15",
"dockerode": "^4.0.7",
"dockerode": "^4.0.9",
"express": "^4.21.2",
"lodash": "^4.17.21",
"p-limit": "^3.1.0",
@@ -51041,8 +51040,7 @@
"express": "^4.21.2",
"lodash": "^4.17.21",
"mongodb-legacy": "6.1.3",
"p-map": "^4.0.0",
"request": "^2.88.2"
"p-map": "^4.0.0"
},
"devDependencies": {
"@google-cloud/storage": "^6.10.1",
@@ -51098,6 +51096,7 @@
"services/document-updater": {
"name": "@overleaf/document-updater",
"dependencies": {
"@overleaf/fetch-utils": "*",
"@overleaf/logger": "*",
"@overleaf/metrics": "*",
"@overleaf/mongo-utils": "*",

View File

@@ -6,6 +6,9 @@ const ContentController = require('./app/js/ContentController')
const Settings = require('@overleaf/settings')
const logger = require('@overleaf/logger')
logger.initialize('clsi')
const LoggerSerializers = require('./app/js/LoggerSerializers')
logger.logger.serializers.clsiRequest = LoggerSerializers.clsiRequest
const Metrics = require('@overleaf/metrics')
const smokeTest = require('./test/smoke/js/SmokeTests')

View File

@@ -23,6 +23,7 @@ const {
downloadLatestCompileCache,
downloadOutputDotSynctexFromCompileCache,
} = require('./CLSICacheHandler')
const StatsManager = require('./StatsManager')
const { callbackifyMultiResult } = require('@overleaf/promise-utils')
const COMPILE_TIME_BUCKETS = [
@@ -191,6 +192,26 @@ async function doCompile(request, stats, timings) {
Metrics.inc(`compiles-with-image.${tag}`, 1, request.metricsOpts)
const compileName = getCompileName(request.project_id, request.user_id)
// Record latexmk -time stats for a subset of users
const recordPerformanceMetrics =
request.user_id != null &&
Settings.performanceLogSamplingPercentage > 0 &&
StatsManager.sampleByHash(
request.user_id,
Settings.performanceLogSamplingPercentage
)
// For selected users, define a `latexmk` property on the stats object
// to collect latexmk -time stats.
if (recordPerformanceMetrics) {
// To prevent any changes to the existing compile responses being sent
// to web, exclude latexmk stats from being exported by marking them
// non-enumerable. This prevents them being serialised by JSON.stringify().
Object.defineProperty(stats, 'latexmk', {
value: {},
enumerable: false,
})
}
try {
await LatexRunner.promises.runLatex(compileName, {
directory: compileDir,
@@ -257,11 +278,15 @@ async function doCompile(request, stats, timings) {
Metrics.inc('compiles-succeeded', 1, request.metricsOpts)
for (const metricKey in stats) {
const metricValue = stats[metricKey]
Metrics.count(metricKey, metricValue, 1, request.metricsOpts)
if (typeof metricValue === 'number') {
Metrics.count(metricKey, metricValue, 1, request.metricsOpts)
}
}
for (const metricKey in timings) {
const metricValue = timings[metricKey]
Metrics.timing(metricKey, metricValue, 1, request.metricsOpts)
if (typeof metricValue === 'number') {
Metrics.timing(metricKey, metricValue, 1, request.metricsOpts)
}
}
const loadavg = typeof os.loadavg === 'function' ? os.loadavg() : undefined
if (loadavg != null) {
@@ -320,6 +345,27 @@ async function doCompile(request, stats, timings) {
emitPdfStats(stats, timings, request)
}
// Record compile performance for a subset of users
if (recordPerformanceMetrics) {
logger.info(
{
userId: request.user_id,
projectId: request.project_id,
timeTaken: ts,
clsiRequest: request,
stats,
timings,
// explicitly include latexmk stats to bypass the non-enumerable property
latexmk: stats.latexmk,
loadavg1m: loadavg?.[0],
loadavg5m: loadavg?.[1],
loadavg15m: loadavg?.[2],
samplingPercentage: Settings.performanceLogSamplingPercentage,
},
'sampled performance log'
)
}
return { outputFiles, buildId }
}

View File

@@ -0,0 +1,109 @@
/**
* Converts a time string in seconds to an integer number of milliseconds.
*
* @param {string} timeStr - The time duration in seconds, represented as a string.
* @returns {number} The equivalent time in milliseconds, rounded down to the nearest integer.
*/
function convertToMs(timeStr) {
return Math.floor(parseFloat(timeStr, 10) * 1000)
}
/* An array of metric parsers for `latexmk` time output (`-time` flag).
* Each entry is a tuple containing a metric name and a function to parse that
* metric from the `latexmk` log output.
*
* The parser functions generally take the log string as their first argument.
* Some may take additional arguments, such as the already computed stats,
* to derive new metrics.
*
* There are different formats of `latexmk` output depending on the version.
* The parsers attempt to handle these variations gracefully.
*/
const LATEX_MK_METRICS = [
// Extract individual latexmk rule times as an array of objects, each with 'rule'
// and 'time_ms' properties
[
'latexmk-rule-times',
s => {
// Each line looks like: 'pdflatex ... options ...': time = 12.34
// Take the command up to the first space as the rule name
const matches = s.matchAll(/^'([^' ]+).*': time = (\d+\.\d+)/gm)
return Array.from(matches, match => ({
rule: match[1],
time_ms: convertToMs(match[2]),
}))
},
],
// Extract a comma-separated signature of rule names from the rule times above
[
'latexmk-rule-signature',
(s, latexmkStats) => {
const times = latexmkStats['latexmk-rule-times']
if (!times) return null
// Example output: 'pdflatex,bibtex,pdflatex,pdflatex'
return times.map(t => t.rule).join(',')
},
],
// Total latexmk processing time, invoked processes time, and other time in ms
[
'latexmk-time',
s => {
const match = s.match(
/^Processing time = (\d+\.\d+), of which invoked processes = (\d+\.\d+), other = (\d+\.\d+)/m
)
if (match) {
return {
total: convertToMs(match[1]),
invoked: convertToMs(match[2]),
other: convertToMs(match[3]),
}
}
// Older format
const fallbackMatch = s.match(
/^Accumulated processing time = (\d+\.\d+)/m
)
if (fallbackMatch) {
return { total: convertToMs(fallbackMatch[1]) }
}
return null
},
],
// Total elapsed clock time in ms for latexmk
[
'latexmk-clock-time',
s => {
const match = s.match(/^Elapsed clock time = (\d+\.\d+)/m)
if (match) {
return convertToMs(match[1])
}
// not present in older versions
return null
},
],
// Number of rules run by latexmk
[
'latexmk-rules-run',
(s, latexmkStats) => {
const match = s.match(/^Number of rules run = (\d+)/m)
if (match) {
return parseInt(match[1], 10)
}
// Fallback: count number of entries in rule times if available
if (latexmkStats['latexmk-rule-times']) {
return latexmkStats['latexmk-rule-times'].length
}
return null
},
],
]
function addLatexMkMetrics(output, stats) {
for (const [stat, matcher] of LATEX_MK_METRICS) {
const match = matcher(output?.stdout || '', stats.latexmk)
if (match) {
stats.latexmk[stat] = match
}
}
}
module.exports = { addLatexMkMetrics }

View File

@@ -3,15 +3,16 @@ const { promisify } = require('node:util')
const Settings = require('@overleaf/settings')
const logger = require('@overleaf/logger')
const CommandRunner = require('./CommandRunner')
const { addLatexMkMetrics } = require('./LatexMetrics')
const fs = require('node:fs')
const ProcessTable = {} // table of currently running jobs (pids or docker container names)
const TIME_V_METRICS = Object.entries({
'cpu-percent': /Percent of CPU this job got: (\d+)/m,
'cpu-time': /User time.*: (\d+.\d+)/m,
'sys-time': /System time.*: (\d+.\d+)/m,
})
const TIME_V_METRICS = [
['cpu-percent', /Percent of CPU this job got: (\d+)/m],
['cpu-time', /User time.*: (\d+.\d+)/m],
['sys-time', /System time.*: (\d+.\d+)/m],
]
const COMPILER_FLAGS = {
latex: '-pdfdvi',
@@ -75,6 +76,14 @@ function runLatex(projectId, options, callback) {
if (error) {
return callback(error)
}
if (stats.latexmk) {
try {
addLatexMkMetrics(output, stats)
} catch (err) {
logger.error({ err, projectId }, 'error adding latexmk metrics')
}
}
// number of latex runs and whether there were errors
const runs =
output?.stderr?.match(/^Run number \d+ of .*latex/gm)?.length || 0
const failed = output?.stdout?.match(/^Latexmk: Errors/m) != null ? 1 : 0
@@ -161,7 +170,8 @@ function _buildLatexCommand(mainFile, opts = {}) {
'-auxdir=$COMPILE_DIR',
'-outdir=$COMPILE_DIR',
'-synctex=1',
'-interaction=batchmode'
'-interaction=batchmode',
'-time'
)
// Stop on first error option

View File

@@ -0,0 +1,40 @@
const Path = require('node:path')
const CLSI_REQUEST_SERIALIZED_PROPERTIES = [
'compiler',
'compileFromClsiCache',
'populateClsiCache',
'enablePdfCaching',
'pdfCachingMinChunkSize',
'timeout',
'imageName',
'draft',
'stopOnFirstError',
'check',
'flags',
'compileGroup',
'syncType',
]
module.exports = {
/**
* Serializer for a CLSI request object.
* Only includes properties useful for logging.
* Excludes large, sensitive, or irrelevant properties (e.g., 'syncState', 'resources').
* To add more properties, update the allowed properties above.
*
* @param {object} clsiRequest - The original CLSI request object.
* @returns {object} A summarized version of the request object for logging.
*/
clsiRequest(clsiRequest) {
const summary = {}
for (const key of CLSI_REQUEST_SERIALIZED_PROPERTIES) {
if (key === 'imageName' && clsiRequest.imageName) {
summary.imageName = Path.basename(clsiRequest.imageName)
} else if (clsiRequest[key] !== undefined) {
summary[key] = clsiRequest[key]
}
}
return summary
},
}

View File

@@ -0,0 +1,24 @@
const crypto = require('node:crypto')
/**
* Consistently sample a keyspace with a given sample percentage.
* The same key will always produce a consistent percentile value that
* can be compared against the sample percentage.
* Example: if key is the userId and the samplePercentage is 10, then
* we see all the activity for the 10% of users who are selected.
*
* @param {string} key - The unique identifier to be hashed and checked.
* @param {number} samplePercentage - The percentage (0-100) of keys that should return true.
* @returns {boolean} - True if the key is within the sample, false otherwise.
*/
function sampleByHash(key, samplePercentage) {
if (samplePercentage <= 0) {
return false
}
const hash = crypto.createHash('md5').update(key).digest()
const hashValue = hash.readUInt32BE(0)
const percentile = Math.floor((hashValue / 0xffffffff) * 100)
return percentile < samplePercentage
}
module.exports = { sampleByHash }

View File

@@ -81,6 +81,8 @@ module.exports = {
pdfCachingWorkerPoolBackLogLimit:
parseInt(process.env.PDF_CACHING_WORKER_POOL_BACK_LOG_LIMIT, 10) || 40,
compileConcurrencyLimit: isPreEmptible ? 32 : 64,
performanceLogSamplingPercentage:
parseFloat(process.env.CLSI_PERFORMANCE_LOG_SAMPLING, 10) || 0,
}
if (process.env.ALLOWED_COMPILE_GROUPS) {

View File

@@ -28,7 +28,7 @@
"async": "^3.2.5",
"body-parser": "^1.20.3",
"bunyan": "^1.8.15",
"dockerode": "^4.0.7",
"dockerode": "^4.0.9",
"express": "^4.21.2",
"lodash": "^4.17.21",
"p-limit": "^3.1.0",

View File

@@ -67,4 +67,31 @@ Hello world
})
.should.equal(true)
})
it('should return only the expected keys for stats and timings', function () {
const { stats, timings } = this.body.compile
// Note: chai's all.keys assertion rejects extra keys
stats.should.have.all.keys(
'isInitialCompile',
'latexmk-errors',
'latex-runs',
'latex-runs-with-errors',
'latex-runs-2',
'latex-runs-with-errors-2',
'pdf-caching-total-ranges-size',
'pdf-caching-reclaimed-space',
'pdf-caching-new-ranges-size',
'pdf-caching-n-ranges',
'pdf-caching-n-new-ranges',
'pdf-size'
)
timings.should.have.all.keys(
'sync',
'compile',
'output',
'compileE2E',
'compute-pdf-caching',
'pdf-caching-overhead-delete-stale-hashes'
)
})
})

View File

@@ -1,12 +1,4 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const request = require('request')
const { fetchNothing } = require('@overleaf/fetch-utils')
const Settings = require('@overleaf/settings')
const async = require('async')
const fs = require('node:fs')
@@ -27,7 +19,7 @@ const getAverageCompileTime = function () {
return totalTime / compileTimes.length
}
const makeRequest = function (compileNumber, callback) {
const makeRequest = async function (compileNumber) {
let bulkBodyCount = 7
let bodyContent = ''
while (--bulkBodyCount) {
@@ -35,67 +27,64 @@ const makeRequest = function (compileNumber, callback) {
}
const startTime = new Date()
return request.post(
{
url: buildUrl(`project/loadcompile-${compileNumber}/compile`),
json: {
compile: {
resources: [
{
path: 'main.tex',
content: `\
try {
await fetchNothing(
buildUrl(`project/loadcompile-${compileNumber}/compile`),
{
method: 'POST',
json: {
compile: {
resources: [
{
path: 'main.tex',
content: `\
\\documentclass{article}
\\begin{document}
${bodyContent}
\\end{document}\
`,
},
],
},
],
},
},
},
},
(err, response, body) => {
if (err != null) {
failedCount++
return callback(new Error(`compile ${compileNumber} failed`))
}
if (response.statusCode !== 200) {
failedCount++
return callback(new Error(`compile ${compileNumber} failed`))
}
const totalTime = new Date() - startTime
console.log(totalTime + 'ms')
compileTimes.push(totalTime)
return callback(err)
}
)
)
const totalTime = new Date() - startTime
console.log(totalTime + 'ms')
compileTimes.push(totalTime)
} catch (error) {
console.log({ error })
failedCount++
throw new Error(`compile ${compileNumber} failed`)
}
}
const jobs = _.map(
__range__(1, totalCompiles, true),
i => cb => makeRequest(i, cb)
)
const jobs = []
for (let i = 0; i < totalCompiles; i++) {
jobs.push(() => makeRequest(i))
}
const startTime = new Date()
async.parallelLimit(jobs, concurentCompiles, err => {
if (err != null) {
console.error(err)
}
const runJob = (job, _, cb) =>
job()
.then(() => cb())
.catch(err => cb(err))
async function run() {
const startTime = new Date()
await async.eachOfLimit(jobs, concurentCompiles, runJob)
console.log(`total time taken = ${(new Date() - startTime) / 1000}s`)
console.log(`total compiles = ${totalCompiles}`)
console.log(`concurent compiles = ${concurentCompiles}`)
console.log(`average time = ${getAverageCompileTime() / 1000}s`)
console.log(`max time = ${_.max(compileTimes) / 1000}s`)
console.log(`min time = ${_.min(compileTimes) / 1000}s`)
return console.log(`total failures = ${failedCount}`)
})
function __range__(left, right, inclusive) {
const range = []
const ascending = left < right
const end = !inclusive ? right : ascending ? right + 1 : right - 1
for (let i = left; ascending ? i < end : i > end; ascending ? i++ : i--) {
range.push(i)
}
return range
console.log(`total failures = ${failedCount}`)
}
run()
.then(() => process.exit(0))
.catch(error => {
console.error(error)
process.exit(1)
})

View File

@@ -1,6 +1,8 @@
const SandboxedModule = require('sandboxed-module')
const sinon = require('sinon')
const { expect } = require('chai')
const fs = require('node:fs')
const path = require('node:path')
const MODULE_PATH = require('node:path').join(
__dirname,
@@ -87,6 +89,7 @@ describe('LatexRunner', function () {
'-outdir=$COMPILE_DIR',
'-synctex=1',
'-interaction=batchmode',
'-time',
'-f',
'-pdf',
'$COMPILE_DIR/main-file.tex',
@@ -140,6 +143,7 @@ describe('LatexRunner', function () {
'-outdir=$COMPILE_DIR',
'-synctex=1',
'-interaction=batchmode',
'-time',
'-f',
'-lualatex',
'$COMPILE_DIR/main-file.tex',
@@ -209,11 +213,75 @@ describe('LatexRunner', function () {
'-outdir=$COMPILE_DIR',
'-synctex=1',
'-interaction=batchmode',
'-time',
'-halt-on-error',
'-pdf',
'$COMPILE_DIR/main-file.tex',
])
})
})
describe('with old latexmk timing output', function () {
beforeEach(function (done) {
this.commandRunnerOutput.stdout = fs.readFileSync(
path.join(__dirname, 'fixtures', 'latexmk1.txt'),
'utf-8'
)
// pass in the `latexmk` property to signal that we want to receive parsed stats
this.stats.latexmk = {}
this.call(done)
})
it('should parse latexmk 4.52c (2017) timing information', function () {
expect(this.stats.latexmk).to.deep.equal({
'latexmk-rule-times': [
{ rule: 'makeindex', time_ms: 30 },
{ rule: 'bibtex', time_ms: 40 },
{ rule: 'latex', time_ms: 690 },
{ rule: 'makeindex', time_ms: 40 },
{ rule: 'bibtex', time_ms: 39 },
{ rule: 'latex', time_ms: 750 },
{ rule: 'makeindex', time_ms: 39 },
{ rule: 'bibtex', time_ms: 20 },
{ rule: 'latex', time_ms: 770 },
],
'latexmk-rule-signature':
'makeindex,bibtex,latex,makeindex,bibtex,latex,makeindex,bibtex,latex',
'latexmk-rules-run': 9,
'latexmk-time': { total: 2930 },
})
})
})
describe('with modern latexmk timing output', function () {
beforeEach(function (done) {
this.commandRunnerOutput.stdout = fs.readFileSync(
path.join(__dirname, 'fixtures', 'latexmk2.txt'),
'utf-8'
)
// pass in the `latexmk` property to signal that we want to receive parsed stats
this.stats.latexmk = {}
this.call(done)
})
it('should parse latexmk 4.83 (2024) timing information', function () {
expect(this.stats.latexmk).to.deep.equal({
'latexmk-rule-times': [
{ rule: 'latex', time_ms: 1880 },
{ rule: 'makeindex', time_ms: 50 },
{ rule: 'bibtex', time_ms: 50 },
{ rule: 'latex', time_ms: 2180 },
],
'latexmk-rule-signature': 'latex,makeindex,bibtex,latex',
'latexmk-time': {
total: 4770,
invoked: 4160,
other: 610,
},
'latexmk-clock-time': 4870,
'latexmk-rules-run': 4,
})
})
})
})
})

View File

@@ -0,0 +1,85 @@
const { expect } = require('chai')
const { sampleByHash } = require('../../../app/js/StatsManager')
describe('StatsManager', function () {
describe('sampleByHash', function () {
it('should always return false for a sample percentage of 0', function () {
for (let i = 0; i < 100; i++) {
const key = `test-key-${i}`
expect(sampleByHash(key, 0), `key ${key} should be false`).to.be.false
}
})
it('should always return false for a negative sample percentage', function () {
for (let i = 0; i < 100; i++) {
const key = `test-key-${i}`
expect(sampleByHash(key, -10), `key ${key} should be false`).to.be.false
}
})
it('should always return true for a sample percentage of 100', function () {
// This isn't strictly true, if the hash is exactly 0xffffffff, then the percentile is 100
// and 100 < 100 is false. But the chances of that are 1 in 4 billion.
for (let i = 0; i < 100; i++) {
const key = `test-key-${i}`
expect(sampleByHash(key, 100), `key ${key} should be true`).to.be.true
}
})
it('should return the expected number of results for a sample percentage of 75', function () {
// This isn't strictly true, if the hash is exactly 0xffffffff, then the percentile is 100
// and 100 < 100 is false. But the chances of that are 1 in 4 billion.
let count = 0
for (let i = 0; i < 100; i++) {
const key = `test-key-${i}`
count += sampleByHash(key, 75) ? 1 : 0
}
// Actual result is 74, it's deterministic but the test allows the algorithm to change
expect(count).to.be.within(70, 80)
})
it('should return true when the hash is within the sample percentage', function () {
// The MD5 hash of 'test-key-in' gives a percentile of 13
const key = 'test-key-in'
const percentage = 40
expect(sampleByHash(key, percentage)).to.be.true
})
it('should return false when the hash is outside the sample percentage', function () {
// The MD5 hash of 'test-key-outer' gives a percentile of 47
const key = 'test-key-outer'
const percentage = 40
expect(sampleByHash(key, percentage)).to.be.false
})
it('should produce consistent results for the same key', function () {
const key = 'consistent-key'
const percentage = 50
const result1 = sampleByHash(key, percentage)
const result2 = sampleByHash(key, percentage)
expect(result1).to.equal(result2)
})
it('should handle different keys correctly', function () {
// MD5('key1') => percentile 76
// MD5('key2') => percentile 47
expect(sampleByHash('key1', 80)).to.be.true
expect(sampleByHash('key1', 70)).to.be.false
expect(sampleByHash('key2', 50)).to.be.true
expect(sampleByHash('key2', 40)).to.be.false
})
it('should be monotonic with respect to percentage', function () {
const key = 'test-key'
const percentile = 32
for (let i = 0; i <= 100; i++) {
const result = sampleByHash(key, i)
if (i <= percentile) {
expect(result, `percentage ${i} should be false`).to.be.false
} else {
expect(result, `percentage ${i} should be true`).to.be.true
}
}
})
})
})

View File

@@ -0,0 +1,12 @@
Latexmk: Found bibliography file(s) [tex.bib]
Latexmk: All targets (TeXbyTopic.dvi) are up-to-date
'makeindex -o "TeXbyTopic.ind" "TeXbyTopic.idx"': time = 0.03
'bibtex "TeXbyTopic"': time = 0.04
'latex -recorder "TeXbyTopic.tex"': time = 0.69
'makeindex -o "TeXbyTopic.ind" "TeXbyTopic.idx"': time = 0.04
'bibtex "TeXbyTopic"': time = 0.0399999999999998
'latex -recorder "TeXbyTopic.tex"': time = 0.75
'makeindex -o "TeXbyTopic.ind" "TeXbyTopic.idx"': time = 0.0399999999999996
'bibtex "TeXbyTopic"': time = 0.02
'latex -recorder "TeXbyTopic.tex"': time = 0.77
Accumulated processing time = 2.93

View File

@@ -0,0 +1,7 @@
'latex': time = 1.88
'makeindex TeXbyTopic.idx': time = 0.05
'bibtex TeXbyTopic': time = 0.05
'latex': time = 2.18
Processing time = 4.77, of which invoked processes = 4.16, other = 0.61.
Elapsed clock time = 4.87.
Number of rules run = 4

View File

@@ -74,11 +74,15 @@ async function archiveDoc(projectId, docId) {
throw error
}
const md5 = crypto.createHash('md5').update(json).digest('hex')
const stream = new ReadableString(json)
await PersistorManager.sendStream(Settings.docstore.bucket, key, stream, {
sourceMd5: md5,
})
if (Settings.docstore.backend === 's3') {
await PersistorManager.sendStream(Settings.docstore.bucket, key, stream)
} else {
await PersistorManager.sendStream(Settings.docstore.bucket, key, stream, {
sourceMd5: crypto.createHash('md5').update(json).digest('hex'),
})
}
await MongoManager.markDocAsArchived(projectId, docId, doc.rev)
}
@@ -112,25 +116,31 @@ async function unArchiveAllDocs(projectId) {
// get the doc from the PersistorManager without storing it in mongo
async function getDoc(projectId, docId) {
const key = `${projectId}/${docId}`
const sourceMd5 = await PersistorManager.getObjectMd5Hash(
Settings.docstore.bucket,
key
)
const stream = await PersistorManager.getObjectStream(
Settings.docstore.bucket,
key
)
stream.resume()
const buffer = await streamToBuffer(projectId, docId, stream)
const md5 = crypto.createHash('md5').update(buffer).digest('hex')
if (sourceMd5 !== md5) {
throw new Errors.Md5MismatchError('md5 mismatch when downloading doc', {
key,
sourceMd5,
md5,
})
}
let buffer
if (Settings.docstore.backend === 's3') {
stream.resume()
buffer = await streamToBuffer(projectId, docId, stream)
} else {
const sourceMd5 = await PersistorManager.getObjectMd5Hash(
Settings.docstore.bucket,
key
)
stream.resume()
buffer = await streamToBuffer(projectId, docId, stream)
const md5 = crypto.createHash('md5').update(buffer).digest('hex')
if (sourceMd5 !== md5) {
throw new Errors.Md5MismatchError('md5 mismatch when downloading doc', {
key,
sourceMd5,
md5,
})
}
}
return _deserializeArchivedDoc(buffer)
}

View File

@@ -33,8 +33,7 @@
"express": "^4.21.2",
"lodash": "^4.17.21",
"mongodb-legacy": "6.1.3",
"p-map": "^4.0.0",
"request": "^2.88.2"
"p-map": "^4.0.0"
},
"devDependencies": {
"@google-cloud/storage": "^6.10.1",

View File

@@ -1,16 +1,3 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const Settings = require('@overleaf/settings')
const { expect } = require('chai')
const { db, ObjectId } = require('../../../app/js/mongodb')
@@ -20,17 +7,16 @@ const DocstoreClient = require('./helpers/DocstoreClient')
const { Storage } = require('@google-cloud/storage')
const Persistor = require('../../../app/js/PersistorManager')
const { ReadableString } = require('@overleaf/stream-utils')
const { callbackify } = require('node:util')
function uploadContent(path, json, callback) {
async function uploadContent(path, json) {
const stream = new ReadableString(JSON.stringify(json))
Persistor.sendStream(Settings.docstore.bucket, path, stream)
.then(() => callback())
.catch(callback)
await Persistor.sendStream(Settings.docstore.bucket, path, stream)
}
describe('Archiving', function () {
before(function (done) {
return DocstoreApp.ensureRunning(done)
before(async function () {
await DocstoreApp.ensureRunning()
})
before(async function () {
@@ -65,118 +51,100 @@ describe('Archiving', function () {
version: 4,
},
]
const jobs = Array.from(this.docs).map(doc =>
(doc => {
return callback => {
return DocstoreClient.createDoc(
this.project_id,
doc._id,
doc.lines,
doc.version,
doc.ranges,
callback
)
}
const jobs = this.docs.map(doc =>
(doc => callback => {
callbackify(DocstoreClient.createDoc)(
this.project_id,
doc._id,
doc.lines,
doc.version,
doc.ranges,
callback
)
})(doc)
)
return async.series(jobs, error => {
async.series(jobs, error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(this.project_id, (error, res) => {
if (error) return done(error)
this.res = res
return done()
})
DocstoreClient.archiveAllDoc(this.project_id)
.then(res => {
this.res = res
done()
})
.catch(done)
})
})
it('should archive all the docs', function (done) {
this.res.statusCode.should.equal(204)
return done()
it('should archive all the docs', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
const jobs = Array.from(this.docs).map(doc =>
(doc => {
return callback => {
return db.docs.findOne({ _id: doc._id }, (error, doc) => {
const jobs = this.docs.map(doc =>
(
doc => callback =>
db.docs.findOne({ _id: doc._id }, (error, doc) => {
if (error) return callback(error)
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
return callback()
callback()
})
}
})(doc)
)(doc)
)
return async.series(jobs, done)
async.series(jobs, done)
})
it('should set the docs in s3 correctly', function (done) {
const jobs = Array.from(this.docs).map(doc =>
(doc => {
return callback => {
return DocstoreClient.getS3Doc(
this.project_id,
doc._id,
(error, s3Doc) => {
if (error) return callback(error)
const jobs = this.docs.map(doc =>
(
doc => callback =>
DocstoreClient.getS3Doc(this.project_id, doc._id)
.then(s3Doc => {
s3Doc.lines.should.deep.equal(doc.lines)
s3Doc.ranges.should.deep.equal(doc.ranges)
callback()
}
)
}
})(doc)
})
.catch(callback)
)(doc)
)
return async.series(jobs, done)
async.series(jobs, done)
})
return describe('after unarchiving from a request for the project', function () {
before(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
describe('after unarchiving from a request for the project', function () {
before(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
it('should return the docs', function (done) {
it('should return the docs', function () {
for (let i = 0; i < this.fetched_docs.length; i++) {
const doc = this.fetched_docs[i]
doc.lines.should.deep.equal(this.docs[i].lines)
}
return done()
})
return it('should restore the docs to mongo', function (done) {
const jobs = Array.from(this.docs).map((doc, i) =>
((doc, i) => {
return callback => {
return db.docs.findOne({ _id: doc._id }, (error, doc) => {
it('should restore the docs to mongo', function (done) {
const jobs = this.docs.map((doc, i) =>
(
(doc, i) => callback =>
db.docs.findOne({ _id: doc._id }, (error, doc) => {
if (error) return callback(error)
doc.lines.should.deep.equal(this.docs[i].lines)
doc.ranges.should.deep.equal(this.docs[i].ranges)
expect(doc.inS3).not.to.exist
return callback()
callback()
})
}
})(doc, i)
)(doc, i)
)
return async.series(jobs, done)
async.series(jobs, done)
})
})
})
describe('a deleted doc', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = new ObjectId()
this.doc = {
_id: new ObjectId(),
@@ -184,102 +152,51 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.deleteDoc(
this.project_id,
this.doc._id,
error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(
this.project_id,
(error, res) => {
this.res = res
if (error != null) {
throw error
}
return done()
}
)
}
)
}
this.doc.ranges
)
await DocstoreClient.deleteDoc(this.project_id, this.doc._id)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
})
it('should successully archive the docs', function (done) {
this.res.statusCode.should.equal(204)
return done()
it('should successully archive the docs', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
doc.deleted.should.equal(true)
return done()
})
it('should set inS3 and unset lines and ranges in each doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
doc.deleted.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
return DocstoreClient.getS3Doc(
this.project_id,
this.doc._id,
(error, s3Doc) => {
if (error != null) {
throw error
}
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
)
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
})
describe('after unarchiving from a request for the project', function () {
beforeEach(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
beforeEach(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
it('should not included the deleted', function (done) {
it('should not included the deleted', function () {
this.fetched_docs.length.should.equal(0)
return done()
})
return it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
doc.deleted.should.equal(true)
return done()
})
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
doc.deleted.should.equal(true)
})
})
@@ -296,42 +213,27 @@ describe('Archiving', function () {
})
describe('after unarchiving from a request for the project', function () {
beforeEach(function (done) {
DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error) {
return done(error)
}
done()
}
)
beforeEach(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
it('should not included the deleted', function (done) {
it('should not included the deleted', function () {
this.fetched_docs.length.should.equal(0)
done()
})
it('should not have restored the deleted doc to mongo', function (done) {
db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error) {
return done(error)
}
expect(doc.lines).to.not.exist
expect(doc.ranges).to.not.exist
expect(doc.inS3).to.equal(true)
expect(doc.deleted).to.equal(true)
done()
})
it('should not have restored the deleted doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).to.not.exist
expect(doc.ranges).to.not.exist
expect(doc.inS3).to.equal(true)
expect(doc.deleted).to.equal(true)
})
})
})
})
describe('archiving a single doc', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.timeout(1000 * 30)
this.doc = {
@@ -340,62 +242,36 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error) {
return done(error)
}
DocstoreClient.archiveDoc(
this.project_id,
this.doc._id,
(error, res) => {
this.res = res
if (error) {
return done(error)
}
done()
}
)
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveDoc(this.project_id, this.doc._id)
})
it('should successully archive the doc', function (done) {
this.res.statusCode.should.equal(204)
done()
it('should successully archive the doc', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in the doc', function (done) {
db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error) {
return done(error)
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
done()
})
it('should set inS3 and unset lines and ranges in the doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
DocstoreClient.getS3Doc(this.project_id, this.doc._id, (error, s3Doc) => {
if (error) {
return done(error)
}
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
done()
})
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
})
})
describe('a doc with large lines', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.timeout(1000 * 30)
const quarterMegInBytes = 250000
@@ -408,89 +284,49 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(this.project_id, (error, res) => {
this.res = res
if (error != null) {
throw error
}
return done()
})
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
})
it('should successully archive the docs', function (done) {
this.res.statusCode.should.equal(204)
return done()
it('should successully archive the docs', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
return done()
})
it('should set inS3 and unset lines and ranges in each doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
return DocstoreClient.getS3Doc(
this.project_id,
this.doc._id,
(error, s3Doc) => {
if (error != null) {
throw error
}
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
)
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
})
return describe('after unarchiving from a request for the project', function () {
before(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
describe('after unarchiving from a request for the project', function () {
before(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
return it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
return done()
})
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
})
})
})
describe('a doc with naughty strings', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.doc = {
_id: new ObjectId(),
@@ -882,89 +718,48 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(this.project_id, (error, res) => {
this.res = res
if (error != null) {
throw error
}
return done()
})
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
})
it('should successully archive the docs', function (done) {
this.res.statusCode.should.equal(204)
return done()
it('should successully archive the docs', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
return done()
})
it('should set inS3 and unset lines and ranges in each doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
return DocstoreClient.getS3Doc(
this.project_id,
this.doc._id,
(error, s3Doc) => {
if (error != null) {
throw error
}
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
)
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
})
return describe('after unarchiving from a request for the project', function () {
before(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
describe('after unarchiving from a request for the project', function () {
before(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
return it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
return done()
})
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
})
})
})
describe('a doc with ranges', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.doc = {
_id: new ObjectId(),
@@ -1010,90 +805,50 @@ describe('Archiving', function () {
},
],
}
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(this.project_id, (error, res) => {
this.res = res
if (error != null) {
throw error
}
return done()
})
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
})
it('should successully archive the docs', function (done) {
this.res.statusCode.should.equal(204)
return done()
it('should successully archive the docs', function () {
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
return done()
})
it('should set inS3 and unset lines and ranges in each doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
return DocstoreClient.getS3Doc(
this.project_id,
this.doc._id,
(error, s3Doc) => {
if (error != null) {
throw error
}
s3Doc.lines.should.deep.equal(this.doc.lines)
const ranges = JSON.parse(JSON.stringify(this.fixedRanges)) // ObjectId -> String
s3Doc.ranges.should.deep.equal(ranges)
return done()
}
)
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
const ranges = JSON.parse(JSON.stringify(this.fixedRanges)) // ObjectId -> String
s3Doc.ranges.should.deep.equal(ranges)
})
return describe('after unarchiving from a request for the project', function () {
before(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
describe('after unarchiving from a request for the project', function () {
before(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
return it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.fixedRanges)
expect(doc.inS3).not.to.exist
return done()
})
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.fixedRanges)
expect(doc.inS3).not.to.exist
})
})
})
describe('a doc that is archived twice', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.doc = {
_id: new ObjectId(),
@@ -1101,95 +856,50 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.archiveAllDoc(this.project_id, (error, res) => {
this.res = res
if (error != null) {
throw error
}
this.res.statusCode.should.equal(204)
return DocstoreClient.archiveAllDoc(
this.project_id,
(error, res1) => {
this.res = res1
if (error != null) {
throw error
}
this.res.statusCode.should.equal(204)
return done()
}
)
})
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
this.res.status.should.equal(204)
this.res = await DocstoreClient.archiveAllDoc(this.project_id)
this.res.status.should.equal(204)
})
it('should set inS3 and unset lines and ranges in each doc', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
return done()
})
it('should set inS3 and unset lines and ranges in each doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
expect(doc.lines).not.to.exist
expect(doc.ranges).not.to.exist
doc.inS3.should.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
return DocstoreClient.getS3Doc(
this.project_id,
this.doc._id,
(error, s3Doc) => {
if (error != null) {
throw error
}
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
)
it('should set the doc in s3 correctly', async function () {
const s3Doc = await DocstoreClient.getS3Doc(this.project_id, this.doc._id)
s3Doc.lines.should.deep.equal(this.doc.lines)
s3Doc.ranges.should.deep.equal(this.doc.ranges)
})
return describe('after unarchiving from a request for the project', function () {
before(function (done) {
return DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
describe('after unarchiving from a request for the project', function () {
before(async function () {
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
return it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
return done()
})
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
doc.ranges.should.deep.equal(this.doc.ranges)
expect(doc.inS3).not.to.exist
})
})
})
return describe('a doc with the old schema (just an array of lines)', function () {
before(function (done) {
describe('a doc with the old schema (just an array of lines)', function () {
before(async function () {
this.project_id = new ObjectId()
this.doc = {
_id: new ObjectId(),
@@ -1197,52 +907,24 @@ describe('Archiving', function () {
ranges: {},
version: 2,
}
uploadContent(
`${this.project_id}/${this.doc._id}`,
this.doc.lines,
error => {
expect(error).not.to.exist
db.docs.insertOne(
{
project_id: this.project_id,
_id: this.doc._id,
rev: this.doc.version,
inS3: true,
},
error => {
if (error != null) {
throw error
}
DocstoreClient.getAllDocs(
this.project_id,
(error, res, fetchedDocs) => {
this.fetched_docs = fetchedDocs
if (error != null) {
throw error
}
return done()
}
)
}
)
}
)
})
it('should restore the doc to mongo', function (done) {
return db.docs.findOne({ _id: this.doc._id }, (error, doc) => {
if (error != null) {
throw error
}
doc.lines.should.deep.equal(this.doc.lines)
expect(doc.inS3).not.to.exist
return done()
await uploadContent(`${this.project_id}/${this.doc._id}`, this.doc.lines)
await db.docs.insertOne({
project_id: this.project_id,
_id: this.doc._id,
rev: this.doc.version,
inS3: true,
})
this.fetched_docs = await DocstoreClient.getAllDocs(this.project_id)
})
return it('should return the doc', function (done) {
it('should restore the doc to mongo', async function () {
const doc = await db.docs.findOne({ _id: this.doc._id })
doc.lines.should.deep.equal(this.doc.lines)
expect(doc.inS3).not.to.exist
})
it('should return the doc', function () {
this.fetched_docs[0].lines.should.deep.equal(this.doc.lines)
return done()
})
})
})

View File

@@ -4,6 +4,9 @@ const DocstoreApp = require('./helpers/DocstoreApp')
const Errors = require('../../../app/js/Errors')
const Settings = require('@overleaf/settings')
const { Storage } = require('@google-cloud/storage')
const { promisify } = require('node:util')
const sleep = promisify(setTimeout)
const DocstoreClient = require('./helpers/DocstoreClient')
@@ -24,85 +27,63 @@ function deleteTestSuite(deleteDoc) {
await storage.bucket(`${Settings.docstore.bucket}-deleted`).delete()
})
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = new ObjectId()
this.doc_id = new ObjectId()
this.lines = ['original', 'lines']
this.version = 42
this.ranges = []
DocstoreApp.ensureRunning(() => {
DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.lines,
this.version,
this.ranges,
error => {
if (error) {
throw error
}
done()
}
)
})
})
it('should show as not deleted on /deleted', function (done) {
DocstoreClient.isDocDeleted(
await DocstoreApp.ensureRunning()
await DocstoreClient.createDoc(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error) return done(error)
expect(res.statusCode).to.equal(200)
expect(body).to.have.property('deleted').to.equal(false)
done()
}
this.lines,
this.version,
this.ranges
)
})
it('should show as not deleted on /deleted', async function () {
const { res, body } = await DocstoreClient.isDocDeleted(
this.project_id,
this.doc_id
)
expect(res.status).to.equal(200)
expect(body).to.have.property('deleted').to.equal(false)
})
describe('when the doc exists', function () {
beforeEach(function (done) {
deleteDoc(this.project_id, this.doc_id, (error, res, doc) => {
if (error) return done(error)
this.res = res
done()
})
beforeEach(async function () {
this.res = await deleteDoc(this.project_id, this.doc_id)
})
afterEach(function (done) {
db.docs.deleteOne({ _id: this.doc_id }, done)
afterEach(async function () {
await db.docs.deleteOne({ _id: this.doc_id })
})
it('should mark the doc as deleted on /deleted', function (done) {
DocstoreClient.isDocDeleted(
it('should mark the doc as deleted on /deleted', async function () {
const { res, body } = await DocstoreClient.isDocDeleted(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error) return done(error)
expect(res.statusCode).to.equal(200)
expect(body).to.have.property('deleted').to.equal(true)
done()
}
this.doc_id
)
expect(res.status).to.equal(200)
expect(body).to.have.property('deleted').to.equal(true)
})
it('should insert a deleted doc into the docs collection', function (done) {
db.docs.find({ _id: this.doc_id }).toArray((error, docs) => {
if (error) return done(error)
docs[0]._id.should.deep.equal(this.doc_id)
docs[0].lines.should.deep.equal(this.lines)
docs[0].deleted.should.equal(true)
done()
})
it('should insert a deleted doc into the docs collection', async function () {
const docs = await db.docs.find({ _id: this.doc_id }).toArray()
docs[0]._id.should.deep.equal(this.doc_id)
docs[0].lines.should.deep.equal(this.lines)
docs[0].deleted.should.equal(true)
})
it('should not export the doc to s3', function (done) {
setTimeout(() => {
DocstoreClient.getS3Doc(this.project_id, this.doc_id, error => {
expect(error).to.be.instanceOf(Errors.NotFoundError)
done()
})
}, 1000)
it('should not export the doc to s3', async function () {
await sleep(1000)
try {
await DocstoreClient.getS3Doc(this.project_id, this.doc_id)
} catch (error) {
expect(error).to.be.instanceOf(Errors.NotFoundError)
}
})
})
@@ -116,12 +97,8 @@ function deleteTestSuite(deleteDoc) {
Settings.docstore.archiveOnSoftDelete = archiveOnSoftDelete
})
beforeEach('delete Doc', function (done) {
deleteDoc(this.project_id, this.doc_id, (error, res) => {
if (error) return done(error)
this.res = res
done()
})
beforeEach('delete Doc', async function () {
this.res = await deleteDoc(this.project_id, this.doc_id)
})
beforeEach(function waitForBackgroundFlush(done) {
@@ -132,81 +109,54 @@ function deleteTestSuite(deleteDoc) {
db.docs.deleteOne({ _id: this.doc_id }, done)
})
it('should set the deleted flag in the doc', function (done) {
db.docs.findOne({ _id: this.doc_id }, (error, doc) => {
if (error) {
return done(error)
}
expect(doc.deleted).to.equal(true)
done()
})
it('should set the deleted flag in the doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc_id })
expect(doc.deleted).to.equal(true)
})
it('should set inS3 and unset lines and ranges in the doc', function (done) {
db.docs.findOne({ _id: this.doc_id }, (error, doc) => {
if (error) {
return done(error)
}
expect(doc.lines).to.not.exist
expect(doc.ranges).to.not.exist
expect(doc.inS3).to.equal(true)
done()
})
it('should set inS3 and unset lines and ranges in the doc', async function () {
const doc = await db.docs.findOne({ _id: this.doc_id })
expect(doc.lines).to.not.exist
expect(doc.ranges).to.not.exist
expect(doc.inS3).to.equal(true)
})
it('should set the doc in s3 correctly', function (done) {
DocstoreClient.getS3Doc(this.project_id, this.doc_id, (error, s3doc) => {
if (error) {
return done(error)
}
expect(s3doc.lines).to.deep.equal(this.lines)
expect(s3doc.ranges).to.deep.equal(this.ranges)
done()
})
it('should set the doc in s3 correctly', async function () {
const s3doc = await DocstoreClient.getS3Doc(this.project_id, this.doc_id)
expect(s3doc.lines).to.deep.equal(this.lines)
expect(s3doc.ranges).to.deep.equal(this.ranges)
})
})
describe('when the doc exists in another project', function () {
const otherProjectId = new ObjectId()
it('should show as not existing on /deleted', function (done) {
DocstoreClient.isDocDeleted(otherProjectId, this.doc_id, (error, res) => {
if (error) return done(error)
expect(res.statusCode).to.equal(404)
done()
})
it('should show as not existing on /deleted', async function () {
expect(DocstoreClient.isDocDeleted(otherProjectId, this.doc_id))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
it('should return a 404 when trying to delete', function (done) {
deleteDoc(otherProjectId, this.doc_id, (error, res) => {
if (error) return done(error)
expect(res.statusCode).to.equal(404)
done()
})
it('should return a 404 when trying to delete', async function () {
expect(deleteDoc(otherProjectId, this.doc_id))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
})
describe('when the doc does not exist', function () {
it('should show as not existing on /deleted', function (done) {
it('should show as not existing on /deleted', async function () {
const missingDocId = new ObjectId()
DocstoreClient.isDocDeleted(
this.project_id,
missingDocId,
(error, res) => {
if (error) return done(error)
expect(res.statusCode).to.equal(404)
done()
}
)
expect(DocstoreClient.isDocDeleted(this.project_id, missingDocId))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
it('should return a 404', function (done) {
it('should return a 404', async function () {
const missingDocId = new ObjectId()
deleteDoc(this.project_id, missingDocId, (error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(404)
done()
})
await expect(deleteDoc(this.project_id, missingDocId))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
})
}
@@ -215,21 +165,17 @@ describe('Delete via PATCH', function () {
deleteTestSuite(DocstoreClient.deleteDoc)
describe('when providing a custom doc name in the delete request', function () {
beforeEach(function (done) {
DocstoreClient.deleteDocWithName(
beforeEach(async function () {
await DocstoreClient.deleteDocWithName(
this.project_id,
this.doc_id,
'wombat.tex',
done
'wombat.tex'
)
})
it('should insert the doc name into the docs collection', function (done) {
db.docs.find({ _id: this.doc_id }).toArray((error, docs) => {
if (error) return done(error)
expect(docs[0].name).to.equal('wombat.tex')
done()
})
it('should insert the doc name into the docs collection', async function () {
const docs = await db.docs.find({ _id: this.doc_id }).toArray()
expect(docs[0].name).to.equal('wombat.tex')
})
})
@@ -239,172 +185,130 @@ describe('Delete via PATCH', function () {
setTimeout(done, 5)
})
beforeEach('perform deletion with past date', function (done) {
DocstoreClient.deleteDocWithDate(
beforeEach('perform deletion with past date', async function () {
await DocstoreClient.deleteDocWithDate(
this.project_id,
this.doc_id,
this.deletedAt,
done
this.deletedAt
)
})
it('should insert the date into the docs collection', function (done) {
db.docs.find({ _id: this.doc_id }).toArray((error, docs) => {
if (error) return done(error)
expect(docs[0].deletedAt.toISOString()).to.equal(
this.deletedAt.toISOString()
)
done()
})
it('should insert the date into the docs collection', async function () {
const docs = await db.docs.find({ _id: this.doc_id }).toArray()
expect(docs[0].deletedAt.toISOString()).to.equal(
this.deletedAt.toISOString()
)
})
})
describe('when providing no doc name in the delete request', function () {
beforeEach(function (done) {
DocstoreClient.deleteDocWithName(
this.project_id,
this.doc_id,
'',
(error, res) => {
this.res = res
done(error)
}
)
})
it('should reject the request', function () {
expect(this.res.statusCode).to.equal(400)
expect(DocstoreClient.deleteDocWithName(this.project_id, this.doc_id))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 400 })
})
})
describe('when providing no date in the delete request', function () {
beforeEach(function (done) {
DocstoreClient.deleteDocWithDate(
this.project_id,
this.doc_id,
'',
(error, res) => {
this.res = res
done(error)
}
)
})
it('should reject the request', function () {
expect(this.res.statusCode).to.equal(400)
expect(DocstoreClient.deleteDocWithDate(this.project_id, this.doc_id))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 400 })
})
})
describe('before deleting anything', function () {
it('should show nothing in deleted docs response', function (done) {
DocstoreClient.getAllDeletedDocs(
this.project_id,
(error, deletedDocs) => {
if (error) return done(error)
expect(deletedDocs).to.deep.equal([])
done()
}
it('should show nothing in deleted docs response', async function () {
const deletedDocs = await DocstoreClient.getAllDeletedDocs(
this.project_id
)
expect(deletedDocs).to.deep.equal([])
})
})
describe('when the doc gets a name on delete', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.deletedAt = new Date()
DocstoreClient.deleteDocWithDate(
await DocstoreClient.deleteDocWithDate(
this.project_id,
this.doc_id,
this.deletedAt,
done
this.deletedAt
)
})
it('should show the doc in deleted docs response', function (done) {
DocstoreClient.getAllDeletedDocs(
this.project_id,
(error, deletedDocs) => {
if (error) return done(error)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id.toString(),
name: 'main.tex',
deletedAt: this.deletedAt.toISOString(),
},
])
done()
}
it('should show the doc in deleted docs response', async function () {
const deletedDocs = await DocstoreClient.getAllDeletedDocs(
this.project_id
)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id.toString(),
name: 'main.tex',
deletedAt: this.deletedAt.toISOString(),
},
])
})
describe('after deleting multiple docs', function () {
beforeEach('create doc2', function (done) {
beforeEach('create doc2', async function () {
this.doc_id2 = new ObjectId()
DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc_id2,
this.lines,
this.version,
this.ranges,
done
this.ranges
)
})
beforeEach('delete doc2', function (done) {
beforeEach('delete doc2', async function () {
this.deletedAt2 = new Date()
DocstoreClient.deleteDocWithDateAndName(
await DocstoreClient.deleteDocWithDateAndName(
this.project_id,
this.doc_id2,
this.deletedAt2,
'two.tex',
done
'two.tex'
)
})
beforeEach('create doc3', function (done) {
beforeEach('create doc3', async function () {
this.doc_id3 = new ObjectId()
DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc_id3,
this.lines,
this.version,
this.ranges,
done
this.ranges
)
})
beforeEach('delete doc3', function (done) {
beforeEach('delete doc3', async function () {
this.deletedAt3 = new Date()
DocstoreClient.deleteDocWithDateAndName(
await DocstoreClient.deleteDocWithDateAndName(
this.project_id,
this.doc_id3,
this.deletedAt3,
'three.tex',
done
'three.tex'
)
})
it('should show all the docs as deleted', function (done) {
DocstoreClient.getAllDeletedDocs(
this.project_id,
(error, deletedDocs) => {
if (error) return done(error)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id3.toString(),
name: 'three.tex',
deletedAt: this.deletedAt3.toISOString(),
},
{
_id: this.doc_id2.toString(),
name: 'two.tex',
deletedAt: this.deletedAt2.toISOString(),
},
{
_id: this.doc_id.toString(),
name: 'main.tex',
deletedAt: this.deletedAt.toISOString(),
},
])
done()
}
it('should show all the docs as deleted', async function () {
const deletedDocs = await DocstoreClient.getAllDeletedDocs(
this.project_id
)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id3.toString(),
name: 'three.tex',
deletedAt: this.deletedAt3.toISOString(),
},
{
_id: this.doc_id2.toString(),
name: 'two.tex',
deletedAt: this.deletedAt2.toISOString(),
},
{
_id: this.doc_id.toString(),
name: 'main.tex',
deletedAt: this.deletedAt.toISOString(),
},
])
})
describe('with one more than max_deleted_docs permits', function () {
@@ -417,28 +321,23 @@ describe('Delete via PATCH', function () {
Settings.max_deleted_docs = maxDeletedDocsBefore
})
it('should omit the first deleted doc', function (done) {
DocstoreClient.getAllDeletedDocs(
this.project_id,
(error, deletedDocs) => {
if (error) return done(error)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id3.toString(),
name: 'three.tex',
deletedAt: this.deletedAt3.toISOString(),
},
{
_id: this.doc_id2.toString(),
name: 'two.tex',
deletedAt: this.deletedAt2.toISOString(),
},
// dropped main.tex
])
done()
}
it('should omit the first deleted doc', async function () {
const deletedDocs = await DocstoreClient.getAllDeletedDocs(
this.project_id
)
expect(deletedDocs).to.deep.equal([
{
_id: this.doc_id3.toString(),
name: 'three.tex',
deletedAt: this.deletedAt3.toISOString(),
},
{
_id: this.doc_id2.toString(),
name: 'two.tex',
deletedAt: this.deletedAt2.toISOString(),
},
// dropped main.tex
])
})
})
})
@@ -446,66 +345,54 @@ describe('Delete via PATCH', function () {
})
describe("Destroying a project's documents", function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = new ObjectId()
this.doc_id = new ObjectId()
this.lines = ['original', 'lines']
this.version = 42
this.ranges = []
DocstoreApp.ensureRunning(() => {
DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.lines,
this.version,
this.ranges,
error => {
if (error) {
throw error
}
done()
}
)
})
await DocstoreApp.ensureRunning()
await DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.lines,
this.version,
this.ranges
)
})
describe('when the doc exists', function () {
beforeEach(function (done) {
DocstoreClient.destroyAllDoc(this.project_id, done)
beforeEach(async function () {
await DocstoreClient.destroyAllDoc(this.project_id)
})
it('should remove the doc from the docs collection', function (done) {
db.docs.find({ _id: this.doc_id }).toArray((err, docs) => {
expect(err).not.to.exist
expect(docs).to.deep.equal([])
done()
})
it('should remove the doc from the docs collection', async function () {
const docs = await db.docs.find({ _id: this.doc_id }).toArray()
expect(docs).to.deep.equal([])
})
})
describe('when the doc is archived', function () {
beforeEach(function (done) {
DocstoreClient.archiveAllDoc(this.project_id, err => {
if (err) {
return done(err)
}
DocstoreClient.destroyAllDoc(this.project_id, done)
})
beforeEach(async function () {
try {
await DocstoreClient.archiveAllDoc(this.project_id)
} catch (error) {
// noop
}
await DocstoreClient.destroyAllDoc(this.project_id)
})
it('should remove the doc from the docs collection', function (done) {
db.docs.find({ _id: this.doc_id }).toArray((err, docs) => {
expect(err).not.to.exist
expect(docs).to.deep.equal([])
done()
})
it('should remove the doc from the docs collection', async function () {
const docs = await db.docs.find({ _id: this.doc_id }).toArray()
expect(docs).to.deep.equal([])
})
it('should remove the doc contents from s3', function (done) {
DocstoreClient.getS3Doc(this.project_id, this.doc_id, error => {
it('should remove the doc contents from s3', async function () {
try {
await DocstoreClient.getS3Doc(this.project_id, this.doc_id)
} catch (error) {
expect(error).to.be.instanceOf(Errors.NotFoundError)
done()
})
}
})
})
})

View File

@@ -1,19 +1,7 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { ObjectId } = require('mongodb-legacy')
const async = require('async')
const DocstoreApp = require('./helpers/DocstoreApp')
const { callbackify } = require('node:util')
const DocstoreClient = require('./helpers/DocstoreClient')
@@ -90,10 +78,10 @@ describe('Getting all docs', function () {
rev: 8,
}
const version = 42
const jobs = Array.from(this.docs).map(doc =>
(doc => {
return callback => {
return DocstoreClient.createDoc(
const jobs = this.docs.map(doc =>
(
doc => callback =>
callbackify(DocstoreClient.createDoc)(
this.project_id,
doc._id,
doc.lines,
@@ -101,11 +89,10 @@ describe('Getting all docs', function () {
doc.ranges,
callback
)
}
})(doc)
)(doc)
)
jobs.push(cb => {
return DocstoreClient.createDoc(
jobs.push(cb =>
callbackify(DocstoreClient.createDoc)(
this.project_id,
this.deleted_doc._id,
this.deleted_doc.lines,
@@ -113,72 +100,48 @@ describe('Getting all docs', function () {
this.deleted_doc.ranges,
err => {
if (err) return done(err)
return DocstoreClient.deleteDoc(
callbackify(DocstoreClient.deleteDoc)(
this.project_id,
this.deleted_doc._id,
cb
)
}
)
})
jobs.unshift(cb => DocstoreApp.ensureRunning(cb))
return async.series(jobs, done)
})
it('getAllDocs should return all the (non-deleted) docs', function (done) {
return DocstoreClient.getAllDocs(this.project_id, (error, res, docs) => {
if (error != null) {
throw error
}
docs.length.should.equal(this.docs.length)
for (let i = 0; i < docs.length; i++) {
const doc = docs[i]
doc.lines.should.deep.equal(this.docs[i].lines)
}
return done()
})
})
it('getAllRanges should return all the (non-deleted) doc ranges', function (done) {
return DocstoreClient.getAllRanges(this.project_id, (error, res, docs) => {
if (error != null) {
throw error
}
docs.length.should.equal(this.docs.length)
for (let i = 0; i < docs.length; i++) {
const doc = docs[i]
doc.ranges.should.deep.equal(this.fixedRanges[i])
}
return done()
})
})
it('getTrackedChangesUserIds should return all the user ids from (non-deleted) ranges', function (done) {
DocstoreClient.getTrackedChangesUserIds(
this.project_id,
(error, res, userIds) => {
if (error != null) {
throw error
}
userIds.should.deep.equal(['user-id-1', 'user-id-2'])
done()
}
)
jobs.unshift(cb => callbackify(DocstoreApp.ensureRunning)(cb))
async.series(jobs, done)
})
it('getCommentThreadIds should return all the thread ids from (non-deleted) ranges', function (done) {
DocstoreClient.getCommentThreadIds(
this.project_id,
(error, res, threadIds) => {
if (error != null) {
throw error
}
threadIds.should.deep.equal({
[this.docs[0]._id.toString()]: [this.threadId1],
[this.docs[2]._id.toString()]: [this.threadId2],
})
done()
}
it('getAllDocs should return all the (non-deleted) docs', async function () {
const docs = await DocstoreClient.getAllDocs(this.project_id)
docs.length.should.equal(this.docs.length)
for (let i = 0; i < docs.length; i++) {
const doc = docs[i]
doc.lines.should.deep.equal(this.docs[i].lines)
}
})
it('getAllRanges should return all the (non-deleted) doc ranges', async function () {
const docs = await DocstoreClient.getAllRanges(this.project_id)
docs.length.should.equal(this.docs.length)
for (let i = 0; i < docs.length; i++) {
const doc = docs[i]
doc.ranges.should.deep.equal(this.fixedRanges[i])
}
})
it('getTrackedChangesUserIds should return all the user ids from (non-deleted) ranges', async function () {
const userIds = await DocstoreClient.getTrackedChangesUserIds(
this.project_id
)
userIds.should.deep.equal(['user-id-1', 'user-id-2'])
})
it('getCommentThreadIds should return all the thread ids from (non-deleted) ranges', async function () {
const threadIds = await DocstoreClient.getCommentThreadIds(this.project_id)
threadIds.should.deep.equal({
[this.docs[0]._id.toString()]: [this.threadId1],
[this.docs[2]._id.toString()]: [this.threadId2],
})
})
})

View File

@@ -5,8 +5,8 @@ const DocstoreClient = require('./helpers/DocstoreClient')
const { Storage } = require('@google-cloud/storage')
describe('Getting A Doc from Archive', function () {
before(function (done) {
return DocstoreApp.ensureRunning(done)
before(async function () {
await DocstoreApp.ensureRunning()
})
before(async function () {
@@ -25,7 +25,7 @@ describe('Getting A Doc from Archive', function () {
})
describe('for an archived doc', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.timeout(1000 * 30)
this.doc = {
@@ -34,72 +34,46 @@ describe('Getting A Doc from Archive', function () {
ranges: {},
version: 2,
}
DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
error => {
if (error) {
return done(error)
}
DocstoreClient.archiveDoc(
this.project_id,
this.doc._id,
(error, res) => {
this.res = res
if (error) {
return done(error)
}
done()
}
)
}
this.doc.ranges
)
this.res = await DocstoreClient.archiveDoc(this.project_id, this.doc._id)
})
it('should successully archive the doc', function (done) {
this.res.statusCode.should.equal(204)
done()
it('should successully archive the doc', function () {
this.res.status.should.equal(204)
})
it('should return the doc lines and version from persistent storage', function (done) {
return DocstoreClient.peekDoc(
it('should return the doc lines and version from persistent storage', async function () {
const { res, doc } = await DocstoreClient.peekDoc(
this.project_id,
this.doc._id,
{},
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(200)
res.headers['x-doc-status'].should.equal('archived')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
this.doc._id
)
res.status.should.equal(200)
res.headers.get('x-doc-status').should.equal('archived')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
})
it('should return the doc lines and version from persistent storage on subsequent requests', function (done) {
return DocstoreClient.peekDoc(
it('should return the doc lines and version from persistent storage on subsequent requests', async function () {
const { res, doc } = await DocstoreClient.peekDoc(
this.project_id,
this.doc._id,
{},
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(200)
res.headers['x-doc-status'].should.equal('archived')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
this.doc._id
)
res.status.should.equal(200)
res.headers.get('x-doc-status').should.equal('archived')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
})
describe('for an non-archived doc', function () {
before(function (done) {
before(async function () {
this.project_id = new ObjectId()
this.timeout(1000 * 30)
this.doc = {
@@ -108,31 +82,25 @@ describe('Getting A Doc from Archive', function () {
ranges: {},
version: 2,
}
DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.doc._id,
this.doc.lines,
this.doc.version,
this.doc.ranges,
done
this.doc.ranges
)
})
it('should return the doc lines and version from mongo', function (done) {
return DocstoreClient.peekDoc(
it('should return the doc lines and version from mongo', async function () {
const { res, doc } = await DocstoreClient.peekDoc(
this.project_id,
this.doc._id,
{},
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(200)
res.headers['x-doc-status'].should.equal('active')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
return done()
}
this.doc._id
)
res.status.should.equal(200)
res.headers.get('x-doc-status').should.equal('active')
doc.lines.should.deep.equal(this.doc.lines)
doc.version.should.equal(this.doc.version)
doc.ranges.should.deep.equal(this.doc.ranges)
})
})
})

View File

@@ -1,22 +1,11 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { ObjectId } = require('mongodb-legacy')
const { expect } = require('chai')
const DocstoreApp = require('./helpers/DocstoreApp')
const DocstoreClient = require('./helpers/DocstoreClient')
describe('Getting a doc', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = new ObjectId()
this.doc_id = new ObjectId()
this.lines = ['original', 'lines']
@@ -49,105 +38,63 @@ describe('Getting a doc', function () {
{ ...this.ranges.comments[0], id: this.ranges.comments[0].op.t },
],
}
return DocstoreApp.ensureRunning(() => {
return DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.lines,
this.version,
this.ranges,
error => {
if (error != null) {
throw error
}
return done()
}
)
})
await DocstoreApp.ensureRunning()
await DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.lines,
this.version,
this.ranges
)
})
describe('when the doc exists', function () {
return it('should get the doc lines and version', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.lines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.fixedRanges)
return done()
}
)
it('should get the doc lines and version', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.lines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.fixedRanges)
})
})
describe('when the doc does not exist', function () {
return it('should return a 404', function (done) {
it('should return a 404', async function () {
const missingDocId = new ObjectId()
return DocstoreClient.getDoc(
this.project_id,
missingDocId,
{},
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(404)
return done()
}
)
await expect(DocstoreClient.getDoc(this.project_id, missingDocId))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
})
return describe('when the doc is a deleted doc', function () {
beforeEach(function (done) {
describe('when the doc is a deleted doc', function () {
beforeEach(async function () {
this.deleted_doc_id = new ObjectId()
return DocstoreClient.createDoc(
await DocstoreClient.createDoc(
this.project_id,
this.deleted_doc_id,
this.lines,
this.version,
this.ranges,
error => {
if (error != null) {
throw error
}
return DocstoreClient.deleteDoc(
this.project_id,
this.deleted_doc_id,
done
)
}
this.ranges
)
await DocstoreClient.deleteDoc(this.project_id, this.deleted_doc_id)
})
it('should return the doc', function (done) {
return DocstoreClient.getDoc(
it('should return the doc', async function () {
const doc = await DocstoreClient.getDoc(
this.project_id,
this.deleted_doc_id,
{ include_deleted: true },
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.lines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.fixedRanges)
doc.deleted.should.equal(true)
return done()
}
{ include_deleted: true }
)
doc.lines.should.deep.equal(this.lines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.fixedRanges)
doc.deleted.should.equal(true)
})
return it('should return a 404 when the query string is not set', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.deleted_doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(404)
return done()
}
)
it('should return a 404 when the query string is not set', async function () {
await expect(DocstoreClient.getDoc(this.project_id, this.deleted_doc_id))
.to.eventually.be.rejected.and.have.property('info')
.to.contain({ status: 404 })
})
})
})

View File

@@ -4,22 +4,19 @@ const DocstoreClient = require('./helpers/DocstoreClient')
const { expect } = require('chai')
describe('HealthChecker', function () {
beforeEach('start', function (done) {
DocstoreApp.ensureRunning(done)
beforeEach('start', async function () {
await DocstoreApp.ensureRunning()
})
beforeEach('clear docs collection', async function () {
await db.docs.deleteMany({})
})
let res
beforeEach('run health check', function (done) {
DocstoreClient.healthCheck((err, _res) => {
res = _res
done(err)
})
beforeEach('run health check', async function () {
res = await DocstoreClient.healthCheck()
})
it('should return 200', function () {
res.statusCode.should.equal(200)
res.status.should.equal(200)
})
it('should not leave any cruft behind', async function () {

View File

@@ -1,22 +1,10 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { ObjectId } = require('mongodb-legacy')
const DocstoreApp = require('./helpers/DocstoreApp')
const DocstoreClient = require('./helpers/DocstoreClient')
describe('Applying updates to a doc', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = new ObjectId()
this.doc_id = new ObjectId()
this.originalLines = ['original', 'lines']
@@ -45,511 +33,362 @@ describe('Applying updates to a doc', function () {
},
],
}
this.version = 42
return DocstoreApp.ensureRunning(() => {
return DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.originalLines,
this.version,
this.originalRanges,
error => {
if (error != null) {
throw error
}
return done()
}
)
})
await DocstoreApp.ensureRunning()
await DocstoreClient.createDoc(
this.project_id,
this.doc_id,
this.originalLines,
this.version,
this.originalRanges
)
})
describe('when nothing has been updated', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
beforeEach(async function () {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.originalLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.body = body
return done()
}
this.originalRanges
)
})
it('should return modified = false', function () {
return this.body.modified.should.equal(false)
this.body.modified.should.equal(false)
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
})
})
describe('when the lines have changed', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
beforeEach(async function () {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.newLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.body = body
return done()
}
this.originalRanges
)
})
it('should return modified = true', function () {
return this.body.modified.should.equal(true)
this.body.modified.should.equal(true)
})
it('should return the rev', function () {
return this.body.rev.should.equal(2)
this.body.rev.should.equal(2)
})
return it('should update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.newLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
return done()
}
)
it('should update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.newLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
})
})
describe('when the version has changed', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
beforeEach(async function () {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.originalLines,
this.version + 1,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.body = body
return done()
}
this.originalRanges
)
})
it('should return modified = true', function () {
return this.body.modified.should.equal(true)
this.body.modified.should.equal(true)
})
it('should return the rev', function () {
return this.body.rev.should.equal(1)
this.body.rev.should.equal(1)
})
return it('should update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version + 1)
doc.ranges.should.deep.equal(this.originalRanges)
return done()
}
)
it('should update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version + 1)
doc.ranges.should.deep.equal(this.originalRanges)
})
})
describe('when the version was decremented', function () {
beforeEach(function (done) {
DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.newLines,
this.version - 1,
this.newRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
done()
}
)
let statusCode
beforeEach(async function () {
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.newLines,
this.version - 1,
this.newRanges
)
} catch (error) {
statusCode = error.info.status
}
})
it('should return 409', function () {
this.res.statusCode.should.equal(409)
statusCode.should.equal(409)
})
it('should not update the doc in the API', function (done) {
DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.originalRanges)
})
})
describe('when the ranges have changed', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
beforeEach(async function () {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.originalLines,
this.version,
this.newRanges,
(error, res, body) => {
if (error) return done(error)
this.body = body
return done()
}
this.newRanges
)
})
it('should return modified = true', function () {
return this.body.modified.should.equal(true)
this.body.modified.should.equal(true)
})
it('should return the rev', function () {
return this.body.rev.should.equal(2)
this.body.rev.should.equal(2)
})
return it('should update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.newRanges)
return done()
}
)
it('should update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
doc.ranges.should.deep.equal(this.newRanges)
})
})
describe('when the doc does not exist', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.missing_doc_id = new ObjectId()
return DocstoreClient.updateDoc(
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.missing_doc_id,
this.originalLines,
0,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
this.originalRanges
)
})
it('should create the doc', function () {
return this.body.rev.should.equal(1)
this.body.rev.should.equal(1)
})
return it('should be retreivable', function (done) {
return DocstoreClient.getDoc(
it('should be retreivable', async function () {
const doc = await DocstoreClient.getDoc(
this.project_id,
this.missing_doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(0)
doc.ranges.should.deep.equal(this.originalRanges)
return done()
}
this.missing_doc_id
)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(0)
doc.ranges.should.deep.equal(this.originalRanges)
})
})
describe('when malformed doc lines are provided', function () {
describe('when the lines are not an array', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
{ foo: 'bar' },
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
)
let statusCode
beforeEach(async function () {
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
{ foo: 'bar' },
this.version,
this.originalRanges
)
} catch (error) {
statusCode = error.info.status
}
})
it('should return 400', function () {
return this.res.statusCode.should.equal(400)
statusCode.should.equal(400)
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
})
})
return describe('when the lines are not present', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
null,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
)
describe('when the lines are not present', function () {
let statusCode
beforeEach(async function () {
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
null,
this.version,
this.originalRanges
)
} catch (error) {
statusCode = error.info.status
}
})
it('should return 400', function () {
return this.res.statusCode.should.equal(400)
statusCode.should.equal(400)
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
})
})
})
describe('when no version is provided', function () {
beforeEach(function (done) {
return DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.originalLines,
null,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
)
let statusCode
beforeEach(async function () {
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.originalLines,
null,
this.originalRanges
)
} catch (error) {
statusCode = error.info.status
}
})
it('should return 400', function () {
return this.res.statusCode.should.equal(400)
statusCode.should.equal(400)
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
doc.version.should.equal(this.version)
})
})
describe('when the content is large', function () {
beforeEach(function (done) {
beforeEach(async function () {
const line = new Array(1025).join('x') // 1kb
this.largeLines = Array.apply(null, Array(1024)).map(() => line) // 1mb
return DocstoreClient.updateDoc(
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.body = body
return done()
}
this.originalRanges
)
})
it('should return modified = true', function () {
return this.body.modified.should.equal(true)
this.body.modified.should.equal(true)
})
return it('should update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.largeLines)
return done()
}
)
it('should update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.largeLines)
})
})
describe('when there is a large json payload', function () {
beforeEach(function (done) {
beforeEach(async function () {
const line = new Array(1025).join('x') // 1kb
this.largeLines = Array.apply(null, Array(1024)).map(() => line) // 1kb
this.originalRanges.padding = Array.apply(null, Array(2049)).map(
() => line
) // 2mb + 1kb
return DocstoreClient.updateDoc(
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
this.originalRanges
)
})
it('should return modified = true', function () {
return this.body.modified.should.equal(true)
this.body.modified.should.equal(true)
})
return it('should update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.largeLines)
return done()
}
)
it('should update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.largeLines)
})
})
describe('when the document body is too large', function () {
beforeEach(function (done) {
let statusCode, body
beforeEach(async function () {
const line = new Array(1025).join('x') // 1kb
this.largeLines = Array.apply(null, Array(2049)).map(() => line) // 2mb + 1kb
return DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
)
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges
)
} catch (error) {
statusCode = error.info.status
body = error.body
}
})
it('should return 413', function () {
return this.res.statusCode.should.equal(413)
statusCode.should.equal(413)
})
it('should report body too large', function () {
return this.res.body.should.equal('document body too large')
body.should.equal('document body too large')
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
})
})
return describe('when the json payload is too large', function () {
beforeEach(function (done) {
describe('when the json payload is too large', function () {
beforeEach(async function () {
const line = new Array(1024).join('x') // 1 KB
this.largeLines = new Array(8192).fill(line) // 8 MB
this.originalRanges.padding = new Array(6144).fill(line) // 6 MB
return DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges,
(error, res, body) => {
if (error) return done(error)
this.res = res
this.body = body
return done()
}
)
try {
this.body = await DocstoreClient.updateDoc(
this.project_id,
this.doc_id,
this.largeLines,
this.version,
this.originalRanges
)
} catch (error) {
// ignore error response
}
})
return it('should not update the doc in the API', function (done) {
return DocstoreClient.getDoc(
this.project_id,
this.doc_id,
{},
(error, res, doc) => {
if (error) return done(error)
doc.lines.should.deep.equal(this.originalLines)
return done()
}
)
it('should not update the doc in the API', async function () {
const doc = await DocstoreClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.originalLines)
})
})
})

View File

@@ -1,26 +1,31 @@
const app = require('../../../../app')
const settings = require('@overleaf/settings')
const Settings = require('@overleaf/settings')
function startApp() {
return new Promise((resolve, reject) => {
app.listen(
Settings.internal.docstore.port,
Settings.internal.docstore.host,
error => {
if (error) {
reject(error)
} else {
resolve()
}
}
)
})
}
let appStartedPromise
async function ensureRunning() {
if (!appStartedPromise) {
appStartedPromise = startApp()
}
await appStartedPromise
}
module.exports = {
running: false,
initing: false,
callbacks: [],
ensureRunning(callback) {
if (this.running) {
return callback()
} else if (this.initing) {
return this.callbacks.push(callback)
}
this.initing = true
this.callbacks.push(callback)
app.listen(settings.internal.docstore.port, '127.0.0.1', error => {
if (error != null) {
throw error
}
this.running = true
for (callback of Array.from(this.callbacks)) {
callback()
}
})
},
ensureRunning,
}

View File

@@ -1,5 +1,9 @@
let DocstoreClient
const request = require('request').defaults({ jar: false })
const {
fetchNothing,
fetchJson,
fetchJsonWithResponse,
} = require('@overleaf/fetch-utils')
const settings = require('@overleaf/settings')
const Persistor = require('../../../../app/js/PersistorManager')
@@ -19,204 +23,156 @@ async function getStringFromPersistor(persistor, bucket, key) {
}
module.exports = DocstoreClient = {
createDoc(projectId, docId, lines, version, ranges, callback) {
return DocstoreClient.updateDoc(
async createDoc(projectId, docId, lines, version, ranges) {
return await DocstoreClient.updateDoc(
projectId,
docId,
lines,
version,
ranges,
callback
ranges
)
},
getDoc(projectId, docId, qs, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`,
json: true,
qs,
},
callback
async getDoc(projectId, docId, qs = {}) {
const url = new URL(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`
)
for (const [key, value] of Object.entries(qs)) {
url.searchParams.append(key, value)
}
return await fetchJson(url)
},
async peekDoc(projectId, docId, qs = {}) {
const url = new URL(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/peek`
)
for (const [key, value] of Object.entries(qs)) {
url.searchParams.append(key, value)
}
const { response, json } = await fetchJsonWithResponse(url)
return { res: response, doc: json }
},
async isDocDeleted(projectId, docId) {
const { response, json } = await fetchJsonWithResponse(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/deleted`
)
return { res: response, body: json }
},
async getAllDocs(projectId) {
return await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc`
)
},
peekDoc(projectId, docId, qs, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/peek`,
json: true,
qs,
},
callback
async getAllDeletedDocs(projectId, callback) {
return await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc-deleted`
)
},
isDocDeleted(projectId, docId, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/deleted`,
json: true,
},
callback
async getAllRanges(projectId) {
return await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/ranges`
)
},
getAllDocs(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc`,
json: true,
},
(req, res, body) => {
callback(req, res, body)
}
async getCommentThreadIds(projectId, callback) {
return await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/comment-thread-ids`
)
},
getAllDeletedDocs(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc-deleted`,
json: true,
},
(error, res, body) => {
if (error) return callback(error)
if (res.statusCode !== 200) {
return callback(new Error('unexpected statusCode'))
}
callback(null, body)
}
async getTrackedChangesUserIds(projectId) {
return await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/tracked-changes-user-ids`
)
},
getAllRanges(projectId, callback) {
request.get(
async updateDoc(projectId, docId, lines, version, ranges) {
const res = await fetchJson(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`,
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/ranges`,
json: true,
},
callback
)
},
getCommentThreadIds(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/comment-thread-ids`,
json: true,
},
callback
)
},
getTrackedChangesUserIds(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/tracked-changes-user-ids`,
json: true,
},
callback
)
},
updateDoc(projectId, docId, lines, version, ranges, callback) {
return request.post(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`,
method: 'POST',
json: {
lines,
version,
ranges,
},
},
callback
}
)
return res
},
deleteDoc(projectId, docId, callback) {
DocstoreClient.deleteDocWithDateAndName(
async deleteDoc(projectId, docId) {
return await DocstoreClient.deleteDocWithDateAndName(
projectId,
docId,
new Date(),
'main.tex',
callback
'main.tex'
)
},
deleteDocWithDate(projectId, docId, date, callback) {
DocstoreClient.deleteDocWithDateAndName(
async deleteDocWithDate(projectId, docId, date) {
return await DocstoreClient.deleteDocWithDateAndName(
projectId,
docId,
date,
'main.tex',
callback
'main.tex'
)
},
deleteDocWithName(projectId, docId, name, callback) {
DocstoreClient.deleteDocWithDateAndName(
async deleteDocWithName(projectId, docId, name) {
return await DocstoreClient.deleteDocWithDateAndName(
projectId,
docId,
new Date(),
name,
callback
name
)
},
deleteDocWithDateAndName(projectId, docId, deletedAt, name, callback) {
request.patch(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`,
json: { name, deleted: true, deletedAt },
},
callback
async deleteDocWithDateAndName(projectId, docId, deletedAt, name) {
return await fetchNothing(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}`,
{ method: 'PATCH', json: { name, deleted: true, deletedAt } }
)
},
archiveAllDoc(projectId, callback) {
request.post(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/archive`,
},
callback
async archiveAllDoc(projectId) {
return await fetchNothing(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/archive`,
{ method: 'POST' }
)
},
archiveDoc(projectId, docId, callback) {
request.post(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/archive`,
},
callback
async archiveDoc(projectId, docId) {
return await fetchNothing(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/doc/${docId}/archive`,
{ method: 'POST' }
)
},
destroyAllDoc(projectId, callback) {
request.post(
{
url: `http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/destroy`,
},
callback
async destroyAllDoc(projectId) {
await fetchNothing(
`http://127.0.0.1:${settings.internal.docstore.port}/project/${projectId}/destroy`,
{ method: 'POST' }
)
},
healthCheck(callback) {
request.get(
`http://127.0.0.1:${settings.internal.docstore.port}/health_check`,
callback
async healthCheck() {
return await fetchNothing(
`http://127.0.0.1:${settings.internal.docstore.port}/health_check`
)
},
getS3Doc(projectId, docId, callback) {
getStringFromPersistor(
async getS3Doc(projectId, docId) {
const data = await getStringFromPersistor(
Persistor,
settings.docstore.bucket,
`${projectId}/${docId}`
)
.then(data => {
callback(null, JSON.parse(data))
})
.catch(callback)
return JSON.parse(data)
},
}

View File

@@ -232,6 +232,25 @@ describe('DocArchiveManager', function () {
)
})
describe('with S3 persistor', function () {
beforeEach(async function () {
Settings.docstore.backend = 's3'
await DocArchiveManager.archiveDoc(projectId, mongoDocs[0]._id)
})
it('should not calculate the hex md5 sum of the content', function () {
expect(Crypto.createHash).not.to.have.been.called
expect(HashUpdate).not.to.have.been.called
expect(HashDigest).not.to.have.been.called
})
it('should not pass an md5 hash to the object persistor for verification', function () {
expect(PersistorManager.sendStream).not.to.have.been.calledWithMatch({
sourceMd5: sinon.match.any,
})
})
})
it('should pass the correct bucket and key to the persistor', async function () {
await DocArchiveManager.archiveDoc(projectId, mongoDocs[0]._id)

View File

@@ -1,145 +1,118 @@
const async = require('async')
const logger = require('@overleaf/logger')
const { promisifyAll } = require('@overleaf/promise-utils')
const request = require('request')
const { promiseMapWithLimit } = require('@overleaf/promise-utils')
const Settings = require('@overleaf/settings')
const ProjectHistoryRedisManager = require('./ProjectHistoryRedisManager')
const metrics = require('./Metrics')
const { fetchNothing } = require('@overleaf/fetch-utils')
const OError = require('@overleaf/o-error')
const HistoryManager = {
// flush changes in the background
flushProjectChangesAsync(projectId) {
HistoryManager.flushProjectChanges(
projectId,
{ background: true },
function () {}
const FLUSH_PROJECT_EVERY_N_OPS = 500
const MAX_PARALLEL_REQUESTS = 4
// flush changes in the background
function flushProjectChangesAsync(projectId) {
flushProjectChanges(projectId, { background: true }).catch(err => {
logger.error({ projectId, err }, 'failed to flush in background')
})
}
// flush changes (for when we need to know the queue is flushed)
async function flushProjectChanges(projectId, options) {
if (options.skip_history_flush) {
logger.debug({ projectId }, 'skipping flush of project history')
return
}
metrics.inc('history-flush', 1, { status: 'project-history' })
const url = new URL(
`${Settings.apis.project_history.url}/project/${projectId}/flush`
)
if (options.background) {
// pass on the background flush option if present
url.searchParams.set('background', 'true')
}
logger.debug({ projectId, url }, 'flushing doc in project history api')
try {
await fetchNothing(url, { method: 'POST' })
} catch (err) {
throw OError.tag(err, 'project history api request failed', { projectId })
}
}
function recordAndFlushHistoryOps(projectId, ops, projectOpsLength) {
if (ops == null) {
ops = []
}
if (ops.length === 0) {
return
}
// record updates for project history
if (shouldFlushHistoryOps(projectId, projectOpsLength, ops.length)) {
// Do this in the background since it uses HTTP and so may be too
// slow to wait for when processing a doc update.
logger.debug(
{ projectOpsLength, projectId },
'flushing project history api'
)
},
flushProjectChangesAsync(projectId)
}
}
// flush changes and callback (for when we need to know the queue is flushed)
flushProjectChanges(projectId, options, callback) {
if (callback == null) {
callback = function () {}
}
if (options.skip_history_flush) {
logger.debug({ projectId }, 'skipping flush of project history')
return callback()
}
metrics.inc('history-flush', 1, { status: 'project-history' })
const url = `${Settings.apis.project_history.url}/project/${projectId}/flush`
const qs = {}
if (options.background) {
qs.background = true
} // pass on the background flush option if present
logger.debug({ projectId, url, qs }, 'flushing doc in project history api')
request.post({ url, qs }, function (error, res, body) {
if (error) {
logger.error({ error, projectId }, 'project history api request failed')
callback(error)
} else if (res.statusCode < 200 && res.statusCode >= 300) {
logger.error(
{ projectId },
`project history api returned a failure status code: ${res.statusCode}`
)
callback(error)
} else {
callback()
}
})
},
function shouldFlushHistoryOps(
projectId,
length,
opsLength,
threshold = FLUSH_PROJECT_EVERY_N_OPS
) {
if (Settings.shortHistoryQueues.includes(projectId)) return true
if (!length) {
return false
} // don't flush unless we know the length
// We want to flush every 100 ops, i.e. 100, 200, 300, etc
// Find out which 'block' (i.e. 0-99, 100-199) we were in before and after pushing these
// ops. If we've changed, then we've gone over a multiple of 100 and should flush.
// (Most of the time, we will only hit 100 and then flushing will put us back to 0)
const previousLength = length - opsLength
const prevBlock = Math.floor(previousLength / threshold)
const newBlock = Math.floor(length / threshold)
return newBlock !== prevBlock
}
FLUSH_DOC_EVERY_N_OPS: 100,
FLUSH_PROJECT_EVERY_N_OPS: 500,
recordAndFlushHistoryOps(projectId, ops, projectOpsLength) {
if (ops == null) {
ops = []
}
if (ops.length === 0) {
return
}
// record updates for project history
if (
HistoryManager.shouldFlushHistoryOps(
projectId,
projectOpsLength,
ops.length,
HistoryManager.FLUSH_PROJECT_EVERY_N_OPS
)
) {
// Do this in the background since it uses HTTP and so may be too
// slow to wait for when processing a doc update.
logger.debug(
{ projectOpsLength, projectId },
'flushing project history api'
)
HistoryManager.flushProjectChangesAsync(projectId)
}
},
shouldFlushHistoryOps(projectId, length, opsLength, threshold) {
if (Settings.shortHistoryQueues.includes(projectId)) return true
if (!length) {
return false
} // don't flush unless we know the length
// We want to flush every 100 ops, i.e. 100, 200, 300, etc
// Find out which 'block' (i.e. 0-99, 100-199) we were in before and after pushing these
// ops. If we've changed, then we've gone over a multiple of 100 and should flush.
// (Most of the time, we will only hit 100 and then flushing will put us back to 0)
const previousLength = length - opsLength
const prevBlock = Math.floor(previousLength / threshold)
const newBlock = Math.floor(length / threshold)
return newBlock !== prevBlock
},
MAX_PARALLEL_REQUESTS: 4,
resyncProjectHistory(
async function resyncProjectHistory(
projectId,
projectHistoryId,
docs,
files,
opts,
callback
) {
await ProjectHistoryRedisManager.promises.queueResyncProjectStructure(
projectId,
projectHistoryId,
docs,
files,
opts,
callback
) {
ProjectHistoryRedisManager.queueResyncProjectStructure(
opts
)
if (opts.resyncProjectStructureOnly) return
const DocumentManager = require('./DocumentManager')
await promiseMapWithLimit(MAX_PARALLEL_REQUESTS, docs, doc => {
DocumentManager.promises.resyncDocContentsWithLock(
projectId,
projectHistoryId,
docs,
files,
opts,
function (error) {
if (error) {
return callback(error)
}
if (opts.resyncProjectStructureOnly) return callback()
const DocumentManager = require('./DocumentManager')
const resyncDoc = (doc, cb) => {
DocumentManager.resyncDocContentsWithLock(
projectId,
doc.doc,
doc.path,
opts,
cb
)
}
async.eachLimit(
docs,
HistoryManager.MAX_PARALLEL_REQUESTS,
resyncDoc,
callback
)
}
doc.doc,
doc.path,
opts
)
},
})
}
module.exports = HistoryManager
module.exports.promises = promisifyAll(HistoryManager, {
without: [
'flushProjectChangesAsync',
'recordAndFlushHistoryOps',
'shouldFlushHistoryOps',
],
})
module.exports = {
FLUSH_PROJECT_EVERY_N_OPS,
flushProjectChangesAsync,
recordAndFlushHistoryOps,
shouldFlushHistoryOps,
promises: {
flushProjectChanges,
resyncProjectHistory,
},
}

View File

@@ -226,7 +226,11 @@ async function setDoc(req, res) {
)
timer.done()
logger.debug({ projectId, docId }, 'set doc via http')
res.json(result)
// If the document is unchanged and hasn't been updated, `result` will be
// undefined, which leads to an invalid JSON response, so we send an empty
// object instead.
res.json(result || {})
}
async function appendToDoc(req, res) {

View File

@@ -1,139 +1,95 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS205: Consider reworking code to avoid use of IIFEs
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const request = require('request')
const { setTimeout } = require('node:timers/promises')
const Settings = require('@overleaf/settings')
const RedisManager = require('./RedisManager')
const { rclient } = RedisManager
const docUpdaterKeys = Settings.redis.documentupdater.key_schema
const async = require('async')
const { rclient } = require('./RedisManager')
const ProjectManager = require('./ProjectManager')
const _ = require('lodash')
const logger = require('@overleaf/logger')
const { promisifyAll } = require('@overleaf/promise-utils')
const { promiseMapSettledWithLimit } = require('@overleaf/promise-utils')
const docUpdaterKeys = Settings.redis.documentupdater.key_schema
const ProjectFlusher = {
// iterate over keys asynchronously using redis scan (non-blocking)
// handle all the cluster nodes or single redis server
_getKeys(pattern, limit, callback) {
const nodes = (typeof rclient.nodes === 'function'
? rclient.nodes('master')
: undefined) || [rclient]
const doKeyLookupForNode = (node, cb) =>
ProjectFlusher._getKeysFromNode(node, pattern, limit, cb)
return async.concatSeries(nodes, doKeyLookupForNode, callback)
},
_getKeysFromNode(node, pattern, limit, callback) {
if (limit == null) {
limit = 1000
}
let cursor = 0 // redis iterator
const keySet = {} // use hash to avoid duplicate results
const batchSize = limit != null ? Math.min(limit, 1000) : 1000
// scan over all keys looking for pattern
const doIteration = (
cb // avoid hitting redis too hard
) =>
node.scan(
cursor,
'MATCH',
pattern,
'COUNT',
batchSize,
function (error, reply) {
let keys
if (error != null) {
return callback(error)
}
;[cursor, keys] = Array.from(reply)
for (const key of Array.from(keys)) {
keySet[key] = true
}
keys = Object.keys(keySet)
const noResults = cursor === '0' // redis returns string results not numeric
const limitReached = limit != null && keys.length >= limit
if (noResults || limitReached) {
return callback(null, keys)
} else {
return setTimeout(doIteration, 10)
}
}
)
return doIteration()
},
// extract ids from keys like DocsWithHistoryOps:57fd0b1f53a8396d22b2c24b
// or docsInProject:{57fd0b1f53a8396d22b2c24b} (for redis cluster)
_extractIds(keyList) {
const ids = (() => {
const result = []
for (const key of Array.from(keyList)) {
const m = key.match(/:\{?([0-9a-f]{24})\}?/) // extract object id
result.push(m[1])
}
return result
})()
return ids
},
flushAllProjects(options, callback) {
logger.info({ options }, 'flushing all projects')
return ProjectFlusher._getKeys(
docUpdaterKeys.docsInProject({ project_id: '*' }),
options.limit,
function (error, projectKeys) {
if (error != null) {
logger.err({ err: error }, 'error getting keys for flushing')
return callback(error)
}
const projectIds = ProjectFlusher._extractIds(projectKeys)
if (options.dryRun) {
return callback(null, projectIds)
}
const jobs = _.map(
projectIds,
projectId => cb =>
ProjectManager.flushAndDeleteProjectWithLocks(
projectId,
{ background: true },
cb
)
)
return async.parallelLimit(
async.reflectAll(jobs),
options.concurrency,
function (error, results) {
const success = []
const failure = []
_.each(results, function (result, i) {
if (result.error != null) {
return failure.push(projectIds[i])
} else {
return success.push(projectIds[i])
}
})
logger.info(
{ successCount: success.length, failureCount: failure.length },
'finished flushing all projects'
)
return callback(error, { success, failure })
}
)
}
)
},
// iterate over keys asynchronously using redis scan (non-blocking)
// handle all the cluster nodes or single redis server
async function _getKeys(pattern, limit) {
const nodes = (typeof rclient.nodes === 'function'
? rclient.nodes('master')
: undefined) || [rclient]
let keys = []
for (const node of nodes) {
keys = keys.concat(await _getKeysFromNode(node, pattern, limit))
}
return keys
}
module.exports = ProjectFlusher
module.exports.promises = promisifyAll(ProjectFlusher)
async function _getKeysFromNode(node, pattern, limit = 1000) {
let cursor = 0 // redis iterator
const keySet = new Set() // use hash to avoid duplicate results
const batchSize = Math.min(limit, 1000)
while (true) {
// scan over all keys looking for pattern
const reply = await node.scan(cursor, 'MATCH', pattern, 'COUNT', batchSize)
cursor = reply[0]
for (const key of reply[1]) {
keySet.add(key)
}
const noResults = cursor === '0' // redis returns string results not numeric
const limitReached = keySet.size >= limit
if (noResults || limitReached) {
return Array.from(keySet)
} else {
// avoid hitting redis too hard
await setTimeout(10)
}
}
}
// extract ids from keys like DocsWithHistoryOps:57fd0b1f53a8396d22b2c24b
// or docsInProject:{57fd0b1f53a8396d22b2c24b} (for redis cluster)
function _extractIds(keyList) {
const result = []
for (const key of Array.from(keyList)) {
const m = key.match(/:\{?([0-9a-f]{24})\}?/) // extract object id
result.push(m[1])
}
return result
}
async function flushAllProjects(options) {
logger.info({ options }, 'flushing all projects')
const projectKeys = await _getKeys(
docUpdaterKeys.docsInProject({ project_id: '*' }),
options.limit
)
const projectIds = _extractIds(projectKeys)
if (options.dryRun) {
return projectIds
}
const results = await promiseMapSettledWithLimit(
options.concurrency,
projectIds,
projectId =>
ProjectManager.promises.flushAndDeleteProjectWithLocks(projectId, {
background: true,
})
)
const success = []
const failure = []
for (let i = 0; i < results.length; i++) {
if (results[i].status === 'rejected') {
failure.push(projectIds[i])
} else {
success.push(projectIds[i])
}
}
logger.info(
{ successCount: success.length, failureCount: failure.length },
'finished flushing all projects'
)
return { success, failure }
}
module.exports = {
_extractIds,
promises: {
flushAllProjects,
},
}

View File

@@ -228,8 +228,7 @@ async function updateProjectWithLocks(
HistoryManager.shouldFlushHistoryOps(
projectId,
projectOpsLength,
updates.length,
HistoryManager.FLUSH_PROJECT_EVERY_N_OPS
updates.length
)
) {
HistoryManager.flushProjectChangesAsync(projectId)

View File

@@ -542,7 +542,7 @@ const RedisManager = {
)
// return if no projects ready to be processed
if (!projectsReady || projectsReady.length === 0) {
return
return {}
}
// pop the oldest entry (get and remove in a multi)
const multi = rclient.multi()
@@ -552,7 +552,7 @@ const RedisManager = {
multi.zcard(keys.flushAndDeleteQueue()) // the total length of the queue (for metrics)
const reply = await multi.exec()
if (!reply || reply.length === 0) {
return
return {}
}
const [key, timestamp] = reply[0]
const queueLength = reply[2]

View File

@@ -18,6 +18,7 @@
"types:check": "tsc --noEmit"
},
"dependencies": {
"@overleaf/fetch-utils": "*",
"@overleaf/logger": "*",
"@overleaf/metrics": "*",
"@overleaf/mongo-utils": "*",

View File

@@ -32,15 +32,7 @@ Options:
}
console.log('Flushing all projects with options:', options)
return await new Promise((resolve, reject) => {
ProjectFlusher.flushAllProjects(options, err => {
if (err) {
reject(err)
} else {
resolve()
}
})
})
await ProjectFlusher.promises.flushAllProjects(options)
}
main()

View File

@@ -1,4 +1,5 @@
const sinon = require('sinon')
const { setTimeout } = require('node:timers/promises')
const Settings = require('@overleaf/settings')
const rclientProjectHistory = require('@overleaf/redis-wrapper').createClient(
Settings.redis.project_history
@@ -10,14 +11,23 @@ const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
async function sendProjectUpdateAndWait(projectId, docId, update, version) {
await DocUpdaterClient.sendProjectUpdate(projectId, docId, update, version)
// It seems that we need to wait for a little while
await setTimeout(200)
}
describe("Applying updates to a project's structure", function () {
before(function () {
before(async function () {
this.user_id = 'user-id-123'
this.version = 1234
await DocUpdaterApp.ensureRunning()
})
describe('renaming a file', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.fileUpdate = {
type: 'rename-file',
@@ -26,23 +36,12 @@ describe("Applying updates to a project's structure", function () {
newPathname: '/new-file-path',
}
this.updates = [this.fileUpdate]
DocUpdaterApp.ensureRunning(error => {
if (error) {
return done(error)
}
DocUpdaterClient.sendProjectUpdate(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
)
})
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version
)
})
it('should push the applied file renames to the project history api', function (done) {
@@ -70,7 +69,7 @@ describe("Applying updates to a project's structure", function () {
})
describe('deleting a file', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.fileUpdate = {
type: 'rename-file',
@@ -79,17 +78,11 @@ describe("Applying updates to a project's structure", function () {
newPathname: '',
}
this.updates = [this.fileUpdate]
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -129,19 +122,13 @@ describe("Applying updates to a project's structure", function () {
})
describe('when the document is not loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -170,45 +157,29 @@ describe("Applying updates to a project's structure", function () {
})
describe('when the document is loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.update.id, {})
DocUpdaterClient.preloadDoc(this.project_id, this.update.id, error => {
if (error) {
return done(error)
}
sinon.spy(MockWebApi, 'getDocument')
DocUpdaterClient.sendProjectUpdate(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.update.id)
sinon.spy(MockWebApi, 'getDocument')
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version
)
})
after(function () {
MockWebApi.getDocument.restore()
})
it('should update the doc', function (done) {
DocUpdaterClient.getDoc(
it('should update the doc', async function () {
const doc = await DocUpdaterClient.getDoc(
this.project_id,
this.update.id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.pathname.should.equal(this.update.newPathname)
done()
}
this.update.id
)
doc.pathname.should.equal(this.update.newPathname)
})
it('should push the applied doc renames to the project history api', function (done) {
@@ -271,19 +242,13 @@ describe("Applying updates to a project's structure", function () {
})
describe('when the documents are not loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -348,19 +313,13 @@ describe("Applying updates to a project's structure", function () {
})
describe('when the document is not loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -389,46 +348,29 @@ describe("Applying updates to a project's structure", function () {
})
describe('when the document is loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.update.id, {})
DocUpdaterClient.preloadDoc(this.project_id, this.update.id, error => {
if (error) {
return done(error)
}
sinon.spy(MockWebApi, 'getDocument')
DocUpdaterClient.sendProjectUpdate(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.update.id)
sinon.spy(MockWebApi, 'getDocument')
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version
)
})
after(function () {
MockWebApi.getDocument.restore()
})
it('should not modify the doc', function (done) {
DocUpdaterClient.getDoc(
it('should not modify the doc', async function () {
const doc = await DocUpdaterClient.getDoc(
this.project_id,
this.update.id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.pathname.should.equal('/a/b/c.tex') // default pathname from MockWebApi
done()
}
this.update.id
)
doc.pathname.should.equal('/a/b/c.tex') // default pathname from MockWebApi
})
it('should push the applied doc update to the project history api', function (done) {
@@ -457,7 +399,7 @@ describe("Applying updates to a project's structure", function () {
})
describe('adding a file', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.fileUpdate = {
type: 'add-file',
@@ -466,17 +408,11 @@ describe("Applying updates to a project's structure", function () {
url: 'filestore.example.com',
}
this.updates = [this.fileUpdate]
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -505,7 +441,7 @@ describe("Applying updates to a project's structure", function () {
})
describe('adding a doc', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.docUpdate = {
type: 'add-doc',
@@ -514,17 +450,11 @@ describe("Applying updates to a project's structure", function () {
docLines: 'a\nb',
}
this.updates = [this.docUpdate]
DocUpdaterClient.sendProjectUpdate(
await sendProjectUpdateAndWait(
this.project_id,
this.user_id,
this.updates,
this.version,
error => {
if (error) {
return done(error)
}
setTimeout(done, 200)
}
this.version
)
})
@@ -553,7 +483,7 @@ describe("Applying updates to a project's structure", function () {
})
describe('with enough updates to flush to the history service', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.user_id = DocUpdaterClient.randomId()
this.version0 = 12345
@@ -574,29 +504,19 @@ describe("Applying updates to a project's structure", function () {
// Send updates in chunks to causes multiple flushes
const projectId = this.project_id
const userId = this.project_id
DocUpdaterClient.sendProjectUpdate(
await DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(0, 250),
this.version0,
function (error) {
if (error) {
return done(error)
}
DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(250),
this.version1,
error => {
if (error) {
return done(error)
}
setTimeout(done, 2000)
}
)
}
this.version0
)
await DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(250),
this.version1
)
await setTimeout(200)
})
after(function () {
@@ -611,7 +531,7 @@ describe("Applying updates to a project's structure", function () {
})
describe('with too few updates to flush to the history service', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.user_id = DocUpdaterClient.randomId()
this.version0 = 12345
@@ -633,29 +553,19 @@ describe("Applying updates to a project's structure", function () {
// Send updates in chunks
const projectId = this.project_id
const userId = this.project_id
DocUpdaterClient.sendProjectUpdate(
await DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(0, 10),
this.version0,
function (error) {
if (error) {
return done(error)
}
DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(10),
this.version1,
error => {
if (error) {
return done(error)
}
setTimeout(done, 2000)
}
)
}
this.version0
)
await DocUpdaterClient.sendProjectUpdate(
projectId,
userId,
updates.slice(10),
this.version1
)
await setTimeout(200)
})
after(function () {

View File

@@ -15,8 +15,8 @@ const rclient = require('@overleaf/redis-wrapper').createClient(
)
describe('CheckRedisMongoSyncState', function () {
beforeEach(function (done) {
DocUpdaterApp.ensureRunning(done)
beforeEach(async function () {
await DocUpdaterApp.ensureRunning()
})
beforeEach(async function () {
await rclient.flushall()
@@ -60,14 +60,14 @@ describe('CheckRedisMongoSyncState', function () {
describe('with a project', function () {
let projectId, docId
beforeEach(function (done) {
beforeEach(async function () {
projectId = DocUpdaterClient.randomId()
docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
lines: ['mongo', 'lines'],
version: 1,
})
DocUpdaterClient.getDoc(projectId, docId, done)
await DocUpdaterClient.preloadDoc(projectId, docId)
})
it('should work when in sync', async function () {
@@ -149,14 +149,14 @@ describe('CheckRedisMongoSyncState', function () {
describe('with a project', function () {
let projectId2, docId2
beforeEach(function (done) {
beforeEach(async function () {
projectId2 = DocUpdaterClient.randomId()
docId2 = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId2, docId2, {
lines: ['mongo', 'lines'],
version: 1,
})
DocUpdaterClient.getDoc(projectId2, docId2, done)
await DocUpdaterClient.preloadDoc(projectId2, docId2)
})
it('should work when in sync', async function () {
@@ -245,14 +245,14 @@ describe('CheckRedisMongoSyncState', function () {
describe('with more projects than the LIMIT', function () {
for (let i = 0; i < 20; i++) {
beforeEach(function (done) {
beforeEach(async function () {
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
lines: ['mongo', 'lines'],
version: 1,
})
DocUpdaterClient.getDoc(projectId, docId, done)
await DocUpdaterClient.preloadDoc(projectId, docId)
})
}
@@ -278,7 +278,7 @@ describe('CheckRedisMongoSyncState', function () {
describe('with partially deleted doc', function () {
let projectId, docId
beforeEach(function (done) {
beforeEach(async function () {
projectId = DocUpdaterClient.randomId()
docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
@@ -289,10 +289,8 @@ describe('CheckRedisMongoSyncState', function () {
lines: ['mongo', 'lines'],
version: 1,
})
DocUpdaterClient.getDoc(projectId, docId, err => {
MockWebApi.clearDocs()
done(err)
})
await DocUpdaterClient.preloadDoc(projectId, docId)
MockWebApi.clearDocs()
})
describe('with only the file-tree entry deleted', function () {
it('should flag the partial deletion', async function () {

View File

@@ -1,19 +1,12 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const MockProjectHistoryApi = require('./helpers/MockProjectHistoryApi')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { setTimeout } = require('node:timers/promises')
describe('Deleting a document', function () {
before(function (done) {
before(async function () {
this.lines = ['one', 'two', 'three']
this.version = 42
this.update = {
@@ -29,7 +22,7 @@ describe('Deleting a document', function () {
this.result = ['one', 'one and a half', 'two', 'three']
sinon.spy(MockProjectHistoryApi, 'flushProject')
DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
after(function () {
@@ -37,11 +30,9 @@ describe('Deleting a document', function () {
})
describe('when the updated doc exists in the doc updater', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockWebApi, 'getDocument')
@@ -49,32 +40,15 @@ describe('Deleting a document', function () {
lines: this.lines,
version: this.version,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error != null) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update,
error => {
if (error != null) {
throw error
}
setTimeout(() => {
DocUpdaterClient.deleteDoc(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
setTimeout(done, 200)
}
)
}, 200)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update
)
await setTimeout(200)
const res = await DocUpdaterClient.deleteDoc(this.project_id, this.doc_id)
this.statusCode = res.status
})
after(function () {
@@ -92,20 +66,13 @@ describe('Deleting a document', function () {
.should.equal(true)
})
it('should need to reload the doc if read again', function (done) {
it('should need to reload the doc if read again', async function () {
MockWebApi.getDocument.resetHistory()
MockWebApi.getDocument.called.should.equals(false)
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
MockWebApi.getDocument
.calledWith(this.project_id, this.doc_id)
.should.equal(true)
done()
}
)
await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
MockWebApi.getDocument
.calledWith(this.project_id, this.doc_id)
.should.equal(true)
})
it('should flush project history', function () {
@@ -116,25 +83,16 @@ describe('Deleting a document', function () {
})
describe('when the doc is not in the doc updater', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
})
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockWebApi, 'getDocument')
DocUpdaterClient.deleteDoc(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
setTimeout(done, 200)
}
)
const res = await DocUpdaterClient.deleteDoc(this.project_id, this.doc_id)
this.statusCode = res.status
})
after(function () {
@@ -150,19 +108,12 @@ describe('Deleting a document', function () {
MockWebApi.setDocument.called.should.equal(false)
})
it('should need to reload the doc if read again', function (done) {
it('should need to reload the doc if read again', async function () {
MockWebApi.getDocument.called.should.equals(false)
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
MockWebApi.getDocument
.calledWith(this.project_id, this.doc_id)
.should.equal(true)
done()
}
)
await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
MockWebApi.getDocument
.calledWith(this.project_id, this.doc_id)
.should.equal(true)
})
it('should flush project history', function () {

View File

@@ -1,13 +1,5 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const async = require('async')
const { setTimeout } = require('node:timers/promises')
const MockProjectHistoryApi = require('./helpers/MockProjectHistoryApi')
const MockWebApi = require('./helpers/MockWebApi')
@@ -15,7 +7,7 @@ const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
describe('Deleting a project', function () {
beforeEach(function (done) {
beforeEach(async function () {
let docId0, docId1
this.project_id = DocUpdaterClient.randomId()
this.docs = [
@@ -50,45 +42,27 @@ describe('Deleting a project', function () {
updatedLines: ['four', 'four and a half', 'five', 'six'],
},
]
for (const doc of Array.from(this.docs)) {
for (const doc of this.docs) {
MockWebApi.insertDoc(this.project_id, doc.id, {
lines: doc.lines,
version: doc.update.v,
})
}
DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
describe('without updates', function () {
beforeEach(function (done) {
beforeEach(async function () {
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockProjectHistoryApi, 'flushProject')
async.series(
this.docs.map(doc => {
return callback => {
DocUpdaterClient.preloadDoc(this.project_id, doc.id, error => {
callback(error)
})
}
}),
error => {
if (error != null) {
throw error
}
setTimeout(() => {
DocUpdaterClient.deleteProject(
this.project_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
done()
}
)
}, 200)
}
)
for (const doc of this.docs) {
await DocUpdaterClient.preloadDoc(this.project_id, doc.id)
}
await setTimeout(200)
const res = await DocUpdaterClient.deleteProject(this.project_id)
this.statusCode = res.status
})
afterEach(function () {
@@ -104,32 +78,18 @@ describe('Deleting a project', function () {
MockWebApi.setDocument.should.not.have.been.called
})
it('should need to reload the docs if read again', function (done) {
it('should need to reload the docs if read again', async function () {
sinon.spy(MockWebApi, 'getDocument')
async.series(
this.docs.map(doc => {
return callback => {
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(false)
DocUpdaterClient.getDoc(
this.project_id,
doc.id,
(error, res, returnedDoc) => {
if (error) return done(error)
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(true)
callback()
}
)
}
}),
() => {
MockWebApi.getDocument.restore()
done()
}
)
for (const doc of this.docs) {
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(false)
await DocUpdaterClient.getDoc(this.project_id, doc.id)
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(true)
}
MockWebApi.getDocument.restore()
})
it('should flush each doc in project history', function () {
@@ -140,44 +100,16 @@ describe('Deleting a project', function () {
})
describe('with documents which have been updated', function () {
beforeEach(function (done) {
beforeEach(async function () {
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockProjectHistoryApi, 'flushProject')
async.series(
this.docs.map(doc => {
return callback => {
DocUpdaterClient.preloadDoc(this.project_id, doc.id, error => {
if (error != null) {
return callback(error)
}
DocUpdaterClient.sendUpdate(
this.project_id,
doc.id,
doc.update,
error => {
callback(error)
}
)
})
}
}),
error => {
if (error != null) {
throw error
}
setTimeout(() => {
DocUpdaterClient.deleteProject(
this.project_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
done()
}
)
}, 200)
}
)
for (const doc of this.docs) {
await DocUpdaterClient.preloadDoc(this.project_id, doc.id)
await DocUpdaterClient.sendUpdate(this.project_id, doc.id, doc.update)
}
await setTimeout(200)
const res = await DocUpdaterClient.deleteProject(this.project_id)
this.statusCode = res.status
})
afterEach(function () {
@@ -190,39 +122,25 @@ describe('Deleting a project', function () {
})
it('should send each document to the web api', function () {
Array.from(this.docs).map(doc =>
for (const doc of this.docs) {
MockWebApi.setDocument
.calledWith(this.project_id, doc.id, doc.updatedLines)
.should.equal(true)
)
}
})
it('should need to reload the docs if read again', function (done) {
it('should need to reload the docs if read again', async function () {
sinon.spy(MockWebApi, 'getDocument')
async.series(
this.docs.map(doc => {
return callback => {
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(false)
DocUpdaterClient.getDoc(
this.project_id,
doc.id,
(error, res, returnedDoc) => {
if (error) return done(error)
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(true)
callback()
}
)
}
}),
() => {
MockWebApi.getDocument.restore()
done()
}
)
for (const doc of this.docs) {
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(false)
await DocUpdaterClient.getDoc(this.project_id, doc.id)
MockWebApi.getDocument
.calledWith(this.project_id, doc.id)
.should.equal(true)
}
MockWebApi.getDocument.restore()
})
it('should flush each doc in project history', function () {
@@ -233,44 +151,18 @@ describe('Deleting a project', function () {
})
describe('with the background=true parameter from realtime and no request to flush the queue', function () {
beforeEach(function (done) {
beforeEach(async function () {
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockProjectHistoryApi, 'flushProject')
async.series(
this.docs.map(doc => {
return callback => {
DocUpdaterClient.preloadDoc(this.project_id, doc.id, error => {
if (error != null) {
return callback(error)
}
DocUpdaterClient.sendUpdate(
this.project_id,
doc.id,
doc.update,
error => {
callback(error)
}
)
})
}
}),
error => {
if (error != null) {
throw error
}
setTimeout(() => {
DocUpdaterClient.deleteProjectOnShutdown(
this.project_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
done()
}
)
}, 200)
}
for (const doc of this.docs) {
await DocUpdaterClient.preloadDoc(this.project_id, doc.id)
await DocUpdaterClient.sendUpdate(this.project_id, doc.id, doc.update)
}
await setTimeout(200)
const res = await DocUpdaterClient.deleteProjectOnShutdown(
this.project_id
)
this.statusCode = res.status
})
afterEach(function () {
@@ -292,45 +184,21 @@ describe('Deleting a project', function () {
})
describe('with the background=true parameter from realtime and a request to flush the queue', function () {
beforeEach(function (done) {
beforeEach(async function () {
sinon.spy(MockWebApi, 'setDocument')
sinon.spy(MockProjectHistoryApi, 'flushProject')
async.series(
this.docs.map(doc => {
return callback => {
DocUpdaterClient.preloadDoc(this.project_id, doc.id, error => {
if (error != null) {
return callback(error)
}
DocUpdaterClient.sendUpdate(
this.project_id,
doc.id,
doc.update,
error => {
callback(error)
}
)
})
}
}),
error => {
if (error != null) {
throw error
}
setTimeout(() => {
DocUpdaterClient.deleteProjectOnShutdown(
this.project_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
// after deleting the project and putting it in the queue, flush the queue
setTimeout(() => DocUpdaterClient.flushOldProjects(done), 2000)
}
)
}, 200)
}
for (const doc of this.docs) {
await DocUpdaterClient.preloadDoc(this.project_id, doc.id)
await DocUpdaterClient.sendUpdate(this.project_id, doc.id, doc.update)
}
await setTimeout(200)
const res = await DocUpdaterClient.deleteProjectOnShutdown(
this.project_id
)
this.statusCode = res.status
// after deleting the project and putting it in the queue, flush the queue
await setTimeout(2000)
await DocUpdaterClient.flushOldProjects()
})
afterEach(function () {
@@ -343,11 +211,11 @@ describe('Deleting a project', function () {
})
it('should send each document to the web api', function () {
Array.from(this.docs).map(doc =>
for (const doc of this.docs) {
MockWebApi.setDocument
.calledWith(this.project_id, doc.id, doc.updatedLines)
.should.equal(true)
)
}
})
it('should flush to project history', function () {

View File

@@ -1,26 +1,18 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const async = require('async')
const { setTimeout } = require('node:timers/promises')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
describe('Flushing a project', function () {
before(function (done) {
let docId0, docId1
before(async function () {
this.project_id = DocUpdaterClient.randomId()
const docId0 = DocUpdaterClient.randomId()
const docId1 = DocUpdaterClient.randomId()
this.docs = [
{
id: (docId0 = DocUpdaterClient.randomId()),
id: docId0,
lines: ['one', 'two', 'three'],
update: {
doc: docId0,
@@ -35,7 +27,7 @@ describe('Flushing a project', function () {
updatedLines: ['one', 'one and a half', 'two', 'three'],
},
{
id: (docId1 = DocUpdaterClient.randomId()),
id: docId1,
lines: ['four', 'five', 'six'],
update: {
doc: docId1,
@@ -50,92 +42,51 @@ describe('Flushing a project', function () {
updatedLines: ['four', 'four and a half', 'five', 'six'],
},
]
for (const doc of Array.from(this.docs)) {
for (const doc of this.docs) {
MockWebApi.insertDoc(this.project_id, doc.id, {
lines: doc.lines,
version: doc.update.v,
})
}
return DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
return describe('with documents which have been updated', function () {
before(function (done) {
describe('with documents which have been updated', function () {
before(async function () {
sinon.spy(MockWebApi, 'setDocument')
return async.series(
this.docs.map(doc => {
return callback => {
return DocUpdaterClient.preloadDoc(
this.project_id,
doc.id,
error => {
if (error != null) {
return callback(error)
}
return DocUpdaterClient.sendUpdate(
this.project_id,
doc.id,
doc.update,
error => {
return callback(error)
}
)
}
)
}
}),
error => {
if (error != null) {
throw error
}
return setTimeout(() => {
return DocUpdaterClient.flushProject(
this.project_id,
(error, res, body) => {
if (error) return done(error)
this.statusCode = res.statusCode
return done()
}
)
}, 200)
}
)
for (const doc of this.docs) {
await DocUpdaterClient.preloadDoc(this.project_id, doc.id)
await DocUpdaterClient.sendUpdate(this.project_id, doc.id, doc.update)
}
await setTimeout(200)
const res = await DocUpdaterClient.flushProject(this.project_id)
this.statusCode = res.status
})
after(function () {
return MockWebApi.setDocument.restore()
MockWebApi.setDocument.restore()
})
it('should return a 204 status code', function () {
return this.statusCode.should.equal(204)
this.statusCode.should.equal(204)
})
it('should send each document to the web api', function () {
return Array.from(this.docs).map(doc =>
for (const doc of this.docs) {
MockWebApi.setDocument
.calledWith(this.project_id, doc.id, doc.updatedLines)
.should.equal(true)
)
}
})
return it('should update the lines in the doc updater', function (done) {
return async.series(
this.docs.map(doc => {
return callback => {
return DocUpdaterClient.getDoc(
this.project_id,
doc.id,
(error, res, returnedDoc) => {
if (error) return done(error)
returnedDoc.lines.should.deep.equal(doc.updatedLines)
return callback()
}
)
}
}),
done
)
it('should update the lines in the doc updater', async function () {
for (const doc of this.docs) {
const returnedDoc = await DocUpdaterClient.getDoc(
this.project_id,
doc.id
)
returnedDoc.lines.should.deep.equal(doc.updatedLines)
}
})
})
})

View File

@@ -1,26 +1,13 @@
/* eslint-disable
no-return-assign,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { expect } = require('chai')
const async = require('async')
const { setTimeout } = require('node:timers/promises')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
describe('Flushing a doc to Mongo', function () {
before(function (done) {
before(async function () {
this.lines = ['one', 'two', 'three']
this.version = 42
this.update = {
@@ -35,83 +22,69 @@ describe('Flushing a doc to Mongo', function () {
v: this.version,
}
this.result = ['one', 'one and a half', 'two', 'three']
return DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
describe('when the updated doc exists in the doc updater', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
sinon.spy(MockWebApi, 'setDocument')
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.sendUpdates(
this.project_id,
this.doc_id,
[this.update],
error => {
if (error != null) {
throw error
}
return setTimeout(() => {
return DocUpdaterClient.flushDoc(this.project_id, this.doc_id, done)
}, 200)
}
)
await DocUpdaterClient.sendUpdates(this.project_id, this.doc_id, [
this.update,
])
await setTimeout(200)
await DocUpdaterClient.flushDoc(this.project_id, this.doc_id)
})
after(function () {
return MockWebApi.setDocument.restore()
MockWebApi.setDocument.restore()
})
it('should flush the updated doc lines and version to the web api', function () {
return MockWebApi.setDocument
MockWebApi.setDocument
.calledWith(this.project_id, this.doc_id, this.result, this.version + 1)
.should.equal(true)
})
return it('should flush the last update author and time to the web api', function () {
it('should flush the last update author and time to the web api', function () {
const lastUpdatedAt = MockWebApi.setDocument.lastCall.args[5]
parseInt(lastUpdatedAt).should.be.closeTo(new Date().getTime(), 30000)
const lastUpdatedBy = MockWebApi.setDocument.lastCall.args[6]
return lastUpdatedBy.should.equal('last-author-fake-id')
lastUpdatedBy.should.equal('last-author-fake-id')
})
})
describe('when the doc does not exist in the doc updater', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
})
sinon.spy(MockWebApi, 'setDocument')
return DocUpdaterClient.flushDoc(this.project_id, this.doc_id, done)
await DocUpdaterClient.flushDoc(this.project_id, this.doc_id)
})
after(function () {
return MockWebApi.setDocument.restore()
MockWebApi.setDocument.restore()
})
return it('should not flush the doc to the web api', function () {
return MockWebApi.setDocument.called.should.equal(false)
it('should not flush the doc to the web api', function () {
MockWebApi.setDocument.called.should.equal(false)
})
})
return describe('when the web api http request takes a long time on first request', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
describe('when the web api http request takes a long time on first request', function () {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
version: this.version,
@@ -130,33 +103,26 @@ describe('Flushing a doc to Mongo', function () {
lastUpdatedBy,
callback
) => {
if (callback == null) {
if (!callback) {
callback = function () {}
}
setTimeout(callback, t)
return (t = 0)
t = 0
}
)
return DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, done)
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
})
after(function () {
return MockWebApi.setDocument.restore()
MockWebApi.setDocument.restore()
})
return it('should still work', function (done) {
it('should still work', async function () {
const start = Date.now()
return DocUpdaterClient.flushDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(204)
const delta = Date.now() - start
expect(delta).to.be.below(20000)
return done()
}
)
const res = await DocUpdaterClient.flushDoc(this.project_id, this.doc_id)
res.status.should.equal(204)
const delta = Date.now() - start
expect(delta).to.be.below(20000)
})
})
})

View File

@@ -1,32 +1,22 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { expect } = require('chai')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { RequestFailedError } = require('@overleaf/fetch-utils')
describe('Getting a document', function () {
before(function (done) {
before(async function () {
this.lines = ['one', 'two', 'three']
this.version = 42
return DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
describe('when the document is not loaded', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
sinon.spy(MockWebApi, 'getDocument')
MockWebApi.insertDoc(this.project_id, this.doc_id, {
@@ -34,87 +24,66 @@ describe('Getting a document', function () {
version: this.version,
})
return DocUpdaterClient.getDoc(
this.returnedDoc = await DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, returnedDoc) => {
if (error) return done(error)
this.returnedDoc = returnedDoc
return done()
}
this.doc_id
)
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
it('should load the document from the web API', function () {
return MockWebApi.getDocument
MockWebApi.getDocument
.calledWith(this.project_id, this.doc_id)
.should.equal(true)
})
it('should return the document lines', function () {
return this.returnedDoc.lines.should.deep.equal(this.lines)
this.returnedDoc.lines.should.deep.equal(this.lines)
})
return it('should return the document at its current version', function () {
return this.returnedDoc.version.should.equal(this.version)
it('should return the document at its current version', function () {
this.returnedDoc.version.should.equal(this.version)
})
})
describe('when the document is already loaded', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.preloadDoc(
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
sinon.spy(MockWebApi, 'getDocument')
this.returnedDoc = await DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
error => {
if (error != null) {
throw error
}
sinon.spy(MockWebApi, 'getDocument')
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, returnedDoc) => {
if (error) return done(error)
this.returnedDoc = returnedDoc
return done()
}
)
}
this.doc_id
)
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
it('should not load the document from the web API', function () {
return MockWebApi.getDocument.called.should.equal(false)
MockWebApi.getDocument.called.should.equal(false)
})
return it('should return the document lines', function () {
return this.returnedDoc.lines.should.deep.equal(this.lines)
it('should return the document lines', function () {
this.returnedDoc.lines.should.deep.equal(this.lines)
})
})
describe('when the request asks for some recent ops', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: (this.lines = ['one', 'two', 'three']),
})
@@ -125,159 +94,109 @@ describe('Getting a document', function () {
v,
}))
return DocUpdaterClient.sendUpdates(
await DocUpdaterClient.sendUpdates(
this.project_id,
this.doc_id,
this.updates,
error => {
if (error != null) {
throw error
}
sinon.spy(MockWebApi, 'getDocument')
return done()
}
this.updates
)
sinon.spy(MockWebApi, 'getDocument')
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
describe('when the ops are loaded', function () {
before(function (done) {
return DocUpdaterClient.getDocAndRecentOps(
before(async function () {
this.returnedDoc = await DocUpdaterClient.getDocAndRecentOps(
this.project_id,
this.doc_id,
190,
(error, res, returnedDoc) => {
if (error) return done(error)
this.returnedDoc = returnedDoc
return done()
}
190
)
})
return it('should return the recent ops', function () {
it('should return the recent ops', function () {
this.returnedDoc.ops.length.should.equal(10)
return Array.from(this.updates.slice(190, -1)).map((update, i) =>
for (const [i, update] of this.updates.slice(190, -1).entries()) {
this.returnedDoc.ops[i].op.should.deep.equal(update.op)
)
}
})
})
return describe('when the ops are not all loaded', function () {
before(function (done) {
describe('when the ops are not all loaded', function () {
it('should return UnprocessableEntity', async function () {
// We only track 100 ops
return DocUpdaterClient.getDocAndRecentOps(
this.project_id,
this.doc_id,
10,
(error, res, returnedDoc) => {
if (error) return done(error)
this.res = res
this.returnedDoc = returnedDoc
return done()
}
await expect(
DocUpdaterClient.getDocAndRecentOps(this.project_id, this.doc_id, 10)
)
})
return it('should return UnprocessableEntity', function () {
return this.res.statusCode.should.equal(422)
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 422)
})
})
})
describe('when the document does not exist', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
this.statusCode = res.statusCode
return done()
}
)
})
return it('should return 404', function () {
return this.statusCode.should.equal(404)
it('should return 404', async function () {
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
await expect(DocUpdaterClient.getDoc(projectId, docId))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 404)
})
})
describe('when the web api returns an error', function () {
before(function (done) {
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
before(function () {
sinon
.stub(MockWebApi, 'getDocument')
.callsFake((projectId, docId, callback) => {
if (callback == null) {
callback = function () {}
}
return callback(new Error('oops'))
callback(new Error('oops'))
})
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
this.statusCode = res.statusCode
return done()
}
)
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
return it('should return 500', function () {
return this.statusCode.should.equal(500)
it('should return 500', async function () {
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
await expect(DocUpdaterClient.getDoc(projectId, docId))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 500)
})
})
return describe('when the web api http request takes a long time', function () {
describe('when the web api http request takes a long time', function () {
before(function (done) {
this.timeout = 10000
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
sinon
.stub(MockWebApi, 'getDocument')
.callsFake((projectId, docId, callback) => {
if (callback == null) {
callback = function () {}
}
return setTimeout(callback, 30000)
setTimeout(callback, 30000)
})
return done()
done()
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
return it('should return quickly(ish)', function (done) {
it('should return quickly(ish)', async function () {
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
const start = Date.now()
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
res.statusCode.should.equal(500)
const delta = Date.now() - start
expect(delta).to.be.below(20000)
return done()
}
)
await expect(DocUpdaterClient.getDoc(projectId, docId))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 500)
const delta = Date.now() - start
expect(delta).to.be.below(20000)
})
})
})

View File

@@ -1,176 +1,77 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const sinon = require('sinon')
const { expect } = require('chai')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { RequestFailedError } = require('@overleaf/fetch-utils')
describe('Getting documents for project', function () {
before(function (done) {
before(async function () {
this.lines = ['one', 'two', 'three']
this.version = 42
return DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
describe('when project state hash does not match', function () {
before(function (done) {
this.projectStateHash = DocUpdaterClient.randomId()
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
MockWebApi.insertDoc(this.project_id, this.doc_id, {
it('should return a 409 Conflict response', async function () {
const projectStateHash = DocUpdaterClient.randomId()
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.preloadDoc(
this.project_id,
this.doc_id,
error => {
if (error != null) {
throw error
}
return DocUpdaterClient.getProjectDocs(
this.project_id,
this.projectStateHash,
(error, res, returnedDocs) => {
if (error) return done(error)
this.res = res
this.returnedDocs = returnedDocs
return done()
}
)
}
)
})
return it('should return a 409 Conflict response', function () {
return this.res.statusCode.should.equal(409)
await DocUpdaterClient.preloadDoc(projectId, docId)
await expect(DocUpdaterClient.getProjectDocs(projectId, projectStateHash))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 409)
})
})
describe('when project state hash matches', function () {
before(function (done) {
this.projectStateHash = DocUpdaterClient.randomId()
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
MockWebApi.insertDoc(this.project_id, this.doc_id, {
it('should return the documents', async function () {
const projectStateHash = DocUpdaterClient.randomId()
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.preloadDoc(
this.project_id,
this.doc_id,
error => {
if (error != null) {
throw error
}
return DocUpdaterClient.getProjectDocs(
this.project_id,
this.projectStateHash,
(error, res0, returnedDocs0) => {
if (error) return done(error)
// set the hash
this.res0 = res0
this.returnedDocs0 = returnedDocs0
return DocUpdaterClient.getProjectDocs(
this.project_id,
this.projectStateHash,
(error, res, returnedDocs) => {
if (error) return done(error)
// the hash should now match
this.res = res
this.returnedDocs = returnedDocs
return done()
}
)
}
)
}
await DocUpdaterClient.preloadDoc(projectId, docId)
// set the hash
await expect(DocUpdaterClient.getProjectDocs(projectId, projectStateHash))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 409)
const returnedDocs1 = await DocUpdaterClient.getProjectDocs(
projectId,
projectStateHash
)
})
it('should return a 200 response', function () {
return this.res.statusCode.should.equal(200)
})
return it('should return the documents', function () {
return this.returnedDocs.should.deep.equal([
{ _id: this.doc_id, lines: this.lines, v: this.version },
// the hash should now match
returnedDocs1.should.deep.equal([
{ _id: docId, lines: this.lines, v: this.version },
])
})
})
return describe('when the doc has been removed', function () {
before(function (done) {
this.projectStateHash = DocUpdaterClient.randomId()
;[this.project_id, this.doc_id] = Array.from([
DocUpdaterClient.randomId(),
DocUpdaterClient.randomId(),
])
MockWebApi.insertDoc(this.project_id, this.doc_id, {
describe('when the doc has been removed', function () {
it('should return a 409 Conflict response', async function () {
const projectStateHash = DocUpdaterClient.randomId()
const projectId = DocUpdaterClient.randomId()
const docId = DocUpdaterClient.randomId()
MockWebApi.insertDoc(projectId, docId, {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.preloadDoc(
this.project_id,
this.doc_id,
error => {
if (error != null) {
throw error
}
return DocUpdaterClient.getProjectDocs(
this.project_id,
this.projectStateHash,
(error, res0, returnedDocs0) => {
if (error) return done(error)
// set the hash
this.res0 = res0
this.returnedDocs0 = returnedDocs0
return DocUpdaterClient.deleteDoc(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error) return done(error)
// delete the doc
return DocUpdaterClient.getProjectDocs(
this.project_id,
this.projectStateHash,
(error, res1, returnedDocs) => {
if (error) return done(error)
// the hash would match, but the doc has been deleted
this.res = res1
this.returnedDocs = returnedDocs
return done()
}
)
}
)
}
)
}
)
})
return it('should return a 409 Conflict response', function () {
return this.res.statusCode.should.equal(409)
await DocUpdaterClient.preloadDoc(projectId, docId)
await expect(DocUpdaterClient.getProjectDocs(projectId, projectStateHash))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 409)
await DocUpdaterClient.deleteDoc(projectId, docId)
// the hash would match, but the doc has been deleted
await expect(DocUpdaterClient.getProjectDocs(projectId, projectStateHash))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 409)
})
})
})

View File

@@ -2,16 +2,18 @@ const sinon = require('sinon')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { expect } = require('chai')
const { RequestFailedError } = require('@overleaf/fetch-utils')
describe('Peeking a document', function () {
before(function (done) {
before(async function () {
this.lines = ['one', 'two', 'three']
this.version = 42
return DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
describe('when the document is not loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
sinon.spy(MockWebApi, 'getDocument')
@@ -20,34 +22,22 @@ describe('Peeking a document', function () {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.peekDoc(
this.project_id,
this.doc_id,
(error, res, returnedDoc) => {
this.error = error
this.res = res
this.returnedDoc = returnedDoc
return done()
}
)
})
after(function () {
return MockWebApi.getDocument.restore()
MockWebApi.getDocument.restore()
})
it('should return a 404 response', function () {
this.res.statusCode.should.equal(404)
})
it('should not load the document from the web API', function () {
return MockWebApi.getDocument.called.should.equal(false)
it('should not load the document from the web API and should return a 404 response', async function () {
await expect(DocUpdaterClient.peekDoc(this.project_id, this.doc_id))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 404)
MockWebApi.getDocument.called.should.equal(false)
})
})
describe('when the document is already loaded', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
@@ -55,46 +45,28 @@ describe('Peeking a document', function () {
lines: this.lines,
version: this.version,
})
return DocUpdaterClient.preloadDoc(
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
sinon.spy(MockWebApi, 'getDocument')
this.returnedDoc = await DocUpdaterClient.peekDoc(
this.project_id,
this.doc_id,
error => {
if (error != null) {
throw error
}
sinon.spy(MockWebApi, 'getDocument')
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, returnedDoc) => {
if (error) return done(error)
this.res = res
this.returnedDoc = returnedDoc
return done()
}
)
}
this.doc_id
)
})
after(function () {
return MockWebApi.getDocument.restore()
})
it('should return a 200 response', function () {
this.res.statusCode.should.equal(200)
MockWebApi.getDocument.restore()
})
it('should return the document lines', function () {
return this.returnedDoc.lines.should.deep.equal(this.lines)
this.returnedDoc.lines.should.deep.equal(this.lines)
})
it('should return the document version', function () {
return this.returnedDoc.version.should.equal(this.version)
this.returnedDoc.version.should.equal(this.version)
})
it('should not load the document from the web API', function () {
return MockWebApi.getDocument.called.should.equal(false)
MockWebApi.getDocument.called.should.equal(false)
})
})
})

File diff suppressed because it is too large Load Diff

View File

@@ -8,12 +8,12 @@ const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const sandbox = sinon.createSandbox()
describe('Rejecting Changes', function () {
before(function (done) {
DocUpdaterApp.ensureRunning(done)
before(async function () {
await DocUpdaterApp.ensureRunning()
})
describe('rejecting a single change', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = DocUpdaterClient.randomId()
this.user_id = DocUpdaterClient.randomId()
this.doc = {
@@ -38,11 +38,10 @@ describe('Rejecting Changes', function () {
},
}
DocUpdaterClient.sendUpdate(
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc.id,
this.update,
done
this.update
)
})
@@ -50,93 +49,54 @@ describe('Rejecting Changes', function () {
sandbox.restore()
})
it('should reject the change and restore the original text', function (done) {
DocUpdaterClient.getDoc(
it('should reject the change and restore the original text', async function () {
const doc1 = await DocUpdaterClient.getDoc(this.project_id, this.doc.id)
expect(doc1.ranges.changes).to.have.length(1)
const change = doc1.ranges.changes[0]
expect(change.op).to.deep.equal({ i: 'quick ', p: 4 })
expect(change.id).to.equal(this.id_seed + '000001')
expect(doc1.lines).to.deep.equal([
'the quick brown fox jumps over the lazy dog',
])
const { rejectedChangeIds } = await DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
(error, res, data) => {
if (error != null) {
throw error
}
expect(data.ranges.changes).to.have.length(1)
const change = data.ranges.changes[0]
expect(change.op).to.deep.equal({ i: 'quick ', p: 4 })
expect(change.id).to.equal(this.id_seed + '000001')
expect(data.lines).to.deep.equal([
'the quick brown fox jumps over the lazy dog',
])
DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
[change.id],
this.user_id,
(error, res, body) => {
if (error != null) {
throw error
}
expect(res.statusCode).to.equal(200)
expect(body.rejectedChangeIds).to.be.an('array')
expect(body.rejectedChangeIds).to.include(change.id)
DocUpdaterClient.getDoc(
this.project_id,
this.doc.id,
(error, res, data) => {
if (error != null) {
throw error
}
expect(data.ranges.changes || []).to.have.length(0)
expect(data.lines).to.deep.equal([
'the brown fox jumps over the lazy dog',
])
done()
}
)
}
)
}
[change.id],
this.user_id
)
expect(rejectedChangeIds).to.be.an('array')
expect(rejectedChangeIds).to.include(change.id)
const doc2 = await DocUpdaterClient.getDoc(this.project_id, this.doc.id)
expect(doc2.ranges.changes || []).to.have.length(0)
expect(doc2.lines).to.deep.equal([
'the brown fox jumps over the lazy dog',
])
})
it('should return 200 status code with rejectedChangeIds on successful rejection', function (done) {
DocUpdaterClient.getDoc(
it('should return 200 status code with rejectedChangeIds on successful rejection', async function () {
const data = await DocUpdaterClient.getDoc(this.project_id, this.doc.id)
const changeId = data.ranges.changes[0].id
const { rejectedChangeIds } = await DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
(error, res, data) => {
if (error != null) {
throw error
}
const changeId = data.ranges.changes[0].id
DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
[changeId],
this.user_id,
(error, res, body) => {
if (error != null) {
throw error
}
expect(res.statusCode).to.equal(200)
expect(body.rejectedChangeIds).to.be.an('array')
expect(body.rejectedChangeIds).to.include(changeId)
done()
}
)
}
[changeId],
this.user_id
)
expect(rejectedChangeIds).to.be.an('array')
expect(rejectedChangeIds).to.include(changeId)
})
})
describe('rejecting multiple changes', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = DocUpdaterClient.randomId()
this.user_id = DocUpdaterClient.randomId()
this.doc = {
@@ -174,11 +134,10 @@ describe('Rejecting Changes', function () {
},
]
DocUpdaterClient.sendUpdates(
await DocUpdaterClient.sendUpdates(
this.project_id,
this.doc.id,
this.updates,
done
this.updates
)
})
@@ -186,62 +145,36 @@ describe('Rejecting Changes', function () {
sandbox.restore()
})
it('should reject multiple changes in order', function (done) {
DocUpdaterClient.getDoc(
it('should reject multiple changes in order', async function () {
const data = await DocUpdaterClient.getDoc(this.project_id, this.doc.id)
expect(data.ranges.changes).to.have.length(2)
expect(data.lines).to.deep.equal([
'the quick brown fox jumps over the dog',
])
const changeIds = data.ranges.changes.map(change => change.id)
const { rejectedChangeIds } = await DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
(error, res, data) => {
if (error != null) {
throw error
}
expect(data.ranges.changes).to.have.length(2)
expect(data.lines).to.deep.equal([
'the quick brown fox jumps over the dog',
])
const changeIds = data.ranges.changes.map(change => change.id)
DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
changeIds,
this.user_id,
(error, res, body) => {
if (error != null) {
throw error
}
expect(res.statusCode).to.equal(200)
expect(body.rejectedChangeIds).to.be.an('array')
expect(body.rejectedChangeIds).to.have.length(2)
expect(body.rejectedChangeIds).to.include.members(changeIds)
DocUpdaterClient.getDoc(
this.project_id,
this.doc.id,
(error, res, data) => {
if (error != null) {
throw error
}
expect(data.ranges.changes || []).to.have.length(0)
expect(data.lines).to.deep.equal([
'the brown fox jumps over the lazy dog',
])
done()
}
)
}
)
}
changeIds,
this.user_id
)
expect(rejectedChangeIds).to.be.an('array')
expect(rejectedChangeIds).to.have.length(2)
expect(rejectedChangeIds).to.include.members(changeIds)
const data2 = await DocUpdaterClient.getDoc(this.project_id, this.doc.id)
expect(data2.ranges.changes || []).to.have.length(0)
expect(data2.lines).to.deep.equal([
'the brown fox jumps over the lazy dog',
])
})
})
describe('error cases', function () {
beforeEach(function (done) {
beforeEach(async function () {
this.project_id = DocUpdaterClient.randomId()
this.user_id = DocUpdaterClient.randomId()
this.doc = {
@@ -255,46 +188,32 @@ describe('Rejecting Changes', function () {
historyRangesSupport: true,
})
DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
it('should handle rejection of non-existent changes gracefully', function (done) {
it('should handle rejection of non-existent changes gracefully', async function () {
const nonExistentChangeId = 'nonexistent_change_id'
DocUpdaterClient.rejectChanges(
const { rejectedChangeIds } = await DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
[nonExistentChangeId],
this.user_id,
(error, res, body) => {
// Should still return 200 with empty rejectedChangeIds if no changes were found to reject
if (error != null) {
throw error
}
expect(res.statusCode).to.equal(200)
expect(body.rejectedChangeIds).to.be.an('array')
expect(body.rejectedChangeIds).to.have.length(0)
done()
}
this.user_id
)
// Should still return 200 with empty rejectedChangeIds if no changes were found to reject
expect(rejectedChangeIds).to.be.an('array')
expect(rejectedChangeIds).to.have.length(0)
})
it('should handle empty change_ids array', function (done) {
DocUpdaterClient.rejectChanges(
it('should handle empty change_ids array', async function () {
const { rejectedChangeIds } = await DocUpdaterClient.rejectChanges(
this.project_id,
this.doc.id,
[],
this.user_id,
(error, res, body) => {
if (error != null) {
throw error
}
expect(res.statusCode).to.equal(200)
expect(body.rejectedChangeIds).to.be.an('array')
expect(body.rejectedChangeIds).to.have.length(0)
done()
}
this.user_id
)
expect(rejectedChangeIds).to.be.an('array')
expect(rejectedChangeIds).to.have.length(0)
})
})
})

View File

@@ -1,5 +1,6 @@
const sinon = require('sinon')
const { expect } = require('chai')
const { setTimeout } = require('node:timers/promises')
const Settings = require('@overleaf/settings')
const docUpdaterRedis = require('@overleaf/redis-wrapper').createClient(
Settings.redis.documentupdater
@@ -10,10 +11,11 @@ const MockProjectHistoryApi = require('./helpers/MockProjectHistoryApi')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { RequestFailedError } = require('@overleaf/fetch-utils')
describe('Setting a document', function () {
let numberOfReceivedUpdates = 0
before(function (done) {
before(async function () {
DocUpdaterClient.subscribeToAppliedOps(() => {
numberOfReceivedUpdates++
})
@@ -36,7 +38,7 @@ describe('Setting a document', function () {
sinon.spy(MockProjectHistoryApi, 'flushProject')
sinon.spy(MockWebApi, 'setDocument')
DocUpdaterApp.ensureRunning(done)
await DocUpdaterApp.ensureRunning()
})
after(function () {
@@ -45,7 +47,7 @@ describe('Setting a document', function () {
})
describe('when the updated doc exists in the doc updater', function () {
before(function (done) {
before(async function () {
numberOfReceivedUpdates = 0
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
@@ -53,39 +55,21 @@ describe('Setting a document', function () {
lines: this.lines,
version: this.version,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update,
error => {
if (error) {
throw error
}
setTimeout(() => {
DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
done()
}
)
}, 200)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update
)
await setTimeout(200)
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false
)
})
after(function () {
@@ -93,10 +77,6 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should return a 200 status code', function () {
this.statusCode.should.equal(200)
})
it('should emit two updates (from sendUpdate and setDocLines)', function () {
expect(numberOfReceivedUpdates).to.equal(2)
})
@@ -107,32 +87,14 @@ describe('Setting a document', function () {
.should.equal(true)
})
it('should update the lines in the doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.lines.should.deep.equal(this.newLines)
done()
}
)
it('should update the lines in the doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.newLines)
})
it('should bump the version in the doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.version.should.equal(this.version + 2)
done()
}
)
it('should bump the version in the doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.version.should.equal(this.version + 2)
})
it('should leave the document in redis', function (done) {
@@ -153,51 +115,33 @@ describe('Setting a document', function () {
})
describe('when doc has the same contents', function () {
beforeEach(function (done) {
beforeEach(async function () {
numberOfReceivedUpdates = 0
DocUpdaterClient.setDocLines(
await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
done()
}
false
)
})
it('should not bump the version in doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.version.should.equal(this.version + 2)
done()
}
)
it('should not bump the version in doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.version.should.equal(this.version + 2)
})
it('should not emit any updates', function (done) {
setTimeout(() => {
expect(numberOfReceivedUpdates).to.equal(0)
done()
}, 100) // delay by 100ms: make sure we do not check too early!
it('should not emit any updates', async function () {
// delay by 100ms: make sure we do not check too early!
await setTimeout(100)
expect(numberOfReceivedUpdates).to.equal(0)
})
})
})
describe('when the updated doc exists in the doc updater (history-ot)', function () {
before(function (done) {
before(async function () {
numberOfReceivedUpdates = 0
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
@@ -212,39 +156,21 @@ describe('Setting a document', function () {
version: this.version,
otMigrationStage: 1,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.historyOTUpdate,
error => {
if (error) {
throw error
}
setTimeout(() => {
DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
done()
}
)
}, 200)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.historyOTUpdate
)
await setTimeout(200)
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false
)
})
after(function () {
@@ -252,10 +178,6 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should return a 200 status code', function () {
this.statusCode.should.equal(200)
})
it('should emit two updates (from sendUpdate and setDocLines)', function () {
expect(numberOfReceivedUpdates).to.equal(2)
})
@@ -266,32 +188,14 @@ describe('Setting a document', function () {
.should.equal(true)
})
it('should update the lines in the doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.lines.should.deep.equal(this.newLines)
done()
}
)
it('should update the lines in the doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.lines.should.deep.equal(this.newLines)
})
it('should bump the version in the doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.version.should.equal(this.version + 2)
done()
}
)
it('should bump the version in the doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.version.should.equal(this.version + 2)
})
it('should leave the document in redis', function (done) {
@@ -314,51 +218,33 @@ describe('Setting a document', function () {
})
describe('when doc has the same contents', function () {
beforeEach(function (done) {
beforeEach(async function () {
numberOfReceivedUpdates = 0
DocUpdaterClient.setDocLines(
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
done()
}
false
)
})
it('should not bump the version in doc updater', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) {
return done(error)
}
doc.version.should.equal(this.version + 2)
done()
}
)
it('should not bump the version in doc updater', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
doc.version.should.equal(this.version + 2)
})
it('should not emit any updates', function (done) {
setTimeout(() => {
expect(numberOfReceivedUpdates).to.equal(0)
done()
}, 100) // delay by 100ms: make sure we do not check too early!
it('should not emit any updates', async function () {
// delay by 100ms: make sure we do not check too early!
await setTimeout(100)
expect(numberOfReceivedUpdates).to.equal(0)
})
})
})
describe('when the updated doc does not exist in the doc updater', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
numberOfReceivedUpdates = 0
@@ -366,22 +252,15 @@ describe('Setting a document', function () {
lines: this.lines,
version: this.version,
})
DocUpdaterClient.setDocLines(
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
setTimeout(done, 200)
}
false
)
await setTimeout(200)
})
after(function () {
@@ -389,10 +268,6 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should return a 200 status code', function () {
this.statusCode.should.equal(200)
})
it('should emit an update', function () {
expect(numberOfReceivedUpdates).to.equal(1)
})
@@ -442,7 +317,7 @@ describe('Setting a document', function () {
DOC_TOO_LARGE_TEST_CASES.forEach(testCase => {
describe(testCase.desc, function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
@@ -453,21 +328,24 @@ describe('Setting a document', function () {
while (JSON.stringify(this.newLines).length <= testCase.size) {
this.newLines.push('(a long line of text)'.repeat(10000))
}
DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
setTimeout(done, 200)
try {
await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false
)
this.statusCode = 200
} catch (err) {
if (err instanceof RequestFailedError) {
this.statusCode = err.response.status
} else {
throw err
}
)
}
await setTimeout(200)
})
after(function () {
@@ -490,7 +368,7 @@ describe('Setting a document', function () {
})
describe('when the updated doc is large but under the bodyParser and HTTPController size limit', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
@@ -504,22 +382,15 @@ describe('Setting a document', function () {
this.newLines.push('(a long line of text)'.repeat(10000))
}
this.newLines.pop() // remove the line which took it over the limit
DocUpdaterClient.setDocLines(
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.newLines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
setTimeout(done, 200)
}
false
)
await setTimeout(200)
})
after(function () {
@@ -527,10 +398,6 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should return a 200 status code', function () {
this.statusCode.should.equal(200)
})
it('should send the updated doc lines to the web api', function () {
MockWebApi.setDocument
.calledWith(this.project_id, this.doc_id, this.newLines)
@@ -563,44 +430,29 @@ describe('Setting a document', function () {
})
describe('with the undo flag', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
version: this.version,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update,
error => {
if (error) {
throw error
}
// Go back to old lines, with undo flag
DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.lines,
this.source,
this.user_id,
true,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
setTimeout(done, 200)
}
)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update
)
// Go back to old lines, with undo flag
await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.lines,
this.source,
this.user_id,
true
)
await setTimeout(200)
})
after(function () {
@@ -608,61 +460,36 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should undo the tracked changes', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, data) => {
if (error) {
throw error
}
const { ranges } = data
expect(ranges.changes).to.be.undefined
done()
}
)
it('should undo the tracked changes', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.ranges.changes).to.be.undefined
})
})
describe('without the undo flag', function () {
before(function (done) {
before(async function () {
this.project_id = DocUpdaterClient.randomId()
this.doc_id = DocUpdaterClient.randomId()
MockWebApi.insertDoc(this.project_id, this.doc_id, {
lines: this.lines,
version: this.version,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update,
error => {
if (error) {
throw error
}
// Go back to old lines, without undo flag
DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.lines,
this.source,
this.user_id,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
setTimeout(done, 200)
}
)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.update
)
// Go back to old lines, without undo flag
await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
this.lines,
this.source,
this.user_id,
false
)
await setTimeout(200)
})
after(function () {
@@ -670,19 +497,9 @@ describe('Setting a document', function () {
MockWebApi.setDocument.resetHistory()
})
it('should not undo the tracked changes', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, data) => {
if (error) {
throw error
}
const { ranges } = data
expect(ranges.changes.length).to.equal(1)
done()
}
)
it('should not undo the tracked changes', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.ranges.changes.length).to.equal(1)
})
})
})
@@ -691,7 +508,8 @@ describe('Setting a document', function () {
const lines = ['one', 'one and a half', 'two', 'three']
const userId = DocUpdaterClient.randomId()
const ts = new Date().toISOString()
beforeEach(function (done) {
beforeEach(async function () {
numberOfReceivedUpdates = 0
this.newLines = ['one', 'two', 'three']
this.project_id = DocUpdaterClient.randomId()
@@ -722,32 +540,20 @@ describe('Setting a document', function () {
version: this.version,
otMigrationStage: 1,
})
DocUpdaterClient.preloadDoc(this.project_id, this.doc_id, error => {
if (error) {
throw error
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.historyOTUpdate,
error => {
if (error) {
throw error
}
DocUpdaterClient.waitForPendingUpdates(
this.project_id,
this.doc_id,
done
)
}
)
})
await DocUpdaterClient.preloadDoc(this.project_id, this.doc_id)
await DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
this.historyOTUpdate
)
await DocUpdaterClient.waitForPendingUpdates(this.doc_id)
})
afterEach(function () {
MockProjectHistoryApi.flushProject.resetHistory()
MockWebApi.setDocument.resetHistory()
})
it('should record tracked changes', function (done) {
docUpdaterRedis.get(
Keys.docLines({ doc_id: this.doc_id }),
@@ -776,19 +582,11 @@ describe('Setting a document', function () {
)
})
it('should apply the change', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, data) => {
if (error) {
throw error
}
expect(data.lines).to.deep.equal(this.newLines)
done()
}
)
it('should apply the change', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.lines).to.deep.equal(this.newLines)
})
const cases = [
{
name: 'when resetting the content',
@@ -934,22 +732,14 @@ describe('Setting a document', function () {
for (const { name, lines, want } of cases) {
describe(name, function () {
beforeEach(function (done) {
DocUpdaterClient.setDocLines(
beforeEach(async function () {
this.body = await DocUpdaterClient.setDocLines(
this.project_id,
this.doc_id,
lines,
this.source,
userId,
false,
(error, res, body) => {
if (error) {
return done(error)
}
this.statusCode = res.statusCode
this.body = body
done()
}
false
)
})
it('should update accordingly', function (done) {

View File

@@ -1,13 +1,15 @@
const { expect } = require('chai')
const { setTimeout } = require('node:timers/promises')
const Settings = require('@overleaf/settings')
const MockWebApi = require('./helpers/MockWebApi')
const DocUpdaterClient = require('./helpers/DocUpdaterClient')
const DocUpdaterApp = require('./helpers/DocUpdaterApp')
const { RequestFailedError } = require('@overleaf/fetch-utils')
describe('SizeChecks', function () {
before(function (done) {
DocUpdaterApp.ensureRunning(done)
before(async function () {
await DocUpdaterApp.ensureRunning()
})
beforeEach(function () {
this.version = 0
@@ -34,40 +36,27 @@ describe('SizeChecks', function () {
})
})
it('should error when fetching the doc', function (done) {
DocUpdaterClient.getDoc(this.project_id, this.doc_id, (error, res) => {
if (error) return done(error)
expect(res.statusCode).to.equal(500)
done()
})
it('should error when fetching the doc', async function () {
await expect(DocUpdaterClient.getDoc(this.project_id, this.doc_id))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 500)
})
describe('when trying to update', function () {
beforeEach(function (done) {
beforeEach(async function () {
const update = {
doc: this.doc_id,
op: this.update.op,
v: this.version,
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
update,
error => {
if (error != null) {
throw error
}
setTimeout(done, 200)
}
)
await DocUpdaterClient.sendUpdate(this.project_id, this.doc_id, update)
await setTimeout(200)
})
it('should still error when fetching the doc', function (done) {
DocUpdaterClient.getDoc(this.project_id, this.doc_id, (error, res) => {
if (error) return done(error)
expect(res.statusCode).to.equal(500)
done()
})
it('should still error when fetching the doc', async function () {
await expect(DocUpdaterClient.getDoc(this.project_id, this.doc_id))
.to.be.rejectedWith(RequestFailedError)
.and.eventually.have.nested.property('response.status', 500)
})
})
})
@@ -91,48 +80,25 @@ describe('SizeChecks', function () {
})
})
it('should be able to fetch the doc', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
expect(doc.lines).to.deep.equal(this.lines)
done()
}
)
it('should be able to fetch the doc', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.lines).to.deep.equal(this.lines)
})
describe('when trying to update', function () {
beforeEach(function (done) {
beforeEach(async function () {
const update = {
doc: this.doc_id,
op: this.update.op,
v: this.version,
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
update,
error => {
if (error != null) {
throw error
}
setTimeout(done, 200)
}
)
await DocUpdaterClient.sendUpdate(this.project_id, this.doc_id, update)
await setTimeout(200)
})
it('should not update the doc', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
expect(doc.lines).to.deep.equal(this.lines)
done()
}
)
it('should not update the doc', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.lines).to.deep.equal(this.lines)
})
})
})
@@ -146,48 +112,25 @@ describe('SizeChecks', function () {
})
})
it('should be able to fetch the doc', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
expect(doc.lines).to.deep.equal(this.lines)
done()
}
)
it('should be able to fetch the doc', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.lines).to.deep.equal(this.lines)
})
describe('when trying to update', function () {
beforeEach(function (done) {
beforeEach(async function () {
const update = {
doc: this.doc_id,
op: this.update.op,
v: this.version,
}
DocUpdaterClient.sendUpdate(
this.project_id,
this.doc_id,
update,
error => {
if (error != null) {
throw error
}
setTimeout(done, 200)
}
)
await DocUpdaterClient.sendUpdate(this.project_id, this.doc_id, update)
await setTimeout(200)
})
it('should not update the doc', function (done) {
DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, doc) => {
if (error) return done(error)
expect(doc.lines).to.deep.equal(this.lines)
done()
}
)
it('should not update the doc', async function () {
const doc = await DocUpdaterClient.getDoc(this.project_id, this.doc_id)
expect(doc.lines).to.deep.equal(this.lines)
})
})
})

View File

@@ -1,42 +1,26 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS205: Consider reworking code to avoid use of IIFEs
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const app = require('../../../../app')
module.exports = {
running: false,
initing: false,
callbacks: [],
ensureRunning(callback) {
if (callback == null) {
callback = function () {}
}
if (this.running) {
return callback()
} else if (this.initing) {
return this.callbacks.push(callback)
}
this.initing = true
this.callbacks.push(callback)
function startApp() {
return new Promise((resolve, reject) => {
app.listen(3003, '127.0.0.1', error => {
if (error != null) {
throw error
if (error) {
reject(error)
} else {
resolve()
}
this.running = true
return (() => {
const result = []
for (callback of Array.from(this.callbacks)) {
result.push(callback())
}
return result
})()
})
},
})
}
let appStartedPromise
async function ensureRunning() {
if (!appStartedPromise) {
appStartedPromise = startApp()
}
await appStartedPromise
}
module.exports = {
ensureRunning,
}

View File

@@ -5,8 +5,8 @@ const rclient = require('@overleaf/redis-wrapper').createClient(
Settings.redis.documentupdater
)
const keys = Settings.redis.documentupdater.key_schema
const request = require('request').defaults({ jar: false })
const async = require('async')
const { fetchJson, fetchNothing } = require('@overleaf/fetch-utils')
const { setTimeout } = require('node:timers/promises')
const rclientSub = require('@overleaf/redis-wrapper').createClient(
Settings.redis.pubsub
@@ -14,6 +14,15 @@ const rclientSub = require('@overleaf/redis-wrapper').createClient(
rclientSub.subscribe('applied-ops')
rclientSub.setMaxListeners(0)
function getPendingUpdateListKey() {
const shard = _.random(0, Settings.dispatcherCount - 1)
if (shard === 0) {
return 'pending-updates-list'
} else {
return `pending-updates-list-${shard}`
}
}
module.exports = DocUpdaterClient = {
randomId() {
let str = ''
@@ -23,234 +32,177 @@ module.exports = DocUpdaterClient = {
return str
},
subscribeToAppliedOps(callback) {
rclientSub.on('message', callback)
subscribeToAppliedOps(messageHandler) {
rclientSub.on('message', messageHandler)
},
_getPendingUpdateListKey() {
const shard = _.random(0, Settings.dispatcherCount - 1)
if (shard === 0) {
return 'pending-updates-list'
} else {
return `pending-updates-list-${shard}`
}
},
sendUpdate(projectId, docId, update, callback) {
rclient.rpush(
async sendUpdate(projectId, docId, update) {
const docKey = `${projectId}:${docId}`
await rclient.rpush(
keys.pendingUpdates({ doc_id: docId }),
JSON.stringify(update),
error => {
if (error) {
return callback(error)
}
const docKey = `${projectId}:${docId}`
rclient.sadd('DocsWithPendingUpdates', docKey, error => {
if (error) {
return callback(error)
}
JSON.stringify(update)
)
await rclient.sadd('DocsWithPendingUpdates', docKey)
await rclient.rpush(getPendingUpdateListKey(), docKey)
},
rclient.rpush(
DocUpdaterClient._getPendingUpdateListKey(),
docKey,
callback
)
})
async sendUpdates(projectId, docId, updates) {
await DocUpdaterClient.preloadDoc(projectId, docId)
for (const update of updates) {
await DocUpdaterClient.sendUpdate(projectId, docId, update)
}
await DocUpdaterClient.waitForPendingUpdates(docId)
},
async waitForPendingUpdates(docId) {
const maxRetries = 30
const retryInterval = 100
for (let attempt = 0; attempt < maxRetries; attempt++) {
const length = await rclient.llen(keys.pendingUpdates({ doc_id: docId }))
if (length === 0) {
return // Success - no pending updates
}
if (attempt < maxRetries - 1) {
await setTimeout(retryInterval)
}
}
throw new Error('updates still pending after maximum retries')
},
async getDoc(projectId, docId) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}`
)
},
sendUpdates(projectId, docId, updates, callback) {
DocUpdaterClient.preloadDoc(projectId, docId, error => {
if (error) {
return callback(error)
}
const jobs = updates.map(update => callback => {
DocUpdaterClient.sendUpdate(projectId, docId, update, callback)
})
async.series(jobs, err => {
if (err) {
return callback(err)
}
DocUpdaterClient.waitForPendingUpdates(projectId, docId, callback)
})
})
},
waitForPendingUpdates(projectId, docId, callback) {
async.retry(
{ times: 30, interval: 100 },
cb =>
rclient.llen(keys.pendingUpdates({ doc_id: docId }), (err, length) => {
if (err) {
return cb(err)
}
if (length > 0) {
cb(new Error('updates still pending'))
} else {
cb()
}
}),
callback
async getDocAndRecentOps(projectId, docId, fromVersion) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}?fromVersion=${fromVersion}`
)
},
getDoc(projectId, docId, callback) {
request.get(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}`,
(error, res, body) => {
if (body != null && res.statusCode >= 200 && res.statusCode < 300) {
body = JSON.parse(body)
}
callback(error, res, body)
}
async getProjectLastUpdatedAt(projectId) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/last_updated_at`
)
},
getDocAndRecentOps(projectId, docId, fromVersion, callback) {
request.get(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}?fromVersion=${fromVersion}`,
(error, res, body) => {
if (body != null && res.statusCode >= 200 && res.statusCode < 300) {
body = JSON.parse(body)
}
callback(error, res, body)
}
async preloadDoc(projectId, docId) {
await DocUpdaterClient.getDoc(projectId, docId)
},
async peekDoc(projectId, docId) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/peek`
)
},
getProjectLastUpdatedAt(projectId, callback) {
request.get(
`http://127.0.0.1:3003/project/${projectId}/last_updated_at`,
(error, res, body) => {
if (body != null && res.statusCode >= 200 && res.statusCode < 300) {
body = JSON.parse(body)
}
callback(error, res, body)
}
)
},
preloadDoc(projectId, docId, callback) {
DocUpdaterClient.getDoc(projectId, docId, callback)
},
peekDoc(projectId, docId, callback) {
request.get(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/peek`,
(error, res, body) => {
if (body != null && res.statusCode >= 200 && res.statusCode < 300) {
body = JSON.parse(body)
}
callback(error, res, body)
}
)
},
flushDoc(projectId, docId, callback) {
request.post(
async flushDoc(projectId, docId) {
return await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/flush`,
(error, res, body) => callback(error, res, body)
{ method: 'POST' }
)
},
setDocLines(projectId, docId, lines, source, userId, undoing, callback) {
request.post(
async setDocLines(projectId, docId, lines, source, userId, undoing) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}`,
{
url: `http://127.0.0.1:3003/project/${projectId}/doc/${docId}`,
method: 'POST',
json: {
lines,
source,
user_id: userId,
undoing,
},
},
(error, res, body) => callback(error, res, body)
)
},
deleteDoc(projectId, docId, callback) {
request.del(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}`,
(error, res, body) => callback(error, res, body)
)
},
flushProject(projectId, callback) {
request.post(`http://127.0.0.1:3003/project/${projectId}/flush`, callback)
},
deleteProject(projectId, callback) {
request.del(`http://127.0.0.1:3003/project/${projectId}`, callback)
},
deleteProjectOnShutdown(projectId, callback) {
request.del(
`http://127.0.0.1:3003/project/${projectId}?background=true&shutdown=true`,
callback
)
},
flushOldProjects(callback) {
request.get(
'http://127.0.0.1:3003/flush_queued_projects?min_delete_age=1',
callback
)
},
acceptChange(projectId, docId, changeId, callback) {
request.post(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/${changeId}/accept`,
callback
)
},
acceptChanges(projectId, docId, changeIds, callback) {
request.post(
{
url: `http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/accept`,
json: { change_ids: changeIds },
},
callback
)
},
rejectChanges(projectId, docId, changeIds, userId, callback) {
request.post(
{
url: `http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/reject`,
json: { change_ids: changeIds, user_id: userId },
},
callback
)
},
removeComment(projectId, docId, comment, callback) {
request.del(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/comment/${comment}`,
callback
)
},
getProjectDocs(projectId, projectStateHash, callback) {
request.get(
`http://127.0.0.1:3003/project/${projectId}/doc?state=${projectStateHash}`,
(error, res, body) => {
if (body != null && res.statusCode >= 200 && res.statusCode < 300) {
body = JSON.parse(body)
}
callback(error, res, body)
}
)
},
sendProjectUpdate(projectId, userId, updates, version, callback) {
request.post(
{
url: `http://127.0.0.1:3003/project/${projectId}`,
json: { userId, updates, version },
},
(error, res, body) => callback(error, res, body)
async deleteDoc(projectId, docId) {
return await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}`,
{ method: 'DELETE' }
)
},
async flushProject(projectId) {
return await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/flush`,
{
method: 'POST',
}
)
},
async deleteProject(projectId) {
return await fetchNothing(`http://127.0.0.1:3003/project/${projectId}`, {
method: 'DELETE',
})
},
async deleteProjectOnShutdown(projectId) {
return await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}?background=true&shutdown=true`,
{
method: 'DELETE',
}
)
},
async flushOldProjects() {
await fetchNothing(
'http://127.0.0.1:3003/flush_queued_projects?min_delete_age=1'
)
},
async acceptChange(projectId, docId, changeId) {
await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/${changeId}/accept`,
{ method: 'POST' }
)
},
async acceptChanges(projectId, docId, changeIds) {
await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/accept`,
{
method: 'POST',
json: { change_ids: changeIds },
}
)
},
async rejectChanges(projectId, docId, changeIds, userId) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/change/reject`,
{
method: 'POST',
json: { change_ids: changeIds, user_id: userId },
}
)
},
async removeComment(projectId, docId, comment) {
await fetchNothing(
`http://127.0.0.1:3003/project/${projectId}/doc/${docId}/comment/${comment}`,
{ method: 'DELETE' }
)
},
async getProjectDocs(projectId, projectStateHash) {
return await fetchJson(
`http://127.0.0.1:3003/project/${projectId}/doc?state=${projectStateHash}`
)
},
async sendProjectUpdate(projectId, userId, updates, version) {
await fetchNothing(`http://127.0.0.1:3003/project/${projectId}`, {
method: 'POST',
json: { userId, updates, version },
})
},
}

View File

@@ -33,7 +33,7 @@ SandboxedModule.configure({
'mongodb-legacy': require('mongodb-legacy'), // for ObjectId comparisons
'overleaf-editor-core': require('overleaf-editor-core'), // does not play nice with sandbox
},
globals: { Buffer, JSON, Math, console, process },
globals: { Buffer, JSON, Math, console, process, URL },
sourceTransformers: {
removeNodePrefix: function (source) {
return source.replace(/require\(['"]node:/g, "require('")

View File

@@ -1,387 +0,0 @@
/* eslint-disable
no-return-assign,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS202: Simplify dynamic range loops
* DS205: Consider reworking code to avoid use of IIFEs
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
const DocUpdaterClient = require('../../acceptance/js/helpers/DocUpdaterClient')
// MockWebApi = require "../../acceptance/js/helpers/MockWebApi"
const assert = require('node:assert')
const async = require('async')
const insert = function (string, pos, content) {
const result = string.slice(0, pos) + content + string.slice(pos)
return result
}
const transform = function (op1, op2) {
if (op2.p < op1.p) {
return {
p: op1.p + op2.i.length,
i: op1.i,
}
} else {
return op1
}
}
class StressTestClient {
constructor(options) {
if (options == null) {
options = {}
}
this.options = options
if (this.options.updateDelay == null) {
this.options.updateDelay = 200
}
this.project_id = this.options.project_id || DocUpdaterClient.randomId()
this.doc_id = this.options.doc_id || DocUpdaterClient.randomId()
this.pos = this.options.pos || 0
this.content = this.options.content || ''
this.client_id = DocUpdaterClient.randomId()
this.version = this.options.version || 0
this.inflight_op = null
this.charCode = 0
this.counts = {
conflicts: 0,
local_updates: 0,
remote_updates: 0,
max_delay: 0,
}
DocUpdaterClient.subscribeToAppliedOps((channel, update) => {
update = JSON.parse(update)
if (update.error != null) {
console.error(new Error(`Error from server: '${update.error}'`))
return
}
if (update.doc_id === this.doc_id) {
return this.processReply(update)
}
})
}
sendUpdate() {
const data = String.fromCharCode(65 + (this.charCode++ % 26))
this.content = insert(this.content, this.pos, data)
this.inflight_op = {
i: data,
p: this.pos++,
}
this.resendUpdate()
return (this.inflight_op_sent = Date.now())
}
resendUpdate() {
assert(this.inflight_op != null)
DocUpdaterClient.sendUpdate(this.project_id, this.doc_id, {
doc: this.doc_id,
op: [this.inflight_op],
v: this.version,
meta: {
source: this.client_id,
},
dupIfSource: [this.client_id],
})
return (this.update_timer = setTimeout(() => {
console.log(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] WARN: Resending update after 5 seconds`
)
return this.resendUpdate()
}, 5000))
}
processReply(update) {
if (update.op.v !== this.version) {
if (update.op.v < this.version) {
console.log(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] WARN: Duplicate ack (already seen version)`
)
return
} else {
console.error(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] ERROR: Version jumped ahead (client: ${this.version}, op: ${
update.op.v
})`
)
}
}
this.version++
if (update.op.meta.source === this.client_id) {
if (this.inflight_op != null) {
this.counts.local_updates++
this.inflight_op = null
clearTimeout(this.update_timer)
const delay = Date.now() - this.inflight_op_sent
this.counts.max_delay = Math.max(this.counts.max_delay, delay)
return this.continue()
} else {
return console.log(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] WARN: Duplicate ack`
)
}
} else {
assert(update.op.op.length === 1)
this.counts.remote_updates++
let externalOp = update.op.op[0]
if (this.inflight_op != null) {
this.counts.conflicts++
this.inflight_op = transform(this.inflight_op, externalOp)
externalOp = transform(externalOp, this.inflight_op)
}
if (externalOp.p < this.pos) {
this.pos += externalOp.i.length
}
return (this.content = insert(this.content, externalOp.p, externalOp.i))
}
}
continue() {
if (this.updateCount > 0) {
this.updateCount--
return setTimeout(
() => {
return this.sendUpdate()
},
this.options.updateDelay * (0.5 + Math.random())
)
} else {
return this.updateCallback()
}
}
runForNUpdates(n, callback) {
if (callback == null) {
callback = function () {}
}
this.updateCallback = callback
this.updateCount = n
return this.continue()
}
check(callback) {
if (callback == null) {
callback = function () {}
}
return DocUpdaterClient.getDoc(
this.project_id,
this.doc_id,
(error, res, body) => {
if (error != null) {
throw error
}
if (body.lines == null) {
return console.error(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] ERROR: Invalid response from get doc (${this.doc_id})`,
body
)
}
const content = body.lines.join('\n')
const { version } = body
if (content !== this.content) {
if (version === this.version) {
console.error(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] Error: Client content does not match server.`
)
console.error(`Server: ${content.split('a')}`)
console.error(`Client: ${this.content.split('a')}`)
} else {
console.error(
`[${new Date()}] \t[${this.client_id.slice(
0,
4
)}] Error: Version mismatch (Server: '${version}', Client: '${
this.version
}')`
)
}
}
if (!this.isContentValid(this.content)) {
const iterable = this.content.split('')
for (let i = 0; i < iterable.length; i++) {
const chunk = iterable[i]
if (chunk != null && chunk !== 'a') {
console.log(chunk, i)
}
}
throw new Error('bad content')
}
return callback()
}
)
}
isChunkValid(chunk) {
const char = 0
for (let i = 0; i < chunk.length; i++) {
const letter = chunk[i]
if (letter.charCodeAt(0) !== 65 + (i % 26)) {
console.error(
`[${new Date()}] \t[${this.client_id.slice(0, 4)}] Invalid Chunk:`,
chunk
)
return false
}
}
return true
}
isContentValid(content) {
for (const chunk of Array.from(content.split('a'))) {
if (chunk != null && chunk !== '') {
if (!this.isChunkValid(chunk)) {
console.error(
`[${new Date()}] \t[${this.client_id.slice(0, 4)}] Invalid content`,
content
)
return false
}
}
}
return true
}
}
const checkDocument = function (projectId, docId, clients, callback) {
if (callback == null) {
callback = function () {}
}
const jobs = clients.map(client => cb => client.check(cb))
return async.parallel(jobs, callback)
}
const printSummary = function (docId, clients) {
const slot = require('cluster-key-slot')
const now = new Date()
console.log(
`[${now}] [${docId.slice(0, 4)} (slot: ${slot(docId)})] ${
clients.length
} clients...`
)
return (() => {
const result = []
for (const client of Array.from(clients)) {
console.log(
`[${now}] \t[${client.client_id.slice(0, 4)}] { local: ${
client.counts.local_updates
}, remote: ${client.counts.remote_updates}, conflicts: ${
client.counts.conflicts
}, max_delay: ${client.counts.max_delay} }`
)
result.push(
(client.counts = {
local_updates: 0,
remote_updates: 0,
conflicts: 0,
max_delay: 0,
})
)
}
return result
})()
}
const CLIENT_COUNT = parseInt(process.argv[2], 10)
const UPDATE_DELAY = parseInt(process.argv[3], 10)
const SAMPLE_INTERVAL = parseInt(process.argv[4], 10)
for (const docAndProjectId of Array.from(process.argv.slice(5))) {
;(function (docAndProjectId) {
const [projectId, docId] = Array.from(docAndProjectId.split(':'))
console.log({ projectId, docId })
return DocUpdaterClient.setDocLines(
projectId,
docId,
[new Array(CLIENT_COUNT + 2).join('a')],
null,
null,
error => {
if (error != null) {
throw error
}
return DocUpdaterClient.getDoc(projectId, docId, (error, res, body) => {
let runBatch
if (error != null) {
throw error
}
if (body.lines == null) {
return console.error(
`[${new Date()}] ERROR: Invalid response from get doc (${docId})`,
body
)
}
const content = body.lines.join('\n')
const { version } = body
const clients = []
for (
let pos = 1, end = CLIENT_COUNT, asc = end >= 1;
asc ? pos <= end : pos >= end;
asc ? pos++ : pos--
) {
;(function (pos) {
const client = new StressTestClient({
doc_id: docId,
project_id: projectId,
content,
pos,
version,
updateDelay: UPDATE_DELAY,
})
return clients.push(client)
})(pos)
}
return (runBatch = function () {
const jobs = clients.map(
client => cb =>
client.runForNUpdates(SAMPLE_INTERVAL / UPDATE_DELAY, cb)
)
return async.parallel(jobs, error => {
if (error != null) {
throw error
}
printSummary(docId, clients)
return checkDocument(projectId, docId, clients, error => {
if (error != null) {
throw error
}
return runBatch()
})
})
})()
})
}
)
})(docAndProjectId)
}

View File

@@ -13,6 +13,9 @@ describe('HistoryManager', function () {
this.HistoryManager = SandboxedModule.require(modulePath, {
requires: {
request: (this.request = {}),
'@overleaf/fetch-utils': (this.fetchUtils = {
fetchNothing: sinon.stub().resolves(),
}),
'@overleaf/settings': (this.Settings = {
shortHistoryQueues: [],
apis: {
@@ -21,74 +24,69 @@ describe('HistoryManager', function () {
},
},
}),
'./DocumentManager': (this.DocumentManager = {}),
'./DocumentManager': (this.DocumentManager = {
promises: {
resyncDocContentsWithLock: sinon.stub().resolves(),
},
}),
'./RedisManager': (this.RedisManager = {}),
'./ProjectHistoryRedisManager': (this.ProjectHistoryRedisManager = {}),
'./ProjectHistoryRedisManager': (this.ProjectHistoryRedisManager = {
promises: {
queueResyncProjectStructure: sinon.stub().resolves(),
},
}),
'./Metrics': (this.metrics = { inc: sinon.stub() }),
},
})
this.project_id = 'mock-project-id'
this.callback = sinon.stub()
})
describe('flushProjectChangesAsync', function () {
beforeEach(function () {
this.request.post = sinon
.stub()
.callsArgWith(1, null, { statusCode: 204 })
this.HistoryManager.flushProjectChangesAsync(this.project_id)
})
it('should send a request to the project history api', function () {
this.request.post
.calledWith({
url: `${this.Settings.apis.project_history.url}/project/${this.project_id}/flush`,
qs: { background: true },
})
.should.equal(true)
this.fetchUtils.fetchNothing.should.have.been.calledWith(
new URL(
`${this.Settings.apis.project_history.url}/project/${this.project_id}/flush?background=true`
)
)
})
})
describe('flushProjectChanges', function () {
describe('in the normal case', function () {
beforeEach(function (done) {
this.request.post = sinon
.stub()
.callsArgWith(1, null, { statusCode: 204 })
this.HistoryManager.flushProjectChanges(
beforeEach(async function () {
await this.HistoryManager.promises.flushProjectChanges(
this.project_id,
{
background: true,
},
done
}
)
})
it('should send a request to the project history api', function () {
this.request.post
.calledWith({
url: `${this.Settings.apis.project_history.url}/project/${this.project_id}/flush`,
qs: { background: true },
})
.should.equal(true)
this.fetchUtils.fetchNothing.should.have.been.calledWith(
new URL(
`${this.Settings.apis.project_history.url}/project/${this.project_id}/flush?background=true`
)
)
})
})
describe('with the skip_history_flush option', function () {
beforeEach(function (done) {
this.request.post = sinon.stub()
this.HistoryManager.flushProjectChanges(
beforeEach(async function () {
await this.HistoryManager.promises.flushProjectChanges(
this.project_id,
{
skip_history_flush: true,
},
done
}
)
})
it('should not send a request to the project history api', function () {
this.request.post.called.should.equal(false)
this.fetchUtils.fetchNothing.should.not.have.been.called
})
})
})
@@ -96,7 +94,7 @@ describe('HistoryManager', function () {
describe('recordAndFlushHistoryOps', function () {
beforeEach(function () {
this.ops = ['mock-ops']
this.project_ops_length = 10
this.project_ops_length = 500
this.HistoryManager.flushProjectChangesAsync = sinon.stub()
})
@@ -111,17 +109,12 @@ describe('HistoryManager', function () {
})
it('should not flush project changes', function () {
this.HistoryManager.flushProjectChangesAsync.called.should.equal(false)
this.fetchUtils.fetchNothing.should.not.have.been.called
})
})
describe('with enough ops to flush project changes', function () {
beforeEach(function () {
this.HistoryManager.shouldFlushHistoryOps = sinon.stub()
this.HistoryManager.shouldFlushHistoryOps
.withArgs(this.project_id, this.project_ops_length)
.returns(true)
this.HistoryManager.recordAndFlushHistoryOps(
this.project_id,
this.ops,
@@ -130,29 +123,12 @@ describe('HistoryManager', function () {
})
it('should flush project changes', function () {
this.HistoryManager.flushProjectChangesAsync
.calledWith(this.project_id)
.should.equal(true)
})
})
describe('with enough ops to flush doc changes', function () {
beforeEach(function () {
this.HistoryManager.shouldFlushHistoryOps = sinon.stub()
this.HistoryManager.shouldFlushHistoryOps
.withArgs(this.project_id, this.project_ops_length)
.returns(false)
this.HistoryManager.recordAndFlushHistoryOps(
this.project_id,
this.ops,
this.project_ops_length
this.fetchUtils.fetchNothing.should.have.been.calledWith(
new URL(
`${this.Settings.apis.project_history.url}/project/${this.project_id}/flush?background=true`
)
)
})
it('should not flush project changes', function () {
this.HistoryManager.flushProjectChangesAsync.called.should.equal(false)
})
})
describe('shouldFlushHistoryOps', function () {
@@ -228,78 +204,61 @@ describe('HistoryManager', function () {
url: `www.filestore.test/${this.project_id}/mock-file-id`,
},
]
this.ProjectHistoryRedisManager.queueResyncProjectStructure = sinon
.stub()
.yields()
this.DocumentManager.resyncDocContentsWithLock = sinon.stub().yields()
})
describe('full sync', function () {
beforeEach(function () {
this.HistoryManager.resyncProjectHistory(
beforeEach(async function () {
await this.HistoryManager.promises.resyncProjectHistory(
this.project_id,
this.projectHistoryId,
this.docs,
this.files,
{},
this.callback
{}
)
})
it('should queue a project structure reync', function () {
this.ProjectHistoryRedisManager.queueResyncProjectStructure
.calledWith(
this.project_id,
this.projectHistoryId,
this.docs,
this.files
)
.should.equal(true)
this.ProjectHistoryRedisManager.promises.queueResyncProjectStructure.should.have.been.calledWith(
this.project_id,
this.projectHistoryId,
this.docs,
this.files
)
})
it('should queue doc content reyncs', function () {
this.DocumentManager.resyncDocContentsWithLock
.calledWith(this.project_id, this.docs[0].doc, this.docs[0].path)
.should.equal(true)
})
it('should call the callback', function () {
this.callback.called.should.equal(true)
this.DocumentManager.promises.resyncDocContentsWithLock.should.have.been.calledWith(
this.project_id,
this.docs[0].doc,
this.docs[0].path
)
})
})
describe('resyncProjectStructureOnly=true', function () {
beforeEach(function () {
this.HistoryManager.resyncProjectHistory(
beforeEach(async function () {
await this.HistoryManager.promises.resyncProjectHistory(
this.project_id,
this.projectHistoryId,
this.docs,
this.files,
{ resyncProjectStructureOnly: true },
this.callback
{ resyncProjectStructureOnly: true }
)
})
it('should queue a project structure reync', function () {
this.ProjectHistoryRedisManager.queueResyncProjectStructure
.calledWith(
this.project_id,
this.projectHistoryId,
this.docs,
this.files,
{ resyncProjectStructureOnly: true }
)
.should.equal(true)
})
it('should not queue doc content reyncs', function () {
this.DocumentManager.resyncDocContentsWithLock.called.should.equal(
false
this.ProjectHistoryRedisManager.promises.queueResyncProjectStructure.should.have.been.calledWith(
this.project_id,
this.projectHistoryId,
this.docs,
this.files,
{ resyncProjectStructureOnly: true }
)
})
it('should call the callback', function () {
this.callback.called.should.equal(true)
it('should not queue doc content reyncs', function () {
this.DocumentManager.promises.resyncDocContentsWithLock.should.not.have
.been.called
})
})
})

View File

@@ -25,6 +25,7 @@ services:
POSTGRES_HOST: postgres
AWS_S3_ENDPOINT: https://minio:9000
AWS_S3_PATH_STYLE: 'true'
DELETE_OBJECTS_MD5_FALLBACK: true
AWS_ACCESS_KEY_ID: OVERLEAF_FILESTORE_S3_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: OVERLEAF_FILESTORE_S3_SECRET_ACCESS_KEY
MINIO_ROOT_USER: MINIO_ROOT_USER
@@ -69,7 +70,7 @@ services:
build:
dockerfile_inline: |
FROM node:22.18.0
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-$(dpkg --print-architecture)"
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-"`dpkg --print-architecture`
RUN chmod +x /certgen
volumes:
- minio-certs:/certs
@@ -138,7 +139,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-user-files/*"
},
@@ -154,7 +158,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-user-files-dek/*"
},
@@ -170,7 +177,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-template-files/*"
},
@@ -186,7 +196,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::random-bucket-*"
}

View File

@@ -42,6 +42,7 @@ services:
POSTGRES_HOST: postgres
AWS_S3_ENDPOINT: https://minio:9000
AWS_S3_PATH_STYLE: 'true'
DELETE_OBJECTS_MD5_FALLBACK: true
AWS_ACCESS_KEY_ID: OVERLEAF_FILESTORE_S3_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: OVERLEAF_FILESTORE_S3_SECRET_ACCESS_KEY
MINIO_ROOT_USER: MINIO_ROOT_USER
@@ -77,7 +78,7 @@ services:
build:
dockerfile_inline: |
FROM node:22.18.0
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-$(dpkg --print-architecture)"
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-"`dpkg --print-architecture`
RUN chmod +x /certgen
volumes:
- minio-certs:/certs
@@ -146,7 +147,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-user-files/*"
},
@@ -162,7 +166,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-user-files-dek/*"
},
@@ -178,7 +185,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::fake-template-files/*"
},
@@ -194,7 +204,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::random-bucket-*"
}

View File

@@ -41,6 +41,7 @@ services:
POSTGRES_HOST: postgres
AWS_S3_ENDPOINT: https://minio:9000
AWS_S3_PATH_STYLE: 'true'
DELETE_OBJECTS_MD5_FALLBACK: true
AWS_ACCESS_KEY_ID: OVERLEAF_HISTORY_S3_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: OVERLEAF_HISTORY_S3_SECRET_ACCESS_KEY
MINIO_ROOT_USER: MINIO_ROOT_USER
@@ -119,7 +120,7 @@ services:
build:
dockerfile_inline: |
FROM node:22.18.0
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-$(dpkg --print-architecture)"
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-"`dpkg --print-architecture`
RUN chmod +x /certgen
volumes:
- minio-certs:/certs
@@ -188,7 +189,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-chunks/*"
},
@@ -204,7 +208,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-deks/*"
},
@@ -220,7 +227,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-global-blobs/*"
},
@@ -236,7 +246,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-project-blobs/*"
}

View File

@@ -58,6 +58,7 @@ services:
POSTGRES_HOST: postgres
AWS_S3_ENDPOINT: https://minio:9000
AWS_S3_PATH_STYLE: 'true'
DELETE_OBJECTS_MD5_FALLBACK: true
AWS_ACCESS_KEY_ID: OVERLEAF_HISTORY_S3_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: OVERLEAF_HISTORY_S3_SECRET_ACCESS_KEY
MINIO_ROOT_USER: MINIO_ROOT_USER
@@ -127,7 +128,7 @@ services:
build:
dockerfile_inline: |
FROM node:22.18.0
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-$(dpkg --print-architecture)"
RUN wget -O /certgen "https://github.com/minio/certgen/releases/download/v1.3.0/certgen-linux-"`dpkg --print-architecture`
RUN chmod +x /certgen
volumes:
- minio-certs:/certs
@@ -196,7 +197,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-chunks/*"
},
@@ -212,7 +216,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-deks/*"
},
@@ -228,7 +235,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-global-blobs/*"
},
@@ -244,7 +254,10 @@ services:
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
"s3:DeleteObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads"
],
"Resource": "arn:aws:s3:::overleaf-test-history-project-blobs/*"
}

View File

@@ -70,7 +70,6 @@ export async function downloadBlobToDir(historyId, blob, tmpDir) {
* @return {Promise<void>}
*/
export async function uploadBlobToBackup(historyId, blob, path, persistor) {
const md5 = Crypto.createHash('md5')
const filePathCompressed = path + '.gz'
let backupSource
let contentEncoding
@@ -86,7 +85,6 @@ export async function uploadBlobToBackup(historyId, blob, path, persistor) {
async function* (source) {
for await (const chunk of source) {
size += chunk.byteLength
md5.update(chunk)
yield chunk
}
},
@@ -97,10 +95,6 @@ export async function uploadBlobToBackup(historyId, blob, path, persistor) {
} else {
backupSource = path
size = blob.getByteLength()
await Stream.promises.pipeline(
fs.createReadStream(path, { highWaterMark: HIGHWATER_MARK }),
md5
)
}
const key = makeProjectKey(historyId, blob.getHash())
await persistor.sendStream(
@@ -111,7 +105,6 @@ export async function uploadBlobToBackup(historyId, blob, path, persistor) {
contentEncoding,
contentType: 'application/octet-stream',
contentLength: size,
sourceMd5: md5.digest('hex'),
ifNoneMatch: '*',
}
)

View File

@@ -104,7 +104,6 @@ async function getRootKeyEncryptionKeys() {
export const backupPersistor = new PerProjectEncryptedS3Persistor({
...persistorConfig.s3SSEC,
disableMultiPartUpload: true,
dataEncryptionKeyBucketName: deksBucket,
pathToProjectFolder,
getRootKeyEncryptionKeys,

View File

@@ -221,7 +221,6 @@ async function backupChunk(
}
const key = makeChunkKey(historyId, chunkToBackup.startVersion)
logger.debug({ chunkRecord, historyId, projectId, key }, 'backing up chunk')
const md5 = Crypto.createHash('md5').update(chunkBuffer)
await chunkBackupPersistorForProject.sendStream(
chunksBucket,
makeChunkKey(historyId, chunkToBackup.startVersion),
@@ -230,7 +229,6 @@ async function backupChunk(
contentType: 'application/json',
contentEncoding: 'gzip',
contentLength: chunkBuffer.byteLength,
sourceMd5: md5.digest('hex'),
}
)
}

View File

@@ -131,7 +131,6 @@ async function backupChunk(historyId) {
historyId,
newChunkMetadata.id
)
const md5 = Crypto.createHash('md5').update(chunkBuffer)
await backupPersistor.sendStream(
chunksBucket,
path.join(
@@ -143,7 +142,6 @@ async function backupChunk(historyId) {
contentType: 'application/json',
contentEncoding: 'gzip',
contentLength: chunkBuffer.byteLength,
sourceMd5: md5.digest('hex'),
}
)
}

View File

@@ -23,7 +23,7 @@ describe('Deleting project', function () {
.get(`/api/projects/${this.historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
MockHistoryStore().delete(`/api/projects/${this.historyId}`).reply(204)
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
})
describe('when the project has no pending updates', function () {
@@ -34,16 +34,13 @@ describe('Deleting project', function () {
describe('when the project has pending updates', function () {
beforeEach(async function () {
await ProjectHistoryClient.promises.pushRawUpdate(this.projectId, {
await ProjectHistoryClient.pushRawUpdate(this.projectId, {
pathname: '/main.tex',
docLines: 'hello',
doc: this.docId,
meta: { userId: this.userId, ts: new Date() },
})
await ProjectHistoryClient.promises.setFirstOpTimestamp(
this.projectId,
Date.now()
)
await ProjectHistoryClient.setFirstOpTimestamp(this.projectId, Date.now())
await ProjectHistoryClient.deleteProject(this.projectId)
})
@@ -53,9 +50,7 @@ describe('Deleting project', function () {
})
it('clears the first op timestamp', async function () {
const ts = await ProjectHistoryClient.promises.getFirstOpTimestamp(
this.projectId
)
const ts = await ProjectHistoryClient.getFirstOpTimestamp(this.projectId)
expect(ts).to.be.null
})
})

View File

@@ -24,7 +24,7 @@ function createMockBlob(historyId, content) {
describe('Diffs', function () {
beforeEach(async function () {
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.historyId = new ObjectId().toString()
this.projectId = new ObjectId().toString()
@@ -39,7 +39,7 @@ describe('Diffs', function () {
overleaf: { history: { id: this.historyId } },
})
await ProjectHistoryClient.promises.initializeProject(this.historyId)
await ProjectHistoryClient.initializeProject(this.historyId)
})
afterEach(function () {

View File

@@ -11,7 +11,7 @@ describe('DiscardingUpdates', function () {
beforeEach(async function () {
this.timestamp = new Date()
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.user_id = new ObjectId().toString()
this.project_id = new ObjectId().toString()
this.doc_id = new ObjectId().toString()
@@ -22,7 +22,7 @@ describe('DiscardingUpdates', function () {
MockWeb()
.get(`/project/${this.project_id}/details`)
.reply(200, { name: 'Test Project' })
await ProjectHistoryClient.promises.initializeProject(this.project_id)
await ProjectHistoryClient.initializeProject(this.project_id)
})
it('should discard updates', async function () {
@@ -32,7 +32,7 @@ describe('DiscardingUpdates', function () {
doc: this.doc_id,
meta: { user_id: this.user_id, ts: new Date() },
}
await ProjectHistoryClient.promises.pushRawUpdate(this.project_id, update)
await ProjectHistoryClient.promises.flushProject(this.project_id)
await ProjectHistoryClient.pushRawUpdate(this.project_id, update)
await ProjectHistoryClient.flushProject(this.project_id)
})
})

View File

@@ -13,7 +13,7 @@ const sha = data => crypto.createHash('sha1').update(data).digest('hex')
describe('FileTree Diffs', function () {
beforeEach(async function () {
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.historyId = new ObjectId().toString()
this.projectId = new ObjectId().toString()
@@ -28,7 +28,7 @@ describe('FileTree Diffs', function () {
overleaf: { history: { id: this.historyId } },
})
await ProjectHistoryClient.promises.initializeProject(this.historyId)
await ProjectHistoryClient.initializeProject(this.historyId)
})
afterEach(function () {

View File

@@ -17,7 +17,7 @@ describe('Flushing old queues', function () {
beforeEach(async function () {
this.timestamp = new Date()
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.projectId = new ObjectId().toString()
this.docId = new ObjectId().toString()
this.fileId = new ObjectId().toString()
@@ -45,7 +45,7 @@ describe('Flushing old queues', function () {
},
},
})
await ProjectHistoryClient.promises.initializeProject(historyId)
await ProjectHistoryClient.initializeProject(historyId)
})
afterEach(function () {
@@ -68,11 +68,8 @@ describe('Flushing old queues', function () {
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
await ProjectHistoryClient.promises.pushRawUpdate(
this.projectId,
update
)
await ProjectHistoryClient.promises.setFirstOpTimestamp(
await ProjectHistoryClient.pushRawUpdate(this.projectId, update)
await ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now() - 24 * 3600 * 1000
)
@@ -131,11 +128,8 @@ describe('Flushing old queues', function () {
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
await ProjectHistoryClient.promises.pushRawUpdate(
this.projectId,
update
)
await ProjectHistoryClient.promises.setFirstOpTimestamp(
await ProjectHistoryClient.pushRawUpdate(this.projectId, update)
await ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now() - 60 * 1000
)
@@ -177,11 +171,8 @@ describe('Flushing old queues', function () {
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
await ProjectHistoryClient.promises.pushRawUpdate(
this.projectId,
update
)
await ProjectHistoryClient.promises.setFirstOpTimestamp(
await ProjectHistoryClient.pushRawUpdate(this.projectId, update)
await ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now() - 60 * 1000
)
@@ -241,16 +232,8 @@ describe('Flushing old queues', function () {
meta: { user_id: this.user_id, ts: new Date() },
}
this.startDate = Date.now()
await ProjectHistoryClient.promises.pushRawUpdate(
this.projectId,
update
)
await new Promise((resolve, reject) => {
ProjectHistoryClient.clearFirstOpTimestamp(this.projectId, err => {
if (err) reject(err)
else resolve()
})
})
await ProjectHistoryClient.pushRawUpdate(this.projectId, update)
await ProjectHistoryClient.clearFirstOpTimestamp(this.projectId)
})
it('flushes the project history queue anyway', async function () {
@@ -266,15 +249,9 @@ describe('Flushing old queues', function () {
'made calls to history service to store updates'
)
const result = await new Promise((resolve, reject) => {
ProjectHistoryClient.getFirstOpTimestamp(
this.projectId,
(err, result) => {
if (err) reject(err)
else resolve(result)
}
)
})
const result = await ProjectHistoryClient.getFirstOpTimestamp(
this.projectId
)
expect(result).to.be.null
})
})

View File

@@ -19,14 +19,13 @@ describe('GetChangesInChunkSince', function () {
beforeEach(async function () {
projectId = new ObjectId().toString()
historyId = new ObjectId().toString()
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
})
const olProject =
await ProjectHistoryClient.promises.initializeProject(historyId)
const olProject = await ProjectHistoryClient.initializeProject(historyId)
MockWeb()
.get(`/project/${projectId}/details`)
.reply(200, {

View File

@@ -16,7 +16,7 @@ describe('Health Check', function () {
const historyId = new ObjectId().toString()
settings.history.healthCheck = { project_id: projectId }
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
@@ -43,7 +43,7 @@ describe('Health Check', function () {
},
})
await ProjectHistoryClient.promises.initializeProject(historyId)
await ProjectHistoryClient.initializeProject(historyId)
})
it('should respond to the health check', async function () {

View File

@@ -12,14 +12,14 @@ const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('Labels', function () {
beforeEach(async function () {
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
const olProject = await ProjectHistoryClient.promises.initializeProject(
const olProject = await ProjectHistoryClient.initializeProject(
this.historyId
)
this.project_id = new ObjectId().toString()

View File

@@ -13,14 +13,14 @@ const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('LatestSnapshot', function () {
beforeEach(async function () {
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
const v1Project = await ProjectHistoryClient.promises.initializeProject(
const v1Project = await ProjectHistoryClient.initializeProject(
this.historyId
)
this.projectId = new ObjectId().toString()

View File

@@ -12,14 +12,14 @@ const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('ReadSnapshot', function () {
beforeEach(async function () {
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
const v1Project = await ProjectHistoryClient.promises.initializeProject(
const v1Project = await ProjectHistoryClient.initializeProject(
this.historyId
)
this.projectId = new ObjectId().toString()

View File

@@ -18,7 +18,7 @@ describe('Retrying failed projects', function () {
beforeEach(async function () {
this.timestamp = new Date()
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
this.project_id = new ObjectId().toString()
this.doc_id = new ObjectId().toString()
@@ -47,7 +47,7 @@ describe('Retrying failed projects', function () {
},
},
})
await ProjectHistoryClient.promises.initializeProject(historyId)
await ProjectHistoryClient.initializeProject(historyId)
})
afterEach(function () {
@@ -71,10 +71,7 @@ describe('Retrying failed projects', function () {
meta: { user_id: this.user_id, ts: new Date() },
}
await ProjectHistoryClient.promises.pushRawUpdate(
this.project_id,
update
)
await ProjectHistoryClient.pushRawUpdate(this.project_id, update)
await ProjectHistoryClient.setFailure({
project_id: this.project_id,
attempts: 1,

View File

@@ -15,13 +15,13 @@ describe('Summarized updates', function () {
this.projectId = new ObjectId().toString()
this.historyId = new ObjectId().toString()
await ProjectHistoryApp.promises.ensureRunning()
await ProjectHistoryApp.ensureRunning()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
const olProject = await ProjectHistoryClient.promises.initializeProject(
const olProject = await ProjectHistoryClient.initializeProject(
this.historyId
)

File diff suppressed because it is too large Load Diff

View File

@@ -1,50 +1,37 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS205: Consider reworking code to avoid use of IIFEs
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import { app } from '../../../../app/js/server.js'
import { mongoClient } from '../../../../app/js/mongodb.js'
import { promisify } from '@overleaf/promise-utils'
let running = false
let initing = false
const callbacks = []
let initPromise = null
export function ensureRunning(callback) {
if (callback == null) {
callback = function () {}
}
if (running) {
return callback()
} else if (initing) {
return callbacks.push(callback)
}
initing = true
callbacks.push(callback)
app.listen(3054, '127.0.0.1', error => {
if (error != null) {
throw error
}
async function initialize() {
try {
await new Promise((resolve, reject) => {
app.listen(3054, '127.0.0.1', error => {
if (error) return reject(error)
resolve()
})
})
// Wait for mongo
mongoClient.connect(error => {
if (error != null) {
throw error
}
running = true
for (callback of Array.from(callbacks)) {
callback()
}
})
})
await mongoClient.connect()
running = true
} catch (error) {
initPromise = null
throw error
}
}
export const promises = {
ensureRunning: promisify(ensureRunning),
export async function ensureRunning() {
if (running) {
return
}
if (initPromise) {
return await initPromise
}
initPromise = initialize()
return await initPromise
}

View File

@@ -3,7 +3,6 @@ import request from 'request'
import Settings from '@overleaf/settings'
import RedisWrapper from '@overleaf/redis-wrapper'
import { db } from '../../../../app/js/mongodb.js'
import { promisify } from '@overleaf/promise-utils'
import {
fetchJson,
fetchJsonWithResponse,
@@ -19,44 +18,34 @@ export function resetDatabase(callback) {
rclient.flushdb(callback)
}
export function initializeProject(historyId, callback) {
request.post(
export async function initializeProject(historyId) {
const response = await fetchJsonWithResponse(
'http://127.0.0.1:3054/project',
{
url: 'http://127.0.0.1:3054/project',
method: 'POST',
json: { historyId },
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(null, body.project)
}
)
expect(response.response.status).to.equal(200)
return response.json.project
}
export function flushProject(projectId, options, callback) {
if (typeof options === 'function') {
callback = options
options = null
}
if (!options) {
options = { allowErrors: false }
}
request.post(
{
url: `http://127.0.0.1:3054/project/${projectId}/flush`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
if (!options.allowErrors) {
expect(res.statusCode).to.equal(204)
}
callback(error, res)
export async function flushProject(projectId, options = {}) {
try {
const response = await fetchNothing(
`http://127.0.0.1:3054/project/${projectId}/flush`,
{ method: 'POST' }
)
if (!options.allowErrors) {
expect(response.status).to.equal(204)
}
)
return { statusCode: response.status }
} catch (error) {
if (options.allowErrors && error instanceof RequestFailedError) {
return { statusCode: error.response.status }
}
throw error
}
}
export async function getSummarizedUpdates(projectId, query) {
@@ -135,33 +124,29 @@ export async function getSnapshot(projectId, pathname, version, options = {}) {
}
}
export function pushRawUpdate(projectId, update, callback) {
rclient.rpush(
export async function pushRawUpdate(projectId, update) {
await rclient.rpush(
Keys.projectHistoryOps({ project_id: projectId }),
JSON.stringify(update),
callback
JSON.stringify(update)
)
}
export function setFirstOpTimestamp(projectId, timestamp, callback) {
rclient.set(
export async function setFirstOpTimestamp(projectId, timestamp) {
await rclient.set(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
timestamp,
callback
timestamp
)
}
export function getFirstOpTimestamp(projectId, callback) {
rclient.get(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
callback
export async function getFirstOpTimestamp(projectId) {
return await rclient.get(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId })
)
}
export function clearFirstOpTimestamp(projectId, callback) {
rclient.del(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
callback
export async function clearFirstOpTimestamp(projectId) {
await rclient.del(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId })
)
}
@@ -179,21 +164,15 @@ export function getQueueCounts(callback) {
)
}
export function resyncHistory(projectId, callback) {
request.post(
export async function resyncHistory(projectId) {
const response = await fetchNothing(
`http://127.0.0.1:3054/project/${projectId}/resync`,
{
url: `http://127.0.0.1:3054/project/${projectId}/resync`,
json: true,
body: { origin: { kind: 'test-origin' } },
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(204)
callback(error)
method: 'POST',
json: { origin: { kind: 'test-origin' } },
}
)
expect(response.status).to.equal(204)
}
export async function createLabel(
@@ -257,11 +236,3 @@ export async function deleteProject(projectId) {
)
expect(response.status).to.equal(204)
}
export const promises = {
initializeProject: promisify(initializeProject),
pushRawUpdate: promisify(pushRawUpdate),
setFirstOpTimestamp: promisify(setFirstOpTimestamp),
getFirstOpTimestamp: promisify(getFirstOpTimestamp),
flushProject: promisify(flushProject),
}

View File

@@ -1,7 +1,7 @@
const logger = require('@overleaf/logger')
const OError = require('@overleaf/o-error')
const AnalyticsRegistrationSourceHelper = require('./AnalyticsRegistrationSourceHelper')
const SessionManager = require('../../Features/Authentication/SessionManager')
import logger from '@overleaf/logger'
import OError from '@overleaf/o-error'
import AnalyticsRegistrationSourceHelper from './AnalyticsRegistrationSourceHelper.js'
import SessionManager from '../../Features/Authentication/SessionManager.js'
function setSource(medium, source) {
return function (req, res, next) {
@@ -51,7 +51,7 @@ function setInbound() {
}
}
module.exports = {
export default {
setSource,
clearSource,
setInbound,

View File

@@ -2,7 +2,7 @@ import AuthenticationController from './../Authentication/AuthenticationControll
import AnalyticsController from './AnalyticsController.mjs'
import AnalyticsProxy from './AnalyticsProxy.mjs'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
const rateLimiters = {
recordEvent: new RateLimiter('analytics-record-event', {

View File

@@ -654,10 +654,8 @@ function _afterLoginSessionSetup(req, user, callback) {
const _afterLoginSessionSetupAsync = promisify(_afterLoginSessionSetup)
function _loginAsyncHandlers(req, user, anonymousAnalyticsId, isNewUser) {
UserHandler.populateTeamInvites(user, err => {
if (err != null) {
logger.warn({ err }, 'error setting up login data')
}
UserHandler.promises.populateTeamInvites(user).catch(err => {
logger.warn({ err }, 'error setting up login data')
})
LoginRateLimiter.recordSuccessfulLogin(user.email, () => {})
AuthenticationController._recordSuccessfulLogin(user._id, () => {})

View File

@@ -1,6 +1,6 @@
// @ts-check
import { ForbiddenError, UserNotFoundError } from '../Errors/Errors.js'
import PermissionsManager from './PermissionsManager.js'
import PermissionsManager from './PermissionsManager.mjs'
import Modules from '../../infrastructure/Modules.js'
import { expressify } from '@overleaf/promise-utils'
import Features from '../../infrastructure/Features.js'
@@ -9,7 +9,7 @@ import Features from '../../infrastructure/Features.js'
* @typedef {(import('express').Request)} Request
* @typedef {(import('express').Response)} Response
* @typedef {(import('express').NextFunction)} NextFunction
* @typedef {import('./PermissionsManager').Capability} Capability
* @typedef {import('./PermissionsManager.mjs').Capability} Capability
*/
const {

View File

@@ -41,9 +41,12 @@
* }
*/
const { callbackify } = require('util')
const { ForbiddenError } = require('../Errors/Errors')
const Modules = require('../../infrastructure/Modules')
import { callbackify } from 'node:util'
import Errors from '../Errors/Errors.js'
import Modules from '../../infrastructure/Modules.js'
const { ForbiddenError } = Errors
/**
* @typedef {(import('../../../../types/capabilities').Capability)} Capability
@@ -466,7 +469,7 @@ async function checkUserListPermissions(userList, capabilities) {
return true
}
module.exports = {
export default {
validatePolicies,
registerCapability,
registerPolicy,

View File

@@ -3,9 +3,9 @@ import AuthenticationController from '../Authentication/AuthenticationController
import AuthorizationMiddleware from '../Authorization/AuthorizationMiddleware.mjs'
import CollaboratorsInviteController from './CollaboratorsInviteController.mjs'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
import CaptchaMiddleware from '../Captcha/CaptchaMiddleware.mjs'
import AnalyticsRegistrationSourceMiddleware from '../Analytics/AnalyticsRegistrationSourceMiddleware.js'
import AnalyticsRegistrationSourceMiddleware from '../Analytics/AnalyticsRegistrationSourceMiddleware.mjs'
const rateLimiters = {
inviteToProjectByProjectId: new RateLimiter(

View File

@@ -1,33 +1,35 @@
const { callbackify } = require('util')
const { callbackifyMultiResult } = require('@overleaf/promise-utils')
const {
import { callbackify } from 'node:util'
import { callbackifyMultiResult } from '@overleaf/promise-utils'
import {
fetchString,
fetchStringWithResponse,
fetchStream,
RequestFailedError,
} = require('@overleaf/fetch-utils')
const Settings = require('@overleaf/settings')
const ProjectGetter = require('../Project/ProjectGetter')
const ProjectEntityHandler = require('../Project/ProjectEntityHandler')
const logger = require('@overleaf/logger')
const OError = require('@overleaf/o-error')
const { Cookie } = require('tough-cookie')
const ClsiCookieManager = require('./ClsiCookieManager')(
} from '@overleaf/fetch-utils'
import Settings from '@overleaf/settings'
import ProjectGetter from '../Project/ProjectGetter.js'
import ProjectEntityHandler from '../Project/ProjectEntityHandler.js'
import logger from '@overleaf/logger'
import OError from '@overleaf/o-error'
import { Cookie } from 'tough-cookie'
import ClsiCookieManagerFactory from './ClsiCookieManager.js'
import ClsiStateManager from './ClsiStateManager.js'
import _ from 'lodash'
import ClsiFormatChecker from './ClsiFormatChecker.js'
import DocumentUpdaterHandler from '../DocumentUpdater/DocumentUpdaterHandler.js'
import Metrics from '@overleaf/metrics'
import Errors from '../Errors/Errors.js'
import ClsiCacheHandler from './ClsiCacheHandler.js'
import HistoryManager from '../History/HistoryManager.js'
import SplitTestHandler from '../SplitTests/SplitTestHandler.js'
import AnalyticsManager from '../Analytics/AnalyticsManager.js'
const ClsiCookieManager = ClsiCookieManagerFactory(
Settings.apis.clsi?.backendGroupName
)
const NewBackendCloudClsiCookieManager = require('./ClsiCookieManager')(
const NewBackendCloudClsiCookieManager = ClsiCookieManagerFactory(
Settings.apis.clsi_new?.backendGroupName
)
const ClsiStateManager = require('./ClsiStateManager')
const _ = require('lodash')
const ClsiFormatChecker = require('./ClsiFormatChecker')
const DocumentUpdaterHandler = require('../DocumentUpdater/DocumentUpdaterHandler')
const Metrics = require('@overleaf/metrics')
const Errors = require('../Errors/Errors')
const ClsiCacheHandler = require('./ClsiCacheHandler')
const { getFilestoreBlobURL } = require('../History/HistoryManager')
const SplitTestHandler = require('../SplitTests/SplitTestHandler')
const AnalyticsManager = require('../Analytics/AnalyticsManager')
const VALID_COMPILERS = ['pdflatex', 'latex', 'xelatex', 'lualatex']
const OUTPUT_FILE_TIMEOUT_MS = 60000
@@ -843,7 +845,7 @@ function _finaliseRequest(projectId, options, project, docs, files) {
path = path.replace(/^\//, '') // Remove leading /
resources.push({
path,
url: getFilestoreBlobURL(historyId, file.hash),
url: HistoryManager.getFilestoreBlobURL(historyId, file.hash),
modified: file.created?.getTime(),
})
}
@@ -975,7 +977,7 @@ function _getClsiServerIdFromResponse(response) {
return null
}
module.exports = {
export default {
sendRequest: callbackifyMultiResult(sendRequest, [
'status',
'outputFiles',

View File

@@ -5,7 +5,7 @@ import OError from '@overleaf/o-error'
import Metrics from '@overleaf/metrics'
import ProjectGetter from '../Project/ProjectGetter.js'
import CompileManager from './CompileManager.mjs'
import ClsiManager from './ClsiManager.js'
import ClsiManager from './ClsiManager.mjs'
import logger from '@overleaf/logger'
import Settings from '@overleaf/settings'
import Errors from '../Errors/Errors.js'

View File

@@ -4,7 +4,7 @@ import RedisWrapper from '../../infrastructure/RedisWrapper.js'
import ProjectGetter from '../Project/ProjectGetter.js'
import ProjectRootDocManager from '../Project/ProjectRootDocManager.js'
import UserGetter from '../User/UserGetter.js'
import ClsiManager from './ClsiManager.js'
import ClsiManager from './ClsiManager.mjs'
import Metrics from '@overleaf/metrics'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import UserAnalyticsIdCache from '../Analytics/UserAnalyticsIdCache.js'

View File

@@ -2,7 +2,7 @@ import EditorHttpController from './EditorHttpController.mjs'
import AuthenticationController from '../Authentication/AuthenticationController.js'
import AuthorizationMiddleware from '../Authorization/AuthorizationMiddleware.mjs'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
const rateLimiters = {
addDocToProject: new RateLimiter('add-doc-to-project', {

View File

@@ -95,13 +95,13 @@ class SAMLAlreadyLinkedError extends OError {
class SAMLEmailNotAffiliatedError extends OError {
get i18nKey() {
return 'institution_account_tried_to_add_not_affiliated'
return 'institution_account_tried_to_add_not_affiliated_2'
}
}
class SAMLEmailAffiliatedWithAnotherInstitutionError extends OError {
get i18nKey() {
return 'institution_account_tried_to_add_affiliated_with_another_institution'
return 'institution_account_tried_to_add_affiliated_with_another_institution_2'
}
}

View File

@@ -4,7 +4,7 @@ import Settings from '@overleaf/settings'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import AuthenticationController from '../Authentication/AuthenticationController.js'
import AuthorizationMiddleware from '../Authorization/AuthorizationMiddleware.mjs'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
import HistoryController from './HistoryController.mjs'
const rateLimiters = {

View File

@@ -1,7 +1,7 @@
import AuthorizationMiddleware from '../Authorization/AuthorizationMiddleware.mjs'
import AuthenticationController from '../Authentication/AuthenticationController.js'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
import LinkedFilesController from './LinkedFilesController.mjs'
const rateLimiters = {

View File

@@ -1,6 +1,6 @@
import AuthorizationManager from '../Authorization/AuthorizationManager.js'
import CompileManager from '../Compile/CompileManager.mjs'
import ClsiManager from '../Compile/ClsiManager.js'
import ClsiManager from '../Compile/ClsiManager.mjs'
import ProjectFileAgent from './ProjectFileAgent.mjs'
import _ from 'lodash'
import LinkedFilesErrors from './LinkedFilesErrors.mjs'

View File

@@ -5,7 +5,7 @@ import OneTimeTokenHandler from '../Security/OneTimeTokenHandler.js'
import EmailHandler from '../Email/EmailHandler.js'
import AuthenticationManager from '../Authentication/AuthenticationManager.js'
import { callbackify, promisify } from 'node:util'
import PermissionsManager from '../Authorization/PermissionsManager.js'
import PermissionsManager from '../Authorization/PermissionsManager.mjs'
const assertUserPermissions = PermissionsManager.promises.assertUserPermissions

View File

@@ -2,7 +2,7 @@ import PasswordResetController from './PasswordResetController.mjs'
import AuthenticationController from '../Authentication/AuthenticationController.js'
import CaptchaMiddleware from '../../Features/Captcha/CaptchaMiddleware.mjs'
import { RateLimiter } from '../../infrastructure/RateLimiter.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.js'
import RateLimiterMiddleware from '../Security/RateLimiterMiddleware.mjs'
const rateLimiter = new RateLimiter('password_reset_rate_limit', {
points: 6,

Some files were not shown because too many files have changed in this diff Show More