55 Commits

Author SHA1 Message Date
williamlardier
5cd70d7cf1 ARSN-267: fix failing unit test
NodeJS 16.17.0 introduced a change in the error handling of TLS sockets
in case of error. The connexion is closed before the response is sent,
so handling the ECONNRESET error in the affected test will unblock it,
until this is fixed by NodeJS, if appropriate.

(cherry picked from commit a237e38c51)
2023-05-25 17:50:00 +00:00
gaspardmoindrot
e8a409e337 [ARSN-335] Implement GHAS 2023-05-16 21:21:49 +00:00
Nicolas Humbert
7d254a0556 ARSN-105 Disjointed reduced locations 2022-03-15 14:03:54 -04:00
Vianney Rancurel
5f8c92a0a2 ft: ARSN-87 some versioning exports are still missing for Armory 2022-02-18 17:09:27 -08:00
Taylor McKinnon
6861ac477a impr(ARSN-46): Rollback changes 2022-02-14 11:10:36 -08:00
Nicolas Humbert
90d6556229 ARSN-21 update package version 2022-02-07 18:13:46 +01:00
bert-e
f7802650ee Merge branch 'feature/ARSN-21/UpgradeToNode16' into q/7.4 2022-02-07 17:06:51 +00:00
Nicolas Humbert
d0684396b6 S3C-5450 log is not accurate anymore 2022-02-04 10:45:48 +01:00
Naren
9b9a8660d9 bf: ARSN-57 log correct client ip
check request header 'x-forwarded-for' if there is no request
configuration.
2022-01-28 17:03:47 -08:00
Ronnie Smith
8c3f304d9b feature: ARSN-21 upgrade to node 16 2022-01-24 14:26:11 -08:00
Ronnie Smith
efb3629eb0 feature: ARSN-54 use a less strict node engine 2022-01-20 15:20:43 -08:00
Ronnie Smith
6733d30439 feature: ARSN-54 revert node 16 2022-01-20 12:18:01 -08:00
bert-e
a1e14fccb1 Merge branch 'improvement/ARSN-21-Upgrade-Node-to-16' into q/7.4 2022-01-20 00:09:23 +00:00
bert-e
030f47a88a Merge branch 'bugfix/ARSN-35/add-http-header-too-large-error' into q/7.4 2022-01-19 00:48:15 +00:00
Taylor McKinnon
fc7711cca2 impr(ARSN-46): Add isAborted flag 2022-01-13 13:51:18 -08:00
Ronnie Smith
3919808d14 feature: ARSN-21 resolve broken tests 2022-01-11 14:18:56 -08:00
Dimitri Bourreau
b1dea67eef tests: ARSN-21 remove timeout 5500 from package.json script test
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-10 02:21:36 +01:00
Dimitri Bourreau
c3196181c1 chore: ARSN-21 add ioctl as optional dependency
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-10 02:20:14 +01:00
Dimitri Bourreau
c24ad4f887 chore: ARSN-21 remove ioctl
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-10 02:15:33 +01:00
Dimitri Bourreau
ad1c623c80 chore: ARSN-21 GitHub Actions run unit tests without --silent
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-10 02:14:08 +01:00
Dimitri Bourreau
9d81cad0aa tests: ARSN-21 update ws._server.connections with _connections
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-10 02:03:12 +01:00
Dimitri Bourreau
5f72738b7f improvement: ARSN-21 upgrade uuid from 3.3.2 to 3.4.0
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-09 00:38:07 +01:00
Dimitri Bourreau
70278f86ab improvement: ARSN-21 upgrade dependencies with yarn upgrade-interactive
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-07 14:35:33 +01:00
Dimitri Bourreau
083dd7454a improvement: ARSN-21 GitHub Actions should use Node 16 instead of 10
Signed-off-by: Dimitri Bourreau <contact@dimitribourreau.me>
2021-12-07 11:50:16 +01:00
Jonathan Gramain
5ce057a498 ARSN-42 bump version to 7.4.13 2021-11-18 18:19:59 -08:00
Jonathan Gramain
8c3f88e233 improvement: ARSN-42 get/set ObjectMD.nullUploadId
Add getNullUploadId/setNullUploadId helpers to ObjectMD, to store the
null version uploadId, so that it can be passed to the metadata layer
as "replayId" when deleting the null version from another master key
2021-11-18 14:16:19 -08:00
Jonathan Gramain
04581abbf6 ARSN-38 bump arsenal version 2021-11-03 15:45:30 -07:00
Jonathan Gramain
abfbe90a57 feature: ARSN-38 introduce replay prefix hidden in listings
- Add a new DB prefix for replay keys, similar to existing v1 vformat
  prefixes

- Hide this prefix for v0 listing algos DelimiterMaster and
  DelimiterVersions: skip keys beginning with this prefix, and update
  the "skipping" value to be able to skip the entire prefix after the
  streak length is reached (similar to how regular prefixes are
  skipped)

- fix an existing unit test in DelimiterVersions
2021-11-02 12:01:28 -07:00
Jonathan Gramain
b1c9474159 feature: ARSN-37 ObjectMD getUploadId/setUploadId
Add getter/setter for the "uploadId" field, used for MPUs in progress.
2021-11-01 17:25:57 -07:00
Ilke
8e8d771a64 bugfix: ARSN-35 add http header too large error 2021-10-29 20:17:42 -07:00
Rahul Padigela
f941132c8a chore: update version 2021-10-26 14:47:21 -07:00
Rahul Padigela
2246a9fbdc bugfix: ARSN-31 return empty string for invalid requests
This returns empty string for invalid encoding requests, for example
when duplicate query params in HTTP URL are parsed by Node.js HTTP parser
which converts duplicate query params into an Array and this breaks the encoding
method.
2021-10-25 16:59:09 -07:00
Rahul Padigela
86270d8495 test: test for invalid type for encoding strings 2021-10-25 16:59:03 -07:00
Thomas Carmet
4b08dd5263 ARSN-20 migrate to github actions
Co-authored-by: Ronnie <halfpint1170@gmail.com>
2021-09-23 11:37:04 -07:00
Thomas Carmet
36f6ca47e9 ARSN-17 align package.json with releases 2021-08-31 09:55:21 -07:00
Jonathan Gramain
c495ecacb0 feature: ARSN-12 bump arsenal version
Needed to ensure proper dependency update in Vault
2021-08-26 14:21:10 -07:00
anurag4DSB
8603ca5b99 feature: ARSN-12-introduce-cond-put-op
(cherry picked from commit f101a0f3a0)
2021-08-25 23:03:58 +02:00
Thomas Carmet
ef6197250c ARSN-11 update werelogs to tagged version 2021-08-12 10:03:26 -07:00
Ronnie Smith
836c65e91e bugfix: S3C-3810 Skip headers on 304 response 2021-07-30 15:24:31 -07:00
bert-e
ffbe46edfb Merge branch 'bugfix/S3C-4257_StartSeqCanBeNull' into q/7.4 2021-06-08 08:18:01 +00:00
Ronnie Smith
3ed07317e5 bugfix: S3C-4257 Start Seq can be null
* Return undefined if start seq is falsey
2021-06-07 19:49:13 -07:00
bert-e
0487a18623 Merge branch 'improvement/S3C-4336_add_BucketInfoModelVersion' into q/7.4 2021-05-10 20:18:35 +00:00
Taylor McKinnon
a4ccb94978 impr(S3C-4336): Add BucketInfoModelVersion.md from cloudserver 2021-05-10 13:01:46 -07:00
Ronnie Smith
3098fcf1e1 feature: S3C-4073 Add probe server to index 2021-05-06 21:16:48 -07:00
Ronnie Smith
41b3babc69 feature: S3C-4073 Add new probe server
* JsDocs for arsenal error
* ProbeServer as a replacement to HealthProbeServer
2021-04-30 12:53:38 -07:00
bert-e
403d9b5a08 Merge branch 'bugfix/S3C-4275-versionListingWithDelimiterInefficiency' into q/7.4 2021-04-14 01:17:37 +00:00
Jonathan Gramain
ecaf9f843a bugfix: S3C-4275 enable skip-scan for DelimiterVersions with a delimiter
Enable the skip-scan optimization to work for DelimiterVersions
listing algorithm when used with a delimiter.

For this to work, instead of returning FILTER_ACCEPT when encountering
a version that matches the master key (which resets the skip-scan
counter), return FILTER_SKIP to let the skip-scan counter increment
and eventually skip the entire listed common prefix after 100 entries.
2021-04-09 16:33:50 -07:00
Jonathan Gramain
3506fd9f4e bugfix: S3C-4275 more DelimiterVersions unit tests
Increase coverage for DelimiterVersions listing algorithm to have it
in par with DelimiterMaster before attempting a fix: most existing
tests from DelimiterMaster have been copied and adapted to fit the
DelimiterVersions logic.
2021-04-09 16:32:15 -07:00
Ronnie Smith
d533bc4e0f Merge branch 'development/7.4' into feature/S3C-4262_BackportZenkoMetrics 2021-04-06 02:41:34 -07:00
Jonathan Gramain
c6976e996e build(deps-dev): Bump mocha from 2.5.3 to 8.0.1
Clean remaining references in a few test suites to have mocha not hang
after tests complete, since mocha 4+ does not force exit anymore if
there are active references.

Ref: https://boneskull.com/mocha-v4-nears-release/#mochawontforceexit
2021-04-02 11:48:27 -07:00
Ronnie Smith
1584c4acb1 feature S3C-4262 Backport zenko metrics 2021-04-01 20:03:39 -07:00
dependabot[bot]
f1345ec2ed build(deps-dev): Bump mocha from 2.5.3 to 8.0.1
Bumps [mocha](https://github.com/mochajs/mocha) from 2.5.3 to 8.0.1.
- [Release notes](https://github.com/mochajs/mocha/releases)
- [Changelog](https://github.com/mochajs/mocha/blob/master/CHANGELOG.md)
- [Commits](https://github.com/mochajs/mocha/compare/v2.5.3...v8.0.1)

Signed-off-by: dependabot[bot] <support@github.com>
2021-03-30 15:55:18 -07:00
alexandre merle
f17006b91e bugfix: S3C-3962: considering zero size has valid in stream response 2021-02-09 13:44:05 +01:00
alexandre merle
b3080e9ac6 S3C-3904: match api method with real aws s3 api call 2021-02-05 18:36:48 +01:00
alexandre merle
9484366844 bugfix: S3C-3904: better-s3-action-logs
Introduce a map meant to override default
actionMap values for S3, will be used in logs
to monitor the s3 actions instead of the iam
permissions needed for that action
2021-02-05 02:09:08 +01:00
44 changed files with 3029 additions and 555 deletions

25
.github/workflows/codeql.yaml vendored Normal file
View File

@@ -0,0 +1,25 @@
---
name: codeQL
on:
push:
branches: [development/*, stabilization/*, hotfix/*]
pull_request:
branches: [development/*, stabilization/*, hotfix/*]
workflow_dispatch:
jobs:
analyze:
name: Static analysis with CodeQL
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: javascript, typescript
- name: Build and analyze
uses: github/codeql-action/analyze@v2

View File

@@ -0,0 +1,16 @@
---
name: dependency review
on:
pull_request:
branches: [development/*, stabilization/*, hotfix/*]
jobs:
dependency-review:
runs-on: ubuntu-latest
steps:
- name: 'Checkout Repository'
uses: actions/checkout@v3
- name: 'Dependency Review'
uses: actions/dependency-review-action@v3

47
.github/workflows/tests.yaml vendored Normal file
View File

@@ -0,0 +1,47 @@
---
name: tests
on:
push:
branches-ignore:
- 'development/**'
jobs:
test:
runs-on: ubuntu-latest
services:
# Label used to access the service container
redis:
# Docker Hub image
image: redis
# Set health checks to wait until redis has started
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
# Maps port 6379 on service container to the host
- 6379:6379
steps:
- name: Checkout
uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: '16'
cache: 'yarn'
- name: install dependencies
run: yarn install --frozen-lockfile
- name: lint yaml
run: yarn --silent lint_yml
- name: lint javascript
run: yarn --silent lint -- --max-warnings 0
- name: lint markdown
run: yarn --silent lint_md
- name: run unit tests
run: yarn test
- name: run functional tests
run: yarn ft_test
- name: run executables tests
run: yarn install && yarn test
working-directory: 'lib/executables/pensieveCreds/'

View File

@@ -0,0 +1,82 @@
# BucketInfo Model Version History
## Model Version 0/1
### Properties
``` javascript
this._acl = aclInstance;
this._name = name;
this._owner = owner;
this._ownerDisplayName = ownerDisplayName;
this._creationDate = creationDate;
```
### Usage
No explicit references in the code since mdBucketModelVersion
property not added until Model Version 2
## Model Version 2
### Properties Added
``` javascript
this._mdBucketModelVersion = mdBucketModelVersion || 0
this._transient = transient || false;
this._deleted = deleted || false;
```
### Usage
Used to determine which splitter to use ( < 2 means old splitter)
## Model version 3
### Properties Added
```
this._serverSideEncryption = serverSideEncryption || null;
```
### Usage
Used to store the server bucket encryption info
## Model version 4
### Properties Added
```javascript
this._locationConstraint = LocationConstraint || null;
```
### Usage
Used to store the location constraint of the bucket
## Model version 5
### Properties Added
```javascript
this._websiteConfiguration = websiteConfiguration || null;
this._cors = cors || null;
```
### Usage
Used to store the bucket website configuration info
and to store CORS rules to apply to cross-domain requests
## Model version 6
### Properties Added
```javascript
this._lifecycleConfiguration = lifecycleConfiguration || null;
```
### Usage
Used to store the bucket lifecycle configuration info

View File

@@ -56,6 +56,10 @@
"code": 400,
"description": "The provided token has expired."
},
"HttpHeadersTooLarge": {
"code": 400,
"description": "Your http headers exceed the maximum allowed http headers size."
},
"IllegalVersioningConfigurationException": {
"code": 400,
"description": "Indicates that the versioning configuration specified in the request is invalid."

View File

@@ -1,43 +0,0 @@
---
version: 0.2
branches:
default:
stage: pre-merge
stages:
pre-merge:
worker: &master-worker
type: docker
path: eve/workers/master
volumes:
- '/home/eve/workspace'
steps:
- Git:
name: fetch source
repourl: '%(prop:git_reference)s'
shallow: True
retryFetch: True
haltOnFailure: True
- ShellCommand:
name: install dependencies
command: yarn install --frozen-lockfile
- ShellCommand:
name: run lint yml
command: yarn run --silent lint_yml
- ShellCommand:
name: run lint
command: yarn run --silent lint -- --max-warnings 0
- ShellCommand:
name: run lint_md
command: yarn run --silent lint_md
- ShellCommand:
name: run test
command: yarn run --silent test
- ShellCommand:
name: run ft_test
command: yarn run ft_test
- ShellCommand:
name: run executables tests
command: yarn install && yarn test
workdir: '%(prop:builddir)s/build/lib/executables/pensieveCreds/'

View File

@@ -1,57 +0,0 @@
FROM ubuntu:trusty
#
# Install apt packages needed by the buildchain
#
ENV LANG C.UTF-8
COPY buildbot_worker_packages.list arsenal_packages.list /tmp/
RUN apt-get update -q && apt-get -qy install curl apt-transport-https \
&& apt-get install -qy software-properties-common python-software-properties \
&& curl --silent https://deb.nodesource.com/gpgkey/nodesource.gpg.key | apt-key add - \
&& echo "deb https://deb.nodesource.com/node_10.x trusty main" > /etc/apt/sources.list.d/nodesource.list \
&& curl -sS http://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - \
&& echo "deb http://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list \
&& add-apt-repository ppa:ubuntu-toolchain-r/test \
&& apt-get update -q \
&& cat /tmp/buildbot_worker_packages.list | xargs apt-get install -qy \
&& cat /tmp/arsenal_packages.list | xargs apt-get install -qy \
&& pip install pip==9.0.1 \
&& rm -rf /var/lib/apt/lists/* \
&& rm -f /tmp/*_packages.list
#
# Install usefull nodejs dependencies
#
RUN yarn global add mocha
#
# Add user eve
#
RUN adduser -u 1042 --home /home/eve --disabled-password --gecos "" eve \
&& adduser eve sudo \
&& sed -ri 's/(%sudo.*)ALL$/\1NOPASSWD:ALL/' /etc/sudoers
#
# Run buildbot-worker on startup
#
ARG BUILDBOT_VERSION=0.9.12
RUN pip install yamllint
RUN pip install buildbot-worker==$BUILDBOT_VERSION
USER eve
ENV HOME /home/eve
#
# Setup nodejs environmnent
#
ENV CXX=g++-4.9
ENV LANG C.UTF-8
WORKDIR /home/eve/workspace
CMD buildbot-worker create-worker . "$BUILDMASTER:$BUILDMASTER_PORT" "$WORKERNAME" "$WORKERPASS" \
&& sudo service redis-server start \
&& buildbot-worker start --nodaemon

View File

@@ -1,4 +0,0 @@
nodejs
redis-server
g++-4.9
yarn

View File

@@ -1,9 +0,0 @@
ca-certificates
git
libffi-dev
libssl-dev
python2.7
python2.7-dev
python-pip
software-properties-common
sudo

View File

@@ -49,6 +49,9 @@ module.exports = {
.VersioningConstants,
Version: require('./lib/versioning/Version.js').Version,
VersionID: require('./lib/versioning/VersionID.js'),
WriteGatheringManager: require('./lib/versioning/WriteGatheringManager.js'),
WriteCache: require('./lib/versioning/WriteCache.js'),
VersioningRequestProcessor: require('./lib/versioning/VersioningRequestProcessor.js'),
},
network: {
http: {
@@ -60,6 +63,9 @@ module.exports = {
RESTServer: require('./lib/network/rest/RESTServer'),
RESTClient: require('./lib/network/rest/RESTClient'),
},
probe: {
ProbeServer: require('./lib/network/probe/ProbeServer'),
},
RoundRobin: require('./lib/network/RoundRobin'),
},
s3routes: {
@@ -118,6 +124,7 @@ module.exports = {
StatsClient: require('./lib/metrics/StatsClient'),
StatsModel: require('./lib/metrics/StatsModel'),
RedisClient: require('./lib/metrics/RedisClient'),
ZenkoMetrics: require('./lib/metrics/ZenkoMetrics'),
},
pensieve: {
credentialUtils: require('./lib/executables/pensieveCreds/utils'),

View File

@@ -32,6 +32,7 @@ class DelimiterMaster extends Delimiter {
// non-PHD master version or a version whose master is a PHD version
this.prvKey = undefined;
this.prvPHDKey = undefined;
this.inReplayPrefix = false;
Object.assign(this, {
[BucketVersioningKeyFormat.v0]: {
@@ -61,6 +62,12 @@ class DelimiterMaster extends Delimiter {
let key = obj.key;
const value = obj.value;
if (key.startsWith(DbPrefixes.Replay)) {
this.inReplayPrefix = true;
return FILTER_SKIP;
}
this.inReplayPrefix = false;
/* Skip keys not starting with the prefix or not alphabetically
* ordered. */
if ((this.prefix && !key.startsWith(this.prefix))
@@ -80,13 +87,13 @@ class DelimiterMaster extends Delimiter {
* when a listing page ends on an accepted obj and the next page
* starts with a version of this object.
* In that case prvKey is default set to undefined
* in the constructor) and comparing to NextMarker is the only
* in the constructor and comparing to NextMarker is the only
* way to know we should not accept this version. This test is
* not redundant with the one at the beginning of this function,
* we are comparing here the key without the version suffix,
* - key startsWith the previous NextMarker happens because we set
* NextMarker to the common prefix instead of the whole key
* value. (TODO: remove this test once ZENKO-1048 is fixed. ).
* value. (TODO: remove this test once ZENKO-1048 is fixed)
* */
if (key === this.prvKey || key === this[this.nextContinueMarker] ||
(this.delimiter &&
@@ -155,7 +162,7 @@ class DelimiterMaster extends Delimiter {
return super.filter(obj);
}
skippingV0() {
skippingBase() {
if (this[this.nextContinueMarker]) {
// next marker or next continuation token:
// - foo/ : skipping foo/
@@ -170,8 +177,15 @@ class DelimiterMaster extends Delimiter {
return SKIP_NONE;
}
skippingV0() {
if (this.inReplayPrefix) {
return DbPrefixes.Replay;
}
return this.skippingBase();
}
skippingV1() {
const skipTo = this.skippingV0();
const skipTo = this.skippingBase();
if (skipTo === SKIP_NONE) {
return SKIP_NONE;
}

View File

@@ -33,6 +33,7 @@ class DelimiterVersions extends Delimiter {
// listing results
this.NextMarker = parameters.keyMarker;
this.NextVersionIdMarker = undefined;
this.inReplayPrefix = false;
Object.assign(this, {
[BucketVersioningKeyFormat.v0]: {
@@ -163,8 +164,15 @@ class DelimiterVersions extends Delimiter {
* @return {number} - indicates if iteration should continue
*/
filterV0(obj) {
if (obj.key.startsWith(DbPrefixes.Replay)) {
this.inReplayPrefix = true;
return FILTER_SKIP;
}
this.inReplayPrefix = false;
if (Version.isPHD(obj.value)) {
return FILTER_ACCEPT; // trick repd to not increase its streak
// return accept to avoid skipping the next values in range
return FILTER_ACCEPT;
}
return this.filterCommon(obj.key, obj.value);
}
@@ -205,8 +213,9 @@ class DelimiterVersions extends Delimiter {
} else {
nonversionedKey = key.slice(0, versionIdIndex);
versionId = key.slice(versionIdIndex + 1);
// skip a version key if it is the master version
if (this.masterKey === nonversionedKey && this.masterVersionId === versionId) {
return FILTER_ACCEPT; // trick repd to not increase its streak
return FILTER_SKIP;
}
this.masterKey = undefined;
this.masterVersionId = undefined;
@@ -222,6 +231,9 @@ class DelimiterVersions extends Delimiter {
}
skippingV0() {
if (this.inReplayPrefix) {
return DbPrefixes.Replay;
}
if (this.NextMarker) {
const index = this.NextMarker.lastIndexOf(this.delimiter);
if (index === this.NextMarker.length - 1) {

View File

@@ -35,6 +35,13 @@ function _toHexUTF8(char) {
function awsURIencode(input, encodeSlash, noEncodeStar) {
const encSlash = encodeSlash === undefined ? true : encodeSlash;
let encoded = '';
/**
* Duplicate query params are not suppported by AWS S3 APIs. These params
* are parsed as Arrays by Node.js HTTP parser which breaks this method
*/
if (typeof input !== 'string') {
return encoded;
}
for (let i = 0; i < input.length; i++) {
const ch = input.charAt(i);
if ((ch >= 'A' && ch <= 'Z') ||

View File

@@ -40,6 +40,7 @@ class IndexTransaction {
this.operations = [];
this.db = db;
this.closed = false;
this.conditions = [];
}
/**
@@ -118,6 +119,35 @@ class IndexTransaction {
this.push({ type: 'del', key });
}
/**
* Adds a condition for the transaction
*
* @argument {object} condition an object with the following attributes:
* {
* <condition>: the object key
* }
* example: { notExists: 'key1' }
*
* @throws {Error} an error described by the following properties
* - pushOnCommittedTransaction if already committed
* - missingCondition if the condition is empty
*
* @returns {undefined}
*/
addCondition(condition) {
if (this.closed) {
throw propError('pushOnCommittedTransaction',
'can not add conditions to already committed transaction');
}
if (condition === undefined || Object.keys(condition).length === 0) {
throw propError('missingCondition', 'missing condition for conditional put');
}
if (typeof (condition.notExists) !== 'string') {
throw propError('unsupportedConditionalOperation', 'missing key or supported condition');
}
this.conditions.push(condition);
}
/**
* Applies the queued updates in this transaction atomically.
*
@@ -138,6 +168,7 @@ class IndexTransaction {
}
this.closed = true;
writeOptions.conditions = this.conditions;
// The array-of-operations variant of the `batch` method
// allows passing options such has `sync: true` whereas the

View File

@@ -1,13 +1,65 @@
'use strict'; // eslint-disable-line strict
/**
* ArsenalError
*
* @extends {Error}
*/
class ArsenalError extends Error {
/**
* constructor.
*
* @param {string} type - Type of error or message
* @param {number} code - HTTP status code
* @param {string} desc - Verbose description of error
*/
constructor(type, code, desc) {
super(type);
/**
* HTTP status code of error
* @type {number}
*/
this.code = code;
/**
* Description of error
* @type {string}
*/
this.description = desc;
this[type] = true;
}
/**
* Output the error as a JSON string
* @returns {string} Error as JSON string
*/
toString() {
return JSON.stringify({
errorType: this.message,
errorMessage: this.description,
});
}
/**
* Write the error in an HTTP response
*
* @param { http.ServerResponse } res - Response we are responding to
* @returns {undefined}
*/
writeResponse(res) {
res.writeHead(this.code);
res.end(this.toString());
}
/**
* customizeDescription returns a new ArsenalError with a new description
* with the same HTTP code and message.
*
* @param {string} description - New error description
* @returns {ArsenalError} New error
*/
customizeDescription(description) {
return new ArsenalError(this.message, this.code, description);
}

View File

@@ -153,6 +153,10 @@ class RedisClient {
clear(cb) {
return this._client.flushdb(cb);
}
disconnect() {
this._client.disconnect();
}
}
module.exports = RedisClient;

View File

@@ -0,0 +1,40 @@
const promClient = require('prom-client');
const collectDefaultMetricsIntervalMs =
process.env.COLLECT_DEFAULT_METRICS_INTERVAL_MS !== undefined ?
Number.parseInt(process.env.COLLECT_DEFAULT_METRICS_INTERVAL_MS, 10) :
10000;
promClient.collectDefaultMetrics({ timeout: collectDefaultMetricsIntervalMs });
class ZenkoMetrics {
static createCounter(params) {
return new promClient.Counter(params);
}
static createGauge(params) {
return new promClient.Gauge(params);
}
static createHistogram(params) {
return new promClient.Histogram(params);
}
static createSummary(params) {
return new promClient.Summary(params);
}
static getMetric(name) {
return promClient.register.getSingleMetric(name);
}
static asPrometheus() {
return promClient.register.metrics();
}
static asPrometheusContentType() {
return promClient.register.contentType;
}
}
module.exports = ZenkoMetrics;

View File

@@ -110,8 +110,10 @@ class ObjectMD {
// should be undefined when not set explicitly
'isNull': undefined,
'nullVersionId': undefined,
'nullUploadId': undefined,
'isDeleteMarker': undefined,
'versionId': undefined,
'uploadId': undefined,
'tags': {},
'replicationInfo': {
status: '',
@@ -572,8 +574,12 @@ class ObjectMD {
const locations = this.getLocation();
const reducedLocations = [];
let partTotal = 0;
let start;
for (let i = 0; i < locations.length; i++) {
const currPart = new ObjectMDLocation(locations[i]);
if (i === 0) {
start = currPart.getPartStart();
}
const currPartNum = currPart.getPartNumber();
let nextPartNum = undefined;
if (i < locations.length - 1) {
@@ -583,7 +589,9 @@ class ObjectMD {
partTotal += currPart.getPartSize();
if (currPartNum !== nextPartNum) {
currPart.setPartSize(partTotal);
currPart.setPartStart(start);
reducedLocations.push(currPart.getValue());
start += partTotal;
partTotal = 0;
}
}
@@ -630,6 +638,27 @@ class ObjectMD {
return this._data.nullVersionId;
}
/**
* Set metadata nullUploadId value
*
* @param {string} nullUploadId - The upload ID used to complete
* the MPU of the null version
* @return {ObjectMD} itself
*/
setNullUploadId(nullUploadId) {
this._data.nullUploadId = nullUploadId;
return this;
}
/**
* Get metadata nullUploadId value
*
* @return {string|undefined} The object nullUploadId
*/
getNullUploadId() {
return this._data.nullUploadId;
}
/**
* Set metadata isDeleteMarker value
*
@@ -680,6 +709,26 @@ class ObjectMD {
return VersionIDUtils.encode(this.getVersionId());
}
/**
* Set metadata uploadId value
*
* @param {string} uploadId - The upload ID used to complete the MPU object
* @return {ObjectMD} itself
*/
setUploadId(uploadId) {
this._data.uploadId = uploadId;
return this;
}
/**
* Get metadata uploadId value
*
* @return {string|undefined} The object uploadId
*/
getUploadId() {
return this._data.uploadId;
}
/**
* Set tags
*

View File

@@ -55,6 +55,11 @@ class ObjectMDLocation {
return this._data.start;
}
setPartStart(start) {
this._data.start = start;
return this;
}
getPartSize() {
return this._data.size;
}

View File

@@ -8,6 +8,7 @@ const ciphers = require('../../https/ciphers').ciphers;
const errors = require('../../errors');
const { checkSupportIPv6 } = require('./utils');
class Server {
/**
@@ -186,7 +187,7 @@ class Server {
* Function called when no handler specified in the server
*
* @param {http.IncomingMessage|https.IncomingMessage} req - Request object
* @param {http.ServerResponse|https.ServerResponse} res - Response object
* @param {http.ServerResponse} res - Response object
* @return {undefined}
*/
_noHandlerCb(req, res) {
@@ -203,8 +204,8 @@ class Server {
/**
* Function called when request received
*
* @param {http.IncomingMessage|https.IncomingMessage} req - Request object
* @param {http.ServerResponse|https.ServerResponse} res - Response object
* @param {http.IncomingMessage} req - Request object
* @param {http.ServerResponse} res - Response object
* @return {undefined}
*/
_onRequest(req, res) {
@@ -375,7 +376,7 @@ class Server {
* where the value is not 100-continue
*
* @param {http.IncomingMessage|https.IncomingMessage} req - Request object
* @param {http.ServerResponse|https.ServerResponse} res - Response object
* @param {http.ServerResponse} res - Response object
* @return {undefined}
*/
_onCheckExpectation(req, res) {
@@ -387,7 +388,7 @@ class Server {
* is received
*
* @param {http.IncomingMessage|https.IncomingMessage} req - Request object
* @param {http.ServerResponse|https.ServerResponse} res - Response object
* @param {http.ServerResponse} res - Response object
* @return {undefined}
*/
_onCheckContinue(req, res) {

View File

@@ -0,0 +1,109 @@
const httpServer = require('../http/server');
const werelogs = require('werelogs');
const errors = require('../../errors');
const DEFAULT_LIVE_ROUTE = '/_/live';
const DEFAULT_READY_ROUTE = '/_/live';
const DEFAULT_METRICS_ROUTE = '/_/metrics';
/**
* ProbeDelegate is used to determine if a probe is successful or
* if any errors are present.
* If everything is working as intended, it is a no-op.
* Otherwise, return a string representing what is failing.
* @callback ProbeDelegate
* @param { import('http').ServerResponse } res - HTTP response for writing
* @param {werelogs.Logger} log - Werelogs instance for logging if you choose to
* @return {(string|undefined)} String representing issues to report. An empty
* string or undefined is used to represent no issues.
*/
/**
* @typedef {Object} ProbeServerParams
* @property {number} port - Port to run server on
* @property {string} [bindAddress] - Address to bind to, defaults to localhost
*/
/**
* ProbeServer is a generic server for handling probe checks or other
* generic responses.
*
* @extends {httpServer}
*/
class ProbeServer extends httpServer {
/**
* Create a new ProbeServer with parameters
*
* @param {ProbeServerParams} params - Parameters for server
*/
constructor(params) {
const logging = new werelogs.Logger('ProbeServer');
super(params.port, logging);
this.logging = logging;
this.setBindAddress(params.bindAddress || 'localhost');
// hooking our request processing function by calling the
// parent's method for that
this.onRequest(this._onRequest);
/**
* Map of routes to callback methods
* @type {Map<string, ProbeDelegate>}
*/
this._handlers = new Map();
}
/**
* Add request handler at the path
*
* @example <caption>If service is not connected</caption>
* addHandler(DEFAULT_LIVE_ROUTE, (res, log) => {
* if (!redisConnected) {
* return 'Redis is not connected';
* }
* res.writeHead(200)
* res.end()
* })
* @param {string|string[]} pathOrPaths - URL path(s) for where the request should be handled
* @param {ProbeDelegate} handler - Callback to handle request
* @returns {undefined}
*/
addHandler(pathOrPaths, handler) {
let paths = pathOrPaths;
if (typeof paths === 'string') {
paths = [paths];
}
for (const p of paths) {
this._handlers.set(p, handler);
}
}
_onRequest(req, res) {
const log = this.logging.newRequestLogger();
log.debug('request received', { method: req.method, url: req.url });
if (req.method !== 'GET') {
errors.MethodNotAllowed.writeResponse(res);
return;
}
if (!this._handlers.has(req.url)) {
errors.InvalidURI.writeResponse(res);
return;
}
const probeResponse = this._handlers.get(req.url)(res, log);
if (probeResponse !== undefined && probeResponse !== '') {
// Return an internal error with the response
errors.InternalError
.customizeDescription(probeResponse)
.writeResponse(res);
}
}
}
module.exports = {
ProbeServer,
DEFAULT_LIVE_ROUTE,
DEFAULT_READY_ROUTE,
DEFAULT_METRICS_ROUTE,
};

View File

@@ -88,6 +88,16 @@ class RESTClient {
});
}
/**
* Destroy the HTTP agent, forcing a close of the remaining open
* connections
*
* @return {undefined}
*/
destroy() {
this.httpAgent.destroy();
}
/*
* Create a dedicated logger for RESTClient, from the provided werelogs API
* instance.

View File

@@ -214,7 +214,7 @@ class RESTServer extends httpServer {
if (req.url.startsWith(`${constants.dataFileURL}?`)) {
const queryParam = url.parse(req.url).query;
if (queryParam === 'diskUsage') {
this.dataStore.getDiskUsage((err, result) => {
return this.dataStore.getDiskUsage((err, result) => {
if (err) {
return sendError(res, log, err);
}

View File

@@ -7,8 +7,9 @@ const ipCheck = require('../ipCheck');
* @return {string} - returns client IP from the request
*/
function getClientIp(request, s3config) {
const clientIp = request.socket.remoteAddress;
const requestConfig = s3config ? s3config.requests : {};
const remoteAddress = request.socket.remoteAddress;
const clientIp = requestConfig ? remoteAddress : request.headers['x-forwarded-for'] || remoteAddress;
if (requestConfig) {
const { trustedProxyCIDRs, extractClientIPFromHeader } = requestConfig;
/**

View File

@@ -72,6 +72,49 @@ const actionMapS3 = Object.assign({
bucketPutNotification: 's3:PutBucketNotification',
}, sharedActionMap, actionMapRQ, actionMapBP);
const actionMonitoringMapS3 = {
bucketDelete: 'DeleteBucket',
bucketDeleteCors: 'DeleteBucketCors',
bucketDeleteLifecycle: 'DeleteBucketLifecycle',
bucketDeleteReplication: 'DeleteBucketReplication',
bucketDeleteWebsite: 'DeleteBucketWebsite',
bucketGet: 'ListObjects',
bucketGetACL: 'GetBucketAcl',
bucketGetCors: 'GetBucketCors',
bucketGetLifecycle: 'GetBucketLifecycleConfiguration',
bucketGetLocation: 'GetBucketLocation',
bucketGetReplication: 'GetBucketReplication',
bucketGetVersioning: 'GetBucketVersioning',
bucketGetWebsite: 'GetBucketWebsite',
bucketHead: 'HeadBucket',
bucketPut: 'CreateBucket',
bucketPutACL: 'PutBucketAcl',
bucketPutCors: 'PutBucketCors',
bucketPutLifecycle: 'PutBucketLifecycleConfiguration',
bucketPutReplication: 'PutBucketReplication',
bucketPutVersioning: 'PutBucketVersioning',
bucketPutWebsite: 'PutBucketWebsite',
completeMultipartUpload: 'CompleteMultipartUpload',
initiateMultipartUpload: 'CreateMultipartUpload',
listMultipartUploads: 'ListMultipartUploads',
listParts: 'ListParts',
multiObjectDelete: 'DeleteObjects',
multipartDelete: 'AbortMultipartUpload',
objectCopy: 'CopyObject',
objectDelete: 'DeleteObject',
objectDeleteTagging: 'DeleteObjectTagging',
objectGet: 'GetObject',
objectGetACL: 'GetObjectAcl',
objectGetTagging: 'GetObjectTagging',
objectHead: 'HeadObject',
objectPut: 'PutObject',
objectPutACL: 'PutObjectAcl',
objectPutCopyPart: 'UploadPartCopy',
objectPutPart: 'UploadPart',
objectPutTagging: 'PutObjectTagging',
serviceGet: 'ListBuckets',
};
const actionMapIAM = {
attachGroupPolicy: 'iam:AttachGroupPolicy',
attachUserPolicy: 'iam:AttachUserPolicy',
@@ -121,6 +164,7 @@ module.exports = {
actionMapRQ,
actionMapBP,
actionMapS3,
actionMonitoringMapS3,
actionMapIAM,
actionMapSSO,
actionMapSTS,

View File

@@ -10,6 +10,8 @@ const routeOPTIONS = require('./routes/routeOPTIONS');
const routesUtils = require('./routesUtils');
const routeWebsite = require('./routes/routeWebsite');
const requestUtils = require('../../lib/policyEvaluator/requestUtils');
const routeMap = {
GET: routeGET,
PUT: routePUT,
@@ -67,7 +69,8 @@ function checkBucketAndKey(bucketName, objectKey, method, reqQuery,
return undefined;
}
function checkTypes(req, res, params, logger) {
// TODO: ARSN-59 remove assertions or restrict it to dev environment only.
function checkTypes(req, res, params, logger, s3config) {
assert.strictEqual(typeof req, 'object',
'bad routes param: req must be an object');
assert.strictEqual(typeof res, 'object',
@@ -114,6 +117,9 @@ function checkTypes(req, res, params, logger) {
});
assert.strictEqual(typeof params.dataRetrievalFn, 'function',
'bad routes param: dataRetrievalFn must be a defined function');
if (s3config) {
assert.strictEqual(typeof s3config, 'object', 'bad routes param: s3config must be an object');
}
}
/** routes - route request to appropriate method
@@ -134,9 +140,10 @@ function checkTypes(req, res, params, logger) {
* values for whether queries are supported
* @param {function} params.dataRetrievalFn - function to retrieve data
* @param {RequestLogger} logger - werelogs logger instance
* @param {String} [s3config] - s3 configuration
* @returns {undefined}
*/
function routes(req, res, params, logger) {
function routes(req, res, params, logger, s3config) {
checkTypes(req, res, params, logger);
const {
@@ -150,7 +157,7 @@ function routes(req, res, params, logger) {
} = params;
const clientInfo = {
clientIP: req.socket.remoteAddress,
clientIP: requestUtils.getClientIp(req, s3config),
clientPort: req.socket.remotePort,
httpCode: res.statusCode,
httpMessage: res.statusMessage,

View File

@@ -85,8 +85,18 @@ const XMLResponseBackend = {
});
},
errorResponse: function errorXMLResponse(errCode, response, log,
corsHeaders) {
errorResponse: function errorXMLResponse(errCode, response, log, corsHeaders) {
setCommonResponseHeaders(corsHeaders, response, log);
// early return to avoid extra headers and XML data
if (errCode.code === 304) {
response.writeHead(errCode.code);
return response.end('', 'utf8', () => {
log.end().info('responded with empty body', {
httpCode: response.statusCode,
});
});
}
log.trace('sending error xml response', { errCode });
/*
<?xml version="1.0" encoding="UTF-8"?>
@@ -112,7 +122,6 @@ const XMLResponseBackend = {
log.addDefaultFields({
bytesSent,
});
setCommonResponseHeaders(corsHeaders, response, log);
response.writeHead(errCode.code,
{ 'Content-Type': 'application/xml',
'Content-Length': bytesSent });
@@ -286,7 +295,6 @@ function retrieveData(locations, retrieveDataFn, response, log) {
};
// the S3-client might close the connection while we are processing it
response.once('close', () => {
log.debug('received close event before response end');
responseDestroyed = true;
if (currentStream) {
currentStream.destroy();
@@ -354,11 +362,16 @@ function _responseBody(responseBackend, errCode, payload, response, log,
return undefined;
}
function _contentLengthMatchesLocations(contentLength, dataLocations) {
const sumSizes = dataLocations.reduce(
(sum, location) => (sum !== undefined && location.size ?
function _computeContentLengthFromLocation(dataLocations) {
return dataLocations.reduce(
(sum, location) => (sum !== undefined &&
(typeof location.size === 'number' || typeof location.size === 'string') ?
sum + Number.parseInt(location.size, 10) :
undefined), 0);
}
function _contentLengthMatchesLocations(contentLength, dataLocations) {
const sumSizes = _computeContentLengthFromLocation(dataLocations);
return sumSizes === undefined ||
sumSizes === Number.parseInt(contentLength, 10);
}
@@ -480,7 +493,7 @@ const routesUtils = {
okContentHeadersResponse(overrideParams, resHeaders, response,
range, log);
}
if (dataLocations === null) {
if (dataLocations === null || _computeContentLengthFromLocation(dataLocations) === 0) {
return response.end(() => {
log.end().info('responded with only metadata', {
httpCode: response.statusCode,

View File

@@ -30,7 +30,7 @@ class RecordLogProxy extends rpc.BaseClient {
function formatSeq(seq) {
if (seq === undefined) {
if (!seq) {
return undefined;
}
return `0000000000000000${seq.toString()}`.slice(-16);

View File

@@ -5,6 +5,7 @@ module.exports.VersioningConstants = {
DbPrefixes: {
Master: '\x7fM',
Version: '\x7fV',
Replay: '\x7fR',
},
BucketVersioningKeyFormat: {
current: 'v1',

View File

@@ -1,9 +1,9 @@
{
"name": "arsenal",
"engines": {
"node": ">=6.9.5"
"node": ">=16"
},
"version": "7.5.0",
"version": "7.4.17",
"description": "Common utilities for the S3 project components",
"main": "index.js",
"repository": {
@@ -29,31 +29,32 @@
"level": "~5.0.1",
"level-sublevel": "~6.6.5",
"node-forge": "^0.7.1",
"prom-client": "10.2.3",
"simple-glob": "^0.2",
"socket.io": "~2.3.0",
"socket.io-client": "~2.3.0",
"utf8": "2.1.2",
"uuid": "^3.0.1",
"werelogs": "scality/werelogs#0ff7ec82",
"werelogs": "scality/werelogs#8.1.0",
"xml2js": "~0.4.23"
},
"optionalDependencies": {
"ioctl": "2.0.0"
"ioctl": "^2.0.2"
},
"devDependencies": {
"@sinonjs/fake-timers": "^6.0.1",
"eslint": "2.13.1",
"eslint-config-airbnb": "6.2.0",
"eslint-config-scality": "scality/Guidelines#ec33dfb",
"eslint-config-scality": "scality/Guidelines#7.4.11",
"eslint-plugin-react": "^4.3.0",
"mocha": "2.5.3",
"mocha": "8.0.1",
"temp": "0.9.1"
},
"scripts": {
"lint": "eslint $(git ls-files '*.js')",
"lint_md": "mdlint $(git ls-files '*.md')",
"lint_yml": "yamllint $(git ls-files '*.yml')",
"test": "mocha --recursive --timeout 5500 tests/unit",
"test": "mocha --recursive tests/unit",
"ft_test": "find tests/functional -name \"*.js\" | grep -v \"utils/\" | xargs mocha --timeout 120000"
},
"private": true

View File

@@ -8,12 +8,14 @@ const {
FILTER_ACCEPT,
FILTER_SKIP,
SKIP_NONE,
inc,
} = require('../../../../lib/algos/list/tools');
const VSConst =
require('../../../../lib/versioning/constants').VersioningConstants;
const Version = require('../../../../lib/versioning/Version').Version;
const { generateVersionId } = require('../../../../lib/versioning/VersionID');
const { DbPrefixes } = VSConst;
const zpad = require('../../helpers').zpad;
const VID_SEP = VSConst.VersionId.Separator;
@@ -64,7 +66,6 @@ function getListingKey(key, vFormat) {
fakeLogger, vFormat);
/* Filter a master version to set NextMarker. */
// TODO: useless once S3C-1628 is fixed.
const listingKey = getListingKey(key, vFormat);
delimiter.filter({ key: listingKey, value: '' });
assert.strictEqual(delimiter.NextMarker, key);
@@ -215,8 +216,8 @@ function getListingKey(key, vFormat) {
value: Version.generatePHDVersion(generateVersionId('', '')),
};
/* When filtered, it should return FILTER_ACCEPT and set the prvKey.
* to undefined. It should not be added to result the content or common
/* When filtered, it should return FILTER_ACCEPT and set the prvKey
* to undefined. It should not be added to the result content or common
* prefixes. */
assert.strictEqual(delimiter.filter(objPHD), FILTER_ACCEPT);
assert.strictEqual(delimiter.prvKey, undefined);
@@ -238,7 +239,7 @@ function getListingKey(key, vFormat) {
* and element in result content. */
delimiter.filter({ key, value });
/* When filtered, it should return FILTER_ACCEPT and set the prvKey.
/* When filtered, it should return FILTER_ACCEPT and set the prvKey
* to undefined. It should not be added to the result content or common
* prefixes. */
assert.strictEqual(delimiter.filter(objPHD), FILTER_ACCEPT);
@@ -283,7 +284,7 @@ function getListingKey(key, vFormat) {
});
});
it('should accept a delete marker', () => {
it('should skip a delete marker version', () => {
const delimiter = new DelimiterMaster({}, fakeLogger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key = 'key';
@@ -300,7 +301,7 @@ function getListingKey(key, vFormat) {
assert.deepStrictEqual(delimiter.result(), EmptyResult);
});
it('should skip version after a delete marker', () => {
it('should skip version after a delete marker master', () => {
const delimiter = new DelimiterMaster({}, fakeLogger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key = 'key';
@@ -316,7 +317,7 @@ function getListingKey(key, vFormat) {
assert.deepStrictEqual(delimiter.result(), EmptyResult);
});
it('should accept a new key after a delete marker', () => {
it('should accept a new master key after a delete marker master', () => {
const delimiter = new DelimiterMaster({}, fakeLogger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key1 = 'key1';
@@ -454,6 +455,39 @@ function getListingKey(key, vFormat) {
assert.strictEqual(delimiter.filter({ key, value }), FILTER_SKIP);
});
it('should return good skipping value for DelimiterMaster on replay keys', () => {
const delimiter = new DelimiterMaster(
{ delimiter: '/', v2: true },
fakeLogger, vFormat);
for (let i = 0; i < 10; i++) {
delimiter.filter({
key: `foo/${zpad(i)}`,
value: '{}',
});
}
// simulate a listing that goes through a replay key, ...
assert.strictEqual(
delimiter.filter({
key: `${DbPrefixes.Replay}xyz`,
value: 'abcdef',
}),
FILTER_SKIP);
// ...it should skip the whole replay prefix
assert.strictEqual(delimiter.skipping(), DbPrefixes.Replay);
// simulate a listing that reaches regular object keys
// beyond the replay prefix, ...
assert.strictEqual(
delimiter.filter({
key: `${inc(DbPrefixes.Replay)}foo/bar`,
value: '{}',
}),
FILTER_ACCEPT);
// ...it should return to skipping by prefix as usual
assert.strictEqual(delimiter.skipping(), `${inc(DbPrefixes.Replay)}foo/`);
});
}
});
});

View File

@@ -3,14 +3,29 @@
const assert = require('assert');
const DelimiterVersions =
require('../../../../lib/algos/list/delimiterVersions').DelimiterVersions;
const {
FILTER_ACCEPT,
FILTER_SKIP,
SKIP_NONE,
inc,
} = require('../../../../lib/algos/list/tools');
const Werelogs = require('werelogs').Logger;
const logger = new Werelogs('listTest');
const performListing = require('../../../utils/performListing');
const zpad = require('../../helpers').zpad;
const { inc } = require('../../../../lib/algos/list/tools');
const VSConst = require('../../../../lib/versioning/constants').VersioningConstants;
const Version = require('../../../../lib/versioning/Version').Version;
const { generateVersionId } = require('../../../../lib/versioning/VersionID');
const { DbPrefixes } = VSConst;
const VID_SEP = VSConst.VersionId.Separator;
const EmptyResult = {
Versions: [],
CommonPrefixes: [],
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
Delimiter: undefined,
};
class Test {
constructor(name, input, genMDParams, output, filter) {
@@ -264,7 +279,7 @@ const tests = [
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
}),
new Test('bad key marker and good prefix', {
new Test('with bad key marker and good prefix', {
delimiter: '/',
prefix: 'notes/summer/',
keyMarker: 'notes/summer0',
@@ -288,7 +303,7 @@ const tests = [
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
}, (e, input) => e.key > input.keyMarker),
new Test('delimiter and prefix (related to #147)', {
new Test('with delimiter and prefix (related to #147)', {
delimiter: '/',
prefix: 'notes/',
}, {
@@ -318,7 +333,7 @@ const tests = [
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
}),
new Test('delimiter, prefix and marker (related to #147)', {
new Test('with delimiter, prefix and marker (related to #147)', {
delimiter: '/',
prefix: 'notes/',
keyMarker: 'notes/year.txt',
@@ -346,7 +361,7 @@ const tests = [
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
}, (e, input) => e.key > input.keyMarker),
new Test('all parameters 1/3', {
new Test('with all parameters 1/3', {
delimiter: '/',
prefix: 'notes/',
keyMarker: 'notes/',
@@ -372,7 +387,7 @@ const tests = [
NextVersionIdMarker: undefined,
}, (e, input) => e.key > input.keyMarker),
new Test('all parameters 2/3', {
new Test('with all parameters 2/3', {
delimiter: '/',
prefix: 'notes/', // prefix
keyMarker: 'notes/spring/',
@@ -398,7 +413,7 @@ const tests = [
NextVersionIdMarker: undefined,
}, (e, input) => e.key > input.keyMarker),
new Test('all parameters 3/3', {
new Test('with all parameters 3/3', {
delimiter: '/',
prefix: 'notes/', // prefix
keyMarker: 'notes/summer/',
@@ -426,7 +441,7 @@ const tests = [
NextVersionIdMarker: receivedData[19].versionId,
}, (e, input) => e.key > input.keyMarker),
new Test('all parameters 4/3', {
new Test('with all parameters 4/3', {
delimiter: '/',
prefix: 'notes/', // prefix
keyMarker: 'notes/year.txt',
@@ -454,7 +469,7 @@ const tests = [
NextVersionIdMarker: receivedData[20].versionId,
}, (e, input) => e.key > input.keyMarker),
new Test('all parameters 5/3', {
new Test('with all parameters 5/3', {
delimiter: '/',
prefix: 'notes/',
keyMarker: 'notes/yore.rs',
@@ -481,39 +496,79 @@ const tests = [
}, (e, input) => e.key > input.keyMarker),
];
function getListingKey(key, vFormat) {
if (vFormat === 'v0') {
return key;
}
if (vFormat === 'v1') {
const keyPrefix = key.includes(VID_SEP) ?
DbPrefixes.Version : DbPrefixes.Master;
return `${keyPrefix}${key}`;
}
return assert.fail(`bad format ${vFormat}`);
}
function getTestListing(test, data, vFormat) {
return data
.filter(e => test.filter(e, test.input))
.map(e => {
if (vFormat === 'v0') {
return e;
}
if (vFormat === 'v1') {
const keyPrefix = e.key.includes(VID_SEP) ?
DbPrefixes.Version : DbPrefixes.Master;
return {
key: `${keyPrefix}${e.key}`,
value: e.value,
};
}
return assert.fail(`bad format ${vFormat}`);
});
.map(e => ({
key: getListingKey(e.key, vFormat),
value: e.value,
}));
}
['v0', 'v1'].forEach(vFormat => {
describe(`Delimiter All Versions listing algorithm vFormat=${vFormat}`, () => {
it('Should return good skipping value for DelimiterVersions', () => {
const delimiter = new DelimiterVersions({ delimiter: '/' });
const delimiter = new DelimiterVersions({ delimiter: '/' }, logger, vFormat);
for (let i = 0; i < 100; i++) {
delimiter.filter({
key: `${vFormat === 'v1' ? DbPrefixes.Master : ''}foo/${zpad(i)}`,
value: '{}',
});
}
assert.strictEqual(delimiter.skipping(),
`${vFormat === 'v1' ? DbPrefixes.Master : ''}foo/`);
if (vFormat === 'v1') {
assert.deepStrictEqual(delimiter.skipping(), [
`${DbPrefixes.Master}foo/`,
`${DbPrefixes.Version}foo/`,
]);
} else {
assert.strictEqual(delimiter.skipping(), 'foo/');
}
});
if (vFormat === 'v0') {
it('Should return good skipping value for DelimiterVersions on replay keys', () => {
const delimiter = new DelimiterVersions({ delimiter: '/' }, logger, vFormat);
for (let i = 0; i < 10; i++) {
delimiter.filter({
key: `foo/${zpad(i)}`,
value: '{}',
});
}
// simulate a listing that goes through a replay key, ...
assert.strictEqual(
delimiter.filter({
key: `${DbPrefixes.Replay}xyz`,
value: 'abcdef',
}),
FILTER_SKIP);
// ...it should skip the whole replay prefix
assert.strictEqual(delimiter.skipping(), DbPrefixes.Replay);
// simulate a listing that reaches regular object keys
// beyond the replay prefix, ...
assert.strictEqual(
delimiter.filter({
key: `${inc(DbPrefixes.Replay)}foo/bar`,
value: '{}',
}),
FILTER_ACCEPT);
// ...it should return to skipping by prefix as usual
assert.strictEqual(delimiter.skipping(), `${inc(DbPrefixes.Replay)}foo/`);
});
}
tests.forEach(test => {
it(`Should return metadata listing params to list ${test.name}`, () => {
const listing = new DelimiterVersions(test.input, logger, vFormat);
@@ -527,5 +582,442 @@ function getTestListing(test, data, vFormat) {
assert.deepStrictEqual(res, test.output);
});
});
it('skipping() should return SKIP_NONE when NextMarker is undefined', () => {
const delimiter = new DelimiterVersions({ delimiter: '/' }, logger, vFormat);
assert.strictEqual(delimiter.NextMarker, undefined);
assert.strictEqual(delimiter.skipping(), SKIP_NONE);
});
it('skipping() should return SKIP_NONE when marker is set and ' +
'does not contain the delimiter', () => {
const key = 'foo';
const delimiter = new DelimiterVersions({ delimiter: '/', marker: key },
logger, vFormat);
/* Filter a master version to set NextMarker. */
const listingKey = getListingKey(key, vFormat);
delimiter.filter({ key: listingKey, value: '' });
assert.strictEqual(delimiter.NextMarker, 'foo');
assert.strictEqual(delimiter.skipping(), SKIP_NONE);
});
it('skipping() should return prefix to skip when marker is set and ' +
'contains the delimiter', () => {
const key = 'foo/bar';
const delimiter = new DelimiterVersions({ delimiter: '/', marker: key },
logger, vFormat);
/* Filter a master version to set NextMarker. */
const listingKey = getListingKey(key, vFormat);
delimiter.filter({ key: listingKey, value: '' });
assert.strictEqual(delimiter.NextMarker, 'foo/');
if (vFormat === 'v0') {
assert.strictEqual(delimiter.skipping(), 'foo/');
} else {
assert.deepStrictEqual(delimiter.skipping(), [
`${DbPrefixes.Master}foo/`,
`${DbPrefixes.Version}foo/`,
]);
}
});
it('skipping() should return prefix when marker is set and ' +
'ends with the delimiter', () => {
const key = 'foo/';
const delimiter = new DelimiterVersions({ delimiter: '/', marker: key },
logger, vFormat);
/* Filter a master version to set NextMarker. */
const listingKey = getListingKey(key, vFormat);
delimiter.filter({ key: listingKey, value: '' });
assert.strictEqual(delimiter.NextMarker, 'foo/');
if (vFormat === 'v0') {
assert.strictEqual(delimiter.skipping(), 'foo/');
} else {
assert.deepStrictEqual(delimiter.skipping(), [
`${DbPrefixes.Master}foo/`,
`${DbPrefixes.Version}foo/`,
]);
}
});
it('should skip entries not starting with prefix', () => {
const delimiter = new DelimiterVersions({ prefix: 'prefix' }, logger, vFormat);
const listingKey = getListingKey('wrong', vFormat);
assert.strictEqual(delimiter.filter({ key: listingKey, value: '' }), FILTER_SKIP);
assert.strictEqual(delimiter.NextMarker, undefined);
assert.strictEqual(delimiter.prvKey, undefined);
assert.deepStrictEqual(delimiter.result(), EmptyResult);
});
it('should accept a master version', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const key = 'key';
const value = '';
const listingKey = getListingKey(key, vFormat);
assert.strictEqual(delimiter.filter({ key: listingKey, value }), FILTER_ACCEPT);
assert.strictEqual(delimiter.NextMarker, key);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'key',
value: '',
versionId: 'null',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should return good values for entries with different common prefixes', () => {
const delimiter = new DelimiterVersions({ delimiter: '/' },
logger, vFormat);
/* Filter the first entry with a common prefix. It should be
* accepted and added to the result. */
assert.strictEqual(delimiter.filter({
key: getListingKey('commonPrefix1/key1', vFormat),
value: '',
}),
FILTER_ACCEPT);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: ['commonPrefix1/'],
Versions: [],
Delimiter: '/',
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
/* Filter the second entry with the same common prefix than the
* first entry. It should be skipped and not added to the result. */
assert.strictEqual(delimiter.filter({
key: getListingKey('commonPrefix1/key2', vFormat),
value: '',
}),
FILTER_SKIP);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: ['commonPrefix1/'],
Versions: [],
Delimiter: '/',
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
/* Filter an entry with a new common prefix. It should be accepted
* and not added to the result. */
assert.strictEqual(delimiter.filter({
key: getListingKey('commonPrefix2/key1', vFormat),
value: '',
}),
FILTER_ACCEPT);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: ['commonPrefix1/', 'commonPrefix2/'],
Versions: [],
Delimiter: '/',
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should accept a delete marker version', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key = 'key';
const obj = {
key: getListingKey(`${key}${VID_SEP}version`, vFormat),
value: version.toString(),
};
/* When filtered, it should return FILTER_ACCEPT and
* should be added to the result content. */
assert.strictEqual(delimiter.filter(obj), FILTER_ACCEPT);
assert.strictEqual(delimiter.NextMarker, key);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'key',
value: version.toString(),
versionId: 'version',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should accept a version after a delete marker master', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key = 'key';
const versionKey = `${key}${VID_SEP}version`;
delimiter.filter({ key: getListingKey(key, vFormat), value: version.toString() });
assert.strictEqual(delimiter.filter({
key: getListingKey(versionKey, vFormat),
value: 'value',
}), FILTER_ACCEPT);
assert.strictEqual(delimiter.NextMarker, key);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'key',
value: version.toString(),
versionId: 'null',
}, {
key: 'key',
value: 'value',
versionId: 'version',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should accept a new master key w/ version after a delete marker master', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const version = new Version({ isDeleteMarker: true });
const key1 = 'key1';
const key2 = 'key2';
const value2 = '{"versionId":"version"}';
assert.strictEqual(delimiter.filter({
key: getListingKey(key1, vFormat),
value: version.toString(),
}), FILTER_ACCEPT);
assert.strictEqual(delimiter.filter({
key: getListingKey(key2, vFormat),
value: value2,
}), FILTER_ACCEPT);
assert.strictEqual(delimiter.NextMarker, key2);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
Versions: [{
key: 'key1',
value: '{"isDeleteMarker":true}',
versionId: 'null',
}, {
key: 'key2',
value: '{"versionId":"version"}',
versionId: 'version',
}],
});
});
it('should accept a version after skipping an object because of its commonPrefix', () => {
const commonPrefix1 = 'commonPrefix1/';
const commonPrefix2 = 'commonPrefix2/';
const prefix1Key1 = 'commonPrefix1/key1';
const prefix1Key2 = 'commonPrefix1/key2';
const prefix2VersionKey1 = `commonPrefix2/key1${VID_SEP}version`;
const value = '{"versionId":"version"}';
const delimiter = new DelimiterVersions({ delimiter: '/' },
logger, vFormat);
/* Filter the two first entries with the same common prefix to add
* it to the result and reach the state where an entry is skipped
* because of an already present common prefix in the result. */
delimiter.filter({ key: getListingKey(prefix1Key1, vFormat), value });
delimiter.filter({ key: getListingKey(prefix1Key2, vFormat), value });
/* Filter an object with a key containing a version part and a new
* common prefix. It should be accepted and the new common prefix
* added to the result. */
assert.strictEqual(delimiter.filter({
key: getListingKey(prefix2VersionKey1, vFormat),
value,
}), FILTER_ACCEPT);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [commonPrefix1, commonPrefix2],
Versions: [],
Delimiter: '/',
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should skip first version key if equal to master', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const masterKey = 'key';
const versionKey1 = `${masterKey}${VID_SEP}version1`;
const versionKey2 = `${masterKey}${VID_SEP}version2`;
const value2 = 'value2';
/* Filter the master version for version1 */
assert.strictEqual(delimiter.filter({
key: getListingKey(masterKey, vFormat),
value: '{"versionId":"version1"}',
}), FILTER_ACCEPT);
/* Filter the version key for version1 */
assert.strictEqual(delimiter.filter({
key: getListingKey(versionKey1, vFormat),
value: '{"versionId":"version1"}',
}), FILTER_SKIP);
/* Filter the version key for version2 */
assert.strictEqual(delimiter.filter({
key: getListingKey(versionKey2, vFormat),
value: value2,
}), FILTER_ACCEPT);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'key',
value: '{"versionId":"version1"}',
versionId: 'version1',
}, {
key: 'key',
value: 'value2',
versionId: 'version2',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should skip master and version key if under a known prefix', () => {
const commonPrefix1 = 'commonPrefix/';
const prefixKey1 = 'commonPrefix/key1';
const prefixKey2 = 'commonPrefix/key2';
const prefixVersionKey1 = `commonPrefix/key2${VID_SEP}version`;
const value = '{"versionId":"version"}';
const delimiter = new DelimiterVersions({ delimiter: '/' },
logger, vFormat);
assert.strictEqual(delimiter.filter({
key: getListingKey(prefixKey1, vFormat),
value,
}), FILTER_ACCEPT);
/* The second master key of the same common prefix should be skipped */
assert.strictEqual(delimiter.filter({
key: getListingKey(prefixKey2, vFormat),
value,
}), FILTER_SKIP);
/* The version key of the same common prefix should also be skipped */
assert.strictEqual(delimiter.filter({
key: getListingKey(prefixVersionKey1, vFormat),
value,
}), FILTER_SKIP);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [commonPrefix1],
Versions: [],
Delimiter: '/',
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
if (vFormat === 'v0') {
it('should accept a PHD version as first input', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const keyPHD = 'keyPHD';
const objPHD = {
key: keyPHD,
value: Version.generatePHDVersion(generateVersionId('', '')),
};
/* When filtered, it should return FILTER_ACCEPT and set the prvKey
* to undefined. It should not be added to the result content or common
* prefixes. */
assert.strictEqual(delimiter.filter(objPHD), FILTER_ACCEPT);
assert.strictEqual(delimiter.prvKey, undefined);
assert.strictEqual(delimiter.NextMarker, undefined);
assert.deepStrictEqual(delimiter.result(), EmptyResult);
});
it('should accept a PHD version', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const key = 'keyA';
const value = '';
const keyPHD = 'keyBPHD';
const objPHD = {
key: keyPHD,
value: Version.generatePHDVersion(generateVersionId('', '')),
};
/* Filter a master version to set the NextMarker and add
* and element in result content. */
delimiter.filter({ key, value });
/* When filtered, it should return FILTER_ACCEPT. It
* should not be added to the result content or common
* prefixes. */
assert.strictEqual(delimiter.filter(objPHD), FILTER_ACCEPT);
assert.strictEqual(delimiter.prvKey, undefined);
assert.strictEqual(delimiter.NextMarker, key);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'keyA',
value: '',
versionId: 'null',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
it('should accept a version after a PHD', () => {
const delimiter = new DelimiterVersions({}, logger, vFormat);
const masterKey = 'key';
const keyVersion = `${masterKey}${VID_SEP}version`;
const value = '';
const objPHD = {
key: masterKey,
value: Version.generatePHDVersion(generateVersionId('', '')),
};
/* Filter the PHD object. */
delimiter.filter(objPHD);
/* The filtering of the PHD object has no impact, the version is
* accepted and added to the result. */
assert.strictEqual(delimiter.filter({
key: keyVersion,
value,
}), FILTER_ACCEPT);
assert.strictEqual(delimiter.NextMarker, masterKey);
assert.deepStrictEqual(delimiter.result(), {
CommonPrefixes: [],
Versions: [{
key: 'key',
value: '',
versionId: 'version',
}],
Delimiter: undefined,
IsTruncated: false,
NextKeyMarker: undefined,
NextVersionIdMarker: undefined,
});
});
}
});
});

View File

@@ -53,4 +53,12 @@ describe('should URIencode in accordance with AWS rules', () => {
const actualOutput = awsURIencode(input);
assert.strictEqual(actualOutput, expectedOutput);
});
it('should skip invalid query params', () => {
const input = ['s3:ObjectCreated:*', 's3:ObjectRemoved:*',
's3:BucketCreated:*', 's3:BucketRemoved:*'];
const expectedOutput = '';
const actualOutput = awsURIencode(input);
assert.strictEqual(actualOutput, expectedOutput);
});
});

View File

@@ -1,14 +1,22 @@
'use strict';// eslint-disable-line strict
const assert = require('assert');
const async = require('async');
const leveldb = require('level');
const temp = require('temp');
temp.track();
const db = require('../../index').db;
const errors = require('../../lib/errors');
const IndexTransaction = db.IndexTransaction;
const key1 = 'key1';
const key2 = 'key2';
const key3 = 'key3';
const value1 = 'value1';
const value2 = 'value2';
const value3 = 'value3';
function createDb() {
const indexPath = temp.mkdirSync();
@@ -39,6 +47,45 @@ function checkValueNotInDb(db, k, done) {
});
}
function checkKeyNotExistsInDB(db, key, cb) {
return db.get(key, (err, value) => {
if (err && !err.notFound) {
return cb(err);
}
if (value) {
return cb(errors.PreconditionFailed);
}
return cb();
});
}
class ConditionalLevelDB {
constructor() {
this.db = createDb();
}
batch(operations, writeOptions, cb) {
return async.eachLimit(writeOptions.conditions, 10, (cond, asyncCallback) => {
switch (true) {
case ('notExists' in cond):
checkKeyNotExistsInDB(this.db, cond.notExists, asyncCallback);
break;
default:
asyncCallback(new Error('unsupported conditional operation'));
}
}, err => {
if (err) {
return cb(err);
}
return this.db.batch(operations, writeOptions, cb);
});
}
get client() {
return this.db;
}
}
describe('IndexTransaction', () => {
it('should allow put', done => {
const db = createDb();
@@ -317,4 +364,104 @@ describe('IndexTransaction', () => {
});
});
});
it('should allow batch operation with notExists condition if key does not exist', done => {
const db = new ConditionalLevelDB();
const { client } = db;
const transaction = new IndexTransaction(db);
transaction.addCondition({ notExists: key1 });
transaction.push({
type: 'put',
key: key1,
value: value1,
});
return async.series([
next => transaction.commit(next),
next => client.get(key1, next),
], (err, res) => {
assert.ifError(err);
assert.strictEqual(res[1], value1);
return done();
});
});
it('should have a working addCondition shortcut method', done => {
const db = new ConditionalLevelDB();
const { client } = db;
const transaction = new IndexTransaction(db);
transaction.put(key1, value1);
transaction.addCondition({ notExists: 'key1' });
transaction.commit(err => {
if (err) {
return done(err);
}
return checkValueInDb(client, key1, value1, done);
});
});
it('should not allow any op in a batch operation with notExists condition if key exists', done => {
const db = new ConditionalLevelDB();
const { client } = db;
const transaction = new IndexTransaction(db);
function tryPushAgain(err) {
if (err) {
return done(err);
}
transaction.addCondition({ notExists: key1 });
transaction.push({
type: 'put',
key: key1,
value: value1,
});
transaction.push({
type: 'put',
key: key2,
value: value2,
});
transaction.push({
type: 'put',
key: key3,
value: value3,
});
return transaction.commit(err => {
if (!err || !err.PreconditionFailed) {
return done(new Error('should not be able to conditional put for duplicate key'));
}
return async.parallel([
next => checkKeyNotExistsInDB(client, key2, next),
next => checkKeyNotExistsInDB(client, key3, next),
], err => {
assert.ifError(err);
return done();
});
});
}
client.batch()
.put(key1, value1)
.write(tryPushAgain);
});
it('should not allow batch operation with empty condition', done => {
const transaction = new IndexTransaction();
try {
transaction.addCondition({});
done(new Error('should fail for empty condition'));
} catch (err) {
assert.strictEqual(err.missingCondition, true);
done();
}
});
it('should not allow batch operation with unsupported condition', done => {
const transaction = new IndexTransaction();
try {
transaction.addCondition({ exists: key1 });
done(new Error('should fail for unsupported condition, currently supported - notExists'));
} catch (err) {
assert.strictEqual(err.unsupportedConditionalOperation, true);
done();
}
});
});

View File

@@ -26,4 +26,17 @@ describe('Errors: ', () => {
assert.strictEqual(error.description, 'custom-description');
assert.strictEqual(error.NoSuchEntity, true);
});
it('can be used as an http response', () => {
const fakeRes = {
writeHead: code => assert.strictEqual(code, 404),
end: msg => {
assert.strictEqual(
msg,
errors.NoSuchEntity.toString(),
);
},
};
errors.NoSuchEntity.writeResponse(fakeRes);
});
});

View File

@@ -28,6 +28,8 @@ describe('StatsClient class', () => {
afterEach(() => redisClient.clear(() => {}));
after(() => redisClient.disconnect());
it('should correctly record a new request by default one increment',
done => {
async.series([

View File

@@ -0,0 +1,106 @@
const assert = require('assert');
const ZenkoMetrics = require('../../../lib/metrics/ZenkoMetrics');
describe('ZenkoMetrics', () => {
let counter;
let gauge;
let histogram;
let summary;
let petCounter;
before(() => {
counter = ZenkoMetrics.createCounter({
name: 'gizmo_counter',
help: 'Count gizmos',
});
counter.inc();
counter.inc(10);
gauge = ZenkoMetrics.createGauge({
name: 'gizmo_gauge',
help: 'Measure gizmos',
});
gauge.set(42);
gauge.inc();
gauge.dec(10);
histogram = ZenkoMetrics.createHistogram({
name: 'gizmo_histogram',
help: 'Make a histogram of gizmos',
buckets: [1, 10, 100],
});
histogram.observe(5);
histogram.observe(15);
histogram.observe(50);
histogram.observe(500);
summary = ZenkoMetrics.createSummary({
name: 'gizmo_summary',
help: 'Make a summary of gizmos',
percentiles: [0.05, 0.5, 0.95],
});
summary.observe(5);
summary.observe(50);
summary.observe(500);
petCounter = ZenkoMetrics.createCounter({
name: 'pet_counter',
help: 'Count pets',
labelNames: ['type'],
});
petCounter.inc({ type: 'kitten' });
petCounter.inc({ type: 'puppy' }, 2);
});
it('should keep created metrics objects in registry', () => {
const savedCounter = ZenkoMetrics.getMetric('gizmo_counter');
// check we get the same original counter object
assert.strictEqual(savedCounter, counter);
const savedGauge = ZenkoMetrics.getMetric('gizmo_gauge');
// check we get the same original gauge object
assert.strictEqual(savedGauge, gauge);
assert.strictEqual(ZenkoMetrics.getMetric('does_not_exist'), undefined);
});
it('should export metrics in prometheus format', () => {
const expectedLines = [
'# HELP gizmo_counter Count gizmos',
'# TYPE gizmo_counter counter',
'gizmo_counter 11',
'# HELP gizmo_gauge Measure gizmos',
'# TYPE gizmo_gauge gauge',
'gizmo_gauge 33',
'# HELP gizmo_histogram Make a histogram of gizmos',
'# TYPE gizmo_histogram histogram',
'gizmo_histogram_bucket{le="1"} 0',
'gizmo_histogram_bucket{le="10"} 1',
'gizmo_histogram_bucket{le="100"} 3',
'gizmo_histogram_bucket{le="+Inf"} 4',
'gizmo_histogram_sum 570',
'gizmo_histogram_count 4',
'# HELP gizmo_summary Make a summary of gizmos',
'# TYPE gizmo_summary summary',
'gizmo_summary{quantile="0.05"} 5',
'gizmo_summary{quantile="0.5"} 50',
'gizmo_summary{quantile="0.95"} 500',
'gizmo_summary_sum 555',
'gizmo_summary_count 3',
'# HELP pet_counter Count pets',
'# TYPE pet_counter counter',
'pet_counter{type="kitten"} 1',
'pet_counter{type="puppy"} 2',
];
const lines = {};
ZenkoMetrics.asPrometheus().split('\n').forEach(line => {
lines[line.trimRight()] = true;
});
expectedLines.forEach(expectedLine => {
assert.notStrictEqual(
lines[expectedLine], undefined,
`missing expected line in Prometheus export '${expectedLine}'`);
});
});
});

View File

@@ -64,6 +64,8 @@ describe('ObjectMD class setters/getters', () => {
['IsNull', true],
['NullVersionId', null, undefined],
['NullVersionId', '111111'],
['NullUploadId', null, undefined],
['NullUploadId', 'abcdefghi'],
['IsDeleteMarker', null, false],
['IsDeleteMarker', true],
['VersionId', null, undefined],
@@ -73,6 +75,8 @@ describe('ObjectMD class setters/getters', () => {
key: 'value',
}],
['Tags', null, {}],
['UploadId', null, undefined],
['UploadId', 'abcdefghi'],
['ReplicationInfo', null, {
status: '',
backends: [],
@@ -307,13 +311,246 @@ describe('getAttributes static method', () => {
'location': true,
'isNull': true,
'nullVersionId': true,
'nullUploadId': true,
'isDeleteMarker': true,
'versionId': true,
'tags': true,
'uploadId': true,
'replicationInfo': true,
'dataStoreName': true,
'last-modified': true,
'md-model-version': true };
'md-model-version': true,
};
assert.deepStrictEqual(attributes, expectedResult);
});
});
describe('ObjectMD::getReducedLocations', () => {
it('should not alter an array when each part has only one element', () => {
const md = new ObjectMD();
const locations = [
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 1,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '4e67844b674b093a9e109d42172922ea1f32ec12',
size: 3,
start: 1,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
];
md.setLocation(locations);
assert.deepStrictEqual(md.getReducedLocations(), locations);
});
it('should reduce an array when first part is > 1 element', () => {
const md = new ObjectMD();
md.setLocation([
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 1,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: 'deebfb287cfcee1d137b0136562d2d776ba491e1',
size: 1,
start: 1,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '4e67844b674b093a9e109d42172922ea1f32ec12',
size: 3,
start: 2,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
]);
assert.deepStrictEqual(md.getReducedLocations(), [
{
key: 'deebfb287cfcee1d137b0136562d2d776ba491e1',
size: 2,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '4e67844b674b093a9e109d42172922ea1f32ec12',
size: 3,
start: 2,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
]);
});
it('should reduce an array when second part is > 1 element', () => {
const md = new ObjectMD();
md.setLocation([
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 1,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: 'deebfb287cfcee1d137b0136562d2d776ba491e1',
size: 1,
start: 1,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
{
key: '4e67844b674b093a9e109d42172922ea1f32ec12',
size: 3,
start: 2,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
]);
assert.deepStrictEqual(md.getReducedLocations(), [
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 1,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '4e67844b674b093a9e109d42172922ea1f32ec12',
size: 4,
start: 1,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
]);
});
it('should reduce an array when multiple parts are > 1 element', () => {
const md = new ObjectMD();
md.setLocation([
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 1,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: 'c1c1e055b19eb5a61adb8a665e626ff589cff234',
size: 2,
start: 1,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: 'deebfb287cfcee1d137b0136562d2d776ba491e1',
size: 1,
start: 3,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '8e67844b674b093a9e109d42172922ea1f32ec14',
size: 3,
start: 4,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
{
key: 'd1d1e055b19eb5a61adb8a665e626ff589cff233',
size: 10,
start: 7,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
{
key: '0e67844b674b093a9e109d42172922ea1f32ec11',
size: 10,
start: 17,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
{
key: '8e67844b674b093a9e109d42172922ea1f32ec14',
size: 15,
start: 27,
dataStoreName: 'file',
dataStoreETag: '3:1ca655158ca025aa00a818b6b81f9e4c',
},
{
key: '7e67844b674b093a9e109d42172922ea1f32ec1f',
size: 2,
start: 42,
dataStoreName: 'file',
dataStoreETag: '3:1ca655158ca025aa00a818b6b81f9e4c',
},
{
key: '1237844b674b093a9e109d42172922ea1f32ec19',
size: 6,
start: 44,
dataStoreName: 'file',
dataStoreETag: '4:afa655158ca025aa00a818b6b81f9e4d',
},
{
key: '4567844b674b093a9e109d42172922ea1f32ec00',
size: 4,
start: 50,
dataStoreName: 'file',
dataStoreETag: '4:afa655158ca025aa00a818b6b81f9e4d',
},
{
key: '53d7844b674b093a9e109d42172922ea1f32ec02',
size: 9,
start: 54,
dataStoreName: 'file',
dataStoreETag: '4:afa655158ca025aa00a818b6b81f9e4d',
},
{
key: '6f6d7844b674b093a9e109d42172922ea1f32ec01',
size: 2,
start: 63,
dataStoreName: 'file',
dataStoreETag: '4:afa655158ca025aa00a818b6b81f9e4d',
},
]);
assert.deepStrictEqual(md.getReducedLocations(), [
{
key: 'deebfb287cfcee1d137b0136562d2d776ba491e1',
size: 4,
start: 0,
dataStoreName: 'file',
dataStoreETag: '1:0e5a6f42662652d44fcf978399ef5709',
},
{
key: '0e67844b674b093a9e109d42172922ea1f32ec11',
size: 23,
start: 4,
dataStoreName: 'file',
dataStoreETag: '2:9ca655158ca025aa00a818b6b81f9e48',
},
{
key: '7e67844b674b093a9e109d42172922ea1f32ec1f',
size: 17,
start: 27,
dataStoreName: 'file',
dataStoreETag: '3:1ca655158ca025aa00a818b6b81f9e4c',
},
{
key: '6f6d7844b674b093a9e109d42172922ea1f32ec01',
size: 21,
start: 44,
dataStoreName: 'file',
dataStoreETag: '4:afa655158ca025aa00a818b6b81f9e4d',
},
]);
});
});

View File

@@ -166,7 +166,7 @@ describe('network.Server: ', () => {
if (err) {
return ws.onStop(() => {
clearTimeout(requestTimeout);
if (err.code === 'EPROTO') {
if (err.code === 'EPROTO' || err.code === 'ECONNRESET') {
return done();
}
return done(err);
@@ -208,7 +208,7 @@ describe('network.Server: ', () => {
assert.strictEqual(ws._server._connections, 1);
setTimeout(() => {
// client connection should have been closed after more than 1000ms
assert.strictEqual(ws._server.connections, 0);
assert.strictEqual(ws._server._connections, 0);
ws.stop();
ws.onStop(done);
}, 200);

View File

@@ -0,0 +1,106 @@
const assert = require('assert');
const http = require('http');
const index = require('../../../../');
const { ProbeServer } = index.network.probe.ProbeServer;
function makeRequest(method, uri, cb) {
const params = {
hostname: 'localhost',
port: 4042,
path: uri,
method,
};
const req = http.request(params);
req.setNoDelay(true);
req.on('response', res => {
cb(undefined, res);
});
req.on('error', err => {
assert.ifError(err);
cb(err);
}).end();
}
describe('network.probe.ProbeServer', () => {
/** @type {ProbeServer} */
let server;
beforeEach(done => {
server = new ProbeServer({ port: 4042 });
server._cbOnListening = done;
server.start();
});
afterEach(done => {
server.stop();
done();
});
it('error on bad method', done => {
makeRequest('POST', '/unused', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 405);
done();
});
});
it('error on invalid route', done => {
makeRequest('GET', '/unused', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 400);
done();
});
});
it('does nothing if probe successful', done => {
server.addHandler('/check', res => {
res.writeHead(200);
res.end();
});
makeRequest('GET', '/check', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 200);
done();
});
});
it('accepts array of paths', done => {
server.addHandler(['/check', '/probe'], res => {
res.writeHead(200);
res.end();
});
let calls = 0;
makeRequest('GET', '/check', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 200);
calls++;
if (calls === 2) {
done();
}
});
makeRequest('GET', '/probe', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 200);
calls++;
if (calls === 2) {
done();
}
});
});
it('500 response on bad probe', done => {
server.addHandler('/check', () => 'check failed');
makeRequest('GET', '/check', (err, res) => {
assert.ifError(err);
assert.strictEqual(res.statusCode, 500);
res.setEncoding('utf8');
res.on('data', body => {
assert.strictEqual(
body,
'{"errorType":"InternalError","errorMessage":"check failed"}',
);
done();
});
});
});
});

View File

@@ -59,6 +59,7 @@ describe('REST interface for blob data storage', () => {
after(done => {
server.stop();
client.destroy();
done();
});
@@ -114,10 +115,11 @@ describe('REST interface for blob data storage', () => {
assert(usage.total > 0);
subDone();
});
}],
err => {
done(err);
});
},
],
err => {
done(err);
});
});
});

View File

@@ -12,8 +12,7 @@ describe('requestUtils.getClientIp', () => {
const testClientIp2 = '192.168.104.0';
const testProxyIp = '192.168.100.2';
it('should return client Ip address from header ' +
'if the request comes via proxies', () => {
it('should return client Ip address from header if the request comes via proxies', () => {
const request = new DummyRequest({
headers: {
'x-forwarded-for': [testClientIp1, testProxyIp].join(','),
@@ -28,13 +27,9 @@ describe('requestUtils.getClientIp', () => {
assert.strictEqual(result, testClientIp1);
});
it('should not return client Ip address from header ' +
'if the request is not forwarded from proxies or ' +
'fails ip check', () => {
it('should return client Ip address from socket info if the request is not forwarded from proxies', () => {
const request = new DummyRequest({
headers: {
'x-forwarded-for': [testClientIp1, testProxyIp].join(','),
},
headers: {},
url: '/',
parsedHost: 'localhost',
socket: {
@@ -45,12 +40,11 @@ describe('requestUtils.getClientIp', () => {
assert.strictEqual(result, testClientIp2);
});
it('should not return client Ip address from header ' +
'if the request is forwarded from proxies, but the request ' +
it('should not return client Ip address from header if the request is forwarded from proxies, but the request ' +
'has no expected header or the header value is empty', () => {
const request = new DummyRequest({
headers: {
'x-forwarded-for': ' ',
'x-forwarded-for': '',
},
url: '/',
parsedHost: 'localhost',
@@ -61,4 +55,37 @@ describe('requestUtils.getClientIp', () => {
const result = requestUtils.getClientIp(request, configWithProxy);
assert.strictEqual(result, testClientIp2);
});
it('should return client Ip address from header if the request comes via proxies and ' +
'no request config is available', () => {
const request = new DummyRequest({
headers: {
'x-forwarded-for': testClientIp1,
},
url: '/',
parsedHost: 'localhost',
socket: {
remoteAddress: testProxyIp,
},
});
const result = requestUtils.getClientIp(request, configWithoutProxy);
assert.strictEqual(result, testClientIp1);
});
it('should return client Ip address from socket info if the request comes via proxies and ' +
'request config is available and ip check fails', () => {
const dummyRemoteIP = '221.10.221.10';
const request = new DummyRequest({
headers: {
'x-forwarded-for': testClientIp1,
},
url: '/',
parsedHost: 'localhost',
socket: {
remoteAddress: dummyRemoteIP,
},
});
const result = requestUtils.getClientIp(request, configWithProxy);
assert.strictEqual(result, dummyRemoteIP);
});
});

View File

@@ -113,6 +113,24 @@ describe('record log - persistent log of metadata operations', () => {
recordStream.on('end', done);
});
});
it('should list an empty record log with null start', done => {
logProxy.readRecords({ startSeq: null }, (err, res) => {
assert.ifError(err);
const info = res.info;
const recordStream = res.log;
assert(info);
assert.strictEqual(info.start, null);
assert.strictEqual(info.end, null);
assert(recordStream);
recordStream.on('data', () => {
assert.fail('unexpected data event');
});
recordStream.on('end', done);
});
});
it('should be able to add records and list them thereafter', done => {
debug('going to append records');
const ops = [{ type: 'put', key: 'foo', value: 'bar',
@@ -210,6 +228,11 @@ describe('record log - persistent log of metadata operations', () => {
});
});
after(done => closeRecordLog(logProxy, () => {
logProxy = undefined;
done();
}));
function checkRecord(record, seq) {
assert.strictEqual(record.entries.length, 1);
assert.deepStrictEqual(record.db, 'foobucket');

1526
yarn.lock

File diff suppressed because it is too large Load Diff