Compare commits

..

No commits in common. "c3099d8685e9f2c5f0dd1ae4df64a5794895078e" and "e52aa89c6aff530ac93624aa6eb2732748af88b7" have entirely different histories.

15 changed files with 44 additions and 544 deletions

View File

@ -1 +0,0 @@
node_modules

View File

@ -1,87 +0,0 @@
# General support information
GitHub Issues are **reserved** for actionable bug reports (including
documentation inaccuracies), and feature requests.
**All questions** (regarding configuration, usecases, performance, community,
events, setup and usage recommendations, among other things) should be asked on
the **[Zenko Forum](http://forum.zenko.io/)**.
> Questions opened as GitHub issues will systematically be closed, and moved to
> the [Zenko Forum](http://forum.zenko.io/).
--------------------------------------------------------------------------------
## Avoiding duplicates
When reporting a new issue/requesting a feature, make sure that we do not have
any duplicates already open:
- search the issue list for this repository (use the search bar, select
"Issues" on the left pane after searching);
- if there is a duplicate, please do not open your issue, and add a comment
to the existing issue instead.
--------------------------------------------------------------------------------
## Bug report information
(delete this section (everything between the lines) if you're not reporting a
bug but requesting a feature)
### Description
Briefly describe the problem you are having in a few paragraphs.
### Steps to reproduce the issue
Please provide steps to reproduce, including full log output
### Actual result
Describe the results you received
### Expected result
Describe the results you expected
### Additional information
- Node.js version,
- Docker version,
- npm version,
- distribution/OS,
- optional: anything else you deem helpful to us.
--------------------------------------------------------------------------------
## Feature Request
(delete this section (everything between the lines) if you're not requesting
a feature but reporting a bug)
### Proposal
Describe the feature
### Current behavior
What currently happens
### Desired behavior
What you would like to happen
### Usecase
Please provide usecases for changing the current behavior
### Additional information
- Is this request for your company? Y/N
- If Y: Company name:
- Are you using any Scality Enterprise Edition products (RING, Zenko EE)? Y/N
- Are you willing to contribute this feature yourself?
- Position/Title:
- How did you hear about us?
--------------------------------------------------------------------------------

View File

@ -1,15 +0,0 @@
FROM node:10-slim
WORKDIR /usr/src/app
COPY . /usr/src/app
RUN apt-get update \
&& apt-get install build-essential git g++ python -y \
&& npm install --production \
&& apt-get remove git g++ -y
ENTRYPOINT ["/usr/src/app/docker-entrypoint.sh"]
CMD [ "npm", "start" ]
EXPOSE 8100

View File

@ -1,49 +0,0 @@
# Quickstart
## Server
Starting the Utapi server can be done in two ways:
[using NPM](#using-npm-2-minutes) or
[using Docker](#using-docker-5-minutes). Either method will start a server
locally, listening on port 8100.
### Using NPM (~2 minutes)
Please use node v10.16.0 (npm v6.9.0).
1. Install dependencies:
```
npm install
```
2. Start the server:
```
$ npm start
utapi@8.0.0 start /Users/repos/scality/utapi
node start-server.js
{"name":"Utapi","time":1562008743439,"id":0,"childPid":55156,"level":"info",
"message":"Worker started","hostname":"MacBook-Pro-2.local", "pid":55155}
...
```
### Using Docker (~5 minutes)
1. Build the image:
```
$ docker build --tag utapi .
Sending build context to Docker daemon 10.79MB
Step 1/7 : FROM node:10-slim
---> bce75035da07
...
Successfully built 5699ea8e7dec
```
2. Run the image:
```
$ docker run --publish 8100:8100 --detach utapi
25fea1a990b18e7f1f6c76cc5d0c5d564fd6bffb87e1acf5f724db16d602a5b5
```

View File

@ -3,8 +3,9 @@
![Utapi logo](res/utapi-logo.png)
[![Circle CI][badgepub]](https://circleci.com/gh/scality/utapi)
[![Scality CI][badgepriv]](http://ci.ironmann.io/gh/scality/utapi)
Service Utilization API for tracking resource usage and metrics reporting.
Service Utilization API for tracking resource usage and metrics reporting
## Design
@ -12,7 +13,11 @@ Please refer to the [design](/DESIGN.md) for more information.
## Server
Please see the [quickstart](/QUICKSTART.md) guide.
To run the server:
```
npm start
```
## Client
@ -83,13 +88,13 @@ Server is running.
1. Create an IAM user
```
aws iam --endpoint-url <endpoint> create-user --user-name <user-name>
aws iam --endpoint-url <endpoint> create-user --user-name utapiuser
```
2. Create access key for the user
```
aws iam --endpoint-url <endpoint> create-access-key --user-name <user-name>
aws iam --endpoint-url <endpoint> create-access-key --user-name utapiuser
```
3. Define a managed IAM policy
@ -198,11 +203,12 @@ Server is running.
5. Attach user to the managed policy
```
aws --endpoint-url <endpoint> iam attach-user-policy --user-name
<user-name> --policy-arn <policy arn>
aws --endpoint-url <endpoint> iam attach-user-policy --user-name utapiuser
--policy-arn <policy arn>
```
Now the user has access to ListMetrics request in Utapi on all buckets.
Now the user `utapiuser` has access to ListMetrics request in Utapi on all
buckets.
### Signing request with Auth V4
@ -218,18 +224,16 @@ following urls for reference.
You may also view examples making a request with Auth V4 using various languages
and AWS SDKs [here](/examples).
Alternatively, you can use a nifty command line tool available in Scality's
CloudServer.
Alternatively, you can use a nifty command line tool available in Scality's S3.
You can git clone the CloudServer repo from here
https://github.com/scality/cloudserver and follow the instructions in the README
to install the dependencies.
You can git clone S3 repo from here https://github.com/scality/S3.git and follow
the instructions in README to install the dependencies.
If you have CloudServer running inside a docker container you can docker exec
into the CloudServer container as
If you have S3 running inside a docker container you can docker exec into the S3
container as
```
docker exec -it <container-id> bash
docker exec -it <container id> bash
```
and then run the command
@ -267,7 +271,7 @@ Usage: list_metrics [options]
-v, --verbose
```
An example call to list metrics for a bucket `demo` to Utapi in a https enabled
A typical call to list metrics for a bucket `demo` to Utapi in a https enabled
deployment would be
```
@ -279,7 +283,7 @@ Both start and end times are time expressed as UNIX epoch timestamps **expressed
in milliseconds**.
Keep in mind, since Utapi metrics are normalized to the nearest 15 min.
interval, start time and end time need to be in the specific format as follows.
interval, so start time and end time need to be in specific format as follows.
#### Start time
@ -293,7 +297,7 @@ Date: Tue Oct 11 2016 17:35:25 GMT-0700 (PDT)
Unix timestamp (milliseconds): 1476232525320
Here's an example JS method to get a start timestamp
Here's a typical JS method to get start timestamp
```javascript
function getStartTimestamp(t) {
@ -313,7 +317,7 @@ seconds and milliseconds set to 59 and 999 respectively. So valid end timestamps
would look something like `09:14:59:999`, `09:29:59:999`, `09:44:59:999` and
`09:59:59:999`.
Here's an example JS method to get an end timestamp
Here's a typical JS method to get end timestamp
```javascript
function getEndTimestamp(t) {
@ -338,3 +342,4 @@ In order to contribute, please follow the
https://github.com/scality/Guidelines/blob/master/CONTRIBUTING.md).
[badgepub]: http://circleci.com/gh/scality/utapi.svg?style=svg
[badgepriv]: http://ci.ironmann.io/gh/scality/utapi.svg?style=svg

View File

@ -1,47 +0,0 @@
#!/bin/bash
# set -e stops the execution of a script if a command or pipeline has an error
set -e
# modifying config.json
JQ_FILTERS_CONFIG="."
if [[ "$LOG_LEVEL" ]]; then
if [[ "$LOG_LEVEL" == "info" || "$LOG_LEVEL" == "debug" || "$LOG_LEVEL" == "trace" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .log.logLevel=\"$LOG_LEVEL\""
echo "Log level has been modified to $LOG_LEVEL"
else
echo "The log level you provided is incorrect (info/debug/trace)"
fi
fi
if [[ "$WORKERS" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .workers=\"$WORKERS\""
fi
if [[ "$REDIS_HOST" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .redis.host=\"$REDIS_HOST\""
fi
if [[ "$REDIS_PORT" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .redis.port=\"$REDIS_PORT\""
fi
if [[ "$VAULTD_HOST" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .vaultd.host=\"$VAULTD_HOST\""
fi
if [[ "$VAULTD_PORT" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .vaultd.port=\"$VAULTD_PORT\""
fi
if [[ "$HEALTHCHECKS_ALLOWFROM" ]]; then
JQ_FILTERS_CONFIG="$JQ_FILTERS_CONFIG | .healthChecks.allowFrom=[\"$HEALTHCHECKS_ALLOWFROM\"]"
fi
if [[ $JQ_FILTERS_CONFIG != "." ]]; then
jq "$JQ_FILTERS_CONFIG" config.json > config.json.tmp
mv config.json.tmp config.json
fi
exec "$@"

View File

@ -1,90 +0,0 @@
import sys, os, base64, datetime, hashlib, hmac, datetime, calendar, json
import requests # pip install requests
access_key = '9EQTVVVCLSSG6QBMNKO5'
secret_key = 'T5mK/skkkwJ/mTjXZnHyZ5UzgGIN=k9nl4dyTmDH'
method = 'POST'
service = 's3'
host = 'localhost:8100'
region = 'us-east-1'
canonical_uri = '/buckets'
canonical_querystring = 'Action=ListMetrics&Version=20160815'
content_type = 'application/x-amz-json-1.0'
algorithm = 'AWS4-HMAC-SHA256'
t = datetime.datetime.utcnow()
amz_date = t.strftime('%Y%m%dT%H%M%SZ')
date_stamp = t.strftime('%Y%m%d')
# Key derivation functions. See:
# http://docs.aws.amazon.com/general/latest/gr/signature-v4-examples.html#signature-v4-examples-python
def sign(key, msg):
return hmac.new(key, msg.encode("utf-8"), hashlib.sha256).digest()
def getSignatureKey(key, date_stamp, regionName, serviceName):
kDate = sign(('AWS4' + key).encode('utf-8'), date_stamp)
kRegion = sign(kDate, regionName)
kService = sign(kRegion, serviceName)
kSigning = sign(kService, 'aws4_request')
return kSigning
def get_start_time(t):
start = t.replace(minute=t.minute - t.minute % 15, second=0, microsecond=0)
return calendar.timegm(start.utctimetuple()) * 1000;
def get_end_time(t):
end = t.replace(minute=t.minute - t.minute % 15, second=0, microsecond=0)
return calendar.timegm(end.utctimetuple()) * 1000 - 1;
start_time = get_start_time(datetime.datetime(2016, 1, 1, 0, 0, 0, 0))
end_time = get_end_time(datetime.datetime(2016, 2, 1, 0, 0, 0, 0))
# Request parameters for listing Utapi bucket metrics--passed in a JSON block.
bucketListing = {
'buckets': [ 'utapi-test' ],
'timeRange': [ start_time, end_time ],
}
request_parameters = json.dumps(bucketListing)
payload_hash = hashlib.sha256(request_parameters).hexdigest()
canonical_headers = \
'content-type:{0}\nhost:{1}\nx-amz-content-sha256:{2}\nx-amz-date:{3}\n' \
.format(content_type, host, payload_hash, amz_date)
signed_headers = 'content-type;host;x-amz-content-sha256;x-amz-date'
canonical_request = '{0}\n{1}\n{2}\n{3}\n{4}\n{5}' \
.format(method, canonical_uri, canonical_querystring, canonical_headers,
signed_headers, payload_hash)
credential_scope = '{0}/{1}/{2}/aws4_request' \
.format(date_stamp, region, service)
string_to_sign = '{0}\n{1}\n{2}\n{3}' \
.format(algorithm, amz_date, credential_scope,
hashlib.sha256(canonical_request).hexdigest())
signing_key = getSignatureKey(secret_key, date_stamp, region, service)
signature = hmac.new(signing_key, (string_to_sign).encode('utf-8'),
hashlib.sha256).hexdigest()
authorization_header = \
'{0} Credential={1}/{2}, SignedHeaders={3}, Signature={4}' \
.format(algorithm, access_key, credential_scope, signed_headers, signature)
# The 'host' header is added automatically by the Python 'requests' library.
headers = {
'Content-Type': content_type,
'X-Amz-Content-Sha256': payload_hash,
'X-Amz-Date': amz_date,
'Authorization': authorization_header
}
endpoint = 'http://' + host + canonical_uri + '?' + canonical_querystring;
r = requests.post(endpoint, data=request_parameters, headers=headers)
print (r.text)

View File

@ -1,7 +1,6 @@
'use strict'; // eslint-disable-line strict
module.exports = {
UtapiServer: require('./lib/server'),
UtapiServer: require('./lib/server.js'),
UtapiClient: require('./lib/UtapiClient.js'),
UtapiReplay: require('./lib/UtapiReplay.js'),
UtapiReindex: require('./lib/UtapiReindex.js'),

View File

@ -77,17 +77,6 @@ class Datastore {
return this._client.incr(key, cb);
}
/**
* increment value of a key by the provided value
* @param {string} key - key holding the value
* @param {string} value - value containing the data
* @param {callback} cb - callback
* @return {undefined}
*/
incrby(key, value, cb) {
return this._client.incrby(key, value, cb);
}
/**
* decrement value of a key by 1
* @param {string} key - key holding the value

View File

@ -64,7 +64,6 @@ const metricObj = {
buckets: 'bucket',
accounts: 'accountId',
users: 'userId',
location: 'location',
};
class UtapiClient {
@ -88,28 +87,19 @@ class UtapiClient {
const api = (config || {}).logApi || werelogs;
this.log = new api.Logger('UtapiClient');
// By default, we push all resource types
this.metrics = ['buckets', 'accounts', 'users', 'service', 'location'];
this.metrics = ['buckets', 'accounts', 'users', 'service'];
this.service = 's3';
this.disableOperationCounters = false;
this.enabledOperationCounters = [];
this.disableClient = true;
if (config) {
this.disableClient = false;
this.expireMetrics = config.expireMetrics;
this.expireMetricsTTL = config.expireMetricsTTL || 0;
if (config.metrics) {
const message = 'invalid property in UtapiClient configuration';
assert(Array.isArray(config.metrics), `${message}: metrics ` +
'must be an array');
assert(config.metrics.length !== 0, `${message}: metrics ` +
'array cannot be empty');
// if location is the only metric, pushMetric should be disabled
if (config.metrics.length === 1 &&
config.metrics[0] === 'location') {
this.disableClient = true;
}
this.metrics = config.metrics;
}
if (config.redis) {
@ -131,6 +121,9 @@ class UtapiClient {
if (config.enabledOperationCounters) {
this.enabledOperationCounters = config.enabledOperationCounters;
}
this.disableClient = false;
this.expireMetrics = config.expireMetrics;
this.expireMetricsTTL = config.expireMetricsTTL || 0;
}
}
@ -1019,69 +1012,6 @@ class UtapiClient {
});
}
/**
*
* @param {string} location - name of data location
* @param {number} updateSize - size in bytes to update location metric by,
* could be negative, indicating deleted object
* @param {string} reqUid - Request Unique Identifier
* @param {function} callback - callback to call
* @return {undefined}
*/
pushLocationMetric(location, updateSize, reqUid, callback) {
const log = this.log.newRequestLoggerFromSerializedUids(reqUid);
const params = {
level: 'location',
service: 's3',
location,
};
this._checkMetricTypes(params);
const action = (updateSize < 0) ? 'decrby' : 'incrby';
const size = (updateSize < 0) ? -updateSize : updateSize;
return this.ds[action](generateKey(params, 'locationStorage'), size,
err => {
if (err) {
log.error('error pushing metric', {
method: 'UtapiClient.pushLocationMetric',
error: err,
});
return callback(errors.InternalError);
}
return callback();
});
}
/**
*
* @param {string} location - name of data backend to get metric for
* @param {string} reqUid - Request Unique Identifier
* @param {function} callback - callback to call
* @return {undefined}
*/
getLocationMetric(location, reqUid, callback) {
const log = this.log.newRequestLoggerFromSerializedUids(reqUid);
const params = {
level: 'location',
service: 's3',
location,
};
const redisKey = generateKey(params, 'locationStorage');
return this.ds.get(redisKey, (err, bytesStored) => {
if (err) {
log.error('error getting metric', {
method: 'UtapiClient: getLocationMetric',
error: err,
});
return callback(errors.InternalError);
}
// if err and bytesStored are null, key does not exist yet
if (bytesStored === null) {
return callback(null, 0);
}
return callback(null, bytesStored);
});
}
/**
* Get storage used by bucket/account/user/service
* @param {object} params - params for the metrics

View File

@ -58,9 +58,9 @@ const keys = {
* @return {string} - prefix for the schema key
*/
function getSchemaPrefix(params, timestamp) {
const { bucket, accountId, userId, level, service, location } = params;
const { bucket, accountId, userId, level, service } = params;
// `service` property must remain last because other objects also include it
const id = bucket || accountId || userId || location || service;
const id = bucket || accountId || userId || service;
const prefix = timestamp ? `${service}:${level}:${timestamp}:${id}:` :
`${service}:${level}:${id}:`;
return prefix;
@ -75,13 +75,9 @@ function getSchemaPrefix(params, timestamp) {
*/
function generateKey(params, metric, timestamp) {
const prefix = getSchemaPrefix(params, timestamp);
if (params.location) {
return `${prefix}locationStorage`;
}
return keys[metric](prefix);
}
/**
* Returns a list of the counters for a metric type
* @param {object} params - object with metric type and id as a property

View File

@ -3,7 +3,7 @@
"engines": {
"node": ">=10"
},
"version": "8.0.0",
"version": "7.4.5",
"description": "API for tracking resource utilization and reporting metrics",
"main": "index.js",
"repository": {
@ -17,19 +17,19 @@
},
"homepage": "https://github.com/scality/utapi#readme",
"dependencies": {
"arsenal": "scality/Arsenal#dd6fde6",
"arsenal": "scality/Arsenal#b03f5b8",
"async": "^2.0.1",
"ioredis": "^4.9.5",
"node-schedule": "1.2.0",
"uuid": "^3.3.2",
"vaultclient": "scality/vaultclient#cc9ba34",
"vaultclient": "scality/vaultclient#3eaaff2",
"werelogs": "scality/werelogs#4e0d97c"
},
"devDependencies": {
"aws4": "^1.8.0",
"eslint": "^2.4.0",
"eslint-config-airbnb": "^6.0.0",
"eslint-config-scality": "scality/Guidelines#b578e709",
"eslint-config-scality": "scality/Guidelines#20dfffcc",
"eslint-plugin-react": "^4.3.0",
"express": "^4.17.1",
"mocha": "^3.0.2"

View File

@ -263,10 +263,6 @@ class Router {
*/
_processSecurityChecks(utapiRequest, route, cb) {
const log = utapiRequest.getLog();
if (process.env.UTAPI_AUTH === 'false') {
// Zenko route request does not need to go through Vault
return this._startRequest(utapiRequest, route, cb);
}
return this._authSquared(utapiRequest, err => {
if (err) {
log.trace('error from vault', { errors: err });

View File

@ -21,9 +21,6 @@ const config = {
localCache: redisLocal,
component: 's3',
};
const location = 'foo-backend';
const incrby = 100;
const decrby = -30;
function isSortedSetKey(key) {
return key.endsWith('storageUtilized') || key.endsWith('numberOfObjects');
@ -136,29 +133,6 @@ describe('UtapiClient:: _isCounterEnabled', () => {
});
});
function getLocationObject(bytesValue) {
const obj = {};
obj[`s3:location:${location}:locationStorage`] = `${bytesValue}`;
return obj;
}
function testLocationMetric(c, params, expected, cb) {
const { location, updateSize } = params;
if (updateSize) {
c.pushLocationMetric(location, updateSize, REQUID, err => {
assert.equal(err, null);
assert.deepStrictEqual(memoryBackend.data, expected);
return cb();
});
} else {
c.getLocationMetric(location, REQUID, (err, bytesStored) => {
assert.equal(err, null);
assert.strictEqual(bytesStored, expected);
return cb();
});
}
}
const tests = [
{
description: 'with no custom configuration',
@ -703,27 +677,3 @@ tests.forEach(test => {
});
});
});
describe('UtapiClient:: location quota metrics', () => {
beforeEach(function beFn() {
this.currentTest.c = new UtapiClient(config);
this.currentTest.c.setDataStore(ds);
});
afterEach(() => memoryBackend.flushDb());
it('should increment location metric', function itFn(done) {
const expected = getLocationObject(incrby);
testLocationMetric(this.test.c, { location, updateSize: incrby },
expected, done);
});
it('should decrement location metric', function itFn(done) {
const expected = getLocationObject(decrby);
testLocationMetric(this.test.c, { location, updateSize: decrby },
expected, done);
});
it('should list location metric', function itFn(done) {
const expected = 0;
testLocationMetric(this.test.c, { location }, expected, done);
});
});

View File

@ -146,23 +146,20 @@ arsenal@scality/Arsenal#9f2e74e:
optionalDependencies:
ioctl "2.0.0"
arsenal@scality/Arsenal#dd6fde6:
version "8.0.6"
resolved "https://codeload.github.com/scality/Arsenal/tar.gz/dd6fde61bbb7a799c822b8095c67f455c7c55e8f"
arsenal@scality/Arsenal#b03f5b8:
version "7.4.3"
resolved "https://codeload.github.com/scality/Arsenal/tar.gz/b03f5b80acd6acaaf8dd2d49cd955e6b95279ab3"
dependencies:
JSONStream "^1.0.0"
ajv "4.10.0"
async "~2.1.5"
bson "2.0.4"
debug "~2.3.3"
diskusage "^1.1.1"
fcntl "github:scality/node-fcntl"
ioredis "4.9.5"
ipaddr.js "1.2.0"
joi "^10.6"
level "~5.0.1"
level-sublevel "~6.6.5"
mongodb "^3.0.1"
node-forge "^0.7.1"
simple-glob "^0.1"
socket.io "~1.7.3"
@ -268,16 +265,6 @@ browser-stdout@1.3.0:
resolved "https://registry.yarnpkg.com/browser-stdout/-/browser-stdout-1.3.0.tgz#f351d32969d32fa5d7a5567154263d928ae3bd1f"
integrity sha1-81HTKWnTL6XXpVZxVCY9korjvR8=
bson@2.0.4:
version "2.0.4"
resolved "https://registry.yarnpkg.com/bson/-/bson-2.0.4.tgz#5b8e40f80d3d1b96a2ae55311c6612c25e586e50"
integrity sha512-e/GPy6CE0xL7MOYYRMIEwPGKF21WNaQdPIpV0YvaQDoR7oc47KUZ8c2P/TlRJVQP8RZ4CEsArGBC1NbkCRvl1w==
bson@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/bson/-/bson-1.1.1.tgz#4330f5e99104c4e751e7351859e2d408279f2f13"
integrity sha512-jCGVYLoYMHDkOsbwJZBCqwMHyH4c+wzgI9hG7Z6SZJRXWr+x58pdIbm2i9a/jFGCkRJqRUr8eoI7lDWa0hTkxg==
buffer-from@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/buffer-from/-/buffer-from-1.1.1.tgz#32713bc028f75c02fdb710d7c7bcec1f2c6070ef"
@ -723,9 +710,9 @@ eslint-config-airbnb@^6.0.0:
resolved "https://registry.yarnpkg.com/eslint-config-airbnb/-/eslint-config-airbnb-6.2.0.tgz#4a28196aa4617de01b8c914e992a82e5d0886a6e"
integrity sha1-SigZaqRhfeAbjJFOmSqC5dCIam4=
eslint-config-scality@scality/Guidelines#b578e709:
version "8.0.0"
resolved "https://codeload.github.com/scality/Guidelines/tar.gz/b578e709a511f5224e236ec865c85211b4d3c2db"
eslint-config-scality@scality/Guidelines#20dfffcc:
version "1.1.0"
resolved "https://codeload.github.com/scality/Guidelines/tar.gz/20dfffcc863bdb3c6bbc93b283e99a33b2fd6136"
dependencies:
commander "1.3.2"
markdownlint "0.0.8"
@ -863,13 +850,6 @@ fast-levenshtein@~2.0.4:
resolved "https://registry.yarnpkg.com/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz#3d8a5c66883a16a30ca8643e851f19baa7797917"
integrity sha1-PYpcZog6FqMMqGQ+hR8Zuqd5eRc=
"fcntl@github:scality/node-fcntl":
version "0.1.0"
resolved "https://codeload.github.com/scality/node-fcntl/tar.gz/9108603d8881d7762dcadfde1db927a1653dfda5"
dependencies:
bindings "^1.1.1"
nan "^2.3.2"
figures@^1.3.5:
version "1.7.0"
resolved "https://registry.yarnpkg.com/figures/-/figures-1.7.0.tgz#cbe1e3affcf1cd44b80cadfed28dc793a9701d2e"
@ -1533,11 +1513,6 @@ media-typer@0.3.0:
resolved "https://registry.yarnpkg.com/media-typer/-/media-typer-0.3.0.tgz#8710d7af0aa626f8fffa1ce00168545263255748"
integrity sha1-hxDXrwqmJvj/+hzgAWhUUmMlV0g=
memory-pager@^1.0.2:
version "1.5.0"
resolved "https://registry.yarnpkg.com/memory-pager/-/memory-pager-1.5.0.tgz#d8751655d22d384682741c972f2c3d6dfa3e66b5"
integrity sha512-ZS4Bp4r/Zoeq6+NLJpP+0Zzm0pR8whtGPf1XExKLJBAczGMnSi3It14OiNCStjQjM6NU1okjQGSxgEZN8eBYKg==
merge-descriptors@1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/merge-descriptors/-/merge-descriptors-1.0.1.tgz#b00aaa556dd8b44568150ec9d1b953f3f90cbb61"
@ -1618,25 +1593,6 @@ mocha@^3.0.2:
mkdirp "0.5.1"
supports-color "3.1.2"
mongodb-core@3.2.7:
version "3.2.7"
resolved "https://registry.yarnpkg.com/mongodb-core/-/mongodb-core-3.2.7.tgz#a8ef1fe764a192c979252dacbc600dc88d77e28f"
integrity sha512-WypKdLxFNPOH/Jy6i9z47IjG2wIldA54iDZBmHMINcgKOUcWJh8og+Wix76oGd7EyYkHJKssQ2FAOw5Su/n4XQ==
dependencies:
bson "^1.1.1"
require_optional "^1.0.1"
safe-buffer "^5.1.2"
optionalDependencies:
saslprep "^1.0.0"
mongodb@^3.0.1:
version "3.2.7"
resolved "https://registry.yarnpkg.com/mongodb/-/mongodb-3.2.7.tgz#8ba149e4be708257cad0dea72aebb2bbb311a7ac"
integrity sha512-2YdWrdf1PJgxcCrT1tWoL6nHuk6hCxhddAAaEh8QJL231ci4+P9FLyqopbTm2Z2sAU6mhCri+wd9r1hOcHdoMw==
dependencies:
mongodb-core "3.2.7"
safe-buffer "^5.1.2"
ms@0.7.1:
version "0.7.1"
resolved "https://registry.yarnpkg.com/ms/-/ms-0.7.1.tgz#9cd13c03adbff25b65effde7ce864ee952017098"
@ -1994,24 +1950,11 @@ require-uncached@^1.0.2:
caller-path "^0.1.0"
resolve-from "^1.0.0"
require_optional@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/require_optional/-/require_optional-1.0.1.tgz#4cf35a4247f64ca3df8c2ef208cc494b1ca8fc2e"
integrity sha512-qhM/y57enGWHAe3v/NcwML6a3/vfESLe/sGM2dII+gEO0BpKRUkWZow/tyloNqJyN6kXSl3RyyM8Ll5D/sJP8g==
dependencies:
resolve-from "^2.0.0"
semver "^5.1.0"
resolve-from@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/resolve-from/-/resolve-from-1.0.1.tgz#26cbfe935d1aeeeabb29bc3fe5aeb01e93d44226"
integrity sha1-Jsv+k10a7uq7Kbw/5a6wHpPUQiY=
resolve-from@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/resolve-from/-/resolve-from-2.0.0.tgz#9480ab20e94ffa1d9e80a804c7ea147611966b57"
integrity sha1-lICrIOlP+h2egKgEx+oUdhGWa1c=
restore-cursor@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/restore-cursor/-/restore-cursor-1.0.1.tgz#34661f46886327fed2991479152252df92daa541"
@ -2044,7 +1987,7 @@ safe-buffer@5.1.2, safe-buffer@~5.1.0, safe-buffer@~5.1.1:
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.1.2.tgz#991ec69d296e0313747d59bdfd2b745c35f8828d"
integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==
safe-buffer@^5.1.2, safe-buffer@~5.2.0:
safe-buffer@~5.2.0:
version "5.2.0"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.2.0.tgz#b74daec49b1148f88c64b68d49b1e815c1f2f519"
integrity sha512-fZEwUGbVl7kouZs1jCdMLdt95hdIv0ZeHg6L7qPeciMZhZ+/gdesW4wgTARkrFWEpspjEATAzUGPG8N2jJiwbg==
@ -2059,23 +2002,11 @@ safe-json-stringify@1.0.3:
resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a"
integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
saslprep@^1.0.0:
version "1.0.3"
resolved "https://registry.yarnpkg.com/saslprep/-/saslprep-1.0.3.tgz#4c02f946b56cf54297e347ba1093e7acac4cf226"
integrity sha512-/MY/PEMbk2SuY5sScONwhUDsV2p77Znkb/q3nSVstq/yQzYJOH/Azh29p9oJLsl3LnQwSvZDKagDGBsBwSooag==
dependencies:
sparse-bitfield "^3.0.3"
sax@>=0.6.0:
version "1.2.4"
resolved "https://registry.yarnpkg.com/sax/-/sax-1.2.4.tgz#2816234e2378bddc4e5354fab5caa895df7100d9"
integrity sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw==
semver@^5.1.0:
version "5.7.0"
resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.0.tgz#790a7cf6fea5459bac96110b29b60412dc8ff96b"
integrity sha512-Ya52jSX2u7QKghxeoFGpLwCtGlt7j0oY9DYb5apt9nPlJ42ID+ulTXESnt/qAQcoSERyZ5sl3LDIOw0nAn/5DA==
semver@~5.1.0:
version "5.1.1"
resolved "https://registry.yarnpkg.com/semver/-/semver-5.1.1.tgz#a3292a373e6f3e0798da0b20641b9a9c5bc47e19"
@ -2187,13 +2118,6 @@ socket.io@~1.7.3:
socket.io-client "1.7.4"
socket.io-parser "2.3.1"
sparse-bitfield@^3.0.3:
version "3.0.3"
resolved "https://registry.yarnpkg.com/sparse-bitfield/-/sparse-bitfield-3.0.3.tgz#ff4ae6e68656056ba4b3e792ab3334d38273ca11"
integrity sha1-/0rm5oZWBWuks+eSqzM004JzyhE=
dependencies:
memory-pager "^1.0.2"
sprintf-js@~1.0.2:
version "1.0.3"
resolved "https://registry.yarnpkg.com/sprintf-js/-/sprintf-js-1.0.3.tgz#04e6926f662895354f3dd015203633b857297e2c"
@ -2419,9 +2343,9 @@ vary@~1.1.2:
resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc"
integrity sha1-IpnwLG3tMNSllhsLn3RSShj2NPw=
vaultclient@scality/vaultclient#cc9ba34:
vaultclient@scality/vaultclient#3eaaff2:
version "7.4.0"
resolved "https://codeload.github.com/scality/vaultclient/tar.gz/cc9ba34b9abf3aac8324e912671547f5fc31baf4"
resolved "https://codeload.github.com/scality/vaultclient/tar.gz/3eaaff280e097e012a1c49ffcf52e816304f25ed"
dependencies:
arsenal scality/Arsenal#9f2e74e
commander "2.20.0"