Compare commits

...

45 Commits

Author SHA1 Message Date
Guillaume Hivert 7ec4bcd1d2 ARSN-99 Fix @hapi/joi to joi 2022-03-09 16:30:36 +01:00
Guillaume Hivert 4e8a4c7186 ARSN-99 MongoDB memory server 2022-03-09 16:27:48 +01:00
Guillaume Hivert 16a2c93c49 ARSN-99 Fix Bugs 2022-03-09 15:30:50 +01:00
Guillaume Hivert 9ca6113e29 ARSN-99 Type constants 2022-03-09 15:00:18 +01:00
Guillaume Hivert 2486efe2f6 ARSN-99 Type auth/backends/ChainBackend 2022-03-09 14:55:39 +01:00
Guillaume Hivert eff561c465 ARSN-99 Type auth/backends/BaseBackend 2022-03-09 14:54:22 +01:00
Guillaume Hivert d2a917bcec ARSN-99 Moving auth/in_memory to auth/backends/in_memory 2022-03-09 14:52:34 +01:00
Guillaume Hivert 50d929c008 ARSN-99 Trigger merge to development/8.1 2022-03-09 14:45:56 +01:00
Guillaume Hivert f94256a878 ARSN-108 Fix test suites 2022-03-09 11:23:24 +01:00
Guillaume Hivert a86d78e6ca ARSN-108 Type auth/auth 2022-03-09 11:23:19 +01:00
Guillaume Hivert cb551ea6b8 ARSN-108 Type auth/v4 2022-03-09 11:23:14 +01:00
Guillaume Hivert 223e4873ec ARSN-103 Add @types/utf8 2022-03-09 11:23:14 +01:00
Guillaume Hivert 705821cb5e ARSN-103 Type auth/v2 2022-03-09 11:23:14 +01:00
Guillaume Hivert 3de2a6428c ARSN-108 Type auth/Vault 2022-03-09 11:23:09 +01:00
Guillaume Hivert db6d328601 ARSN-108 Type constants 2022-03-09 11:23:03 +01:00
Guillaume Hivert e008afb49a ARSN-108 Rename constants 2022-03-09 11:22:58 +01:00
Guillaume Hivert e069662fc1 ARSN-108 Type auth/AuthInfo 2022-03-09 11:22:52 +01:00
Guillaume Hivert 4993ebe5f5 ARSN-108 Type auth/in_memory/Indexer 2022-03-09 11:22:45 +01:00
Guillaume Hivert 99d66f6da0 ARSN-108 Type auth/in_memory/validateAuthConfig 2022-03-09 11:22:40 +01:00
Guillaume Hivert 6697b36780 ARSN-108 Add @types/async 2022-03-09 11:22:33 +01:00
Guillaume Hivert 196a31bdb8 ARSN-108 Type auth/in_memory/Backend 2022-03-09 11:22:27 +01:00
Guillaume Hivert e2f709d848 ARSN-108 Add types for auth/in_memory/vaultUtilities 2022-03-09 11:22:20 +01:00
Guillaume Hivert 5b70a76018 ARSN-108 Add simple-glob interface 2022-03-09 11:22:14 +01:00
Guillaume Hivert 759cfa5c39 ARSN-108 Type auth/in_memory/AuthLoader 2022-03-09 11:22:06 +01:00
Guillaume Hivert dfaedf258e ARSN-108 Rename all files to TS 2022-03-09 11:21:21 +01:00
Guillaume Hivert 2af82a2852 ARSN-108 Upgrade joi to fully support type-checking 2022-03-09 11:21:07 +01:00
Guillaume Hivert a951072ec4 ARSN-67 Remove now useless errors/types.ts and add some docstring 2022-03-08 10:55:01 +01:00
Guillaume Hivert 726d6bbc6f ARSN-67 Fix typo in tests 2022-03-08 10:54:37 +01:00
Guillaume Hivert 9854a4d31c ARSN-67 Inline entries 2022-03-08 10:43:59 +01:00
Guillaume Hivert 9a9f7cf5eb ARSN-67 Switch from import http to import type { ServerResponse } 2022-03-08 10:41:05 +01:00
Guillaume Hivert 508f7bf5ae ARSN-67 Fix all tests 2022-03-04 14:57:14 +01:00
Guillaume Hivert 91c6c991e6 ARSN-67 Change errors.spec.js to errors.spec.ts 2022-03-04 14:46:35 +01:00
Guillaume Hivert 494da87998 ARSN-67 Switch errors to TS 2022-03-04 14:46:22 +01:00
Guillaume Hivert 5389fa40e3 ARSN-67 Rename errors and arsenalErrors 2022-03-04 14:42:11 +01:00
Guillaume Hivert d50ba76793 ARSN-67 Upload Artifacts 2022-03-04 12:13:57 +01:00
Guillaume Hivert 19c83f9d2e ARSN-67 Switch index.ts to import/export and fix JSON import in policyValidator 2022-03-04 11:49:12 +01:00
Guillaume Hivert fc35b118e3 ARSN-67 Rename index.js to index.ts for proper future migration 2022-03-04 11:30:18 +01:00
Guillaume Hivert d42a808abe ARSN-67 Remove ignore of build for NPM
Installing from git sources for dependents produced only an index.js
file. It was due to .gitignore ignoring the build folder and npm/yarn
removing the ignored files after install. Adding an empty .npmignore
solves the problem. This can be found here:
https://stackoverflow.com/questions/61754026/installing-npm-package-with-prepare-script-from-yarn-produces-only-index-js
2022-03-04 11:29:32 +01:00
Guillaume Hivert c37035cc9e ARSN-67 Add TypeScript and Babel, and make test suite working 2022-03-04 11:27:07 +01:00
Guillaume Hivert c87627f28a ARSN-84 Correct Jest configuration for test suites and coverage
Thanks to files renaming, we can follow as much as we can the jest
default configurations. The options are gone, and we're specifying only
the maxWorkers (because the test suite is linear, and bugs if we're
running it in parallel) and the collect coverage files.
The coverage script itself is joined into one command instead of three
to leverage the Jest builtin coverage.
2022-03-04 10:02:16 +01:00
Guillaume Hivert 21326b0ea1 ARSN-84 Rename all test files from [name].js to [name].spec.js
In order to simplify jest configuration, we have to remane the files to
follow the jest convention (to have a .spec.js extension for test
files).
2022-03-04 10:02:16 +01:00
Guillaume Hivert 1ad20826cd ARSN-84 Fix Jest bug in _arsenalError
You can check out the bug at
https://github.com/facebook/jest/issues/2549.
The bug in inherent to jest and is a known bug since years, because jest
is switching the VM from node to a custom VM from jest. Jest injects
its own set of globals. The Error provided by Jest is different from
the Error provided by node and the test `err instanceof Error` is false.
Error:
```
 Expected value to be equal to:
      true
 Received:
      false
```
2022-03-04 10:02:16 +01:00
Guillaume Hivert aeac58560e ARSN-84 Fix redis commands in functional tests
The switch from mocha to Jest introduces some tests bugs.
As far as we can tell, jest is quicker than mocha, creating some
weird behaviour: some commands send to redis (with ioredis)
are working, and some aren’t. Our conclusion is that redis needs
to queue requests offline to avoid micro-disconnections from
redis in development. Otherwise, we got the following error:
```
  - StatsModel class › should correctly record a new request by default
one increment

    assert.ifError(received, expected)

    Expected value ifError to:
      null
    Received:
      [Error: Stream isn't writeable and enableOfflineQueue options is
false]

    Message:
      ifError got unwanted exception: Stream isn't writeable and
enableOfflineQueue options is false
```
Switching enableOfflineQueue to true makes the test suite to
success.
2022-03-04 10:02:16 +01:00
Guillaume Hivert 940fa18b72 ARSN-84 Fix linting with correct indentation and trailing commas 2022-03-04 10:02:16 +01:00
Guillaume Hivert f0a2fbb47c ARSN-84 Introduce Jest and reconfigure ESLint
Add Jest as a test runner as a mocha replacement to have the
TS compiling on the fly and allowing mixed sources TS/JS in the
sources (and replacing the before and after of mocha with beforeAll
and afterAll of Jest), and adding some ESLint configuration to make
ESLint happy.
2022-03-03 18:44:17 +01:00
159 changed files with 4294 additions and 3178 deletions

View File

@ -32,6 +32,7 @@ jobs:
cache: 'yarn'
- name: install dependencies
run: yarn cache clean && yarn install --frozen-lockfile
continue-on-error: true # TODO ARSN-97 Remove it when no errors in TS
- name: lint yaml
run: yarn --silent lint_yml
- name: lint javascript
@ -49,3 +50,31 @@ jobs:
- name: run executables tests
run: yarn install && yarn test
working-directory: 'lib/executables/pensieveCreds/'
compile:
name: Compile and upload build artifacts
needs: test
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Install NodeJS
uses: actions/setup-node@v2
with:
node-version: '16'
cache: yarn
- name: Install dependencies
run: yarn cache clean && yarn install --frozen-lockfile
continue-on-error: true # TODO ARSN-97 Remove it when no errors in TS
- name: Compile
run: yarn build
continue-on-error: true # TODO ARSN-97 Remove it when no errors in TS
- name: Upload artifacts
uses: scality/action-artifacts@v2
with:
url: https://artifacts.scality.net
user: ${{ secrets.ARTIFACTS_USER }}
password: ${{ secrets.ARTIFACTS_PASSWORD }}
source: ./build
method: upload
if: success()

3
.gitignore vendored
View File

@ -13,3 +13,6 @@ node_modules/
# Coverage
coverage/
.nyc_output/
# TypeScript
build/

0
.npmignore Normal file
View File

6
babel.config.js Normal file
View File

@ -0,0 +1,6 @@
module.exports = {
presets: [
['@babel/preset-env', { targets: { node: 'current' } }],
'@babel/preset-typescript',
],
};

View File

@ -1,773 +0,0 @@
{
"_comment": "------------------- Amazon errors ------------------",
"AccessDenied": {
"code": 403,
"description": "Access Denied"
},
"AccessForbidden": {
"code": 403,
"description": "Access Forbidden"
},
"AccountProblem": {
"code": 403,
"description": "There is a problem with your AWS account that prevents the operation from completing successfully. Please use Contact Us."
},
"AmbiguousGrantByEmailAddress": {
"code": 400,
"description": "The email address you provided is associated with more than one account."
},
"BadDigest": {
"code": 400,
"description": "The Content-MD5 you specified did not match what we received."
},
"BucketAlreadyExists": {
"code": 409,
"description": "The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again."
},
"BucketAlreadyOwnedByYou": {
"code": 409,
"description": "Your previous request to create the named bucket succeeded and you already own it. You get this error in all AWS regions except US Standard, us-east-1. In us-east-1 region, you will get 200 OK, but it is no-op (if bucket exists S3 will not do anything)."
},
"BucketNotEmpty": {
"code": 409,
"description": "The bucket you tried to delete is not empty."
},
"CredentialsNotSupported": {
"code": 400,
"description": "This request does not support credentials."
},
"CrossLocationLoggingProhibited": {
"code": 403,
"description": "Cross-location logging not allowed. Buckets in one geographic location cannot log information to a bucket in another location."
},
"DeleteConflict": {
"code": 409,
"description": "The request was rejected because it attempted to delete a resource that has attached subordinate entities. The error message describes these entities."
},
"EntityTooSmall": {
"code": 400,
"description": "Your proposed upload is smaller than the minimum allowed object size."
},
"EntityTooLarge": {
"code": 400,
"description": "Your proposed upload exceeds the maximum allowed object size."
},
"ExpiredToken": {
"code": 400,
"description": "The provided token has expired."
},
"HttpHeadersTooLarge": {
"code": 400,
"description": "Your http headers exceed the maximum allowed http headers size."
},
"IllegalVersioningConfigurationException": {
"code": 400,
"description": "Indicates that the versioning configuration specified in the request is invalid."
},
"IncompleteBody": {
"code": 400,
"description": "You did not provide the number of bytes specified by the Content-Length HTTP header."
},
"IncorrectNumberOfFilesInPostRequest": {
"code": 400,
"description": "POST requires exactly one file upload per request."
},
"InlineDataTooLarge": {
"code": 400,
"description": "Inline data exceeds the maximum allowed size."
},
"InternalError": {
"code": 500,
"description": "We encountered an internal error. Please try again."
},
"InvalidAccessKeyId": {
"code": 403,
"description": "The AWS access key Id you provided does not exist in our records."
},
"InvalidAddressingHeader": {
"code": 400,
"description": "You must specify the Anonymous role."
},
"InvalidArgument": {
"code": 400,
"description": "Invalid Argument"
},
"InvalidBucketName": {
"code": 400,
"description": "The specified bucket is not valid."
},
"InvalidBucketState": {
"code": 409,
"description": "The request is not valid with the current state of the bucket."
},
"InvalidDigest": {
"code": 400,
"description": "The Content-MD5 you specified is not valid."
},
"InvalidEncryptionAlgorithmError": {
"code": 400,
"description": "The encryption request you specified is not valid. The valid value is AES256."
},
"InvalidLocationConstraint": {
"code": 400,
"description": "The specified location constraint is not valid."
},
"InvalidObjectState": {
"code": 403,
"description": "The operation is not valid for the current state of the object."
},
"InvalidPart": {
"code": 400,
"description": "One or more of the specified parts could not be found. The part might not have been uploaded, or the specified entity tag might not have matched the part's entity tag."
},
"InvalidPartOrder": {
"code": 400,
"description": "The list of parts was not in ascending order.Parts list must specified in order by part number."
},
"InvalidPartNumber": {
"code": 416,
"description": "The requested partnumber is not satisfiable."
},
"InvalidPayer": {
"code": 403,
"description": "All access to this object has been disabled."
},
"InvalidPolicyDocument": {
"code": 400,
"description": "The content of the form does not meet the conditions specified in the policy document."
},
"InvalidRange": {
"code": 416,
"description": "The requested range cannot be satisfied."
},
"InvalidRedirectLocation": {
"code": 400,
"description": "The website redirect location must have a prefix of 'http://' or 'https://' or '/'."
},
"InvalidRequest": {
"code": 400,
"description": "SOAP requests must be made over an HTTPS connection."
},
"InvalidSecurity": {
"code": 403,
"description": "The provided security credentials are not valid."
},
"InvalidSOAPRequest": {
"code": 400,
"description": "The SOAP request body is invalid."
},
"InvalidStorageClass": {
"code": 400,
"description": "The storage class you specified is not valid."
},
"InvalidTag": {
"code": 400,
"description": "The Tag you have provided is invalid"
},
"InvalidTargetBucketForLogging": {
"code": 400,
"description": "The target bucket for logging does not exist, is not owned by you, or does not have the appropriate grants for the log-delivery group."
},
"InvalidToken": {
"code": 400,
"description": "The provided token is malformed or otherwise invalid."
},
"InvalidURI": {
"code": 400,
"description": "Couldn't parse the specified URI."
},
"KeyTooLong": {
"code": 400,
"description": "Your key is too long."
},
"LimitExceeded": {
"code": 409,
"description": " The request was rejected because it attempted to create resources beyond the current AWS account limits. The error message describes the limit exceeded."
},
"MalformedACLError": {
"code": 400,
"description": "The XML you provided was not well-formed or did not validate against our published schema."
},
"MalformedPOSTRequest": {
"code": 400,
"description": "The body of your POST request is not well-formed multipart/form-data."
},
"MalformedXML": {
"code": 400,
"description": "The XML you provided was not well-formed or did not validate against our published schema."
},
"MaxMessageLengthExceeded": {
"code": 400,
"description": "Your request was too big."
},
"MaxPostPreDataLengthExceededError": {
"code": 400,
"description": "Your POST request fields preceding the upload file were too large."
},
"MetadataTooLarge": {
"code": 400,
"description": "Your metadata headers exceed the maximum allowed metadata size."
},
"MethodNotAllowed": {
"code": 405,
"description": "The specified method is not allowed against this resource."
},
"MissingAttachment": {
"code": 400,
"description": "A SOAP attachment was expected, but none were found."
},
"MissingContentLength": {
"code": 411,
"description": "You must provide the Content-Length HTTP header."
},
"MissingRequestBodyError": {
"code": 400,
"description": "Request body is empty"
},
"MissingRequiredParameter": {
"code": 400,
"description": "Your request is missing a required parameter."
},
"MissingSecurityElement": {
"code": 400,
"description": "The SOAP 1.1 request is missing a security element."
},
"MissingSecurityHeader": {
"code": 400,
"description": "Your request is missing a required header."
},
"NoLoggingStatusForKey": {
"code": 400,
"description": "There is no such thing as a logging status subresource for a key."
},
"NoSuchBucket": {
"code": 404,
"description": "The specified bucket does not exist."
},
"NoSuchCORSConfiguration": {
"code": 404,
"description": "The CORS configuration does not exist"
},
"NoSuchKey": {
"code": 404,
"description": "The specified key does not exist."
},
"NoSuchLifecycleConfiguration": {
"code": 404,
"description": "The lifecycle configuration does not exist."
},
"NoSuchObjectLockConfiguration": {
"code": 404,
"description": "The specified object does not have a ObjectLock configuration."
},
"NoSuchWebsiteConfiguration": {
"code": 404,
"description": "The specified bucket does not have a website configuration"
},
"NoSuchUpload": {
"code": 404,
"description": "The specified multipart upload does not exist. The upload ID might be invalid, or the multipart upload might have been aborted or completed."
},
"NoSuchVersion": {
"code": 404,
"description": "Indicates that the version ID specified in the request does not match an existing version."
},
"ReplicationConfigurationNotFoundError": {
"code": 404,
"description": "The replication configuration was not found"
},
"ObjectLockConfigurationNotFoundError": {
"code": 404,
"description": "The object lock configuration was not found"
},
"ServerSideEncryptionConfigurationNotFoundError" : {
"code": 404,
"description": "The server side encryption configuration was not found"
},
"NotImplemented": {
"code": 501,
"description": "A header you provided implies functionality that is not implemented."
},
"NotModified": {
"code": 304,
"description": "Not Modified."
},
"NotSignedUp": {
"code": 403,
"description": "Your account is not signed up for the S3 service. You must sign up before you can use S3. "
},
"NoSuchBucketPolicy": {
"code": 404,
"description": "The specified bucket does not have a bucket policy."
},
"OperationAborted": {
"code": 409,
"description": "A conflicting conditional operation is currently in progress against this resource. Try again."
},
"PermanentRedirect": {
"code": 301,
"description": "The bucket you are attempting to access must be addressed using the specified endpoint. Send all future requests to this endpoint."
},
"PreconditionFailed": {
"code": 412,
"description": "At least one of the preconditions you specified did not hold."
},
"Redirect": {
"code": 307,
"description": "Temporary redirect."
},
"RestoreAlreadyInProgress": {
"code": 409,
"description": "Object restore is already in progress."
},
"RequestIsNotMultiPartContent": {
"code": 400,
"description": "Bucket POST must be of the enclosure-type multipart/form-data."
},
"RequestTimeout": {
"code": 400,
"description": "Your socket connection to the server was not read from or written to within the timeout period."
},
"RequestTimeTooSkewed": {
"code": 403,
"description": "The difference between the request time and the server's time is too large."
},
"RequestTorrentOfBucketError": {
"code": 400,
"description": "Requesting the torrent file of a bucket is not permitted."
},
"SignatureDoesNotMatch": {
"code": 403,
"description": "The request signature we calculated does not match the signature you provided."
},
"_comment" : {
"note" : "This is an AWS S3 specific error. We are opting to use the more general 'ServiceUnavailable' error used throughout AWS (IAM/EC2) to have uniformity of error messages even though we are potentially compromising S3 compatibility.",
"ServiceUnavailable": {
"code": 503,
"description": "Reduce your request rate."
}
},
"ServiceUnavailable": {
"code": 503,
"description": "The request has failed due to a temporary failure of the server."
},
"SlowDown": {
"code": 503,
"description": "Reduce your request rate."
},
"TemporaryRedirect": {
"code": 307,
"description": "You are being redirected to the bucket while DNS updates."
},
"TokenRefreshRequired": {
"code": 400,
"description": "The provided token must be refreshed."
},
"TooManyBuckets": {
"code": 400,
"description": "You have attempted to create more buckets than allowed."
},
"TooManyParts": {
"code": 400,
"description": "You have attempted to upload more parts than allowed."
},
"UnexpectedContent": {
"code": 400,
"description": "This request does not support content."
},
"UnresolvableGrantByEmailAddress": {
"code": 400,
"description": "The email address you provided does not match any account on record."
},
"UserKeyMustBeSpecified": {
"code": 400,
"description": "The bucket POST must contain the specified field name. If it is specified, check the order of the fields."
},
"NoSuchEntity": {
"code": 404,
"description": "The request was rejected because it referenced an entity that does not exist. The error message describes the entity."
},
"WrongFormat": {
"code": 400,
"description": "Data entered by the user has a wrong format."
},
"Forbidden": {
"code": 403,
"description": "Authentication failed."
},
"EntityDoesNotExist": {
"code": 404,
"description": "Not found."
},
"EntityAlreadyExists": {
"code": 409,
"description": "The request was rejected because it attempted to create a resource that already exists."
},
"KeyAlreadyExists": {
"code": 409,
"description": "The request was rejected because it attempted to create a resource that already exists."
},
"ServiceFailure": {
"code": 500,
"description": "Server error: the request processing has failed because of an unknown error, exception or failure."
},
"IncompleteSignature": {
"code": 400,
"description": "The request signature does not conform to AWS standards."
},
"InternalFailure": {
"code": 500,
"description": "The request processing has failed because of an unknown error, exception or failure."
},
"InvalidAction": {
"code": 400,
"description": "The action or operation requested is invalid. Verify that the action is typed correctly."
},
"InvalidClientTokenId": {
"code": 403,
"description": "The X.509 certificate or AWS access key ID provided does not exist in our records."
},
"InvalidParameterCombination": {
"code": 400,
"description": "Parameters that must not be used together were used together."
},
"InvalidParameterValue": {
"code": 400,
"description": "An invalid or out-of-range value was supplied for the input parameter."
},
"InvalidQueryParameter": {
"code": 400,
"description": "The AWS query string is malformed or does not adhere to AWS standards."
},
"MalformedQueryString": {
"code": 404,
"description": "The query string contains a syntax error."
},
"MissingAction": {
"code": 400,
"description": "The request is missing an action or a required parameter."
},
"MissingAuthenticationToken": {
"code": 403,
"description": "The request must contain either a valid (registered) AWS access key ID or X.509 certificate."
},
"MissingParameter": {
"code": 400,
"description": "A required parameter for the specified action is not supplied."
},
"OptInRequired": {
"code": 403,
"description": "The AWS access key ID needs a subscription for the service."
},
"RequestExpired": {
"code": 400,
"description": "The request reached the service more than 15 minutes after the date stamp on the request or more than 15 minutes after the request expiration date (such as for pre-signed URLs), or the date stamp on the request is more than 15 minutes in the future."
},
"Throttling": {
"code": 400,
"description": "The request was denied due to request throttling."
},
"AccountNotFound": {
"code": 404,
"description": "No account was found in Vault, please contact your system administrator."
},
"ValidationError": {
"code": 400,
"description": "The specified value is invalid."
},
"MalformedPolicyDocument": {
"code": 400,
"description": "Syntax errors in policy."
},
"InvalidInput": {
"code": 400,
"description": "The request was rejected because an invalid or out-of-range value was supplied for an input parameter."
},
"MalformedPolicy": {
"code": 400,
"description": "This policy contains invalid Json"
},
"ReportExpired": {
"code": 410,
"description": "The request was rejected because the most recent credential report has expired. To generate a new credential report, use GenerateCredentialReport."
},
"ReportInProgress": {
"code": 404,
"description": "The request was rejected because the credential report is still being generated."
},
"ReportNotPresent": {
"code": 410,
"description": "The request was rejected because the credential report does not exist. To generate a credential report, use GenerateCredentialReport."
},
"_comment": "-------------- Special non-AWS S3 errors --------------",
"MPUinProgress": {
"code": 409,
"description": "The bucket you tried to delete has an ongoing multipart upload."
},
"LocationNotFound": {
"code": 424,
"description": "The object data location does not exist."
},
"_comment": "-------------- Internal project errors --------------",
"_comment": "----------------------- Vault -----------------------",
"_comment": "#### formatErrors ####",
"BadName": {
"description": "name not ok",
"code": 5001
},
"BadAccount": {
"description": "account not ok",
"code": 5002
},
"BadGroup": {
"description": "group not ok",
"code": 5003
},
"BadId": {
"description": "id not ok",
"code": 5004
},
"BadAccountName": {
"description": "accountName not ok",
"code": 5005
},
"BadNameFriendly": {
"description": "nameFriendly not ok",
"code": 5006
},
"BadEmailAddress": {
"description": "email address not ok",
"code": 5007
},
"BadPath": {
"description": "path not ok",
"code": 5008
},
"BadArn": {
"description": "arn not ok",
"code": 5009
},
"BadCreateDate": {
"description": "createDate not ok",
"code": 5010
},
"BadLastUsedDate": {
"description": "lastUsedDate not ok",
"code": 5011
},
"BadNotBefore": {
"description": "notBefore not ok",
"code": 5012
},
"BadNotAfter": {
"description": "notAfter not ok",
"code": 5013
},
"BadSaltedPwd": {
"description": "salted password not ok",
"code": 5014
},
"ok": {
"description": "No error",
"code": 200
},
"BadUser": {
"description": "user not ok",
"code": 5016
},
"BadSaltedPasswd": {
"description": "salted password not ok",
"code": 5017
},
"BadPasswdDate": {
"description": "password date not ok",
"code": 5018
},
"BadCanonicalId": {
"description": "canonicalId not ok",
"code": 5019
},
"BadAlias": {
"description": "alias not ok",
"code": 5020
},
"_comment": "#### internalErrors ####",
"DBPutFailed": {
"description": "DB put failed",
"code": 5021
},
"_comment": "#### alreadyExistErrors ####",
"AccountEmailAlreadyUsed": {
"description": "an other account already uses that email",
"code": 5022
},
"AccountNameAlreadyUsed": {
"description": "an other account already uses that name",
"code": 5023
},
"UserEmailAlreadyUsed": {
"description": "an other user already uses that email",
"code": 5024
},
"UserNameAlreadyUsed": {
"description": "an other user already uses that name",
"code": 5025
},
"_comment": "#### doesntExistErrors ####",
"NoParentAccount": {
"description": "parent account does not exist",
"code": 5026
},
"_comment": "#### authErrors ####",
"BadStringToSign": {
"description": "stringToSign not ok'",
"code": 5027
},
"BadSignatureFromRequest": {
"description": "signatureFromRequest not ok",
"code": 5028
},
"BadAlgorithm": {
"description": "hashAlgorithm not ok",
"code": 5029
},
"SecretKeyDoesNotExist": {
"description": "secret key does not exist",
"code": 5030
},
"InvalidRegion": {
"description": "Region was not provided or is not recognized by the system",
"code": 5031
},
"ScopeDate": {
"description": "scope date is missing, or format is invalid",
"code": 5032
},
"BadAccessKey": {
"description": "access key not ok",
"code": 5033
},
"NoDict": {
"description": "no dictionary of params provided for signature verification",
"code": 5034
},
"BadSecretKey": {
"description": "secretKey not ok",
"code": 5035
},
"BadSecretKeyValue": {
"description": "secretKey value not ok",
"code": 5036
},
"BadSecretKeyStatus": {
"description": "secretKey status not ok",
"code": 5037
},
"_comment": "#### OidcpErrors ####",
"BadUrl": {
"description": "url not ok",
"code": 5038
},
"BadClientIdList": {
"description": "client id list not ok'",
"code": 5039
},
"BadThumbprintList": {
"description": "thumbprint list not ok'",
"code": 5040
},
"BadObject": {
"description": "Object not ok'",
"code": 5041
},
"_comment": "#### RoleErrors ####",
"BadRole": {
"description": "role not ok",
"code": 5042
},
"_comment": "#### SamlpErrors ####",
"BadSamlp": {
"description": "samlp not ok",
"code": 5043
},
"BadMetadataDocument": {
"description": "metadata document not ok",
"code": 5044
},
"BadSessionIndex": {
"description": "session index not ok",
"code": 5045
},
"Unauthorized": {
"description": "not authenticated",
"code": 401
},
"_comment": "--------------------- MetaData ---------------------",
"_comment": "#### formatErrors ####",
"CacheUpdated": {
"description": "The cache has been updated",
"code": 500
},
"DBNotFound": {
"description": "This DB does not exist",
"code": 404
},
"DBAlreadyExists": {
"description": "This DB already exist",
"code": 409
},
"ObjNotFound": {
"description": "This object does not exist",
"code": 404
},
"PermissionDenied": {
"description": "Permission denied",
"code": 403
},
"BadRequest": {
"description": "BadRequest",
"code": 400
},
"RaftSessionNotLeader": {
"description": "NotLeader",
"code": 500
},
"RaftSessionLeaderNotConnected": {
"description": "RaftSessionLeaderNotConnected",
"code": 400
},
"NoLeaderForDB": {
"description": "NoLeaderForDB",
"code": 400
},
"RouteNotFound": {
"description": "RouteNotFound",
"code": 404
},
"NoMapsInConfig": {
"description": "NoMapsInConfig",
"code": 404
},
"DBAPINotReady": {
"message": "DBAPINotReady",
"code": 500
},
"NotEnoughMapsInConfig:": {
"description": "NotEnoughMapsInConfig",
"code": 400
},
"TooManyRequests": {
"description": "TooManyRequests",
"code": 429
},
"_comment": "----------------------- cdmiclient -----------------------",
"ReadOnly": {
"description": "trying to write to read only back-end",
"code": 403
},
"_comment": "----------------------- authbackend -----------------------",
"AuthMethodNotImplemented": {
"description": "AuthMethodNotImplemented",
"code": 501
}
}

202
index.js
View File

@ -1,202 +0,0 @@
module.exports = {
auth: require('./lib/auth/auth'),
constants: require('./lib/constants'),
db: require('./lib/db'),
errors: require('./lib/errors.js'),
errorUtils: require('./lib/errorUtils'),
shuffle: require('./lib/shuffle'),
stringHash: require('./lib/stringHash'),
ipCheck: require('./lib/ipCheck'),
jsutil: require('./lib/jsutil'),
https: {
ciphers: require('./lib/https/ciphers.js'),
dhparam: require('./lib/https/dh2048.js'),
},
algorithms: {
list: require('./lib/algos/list/exportAlgos'),
listTools: {
DelimiterTools: require('./lib/algos/list/tools'),
Skip: require('./lib/algos/list/skip'),
},
cache: {
LRUCache: require('./lib/algos/cache/LRUCache'),
},
stream: {
MergeStream: require('./lib/algos/stream/MergeStream'),
},
SortedSet: require('./lib/algos/set/SortedSet'),
},
policies: {
evaluators: require('./lib/policyEvaluator/evaluator.js'),
validateUserPolicy: require('./lib/policy/policyValidator')
.validateUserPolicy,
evaluatePrincipal: require('./lib/policyEvaluator/principal'),
RequestContext: require('./lib/policyEvaluator/RequestContext.js'),
requestUtils: require('./lib/policyEvaluator/requestUtils'),
actionMaps: require('./lib/policyEvaluator/utils/actionMaps'),
},
Clustering: require('./lib/Clustering'),
testing: {
matrix: require('./lib/testing/matrix.js'),
},
versioning: {
VersioningConstants: require('./lib/versioning/constants.js')
.VersioningConstants,
Version: require('./lib/versioning/Version.js').Version,
VersionID: require('./lib/versioning/VersionID.js'),
WriteGatheringManager: require('./lib/versioning/WriteGatheringManager.js'),
WriteCache: require('./lib/versioning/WriteCache.js'),
VersioningRequestProcessor: require('./lib/versioning/VersioningRequestProcessor.js'),
},
network: {
http: {
server: require('./lib/network/http/server'),
utils: require('./lib/network/http/utils'),
},
rpc: require('./lib/network/rpc/rpc'),
level: require('./lib/network/rpc/level-net'),
rest: {
RESTServer: require('./lib/network/rest/RESTServer'),
RESTClient: require('./lib/network/rest/RESTClient'),
},
RoundRobin: require('./lib/network/RoundRobin'),
probe: {
ProbeServer: require('./lib/network/probe/ProbeServer'),
HealthProbeServer:
require('./lib/network/probe/HealthProbeServer.js'),
Utils: require('./lib/network/probe/Utils.js'),
},
kmip: require('./lib/network/kmip'),
kmipClient: require('./lib/network/kmip/Client'),
},
s3routes: {
routes: require('./lib/s3routes/routes'),
routesUtils: require('./lib/s3routes/routesUtils'),
},
s3middleware: {
userMetadata: require('./lib/s3middleware/userMetadata'),
convertToXml: require('./lib/s3middleware/convertToXml'),
escapeForXml: require('./lib/s3middleware/escapeForXml'),
objectLegalHold: require('./lib/s3middleware/objectLegalHold'),
tagging: require('./lib/s3middleware/tagging'),
checkDateModifiedHeaders:
require('./lib/s3middleware/validateConditionalHeaders')
.checkDateModifiedHeaders,
validateConditionalHeaders:
require('./lib/s3middleware/validateConditionalHeaders')
.validateConditionalHeaders,
MD5Sum: require('./lib/s3middleware/MD5Sum'),
NullStream: require('./lib/s3middleware/nullStream'),
objectUtils: require('./lib/s3middleware/objectUtils'),
azureHelper: {
mpuUtils:
require('./lib/s3middleware/azureHelpers/mpuUtils'),
ResultsCollector:
require('./lib/s3middleware/azureHelpers/ResultsCollector'),
SubStreamInterface:
require('./lib/s3middleware/azureHelpers/SubStreamInterface'),
},
prepareStream: require('./lib/s3middleware/prepareStream'),
processMpuParts: require('./lib/s3middleware/processMpuParts'),
retention: require('./lib/s3middleware/objectRetention'),
lifecycleHelpers: require('./lib/s3middleware/lifecycleHelpers'),
},
storage: {
metadata: {
MetadataWrapper: require('./lib/storage/metadata/MetadataWrapper'),
bucketclient: {
BucketClientInterface:
require('./lib/storage/metadata/bucketclient/' +
'BucketClientInterface'),
LogConsumer:
require('./lib/storage/metadata/bucketclient/LogConsumer'),
},
file: {
BucketFileInterface:
require('./lib/storage/metadata/file/BucketFileInterface'),
MetadataFileServer:
require('./lib/storage/metadata/file/MetadataFileServer'),
MetadataFileClient:
require('./lib/storage/metadata/file/MetadataFileClient'),
},
inMemory: {
metastore:
require('./lib/storage/metadata/in_memory/metastore'),
metadata: require('./lib/storage/metadata/in_memory/metadata'),
bucketUtilities:
require('./lib/storage/metadata/in_memory/bucket_utilities'),
},
mongoclient: {
MongoClientInterface:
require('./lib/storage/metadata/mongoclient/' +
'MongoClientInterface'),
LogConsumer:
require('./lib/storage/metadata/mongoclient/LogConsumer'),
},
proxy: {
Server: require('./lib/storage/metadata/proxy/Server'),
},
},
data: {
DataWrapper: require('./lib/storage/data/DataWrapper'),
MultipleBackendGateway:
require('./lib/storage/data/MultipleBackendGateway'),
parseLC: require('./lib/storage/data/LocationConstraintParser'),
file: {
DataFileStore:
require('./lib/storage/data/file/DataFileStore'),
DataFileInterface:
require('./lib/storage/data/file/DataFileInterface'),
},
external: {
AwsClient: require('./lib/storage/data/external/AwsClient'),
AzureClient: require('./lib/storage/data/external/AzureClient'),
GcpClient: require('./lib/storage/data/external/GcpClient'),
GCP: require('./lib/storage/data/external/GCP/GcpService'),
GcpUtils: require('./lib/storage/data/external/GCP/GcpUtils'),
GcpSigner: require('./lib/storage/data/external/GCP/GcpSigner'),
PfsClient: require('./lib/storage/data/external/PfsClient'),
backendUtils: require('./lib/storage/data/external/utils'),
},
inMemory: {
datastore: require('./lib/storage/data/in_memory/datastore'),
},
},
utils: require('./lib/storage/utils'),
},
models: {
BackendInfo: require('./lib/models/BackendInfo'),
BucketInfo: require('./lib/models/BucketInfo'),
BucketAzureInfo: require('./lib/models/BucketAzureInfo'),
ObjectMD: require('./lib/models/ObjectMD'),
ObjectMDLocation: require('./lib/models/ObjectMDLocation'),
ObjectMDAzureInfo: require('./lib/models/ObjectMDAzureInfo'),
ARN: require('./lib/models/ARN'),
WebsiteConfiguration: require('./lib/models/WebsiteConfiguration'),
ReplicationConfiguration:
require('./lib/models/ReplicationConfiguration'),
LifecycleConfiguration:
require('./lib/models/LifecycleConfiguration'),
LifecycleRule: require('./lib/models/LifecycleRule'),
BucketPolicy: require('./lib/models/BucketPolicy'),
ObjectLockConfiguration:
require('./lib/models/ObjectLockConfiguration'),
NotificationConfiguration:
require('./lib/models/NotificationConfiguration'),
},
metrics: {
StatsClient: require('./lib/metrics/StatsClient'),
StatsModel: require('./lib/metrics/StatsModel'),
RedisClient: require('./lib/metrics/RedisClient'),
ZenkoMetrics: require('./lib/metrics/ZenkoMetrics'),
},
pensieve: {
credentialUtils: require('./lib/executables/pensieveCreds/utils'),
},
stream: {
readJSONStreamObject: require('./lib/stream/readJSONStreamObject'),
},
patches: {
locationConstraints: require('./lib/patches/locationConstraints'),
},
};

191
index.ts Normal file
View File

@ -0,0 +1,191 @@
export { default as errors } from './lib/errors';
export * as auth from './lib/auth/auth'
export * as constants from './lib/constants';
export const db = require('./lib/db');
export const errorUtils = require('./lib/errorUtils');
export const shuffle = require('./lib/shuffle');
export const stringHash = require('./lib/stringHash');
export const ipCheck = require('./lib/ipCheck');
export const jsutil = require('./lib/jsutil');
export const Clustering = require('./lib/Clustering');
export const https = {
ciphers: require('./lib/https/ciphers.js'),
dhparam: require('./lib/https/dh2048.js'),
};
export const algorithms = {
list: require('./lib/algos/list/exportAlgos'),
listTools: {
DelimiterTools: require('./lib/algos/list/tools'),
Skip: require('./lib/algos/list/skip'),
},
cache: {
LRUCache: require('./lib/algos/cache/LRUCache'),
},
stream: {
MergeStream: require('./lib/algos/stream/MergeStream'),
},
SortedSet: require('./lib/algos/set/SortedSet'),
};
export const policies = {
evaluators: require('./lib/policyEvaluator/evaluator.js'),
validateUserPolicy: require('./lib/policy/policyValidator').validateUserPolicy,
evaluatePrincipal: require('./lib/policyEvaluator/principal'),
RequestContext: require('./lib/policyEvaluator/RequestContext.js'),
requestUtils: require('./lib/policyEvaluator/requestUtils'),
actionMaps: require('./lib/policyEvaluator/utils/actionMaps'),
};
export const testing = {
matrix: require('./lib/testing/matrix.js'),
};
export const versioning = {
VersioningConstants: require('./lib/versioning/constants.js').VersioningConstants,
Version: require('./lib/versioning/Version.js').Version,
VersionID: require('./lib/versioning/VersionID.js'),
WriteGatheringManager: require('./lib/versioning/WriteGatheringManager.js'),
WriteCache: require('./lib/versioning/WriteCache.js'),
VersioningRequestProcessor: require('./lib/versioning/VersioningRequestProcessor.js'),
};
export const network = {
http: {
server: require('./lib/network/http/server'),
utils: require('./lib/network/http/utils'),
},
rpc: require('./lib/network/rpc/rpc'),
level: require('./lib/network/rpc/level-net'),
rest: {
RESTServer: require('./lib/network/rest/RESTServer'),
RESTClient: require('./lib/network/rest/RESTClient'),
},
RoundRobin: require('./lib/network/RoundRobin'),
probe: {
ProbeServer: require('./lib/network/probe/ProbeServer'),
HealthProbeServer:
require('./lib/network/probe/HealthProbeServer.js'),
Utils: require('./lib/network/probe/Utils.js'),
},
kmip: require('./lib/network/kmip'),
kmipClient: require('./lib/network/kmip/Client'),
};
export const s3routes = {
routes: require('./lib/s3routes/routes'),
routesUtils: require('./lib/s3routes/routesUtils'),
};
export const s3middleware = {
userMetadata: require('./lib/s3middleware/userMetadata'),
convertToXml: require('./lib/s3middleware/convertToXml'),
escapeForXml: require('./lib/s3middleware/escapeForXml'),
objectLegalHold: require('./lib/s3middleware/objectLegalHold'),
tagging: require('./lib/s3middleware/tagging'),
checkDateModifiedHeaders:
require('./lib/s3middleware/validateConditionalHeaders')
.checkDateModifiedHeaders,
validateConditionalHeaders:
require('./lib/s3middleware/validateConditionalHeaders').validateConditionalHeaders,
MD5Sum: require('./lib/s3middleware/MD5Sum'),
NullStream: require('./lib/s3middleware/nullStream'),
objectUtils: require('./lib/s3middleware/objectUtils'),
azureHelper: {
mpuUtils: require('./lib/s3middleware/azureHelpers/mpuUtils'),
ResultsCollector: require('./lib/s3middleware/azureHelpers/ResultsCollector'),
SubStreamInterface: require('./lib/s3middleware/azureHelpers/SubStreamInterface'),
},
prepareStream: require('./lib/s3middleware/prepareStream'),
processMpuParts: require('./lib/s3middleware/processMpuParts'),
retention: require('./lib/s3middleware/objectRetention'),
lifecycleHelpers: require('./lib/s3middleware/lifecycleHelpers'),
};
export const storage = {
metadata: {
MetadataWrapper: require('./lib/storage/metadata/MetadataWrapper'),
bucketclient: {
BucketClientInterface: require('./lib/storage/metadata/bucketclient/BucketClientInterface'),
LogConsumer: require('./lib/storage/metadata/bucketclient/LogConsumer'),
},
file: {
BucketFileInterface: require('./lib/storage/metadata/file/BucketFileInterface'),
MetadataFileServer: require('./lib/storage/metadata/file/MetadataFileServer'),
MetadataFileClient: require('./lib/storage/metadata/file/MetadataFileClient'),
},
inMemory: {
metastore: require('./lib/storage/metadata/in_memory/metastore'),
metadata: require('./lib/storage/metadata/in_memory/metadata'),
bucketUtilities: require('./lib/storage/metadata/in_memory/bucket_utilities'),
},
mongoclient: {
MongoClientInterface: require('./lib/storage/metadata/mongoclient/MongoClientInterface'),
LogConsumer: require('./lib/storage/metadata/mongoclient/LogConsumer'),
},
proxy: {
Server: require('./lib/storage/metadata/proxy/Server'),
},
},
data: {
DataWrapper: require('./lib/storage/data/DataWrapper'),
MultipleBackendGateway: require('./lib/storage/data/MultipleBackendGateway'),
parseLC: require('./lib/storage/data/LocationConstraintParser'),
file: {
DataFileStore: require('./lib/storage/data/file/DataFileStore'),
DataFileInterface: require('./lib/storage/data/file/DataFileInterface'),
},
external: {
AwsClient: require('./lib/storage/data/external/AwsClient'),
AzureClient: require('./lib/storage/data/external/AzureClient'),
GcpClient: require('./lib/storage/data/external/GcpClient'),
GCP: require('./lib/storage/data/external/GCP/GcpService'),
GcpUtils: require('./lib/storage/data/external/GCP/GcpUtils'),
GcpSigner: require('./lib/storage/data/external/GCP/GcpSigner'),
PfsClient: require('./lib/storage/data/external/PfsClient'),
backendUtils: require('./lib/storage/data/external/utils'),
},
inMemory: {
datastore: require('./lib/storage/data/in_memory/datastore'),
},
},
utils: require('./lib/storage/utils'),
};
export const models = {
BackendInfo: require('./lib/models/BackendInfo'),
BucketInfo: require('./lib/models/BucketInfo'),
BucketAzureInfo: require('./lib/models/BucketAzureInfo'),
ObjectMD: require('./lib/models/ObjectMD'),
ObjectMDLocation: require('./lib/models/ObjectMDLocation'),
ObjectMDAzureInfo: require('./lib/models/ObjectMDAzureInfo'),
ARN: require('./lib/models/ARN'),
WebsiteConfiguration: require('./lib/models/WebsiteConfiguration'),
ReplicationConfiguration: require('./lib/models/ReplicationConfiguration'),
LifecycleConfiguration: require('./lib/models/LifecycleConfiguration'),
LifecycleRule: require('./lib/models/LifecycleRule'),
BucketPolicy: require('./lib/models/BucketPolicy'),
ObjectLockConfiguration: require('./lib/models/ObjectLockConfiguration'),
NotificationConfiguration: require('./lib/models/NotificationConfiguration'),
};
export const metrics = {
StatsClient: require('./lib/metrics/StatsClient'),
StatsModel: require('./lib/metrics/StatsModel'),
RedisClient: require('./lib/metrics/RedisClient'),
ZenkoMetrics: require('./lib/metrics/ZenkoMetrics'),
};
export const pensieve = {
credentialUtils: require('./lib/executables/pensieveCreds/utils'),
};
export const stream = {
readJSONStreamObject: require('./lib/stream/readJSONStreamObject'),
};
export const patches = {
locationConstraints: require('./lib/patches/locationConstraints'),
}

View File

@ -1,6 +1,4 @@
'use strict'; // eslint-disable-line strict
const constants = require('../constants');
import * as constants from '../constants';
/**
* Class containing requester's information received from Vault
@ -8,9 +6,15 @@ const constants = require('../constants');
* shortid, email, accountDisplayName and IAMdisplayName (if applicable)
* @return {AuthInfo} an AuthInfo instance
*/
export default class AuthInfo {
arn: string;
canonicalID: string;
shortid: string;
email: string;
accountDisplayName: string;
IAMdisplayName: string;
class AuthInfo {
constructor(objectFromVault) {
constructor(objectFromVault: any) {
// amazon resource name for IAM user (if applicable)
this.arn = objectFromVault.arn;
// account canonicalID
@ -53,10 +57,8 @@ class AuthInfo {
return this.canonicalID.startsWith(
`${constants.zenkoServiceAccount}/`);
}
isRequesterThisServiceAccount(serviceName) {
return this.canonicalID ===
`${constants.zenkoServiceAccount}/${serviceName}`;
isRequesterThisServiceAccount(serviceName: string) {
const computedCanonicalID = `${constants.zenkoServiceAccount}/${serviceName}`;
return this.canonicalID === computedCanonicalID;
}
}
module.exports = AuthInfo;

View File

@ -1,16 +1,22 @@
const errors = require('../errors');
const AuthInfo = require('./AuthInfo');
import { Logger } from 'werelogs';
import errors from '../errors';
import AuthInfo from './AuthInfo';
/** vaultSignatureCb parses message from Vault and instantiates
* @param {object} err - error from vault
* @param {object} authInfo - info from vault
* @param {object} log - log for request
* @param {function} callback - callback to authCheck functions
* @param {object} [streamingV4Params] - present if v4 signature;
* @param err - error from vault
* @param authInfo - info from vault
* @param log - log for request
* @param callback - callback to authCheck functions
* @param [streamingV4Params] - present if v4 signature;
* items used to calculate signature on chunks if streaming auth
* @return {undefined}
*/
function vaultSignatureCb(err, authInfo, log, callback, streamingV4Params) {
function vaultSignatureCb(
err: Error | null,
authInfo: { message: { body: any } },
log: Logger,
callback: (err: Error | null, data?: any, results?: any, params?: any) => void,
streamingV4Params?: any
) {
// vaultclient API guarantees that it returns:
// - either `err`, an Error object with `code` and `message` properties set
// - or `err == null` and `info` is an object with `message.code` and
@ -24,11 +30,13 @@ function vaultSignatureCb(err, authInfo, log, callback, streamingV4Params) {
const info = authInfo.message.body;
const userInfo = new AuthInfo(info.userInfo);
const authorizationResults = info.authorizationResults;
const auditLog = { accountDisplayName: userInfo.getAccountDisplayName() };
const auditLog: { accountDisplayName: string, IAMdisplayName?: string } =
{ accountDisplayName: userInfo.getAccountDisplayName() };
const iamDisplayName = userInfo.getIAMdisplayName();
if (iamDisplayName) {
auditLog.IAMdisplayName = iamDisplayName;
}
// @ts-ignore
log.addDefaultFields(auditLog);
return callback(null, userInfo, authorizationResults, streamingV4Params);
}
@ -39,43 +47,63 @@ function vaultSignatureCb(err, authInfo, log, callback, streamingV4Params) {
* authentication backends.
* @class Vault
*/
class Vault {
export default class Vault {
client: any;
implName: string;
/**
* @constructor
* @param {object} client - authentication backend or vault client
* @param {string} implName - implementation name for auth backend
*/
constructor(client, implName) {
constructor(client: any, implName: string) {
this.client = client;
this.implName = implName;
}
/**
* authenticateV2Request
*
* @param {string} params - the authentication parameters as returned by
* @param params - the authentication parameters as returned by
* auth.extractParams
* @param {number} params.version - shall equal 2
* @param {string} params.data.accessKey - the user's accessKey
* @param {string} params.data.signatureFromRequest - the signature read
* @param params.version - shall equal 2
* @param params.data.accessKey - the user's accessKey
* @param params.data.signatureFromRequest - the signature read
* from the request
* @param {string} params.data.stringToSign - the stringToSign
* @param {string} params.data.algo - the hashing algorithm used for the
* @param params.data.stringToSign - the stringToSign
* @param params.data.algo - the hashing algorithm used for the
* signature
* @param {string} params.data.authType - the type of authentication (query
* @param params.data.authType - the type of authentication (query
* or header)
* @param {string} params.data.signatureVersion - the version of the
* @param params.data.signatureVersion - the version of the
* signature (AWS or AWS4)
* @param {number} [params.data.signatureAge] - the age of the signature in
* @param [params.data.signatureAge] - the age of the signature in
* ms
* @param {string} params.data.log - the logger object
* @param params.data.log - the logger object
* @param {RequestContext []} requestContexts - an array of RequestContext
* instances which contain information for policy authorization check
* @param {function} callback - callback with either error or user info
* @returns {undefined}
* @param callback - callback with either error or user info
*/
authenticateV2Request(params, requestContexts, callback) {
authenticateV2Request(
params: {
version: 2;
log: Logger;
data: {
securityToken: string;
accessKey: string;
signatureFromRequest: string;
stringToSign: string;
algo: string;
authType: 'query' | 'header';
signatureVersion: string;
signatureAge?: number;
log: Logger;
};
},
requestContexts: any[],
callback: (err: Error | null, data?: any) => void
) {
params.log.debug('authenticating V2 request');
let serializedRCsArr;
let serializedRCsArr: any;
if (requestContexts) {
serializedRCsArr = requestContexts.map(rc => rc.serialize());
}
@ -85,44 +113,66 @@ class Vault {
params.data.accessKey,
{
algo: params.data.algo,
// @ts-ignore
reqUid: params.log.getSerializedUids(),
logger: params.log,
securityToken: params.data.securityToken,
requestContext: serializedRCsArr,
},
(err, userInfo) => vaultSignatureCb(err, userInfo,
(err: Error | null, userInfo?: any) => vaultSignatureCb(err, userInfo,
params.log, callback),
);
}
/** authenticateV4Request
* @param {object} params - the authentication parameters as returned by
* @param params - the authentication parameters as returned by
* auth.extractParams
* @param {number} params.version - shall equal 4
* @param {string} params.data.log - the logger object
* @param {string} params.data.accessKey - the user's accessKey
* @param {string} params.data.signatureFromRequest - the signature read
* @param params.version - shall equal 4
* @param params.data.log - the logger object
* @param params.data.accessKey - the user's accessKey
* @param params.data.signatureFromRequest - the signature read
* from the request
* @param {string} params.data.region - the AWS region
* @param {string} params.data.stringToSign - the stringToSign
* @param {string} params.data.scopeDate - the timespan to allow the request
* @param {string} params.data.authType - the type of authentication (query
* @param params.data.region - the AWS region
* @param params.data.stringToSign - the stringToSign
* @param params.data.scopeDate - the timespan to allow the request
* @param params.data.authType - the type of authentication (query
* or header)
* @param {string} params.data.signatureVersion - the version of the
* @param params.data.signatureVersion - the version of the
* signature (AWS or AWS4)
* @param {number} params.data.signatureAge - the age of the signature in ms
* @param {number} params.data.timestamp - signaure timestamp
* @param {string} params.credentialScope - credentialScope for signature
* @param params.data.signatureAge - the age of the signature in ms
* @param params.data.timestamp - signaure timestamp
* @param params.credentialScope - credentialScope for signature
* @param {RequestContext [] | null} requestContexts -
* an array of RequestContext or null if authenticaiton of a chunk
* in streamingv4 auth
* instances which contain information for policy authorization check
* @param {function} callback - callback with either error or user info
* @return {undefined}
* @param callback - callback with either error or user info
*/
authenticateV4Request(params, requestContexts, callback) {
authenticateV4Request(
params: {
version: 4;
log: Logger;
data: {
accessKey: string;
signatureFromRequest: string;
region: string;
stringToSign: string;
scopeDate: string;
authType: 'query' | 'header';
signatureVersion: string;
signatureAge?: number;
timestamp: number;
credentialScope: string;
securityToken: string;
algo: string;
log: Logger;
};
},
requestContexts: any[] | null,
callback: (err: Error | null, data?: any) => void
) {
params.log.debug('authenticating V4 request');
let serializedRCs;
let serializedRCs: any;
if (requestContexts) {
serializedRCs = requestContexts.map(rc => rc.serialize());
}
@ -140,31 +190,39 @@ class Vault {
params.data.region,
params.data.scopeDate,
{
// @ts-ignore
reqUid: params.log.getSerializedUids(),
logger: params.log,
securityToken: params.data.securityToken,
requestContext: serializedRCs,
},
(err, userInfo) => vaultSignatureCb(err, userInfo,
(err: Error | null, userInfo?: any) => vaultSignatureCb(err, userInfo,
params.log, callback, streamingV4Params),
);
}
/** getCanonicalIds -- call Vault to get canonicalIDs based on email
* addresses
* @param {array} emailAddresses - list of emailAddresses
* @param {object} log - log object
* @param {function} callback - callback with either error or an array
* @param emailAddresses - list of emailAddresses
* @param log - log object
* @param callback - callback with either error or an array
* of objects with each object containing the canonicalID and emailAddress
* of an account as properties
* @return {undefined}
*/
getCanonicalIds(emailAddresses, log, callback) {
getCanonicalIds(
emailAddresses: string[],
log: Logger,
callback: (
err: Error | null,
data?: { canonicalID: string; email: string }[]
) => void
) {
log.trace('getting canonicalIDs from Vault based on emailAddresses',
{ emailAddresses });
this.client.getCanonicalIds(emailAddresses,
// @ts-ignore
{ reqUid: log.getSerializedUids() },
(err, info) => {
(err: Error | null, info?: any) => {
if (err) {
log.debug('received error message from auth provider',
{ errorMessage: err });
@ -172,17 +230,17 @@ class Vault {
}
const infoFromVault = info.message.body;
log.trace('info received from vault', { infoFromVault });
const foundIds = [];
const foundIds: { canonicalID: string; email: string }[] = [];
for (let i = 0; i < Object.keys(infoFromVault).length; i++) {
const key = Object.keys(infoFromVault)[i];
if (infoFromVault[key] === 'WrongFormat'
|| infoFromVault[key] === 'NotFound') {
return callback(errors.UnresolvableGrantByEmailAddress);
}
const obj = {};
obj.email = key;
obj.canonicalID = infoFromVault[key];
foundIds.push(obj);
foundIds.push({
email: key,
canonicalID: infoFromVault[key],
})
}
return callback(null, foundIds);
});
@ -190,18 +248,22 @@ class Vault {
/** getEmailAddresses -- call Vault to get email addresses based on
* canonicalIDs
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} log - log object
* @param {function} callback - callback with either error or an object
* @param canonicalIDs - list of canonicalIDs
* @param log - log object
* @param callback - callback with either error or an object
* with canonicalID keys and email address values
* @return {undefined}
*/
getEmailAddresses(canonicalIDs, log, callback) {
getEmailAddresses(
canonicalIDs: string[],
log: Logger,
callback: (err: Error | null, data?: { [key: string]: any }) => void
) {
log.trace('getting emailAddresses from Vault based on canonicalIDs',
{ canonicalIDs });
this.client.getEmailAddresses(canonicalIDs,
// @ts-ignore
{ reqUid: log.getSerializedUids() },
(err, info) => {
(err: Error | null, info?: any) => {
if (err) {
log.debug('received error message from vault',
{ errorMessage: err });
@ -224,18 +286,22 @@ class Vault {
/** getAccountIds -- call Vault to get accountIds based on
* canonicalIDs
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} log - log object
* @param {function} callback - callback with either error or an object
* @param canonicalIDs - list of canonicalIDs
* @param log - log object
* @param callback - callback with either error or an object
* with canonicalID keys and accountId values
* @return {undefined}
*/
getAccountIds(canonicalIDs, log, callback) {
getAccountIds(
canonicalIDs: string[],
log: Logger,
callback: (err: Error | null, data?: { [key: string]: string }) => void
) {
log.trace('getting accountIds from Vault based on canonicalIDs',
{ canonicalIDs });
this.client.getAccountIds(canonicalIDs,
// @ts-ignore
{ reqUid: log.getSerializedUids() },
(err, info) => {
(err: Error | null, info?: any) => {
if (err) {
log.debug('received error message from vault',
{ errorMessage: err });
@ -268,14 +334,19 @@ class Vault {
* @param {object} log - log object
* @param {function} callback - callback with either error or an array
* of authorization results
* @return {undefined}
*/
checkPolicies(requestContextParams, userArn, log, callback) {
checkPolicies(
requestContextParams: any[],
userArn: string,
log: Logger,
callback: (err: Error | null, data?: any[]) => void
) {
log.trace('sending request context params to vault to evaluate' +
'policies');
this.client.checkPolicies(requestContextParams, userArn, {
// @ts-ignore
reqUid: log.getSerializedUids(),
}, (err, info) => {
}, (err: Error | null, info?: any) => {
if (err) {
log.debug('received error message from auth provider',
{ error: err });
@ -286,13 +357,14 @@ class Vault {
});
}
checkHealth(log, callback) {
checkHealth(log: Logger, callback: (err: Error | null, data?: any) => void) {
if (!this.client.healthcheck) {
const defResp = {};
defResp[this.implName] = { code: 200, message: 'OK' };
return callback(null, defResp);
}
return this.client.healthcheck(log.getSerializedUids(), (err, obj) => {
// @ts-ignore
return this.client.healthcheck(log.getSerializedUids(), (err: Error | null, obj?: any) => {
const respBody = {};
if (err) {
log.debug(`error from ${this.implName}`, { error: err });
@ -312,5 +384,3 @@ class Vault {
});
}
}
module.exports = Vault;

View File

@ -1,24 +1,23 @@
'use strict'; // eslint-disable-line strict
import * as crypto from 'crypto';
import { Logger } from 'werelogs';
import errors from '../errors';
import * as queryString from 'querystring';
import AuthInfo from './AuthInfo';
import * as v2 from './v2/authV2';
import * as v4 from './v4/authV4';
import * as constants from '../constants';
import constructStringToSignV2 from './v2/constructStringToSign';
import constructStringToSignV4 from './v4/constructStringToSign';
import { convertUTCtoISO8601 } from './v4/timeUtils';
import * as vaultUtilities from './backends/in_memory/vaultUtilities';
import * as inMemoryBackend from './backends/in_memory/Backend';
import validateAuthConfig from './backends/in_memory/validateAuthConfig';
import AuthLoader from './backends/in_memory/AuthLoader';
import Vault from './Vault';
import baseBackend from './backends/BaseBackend';
import chainBackend from './backends/ChainBackend';
const crypto = require('crypto');
const errors = require('../errors');
const queryString = require('querystring');
const AuthInfo = require('./AuthInfo');
const v2 = require('./v2/authV2');
const v4 = require('./v4/authV4');
const constants = require('../constants');
const constructStringToSignV2 = require('./v2/constructStringToSign');
const constructStringToSignV4 = require('./v4/constructStringToSign');
const convertUTCtoISO8601 = require('./v4/timeUtils').convertUTCtoISO8601;
const vaultUtilities = require('./backends/in_memory/vaultUtilities');
const inMemoryBackend = require('./backends/in_memory/Backend');
const validateAuthConfig = require('./backends/in_memory/validateAuthConfig');
const AuthLoader = require('./backends/in_memory/AuthLoader');
const Vault = require('./Vault');
const baseBackend = require('./backends/base');
const chainBackend = require('./backends/ChainBackend');
let vault = null;
let vault: Vault | null = null;
const auth = {};
const checkFunctions = {
v2: {
@ -35,7 +34,7 @@ const checkFunctions = {
// 'All Users Group' so use this group as the canonicalID for the publicUser
const publicUserInfo = new AuthInfo({ canonicalID: constants.publicId });
function setAuthHandler(handler) {
function setAuthHandler(handler: Vault) {
vault = handler;
return auth;
}
@ -43,25 +42,30 @@ function setAuthHandler(handler) {
/**
* This function will check validity of request parameters to authenticate
*
* @param {Http.Request} request - Http request object
* @param {object} log - Logger object
* @param {string} awsService - Aws service related
* @param {object} data - Parameters from queryString parsing or body of
* @param request - Http request object
* @param log - Logger object
* @param awsService - Aws service related
* @param data - Parameters from queryString parsing or body of
* POST request
*
* @return {object} ret
* @return {object} ret.err - arsenal.errors object if any error was found
* @return {object} ret.params - auth parameters to use later on for signature
* @return ret
* @return ret.err - arsenal.errors object if any error was found
* @return ret.params - auth parameters to use later on for signature
* computation and check
* @return {object} ret.params.version - the auth scheme version
* @return ret.params.version - the auth scheme version
* (undefined, 2, 4)
* @return {object} ret.params.data - the auth scheme's specific data
* @return ret.params.data - the auth scheme's specific data
*/
function extractParams(request, log, awsService, data) {
function extractParams(
request: any,
log: Logger,
awsService: string,
data: { [key: string]: string }
) {
log.trace('entered', { method: 'Arsenal.auth.server.extractParams' });
const authHeader = request.headers.authorization;
let version = null;
let method = null;
let version: 'v2' |'v4' | null = null;
let method: 'query' | 'headers' | null = null;
// Identify auth version and method to dispatch to the right check function
if (authHeader) {
@ -104,16 +108,21 @@ function extractParams(request, log, awsService, data) {
/**
* This function will check validity of request parameters to authenticate
*
* @param {Http.Request} request - Http request object
* @param {object} log - Logger object
* @param {function} cb - the callback
* @param {string} awsService - Aws service related
* @param request - Http request object
* @param log - Logger object
* @param cb - the callback
* @param awsService - Aws service related
* @param {RequestContext[] | null} requestContexts - array of RequestContext
* or null if no requestContexts to be sent to Vault (for instance,
* in multi-object delete request)
* @return {undefined}
*/
function doAuth(request, log, cb, awsService, requestContexts) {
function doAuth(
request: any,
log: Logger,
cb: (err: Error | null, data?: any) => void,
awsService: string,
requestContexts: any[] | null
) {
const res = extractParams(request, log, awsService, request.query);
if (res.err) {
return cb(res.err);
@ -121,23 +130,31 @@ function doAuth(request, log, cb, awsService, requestContexts) {
return cb(null, res.params);
}
if (requestContexts) {
requestContexts.forEach(requestContext => {
requestContext.setAuthType(res.params.data.authType);
requestContext.setSignatureVersion(res.params
.data.signatureVersion);
requestContext.setSignatureAge(res.params.data.signatureAge);
requestContext.setSecurityToken(res.params.data.securityToken);
requestContexts.forEach((requestContext) => {
const { params } = res
if ('data' in params) {
const { data } = params
requestContext.setAuthType(data.authType);
requestContext.setSignatureVersion(data.signatureVersion);
requestContext.setSecurityToken(data.securityToken);
if ('signatureAge' in data) {
requestContext.setSignatureAge(data.signatureAge);
}
}
});
}
// Corner cases managed, we're left with normal auth
// TODO What's happening here?
// @ts-ignore
res.params.log = log;
if (res.params.version === 2) {
return vault.authenticateV2Request(res.params, requestContexts, cb);
// @ts-ignore
return vault!.authenticateV2Request(res.params, requestContexts, cb);
}
if (res.params.version === 4) {
return vault.authenticateV4Request(res.params, requestContexts, cb,
awsService);
// @ts-ignore
return vault!.authenticateV4Request(res.params, requestContexts, cb);
}
log.error('authentication method not found', {
@ -149,19 +166,25 @@ function doAuth(request, log, cb, awsService, requestContexts) {
/**
* This function will generate a version 4 header
*
* @param {Http.Request} request - Http request object
* @param {object} data - Parameters from queryString parsing or body of
* @param request - Http request object
* @param data - Parameters from queryString parsing or body of
* POST request
* @param {string} accessKey - the accessKey
* @param {string} secretKeyValue - the secretKey
* @param {string} awsService - Aws service related
* @param {sting} [proxyPath] - path that gets proxied by reverse proxy
* @param {string} [sessionToken] - security token if the access/secret keys
* @param accessKey - the accessKey
* @param secretKeyValue - the secretKey
* @param awsService - Aws service related
* @param [proxyPath] - path that gets proxied by reverse proxy
* @param [sessionToken] - security token if the access/secret keys
* are temporary credentials from STS
* @return {undefined}
*/
function generateV4Headers(request, data, accessKey, secretKeyValue,
awsService, proxyPath, sessionToken) {
function generateV4Headers(
request: any,
data: { [key: string]: string },
accessKey: string,
secretKeyValue: string,
awsService: string,
proxyPath: string,
sessionToken: string
) {
Object.assign(request, { headers: {} });
const amzDate = convertUTCtoISO8601(Date.now());
// get date without time
@ -175,7 +198,7 @@ function generateV4Headers(request, data, accessKey, secretKeyValue,
let payload = '';
if (request.method === 'POST') {
payload = queryString.stringify(data, null, null, {
payload = queryString.stringify(data, undefined, undefined, {
encodeURIComponent,
});
}
@ -205,7 +228,7 @@ function generateV4Headers(request, data, accessKey, secretKeyValue,
scopeDate,
service);
const signature = crypto.createHmac('sha256', signingKey)
.update(stringToSign, 'binary').digest('hex');
.update(stringToSign as string, 'binary').digest('hex');
const authorizationHeader = `${algorithm} Credential=${accessKey}` +
`/${credentialScope}, SignedHeaders=${signedHeaders}, ` +
`Signature=${signature}`;
@ -213,25 +236,16 @@ function generateV4Headers(request, data, accessKey, secretKeyValue,
Object.assign(request, { headers: {} });
}
module.exports = {
setHandler: setAuthHandler,
server: {
extractParams,
doAuth,
},
client: {
generateV4Headers,
constructStringToSignV2,
},
inMemory: {
backend: inMemoryBackend,
validateAuthConfig,
AuthLoader,
},
backends: {
baseBackend,
chainBackend,
},
export const server = { extractParams, doAuth }
export const client = { generateV4Headers, constructStringToSignV2 }
export const inMemory = {
backend: inMemoryBackend,
validateAuthConfig,
AuthLoader
}
export const backends = { baseBackend, chainBackend }
export {
setAuthHandler as setHandler,
AuthInfo,
Vault,
};
Vault
}

View File

@ -0,0 +1,70 @@
import errors from '../../errors';
import { Callback } from './in_memory/types';
/** Base backend class */
export default class BaseBackend {
service: string;
constructor(service: string) {
this.service = service;
}
verifySignatureV2(
_stringToSign: string,
_signatureFromRequest: string,
_accessKey: string,
_options: { algo: 'SHA1' | 'SHA256' },
callback: Callback
) {
return callback(errors.AuthMethodNotImplemented);
}
verifySignatureV4(
_stringToSign: string,
_signatureFromRequest: string,
_accessKey: string,
_region: string,
_scopeDate: string,
_options: any,
callback: Callback
) {
return callback(errors.AuthMethodNotImplemented);
}
/**
* Gets canonical ID's for a list of accounts based on email associated
* with account. The callback will be called with either error or object
* with email addresses as keys and canonical IDs as values.
*/
getCanonicalIds(_emails: string[], _options: any, callback: Callback) {
return callback(errors.AuthMethodNotImplemented);
}
/**
* Gets email addresses (referred to as diplay names for getACL's) for a
* list of accounts based on canonical IDs associated with account.
* The callback will be called with either error or an object from Vault
* containing account canonicalID as each object key and an email address
* as the value (or "NotFound").
*/
getEmailAddresses(
_canonicalIDs: string[],
_options: any,
callback: Callback
) {
return callback(errors.AuthMethodNotImplemented);
}
checkPolicies(
_requestContextParams: any,
_userArn: string,
_options: any,
callback: Callback
) {
return callback(null, { message: { body: [] } });
}
healthcheck(_reqUid: string, callback: Callback) {
return callback(null, { code: 200, message: 'OK' });
}
}

View File

@ -1,25 +1,30 @@
'use strict'; // eslint-disable-line strict
import { Callback } from './in_memory/types';
import errors from '../../errors';
import BaseBackend from './BaseBackend';
import assert from 'assert';
import * as async from 'async';
const assert = require('assert');
const async = require('async');
const errors = require('../../errors');
const BaseBackend = require('./base');
export type Policy = {
[key: string]: any;
arn?: string;
versionId?: string;
isAllowed: boolean;
}
/**
* Class that provides an authentication backend that will verify signatures
* and retrieve emails and canonical ids associated with an account using a
* given list of authentication backends and vault clients.
*
* @class ChainBackend
*/
class ChainBackend extends BaseBackend {
export default class ChainBackend extends BaseBackend {
#clients: BaseBackend[];
/**
* @constructor
* @param {string} service - service id
* @param {object[]} clients - list of authentication backends or vault clients
*/
constructor(service, clients) {
constructor(service: string, clients: BaseBackend[]) {
super(service);
assert(Array.isArray(clients) && clients.length > 0, 'invalid client list');
@ -31,26 +36,28 @@ class ChainBackend extends BaseBackend {
typeof client.checkPolicies === 'function' &&
typeof client.healthcheck === 'function',
), 'invalid client: missing required auth backend methods');
this._clients = clients;
this.#clients = clients;
}
/*
* try task against each client for one to be successful
*/
_tryEachClient(task, cb) {
async.tryEach(this._clients.map(client => done => task(client, done)), cb);
/** try task against each client for one to be successful */
#tryEachClient(task: (client: BaseBackend, done?: any) => void, cb: Callback) {
// @ts-ignore
async.tryEach(this.#clients.map(client => (done: any) => task(client, done)), cb);
}
/*
* apply task to all clients
*/
_forEachClient(task, cb) {
async.map(this._clients, task, cb);
/** apply task to all clients */
#forEachClient(task: (client: BaseBackend, done?: any) => void, cb: Callback) {
async.map(this.#clients, task, cb);
}
verifySignatureV2(stringToSign, signatureFromRequest, accessKey, options, callback) {
this._tryEachClient((client, done) => client.verifySignatureV2(
verifySignatureV2(
stringToSign: string,
signatureFromRequest: string,
accessKey: string,
options: any,
callback: Callback
) {
this.#tryEachClient((client, done) => client.verifySignatureV2(
stringToSign,
signatureFromRequest,
accessKey,
@ -59,8 +66,16 @@ class ChainBackend extends BaseBackend {
), callback);
}
verifySignatureV4(stringToSign, signatureFromRequest, accessKey, region, scopeDate, options, callback) {
this._tryEachClient((client, done) => client.verifySignatureV4(
verifySignatureV4(
stringToSign: string,
signatureFromRequest: string,
accessKey: string,
region: string,
scopeDate: string,
options: any,
callback: Callback
) {
this.#tryEachClient((client, done) => client.verifySignatureV4(
stringToSign,
signatureFromRequest,
accessKey,
@ -71,14 +86,18 @@ class ChainBackend extends BaseBackend {
), callback);
}
static _mergeObjects(objectResponses) {
static _mergeObjects(objectResponses: any[]) {
return objectResponses.reduce(
(retObj, resObj) => Object.assign(retObj, resObj.message.body),
{});
}
getCanonicalIds(emailAddresses, options, callback) {
this._forEachClient(
getCanonicalIds(
emailAddresses: string[],
options: any,
callback: Callback<{ message: { body: any } }>
) {
this.#forEachClient(
(client, done) => client.getCanonicalIds(emailAddresses, options, done),
(err, res) => {
if (err) {
@ -93,8 +112,12 @@ class ChainBackend extends BaseBackend {
});
}
getEmailAddresses(canonicalIDs, options, callback) {
this._forEachClient(
getEmailAddresses(
canonicalIDs: string[],
options: any,
callback: Callback<{ message: { body: any } }>
) {
this.#forEachClient(
(client, done) => client.getEmailAddresses(canonicalIDs, options, done),
(err, res) => {
if (err) {
@ -108,11 +131,9 @@ class ChainBackend extends BaseBackend {
});
}
/*
* merge policy responses into a single message
*/
static _mergePolicies(policyResponses) {
const policyMap = {};
/** merge policy responses into a single message */
static _mergePolicies(policyResponses: { message: { body: any[] } }[]) {
const policyMap: { [key: string]: Policy } = {};
policyResponses.forEach(resp => {
if (!resp.message || !Array.isArray(resp.message.body)) {
@ -128,9 +149,9 @@ class ChainBackend extends BaseBackend {
});
});
return Object.keys(policyMap).map(key => {
const policyRes = { isAllowed: policyMap[key].isAllowed };
if (policyMap[key].arn !== '') {
return Object.keys(policyMap).map((key) => {
const policyRes: Policy = { isAllowed: policyMap[key].isAllowed };
if (policyMap[key].arn && policyMap[key].arn !== '') {
policyRes.arn = policyMap[key].arn;
}
if (policyMap[key].versionId) {
@ -140,7 +161,7 @@ class ChainBackend extends BaseBackend {
});
}
/*
/**
response format:
{ message: {
body: [{}],
@ -148,8 +169,13 @@ class ChainBackend extends BaseBackend {
message: string,
} }
*/
checkPolicies(requestContextParams, userArn, options, callback) {
this._forEachClient((client, done) => client.checkPolicies(
checkPolicies(
requestContextParams: any,
userArn: string,
options: any,
callback: Callback<{ message: { body: any } }>
) {
this.#forEachClient((client, done) => client.checkPolicies(
requestContextParams,
userArn,
options,
@ -166,8 +192,8 @@ class ChainBackend extends BaseBackend {
});
}
healthcheck(reqUid, callback) {
this._forEachClient((client, done) =>
healthcheck(reqUid: string, callback: Callback) {
this.#forEachClient((client, done) =>
client.healthcheck(reqUid, (err, res) => done(null, {
error: !!err ? err : null,
status: res,
@ -177,7 +203,7 @@ class ChainBackend extends BaseBackend {
return callback(err);
}
const isError = res.some(results => !!results.error);
const isError = res.some((results: any) => !!results.error);
if (isError) {
return callback(errors.InternalError, res);
}
@ -185,5 +211,3 @@ class ChainBackend extends BaseBackend {
});
}
}
module.exports = ChainBackend;

View File

@ -1,86 +0,0 @@
'use strict'; // eslint-disable-line strict
const errors = require('../../errors');
/**
* Base backend class
*
* @class BaseBackend
*/
class BaseBackend {
/**
* @constructor
* @param {string} service - service identifer for construction arn
*/
constructor(service) {
this.service = service;
}
/** verifySignatureV2
* @param {string} stringToSign - string to sign built per AWS rules
* @param {string} signatureFromRequest - signature sent with request
* @param {string} accessKey - account accessKey
* @param {object} options - contains algorithm (SHA1 or SHA256)
* @param {function} callback - callback with either error or user info
* @return {function} calls callback
*/
verifySignatureV2(stringToSign, signatureFromRequest,
accessKey, options, callback) {
return callback(errors.AuthMethodNotImplemented);
}
/** verifySignatureV4
* @param {string} stringToSign - string to sign built per AWS rules
* @param {string} signatureFromRequest - signature sent with request
* @param {string} accessKey - account accessKey
* @param {string} region - region specified in request credential
* @param {string} scopeDate - date specified in request credential
* @param {object} options - options to send to Vault
* (just contains reqUid for logging in Vault)
* @param {function} callback - callback with either error or user info
* @return {function} calls callback
*/
verifySignatureV4(stringToSign, signatureFromRequest, accessKey,
region, scopeDate, options, callback) {
return callback(errors.AuthMethodNotImplemented);
}
/**
* Gets canonical ID's for a list of accounts
* based on email associated with account
* @param {array} emails - list of email addresses
* @param {object} options - to send log id to vault
* @param {function} callback - callback to calling function
* @returns {function} callback with either error or
* object with email addresses as keys and canonical IDs
* as values
*/
getCanonicalIds(emails, options, callback) {
return callback(errors.AuthMethodNotImplemented);
}
/**
* Gets email addresses (referred to as diplay names for getACL's)
* for a list of accounts based on canonical IDs associated with account
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} options - to send log id to vault
* @param {function} callback - callback to calling function
* @returns {function} callback with either error or
* an object from Vault containing account canonicalID
* as each object key and an email address as the value (or "NotFound")
*/
getEmailAddresses(canonicalIDs, options, callback) {
return callback(errors.AuthMethodNotImplemented);
}
checkPolicies(requestContextParams, userArn, options, callback) {
return callback(null, { message: { body: [] } });
}
healthcheck(reqUid, callback) {
return callback(null, { code: 200, message: 'OK' });
}
}
module.exports = BaseBackend;

View File

@ -1,223 +0,0 @@
const fs = require('fs');
const glob = require('simple-glob');
const joi = require('@hapi/joi');
const werelogs = require('werelogs');
const ARN = require('../../../models/ARN');
/**
* Load authentication information from files or pre-loaded account
* objects
*
* @class AuthLoader
*/
class AuthLoader {
constructor(logApi) {
this._log = new (logApi || werelogs).Logger('S3');
this._authData = { accounts: [] };
// null: unknown validity, true/false: valid or invalid
this._isValid = null;
this._joiKeysValidator = joi.array()
.items({
access: joi.string().required(),
secret: joi.string().required(),
})
.required();
const accountsJoi = joi.array()
.items({
name: joi.string().required(),
email: joi.string().email().required(),
arn: joi.string().required(),
canonicalID: joi.string().required(),
shortid: joi.string().regex(/^[0-9]{12}$/).required(),
keys: this._joiKeysValidator,
// backward-compat
users: joi.array(),
})
.required()
.unique('arn')
.unique('email')
.unique('canonicalID');
this._joiValidator = joi.object({ accounts: accountsJoi });
}
/**
* add one or more accounts to the authentication info
*
* @param {object} authData - authentication data
* @param {object[]} authData.accounts - array of account data
* @param {string} authData.accounts[].name - account name
* @param {string} authData.accounts[].email: email address
* @param {string} authData.accounts[].arn: account ARN,
* e.g. 'arn:aws:iam::123456789012:root'
* @param {string} authData.accounts[].canonicalID account
* canonical ID
* @param {string} authData.accounts[].shortid account ID number,
* e.g. '123456789012'
* @param {object[]} authData.accounts[].keys array of
* access/secret keys
* @param {object[]} authData.accounts[].keys[].access access key
* @param {object[]} authData.accounts[].keys[].secret secret key
* @param {string} [filePath] - optional file path info for
* logging purpose
* @return {undefined}
*/
addAccounts(authData, filePath) {
const isValid = this._validateData(authData, filePath);
if (isValid) {
this._authData.accounts =
this._authData.accounts.concat(authData.accounts);
// defer validity checking when getting data to avoid
// logging multiple times the errors (we need to validate
// all accounts at once to detect duplicate values)
if (this._isValid) {
this._isValid = null;
}
} else {
this._isValid = false;
}
}
/**
* add account information from a file
*
* @param {string} filePath - file path containing JSON
* authentication info (see {@link addAccounts()} for format)
* @return {undefined}
*/
addFile(filePath) {
const authData = JSON.parse(fs.readFileSync(filePath));
this.addAccounts(authData, filePath);
}
/**
* add account information from a filesystem path
*
* @param {string|string[]} globPattern - filesystem glob pattern,
* can be a single string or an array of glob patterns. Globs
* can be simple file paths or can contain glob matching
* characters, like '/a/b/*.json'. The matching files are
* individually loaded as JSON and accounts are added. See
* {@link addAccounts()} for JSON format.
* @return {undefined}
*/
addFilesByGlob(globPattern) {
const files = glob(globPattern);
files.forEach(filePath => this.addFile(filePath));
}
/**
* perform validation on authentication info previously
* loaded. Note that it has to be done on the entire set after an
* update to catch duplicate account IDs or access keys.
*
* @return {boolean} true if authentication info is valid
* false otherwise
*/
validate() {
if (this._isValid === null) {
this._isValid = this._validateData(this._authData);
}
return this._isValid;
}
/**
* get authentication info as a plain JS object containing all accounts
* under the "accounts" attribute, with validation.
*
* @return {object|null} the validated authentication data
* null if invalid
*/
getData() {
return this.validate() ? this._authData : null;
}
_validateData(authData, filePath) {
const res = joi.validate(authData, this._joiValidator,
{ abortEarly: false });
if (res.error) {
this._dumpJoiErrors(res.error.details, filePath);
return false;
}
let allKeys = [];
let arnError = false;
const validatedAuth = res.value;
validatedAuth.accounts.forEach(account => {
// backward-compat: ignore arn if starts with 'aws:' and log a
// warning
if (account.arn.startsWith('aws:')) {
this._log.error(
'account must have a valid AWS ARN, legacy examples ' +
'starting with \'aws:\' are not supported anymore. ' +
'Please convert to a proper account entry (see ' +
'examples at https://github.com/scality/S3/blob/' +
'master/conf/authdata.json). Also note that support ' +
'for account users has been dropped.',
{ accountName: account.name, accountArn: account.arn,
filePath });
arnError = true;
return;
}
if (account.users) {
this._log.error(
'support for account users has been dropped, consider ' +
'turning users into account entries (see examples at ' +
'https://github.com/scality/S3/blob/master/conf/' +
'authdata.json)',
{ accountName: account.name, accountArn: account.arn,
filePath });
arnError = true;
return;
}
const arnObj = ARN.createFromString(account.arn);
if (arnObj.error) {
this._log.error(
'authentication config validation error',
{ reason: arnObj.error.description,
accountName: account.name, accountArn: account.arn,
filePath });
arnError = true;
return;
}
if (!arnObj.isIAMAccount()) {
this._log.error(
'authentication config validation error',
{ reason: 'not an IAM account ARN',
accountName: account.name, accountArn: account.arn,
filePath });
arnError = true;
return;
}
allKeys = allKeys.concat(account.keys);
});
if (arnError) {
return false;
}
const uniqueKeysRes = joi.validate(
allKeys, this._joiKeysValidator.unique('access'));
if (uniqueKeysRes.error) {
this._dumpJoiErrors(uniqueKeysRes.error.details, filePath);
return false;
}
return true;
}
_dumpJoiErrors(errors, filePath) {
errors.forEach(err => {
const logInfo = { item: err.path, filePath };
if (err.type === 'array.unique') {
logInfo.reason = `duplicate value '${err.context.path}'`;
logInfo.dupValue = err.context.value[err.context.path];
} else {
logInfo.reason = err.message;
logInfo.context = err.context;
}
this._log.error('authentication config validation error',
logInfo);
});
}
}
module.exports = AuthLoader;

View File

@ -0,0 +1,205 @@
import * as fs from 'fs';
import glob from 'simple-glob';
import joi from 'joi';
import werelogs from 'werelogs';
import * as types from './types';
import { Account, Accounts } from './types';
const ARN = require('../../../models/ARN');
/** Load authentication information from files or pre-loaded account objects */
export default class AuthLoader {
#log: werelogs.Logger;
#authData: Accounts;
#isValid: 'waiting-for-validation' | 'valid' | 'invalid';
constructor(logApi: { Logger: typeof werelogs.Logger } = werelogs) {
this.#log = new logApi.Logger('S3');
this.#authData = { accounts: [] };
this.#isValid = 'waiting-for-validation';
}
/** Add one or more accounts to the authentication info */
addAccounts(authData: Accounts, filePath?: string) {
const isValid = this.#isAuthDataValid(authData, filePath);
if (isValid) {
this.#authData.accounts = [
...this.#authData.accounts,
...authData.accounts,
];
// defer validity checking when getting data to avoid
// logging multiple times the errors (we need to validate
// all accounts at once to detect duplicate values)
if (this.#isValid === 'valid') {
this.#isValid = 'waiting-for-validation';
}
} else {
this.#isValid = 'invalid';
}
}
/**
* Add account information from a file. Use { legacy: false } as an option
* to use the new, Promise-based version.
*
* @param filePath - file path containing JSON
* authentication info (see {@link addAccounts()} for format)
*/
addFile(filePath: string, options: { legacy: false }): Promise<void>;
/** @deprecated Please use Promise-version instead. */
addFile(filePath: string, options?: { legacy: true }): void;
addFile(filePath: string, options = { legacy: true }) {
// On deprecation, remove the legacy part and keep the promises.
const fn: any = options.legacy ? fs.readFileSync : fs.promises.readFile;
const temp = fn(filePath, 'utf8') as Promise<string> | string;
const prom = Promise.resolve(temp).then((data) => {
const authData = JSON.parse(data);
this.addAccounts(authData, filePath);
});
return options.legacy ? undefined : prom;
}
/**
* Add account information from a filesystem path
*
* @param globPattern - filesystem glob pattern,
* can be a single string or an array of glob patterns. Globs
* can be simple file paths or can contain glob matching
* characters, like '/a/b/*.json'. The matching files are
* individually loaded as JSON and accounts are added. See
* {@link addAccounts()} for JSON format.
*/
addFilesByGlob(globPattern: string | string[]) {
// FIXME switch glob to async version
const files = glob(globPattern);
files.forEach((filePath) => this.addFile(filePath));
}
/**
* Perform validation on authentication info previously
* loaded. Note that it has to be done on the entire set after an
* update to catch duplicate account IDs or access keys.
*/
validate() {
if (this.#isValid === 'waiting-for-validation') {
const isValid = this.#isAuthDataValid(this.#authData);
this.#isValid = isValid ? 'valid' : 'invalid';
}
return this.#isValid === 'valid';
}
/**
* Get authentication info as a plain JS object containing all accounts
* under the "accounts" attribute, with validation.
*/
get data() {
return this.validate() ? this.#authData : null;
}
/** backward-compat: ignore arn if starts with 'aws:' and log a warning */
#isNotLegacyAWSARN(account: Account, filePath?: string) {
if (account.arn.startsWith('aws:')) {
const { name: accountName, arn: accountArn } = account;
this.#log.error(
'account must have a valid AWS ARN, legacy examples ' +
"starting with 'aws:' are not supported anymore. " +
'Please convert to a proper account entry (see ' +
'examples at https://github.com/scality/S3/blob/' +
'master/conf/authdata.json). Also note that support ' +
'for account users has been dropped.',
{ accountName, accountArn, filePath }
);
return false;
}
return true;
}
#isValidUsers(account: Account, filePath?: string) {
if (account.users) {
const { name: accountName, arn: accountArn } = account;
this.#log.error(
'support for account users has been dropped, consider ' +
'turning users into account entries (see examples at ' +
'https://github.com/scality/S3/blob/master/conf/' +
'authdata.json)',
{ accountName, accountArn, filePath }
);
return false;
}
return true;
}
#isValidARN(account: Account, filePath?: string) {
const arnObj = ARN.createFromString(account.arn);
const { name: accountName, arn: accountArn } = account;
if (arnObj instanceof ARN) {
if (!arnObj.isIAMAccount()) {
this.#log.error('authentication config validation error', {
reason: 'not an IAM account ARN',
accountName,
accountArn,
filePath,
});
return false;
}
} else {
this.#log.error('authentication config validation error', {
reason: arnObj.error.description,
accountName,
accountArn,
filePath,
});
return false;
}
return true;
}
#isAuthDataValid(authData: any, filePath?: string) {
const options = { abortEarly: true };
const response = types.validators.accounts.validate(authData, options);
if (response.error) {
this.#dumpJoiErrors(response.error.details, filePath);
return false;
}
const validAccounts = response.value.accounts.filter(
(account: Account) =>
this.#isNotLegacyAWSARN(account, filePath) &&
this.#isValidUsers(account, filePath) &&
this.#isValidARN(account, filePath)
);
const areSomeInvalidAccounts =
validAccounts.length !== response.value.accounts.length;
if (areSomeInvalidAccounts) {
return false;
}
const keys = validAccounts.flatMap((account) => account.keys);
const uniqueKeysValidator = types.validators.keys.unique('access');
const areKeysUnique = uniqueKeysValidator.validate(keys);
if (areKeysUnique.error) {
this.#dumpJoiErrors(areKeysUnique.error.details, filePath);
return false;
}
return true;
}
#dumpJoiErrors(errors: joi.ValidationErrorItem[], filePath?: string) {
errors.forEach((err) => {
const baseLogInfo = { item: err.path, filePath };
const logInfo = () => {
if (err.type === 'array.unique') {
const reason = `duplicate value '${err.context?.path}'`;
const dupValue = err.context?.value[err.context.path];
return { ...baseLogInfo, reason, dupValue };
} else {
const reason = err.message;
const context = err.context;
return { ...baseLogInfo, reason, context };
}
};
this.#log.error(
'authentication config validation error',
logInfo()
);
});
}
}

View File

@ -1,14 +1,11 @@
'use strict'; // eslint-disable-line strict
import * as crypto from 'crypto';
import errors from '../../../errors';
import { calculateSigningKey, hashSignature } from './vaultUtilities';
import Indexer from './Indexer';
import { Accounts } from './types';
import BaseBackend from '../BaseBackend';
const crypto = require('crypto');
const errors = require('../../../errors');
const calculateSigningKey = require('./vaultUtilities').calculateSigningKey;
const hashSignature = require('./vaultUtilities').hashSignature;
const Indexer = require('./Indexer');
const BaseBackend = require('../base');
function _formatResponse(userInfoToSend) {
function _formatResponse(userInfoToSend: any) {
return {
message: {
body: { userInfo: userInfoToSend },
@ -19,25 +16,28 @@ function _formatResponse(userInfoToSend) {
/**
* Class that provides a memory backend for verifying signatures and getting
* emails and canonical ids associated with an account.
*
* @class InMemoryBackend
*/
class InMemoryBackend extends BaseBackend {
/**
* @constructor
* @param {string} service - service identifer for construction arn
* @param {Indexer} indexer - indexer instance for retrieving account info
* @param {function} formatter - function which accepts user info to send
* back and returns it in an object
*/
constructor(service, indexer, formatter) {
indexer: Indexer;
service: string;
constructor(service: string, indexer: Indexer) {
super(service);
this.service = service;
this.indexer = indexer;
this.formatResponse = formatter;
}
verifySignatureV2(stringToSign, signatureFromRequest,
accessKey, options, callback) {
// CODEQUALITY-TODO-SYNC Should be synchronous
verifySignatureV2(
stringToSign: string,
signatureFromRequest: string,
accessKey: string,
options: { algo: 'SHA256' | 'SHA1' },
callback: (
error: Error | null,
data?: ReturnType<typeof _formatResponse>
) => void
) {
const entity = this.indexer.getEntityByKey(accessKey);
if (!entity) {
return callback(errors.InvalidAccessKeyId);
@ -52,14 +52,28 @@ class InMemoryBackend extends BaseBackend {
accountDisplayName: this.indexer.getAcctDisplayName(entity),
canonicalID: entity.canonicalID,
arn: entity.arn,
// TODO Why?
// @ts-ignore
IAMdisplayName: entity.IAMdisplayName,
};
const vaultReturnObject = this.formatResponse(userInfoToSend);
const vaultReturnObject = _formatResponse(userInfoToSend);
return callback(null, vaultReturnObject);
}
verifySignatureV4(stringToSign, signatureFromRequest, accessKey,
region, scopeDate, options, callback) {
// TODO Options not used. Why ?
// CODEQUALITY-TODO-SYNC Should be synchronous
verifySignatureV4(
stringToSign: string,
signatureFromRequest: string,
accessKey: string,
region: string,
scopeDate: string,
_options: { algo: 'SHA256' | 'SHA1' },
callback: (
err: Error | null,
data?: ReturnType<typeof _formatResponse>
) => void
) {
const entity = this.indexer.getEntityByKey(accessKey);
if (!entity) {
return callback(errors.InvalidAccessKeyId);
@ -75,13 +89,21 @@ class InMemoryBackend extends BaseBackend {
accountDisplayName: this.indexer.getAcctDisplayName(entity),
canonicalID: entity.canonicalID,
arn: entity.arn,
// TODO Why?
// @ts-ignore
IAMdisplayName: entity.IAMdisplayName,
};
const vaultReturnObject = this.formatResponse(userInfoToSend);
const vaultReturnObject = _formatResponse(userInfoToSend);
return callback(null, vaultReturnObject);
}
getCanonicalIds(emails, log, cb) {
// TODO log not used. Why ?
// CODEQUALITY-TODO-SYNC Should be synchronous
getCanonicalIds(
emails: string[],
_log: any,
cb: (err: null, data: { message: { body: any } }) => void
) {
const results = {};
emails.forEach(email => {
const lowercasedEmail = email.toLowerCase();
@ -101,7 +123,13 @@ class InMemoryBackend extends BaseBackend {
return cb(null, vaultReturnObject);
}
getEmailAddresses(canonicalIDs, options, cb) {
// TODO options not used. Why ?
// CODEQUALITY-TODO-SYNC Should be synchronous
getEmailAddresses(
canonicalIDs: string[],
_options: any,
cb: (err: null, data: { message: { body: any } }) => void
) {
const results = {};
canonicalIDs.forEach(canonicalId => {
const foundEntity = this.indexer.getEntityByCanId(canonicalId);
@ -119,17 +147,24 @@ class InMemoryBackend extends BaseBackend {
return cb(null, vaultReturnObject);
}
// TODO options not used. Why ?
// CODEQUALITY-TODO-SYNC Should be synchronous
/**
* Gets accountIds for a list of accounts based on
* the canonical IDs associated with the account
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} options - to send log id to vault
* @param {function} cb - callback to calling function
* @returns {function} callback with either error or
* @param canonicalIDs - list of canonicalIDs
* @param _options - to send log id to vault
* @param cb - callback to calling function
* @returns The next is wrong. Here to keep archives.
* callback with either error or
* an object from Vault containing account canonicalID
* as each object key and an accountId as the value (or "NotFound")
*/
getAccountIds(canonicalIDs, options, cb) {
getAccountIds(
canonicalIDs: string[],
_options: any,
cb: (err: null, data: { message: { body: any } }) => void
) {
const results = {};
canonicalIDs.forEach(canonicalID => {
const foundEntity = this.indexer.getEntityByCanId(canonicalID);
@ -148,31 +183,14 @@ class InMemoryBackend extends BaseBackend {
}
}
class S3AuthBackend extends InMemoryBackend {
/**
* @constructor
* @param {object} authdata - the authentication config file's data
* @param {object[]} authdata.accounts - array of account objects
* @param {string=} authdata.accounts[].name - account name
* @param {string} authdata.accounts[].email - account email
* @param {string} authdata.accounts[].arn - IAM resource name
* @param {string} authdata.accounts[].canonicalID - account canonical ID
* @param {string} authdata.accounts[].shortid - short account ID
* @param {object[]=} authdata.accounts[].keys - array of key objects
* @param {string} authdata.accounts[].keys[].access - access key
* @param {string} authdata.accounts[].keys[].secret - secret key
* @return {undefined}
*/
constructor(authdata) {
super('s3', new Indexer(authdata), _formatResponse);
constructor(authdata: Accounts) {
super('s3', new Indexer(authdata));
}
refreshAuthData(authData) {
refreshAuthData(authData: Accounts) {
this.indexer = new Indexer(authData);
}
}
module.exports = {
s3: S3AuthBackend,
};
export { S3AuthBackend as s3 };

View File

@ -1,145 +0,0 @@
/**
* Class that provides an internal indexing over the simple data provided by
* the authentication configuration file for the memory backend. This allows
* accessing the different authentication entities through various types of
* keys.
*
* @class Indexer
*/
class Indexer {
/**
* @constructor
* @param {object} authdata - the authentication config file's data
* @param {object[]} authdata.accounts - array of account objects
* @param {string=} authdata.accounts[].name - account name
* @param {string} authdata.accounts[].email - account email
* @param {string} authdata.accounts[].arn - IAM resource name
* @param {string} authdata.accounts[].canonicalID - account canonical ID
* @param {string} authdata.accounts[].shortid - short account ID
* @param {object[]=} authdata.accounts[].keys - array of key objects
* @param {string} authdata.accounts[].keys[].access - access key
* @param {string} authdata.accounts[].keys[].secret - secret key
* @return {undefined}
*/
constructor(authdata) {
this.accountsBy = {
canId: {},
accessKey: {},
email: {},
};
/*
* This may happen if the application is configured to use another
* authentication backend than in-memory.
* As such, we're managing the error here to avoid screwing up there.
*/
if (!authdata) {
return;
}
this._build(authdata);
}
_indexAccount(account) {
const accountData = {
arn: account.arn,
canonicalID: account.canonicalID,
shortid: account.shortid,
accountDisplayName: account.name,
email: account.email.toLowerCase(),
keys: [],
};
this.accountsBy.canId[accountData.canonicalID] = accountData;
this.accountsBy.email[accountData.email] = accountData;
if (account.keys !== undefined) {
account.keys.forEach(key => {
accountData.keys.push(key);
this.accountsBy.accessKey[key.access] = accountData;
});
}
}
_build(authdata) {
authdata.accounts.forEach(account => {
this._indexAccount(account);
});
}
/**
* This method returns the account associated to a canonical ID.
*
* @param {string} canId - The canonicalId of the account
* @return {Object} account - The account object
* @return {Object} account.arn - The account's ARN
* @return {Object} account.canonicalID - The account's canonical ID
* @return {Object} account.shortid - The account's internal shortid
* @return {Object} account.accountDisplayName - The account's display name
* @return {Object} account.email - The account's lowercased email
*/
getEntityByCanId(canId) {
return this.accountsBy.canId[canId];
}
/**
* This method returns the entity (either an account or a user) associated
* to a canonical ID.
*
* @param {string} key - The accessKey of the entity
* @return {Object} entity - The entity object
* @return {Object} entity.arn - The entity's ARN
* @return {Object} entity.canonicalID - The canonical ID for the entity's
* account
* @return {Object} entity.shortid - The entity's internal shortid
* @return {Object} entity.accountDisplayName - The entity's account
* display name
* @return {Object} entity.IAMDisplayName - The user's display name
* (if the entity is an user)
* @return {Object} entity.email - The entity's lowercased email
*/
getEntityByKey(key) {
return this.accountsBy.accessKey[key];
}
/**
* This method returns the entity (either an account or a user) associated
* to an email address.
*
* @param {string} email - The email address
* @return {Object} entity - The entity object
* @return {Object} entity.arn - The entity's ARN
* @return {Object} entity.canonicalID - The canonical ID for the entity's
* account
* @return {Object} entity.shortid - The entity's internal shortid
* @return {Object} entity.accountDisplayName - The entity's account
* display name
* @return {Object} entity.IAMDisplayName - The user's display name
* (if the entity is an user)
* @return {Object} entity.email - The entity's lowercased email
*/
getEntityByEmail(email) {
const lowerCasedEmail = email.toLowerCase();
return this.accountsBy.email[lowerCasedEmail];
}
/**
* This method returns the secret key associated with the entity.
* @param {Object} entity - the entity object
* @param {string} accessKey - access key
* @returns {string} secret key
*/
getSecretKey(entity, accessKey) {
return entity.keys
.filter(kv => kv.access === accessKey)[0].secret;
}
/**
* This method returns the account display name associated with the entity.
* @param {Object} entity - the entity object
* @returns {string} account display name
*/
getAcctDisplayName(entity) {
return entity.accountDisplayName;
}
}
module.exports = Indexer;

View File

@ -0,0 +1,93 @@
import { Accounts, Account, Entity } from './types';
/**
* Class that provides an internal indexing over the simple data provided by
* the authentication configuration file for the memory backend. This allows
* accessing the different authentication entities through various types of
* keys.
*/
export default class Indexer {
accountsBy: {
canId: { [id: string]: Entity | undefined },
accessKey: { [id: string]: Entity | undefined },
email: { [id: string]: Entity | undefined },
}
constructor(authdata?: Accounts) {
this.accountsBy = {
canId: {},
accessKey: {},
email: {},
};
/*
* This may happen if the application is configured to use another
* authentication backend than in-memory.
* As such, we're managing the error here to avoid screwing up there.
*/
if (!authdata) {
return;
}
this.#build(authdata);
}
#indexAccount(account: Account) {
const accountData: Entity = {
arn: account.arn,
canonicalID: account.canonicalID,
shortid: account.shortid,
accountDisplayName: account.name,
email: account.email.toLowerCase(),
keys: [],
};
this.accountsBy.canId[accountData.canonicalID] = accountData;
this.accountsBy.email[accountData.email] = accountData;
if (account.keys !== undefined) {
account.keys.forEach(key => {
accountData.keys.push(key);
this.accountsBy.accessKey[key.access] = accountData;
});
}
}
#build(authdata: Accounts) {
authdata.accounts.forEach(account => {
this.#indexAccount(account);
});
}
/** This method returns the account associated to a canonical ID. */
getEntityByCanId(canId: string): Entity | undefined {
return this.accountsBy.canId[canId];
}
/**
* This method returns the entity (either an account or a user) associated
* to a canonical ID.
* @param {string} key - The accessKey of the entity
*/
getEntityByKey(key: string): Entity | undefined {
return this.accountsBy.accessKey[key];
}
/**
* This method returns the entity (either an account or a user) associated
* to an email address.
*/
getEntityByEmail(email: string): Entity | undefined {
const lowerCasedEmail = email.toLowerCase();
return this.accountsBy.email[lowerCasedEmail];
}
/** This method returns the secret key associated with the entity. */
getSecretKey(entity: Entity, accessKey: string) {
const keys = entity.keys.filter(kv => kv.access === accessKey);
return keys[0].secret;
}
/** This method returns the account display name associated with the entity. */
getAcctDisplayName(entity: Entity) {
return entity.accountDisplayName;
}
}

View File

@ -0,0 +1,51 @@
import joi from 'joi';
export type Callback<Data = any> = (err: Error | null | undefined, data?: Data) => void;
export type Key = { access: string; secret: string };
export type Base = {
arn: string;
canonicalID: string;
shortid: string;
email: string;
keys: Key[];
};
export type Account = Base & { name: string; users: any[] };
export type Accounts = { accounts: Account[] };
export type Entity = Base & { accountDisplayName: string };
const keys = ((): joi.ArraySchema => {
const str = joi.string().required();
const items = { access: str, secret: str };
return joi.array().items(items).required();
})();
const account = (() => {
return joi.object<Account>({
name: joi.string().required(),
email: joi.string().email().required(),
arn: joi.string().required(),
canonicalID: joi.string().required(),
shortid: joi
.string()
.regex(/^[0-9]{12}$/)
.required(),
keys: keys,
// backward-compat
users: joi.array(),
});
})();
const accounts = (() => {
return joi.object<Accounts>({
accounts: joi
.array()
.items(account)
.required()
.unique('arn')
.unique('email')
.unique('canonicalID'),
});
})();
export const validators = { keys, account, accounts };

View File

@ -1,18 +0,0 @@
const AuthLoader = require('./AuthLoader');
/**
* @deprecated please use {@link AuthLoader} class instead
*
* @param {object} authdata - the authentication config file's data
* @param {werelogs.API} logApi - object providing a constructor function
* for the Logger object
* @return {boolean} true on erroneous data
* false on success
*/
function validateAuthConfig(authdata, logApi) {
const authLoader = new AuthLoader(logApi);
authLoader.addAccounts(authdata);
return !authLoader.validate();
}
module.exports = validateAuthConfig;

View File

@ -0,0 +1,16 @@
import { Logger } from 'werelogs';
import AuthLoader from './AuthLoader';
import { Accounts } from './types';
/**
* @deprecated please use {@link AuthLoader} class instead
* @return true on erroneous data false on success
*/
export default function validateAuthConfig(
authdata: Accounts,
logApi?: { Logger: typeof Logger }
) {
const authLoader = new AuthLoader(logApi);
authLoader.addAccounts(authdata);
return !authLoader.validate();
}

View File

@ -1,6 +1,4 @@
'use strict'; // eslint-disable-line strict
const crypto = require('crypto');
import * as crypto from 'crypto';
/** hashSignature for v2 Auth
* @param {string} stringToSign - built string to sign per AWS rules
@ -8,11 +6,19 @@ const crypto = require('crypto');
* @param {string} algorithm - either SHA256 or SHA1
* @return {string} reconstructed signature
*/
function hashSignature(stringToSign, secretKey, algorithm) {
export function hashSignature(
stringToSign: string,
secretKey: string,
algorithm: 'SHA256' | 'SHA1'
): string {
const hmacObject = crypto.createHmac(algorithm, secretKey);
return hmacObject.update(stringToSign, 'binary').digest('base64');
}
const sha256 = (key: string | Buffer, data: string) => {
return crypto.createHmac('sha256', key).update(data, 'binary').digest();
};
/** calculateSigningKey for v4 Auth
* @param {string} secretKey - requester's secretKey
* @param {string} region - region included in request
@ -20,16 +26,15 @@ function hashSignature(stringToSign, secretKey, algorithm) {
* @param {string} [service] - To specify another service than s3
* @return {string} signingKey - signingKey to calculate signature
*/
function calculateSigningKey(secretKey, region, scopeDate, service) {
const dateKey = crypto.createHmac('sha256', `AWS4${secretKey}`)
.update(scopeDate, 'binary').digest();
const dateRegionKey = crypto.createHmac('sha256', dateKey)
.update(region, 'binary').digest();
const dateRegionServiceKey = crypto.createHmac('sha256', dateRegionKey)
.update(service || 's3', 'binary').digest();
const signingKey = crypto.createHmac('sha256', dateRegionServiceKey)
.update('aws4_request', 'binary').digest();
export function calculateSigningKey(
secretKey: string,
region: string,
scopeDate: string,
service?: string
): Buffer {
const dateKey = sha256(`AWS4${secretKey}`, scopeDate);
const dateRegionKey = sha256(dateKey, region);
const dateRegionServiceKey = sha256(dateRegionKey, service || 's3');
const signingKey = sha256(dateRegionServiceKey, 'aws4_request');
return signingKey;
}
module.exports = { hashSignature, calculateSigningKey };

View File

@ -1,7 +1,5 @@
'use strict'; // eslint-disable-line strict
function algoCheck(signatureLength) {
let algo;
export default function algoCheck(signatureLength: number) {
let algo: 'sha256' | 'sha1';
// If the signature sent is 44 characters,
// this means that sha256 was used:
// 44 characters in base64
@ -13,7 +11,6 @@ function algoCheck(signatureLength) {
if (signatureLength === SHA1LEN) {
algo = 'sha1';
}
// @ts-ignore
return algo;
}
module.exports = algoCheck;

View File

@ -1,11 +0,0 @@
'use strict'; // eslint-disable-line strict
const headerAuthCheck = require('./headerAuthCheck');
const queryAuthCheck = require('./queryAuthCheck');
const authV2 = {
header: headerAuthCheck,
query: queryAuthCheck,
};
module.exports = authV2;

2
lib/auth/v2/authV2.ts Normal file
View File

@ -0,0 +1,2 @@
export * as header from './headerAuthCheck';
export * as query from './queryAuthCheck';

View File

@ -1,9 +1,9 @@
'use strict'; // eslint-disable-line strict
const errors = require('../../errors');
import { Logger } from 'werelogs';
import errors from '../../errors';
const epochTime = new Date('1970-01-01').getTime();
function checkRequestExpiry(timestamp, log) {
export default function checkRequestExpiry(timestamp: number, log: Logger) {
// If timestamp is before epochTime, the request is invalid and return
// errors.AccessDenied
if (timestamp < epochTime) {
@ -32,5 +32,3 @@ function checkRequestExpiry(timestamp, log) {
return undefined;
}
module.exports = checkRequestExpiry;

View File

@ -1,11 +1,14 @@
'use strict'; // eslint-disable-line strict
import { Logger } from 'werelogs';
import utf8 from 'utf8';
import getCanonicalizedAmzHeaders from './getCanonicalizedAmzHeaders';
import getCanonicalizedResource from './getCanonicalizedResource';
const utf8 = require('utf8');
const getCanonicalizedAmzHeaders = require('./getCanonicalizedAmzHeaders');
const getCanonicalizedResource = require('./getCanonicalizedResource');
function constructStringToSign(request, data, log, clientType) {
export default function constructStringToSign(
request: any,
data: { [key: string]: string },
log: Logger,
clientType?: any
) {
/*
Build signature per AWS requirements:
StringToSign = HTTP-Verb + '\n' +
@ -42,5 +45,3 @@ function constructStringToSign(request, data, log, clientType) {
+ getCanonicalizedResource(request, clientType);
return utf8.encode(stringToSign);
}
module.exports = constructStringToSign;

View File

@ -1,14 +1,12 @@
'use strict'; // eslint-disable-line strict
function getCanonicalizedAmzHeaders(headers, clientType) {
export default function getCanonicalizedAmzHeaders(headers: Headers, clientType: string) {
/*
Iterate through headers and pull any headers that are x-amz headers.
Need to include 'x-amz-date' here even though AWS docs
ambiguous on this.
*/
const filterFn = clientType === 'GCP' ?
val => val.substr(0, 7) === 'x-goog-' :
val => val.substr(0, 6) === 'x-amz-';
(val: string) => val.substr(0, 7) === 'x-goog-' :
(val: string) => val.substr(0, 6) === 'x-amz-';
const amzHeaders = Object.keys(headers)
.filter(filterFn)
.map(val => [val.trim(), headers[val].trim()]);
@ -43,5 +41,3 @@ function getCanonicalizedAmzHeaders(headers, clientType) {
`${headerStr}${current[0]}:${current[1]}\n`,
'');
}
module.exports = getCanonicalizedAmzHeaders;

View File

@ -1,6 +1,4 @@
'use strict'; // eslint-disable-line strict
const url = require('url');
import * as url from 'url';
const gcpSubresources = [
'acl',
@ -41,7 +39,7 @@ const awsSubresources = [
'website',
];
function getCanonicalizedResource(request, clientType) {
export default function getCanonicalizedResource(request: any, clientType: string) {
/*
This variable is used to determine whether to insert
a '?' or '&'. Once a query parameter is added to the resourceString,
@ -117,5 +115,3 @@ function getCanonicalizedResource(request, clientType) {
}
return resourceString;
}
module.exports = getCanonicalizedResource;

View File

@ -1,12 +1,11 @@
'use strict'; // eslint-disable-line strict
import { Logger } from 'werelogs';
import errors from '../../errors';
import * as constants from '../../constants';
import constructStringToSign from './constructStringToSign';
import checkRequestExpiry from './checkRequestExpiry';
import algoCheck from './algoCheck';
const errors = require('../../errors');
const constants = require('../../constants');
const constructStringToSign = require('./constructStringToSign');
const checkRequestExpiry = require('./checkRequestExpiry');
const algoCheck = require('./algoCheck');
function check(request, log, data) {
export function check(request: any, log: Logger, data: { [key: string]: string }) {
log.trace('running header auth check');
const headers = request.headers;
@ -52,6 +51,7 @@ function check(request, log, data) {
log.trace('invalid authorization header', { authInfo });
return { err: errors.MissingSecurityHeader };
}
// @ts-ignore
log.addDefaultFields({ accessKey });
const signatureFromRequest = authInfo.substring(semicolonIndex + 1).trim();
@ -80,5 +80,3 @@ function check(request, log, data) {
},
};
}
module.exports = { check };

View File

@ -1,11 +1,10 @@
'use strict'; // eslint-disable-line strict
import { Logger } from 'werelogs';
import errors from '../../errors';
import * as constants from '../../constants';
import algoCheck from './algoCheck';
import constructStringToSign from './constructStringToSign';
const errors = require('../../errors');
const constants = require('../../constants');
const algoCheck = require('./algoCheck');
const constructStringToSign = require('./constructStringToSign');
function check(request, log, data) {
export function check(request: any, log: Logger, data: { [key: string]: string }) {
log.trace('running query auth check');
if (request.method === 'POST') {
log.debug('query string auth not supported for post requests');
@ -51,6 +50,7 @@ function check(request, log, data) {
return { err: errors.RequestTimeTooSkewed };
}
const accessKey = data.AWSAccessKeyId;
// @ts-ignore
log.addDefaultFields({ accessKey });
const signatureFromRequest = decodeURIComponent(data.Signature);
@ -82,5 +82,3 @@ function check(request, log, data) {
},
};
}
module.exports = { check };

View File

@ -1,11 +0,0 @@
'use strict'; // eslint-disable-line strict
const headerAuthCheck = require('./headerAuthCheck');
const queryAuthCheck = require('./queryAuthCheck');
const authV4 = {
header: headerAuthCheck,
query: queryAuthCheck,
};
module.exports = authV4;

2
lib/auth/v4/authV4.ts Normal file
View File

@ -0,0 +1,2 @@
export * as header from './headerAuthCheck';
export * as query from './queryAuthCheck';

View File

@ -1,5 +1,3 @@
'use strict'; // eslint-disable-line strict
/*
AWS's URI encoding rules:
URI encode every byte. Uri-Encode() must enforce the following rules:
@ -19,7 +17,7 @@ See http://docs.aws.amazon.com/AmazonS3/latest/API/sig-v4-header-based-auth.html
*/
// converts utf8 character to hex and pads "%" before every two hex digits
function _toHexUTF8(char) {
function _toHexUTF8(char: string) {
const hexRep = Buffer.from(char, 'utf8').toString('hex').toUpperCase();
let res = '';
hexRep.split('').forEach((v, n) => {
@ -32,7 +30,11 @@ function _toHexUTF8(char) {
return res;
}
function awsURIencode(input, encodeSlash, noEncodeStar) {
export default function awsURIencode(
input: string,
encodeSlash?: boolean,
noEncodeStar?: boolean
) {
const encSlash = encodeSlash === undefined ? true : encodeSlash;
let encoded = '';
/**
@ -76,5 +78,3 @@ function awsURIencode(input, encodeSlash, noEncodeStar) {
}
return encoded;
}
module.exports = awsURIencode;

View File

@ -1,17 +1,33 @@
'use strict'; // eslint-disable-line strict
const crypto = require('crypto');
const createCanonicalRequest = require('./createCanonicalRequest');
import * as crypto from 'crypto';
import { Logger } from 'werelogs';
import createCanonicalRequest from './createCanonicalRequest';
/**
* constructStringToSign - creates V4 stringToSign
* @param {object} params - params object
* @returns {string} - stringToSign
*/
function constructStringToSign(params) {
const { request, signedHeaders, payloadChecksum, credentialScope, timestamp,
query, log, proxyPath } = params;
export default function constructStringToSign(params: {
request: any;
signedHeaders: any;
payloadChecksum: any;
credentialScope: string;
timestamp: string;
query: { [key: string]: string };
log?: Logger;
proxyPath?: string;
awsService: string;
}): string | Error {
const {
request,
signedHeaders,
payloadChecksum,
credentialScope,
timestamp,
query,
log,
proxyPath,
} = params;
const path = proxyPath || request.path;
const canonicalReqResult = createCanonicalRequest({
@ -24,6 +40,8 @@ function constructStringToSign(params) {
service: params.awsService,
});
// TODO Why that line?
// @ts-ignore
if (canonicalReqResult instanceof Error) {
if (log) {
log.error('error creating canonicalRequest');
@ -40,5 +58,3 @@ function constructStringToSign(params) {
`${credentialScope}\n${canonicalHex}`;
return stringToSign;
}
module.exports = constructStringToSign;

View File

@ -1,27 +1,33 @@
'use strict'; // eslint-disable-line strict
const awsURIencode = require('./awsURIencode');
const crypto = require('crypto');
const queryString = require('querystring');
import * as crypto from 'crypto';
import * as queryString from 'querystring';
import awsURIencode from './awsURIencode';
/**
* createCanonicalRequest - creates V4 canonical request
* @param {object} params - contains pHttpVerb (request type),
* @param params - contains pHttpVerb (request type),
* pResource (parsed from URL), pQuery (request query),
* pHeaders (request headers), pSignedHeaders (signed headers from request),
* payloadChecksum (from request)
* @returns {string} - canonicalRequest
* @returns - canonicalRequest
*/
function createCanonicalRequest(params) {
export default function createCanonicalRequest(
params: {
pHttpVerb: string;
pResource: string;
pQuery: { [key: string]: string };
pHeaders: any;
pSignedHeaders: any;
service: string;
payloadChecksum: string;
}
) {
const pHttpVerb = params.pHttpVerb;
const pResource = params.pResource;
const pQuery = params.pQuery;
const pHeaders = params.pHeaders;
const pSignedHeaders = params.pSignedHeaders;
const service = params.service;
let payloadChecksum = params.payloadChecksum;
if (!payloadChecksum) {
if (pHttpVerb === 'GET') {
payloadChecksum = 'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b' +
@ -34,7 +40,7 @@ function createCanonicalRequest(params) {
if (/aws-sdk-java\/[0-9.]+/.test(pHeaders['user-agent'])) {
notEncodeStar = true;
}
let payload = queryString.stringify(pQuery, null, null, {
let payload = queryString.stringify(pQuery, undefined, undefined, {
encodeURIComponent: input => awsURIencode(input, true,
notEncodeStar),
});
@ -61,11 +67,11 @@ function createCanonicalRequest(params) {
// signed headers
const signedHeadersList = pSignedHeaders.split(';');
signedHeadersList.sort((a, b) => a.localeCompare(b));
signedHeadersList.sort((a: any, b: any) => a.localeCompare(b));
const signedHeaders = signedHeadersList.join(';');
// canonical headers
const canonicalHeadersList = signedHeadersList.map(signedHeader => {
const canonicalHeadersList = signedHeadersList.map((signedHeader: any) => {
if (pHeaders[signedHeader] !== undefined) {
const trimmedHeader = pHeaders[signedHeader]
.trim().replace(/\s+/g, ' ');
@ -87,5 +93,3 @@ function createCanonicalRequest(params) {
`${signedHeaders}\n${payloadChecksum}`;
return canonicalRequest;
}
module.exports = createCanonicalRequest;

View File

@ -1,27 +1,32 @@
'use strict'; // eslint-disable-line strict
const errors = require('../../../lib/errors');
const constants = require('../../constants');
const constructStringToSign = require('./constructStringToSign');
const checkTimeSkew = require('./timeUtils').checkTimeSkew;
const convertUTCtoISO8601 = require('./timeUtils').convertUTCtoISO8601;
const convertAmzTimeToMs = require('./timeUtils').convertAmzTimeToMs;
const extractAuthItems = require('./validateInputs').extractAuthItems;
const validateCredentials = require('./validateInputs').validateCredentials;
const areSignedHeadersComplete =
require('./validateInputs').areSignedHeadersComplete;
import { Logger } from 'werelogs';
import errors from '../../../lib/errors';
import * as constants from '../../constants';
import constructStringToSign from './constructStringToSign';
import {
checkTimeSkew,
convertUTCtoISO8601,
convertAmzTimeToMs,
} from './timeUtils';
import {
extractAuthItems,
validateCredentials,
areSignedHeadersComplete,
} from './validateInputs';
/**
* V4 header auth check
* @param {object} request - HTTP request object
* @param {object} log - logging object
* @param {object} data - Parameters from queryString parsing or body of
* @param request - HTTP request object
* @param log - logging object
* @param data - Parameters from queryString parsing or body of
* POST request
* @param {string} awsService - Aws service ('iam' or 's3')
* @return {callback} calls callback
* @param awsService - Aws service ('iam' or 's3')
*/
function check(request, log, data, awsService) {
export function check(
request: any,
log: Logger,
data: { [key: string]: string },
awsService: string
) {
log.trace('running header auth check');
const token = request.headers['x-amz-security-token'];
@ -62,16 +67,16 @@ function check(request, log, data, awsService) {
log.trace('authorization header from request', { authHeader });
const signatureFromRequest = authHeaderItems.signatureFromRequest;
const credentialsArr = authHeaderItems.credentialsArr;
const signedHeaders = authHeaderItems.signedHeaders;
const signatureFromRequest = authHeaderItems.signatureFromRequest!;
const credentialsArr = authHeaderItems.credentialsArr!;
const signedHeaders = authHeaderItems.signedHeaders!;
if (!areSignedHeadersComplete(signedHeaders, request.headers)) {
log.debug('signedHeaders are incomplete', { signedHeaders });
return { err: errors.AccessDenied };
}
let timestamp;
let timestamp: string | undefined;
// check request timestamp
const xAmzDate = request.headers['x-amz-date'];
if (xAmzDate) {
@ -127,7 +132,7 @@ function check(request, log, data, awsService) {
return { err: errors.RequestTimeTooSkewed };
}
let proxyPath = null;
let proxyPath: string | null = null;
if (request.headers.proxy_path) {
try {
proxyPath = decodeURIComponent(request.headers.proxy_path);
@ -147,7 +152,7 @@ function check(request, log, data, awsService) {
timestamp,
payloadChecksum,
awsService: service,
proxyPath,
proxyPath: proxyPath!,
});
log.trace('constructed stringToSign', { stringToSign });
if (stringToSign instanceof Error) {
@ -178,5 +183,3 @@ function check(request, log, data, awsService) {
},
};
}
module.exports = { check };

View File

@ -1,24 +1,18 @@
'use strict'; // eslint-disable-line strict
const constants = require('../../constants');
const errors = require('../../errors');
const constructStringToSign = require('./constructStringToSign');
const checkTimeSkew = require('./timeUtils').checkTimeSkew;
const convertAmzTimeToMs = require('./timeUtils').convertAmzTimeToMs;
const validateCredentials = require('./validateInputs').validateCredentials;
const extractQueryParams = require('./validateInputs').extractQueryParams;
const areSignedHeadersComplete =
require('./validateInputs').areSignedHeadersComplete;
import { Logger } from 'werelogs';
import * as constants from '../../constants';
import errors from '../../errors';
import constructStringToSign from './constructStringToSign';
import { checkTimeSkew, convertAmzTimeToMs } from './timeUtils';
import { validateCredentials, extractQueryParams } from './validateInputs';
import { areSignedHeadersComplete } from './validateInputs';
/**
* V4 query auth check
* @param {object} request - HTTP request object
* @param {object} log - logging object
* @param {object} data - Contain authentification params (GET or POST data)
* @return {callback} calls callback
* @param request - HTTP request object
* @param log - logging object
* @param data - Contain authentification params (GET or POST data)
*/
function check(request, log, data) {
export function check(request: any, log: Logger, data: { [key: string]: string }) {
const authParams = extractQueryParams(data, log);
if (Object.keys(authParams).length !== 5) {
@ -33,11 +27,11 @@ function check(request, log, data) {
return { err: errors.InvalidToken };
}
const signedHeaders = authParams.signedHeaders;
const signatureFromRequest = authParams.signatureFromRequest;
const timestamp = authParams.timestamp;
const expiry = authParams.expiry;
const credential = authParams.credential;
const signedHeaders = authParams.signedHeaders!;
const signatureFromRequest = authParams.signatureFromRequest!;
const timestamp = authParams.timestamp!;
const expiry = authParams.expiry!;
const credential = authParams.credential!;
if (!areSignedHeadersComplete(signedHeaders, request.headers)) {
log.debug('signedHeaders are incomplete', { signedHeaders });
@ -62,7 +56,7 @@ function check(request, log, data) {
return { err: errors.RequestTimeTooSkewed };
}
let proxyPath = null;
let proxyPath: string | null = null;
if (request.headers.proxy_path) {
try {
proxyPath = decodeURIComponent(request.headers.proxy_path);
@ -98,7 +92,7 @@ function check(request, log, data) {
credentialScope:
`${scopeDate}/${region}/${service}/${requestType}`,
awsService: service,
proxyPath,
proxyPath: proxyPath!,
});
if (stringToSign instanceof Error) {
return { err: stringToSign };
@ -122,5 +116,3 @@ function check(request, log, data) {
},
};
}
module.exports = { check };

View File

@ -1,17 +1,38 @@
const { Transform } = require('stream');
const async = require('async');
const errors = require('../../../errors');
const constructChunkStringToSign = require('./constructChunkStringToSign');
import { Transform } from 'stream';
import async from 'async';
import { Logger } from 'werelogs';
import { Callback } from '../../backends/in_memory/types';
import Vault from '../../Vault';
import errors from '../../../errors';
import constructChunkStringToSign from './constructChunkStringToSign';
/**
* This class is designed to handle the chunks sent in a streaming
* v4 Auth request
*/
class V4Transform extends Transform {
export default class V4Transform extends Transform {
log: Logger;
cb: Callback;
accessKey: string;
region: string;
/** Date parsed from headers in ISO8601. */
scopeDate: string;
/** Date parsed from headers in ISO8601. */
timestamp: string;
/** Items from auth header, plus the string 'aws4_request' joined with '/': timestamp/region/aws-service/aws4_request */
credentialScope: string;
lastSignature?: string;
currentSignature?: string;
haveMetadata: boolean;
seekingDataSize: number;
currentData?: any;
dataCursor: number;
currentMetadata: Buffer[];
lastPieceDone: boolean;
lastChunk: boolean;
vault: Vault;
/**
* @constructor
* @param {object} streamingV4Params - info for chunk authentication
* @param {string} streamingV4Params.accessKey - requester's accessKey
* @param {string} streamingV4Params.signatureFromRequest - signature
@ -27,9 +48,27 @@ class V4Transform extends Transform {
* @param {object} log - logger object
* @param {function} cb - callback to api
*/
constructor(streamingV4Params, vault, log, cb) {
const { accessKey, signatureFromRequest, region, scopeDate, timestamp,
credentialScope } = streamingV4Params;
constructor(
streamingV4Params: {
accessKey: string,
signatureFromRequest: string,
region: string,
scopeDate: string,
timestamp: string,
credentialScope: string
},
vault: Vault,
log: Logger,
cb: Callback
) {
const {
accessKey,
signatureFromRequest,
region,
scopeDate,
timestamp,
credentialScope,
} = streamingV4Params;
super({});
this.log = log;
this.cb = cb;
@ -55,8 +94,8 @@ class V4Transform extends Transform {
/**
* This function will parse the metadata portion of the chunk
* @param {Buffer} remainingChunk - chunk sent from _transform
* @return {object} response - if error, will return 'err' key with
* @param remainingChunk - chunk sent from _transform
* @return response - if error, will return 'err' key with
* arsenal error value.
* if incomplete metadata, will return 'completeMetadata' key with
* value false
@ -64,7 +103,7 @@ class V4Transform extends Transform {
* value true and the key 'unparsedChunk' with the remaining chunk without
* the parsed metadata piece
*/
_parseMetadata(remainingChunk) {
_parseMetadata(remainingChunk: Buffer) {
let remainingPlusStoredMetadata = remainingChunk;
// have metadata pieces so need to add to the front of
// remainingChunk
@ -79,33 +118,34 @@ class V4Transform extends Transform {
this.currentMetadata.push(remainingPlusStoredMetadata);
return { completeMetadata: false };
}
let fullMetadata = remainingPlusStoredMetadata.slice(0,
lineBreakIndex);
let fullMetadata = remainingPlusStoredMetadata.slice(0, lineBreakIndex);
// handle extra line break on end of data chunk
if (fullMetadata.length === 0) {
const chunkWithoutLeadingLineBreak = remainingPlusStoredMetadata
.slice(2);
const chunkWithoutLeadingLineBreak =
remainingPlusStoredMetadata.slice(2);
// find second line break
lineBreakIndex = chunkWithoutLeadingLineBreak.indexOf('\r\n');
if (lineBreakIndex < 0) {
this.currentMetadata.push(chunkWithoutLeadingLineBreak);
return { completeMetadata: false };
}
fullMetadata = chunkWithoutLeadingLineBreak.slice(0,
lineBreakIndex);
fullMetadata = chunkWithoutLeadingLineBreak.slice(
0,
lineBreakIndex
);
}
const splitMeta = fullMetadata.toString().split(';');
this.log.trace('parsed full metadata for chunk', { splitMeta });
if (splitMeta.length !== 2) {
this.log.trace('chunk body did not contain correct ' +
'metadata format');
this.log.trace(
'chunk body did not contain correct ' + 'metadata format'
);
return { err: errors.InvalidArgument };
}
let dataSize = splitMeta[0];
// chunk-size is sent in hex
dataSize = Number.parseInt(dataSize, 16);
let dataSize = Number.parseInt(splitMeta[0], 16);
if (Number.isNaN(dataSize)) {
this.log.trace('chunk body did not contain valid size');
return { err: errors.InvalidArgument };
@ -132,28 +172,32 @@ class V4Transform extends Transform {
completeMetadata: true,
// start slice at lineBreak plus 2 to remove line break at end of
// metadata piece since length of '\r\n' is 2
unparsedChunk: remainingPlusStoredMetadata
.slice(lineBreakIndex + 2),
unparsedChunk: remainingPlusStoredMetadata.slice(
lineBreakIndex + 2
),
};
}
/**
* Build the stringToSign and authenticate the chunk
* @param {Buffer} dataToSend - chunk sent from _transform or null
* @param dataToSend - chunk sent from _transform or null
* if last chunk without data
* @param {function} done - callback to _transform
* @return {function} executes callback with err if applicable
* @param done - callback to _transform
* @return executes callback with err if applicable
*/
_authenticate(dataToSend, done) {
_authenticate(dataToSend: Buffer | null, done: (err?: Error) => void) {
// use prior sig to construct new string to sign
const stringToSign = constructChunkStringToSign(this.timestamp,
this.credentialScope, this.lastSignature, dataToSend);
this.log.trace('constructed chunk string to sign',
{ stringToSign });
const stringToSign = constructChunkStringToSign(
this.timestamp,
this.credentialScope,
this.lastSignature!,
dataToSend
);
this.log.trace('constructed chunk string to sign', { stringToSign });
// once used prior sig to construct string to sign, reassign
// lastSignature to current signature
this.lastSignature = this.currentSignature;
const vaultParams = {
const vaultParams: any = {
log: this.log,
data: {
accessKey: this.accessKey,
@ -165,28 +209,30 @@ class V4Transform extends Transform {
credentialScope: this.credentialScope,
},
};
return this.vault.authenticateV4Request(vaultParams, null, err => {
return this.vault.authenticateV4Request(vaultParams, null, (err) => {
if (err) {
this.log.trace('err from vault on streaming v4 auth',
{ error: err, paramsSentToVault: vaultParams.data });
this.log.trace('err from vault on streaming v4 auth', {
error: err,
paramsSentToVault: vaultParams.data,
});
return done(err);
}
return done();
});
}
// TODO encoding unused. Why?
/**
* This function will parse the chunk into metadata and data,
* use the metadata to authenticate with vault and send the
* data on to be stored if authentication passes
*
* @param {Buffer} chunk - chunk from request body
* @param {string} encoding - Data encoding
* @param {function} callback - Callback(err, justDataChunk, encoding)
* @return {function }executes callback with err if applicable
* @param chunk - chunk from request body
* @param encoding - Data encoding
* @param callback - Callback(err, justDataChunk, encoding)
* @return executes callback with err if applicable
*/
_transform(chunk, encoding, callback) {
_transform(chunk: Buffer, _encoding: string, callback: (err?: Error) => void) {
// 'chunk' here is the node streaming chunk
// transfer-encoding chunks should be of the format:
// string(IntHexBase(chunk-size)) + ";chunk-signature=" +
@ -195,9 +241,10 @@ class V4Transform extends Transform {
if (this.lastPieceDone) {
const slice = chunk.slice(0, 10);
this.log.trace('received chunk after end.' +
'See first 10 bytes of chunk',
{ chunk: slice.toString() });
this.log.trace(
'received chunk after end.' + 'See first 10 bytes of chunk',
{ chunk: slice.toString() }
);
return callback();
}
let unparsedChunk = chunk;
@ -206,10 +253,11 @@ class V4Transform extends Transform {
// test function
() => chunkLeftToEvaluate,
// async function
done => {
(done) => {
if (!this.haveMetadata) {
this.log.trace('do not have metadata so calling ' +
'_parseMetadata');
this.log.trace(
'do not have metadata so calling ' + '_parseMetadata'
);
// need to parse our metadata
const parsedMetadataResults =
this._parseMetadata(unparsedChunk);
@ -223,11 +271,11 @@ class V4Transform extends Transform {
}
// have metadata so reset unparsedChunk to remaining
// without metadata piece
unparsedChunk = parsedMetadataResults.unparsedChunk;
unparsedChunk = parsedMetadataResults.unparsedChunk!;
}
if (this.lastChunk) {
this.log.trace('authenticating final chunk with no data');
return this._authenticate(null, err => {
return this._authenticate(null, (err) => {
if (err) {
return done(err);
}
@ -246,17 +294,18 @@ class V4Transform extends Transform {
}
// parse just the next data piece without \r\n at the end
// (therefore, minus 2)
const nextDataPiece =
unparsedChunk.slice(0, this.seekingDataSize - 2);
const nextDataPiece = unparsedChunk.slice(
0,
this.seekingDataSize - 2
);
// add parsed data piece to other currentData pieces
// so that this.currentData is the full data piece
nextDataPiece.copy(this.currentData, this.dataCursor);
return this._authenticate(this.currentData, err => {
return this._authenticate(this.currentData, (err) => {
if (err) {
return done(err);
}
unparsedChunk =
unparsedChunk.slice(this.seekingDataSize);
unparsedChunk = unparsedChunk.slice(this.seekingDataSize);
this.push(this.currentData);
this.haveMetadata = false;
this.seekingDataSize = -1;
@ -267,15 +316,13 @@ class V4Transform extends Transform {
});
},
// final callback
err => {
(err) => {
if (err) {
return this.cb(err);
return this.cb(err as any);
}
// get next chunk
return callback();
},
}
);
}
}
module.exports = V4Transform;

View File

@ -1,32 +0,0 @@
const crypto = require('crypto');
const constants = require('../../../constants');
/**
* Constructs stringToSign for chunk
* @param {string} timestamp - date parsed from headers
* in ISO 8601 format: YYYYMMDDTHHMMSSZ
* @param {string} credentialScope - items from auth
* header plus the string 'aws4_request' joined with '/':
* timestamp/region/aws-service/aws4_request
* @param {string} lastSignature - signature from headers or prior chunk
* @param {string} justDataChunk - data portion of chunk
* @returns {string} stringToSign
*/
function constructChunkStringToSign(timestamp,
credentialScope, lastSignature, justDataChunk) {
let currentChunkHash;
// for last chunk, there will be no data, so use emptyStringHash
if (!justDataChunk) {
currentChunkHash = constants.emptyStringHash;
} else {
currentChunkHash = crypto.createHash('sha256');
currentChunkHash = currentChunkHash
.update(justDataChunk, 'binary').digest('hex');
}
return `AWS4-HMAC-SHA256-PAYLOAD\n${timestamp}\n` +
`${credentialScope}\n${lastSignature}\n` +
`${constants.emptyStringHash}\n${currentChunkHash}`;
}
module.exports = constructChunkStringToSign;

View File

@ -0,0 +1,36 @@
import * as crypto from 'crypto';
import * as constants from '../../../constants';
/**
* Constructs stringToSign for chunk
* @param timestamp - date parsed from headers in ISO 8601 format: YYYYMMDDTHHMMSSZ
* @param credentialScope - items from auth header plus the string
* 'aws4_request' joined with '/':
* timestamp/region/aws-service/aws4_request
* @param lastSignature - signature from headers or prior chunk
* @param justDataChunk - data portion of chunk
*/
export default function constructChunkStringToSign(
timestamp: string,
credentialScope: string,
lastSignature: string,
justDataChunk: string | Buffer | null
): string {
let currentChunkHash: string;
// for last chunk, there will be no data, so use emptyStringHash
if (!justDataChunk) {
currentChunkHash = constants.emptyStringHash;
} else {
let hash = crypto.createHash('sha256');
currentChunkHash = (
typeof justDataChunk === 'string'
? hash.update(justDataChunk, 'binary')
: hash.update(justDataChunk)
).digest('hex');
}
return (
`AWS4-HMAC-SHA256-PAYLOAD\n${timestamp}\n` +
`${credentialScope}\n${lastSignature}\n` +
`${constants.emptyStringHash}\n${currentChunkHash}`
);
}

View File

@ -1,12 +1,11 @@
'use strict'; // eslint-disable-line strict
import { Logger } from 'werelogs';
/**
* Convert timestamp to milliseconds since Unix Epoch
* @param {string} timestamp of ISO8601Timestamp format without
* @param timestamp of ISO8601Timestamp format without
* dashes or colons, e.g. 20160202T220410Z
* @return {number} number of milliseconds since Unix Epoch
*/
function convertAmzTimeToMs(timestamp) {
export function convertAmzTimeToMs(timestamp: string) {
const arr = timestamp.split('');
// Convert to YYYY-MM-DDTHH:mm:ss.sssZ
const ISO8601time = `${arr.slice(0, 4).join('')}-${arr[4]}${arr[5]}` +
@ -15,13 +14,12 @@ function convertAmzTimeToMs(timestamp) {
return Date.parse(ISO8601time);
}
/**
* Convert UTC timestamp to ISO 8601 timestamp
* @param {string} timestamp of UTC form: Fri, 10 Feb 2012 21:34:55 GMT
* @return {string} ISO8601 timestamp of form: YYYYMMDDTHHMMSSZ
* @param timestamp of UTC form: Fri, 10 Feb 2012 21:34:55 GMT
* @return ISO8601 timestamp of form: YYYYMMDDTHHMMSSZ
*/
function convertUTCtoISO8601(timestamp) {
export function convertUTCtoISO8601(timestamp: string | number) {
// convert to ISO string: YYYY-MM-DDTHH:mm:ss.sssZ.
const converted = new Date(timestamp).toISOString();
// Remove "-"s and "."s and milliseconds
@ -30,13 +28,13 @@ function convertUTCtoISO8601(timestamp) {
/**
* Check whether timestamp predates request or is too old
* @param {string} timestamp of ISO8601Timestamp format without
* @param timestamp of ISO8601Timestamp format without
* dashes or colons, e.g. 20160202T220410Z
* @param {number} expiry - number of seconds signature should be valid
* @param {object} log - log for request
* @return {boolean} true if there is a time problem
* @param expiry - number of seconds signature should be valid
* @param log - log for request
* @return true if there is a time problem
*/
function checkTimeSkew(timestamp, expiry, log) {
export function checkTimeSkew(timestamp: string, expiry: number, log: Logger) {
const currentTime = Date.now();
const fifteenMinutes = (15 * 60 * 1000);
const parsedTimestamp = convertAmzTimeToMs(timestamp);
@ -56,5 +54,3 @@ function checkTimeSkew(timestamp, expiry, log) {
}
return false;
}
module.exports = { convertAmzTimeToMs, convertUTCtoISO8601, checkTimeSkew };

View File

@ -1,17 +1,19 @@
'use strict'; // eslint-disable-line strict
const errors = require('../../../lib/errors');
import { Logger } from 'werelogs';
import errors from '../../../lib/errors';
/**
* Validate Credentials
* @param {array} credentials - contains accessKey, scopeDate,
* @param credentials - contains accessKey, scopeDate,
* region, service, requestType
* @param {string} timestamp - timestamp from request in
* @param timestamp - timestamp from request in
* the format of ISO 8601: YYYYMMDDTHHMMSSZ
* @param {object} log - logging object
* @return {boolean} true if credentials are correct format, false if not
* @param log - logging object
*/
function validateCredentials(credentials, timestamp, log) {
export function validateCredentials(
credentials: [string, string, string, string, string],
timestamp: string,
log: Logger
): Error | {} {
if (!Array.isArray(credentials) || credentials.length !== 5) {
log.warn('credentials in improper format', { credentials });
return errors.InvalidArgument;
@ -58,12 +60,21 @@ function validateCredentials(credentials, timestamp, log) {
/**
* Extract and validate components from query object
* @param {object} queryObj - query object from request
* @param {object} log - logging object
* @return {object} object containing extracted query params for authV4
* @param queryObj - query object from request
* @param log - logging object
* @return object containing extracted query params for authV4
*/
function extractQueryParams(queryObj, log) {
const authParams = {};
export function extractQueryParams(
queryObj: { [key: string]: string | undefined },
log: Logger
) {
const authParams: {
signedHeaders?: string;
signatureFromRequest?: string;
timestamp?: string;
expiry?: number;
credential?: [string, string, string, string, string];
} = {};
// Do not need the algorithm sent back
if (queryObj['X-Amz-Algorithm'] !== 'AWS4-HMAC-SHA256') {
@ -99,7 +110,7 @@ function extractQueryParams(queryObj, log) {
return authParams;
}
const expiry = Number.parseInt(queryObj['X-Amz-Expires'], 10);
const expiry = Number.parseInt(queryObj['X-Amz-Expires'] ?? 'nope', 10);
const sevenDays = 604800;
if (expiry && (expiry > 0 && expiry <= sevenDays)) {
authParams.expiry = expiry;
@ -110,6 +121,7 @@ function extractQueryParams(queryObj, log) {
const credential = queryObj['X-Amz-Credential'];
if (credential && credential.length > 28 && credential.indexOf('/') > -1) {
// @ts-ignore
authParams.credential = credential.split('/');
} else {
log.warn('invalid credential param', { credential });
@ -121,14 +133,17 @@ function extractQueryParams(queryObj, log) {
/**
* Extract and validate components from auth header
* @param {string} authHeader - authorization header from request
* @param {object} log - logging object
* @return {object} object containing extracted auth header items for authV4
* @param authHeader - authorization header from request
* @param log - logging object
* @return object containing extracted auth header items for authV4
*/
function extractAuthItems(authHeader, log) {
const authItems = {};
const authArray = authHeader
.replace('AWS4-HMAC-SHA256 ', '').split(',');
export function extractAuthItems(authHeader: string, log: Logger) {
const authItems: {
credentialsArr?: [string, string, string, string, string];
signedHeaders?: string;
signatureFromRequest?: string;
} = {};
const authArray = authHeader.replace('AWS4-HMAC-SHA256 ', '').split(',');
if (authArray.length < 3) {
return authItems;
@ -138,8 +153,12 @@ function extractAuthItems(authHeader, log) {
const signedHeadersStr = authArray[1];
const signatureStr = authArray[2];
log.trace('credentials from request', { credentialStr });
if (credentialStr && credentialStr.trim().startsWith('Credential=')
&& credentialStr.indexOf('/') > -1) {
if (
credentialStr &&
credentialStr.trim().startsWith('Credential=') &&
credentialStr.indexOf('/') > -1
) {
// @ts-ignore
authItems.credentialsArr = credentialStr
.trim().replace('Credential=', '').split('/');
} else {
@ -166,11 +185,11 @@ function extractAuthItems(authHeader, log) {
/**
* Checks whether the signed headers include the host header
* and all x-amz- and x-scal- headers in request
* @param {string} signedHeaders - signed headers sent with request
* @param {object} allHeaders - request.headers
* @return {boolean} true if all x-amz-headers included and false if not
* @param signedHeaders - signed headers sent with request
* @param allHeaders - request.headers
* @return true if all x-amz-headers included and false if not
*/
function areSignedHeadersComplete(signedHeaders, allHeaders) {
export function areSignedHeadersComplete(signedHeaders: string, allHeaders: Headers) {
const signedHeadersList = signedHeaders.split(';');
if (signedHeadersList.indexOf('host') === -1) {
return false;
@ -185,6 +204,3 @@ function areSignedHeadersComplete(signedHeaders, allHeaders) {
}
return true;
}
module.exports = { validateCredentials, extractQueryParams,
areSignedHeadersComplete, extractAuthItems };

View File

@ -1,151 +0,0 @@
'use strict'; // eslint-disable-line strict
const crypto = require('crypto');
// The min value here is to manage further backward compat if we
// need it
// Default value
const vaultGeneratedIamSecurityTokenSizeMin = 128;
// Safe to assume that a typical token size is less than 8192 bytes
const vaultGeneratedIamSecurityTokenSizeMax = 8192;
// Base-64
const vaultGeneratedIamSecurityTokenPattern = /^[A-Za-z0-9/+=]*$/;
module.exports = {
// info about the iam security token
iamSecurityToken: {
min: vaultGeneratedIamSecurityTokenSizeMin,
max: vaultGeneratedIamSecurityTokenSizeMax,
pattern: vaultGeneratedIamSecurityTokenPattern,
},
// PublicId is used as the canonicalID for a request that contains
// no authentication information. Requestor can access
// only public resources
publicId: 'http://acs.amazonaws.com/groups/global/AllUsers',
zenkoServiceAccount: 'http://acs.zenko.io/accounts/service',
metadataFileNamespace: '/MDFile',
dataFileURL: '/DataFile',
passthroughFileURL: '/PassthroughFile',
// AWS states max size for user-defined metadata
// (x-amz-meta- headers) is 2 KB:
// http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
// In testing, AWS seems to allow up to 88 more bytes,
// so we do the same.
maximumMetaHeadersSize: 2136,
emptyFileMd5: 'd41d8cd98f00b204e9800998ecf8427e',
// Version 2 changes the format of the data location property
// Version 3 adds the dataStoreName attribute
// Version 4 add the Creation-Time and Content-Language attributes,
// and add support for x-ms-meta-* headers in UserMetadata
// Version 5 adds the azureInfo structure
mdModelVersion: 5,
/*
* Splitter is used to build the object name for the overview of a
* multipart upload and to build the object names for each part of a
* multipart upload. These objects with large names are then stored in
* metadata in a "shadow bucket" to a real bucket. The shadow bucket
* contains all ongoing multipart uploads. We include in the object
* name some of the info we might need to pull about an open multipart
* upload or about an individual part with each piece of info separated
* by the splitter. We can then extract each piece of info by splitting
* the object name string with this splitter.
* For instance, assuming a splitter of '...!*!',
* the name of the upload overview would be:
* overview...!*!objectKey...!*!uploadId
* For instance, the name of a part would be:
* uploadId...!*!partNumber
*
* The sequence of characters used in the splitter should not occur
* elsewhere in the pieces of info to avoid splitting where not
* intended.
*
* Splitter is also used in adding bucketnames to the
* namespacerusersbucket. The object names added to the
* namespaceusersbucket are of the form:
* canonicalID...!*!bucketname
*/
splitter: '..|..',
usersBucket: 'users..bucket',
// MPU Bucket Prefix is used to create the name of the shadow
// bucket used for multipart uploads. There is one shadow mpu
// bucket per bucket and its name is the mpuBucketPrefix followed
// by the name of the final destination bucket for the object
// once the multipart upload is complete.
mpuBucketPrefix: 'mpuShadowBucket',
// since aws s3 does not allow capitalized buckets, these may be
// used for special internal purposes
permittedCapitalizedBuckets: {
METADATA: true,
},
// Setting a lower object key limit to account for:
// - Mongo key limit of 1012 bytes
// - Version ID in Mongo Key if versioned of 33
// - Max bucket name length if bucket match false of 63
// - Extra prefix slash for bucket prefix if bucket match of 1
objectKeyByteLimit: 915,
/* delimiter for location-constraint. The location constraint will be able
* to include the ingestion flag
*/
zenkoSeparator: ':',
/* eslint-disable camelcase */
externalBackends: { aws_s3: true, azure: true, gcp: true, pfs: true },
replicationBackends: { aws_s3: true, azure: true, gcp: true },
// hex digest of sha256 hash of empty string:
emptyStringHash: crypto.createHash('sha256')
.update('', 'binary').digest('hex'),
mpuMDStoredExternallyBackend: { aws_s3: true, gcp: true },
// AWS sets a minimum size limit for parts except for the last part.
// http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadComplete.html
minimumAllowedPartSize: 5242880,
gcpMaximumAllowedPartCount: 1024,
// GCP Object Tagging Prefix
gcpTaggingPrefix: 'aws-tag-',
productName: 'APN/1.0 Scality/1.0 Scality CloudServer for Zenko',
legacyLocations: ['sproxyd', 'legacy'],
// healthcheck default call from nginx is every 2 seconds
// for external backends, don't call unless at least 1 minute
// (60,000 milliseconds) since last call
externalBackendHealthCheckInterval: 60000,
// some of the available data backends (if called directly rather
// than through the multiple backend gateway) need a key provided
// as a string as first parameter of the get/delete methods.
clientsRequireStringKey: { sproxyd: true, cdmi: true },
hasCopyPartBackends: { aws_s3: true, gcp: true },
versioningNotImplBackends: { azure: true, gcp: true },
// user metadata applied on zenko-created objects
zenkoIDHeader: 'x-amz-meta-zenko-instance-id',
// Default expiration value of the S3 pre-signed URL duration
// 604800 seconds (seven days).
defaultPreSignedURLExpiry: 7 * 24 * 60 * 60,
// Regex for ISO-8601 formatted date
shortIso8601Regex: /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z/,
longIso8601Regex: /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.\d{3}Z/,
supportedNotificationEvents: new Set([
's3:ObjectCreated:*',
's3:ObjectCreated:Put',
's3:ObjectCreated:Copy',
's3:ObjectCreated:CompleteMultipartUpload',
's3:ObjectRemoved:*',
's3:ObjectRemoved:Delete',
's3:ObjectRemoved:DeleteMarkerCreated',
]),
notificationArnPrefix: 'arn:scality:bucketnotif',
// HTTP server keep-alive timeout is set to a higher value than
// client's free sockets timeout to avoid the risk of triggering
// ECONNRESET errors if the server closes the connection at the
// exact moment clients attempt to reuse an established connection
// for a new request.
//
// Note: the ability to close inactive connections on the client
// after httpClientFreeSocketsTimeout milliseconds requires the
// use of "agentkeepalive" module instead of the regular node.js
// http.Agent.
httpServerKeepAliveTimeout: 60000,
httpClientFreeSocketTimeout: 55000,
supportedLifecycleRules: [
'expiration',
'noncurrentVersionExpiration',
'abortIncompleteMultipartUpload',
],
};

149
lib/constants.ts Normal file
View File

@ -0,0 +1,149 @@
import * as crypto from 'crypto';
// The min value here is to manage further backward compat if we
// need it
// Default value
const vaultGeneratedIamSecurityTokenSizeMin = 128;
// Safe to assume that a typical token size is less than 8192 bytes
const vaultGeneratedIamSecurityTokenSizeMax = 8192;
// Base-64
const vaultGeneratedIamSecurityTokenPattern = /^[A-Za-z0-9/+=]*$/;
// info about the iam security token
export const iamSecurityToken = {
min: vaultGeneratedIamSecurityTokenSizeMin,
max: vaultGeneratedIamSecurityTokenSizeMax,
pattern: vaultGeneratedIamSecurityTokenPattern,
};
// PublicId is used as the canonicalID for a request that contains
// no authentication information. Requestor can access
// only public resources
export const publicId = 'http://acs.amazonaws.com/groups/global/AllUsers';
export const zenkoServiceAccount = 'http://acs.zenko.io/accounts/service';
export const metadataFileNamespace = '/MDFile';
export const dataFileURL = '/DataFile';
export const passthroughFileURL = '/PassthroughFile';
// AWS states max size for user-defined metadata
// (x-amz-meta- headers) is 2 KB:
// http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
// In testing, AWS seems to allow up to 88 more bytes,
// so we do the same.
export const maximumMetaHeadersSize = 2136;
export const emptyFileMd5 = 'd41d8cd98f00b204e9800998ecf8427e';
// Version 2 changes the format of the data location property
// Version 3 adds the dataStoreName attribute
// Version 4 add the Creation-Time and Content-Language attributes,
// and add support for x-ms-meta-* headers in UserMetadata
// Version 5 adds the azureInfo structure
export const mdModelVersion = 5;
/*
* Splitter is used to build the object name for the overview of a
* multipart upload and to build the object names for each part of a
* multipart upload. These objects with large names are then stored in
* metadata in a "shadow bucket" to a real bucket. The shadow bucket
* contains all ongoing multipart uploads. We include in the object
* name some of the info we might need to pull about an open multipart
* upload or about an individual part with each piece of info separated
* by the splitter. We can then extract each piece of info by splitting
* the object name string with this splitter.
* For instance, assuming a splitter of '...!*!',
* the name of the upload overview would be:
* overview...!*!objectKey...!*!uploadId
* For instance, the name of a part would be:
* uploadId...!*!partNumber
*
* The sequence of characters used in the splitter should not occur
* elsewhere in the pieces of info to avoid splitting where not
* intended.
*
* Splitter is also used in adding bucketnames to the
* namespacerusersbucket. The object names added to the
* namespaceusersbucket are of the form:
* canonicalID...!*!bucketname
*/
export const splitter = '..|..';
export const usersBucket = 'users..bucket';
// MPU Bucket Prefix is used to create the name of the shadow
// bucket used for multipart uploads. There is one shadow mpu
// bucket per bucket and its name is the mpuBucketPrefix followed
// by the name of the final destination bucket for the object
// once the multipart upload is complete.
export const mpuBucketPrefix = 'mpuShadowBucket';
// since aws s3 does not allow capitalized buckets, these may be
// used for special internal purposes
export const permittedCapitalizedBuckets = {
METADATA: true,
};
// Setting a lower object key limit to account for:
// - Mongo key limit of 1012 bytes
// - Version ID in Mongo Key if versioned of 33
// - Max bucket name length if bucket match false of 63
// - Extra prefix slash for bucket prefix if bucket match of 1
export const objectKeyByteLimit = 915;
/* delimiter for location-constraint. The location constraint will be able
* to include the ingestion flag
*/
export const zenkoSeparator = ':';
/* eslint-disable camelcase */
export const externalBackends = { aws_s3: true, azure: true, gcp: true, pfs: true };
export const replicationBackends = { aws_s3: true, azure: true, gcp: true };
// hex digest of sha256 hash of empty string:
export const emptyStringHash = crypto.createHash('sha256')
.update('', 'binary').digest('hex');
export const mpuMDStoredExternallyBackend = { aws_s3: true, gcp: true };
// AWS sets a minimum size limit for parts except for the last part.
// http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadComplete.html
export const minimumAllowedPartSize = 5242880;
export const gcpMaximumAllowedPartCount = 1024;
// GCP Object Tagging Prefix
export const gcpTaggingPrefix = 'aws-tag-';
export const productName = 'APN/1.0 Scality/1.0 Scality CloudServer for Zenko';
export const legacyLocations = ['sproxyd', 'legacy'];
// healthcheck default call from nginx is every 2 seconds
// for external backends, don't call unless at least 1 minute
// (60,000 milliseconds) since last call
export const externalBackendHealthCheckInterval = 60000;
// some of the available data backends (if called directly rather
// than through the multiple backend gateway) need a key provided
// as a string as first parameter of the get/delete methods.
export const clientsRequireStringKey = { sproxyd: true, cdmi: true };
export const hasCopyPartBackends = { aws_s3: true, gcp: true };
export const versioningNotImplBackends = { azure: true, gcp: true };
// user metadata applied on zenko-created objects
export const zenkoIDHeader = 'x-amz-meta-zenko-instance-id';
// Default expiration value of the S3 pre-signed URL duration
// 604800 seconds (seven days).
export const defaultPreSignedURLExpiry = 7 * 24 * 60 * 60;
// Regex for ISO-8601 formatted date
export const shortIso8601Regex = /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z/;
export const longIso8601Regex = /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.\d{3}Z/;
export const supportedNotificationEvents = new Set([
's3:ObjectCreated:*',
's3:ObjectCreated:Put',
's3:ObjectCreated:Copy',
's3:ObjectCreated:CompleteMultipartUpload',
's3:ObjectRemoved:*',
's3:ObjectRemoved:Delete',
's3:ObjectRemoved:DeleteMarkerCreated',
]);
export const notificationArnPrefix = 'arn:scality:bucketnotif';
// HTTP server keep-alive timeout is set to a higher value than
// client's free sockets timeout to avoid the risk of triggering
// ECONNRESET errors if the server closes the connection at the
// exact moment clients attempt to reuse an established connection
// for a new request.
//
// Note: the ability to close inactive connections on the client
// after httpClientFreeSocketsTimeout milliseconds requires the
// use of "agentkeepalive" module instead of the regular node.js
// http.Agent.
export const httpServerKeepAliveTimeout = 60000;
export const httpClientFreeSocketTimeout = 55000;
export const supportedLifecycleRules = [
'expiration',
'noncurrentVersionExpiration',
'abortIncompleteMultipartUpload',
];

View File

@ -1,87 +0,0 @@
'use strict'; // eslint-disable-line strict
/**
* ArsenalError
*
* @extends {Error}
*/
class ArsenalError extends Error {
/**
* constructor.
*
* @param {string} type - Type of error or message
* @param {number} code - HTTP status code
* @param {string} desc - Verbose description of error
*/
constructor(type, code, desc) {
super(type);
/**
* HTTP status code of error
* @type {number}
*/
this.code = code;
/**
* Description of error
* @type {string}
*/
this.description = desc;
this[type] = true;
}
/**
* Output the error as a JSON string
* @returns {string} Error as JSON string
*/
toString() {
return JSON.stringify({
errorType: this.message,
errorMessage: this.description,
});
}
/**
* Write the error in an HTTP response
*
* @param { http.ServerResponse } res - Response we are responding to
* @returns {undefined}
*/
writeResponse(res) {
res.writeHead(this.code);
res.end(this.toString());
}
/**
* customizeDescription returns a new ArsenalError with a new description
* with the same HTTP code and message.
*
* @param {string} description - New error description
* @returns {ArsenalError} New error
*/
customizeDescription(description) {
return new ArsenalError(this.message, this.code, description);
}
}
/**
* Generate an Errors instances object.
*
* @returns {Object.<string, ArsenalError>} - object field by arsenalError
* instances
*/
function errorsGen() {
const errors = {};
const errorsObj = require('../errors/arsenalErrors.json');
Object.keys(errorsObj)
.filter(index => index !== '_comment')
.forEach(index => {
errors[index] = new ArsenalError(index, errorsObj[index].code,
errorsObj[index].description);
});
return errors;
}
module.exports = errorsGen();

1033
lib/errors/arsenalErrors.ts Normal file

File diff suppressed because it is too large Load Diff

99
lib/errors/index.ts Normal file
View File

@ -0,0 +1,99 @@
import type { ServerResponse } from 'http';
import * as rawErrors from './arsenalErrors';
/** All possible errors names. */
export type Name = keyof typeof rawErrors
/** Object containing all errors names. It has the format { [Name]: "Name" } */
export type Names = { [Name_ in Name]: Name_ };
/** Mapping used to determine an error type. It has the format { [Name]: boolean } */
export type Is = { [_ in Name]: boolean };
/** Mapping of all possible Errors. It has the format { [Name]: Error } */
export type Errors = { [_ in Name]: ArsenalError };
// This contains some metaprog. Be careful.
// Proxy can be found on MDN.
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Proxy
// While this could seems better to avoid metaprog, this allows us to enforce
// type-checking properly while avoiding all errors that could happen at runtime.
const createIs = (type: Name) => {
const get = (_: {}, value: string | symbol) => type === value;
return new Proxy({}, { get }) as Is;
};
export class ArsenalError extends Error {
/** HTTP status code. Example: 401, 403, 500, ... */
#code: number;
/** Text description of the error. */
#description: string;
/** Type of the error. */
#type: Name;
/** Object used to determine the error type.
* Example: error.is.InternalError */
#is: Is;
private constructor(type: Name, code: number, description: string) {
super(type);
this.#code = code;
this.#description = description;
this.#type = type;
this.#is = createIs(type);
}
/** Output the error as a JSON string */
toString() {
const errorType = this.message;
const errorMessage = this.#description;
return JSON.stringify({ errorType, errorMessage });
}
/** Write the error in an HTTP response */
writeResponse(res: ServerResponse) {
res.writeHead(this.#code);
const asStr = this.toString();
res.end(asStr);
}
/** Clone the error with a new description.*/
customizeDescription(description: string): ArsenalError {
const type = this.#type;
const code = this.#code;
return new ArsenalError(type, code, description);
}
/** Used to determine the error type. Example: error.is.InternalError */
get is() {
return this.#is;
}
/** HTTP status code. Example: 401, 403, 500, ... */
get code() {
return this.#code;
}
/** Text description of the error. */
get description() {
return this.#description;
}
/**
* Type of the error, belonging to Name. is should be prefered instead of
* type in a daily-basis, but type remains accessible for future use. */
get type() {
return this.#type;
}
/** Generate all possible errors. An instance is created by default. */
static errors() {
return Object.entries(rawErrors).reduce((acc, value) => {
const name = value[0] as Name;
const error = value[1];
const { code, description } = error;
const err = new ArsenalError(name, code, description);
return { ...acc, [name]: err };
}, {} as Errors);
}
}
/** Mapping of all possible Errors.
* Use them with errors[error].customizeDescription for any customization. */
export default ArsenalError.errors();

View File

@ -1,4 +1,4 @@
const errors = require('../errors');
const errors = require('../errors').default;
const validServices = {
aws: ['s3', 'iam', 'sts', 'ring'],

View File

@ -1,6 +1,6 @@
const assert = require('assert');
const errors = require('../errors');
const errors = require('../errors').default;
const { validateResourcePolicy } = require('../policy/policyValidator');
/**

View File

@ -1,7 +1,7 @@
const assert = require('assert');
const UUID = require('uuid');
const errors = require('../errors');
const errors = require('../errors').default;
const LifecycleRule = require('./LifecycleRule');
const escapeForXml = require('../s3middleware/escapeForXml');

View File

@ -5,7 +5,7 @@ const {
supportedNotificationEvents,
notificationArnPrefix,
} = require('../constants');
const errors = require('../errors');
const errors = require('../errors').default;
/**
* Format of xml request:

View File

@ -1,6 +1,6 @@
const assert = require('assert');
const errors = require('../errors');
const errors = require('../errors').default;
/**
* Format of xml request:

View File

@ -2,7 +2,7 @@ const assert = require('assert');
const UUID = require('uuid');
const escapeForXml = require('../s3middleware/escapeForXml');
const errors = require('../errors');
const errors = require('../errors').default;
const { isValidBucketName } = require('../s3routes/routesUtils');
const MAX_RULES = 1000;

View File

@ -5,7 +5,7 @@ const https = require('https');
const assert = require('assert');
const dhparam = require('../../https/dh2048').dhparam;
const ciphers = require('../../https/ciphers').ciphers;
const errors = require('../../errors');
const errors = require('../../errors').default;
const { checkSupportIPv6 } = require('./utils');

View File

@ -1,7 +1,7 @@
'use strict'; // eslint-disable-line
const os = require('os');
const errors = require('../../errors');
const errors = require('../../errors').default;
/**
* Parse the Range header into an object

View File

@ -3,7 +3,7 @@
const async = require('async');
const errors = require('../../errors');
const errors = require('../../errors').default;
const TTLVCodec = require('./codec/ttlv.js');
const TlsTransport = require('./transport/tls.js');
const KMIP = require('.');

View File

@ -1,6 +1,6 @@
const httpServer = require('../http/server');
const werelogs = require('werelogs');
const errors = require('../../errors');
const errors = require('../../errors').default;
const ZenkoMetrics = require('../../metrics/ZenkoMetrics');
const { sendSuccess, sendError } = require('./Utils');

View File

@ -1,6 +1,6 @@
const httpServer = require('../http/server');
const werelogs = require('werelogs');
const errors = require('../../errors');
const errors = require('../../errors').default;
const DEFAULT_LIVE_ROUTE = '/_/live';
const DEFAULT_READY_ROUTE = '/_/ready';

View File

@ -6,7 +6,7 @@ const werelogs = require('werelogs');
const constants = require('../../constants');
const utils = require('./utils');
const errors = require('../../errors');
const errors = require('../../errors').default;
const HttpAgent = require('agentkeepalive');

View File

@ -9,7 +9,7 @@ const httpServer = require('../http/server');
const constants = require('../../constants');
const { parseURL } = require('./utils');
const httpUtils = require('../http/utils');
const errors = require('../../errors');
const errors = require('../../errors').default;
function setContentLength(response, contentLength) {
response.setHeader('Content-Length', contentLength.toString());

View File

@ -1,6 +1,6 @@
'use strict'; // eslint-disable-line
const errors = require('../../errors');
const errors = require('../../errors').default;
const constants = require('../../constants');
const url = require('url');

View File

@ -10,7 +10,7 @@ const EventEmitter = require('events').EventEmitter;
const flattenError = require('./utils').flattenError;
const reconstructError = require('./utils').reconstructError;
const errors = require('../../errors');
const errors = require('../../errors').default;
const jsutil = require('../../jsutil');
const DEFAULT_CALL_TIMEOUT_MS = 30000;

View File

@ -1,9 +1,9 @@
'use strict'; // eslint-disable-line strict
const Ajv = require('ajv');
const userPolicySchema = require('./userPolicySchema');
const resourcePolicySchema = require('./resourcePolicySchema');
const errors = require('../errors');
const userPolicySchema = require('./userPolicySchema.json');
const resourcePolicySchema = require('./resourcePolicySchema.json');
const errors = require('../errors').default;
const ajValidate = new Ajv({ allErrors: true });
ajValidate.addMetaSchema(require('ajv/lib/refs/json-schema-draft-06.json'));

View File

@ -6,7 +6,7 @@ const ResultsCollector = require('./ResultsCollector');
const SubStreamInterface = require('./SubStreamInterface');
const objectUtils = require('../objectUtils');
const MD5Sum = require('../MD5Sum');
const errors = require('../../errors');
const errors = require('../../errors').default;
const azureMpuUtils = {};

View File

@ -1,5 +1,5 @@
const { parseString } = require('xml2js');
const errors = require('../errors');
const errors = require('../errors').default;
/*
Format of the xml request:

View File

@ -1,6 +1,6 @@
const { parseString } = require('xml2js');
const errors = require('../errors');
const errors = require('../errors').default;
/*
Format of xml request:

View File

@ -1,4 +1,4 @@
const errors = require('../errors');
const errors = require('../errors').default;
const crypto = require('crypto');
const constants = require('../constants');

View File

@ -1,6 +1,6 @@
const { parseString } = require('xml2js');
const errors = require('../errors');
const errors = require('../errors').default;
const escapeForXml = require('./escapeForXml');
const errorInvalidArgument = errors.InvalidArgument

View File

@ -1,5 +1,5 @@
const constants = require('../constants');
const errors = require('../errors');
const errors = require('../errors').default;
const userMetadata = {};
/**

View File

@ -1,4 +1,4 @@
const errors = require('../errors');
const errors = require('../errors').default;
function _matchesETag(item, contentMD5) {
return (item === contentMD5 || item === '*' || item === `"${contentMD5}"`);

View File

@ -1,6 +1,6 @@
const assert = require('assert');
const errors = require('../errors');
const errors = require('../errors').default;
const routeGET = require('./routes/routeGET');
const routePUT = require('./routes/routePUT');
const routeDELETE = require('./routes/routeDELETE');

View File

@ -1,5 +1,5 @@
const routesUtils = require('../routesUtils');
const errors = require('../../errors');
const errors = require('../../errors').default;
function routeDELETE(request, response, api, log, statsClient) {
log.debug('routing request', { method: 'routeDELETE' });
@ -82,7 +82,7 @@ function routeDELETE(request, response, api, log, statsClient) {
the object, the errors NoSuchKey and NoSuchVersion should not
* be sent back as a response.
*/
if (err && !err.NoSuchKey && !err.NoSuchVersion) {
if (err && !err.is.NoSuchKey && !err.is.NoSuchVersion) {
return routesUtils.responseNoBody(err, corsHeaders,
response, null, log);
}

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
function routerGET(request, response, api, log, statsClient,

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
function routeHEAD(request, response, api, log, statsClient) {

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
function routeOPTIONS(request, response, api, log, statsClient) {

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
/* eslint-disable no-param-reassign */

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
/* eslint-disable no-param-reassign */

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const routesUtils = require('../routesUtils');
function routerWebsite(request, response, api, log, statsClient,

View File

@ -1,6 +1,6 @@
const url = require('url');
const ipCheck = require('../ipCheck');
const errors = require('../errors');
const errors = require('../errors').default;
const constants = require('../constants');
const { eachSeries } = require('async');

8
lib/simple-glob.d.ts vendored Normal file
View File

@ -0,0 +1,8 @@
// This module declare the interface for simple-glob.
// simple-glob should probably be discarded in favor of node-glob.
// node-glob is an up to date glob implementation, with support for sync and
// async, and well maintained by the community.
// node-glob is performance oriented and is a little lighter than simple-glob.
declare module 'simple-glob' {
export default function (pattern: string | string[]): string[];
}

View File

@ -2,7 +2,7 @@ const async = require('async');
const PassThrough = require('stream').PassThrough;
const assert = require('assert');
const errors = require('../../errors');
const errors = require('../../errors').default;
const MD5Sum = require('../../s3middleware/MD5Sum');
const NullStream = require('../../s3middleware/nullStream');
const RelayMD5Sum = require('./utils/RelayMD5Sum');

View File

@ -1,6 +1,6 @@
const async = require('async');
const errors = require('../../errors');
const errors = require('../../errors').default;
const { parseTagFromQuery } = require('../../s3middleware/tagging');
const { externalBackendHealthCheckInterval } = require('../../constants');
const DataFileBackend = require('./file/DataFileInterface');

View File

@ -1,7 +1,7 @@
const AWS = require('aws-sdk');
const werelogs = require('werelogs');
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const MD5Sum = require('../../../s3middleware/MD5Sum');
const getMetaHeaders =
require('../../../s3middleware/userMetadata').getMetaHeaders;

View File

@ -1,7 +1,7 @@
const url = require('url');
const azure = require('azure-storage');
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const azureMpuUtils = require('../../../s3middleware/azureHelpers/mpuUtils');
const { validateAndFilterMpuParts } =
require('../../../s3middleware/processMpuParts');

View File

@ -1,4 +1,4 @@
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const MpuHelper = require('./mpuHelper');
const { createMpuKey, logger } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,5 +1,5 @@
const async = require('async');
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const MpuHelper = require('./mpuHelper');
const { createMpuKey, logger } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,5 +1,5 @@
const uuid = require('uuid/v4');
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const { createMpuKey, logger, getPutTagsMetadata } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,4 +1,4 @@
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const { createMpuKey, logger } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,5 +1,5 @@
const async = require('async');
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const { processTagSet } = require('../GcpUtils');

View File

@ -1,4 +1,4 @@
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const { getPartNumber, createMpuKey, logger } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,4 +1,4 @@
const errors = require('../../../../../errors');
const errors = require('../../../../../errors').default;
const { getPartNumber, createMpuKey, logger } = require('../GcpUtils');
const { logHelper } = require('../../utils');

View File

@ -1,7 +1,7 @@
const async = require('async');
const assert = require('assert');
const stream = require('stream');
const errors = require('../../../../errors');
const errors = require('../../../../errors').default;
const { minimumAllowedPartSize, gcpMaximumAllowedPartCount } =
require('../../../../constants');
const { createMpuList, logger } = require('./GcpUtils');

View File

@ -1,7 +1,7 @@
const AWS = require('aws-sdk');
const Service = AWS.Service;
const errors = require('../../../../errors');
const errors = require('../../../../errors').default;
const GcpApis = require('./GcpApis');
const GcpServiceSetup = require('./GcpServiceSetup');
const GcpManagedUpload = require('./GcpManagedUpload');

View File

@ -1,6 +1,6 @@
const werelogs = require('werelogs');
const errors = require('../../../../errors');
const errors = require('../../../../errors').default;
const { parseTagFromQuery } = require('../../../../s3middleware/tagging');
const { gcpTaggingPrefix } = require('../../../../constants');

View File

@ -1,6 +1,6 @@
const async = require('async');
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const MD5Sum = require('../../../s3middleware/MD5Sum');
const { GCP, GcpUtils } = require('./GCP');
const { createMpuKey } = GcpUtils;

View File

@ -1,4 +1,4 @@
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const { createLogger, logHelper } = require('./utils');
const RESTClient = require('../../../network/rest/RESTClient');

View File

@ -7,7 +7,7 @@ const async = require('async');
const diskusage = require('diskusage');
const werelogs = require('werelogs');
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const stringHash = require('../../../stringHash');
const jsutil = require('../../../jsutil');
const storageUtils = require('../../utils');

View File

@ -1,7 +1,7 @@
const stream = require('stream');
const werelogs = require('werelogs');
const errors = require('../../../errors');
const errors = require('../../../errors').default;
const logger = new werelogs.Logger('MemDataBackend');

View File

@ -1,4 +1,4 @@
const errors = require('../../errors');
const errors = require('../../errors').default;
const BucketInfo = require('../../models/BucketInfo');

Some files were not shown because too many files have changed in this diff Show More