Compare commits

..

229 Commits

Author SHA1 Message Date
bert-e 4525e01d95 Merge branch 'dependabot/npm_and_yarn/development/7.4/socket.io-4.4.1' into tmp/octopus/w/7.10/dependabot/npm_and_yarn/development/7.4/socket.io-4.4.1 2022-01-20 00:27:10 +00:00
dependabot[bot] 10c9c62178
build(deps): bump socket.io from 2.3.0 to 4.4.1
Bumps [socket.io](https://github.com/socketio/socket.io) from 2.3.0 to 4.4.1.
- [Release notes](https://github.com/socketio/socket.io/releases)
- [Changelog](https://github.com/socketio/socket.io/blob/master/CHANGELOG.md)
- [Commits](https://github.com/socketio/socket.io/compare/2.3.0...4.4.1)

---
updated-dependencies:
- dependency-name: socket.io
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-01-20 00:27:05 +00:00
bert-e e7869d832e Merge branches 'w/7.10/improvement/ARSN-21-Upgrade-Node-to-16' and 'q/1649/7.4/improvement/ARSN-21-Upgrade-Node-to-16' into tmp/octopus/q/7.10 2022-01-20 00:09:24 +00:00
bert-e 0c17c748fe Merge branches 'w/7.10/bugfix/ARSN-35/add-http-header-too-large-error' and 'q/1611/7.4/bugfix/ARSN-35/add-http-header-too-large-error' into tmp/octopus/q/7.10 2022-01-19 00:48:16 +00:00
bert-e 9c185007a2 Merge branch 'bugfix/ARSN-35/add-http-header-too-large-error' into tmp/octopus/w/7.10/bugfix/ARSN-35/add-http-header-too-large-error 2022-01-18 17:43:37 +00:00
Taylor McKinnon d7a4bef3b3 Merge remote-tracking branch 'origin/improvement/ARSN-46/add_isAborted_flag' into w/7.10/improvement/ARSN-46/add_isAborted_flag 2022-01-13 13:53:41 -08:00
Ronnie Smith 79699324d9
Merge remote-tracking branch 'origin/improvement/ARSN-21-Upgrade-Node-to-16' into w/7.10/improvement/ARSN-21-Upgrade-Node-to-16 2022-01-11 14:26:12 -08:00
Alexander Chan 8aa0f9d030 ARSN-33: add s3 lifecycle helpers 2021-11-19 18:01:05 -08:00
Jonathan Gramain 3b0ea3d7a1 Merge remote-tracking branch 'origin/improvement/ARSN-42-addNullUploadIdField' into w/7.10/improvement/ARSN-42-addNullUploadIdField 2021-11-18 18:24:33 -08:00
Jonathan Gramain 8c2db870c7 Merge remote-tracking branch 'origin/feature/ARSN-38-replayPrefixHiddenInListings' into w/7.10/feature/ARSN-38-replayPrefixHiddenInListings 2021-11-04 15:22:29 -07:00
bert-e 67e5cc770d Merge branch 'feature/ARSN-37-addUploadId' into tmp/octopus/w/7.10/feature/ARSN-37-addUploadId 2021-11-02 00:28:00 +00:00
Rahul Padigela 07a110ff86 chore: update version 2021-10-26 14:52:35 -07:00
Rahul Padigela c696f9a38b Merge remote-tracking branch 'origin/improvement/ARSN-31-update-version' into w/7.10/improvement/ARSN-31-update-version 2021-10-26 14:52:19 -07:00
bert-e c0825231e9 Merge branch 'bugfix/ARSN-31-invalid-query-params' into tmp/octopus/w/7.10/bugfix/ARSN-31-invalid-query-params 2021-10-26 00:27:15 +00:00
Thomas Carmet e52330b935 Merge branch 'feature/ARSN-20-migrate-github-actions' into w/7.10/feature/ARSN-20-migrate-github-actions 2021-09-23 11:37:29 -07:00
Thomas Carmet ce7bba1f8d ARSN-17 fixup version mistake for dev/7.10 2021-08-31 10:44:52 -07:00
Thomas Carmet 46338119b6 Merge remote-tracking branch 'origin/feature/ARSN-17-setup-package.json' into w/7.10/feature/ARSN-17-setup-package.json 2021-08-31 09:57:28 -07:00
bert-e cd50d46162 Merge branch 'feature/ARSN-12-bumpArsenalVersion-stabilization' into tmp/octopus/w/7.10/feature/ARSN-12-bumpArsenalVersion-stabilization 2021-08-26 21:48:36 +00:00
Jonathan Gramain 016107500f feature: ARSN-12 bump arsenal version
Needed to ensure proper dependency update in Vault

(cherry picked from commit c495ecacb0)
2021-08-26 14:47:18 -07:00
Jonathan Gramain 04ebaa8d8f Merge remote-tracking branch 'origin/feature/ARSN-12-bumpArsenalVersion' into w/7.10/feature/ARSN-12-bumpArsenalVersion 2021-08-26 14:24:27 -07:00
bert-e 3f702c29cd Merge branch 'feature/ARSN-12-condition-put-backport' into tmp/octopus/w/7.10/feature/ARSN-12-condition-put-backport 2021-08-25 21:07:37 +00:00
bert-e 7b4e65eaf1 Merge branch 'feature/ARSN-12-introduce-cond-put' into tmp/octopus/w/7.10/feature/ARSN-12-introduce-cond-put 2021-08-25 20:54:20 +00:00
anurag4DSB f101a0f3a0
feature: ARSN-12-introduce-cond-put-op 2021-08-25 22:50:23 +02:00
bert-e e0b95fe931 Merge branch 'w/7.10/feature/ARSN-11-bump-werelogs' into tmp/octopus/q/7.10 2021-08-13 17:56:09 +00:00
naren-scality db7d8b0b45 improvement: ARSN-13 expose isResourceApplicable for policy evaulation 2021-08-12 20:06:19 -07:00
bert-e 46d3a1e53c Merge branch 'feature/ARSN-11-bump-werelogs' into tmp/octopus/w/7.10/feature/ARSN-11-bump-werelogs 2021-08-12 17:06:27 +00:00
Jonathan Gramain 9aa8710a57 ARSN-9 KMIP deep healthcheck
Add a healthcheck() function in the KMIP client that create a dummy
bucket key on the KMS then deletes it, to ensure basic functionality
is working
2021-08-04 11:51:23 -07:00
Ronnie Smith 735c6f2fb5
bugfix: ARSN-8 Remove response code and message from log
* The response has not been computed so this always
returns 200 which is not accurate and is confusing
2021-08-02 19:02:44 -07:00
bert-e 942c6c2a1e Merge branch 'bugfix/ARSN-7_SkipHeadersOn304' into tmp/octopus/w/7.10/bugfix/ARSN-7_SkipHeadersOn304 2021-07-30 23:46:09 +00:00
bert-e 4a6b69247b Merge branch 'feature/ARSN-5/addBucketInfoUIDField' into q/7.10 2021-07-28 16:58:33 +00:00
Gregoire Doumergue 66a48f44da Revert "S3C-656: Remove the expect header hack"
This reverts commit 3e1d8c8ed7.
2021-07-28 14:50:00 +02:00
Gregoire Doumergue fa3ec78e25 Revert "ARSN-3: Remove the test for the old hack"
This reverts commit 8f4453862d.
2021-07-28 14:49:17 +02:00
Alexander Chan 112cee9118 ARSN-5: add BucketInfo field UID 2021-07-27 16:58:12 -07:00
Jonathan Gramain 6fdfbcb223 bugfix: ARSN-4 rework KMIP connection handling
Rework KMIP connection handling to catch all errors, including before
the connection is established, and return the error to each pending
command response.

In particular, setup the 'error' listener (also 'data' and 'end'
listeners) as soon as the TLS client socket is created instead of
waiting for the connection to be established to set the listeners.
2021-07-21 18:26:39 -07:00
Jonathan Gramain c41f1ca4b3 bugfix: [test] ARSN-4 reproduce issue in func tests
- Change existing KMIP transport test to trigger issue: Modify the
  EchoChannel socket mock to use standard EventEmitter, which triggers
  an exception when an error event is emitted.

- Add a new test for TLS transport that raises the same TLS connection
  exception than witnessed on lab
2021-07-21 18:00:52 -07:00
Jonathan Gramain 888273bb2f improvement: S3C-4312 fix ObjectMDLocation.setDataLocation()
Fix ObjectMDLocation.setDataLocation() behavior when cryptoScheme and
cipheredDataKey location params are undefined: instead of setting the
attributes as undefined, remove the attributes.

The previous situation made some backbeat tests fail due to those
attributes existing, and it's cleaner this way.
2021-07-21 11:04:22 -07:00
Jonathan Gramain 1978405fb9 improvement: S3C-4312 backport + adapt ObjectMDLocation unit test
Backport and adapt to 7.x branch the ObjectMDLocation unit tests from
development/8.1 branch
2021-07-20 14:41:58 -07:00
Jonathan Gramain d019076854 improvement: S3C-4312 encryption info in ObjectMDLocation.setDataLocation()
Support setting encryption info in ObjectMDLocation with the method
setDataLocation(), used by backbeat to set the new target location
before writing metadata on the target.
2021-07-20 14:41:37 -07:00
Gregoire Doumergue 8f4453862d ARSN-3: Remove the test for the old hack 2021-07-12 16:19:27 +02:00
Gregoire Doumergue 3e1d8c8ed7 S3C-656: Remove the expect header hack 2021-07-12 15:13:21 +02:00
Rached Ben Mustapha a41d4db1c4 chore: bump version 2021-07-08 16:37:52 -07:00
Rached Ben Mustapha 00d9c9af0c bf: fix user arn validation with path 2021-07-08 16:37:52 -07:00
Rahul Padigela 7aafd05b74 bugfix: ARSN-1 conditionally check for content-md5 2021-07-06 16:17:33 -07:00
bert-e 5540afa194 Merge branch 'feature/S3C-4614/assumerole' into q/7.10 2021-06-29 21:40:05 +00:00
Rached Ben Mustapha 6b9e7fc11f chore: bump version 2021-06-29 20:11:44 +00:00
Nicolas Humbert 058455061d ft: S3C-4614 AssumeRole cross account with user as principal 2021-06-29 20:11:44 +00:00
vrancurel d1e4c8dbb9 ft: S3C-4552 remove duplicate test 2021-06-29 13:02:36 -07:00
bert-e e87198f7ba Merge branch 'feature/S3C-4552-tiny-version-ids' into q/7.10 2021-06-29 19:27:15 +00:00
vrancurel a7bfedfa2b ft: S3C-4552 tiny version IDs
Will be enabled on new buckets only.
2021-06-29 11:13:39 -07:00
bert-e 2794fe0636 Merge branch 'improvement/S3C-4110/backport' into q/7.10 2021-06-29 12:15:38 +00:00
Jonathan Gramain 6347358cc2 bugfix: S3C-3744 fix bucket encryption related actions
Changes made to match the AWS reference:
https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html

- change "bucketDeleteEncryption" action to "s3:PutEncryptionConfiguration"

- rename PUT and GET actions to PutEncryptionConfiguration and
  GetEncryptionConfiguration and add missing 's3:' prefix
2021-06-21 16:12:59 -07:00
Nicolas Humbert 739f0a709c S3C-4110 backport lifecycle expiration - add tests 2021-06-09 16:07:28 -05:00
bert-e ea6e0c464b Merge branches 'w/7.10/bugfix/S3C-4257_StartSeqCanBeNull' and 'q/1472/7.4/bugfix/S3C-4257_StartSeqCanBeNull' into tmp/octopus/q/7.10 2021-06-08 08:18:01 +00:00
bert-e 4948e3a75e Merge branch 'bugfix/S3C-4257_StartSeqCanBeNull' into tmp/octopus/w/7.10/bugfix/S3C-4257_StartSeqCanBeNull 2021-06-08 02:49:44 +00:00
philipyoo 13f8d796b4 bf: apply multiple lifecycle filter tags if exists 2021-06-02 17:43:29 -05:00
Bennett Buchanan 9bdc330e9b feature: ZENKO-1317 AWS lifecycle compat 2021-06-02 17:43:25 -05:00
bert-e bcb6836a23 Merge branch 'feature/S3C-3754_add_bucketDeleteEncryption_route' into q/7.10 2021-05-17 17:31:24 +00:00
Taylor McKinnon cd15540cb9 ft(S3C-3754): Add bucketDeleteEncrytion route and support code 2021-05-17 10:27:52 -07:00
Ilke fe264673e1 bf: S3C-4358 add versioned object lock actions 2021-05-12 16:10:59 -07:00
bert-e e022fc9b99 Merge branches 'w/7.10/improvement/S3C-4336_add_BucketInfoModelVersion' and 'q/1436/7.4/improvement/S3C-4336_add_BucketInfoModelVersion' into tmp/octopus/q/7.10 2021-05-10 20:18:36 +00:00
Taylor McKinnon 5e1fe450f6 add BucketInfo versions 7-9 2021-05-10 13:06:49 -07:00
bert-e 8a1987ba69 Merge branch 'improvement/S3C-4336_add_BucketInfoModelVersion' into tmp/octopus/w/7.10/improvement/S3C-4336_add_BucketInfoModelVersion 2021-05-10 20:02:51 +00:00
bert-e fa47c5045b Merge branch 'feature/S3C-4073_AddProbeServerToIndex' into tmp/octopus/w/7.10/feature/S3C-4073_AddProbeServerToIndex 2021-05-07 04:18:11 +00:00
bert-e cd9949cb11 Merge branch 'feature/S3C-4073_add-new-probe-server' into tmp/octopus/w/7.10/feature/S3C-4073_add-new-probe-server 2021-04-30 19:56:03 +00:00
Taylor McKinnon 990987bb6a ft(S3C-3748): Add PutBucketEncryption route 2021-04-29 09:34:45 -07:00
Taylor McKinnon faab2347f9 ft(S3C-3751): Add GetBucketEncryption route 2021-04-21 11:41:32 -07:00
bert-e 9a2b01c92e Merge branches 'w/7.10/bugfix/S3C-4275-versionListingWithDelimiterInefficiency' and 'q/1399/7.4/bugfix/S3C-4275-versionListingWithDelimiterInefficiency' into tmp/octopus/q/7.10 2021-04-14 01:17:38 +00:00
Taylor McKinnon 71c1c01b35 add BypassGovernanceRetention to action map 2021-04-13 13:25:16 -07:00
naren-scality 941b644e9e bf S3C-4239 log consumer callback error fix
A guard is added to ensure that the callback is called only once in the
event of an error while reading records in log consumer.
2021-04-12 10:47:31 -07:00
bert-e 7a92327da2 Merge branch 'bugfix/S3C-4275-versionListingWithDelimiterInefficiency' into tmp/octopus/w/7.10/bugfix/S3C-4275-versionListingWithDelimiterInefficiency 2021-04-10 00:16:30 +00:00
bert-e bf4c40dfb8 Merge branch 'feature/S3C-4262_BackportZenkoMetrics' into tmp/octopus/w/7.10/feature/S3C-4262_BackportZenkoMetrics 2021-04-06 09:45:40 +00:00
Jonathan Gramain 4aa5071a0d Merge remote-tracking branch 'origin/dependabot/npm_and_yarn/development/7.4/mocha-8.0.1' into w/7.10/dependabot/npm_and_yarn/development/7.4/mocha-8.0.1 2021-04-02 12:44:06 -07:00
vrancurel 147946747c ft: S3C-4172 custom filter
Perform an optional filter on customAttributes sub-object with filterKey
and filterKeyStartWith optional parameters on basic filter.
2021-03-18 15:21:31 -07:00
bert-e 6eacd79f07 Merge branch 'bugfix/S3C-3962-zero-size-stream' into tmp/octopus/w/7.9/bugfix/S3C-3962-zero-size-stream 2021-02-10 17:31:07 +00:00
alexandre merle 65966f5ddf S3C-3904: more s3 action logs
Add 7.9 actions
2021-02-05 20:57:48 +01:00
bert-e f6223d1472 Merge branch 'bugfix/S3C-3904-better-s3-action-logs' into tmp/octopus/w/7.9/bugfix/S3C-3904-better-s3-action-logs 2021-02-05 18:15:28 +00:00
bert-e 7d58ca38ce Merge branch 'bugfix/S3C-3904-better-s3-action-logs' into tmp/octopus/w/7.9/bugfix/S3C-3904-better-s3-action-logs 2021-02-05 01:10:08 +00:00
alexandre merle b8bef65f00
Merge remote-tracking branch 'origin/bugfix/S3C-2201-econnreset-rest-client-keep-alive' into w/7.9/bugfix/S3C-2201-econnreset-rest-client-keep-alive 2021-01-25 20:33:09 +01:00
bert-e 26a00babb4 Merge branch 'w/7.9/bugfix/S3C-3425-client-ip-extraction-logic' into tmp/octopus/q/7.9 2020-12-31 20:26:18 +00:00
Dora Korpar 03521ac8ce bf: S3C-3581 add bucket notif apis for policy eval 2020-12-22 10:24:34 -08:00
bert-e f2bf36a2eb Merge branch 'bugfix/S3C-3425-client-ip-extraction-logic' into tmp/octopus/w/7.9/bugfix/S3C-3425-client-ip-extraction-logic 2020-12-17 18:01:26 +00:00
bert-e c84d41c06f Merge branch 'w/7.4/improvement/S3C-3653-add-fields' into tmp/octopus/w/7.8/improvement/S3C-3653-add-fields 2020-12-02 07:29:04 +00:00
Dora Korpar 38cc5d65d1 Merge remote-tracking branch 'origin/improvement/S3C-3475-add-actions-in-logs' into w/7.8/improvement/S3C-3475-add-actions-in-logs 2020-11-13 16:01:28 -08:00
Jonathan Gramain ed446c569c Merge remote-tracking branch 'origin/bugfix/S3C-3388-httpServerKeepAliveTimeoutOption' into w/7.8/bugfix/S3C-3388-httpServerKeepAliveTimeoutOption 2020-10-15 12:27:12 -07:00
bert-e af92067069 Merge branch 'w/7.8/feature/S3C-3185-CredentialReport-policy-check' into tmp/octopus/q/7.8 2020-10-08 22:24:20 +00:00
bert-e 2ec26f23b0 Merge branch 'bugfix/S3C-3402-removeWrongErrorLog' into tmp/octopus/w/7.8/bugfix/S3C-3402-removeWrongErrorLog 2020-10-08 20:48:43 +00:00
Anurag Mittal edbb4770bf feature: S3C-3185-CredentialReport-policy-and-errors 2020-10-08 12:56:59 +02:00
Dora Korpar 096407487b ft: S3-3177 policy tag condition keys 2020-09-29 16:56:18 -07:00
Dora Korpar 2d28231e97 bf: S3C-3303 empty notif config ok 2020-09-02 14:25:22 -07:00
Anurag Mittal 2e1f689344 feature: S3C-3183-policy-getAccessKeyLastUsed
Policy support for GetAccessKeyLastUsed
2020-08-31 13:48:58 +02:00
Dora Korpar 236c72d2df ft:S3C-2798 get bucketnotif queuearn and filter 2020-08-26 13:13:55 -07:00
Dora Korpar aa9c9e54ff ft: S3C-2797 queue arn parsing 2020-08-21 13:13:54 -07:00
Dora Korpar 775f380a6c ft: S3C-2797 export notification config model 2020-08-20 13:49:16 -07:00
Dora Korpar 645902ac42 ft: S3C-2797-bucket-info-notifconfig-update 2020-08-20 10:49:22 -07:00
Dora Korpar 3d219c208d ft:S3C-2798 get bucket notification 2020-08-20 10:23:28 -07:00
bert-e fb08fa36fc Merge branch 'feature/S3C-2797-put-bucket-notifications' into q/7.8 2020-08-20 17:15:22 +00:00
Dora Korpar 694553c752 ft: S3C-2797 bucket notification model 2020-08-19 22:15:49 -07:00
Dora Korpar 6fff00d088 ft: S3C-3229 add originOp objMD 2020-08-10 13:05:42 -07:00
bert-e 18aa07f49e Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' and 'q/1160/7.7/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' into tmp/octopus/q/7.8 2020-07-21 00:44:30 +00:00
bert-e 5c7664e5d2 Merge branches 'w/7.7/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' and 'q/1160/7.4/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' into tmp/octopus/q/7.7 2020-07-21 00:44:29 +00:00
bert-e 718c8ba461 Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/debug-2.6.9' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/debug-2.6.9 2020-07-20 22:44:47 +00:00
bert-e 899415dce9 Merge branch 'dependabot/npm_and_yarn/development/7.4/debug-2.6.9' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/debug-2.6.9 2020-07-20 22:44:47 +00:00
bert-e 3dac99da94 Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0 2020-07-20 22:44:39 +00:00
Jonathan Gramain e6180b769a Merge remote-tracking branch 'origin/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0' into w/7.7/dependabot/npm_and_yarn/development/7.4/lolex-6.0.0 2020-07-20 15:42:16 -07:00
bert-e f295bcafa5 Merge branch 'w/7.7/bugfix/S3C-3130_handleObjectLockDisabledCaseForBucket' into tmp/octopus/w/7.8/bugfix/S3C-3130_handleObjectLockDisabledCaseForBucket 2020-07-07 23:38:26 +00:00
bert-e 580e25a9e8 Merge branch 'bugfix/S3C-3130_handleObjectLockDisabledCaseForBucket' into tmp/octopus/w/7.7/bugfix/S3C-3130_handleObjectLockDisabledCaseForBucket 2020-07-07 23:38:26 +00:00
Ilke e6622dfdce bf: S3C-3130 obj lock config should pass without rule 2020-07-07 16:37:30 -07:00
Ilke 91bb3ea291 bf: S3C-3130 obj lock config fails without rule 2020-07-06 16:38:16 -07:00
bert-e 478904116f Merge branch 'w/7.7/feature/S3C-3118-flatten-retention-objmd' into tmp/octopus/w/7.8/feature/S3C-3118-flatten-retention-objmd 2020-07-01 22:36:31 +00:00
bert-e 9048f31618 Merge branch 'feature/S3C-3118-flatten-retention-objmd' into tmp/octopus/w/7.7/feature/S3C-3118-flatten-retention-objmd 2020-07-01 22:36:30 +00:00
Dora Korpar b5853078c6 ft: S3C-3118-flatten-objmd-retentioninfo 2020-07-01 15:33:19 -07:00
bert-e fa8f705452 Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' and 'q/1157/7.7/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' into tmp/octopus/q/7.8 2020-07-01 19:33:59 +00:00
bert-e e12e0a3a5c Merge branches 'q/1157/7.4/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' and 'w/7.7/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' into tmp/octopus/q/7.7 2020-07-01 19:33:59 +00:00
bert-e 31f92ebcef Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2 2020-07-01 18:39:20 +00:00
Jonathan Gramain 438001cf60 build(deps): ajv dep bump: updates for compatibility with version 6
- Run migration tool on resourcePolicySchema.json to json-schema draft-06:
  `ajv migrate -s resourcePolicySchema.json`
2020-07-01 11:35:40 -07:00
bert-e 32fc05e04b Merge branch 'dependabot/npm_and_yarn/development/7.4/ajv-6.12.2' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/ajv-6.12.2 2020-07-01 18:30:54 +00:00
bert-e 9f90e1ea26 Merge branches 'w/7.8/feature/S3C-3112_ObjectLockEnabledSetterForBucket' and 'q/1174/7.7/feature/S3C-3112_ObjectLockEnabledSetterForBucket' into tmp/octopus/q/7.8 2020-06-30 21:39:40 +00:00
bert-e 86ed244d7a Merge branch 'w/7.7/feature/S3C-3112_ObjectLockEnabledSetterForBucket' into tmp/octopus/q/7.7 2020-06-30 21:39:39 +00:00
bert-e f8888b9338 Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' and 'q/1156/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' into tmp/octopus/q/7.8 2020-06-30 20:09:44 +00:00
bert-e 1073bac469 Merge branches 'w/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' and 'q/1156/7.4/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' into tmp/octopus/q/7.7 2020-06-30 20:09:43 +00:00
bert-e e8e9e00f11 Merge branches 'development/7.8' and 'w/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/temp-0.9.1 2020-06-30 20:05:02 +00:00
bert-e 89b950a7e8 Merge branch 'development/7.7' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1 2020-06-30 20:05:02 +00:00
bert-e de50c62825 Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/temp-0.9.1 2020-06-30 20:03:45 +00:00
Jonathan Gramain 6fb57f3271 Merge remote-tracking branch 'origin/dependabot/npm_and_yarn/development/7.4/temp-0.9.1' into w/7.7/dependabot/npm_and_yarn/development/7.4/temp-0.9.1 2020-06-30 13:02:37 -07:00
bert-e d6bf1ab748 Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' and 'q/1155/7.7/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' into tmp/octopus/q/7.8 2020-06-30 19:59:55 +00:00
bert-e e93af8ad45 Merge branches 'w/7.7/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' and 'q/1155/7.4/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' into tmp/octopus/q/7.7 2020-06-30 19:59:54 +00:00
bert-e 248ea9cea5 Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-2.3.0' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/socket.io-2.3.0 2020-06-30 02:19:43 +00:00
bert-e 1a00552657 Merge branch 'dependabot/npm_and_yarn/development/7.4/socket.io-2.3.0' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-2.3.0 2020-06-30 02:19:43 +00:00
bert-e b95b8b6cd3 Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' and 'q/1153/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' into tmp/octopus/q/7.8 2020-06-30 00:53:54 +00:00
bert-e 1e377c8801 Merge branches 'w/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' and 'q/1153/7.4/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' into tmp/octopus/q/7.7 2020-06-30 00:53:53 +00:00
bert-e 5e39c4c2c8 Merge branches 'w/7.8/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' and 'q/1152/7.7/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' into tmp/octopus/q/7.8 2020-06-30 00:52:17 +00:00
bert-e 60fe8f09cc Merge branches 'w/7.7/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' and 'q/1152/7.4/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' into tmp/octopus/q/7.7 2020-06-30 00:52:17 +00:00
bert-e 1e47b00568 Merge branch 'w/7.7/feature/S3C-3112_ObjectLockEnabledSetterForBucket' into tmp/octopus/w/7.8/feature/S3C-3112_ObjectLockEnabledSetterForBucket 2020-06-29 21:30:51 +00:00
bert-e c0aee417f9 Merge branch 'feature/S3C-3112_ObjectLockEnabledSetterForBucket' into tmp/octopus/w/7.7/feature/S3C-3112_ObjectLockEnabledSetterForBucket 2020-06-29 21:30:51 +00:00
Ilke 55b6ceadab ft: S3C-3112 add object lock enabled setter to bucket 2020-06-29 14:29:46 -07:00
Ilke 321bb400d3 ft: S3C-3112 bucket should have object lock enabled setter 2020-06-29 14:29:46 -07:00
bert-e ea3c09957d Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0 2020-06-29 18:57:42 +00:00
Jonathan Gramain 53a49c3747 Merge remote-tracking branch 'origin/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0' into w/7.7/dependabot/npm_and_yarn/development/7.4/simple-glob-0.2.0 2020-06-29 11:54:09 -07:00
bert-e eab66494cf Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1 2020-06-29 03:30:42 +00:00
bert-e 01e9b7c80e Merge branch 'dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/ipaddr.js-1.9.1 2020-06-29 03:30:41 +00:00
bert-e ff4afb6c0f Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0 2020-06-29 03:30:16 +00:00
bert-e 76498cf31c Merge branch 'dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/socket.io-client-2.3.0 2020-06-29 03:30:16 +00:00
bert-e ef87129383 Merge branch 'w/7.7/dependabot/npm_and_yarn/development/7.4/xml2js-0.4.23' into tmp/octopus/w/7.8/dependabot/npm_and_yarn/development/7.4/xml2js-0.4.23 2020-06-29 03:29:51 +00:00
bert-e 003b4cfd27 Merge branch 'dependabot/npm_and_yarn/development/7.4/xml2js-0.4.23' into tmp/octopus/w/7.7/dependabot/npm_and_yarn/development/7.4/xml2js-0.4.23 2020-06-29 03:29:51 +00:00
bert-e db793c6e07 Merge branch 'w/7.7/dependabot/add-v2-config-file' into tmp/octopus/w/7.8/dependabot/add-v2-config-file 2020-06-29 03:14:01 +00:00
bert-e b4763b541e Merge branch 'dependabot/add-v2-config-file' into tmp/octopus/w/7.7/dependabot/add-v2-config-file 2020-06-29 03:14:00 +00:00
bert-e 8cd5b714c0 Merge branches 'w/7.8/feature/S3C-3040-object-lock-iam-policies' and 'q/1133/7.7/feature/S3C-3040-object-lock-iam-policies' into tmp/octopus/q/7.8 2020-06-26 23:12:28 +00:00
bert-e 6f5614e461 Merge branch 'feature/S3C-3040-object-lock-iam-policies' into q/7.7 2020-06-26 23:12:28 +00:00
bert-e 4617d66cb8 Merge branch 'feature/S3C-3040-object-lock-iam-policies' into tmp/octopus/w/7.8/feature/S3C-3040-object-lock-iam-policies 2020-06-26 23:09:40 +00:00
Rahul Padigela b2c054e7c7 improvement: update package.json version 2020-06-26 15:54:39 -07:00
Dora Korpar 9716781cbe ft: S3C-3040 add obj lock to iam policies 2020-06-25 10:23:56 -07:00
bert-e a61c1914d6 Merge branch 'bugfix/S3C-2987-helperForJsonStreamParsing' into tmp/octopus/w/7.7/bugfix/S3C-2987-helperForJsonStreamParsing 2020-06-25 00:10:14 +00:00
naren-scality 1f5d33f006 ft: S3C-3069 policy support for new APIs 2020-06-19 15:02:32 -07:00
bert-e 65065dd4e3 Merge branches 'w/7.7/bugfix/S3C-2987-add-v0v1-vFormat' and 'q/1108/7.4/bugfix/S3C-2987-add-v0v1-vFormat' into tmp/octopus/q/7.7 2020-06-17 20:31:03 +00:00
bert-e 3f82448a67 Merge branch 'bugfix/S3C-2987-add-v0v1-vFormat' into tmp/octopus/w/7.7/bugfix/S3C-2987-add-v0v1-vFormat 2020-06-17 18:34:03 +00:00
bert-e 6530f0ace4 Merge branch 'improvement/S3C-3044-add-audit-log-from-vault' into tmp/octopus/w/7.7/improvement/S3C-3044-add-audit-log-from-vault 2020-06-15 13:21:44 +00:00
Dora Korpar 16c4464864 ft: S3C-2787-iso-parse 2020-06-12 16:07:17 -07:00
Dora Korpar 41c2ebcd61 ft: S3C-2787 retention parsing 2020-06-12 16:06:54 -07:00
Dora Korpar 48eeb1bc72 ft: S3C-2788 add get object retention route 2020-06-12 12:32:25 -07:00
Dora Korpar b77199b085 ft: S3C-2788 get obj retention 2020-06-05 21:17:42 -07:00
Dora Korpar 9b82caf129 ft: S3C 2787 put object retention 2020-06-05 20:28:05 -07:00
Ilke 9c12ff241e bugfix: S3C-2945 fix get legal hold route 2020-06-02 15:02:42 -07:00
bert-e 2125465761 Merge branch 'bugfix/S3C-2899-mergeStreamDestroy' into tmp/octopus/w/7.7/bugfix/S3C-2899-mergeStreamDestroy 2020-06-01 05:41:06 +00:00
Ilke b98c4b6dfd ft: S3C-2945 get object legal hold route 2020-05-29 11:41:44 -07:00
Ilke d06989a149 ft: S3C-2944 put object legal hold 2020-05-29 08:16:51 -07:00
bert-e 0d49eff7e4 Merge branches 'w/7.7/bugfix/S3C-2899-vformatV1delimiterVersions' and 'q/1031/7.4/bugfix/S3C-2899-vformatV1delimiterVersions' into tmp/octopus/q/7.7 2020-05-21 22:39:43 +00:00
Ilke 5d78367d1c ft: S3C-2790 get object lock configuration 2020-05-21 15:23:19 -07:00
bert-e b30da5ca67 Merge branches 'w/7.7/bugfix/S3C-2899-vformatV1delimiterMaster' and 'q/1028/7.4/bugfix/S3C-2899-vformatV1delimiterMaster' into tmp/octopus/q/7.7 2020-05-20 22:39:26 +00:00
bert-e d699f78f91 Merge branch 'w/7.7/bugfix/S3C-2899-vformatV1MPU' into tmp/octopus/q/7.7 2020-05-20 21:03:53 +00:00
bert-e 53cc766032 Merge branch 'bugfix/S3C-2899-vformatV1delimiterVersions' into tmp/octopus/w/7.7/bugfix/S3C-2899-vformatV1delimiterVersions 2020-05-19 23:47:34 +00:00
bert-e a82f9a2b70 Merge branch 'bugfix/S3C-2899-vformatV1delimiterMaster' into tmp/octopus/w/7.7/bugfix/S3C-2899-vformatV1delimiterMaster 2020-05-19 23:47:20 +00:00
bert-e d0367eb6d0 Merge branch 'bugfix/S3C-2899-vformatV1MPU' into tmp/octopus/w/7.7/bugfix/S3C-2899-vformatV1MPU 2020-05-19 23:47:07 +00:00
bert-e 9cac91c413 Merge branch 'bugfix/S3C-2899-helperForListingAlgoGenMDParams' into tmp/octopus/w/7.7/bugfix/S3C-2899-helperForListingAlgoGenMDParams 2020-05-19 23:46:02 +00:00
bert-e 6c62091622 Merge branch 'bugfix/S3C-2899-passVformatToListingParams' into tmp/octopus/w/7.7/bugfix/S3C-2899-passVformatToListingParams 2020-05-16 07:13:08 +00:00
Dora Korpar ef4a2dc077 ft: S3C-2789 object lock configuration 2020-05-15 14:22:17 -07:00
bert-e 5dff968096 Merge branch 'feature/S3C-2789-put-objlock-bucketinfo' into q/7.7 2020-05-15 18:16:22 +00:00
bert-e 2676b8384b Merge branch 'bugfix/S3C-2899-mergeStreamTooling' into tmp/octopus/w/7.7/bugfix/S3C-2899-mergeStreamTooling 2020-05-13 22:40:45 +00:00
bert-e 4544239269 Merge branch 'bugfix/S3C-2899-versioningKeyFormatConstants' into tmp/octopus/w/7.7/bugfix/S3C-2899-versioningKeyFormatConstants 2020-05-11 22:30:25 +00:00
Dora Korpar cc5b5e1971 ft: S3C-2789 put objlock bucketinfo update
[squash] bucket info unit tests
2020-05-05 13:03:09 -07:00
Ilke f988270a0c ft S3C-2785 objectLock check to bucketInfo model 2020-05-04 18:27:23 -07:00
bert-e 2b9ac57230 Merge branch 'w/7.6/bugfix/S3C-2726-removeSomeDefaultAttributesFromObjectMD' into tmp/octopus/w/7.7/bugfix/S3C-2726-removeSomeDefaultAttributesFromObjectMD 2020-04-22 21:33:27 +00:00
bert-e 336e42a9e0 Merge branch 'bugfix/S3C-2726-removeSomeDefaultAttributesFromObjectMD' into tmp/octopus/w/7.6/bugfix/S3C-2726-removeSomeDefaultAttributesFromObjectMD 2020-04-22 21:33:26 +00:00
bert-e fc0123ea5e Merge branch 'w/7.6/bugfix/S3C-2668_allow_utf8_characters_in_tags' into tmp/octopus/w/7.7/bugfix/S3C-2668_allow_utf8_characters_in_tags 2020-04-14 19:46:07 +00:00
bert-e 4d54b49c03 Merge branch 'bugfix/S3C-2668_allow_utf8_characters_in_tags' into tmp/octopus/w/7.6/bugfix/S3C-2668_allow_utf8_characters_in_tags 2020-04-14 19:46:06 +00:00
Ilke 65e92ebd92 improvement/S3C-2749 unit tests for url duration
Includes some minor follow-up changes for the customization
of the S3 pre-signed URL duration such as extracting the
urlExpiry as a constant as well as six unit tests.
2020-04-09 16:43:50 -07:00
Ilke d350f3db82 feature: S3C-2729 customize s3 pre-sign url
Customizing the S3 pre-sign URL duration by adding
an environment variable to extend it.
2020-04-03 17:17:45 -07:00
bert-e c848d1f13d Merge branches 'w/7.6/bugfix/S3C-2502-vault-req-ip-header-port' and 'q/953/7.6.0/bugfix/S3C-2502-vault-req-ip-header-port' into tmp/octopus/q/7.6 2020-02-26 17:51:07 +00:00
bert-e eeb3ba970c Merge branch 'bugfix/S3C-2502-vault-req-ip-header-port' into q/7.6.0 2020-02-26 17:51:07 +00:00
bert-e c322c3b887 Merge branch 'bugfix/S3C-2604-listMultipleBucketMetrics' into tmp/octopus/w/7.6/bugfix/S3C-2604-listMultipleBucketMetrics 2020-02-26 09:27:05 +00:00
Anurag Mittal 2c892835cb
bugfix: S3C-2604-handle-multiple-specific-resources 2020-02-26 10:25:40 +01:00
bert-e 04b063da70 Merge branch 'bugfix/S3C-2502-vault-req-ip-header-port' into tmp/octopus/w/7.6/bugfix/S3C-2502-vault-req-ip-header-port 2020-02-25 21:38:59 +00:00
Dora Korpar 3d0c3bea2e bf: S3C-2502 move ip util to arsenal 2020-02-25 13:32:35 -08:00
bert-e 0d4efa67eb Merge branch 'w/7.6/bugfix/S3C-2604-list-multiple-bucket-metrics' into tmp/octopus/q/7.6 2020-02-25 19:25:13 +00:00
bert-e 3068ce38a0 Merge branch 'bugfix/S3C-2604-list-multiple-bucket-metrics' into tmp/octopus/w/7.6/bugfix/S3C-2604-list-multiple-bucket-metrics 2020-02-24 15:45:41 +00:00
bert-e 030a3f33f1 Merge branch 'bugfix/S3C-2623_Explicit_socket_destroyed_check-port' into tmp/octopus/w/7.6/bugfix/S3C-2623_Explicit_socket_destroyed_check-port 2020-02-24 05:28:55 +00:00
Taylor McKinnon ed1cc0f1bf bf(S3C-2623): Add explicit socket.destroyed check
(cherry picked from commit 80d231a3fa)
2020-02-23 21:28:00 -08:00
Taylor McKinnon 80d231a3fa bf(S3C-2623): Add explicit socket.destroyed check 2020-02-21 14:46:10 -08:00
bert-e 2940500db6 Merge branch 'bugfix/S3C-2502-vault-req-ip-header' into tmp/octopus/w/7.6/bugfix/S3C-2502-vault-req-ip-header 2020-02-05 22:59:33 +00:00
bert-e 7aedc5f1f7 Merge branch 'bugfix/S3C-2541-algo-LRUCache' into tmp/octopus/w/7.6/bugfix/S3C-2541-algo-LRUCache 2019-12-27 23:35:22 +00:00
bert-e b99577eaeb Merge branch 'bugfix/S3C-2269/ArnMatch_case_sensitive_check' into tmp/octopus/w/7.6/bugfix/S3C-2269/ArnMatch_case_sensitive_check 2019-10-08 19:39:03 +00:00
bert-e 7f63022caa Merge branch 'bugfix/S3C-1805/bucket_name_with_consecutive_hyphens' into tmp/octopus/w/7.6/bugfix/S3C-1805/bucket_name_with_consecutive_hyphens 2019-10-03 22:20:28 +00:00
Dora Korpar 61d779083f bf: S3C-2440 fix get bucket policy xml error 2019-09-23 12:14:56 -07:00
Dora Korpar b0e56d64cd bf: S3C 2435 fix object action parse 2019-09-17 15:10:06 -07:00
Dora Korpar 12ad2d9423 bf: S3C 2396 fix bucket policy action parsing 2019-08-19 11:18:29 -07:00
Dora Korpar 32c895b21a bf: S3C 2276 bucketinfo should store object not json 2019-08-09 13:31:52 -07:00
Dora Korpar 006f77dd28 ft: S3C 2276 bucket policy models 2019-08-08 15:10:15 -07:00
bert-e c789d38df0 Merge branch 'improvement/S3C-2352-install-yarn-frozen-lockfile' into tmp/octopus/w/7.5/improvement/S3C-2352-install-yarn-frozen-lockfile 2019-08-08 18:18:50 +00:00
Dora Korpar 3b705a9434 ft: S3C 2282 bucket policy schema and validation 2019-08-01 13:15:15 -07:00
bert-e 6c7de4124d Merge branch 'improvement/S3C-2352-install-yarn' into tmp/octopus/w/7.5/improvement/S3C-2352-install-yarn 2019-07-30 18:30:21 +00:00
bert-e 59803d7b67 Merge branch 'improvement/S3C-2351-update-joi' into tmp/octopus/w/7.5/improvement/S3C-2351-update-joi 2019-07-29 22:55:49 +00:00
bert-e 98737a69ba Merge branch 'feature/S3C-2216-bump-tags-limit' into tmp/octopus/w/7.5/feature/S3C-2216-bump-tags-limit 2019-07-26 23:34:37 +00:00
Dora Korpar 94653a14c4 ft: S3C-2346 add bucket policy routes 2019-07-25 15:28:55 -07:00
bert-e 0f53c78ccd Merge branch 'bugfix/S3C-2335-fixDataServerCloseSync' into tmp/octopus/w/7.5/bugfix/S3C-2335-fixDataServerCloseSync 2019-07-17 23:13:17 +00:00
bert-e b03f5b80ac Merge branch 'improvement/S3C-2127-upgrade-node' into tmp/octopus/w/7.5/improvement/S3C-2127-upgrade-node 2019-06-27 22:59:58 +00:00
bert-e 933dc1da17 Merge branch 'bugfix/S3C-2172-bucket-error' into tmp/octopus/w/7.5/bugfix/S3C-2172-bucket-error 2019-05-22 23:59:46 +00:00
bert-e ae8dd1bb0e Merge branch 'improvement/S3C-2034-bump-ioredis' into tmp/octopus/w/7.5/improvement/S3C-2034-bump-ioredis 2019-05-20 21:49:24 +00:00
Guillaume Gimenez c6e06cc235 feature: S3C-2031: KMIP uses arsenal errors 2019-03-14 16:10:50 -07:00
bert-e 57c971ef0f Merge branches 'w/7.5/feature/S3C-2002-admin-service' and 'q/722/7.4/feature/S3C-2002-admin-service' into tmp/octopus/q/7.5 2019-03-08 00:17:05 +00:00
bert-e d8320da1bb Merge branch 'w/7.4/feature/S3C-2002-admin-service' into tmp/octopus/w/7.5/feature/S3C-2002-admin-service 2019-03-07 19:29:21 +00:00
bert-e d5d6243c01 Merge branch 'bugfix/S3C-2017-berte-fix' into tmp/octopus/w/7.5/bugfix/S3C-2017-berte-fix 2019-03-07 18:15:10 +00:00
Guillaume Gimenez 7fb16cbca6 feature: S3C-1968: usage of deprecated Buffer ctor 2019-03-01 16:50:21 -08:00
Guillaume Gimenez 2a8a5dcb94 feature: S3C-1968: Loopback Test KMIP Server 2019-03-01 16:50:21 -08:00
Guillaume Gimenez ff5d62f7de feature: S3C-1968: KMIP High Level Cloudserver Driver 2019-03-01 16:50:21 -08:00
bert-e 97035596e1 Merge branch 'feature/S3C-1967/kmip-lowlevel-driver' into q/7.5 2019-03-02 00:46:12 +00:00
bert-e 8c19dcdc7c Merge branch 'w/7.4/bugfix/S3C-2006-listing-filter-value-fix' into tmp/octopus/w/7.5/bugfix/S3C-2006-listing-filter-value-fix 2019-03-01 19:23:51 +00:00
Guillaume Gimenez cae763669b feature: S3C-1967: KMIP Low Level Driver 2019-02-28 12:03:14 -08:00
Guillaume Gimenez b3598c5d0e feature: S3C-1966: KMIP TLS Transport 2019-02-28 11:49:22 -08:00
bert-e ac365eef18 Merge branch 'feature/S3C-1925/kmip-ttlv-codec' into q/7.5 2019-02-22 00:49:09 +00:00
Guillaume Gimenez f7aa22f9a6 feature: S3C-1925: KMIP TTLV Codec 2019-02-21 16:27:24 -08:00
76 changed files with 10494 additions and 488 deletions

View File

@ -85,6 +85,56 @@ Used to store the bucket lifecycle configuration info
### Properties Added
```javascript
this._objectLockEnabled = objectLockEnabled || false;
this._objectLockConfiguration = objectLockConfiguration || null;
```
### Usage
Used to determine whether object lock capabilities are enabled on a bucket and
to store the object lock configuration of the bucket
## Model version 8
### Properties Added
```javascript
this._notificationConfiguration = notificationConfiguration || null;
```
### Usage
Used to store the bucket notification configuration info
## Model version 9
### Properties Added
```javascript
this._serverSideEncryption.configuredMasterKeyId = configuredMasterKeyId || undefined;
```
### Usage
Used to store the users configured KMS key id
## Model version 10
### Properties Added
```javascript
this._uid = uid || uuid();
```
### Usage
Used to set a unique identifier on a bucket
## Model version 11
### Properties Added
```javascript
this._data.isAborted = true || false;
```

View File

@ -256,6 +256,10 @@
"code": 404,
"description": "The lifecycle configuration does not exist."
},
"NoSuchObjectLockConfiguration": {
"code": 404,
"description": "The specified object does not have a ObjectLock configuration."
},
"NoSuchWebsiteConfiguration": {
"code": 404,
"description": "The specified bucket does not have a website configuration"
@ -272,6 +276,14 @@
"code": 404,
"description": "The replication configuration was not found"
},
"ObjectLockConfigurationNotFoundError": {
"code": 404,
"description": "The object lock configuration was not found"
},
"ServerSideEncryptionConfigurationNotFoundError" : {
"code": 404,
"description": "The server side encryption configuration was not found"
},
"NotImplemented": {
"code": 501,
"description": "A header you provided implies functionality that is not implemented."
@ -467,6 +479,22 @@
"code": 400,
"description": "The request was rejected because an invalid or out-of-range value was supplied for an input parameter."
},
"MalformedPolicy": {
"code": 400,
"description": "This policy contains invalid Json"
},
"ReportExpired": {
"code": 410,
"description": "The request was rejected because the most recent credential report has expired. To generate a new credential report, use GenerateCredentialReport."
},
"ReportInProgress": {
"code": 404,
"description": "The request was rejected because the credential report is still being generated."
},
"ReportNotPresent": {
"code": 410,
"description": "The request was rejected because the credential report does not exist. To generate a credential report, use GenerateCredentialReport."
},
"_comment": "-------------- Special non-AWS S3 errors --------------",
"MPUinProgress": {
"code": 409,

View File

@ -64,6 +64,8 @@ module.exports = {
ProbeServer: require('./lib/network/probe/ProbeServer'),
},
RoundRobin: require('./lib/network/RoundRobin'),
kmip: require('./lib/network/kmip'),
kmipClient: require('./lib/network/kmip/Client'),
},
s3routes: {
routes: require('./lib/s3routes/routes'),
@ -73,6 +75,7 @@ module.exports = {
userMetadata: require('./lib/s3middleware/userMetadata'),
convertToXml: require('./lib/s3middleware/convertToXml'),
escapeForXml: require('./lib/s3middleware/escapeForXml'),
objectLegalHold: require('./lib/s3middleware/objectLegalHold'),
tagging: require('./lib/s3middleware/tagging'),
validateConditionalHeaders:
require('./lib/s3middleware/validateConditionalHeaders')
@ -88,6 +91,8 @@ module.exports = {
SubStreamInterface:
require('./lib/s3middleware/azureHelpers/SubStreamInterface'),
},
retention: require('./lib/s3middleware/objectRetention'),
lifecycleHelpers: require('./lib/s3middleware/lifecycleHelpers'),
},
storage: {
metadata: {
@ -116,6 +121,12 @@ module.exports = {
require('./lib/models/ReplicationConfiguration'),
LifecycleConfiguration:
require('./lib/models/LifecycleConfiguration'),
LifecycleRule: require('./lib/models/LifecycleRule'),
BucketPolicy: require('./lib/models/BucketPolicy'),
ObjectLockConfiguration:
require('./lib/models/ObjectLockConfiguration'),
NotificationConfiguration:
require('./lib/models/NotificationConfiguration'),
},
metrics: {
StatsClient: require('./lib/metrics/StatsClient'),

View File

@ -2,7 +2,7 @@
const Extension = require('./Extension').default;
const { checkLimit, FILTER_END, FILTER_ACCEPT } = require('./tools');
const { checkLimit, FILTER_END, FILTER_ACCEPT, FILTER_SKIP } = require('./tools');
const DEFAULT_MAX_KEYS = 10000;
/**
@ -21,6 +21,8 @@ class List extends Extension {
this.res = [];
if (parameters) {
this.maxKeys = checkLimit(parameters.maxKeys, DEFAULT_MAX_KEYS);
this.filterKey = parameters.filterKey;
this.filterKeyStartsWith = parameters.filterKeyStartsWith;
} else {
this.maxKeys = DEFAULT_MAX_KEYS;
}
@ -44,6 +46,43 @@ class List extends Extension {
return params;
}
/**
* Filters customAttributes sub-object if present
*
* @param {String} value - The JSON value of a listing item
*
* @return {Boolean} Returns true if matches, else false.
*/
customFilter(value) {
let _value;
try {
_value = JSON.parse(value);
} catch (e) {
// Prefer returning an unfiltered data rather than
// stopping the service in case of parsing failure.
// The risk of this approach is a potential
// reproduction of MD-692, where too much memory is
// used by repd.
this.logger.warn(
'Could not parse Object Metadata while listing',
{ err: e.toString() });
return false;
}
if (_value.customAttributes !== undefined) {
for (const key of Object.keys(_value.customAttributes)) {
if (this.filterKey !== undefined &&
key === this.filterKey) {
return true;
}
if (this.filterKeyStartsWith !== undefined &&
key.startsWith(this.filterKeyStartsWith)) {
return true;
}
}
}
return false;
}
/**
* Function apply on each element
* Just add it to the array
@ -56,6 +95,12 @@ class List extends Extension {
if (this.keys >= this.maxKeys) {
return FILTER_END;
}
if ((this.filterKey !== undefined ||
this.filterKeyStartsWith !== undefined) &&
typeof elem === 'object' &&
!this.customFilter(elem.value)) {
return FILTER_SKIP;
}
if (typeof elem === 'object') {
this.res.push({
key: elem.key,

View File

@ -222,6 +222,40 @@ class Vault {
});
}
/** getAccountIds -- call Vault to get accountIds based on
* canonicalIDs
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} log - log object
* @param {function} callback - callback with either error or an object
* with canonicalID keys and accountId values
* @return {undefined}
*/
getAccountIds(canonicalIDs, log, callback) {
log.trace('getting accountIds from Vault based on canonicalIDs',
{ canonicalIDs });
this.client.getAccountIds(canonicalIDs,
{ reqUid: log.getSerializedUids() },
(err, info) => {
if (err) {
log.debug('received error message from vault',
{ errorMessage: err });
return callback(err);
}
const infoFromVault = info.message.body;
log.trace('info received from vault', { infoFromVault });
const result = {};
/* If the accountId was not found in Vault, do not
send the canonicalID back to the API */
Object.keys(infoFromVault).forEach(key => {
if (infoFromVault[key] !== 'NotFound' &&
infoFromVault[key] !== 'WrongFormat') {
result[key] = infoFromVault[key];
}
});
return callback(null, result);
});
}
/** checkPolicies -- call Vault to evaluate policies
* @param {object} requestContextParams - parameters needed to construct
* requestContext in Vault

View File

@ -157,6 +157,34 @@ class Backend {
};
return cb(null, vaultReturnObject);
}
/**
* Gets accountIds for a list of accounts based on
* the canonical IDs associated with the account
* @param {array} canonicalIDs - list of canonicalIDs
* @param {object} options - to send log id to vault
* @param {function} cb - callback to calling function
* @returns {function} callback with either error or
* an object from Vault containing account canonicalID
* as each object key and an accountId as the value (or "NotFound")
*/
getAccountIds(canonicalIDs, options, cb) {
const results = {};
canonicalIDs.forEach(canonicalID => {
const foundEntity = this.indexer.getEntityByCanId(canonicalID);
if (!foundEntity || !foundEntity.shortid) {
results[canonicalID] = 'Not Found';
} else {
results[canonicalID] = foundEntity.shortid;
}
});
const vaultReturnObject = {
message: {
body: results,
},
};
return cb(null, vaultReturnObject);
}
}

View File

@ -34,8 +34,13 @@ function check(request, log, data) {
}
const currentTime = Date.now();
// 604800000 ms (seven days).
if (expirationTime > currentTime + 604800000) {
const preSignedURLExpiry = process.env.PRE_SIGN_URL_EXPIRY
&& !Number.isNaN(process.env.PRE_SIGN_URL_EXPIRY)
? Number.parseInt(process.env.PRE_SIGN_URL_EXPIRY, 10)
: constants.defaultPreSignedURLExpiry * 1000;
if (expirationTime > currentTime + preSignedURLExpiry) {
log.debug('expires parameter too far in future',
{ expires: request.query.Expires });
return { err: errors.AccessDenied };

View File

@ -72,6 +72,22 @@ module.exports = {
permittedCapitalizedBuckets: {
METADATA: true,
},
// Default expiration value of the S3 pre-signed URL duration
// 604800 seconds (seven days).
defaultPreSignedURLExpiry: 7 * 24 * 60 * 60,
// Regex for ISO-8601 formatted date
shortIso8601Regex: /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z/,
longIso8601Regex: /\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.\d{3}Z/,
supportedNotificationEvents: new Set([
's3:ObjectCreated:*',
's3:ObjectCreated:Put',
's3:ObjectCreated:Copy',
's3:ObjectCreated:CompleteMultipartUpload',
's3:ObjectRemoved:*',
's3:ObjectRemoved:Delete',
's3:ObjectRemoved:DeleteMarkerCreated',
]),
notificationArnPrefix: 'arn:scality:bucketnotif',
// HTTP server keep-alive timeout is set to a higher value than
// client's free sockets timeout to avoid the risk of triggering
// ECONNRESET errors if the server closes the connection at the
@ -84,4 +100,9 @@ module.exports = {
// http.Agent.
httpServerKeepAliveTimeout: 60000,
httpClientFreeSocketTimeout: 55000,
supportedLifecycleRules: [
'expiration',
'noncurrentVersionExpiration',
'abortIncompleteMultipartUpload',
],
};

View File

@ -1,10 +1,16 @@
const assert = require('assert');
const uuid = require('uuid/v4');
const { WebsiteConfiguration } = require('./WebsiteConfiguration');
const ReplicationConfiguration = require('./ReplicationConfiguration');
const LifecycleConfiguration = require('./LifecycleConfiguration');
const ObjectLockConfiguration = require('./ObjectLockConfiguration');
const BucketPolicy = require('./BucketPolicy');
const NotificationConfiguration = require('./NotificationConfiguration');
// WHEN UPDATING THIS NUMBER, UPDATE MODELVERSION.MD CHANGELOG
const modelVersion = 6;
// WHEN UPDATING THIS NUMBER, UPDATE BucketInfoModelVersion.md CHANGELOG
// BucketInfoModelVersion.md can be found in the root of this repository
const modelVersion = 10;
class BucketInfo {
/**
@ -27,6 +33,8 @@ class BucketInfo {
* algorithm to use
* @param {string} serverSideEncryption.masterKeyId -
* key to get master key
* @param {string} serverSideEncryption.configuredMasterKeyId -
* custom KMS key id specified by user
* @param {boolean} serverSideEncryption.mandatory -
* true for mandatory encryption
* bucket has been made
@ -47,12 +55,19 @@ class BucketInfo {
* @param {string[]} [cors[].exposeHeaders] - headers expose to applications
* @param {object} [replicationConfiguration] - replication configuration
* @param {object} [lifecycleConfiguration] - lifecycle configuration
* @param {object} [bucketPolicy] - bucket policy
* @param {string} [uid] - unique identifier for the bucket, necessary
* @param {boolean} [objectLockEnabled] - true when object lock enabled
* @param {object} [objectLockConfiguration] - object lock configuration
* @param {object} [notificationConfiguration] - bucket notification configuration
*/
constructor(name, owner, ownerDisplayName, creationDate,
mdBucketModelVersion, acl, transient, deleted,
serverSideEncryption, versioningConfiguration,
locationConstraint, websiteConfiguration, cors,
replicationConfiguration, lifecycleConfiguration) {
replicationConfiguration, lifecycleConfiguration,
bucketPolicy, uid, objectLockEnabled, objectLockConfiguration,
notificationConfiguration) {
assert.strictEqual(typeof name, 'string');
assert.strictEqual(typeof owner, 'string');
assert.strictEqual(typeof ownerDisplayName, 'string');
@ -70,12 +85,15 @@ class BucketInfo {
}
if (serverSideEncryption) {
assert.strictEqual(typeof serverSideEncryption, 'object');
const { cryptoScheme, algorithm, masterKeyId, mandatory } =
serverSideEncryption;
const { cryptoScheme, algorithm, masterKeyId,
configuredMasterKeyId, mandatory } = serverSideEncryption;
assert.strictEqual(typeof cryptoScheme, 'number');
assert.strictEqual(typeof algorithm, 'string');
assert.strictEqual(typeof masterKeyId, 'string');
assert.strictEqual(typeof mandatory, 'boolean');
if (configuredMasterKeyId !== undefined) {
assert.strictEqual(typeof configuredMasterKeyId, 'string');
}
}
if (versioningConfiguration) {
assert.strictEqual(typeof versioningConfiguration, 'object');
@ -112,6 +130,19 @@ class BucketInfo {
if (lifecycleConfiguration) {
LifecycleConfiguration.validateConfig(lifecycleConfiguration);
}
if (bucketPolicy) {
BucketPolicy.validatePolicy(bucketPolicy);
}
if (uid) {
assert.strictEqual(typeof uid, 'string');
assert.strictEqual(uid.length, 36);
}
if (objectLockConfiguration) {
ObjectLockConfiguration.validateConfig(objectLockConfiguration);
}
if (notificationConfiguration) {
NotificationConfiguration.validateConfig(notificationConfiguration);
}
const aclInstance = acl || {
Canned: 'private',
FULL_CONTROL: [],
@ -137,6 +168,11 @@ class BucketInfo {
this._replicationConfiguration = replicationConfiguration || null;
this._cors = cors || null;
this._lifecycleConfiguration = lifecycleConfiguration || null;
this._bucketPolicy = bucketPolicy || null;
this._uid = uid || uuid();
this._objectLockEnabled = objectLockEnabled || false;
this._objectLockConfiguration = objectLockConfiguration || null;
this._notificationConfiguration = notificationConfiguration || null;
return this;
}
/**
@ -160,6 +196,11 @@ class BucketInfo {
cors: this._cors,
replicationConfiguration: this._replicationConfiguration,
lifecycleConfiguration: this._lifecycleConfiguration,
bucketPolicy: this._bucketPolicy,
uid: this._uid,
objectLockEnabled: this._objectLockEnabled,
objectLockConfiguration: this._objectLockConfiguration,
notificationConfiguration: this._notificationConfiguration,
};
if (this._websiteConfiguration) {
bucketInfos.websiteConfiguration =
@ -180,7 +221,9 @@ class BucketInfo {
obj.creationDate, obj.mdBucketModelVersion, obj.acl,
obj.transient, obj.deleted, obj.serverSideEncryption,
obj.versioningConfiguration, obj.locationConstraint, websiteConfig,
obj.cors, obj.replicationConfiguration, obj.lifecycleConfiguration);
obj.cors, obj.replicationConfiguration, obj.lifecycleConfiguration,
obj.bucketPolicy, obj.uid, obj.objectLockEnabled,
obj.objectLockConfiguration, obj.notificationConfiguration);
}
/**
@ -203,7 +246,9 @@ class BucketInfo {
data._transient, data._deleted, data._serverSideEncryption,
data._versioningConfiguration, data._locationConstraint,
data._websiteConfiguration, data._cors,
data._replicationConfiguration, data._lifecycleConfiguration);
data._replicationConfiguration, data._lifecycleConfiguration,
data._bucketPolicy, data._uid, data._objectLockEnabled,
data._objectLockConfiguration, data._notificationConfiguration);
}
/**
@ -331,6 +376,57 @@ class BucketInfo {
this._lifecycleConfiguration = lifecycleConfiguration;
return this;
}
/**
* Get bucket policy statement
* @return {object|null} bucket policy statement or `null` if the bucket
* does not have a bucket policy
*/
getBucketPolicy() {
return this._bucketPolicy;
}
/**
* Set bucket policy statement
* @param {object} bucketPolicy - bucket policy
* @return {BucketInfo} - bucket info instance
*/
setBucketPolicy(bucketPolicy) {
this._bucketPolicy = bucketPolicy;
return this;
}
/**
* Get object lock configuration
* @return {object|null} object lock configuration information or `null` if
* the bucket does not have an object lock configuration
*/
getObjectLockConfiguration() {
return this._objectLockConfiguration;
}
/**
* Set object lock configuration
* @param {object} objectLockConfiguration - object lock information
* @return {BucketInfo} - bucket info instance
*/
setObjectLockConfiguration(objectLockConfiguration) {
this._objectLockConfiguration = objectLockConfiguration;
return this;
}
/**
* Get notification configuration
* @return {object|null} notification configuration information or 'null' if
* the bucket does not have a notification configuration
*/
getNotificationConfiguration() {
return this._notificationConfiguration;
}
/**
* Set notification configuraiton
* @param {object} notificationConfiguration - bucket notification information
* @return {BucketInfo} - bucket info instance
*/
setNotificationConfiguration(notificationConfiguration) {
this._notificationConfiguration = notificationConfiguration;
return this;
}
/**
* Get cors resource
* @return {object[]} cors
@ -521,6 +617,38 @@ class BucketInfo {
return this._versioningConfiguration &&
this._versioningConfiguration.Status === 'Enabled';
}
/**
* Get unique id of bucket.
* @return {string} - unique id
*/
getUid() {
return this._uid;
}
/**
* Set unique id of bucket.
* @param {string} uid - unique identifier for the bucket
* @return {BucketInfo} - bucket info instance
*/
setUid(uid) {
this._uid = uid;
return this;
}
/**
* Check if object lock is enabled.
* @return {boolean} - depending on whether object lock is enabled
*/
isObjectLockEnabled() {
return !!this._objectLockEnabled;
}
/**
* Set the value of objectLockEnabled field.
* @param {boolean} enabled - true if object lock enabled else false.
* @return {BucketInfo} - bucket info instance
*/
setObjectLockEnabled(enabled) {
this._objectLockEnabled = enabled;
return this;
}
}
module.exports = BucketInfo;

143
lib/models/BucketPolicy.js Normal file
View File

@ -0,0 +1,143 @@
const assert = require('assert');
const errors = require('../errors');
const { validateResourcePolicy } = require('../policy/policyValidator');
/**
* Format of json policy:
* {
* "Id": "Policy id",
* "Version": "version date",
* "Statement": [
* {
* "Sid": "Statement id",
* "Effect": "Allow",
* "Principal": "*",
* "Action": "s3:*",
* "Resource": "arn:aws:s3:::examplebucket/bucket2/object"
* },
* {
* "Sid": "Statement id",
* "Effect": "Deny",
* "Principal": {
* "AWS": ["arn:aws:iam::<account_id>", "different_account_id"]
* },
* "Action": [ "s3:*" ],
* "Resource": [
* "arn:aws:s3:::examplebucket", "arn:aws:s3:::otherbucket/*"],
* "Condition": {
* "StringNotLike": {
* "aws:Referer": [
* "http://www.example.com/", "http://example.com/*"]
* }
* }
* }
* ]
* }
*/
const objectActions = [
's3:AbortMultipartUpload',
's3:DeleteObject',
's3:DeleteObjectTagging',
's3:GetObject',
's3:GetObjectAcl',
's3:GetObjectTagging',
's3:ListMultipartUploadParts',
's3:PutObject',
's3:PutObjectAcl',
's3:PutObjectTagging',
];
class BucketPolicy {
/**
* Create a Bucket Policy instance
* @param {string} json - the json policy
* @return {object} - BucketPolicy instance
*/
constructor(json) {
this._json = json;
this._policy = {};
}
/**
* Get the bucket policy
* @return {object} - the bucket policy or error
*/
getBucketPolicy() {
const policy = this._getPolicy();
return policy;
}
/**
* Get the bucket policy array
* @return {object} - contains error if policy validation fails
*/
_getPolicy() {
if (!this._json || this._json === '') {
return { error: errors.MalformedPolicy.customizeDescription(
'request json is empty or undefined') };
}
const validSchema = validateResourcePolicy(this._json);
if (validSchema.error) {
return validSchema;
}
this._setStatementArray();
const valAcRes = this._validateActionResource();
if (valAcRes.error) {
return valAcRes;
}
return this._policy;
}
_setStatementArray() {
this._policy = JSON.parse(this._json);
if (!Array.isArray(this._policy.Statement)) {
const statement = this._policy.Statement;
this._policy.Statement = [statement];
}
}
/**
* Validate action and resource are compatible
* @return {error} - contains error or empty obj
*/
_validateActionResource() {
const invalid = this._policy.Statement.every(s => {
const actions = typeof s.Action === 'string' ?
[s.Action] : s.Action;
const resources = typeof s.Resource === 'string' ?
[s.Resource] : s.Resource;
const objectAction = actions.some(a =>
a.includes('Object') || objectActions.includes(a));
// wildcardObjectAction checks for actions such as 's3:*' or
// 's3:Put*' but will return false for actions such as
// 's3:PutBucket*'
const wildcardObjectAction = actions.some(
a => a.includes('*') && !a.includes('Bucket'));
const objectResource = resources.some(r => r.includes('/'));
return ((objectAction && !objectResource) ||
(objectResource && !objectAction && !wildcardObjectAction));
});
if (invalid) {
return { error: errors.MalformedPolicy.customizeDescription(
'Action does not apply to any resource(s) in statement') };
}
return {};
}
/**
* Call resource policy schema validation function
* @param {object} policy - the bucket policy object to validate
* @return {undefined}
*/
static validatePolicy(policy) {
// only the BucketInfo constructor calls this function
// and BucketInfo will always be passed an object
const validated = validateResourcePolicy(JSON.stringify(policy));
assert.deepStrictEqual(validated, { error: null, valid: true });
}
}
module.exports = BucketPolicy;

View File

@ -2,6 +2,8 @@ const assert = require('assert');
const UUID = require('uuid');
const errors = require('../errors');
const LifecycleRule = require('./LifecycleRule');
const escapeForXml = require('../s3middleware/escapeForXml');
/**
* Format of xml request:
@ -148,6 +150,36 @@ class LifecycleConfiguration {
return rules;
}
/**
* Check that the prefix is valid
* @param {string} prefix - The prefix to check
* @return {object|null} - The error or null
*/
_checkPrefix(prefix) {
if (prefix.length > 1024) {
const msg = 'The maximum size of a prefix is 1024';
return errors.InvalidRequest.customizeDescription(msg);
}
return null;
}
/**
* Parses the prefix of the config
* @param {string} prefix - The prefix to parse
* @return {object} - Contains error if parsing returned an error, otherwise
* it contains the parsed rule object
*/
_parsePrefix(prefix) {
const error = this._checkPrefix(prefix);
if (error) {
return { error };
}
return {
propName: 'prefix',
prefix,
};
}
/**
* Check that each xml rule is valid
* @param {object} rule - a rule object from Rule array from this._parsedXml
@ -201,14 +233,18 @@ class LifecycleConfiguration {
customizeDescription('Rule xml does not include Status');
return ruleObj;
}
const subFilter = rule.Filter ? rule.Filter[0] : rule.Prefix;
const id = this._parseID(rule.ID);
const status = this._parseStatus(rule.Status[0]);
const filter = this._parseFilter(subFilter);
const actions = this._parseAction(rule);
const rulePropArray = [id, status, filter, actions];
const rulePropArray = [id, status, actions];
if (rule.Prefix) {
// Backward compatibility with deprecated top-level prefix.
const prefix = this._parsePrefix(rule.Prefix[0]);
rulePropArray.push(prefix);
} else if (rule.Filter) {
const filter = this._parseFilter(rule.Filter[0]);
rulePropArray.push(filter);
}
for (let i = 0; i < rulePropArray.length; i++) {
const prop = rulePropArray[i];
if (prop.error) {
@ -218,7 +254,11 @@ class LifecycleConfiguration {
const propName = prop.propName;
// eslint-disable-next-line no-param-reassign
delete prop.propName;
ruleObj[propName] = prop[propName] || prop;
if (prop[propName] !== undefined) {
ruleObj[propName] = prop[propName];
} else {
ruleObj[propName] = prop;
}
}
}
return ruleObj;
@ -250,12 +290,14 @@ class LifecycleConfiguration {
_parseFilter(filter) {
const filterObj = {};
filterObj.propName = 'filter';
// if no Rule Prefix or Filter, rulePrefix is empty string
filterObj.rulePrefix = '';
if (Array.isArray(filter)) {
// if Prefix was included, not Filter, filter will be Prefix array
// if more than one Prefix is included, we ignore all but the last
filterObj.rulePrefix = filter.pop();
filterObj.rulePrefix = filter[filter.length - 1];
const error = this._checkPrefix(filterObj.rulePrefix);
if (error) {
filterObj.error = error;
}
return filterObj;
}
if (filter.And && (filter.Prefix || filter.Tag) ||
@ -265,11 +307,15 @@ class LifecycleConfiguration {
return filterObj;
}
if (filter.Prefix) {
filterObj.rulePrefix = filter.Prefix.pop();
filterObj.rulePrefix = filter.Prefix[filter.Prefix.length - 1];
const error = this._checkPrefix(filterObj.rulePrefix);
if (error) {
filterObj.error = error;
}
return filterObj;
}
if (filter.Tag) {
const tagObj = this._parseTags(filter.Tag[0]);
const tagObj = this._parseTags(filter.Tag);
if (tagObj.error) {
filterObj.error = tagObj.error;
return filterObj;
@ -285,9 +331,14 @@ class LifecycleConfiguration {
return filterObj;
}
if (andF.Prefix && andF.Prefix.length >= 1) {
filterObj.rulePrefix = andF.Prefix.pop();
filterObj.rulePrefix = andF.Prefix[andF.Prefix.length - 1];
const error = this._checkPrefix(filterObj.rulePrefix);
if (error) {
filterObj.error = error;
return filterObj;
}
}
const tagObj = this._parseTags(andF.Tag[0]);
const tagObj = this._parseTags(andF.Tag);
if (tagObj.error) {
filterObj.error = tagObj.error;
return filterObj;
@ -320,31 +371,33 @@ class LifecycleConfiguration {
// reset _tagKeys to empty because keys cannot overlap within a rule,
// but different rules can have the same tag keys
this._tagKeys = [];
if (!tags.Key || !tags.Value) {
tagObj.error = errors.MissingRequiredParameter.customizeDescription(
'Tag XML does not contain both Key and Value');
return tagObj;
}
if (tags.Key.length !== tags.Value.length) {
tagObj.error = errors.MalformedXML.customizeDescription(
'Tag XML should contain same number of Keys and Values');
return tagObj;
}
for (let i = 0; i < tags.Key.length; i++) {
if (tags.Key[i].length < 1 || tags.Key[i].length > 128) {
tagObj.error = errors.InvalidRequest.customizeDescription(
'Tag Key must be a length between 1 and 128 char');
for (let i = 0; i < tags.length; i++) {
if (!tags[i].Key || !tags[i].Value) {
tagObj.error =
errors.MissingRequiredParameter.customizeDescription(
'Tag XML does not contain both Key and Value');
break;
}
if (this._tagKeys.includes(tags.Key[i])) {
if (tags[i].Key[0].length < 1 || tags[i].Key[0].length > 128) {
tagObj.error = errors.InvalidRequest.customizeDescription(
'A Tag\'s Key must be a length between 1 and 128');
break;
}
if (tags[i].Value[0].length < 0 || tags[i].Value[0].length > 256) {
tagObj.error = errors.InvalidRequest.customizeDescription(
'A Tag\'s Value must be a length between 0 and 256');
break;
}
if (this._tagKeys.includes(tags[i].Key[0])) {
tagObj.error = errors.InvalidRequest.customizeDescription(
'Tag Keys must be unique');
break;
}
this._tagKeys.push(tags.Key[i]);
this._tagKeys.push(tags[i].Key[0]);
const tag = {
key: tags.Key[i],
val: tags.Value[i],
key: tags[i].Key[0],
val: tags[i].Value[0],
};
tagObj.tags.push(tag);
}
@ -631,20 +684,24 @@ class LifecycleConfiguration {
const rules = config.rules;
assert.strictEqual(Array.isArray(rules), true);
rules.forEach(rule => {
const { ruleID, ruleStatus, filter, actions } = rule;
const { ruleID, ruleStatus, prefix, filter, actions } = rule;
assert.strictEqual(typeof ruleID, 'string');
assert.strictEqual(typeof ruleStatus, 'string');
assert.strictEqual(typeof filter, 'object');
assert.strictEqual(Array.isArray(actions), true);
if (filter.rulePrefix) {
assert.strictEqual(typeof filter.rulePrefix, 'string');
}
if (filter.tags) {
assert.strictEqual(Array.isArray(filter.tags), true);
filter.tags.forEach(t => {
assert.strictEqual(typeof t.key, 'string');
assert.strictEqual(typeof t.val, 'string');
});
if (prefix !== undefined) {
assert.strictEqual(typeof prefix, 'string');
} else {
assert.strictEqual(typeof filter, 'object');
assert.strictEqual(Array.isArray(actions), true);
if (filter.rulePrefix) {
assert.strictEqual(typeof filter.rulePrefix, 'string');
}
if (filter.tags) {
assert.strictEqual(Array.isArray(filter.tags), true);
filter.tags.forEach(t => {
assert.strictEqual(typeof t.key, 'string');
assert.strictEqual(typeof t.val, 'string');
});
}
}
actions.forEach(a => {
assert.strictEqual(typeof a.actionName, 'string');
@ -669,26 +726,36 @@ class LifecycleConfiguration {
static getConfigXml(config) {
const rules = config.rules;
const rulesXML = rules.map(rule => {
const { ruleID, ruleStatus, filter, actions } = rule;
const ID = `<ID>${ruleID}</ID>`;
const { ruleID, ruleStatus, filter, actions, prefix } = rule;
const ID = `<ID>${escapeForXml(ruleID)}</ID>`;
const Status = `<Status>${ruleStatus}</Status>`;
const { rulePrefix, tags } = filter;
const Prefix = rulePrefix ? `<Prefix>${rulePrefix}</Prefix>` : '';
let rulePrefix;
if (prefix !== undefined) {
rulePrefix = prefix;
} else {
rulePrefix = filter.rulePrefix;
}
const tags = filter && filter.tags;
const Prefix = rulePrefix !== undefined ?
`<Prefix>${rulePrefix}</Prefix>` : '';
let tagXML = '';
if (tags) {
const keysVals = tags.map(t => {
tagXML = tags.map(t => {
const { key, val } = t;
const Tag = `<Key>${key}</Key>` +
`<Value>${val}</Value>`;
const Tag = `<Tag><Key>${key}</Key>` +
`<Value>${val}</Value></Tag>`;
return Tag;
}).join('');
tagXML = `<Tag>${keysVals}</Tag>`;
}
let Filter;
if (rulePrefix && !tags) {
if (prefix !== undefined) {
// Prefix is in the top-level of the config, so we can skip the
// filter property.
Filter = Prefix;
} else if (tags && (rulePrefix || tags.length > 1)) {
} else if (filter.rulePrefix !== undefined && !tags) {
Filter = `<Filter>${Prefix}</Filter>`;
} else if (tags &&
(filter.rulePrefix !== undefined || tags.length > 1)) {
Filter = `<Filter><And>${Prefix}${tagXML}</And></Filter>`;
} else {
// remaining condition is if only one or no tag
@ -723,6 +790,62 @@ class LifecycleConfiguration {
`${rulesXML}` +
'</LifecycleConfiguration>';
}
/**
* Get JSON representation of lifecycle configuration object
* @param {object} config - Lifecycle configuration object
* @return {string} - XML representation of config
*/
static getConfigJson(config) {
const rules = config.rules;
const rulesJSON = rules.map(rule => {
const { ruleID, ruleStatus, filter, actions, prefix } = rule;
const entry = new LifecycleRule(ruleID, ruleStatus);
if (prefix !== undefined) {
entry.addPrefix(prefix);
} else if (filter && filter.rulePrefix !== undefined) {
entry.addPrefix(filter.rulePrefix);
}
const tags = filter && filter.tags;
if (tags) {
tags.forEach(tag => entry.addTag(tag.key, tag.val));
}
actions.forEach(action => {
const { actionName, days, date, deleteMarker } = action;
if (actionName === 'AbortIncompleteMultipartUpload') {
entry.addAbortMPU(days);
return;
}
if (actionName === 'NoncurrentVersionExpiration') {
entry.addNCVExpiration(days);
return;
}
if (actionName === 'Expiration') {
if (days !== undefined) {
entry.addExpiration('Days', days);
return;
}
if (date !== undefined) {
entry.addExpiration('Date', date);
return;
}
if (deleteMarker !== undefined) {
entry.addExpiration('ExpiredObjectDeleteMarker', deleteMarker);
return;
}
}
});
return entry.build();
});
return { Rules: rulesJSON };
}
}
module.exports = LifecycleConfiguration;

138
lib/models/LifecycleRule.js Normal file
View File

@ -0,0 +1,138 @@
const uuid = require('uuid/v4');
/**
* @class LifecycleRule
*
* @classdesc Simple get/set class to build a single Rule
*/
class LifecycleRule {
constructor(id, status) {
// defaults
this.id = id || uuid();
this.status = status === 'Disabled' ? 'Disabled' : 'Enabled';
this.tags = [];
}
build() {
const rule = {};
rule.ID = this.id;
rule.Status = this.status;
if (this.expiration) {
rule.Expiration = this.expiration;
}
if (this.ncvExpiration) {
rule.NoncurrentVersionExpiration = this.ncvExpiration;
}
if (this.abortMPU) {
rule.AbortIncompleteMultipartUpload = this.abortMPU;
}
if (this.transitions) {
rule.Transitions = this.transitions;
}
const filter = {};
if ((this.prefix && this.tags.length) || (this.tags.length > 1)) {
// And rule
const andRule = {};
if (this.prefix) {
andRule.Prefix = this.prefix;
}
andRule.Tags = this.tags;
filter.And = andRule;
} else {
if (this.prefix) {
filter.Prefix = this.prefix;
}
if (this.tags.length) {
filter.Tag = this.tags[0];
}
}
if (Object.keys(filter).length > 0) {
rule.Filter = filter;
} else {
rule.Prefix = '';
}
return rule;
}
addID(id) {
this.id = id;
return this;
}
disable() {
this.status = 'Disabled';
return this;
}
addPrefix(prefix) {
this.prefix = prefix;
return this;
}
addTag(key, value) {
this.tags.push({
Key: key,
Value: value,
});
return this;
}
/**
* Expiration
* @param {string} prop - Property must be defined in `validProps`
* @param {integer|boolean} value - integer for `Date` or `Days`, or
* boolean for `ExpiredObjectDeleteMarker`
* @return {undefined}
*/
addExpiration(prop, value) {
const validProps = ['Date', 'Days', 'ExpiredObjectDeleteMarker'];
if (validProps.indexOf(prop) > -1) {
this.expiration = this.expiration || {};
if (prop === 'ExpiredObjectDeleteMarker') {
this.expiration[prop] = JSON.parse(value);
} else {
this.expiration[prop] = value;
}
}
return this;
}
/**
* NoncurrentVersionExpiration
* @param {integer} days - NoncurrentDays
* @return {undefined}
*/
addNCVExpiration(days) {
this.ncvExpiration = { NoncurrentDays: days };
return this;
}
/**
* AbortIncompleteMultipartUpload
* @param {integer} days - DaysAfterInitiation
* @return {undefined}
*/
addAbortMPU(days) {
this.abortMPU = { DaysAfterInitiation: days };
return this;
}
/**
* Transitions
* @param {array} transitions - transitions
* @return {undefined}
*/
addTransitions(transitions) {
this.transitions = transitions;
return this;
}
}
module.exports = LifecycleRule;

View File

@ -0,0 +1,311 @@
const assert = require('assert');
const UUID = require('uuid');
const {
supportedNotificationEvents,
notificationArnPrefix,
} = require('../constants');
const errors = require('../errors');
/**
* Format of xml request:
*
* <ONotificationConfiguration>
* <QueueConfiguration>
* <Event>array</Event>
* <Filter>
* <S3Key>
* <FilterRule>
* <Name>string</Name>
* <Value>string</Value>
* </FilterRule>
* </S3Key>
* </Filter>
* <Id>string</Id>
* <Queue>string</Queue>
* </QueueConfiguration>
* </NotificationConfiguration>
*/
/**
* Format of config:
*
* config = {
* queueConfig: [
* {
* events: array,
* queueArn: string,
* filterRules: [
* {
* name: string,
* value: string
* },
* {
* name: string,
* value:string
* },
* ],
* id: string
* }
* ]
* }
*/
class NotificationConfiguration {
/**
* Create a Notification Configuration instance
* @param {string} xml - parsed configuration xml
* @return {object} - NotificationConfiguration instance
*/
constructor(xml) {
this._parsedXml = xml;
this._config = {};
this._ids = new Set([]);
}
/**
* Get notification configuration
* @return {object} - contains error if parsing failed
*/
getValidatedNotificationConfiguration() {
const validationError = this._parseNotificationConfig();
if (validationError) {
this._config.error = validationError;
}
return this._config;
}
/**
* Check that notification configuration is valid
* @return {error | null} - error if parsing failed, else undefined
*/
_parseNotificationConfig() {
if (!this._parsedXml || this._parsedXml === '') {
return errors.MalformedXML.customizeDescription(
'request xml is undefined or empty');
}
const notificationConfig = this._parsedXml.NotificationConfiguration;
if (!notificationConfig || notificationConfig === '') {
return errors.MalformedXML.customizeDescription(
'request xml does not include NotificationConfiguration');
}
const queueConfig = notificationConfig.QueueConfiguration;
if (!queueConfig || !queueConfig[0]) {
// if undefined or empty QueueConfiguration, notif configuration is deleted
return null;
}
this._config.queueConfig = [];
let parseError;
for (let i = 0; i < queueConfig.length; i++) {
const eventObj = this._parseEvents(queueConfig[i].Event);
const filterObj = this._parseFilter(queueConfig[i].Filter);
const idObj = this._parseId(queueConfig[i].Id);
const arnObj = this._parseArn(queueConfig[i].Queue);
if (eventObj.error) {
parseError = eventObj.error;
this._config = {};
break;
}
if (filterObj.error) {
parseError = filterObj.error;
this._config = {};
break;
}
if (idObj.error) {
parseError = idObj.error;
this._config = {};
break;
}
if (arnObj.error) {
parseError = arnObj.error;
this._config = {};
break;
}
this._config.queueConfig.push({
events: eventObj.events,
queueArn: arnObj.arn,
id: idObj.id,
filterRules: filterObj.filterRules,
});
}
return parseError;
}
/**
* Check that events array is valid
* @param {array} events - event array
* @return {object} - contains error if parsing failed or events array
*/
_parseEvents(events) {
const eventsObj = {
events: [],
};
if (!events || !events[0]) {
eventsObj.error = errors.MalformedXML.customizeDescription(
'each queue configuration must contain an event');
return eventsObj;
}
events.forEach(e => {
if (!supportedNotificationEvents.has(e)) {
eventsObj.error = errors.MalformedXML.customizeDescription(
'event array contains invalid or unsupported event');
} else {
eventsObj.events.push(e);
}
});
return eventsObj;
}
/**
* Check that filter array is valid
* @param {array} filter - filter array
* @return {object} - contains error if parsing failed or filter array
*/
_parseFilter(filter) {
if (!filter || !filter[0]) {
return {};
}
if (!filter[0].S3Key || !filter[0].S3Key[0]) {
return { error: errors.MalformedXML.customizeDescription(
'if included, queue configuration filter must contain S3Key') };
}
const filterRules = filter[0].S3Key[0];
if (!filterRules.FilterRule || !filterRules.FilterRule[0]) {
return { error: errors.MalformedXML.customizeDescription(
'if included, queue configuration filter must contain a rule') };
}
const filterObj = {
filterRules: [],
};
const ruleArray = filterRules.FilterRule;
for (let i = 0; i < ruleArray.length; i++) {
if (!ruleArray[i].Name
|| !ruleArray[i].Name[0]
|| !ruleArray[i].Value
|| !ruleArray[i].Value[0]) {
return { error: errors.MalformedXML.customizeDescription(
'each included filter must contain a name and value') };
}
if (!['Prefix', 'Suffix'].includes(ruleArray[i].Name[0])) {
return { error: errors.MalformedXML.customizeDescription(
'filter Name must be one of Prefix or Suffix') };
}
filterObj.filterRules.push({
name: ruleArray[i].Name[0],
value: ruleArray[i].Value[0],
});
}
return filterObj;
}
/**
* Check that id string is valid
* @param {string} id - id string (optional)
* @return {object} - contains error if parsing failed or id
*/
_parseId(id) {
if (id && id[0].length > 255) {
return { error: errors.InvalidArgument.customizeDescription(
'queue configuration ID is greater than 255 characters long') };
}
let validId;
if (!id || !id[0]) {
// id is optional property, so create one if not provided or is ''
// We generate 48-character alphanumeric, unique id for rule
validId = Buffer.from(UUID.v4()).toString('base64');
} else {
validId = id[0];
}
// Each ID in a list of rules must be unique.
if (this._ids.has(validId)) {
return { error: errors.InvalidRequest.customizeDescription(
'queue configuration ID must be unique') };
}
this._ids.add(validId);
return { id: validId };
}
/**
* Check that arn string is valid
* @param {string} arn - queue arn
* @return {object} - contains error if parsing failed or queue arn
*/
_parseArn(arn) {
if (!arn || !arn[0]) {
return { error: errors.MalformedXML.customizeDescription(
'each queue configuration must contain a queue arn'),
};
}
const splitArn = arn[0].split(':');
const slicedArn = arn[0].slice(0, 23);
if (splitArn.length !== 6 || slicedArn !== notificationArnPrefix) {
return { error: errors.MalformedXML.customizeDescription(
'queue arn is invalid') };
}
// remaining 3 parts of arn are evaluated in cloudserver
return { arn: arn[0] };
}
/**
* Get XML representation of notification configuration object
* @param {object} config - notification configuration object
* @return {string} - XML representation of config
*/
static getConfigXML(config) {
const xmlArray = [];
if (config && config.queueConfig) {
config.queueConfig.forEach(c => {
xmlArray.push('<QueueConfiguration>');
xmlArray.push(`<Id>${c.id}</Id>`);
xmlArray.push(`<Queue>${c.queueArn}</Queue>`);
c.events.forEach(e => {
xmlArray.push(`<Event>${e}</Event>`);
});
if (c.filterRules) {
xmlArray.push('<Filter><S3Key>');
c.filterRules.forEach(r => {
xmlArray.push(`<FilterRule><Name>${r.name}</Name>` +
`<Value>${r.value}</Value></FilterRule>`);
});
xmlArray.push('</S3Key></Filter>');
}
xmlArray.push('</QueueConfiguration>');
});
}
const queueConfigXML = xmlArray.join('');
return '<?xml version="1.0" encoding="UTF-8"?>' +
'<NotificationConfiguration ' +
'xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
`${queueConfigXML}` +
'</NotificationConfiguration>';
}
/**
* Validate the bucket metadata notification configuration structure and
* value types
* @param {object} config - The notificationconfiguration to validate
* @return {undefined}
*/
static validateConfig(config) {
assert.strictEqual(typeof config, 'object');
if (!config.queueConfig) {
return;
}
config.queueConfig.forEach(q => {
const { events, queueArn, filterRules, id } = q;
events.forEach(e => assert.strictEqual(typeof e, 'string'));
assert.strictEqual(typeof queueArn, 'string');
if (filterRules) {
filterRules.forEach(f => {
assert.strictEqual(typeof f.name, 'string');
assert.strictEqual(typeof f.value, 'string');
});
}
assert.strictEqual(typeof id, 'string');
});
return;
}
}
module.exports = NotificationConfiguration;

View File

@ -0,0 +1,238 @@
const assert = require('assert');
const errors = require('../errors');
/**
* Format of xml request:
*
* <ObjectLockConfiguration>
* <ObjectLockEnabled>Enabled</ObjectLockEnabled>
* <Rule>
* <DefaultRetention>
* <Mode>GOVERNANCE|COMPLIANCE</Mode>
* <Days>1</Days>
* <Years>1</Years>
* </DefaultRetention>
* </Rule>
* </ObjectLockConfiguration>
*/
/**
* Format of config:
*
* config = {
* rule: {
* mode: GOVERNANCE|COMPLIANCE,
* days|years: integer,
* }
* }
*/
class ObjectLockConfiguration {
/**
* Create an Object Lock Configuration instance
* @param {string} xml - the parsed configuration xml
* @return {object} - ObjectLockConfiguration instance
*/
constructor(xml) {
this._parsedXml = xml;
this._config = {};
}
/**
* Get the object lock configuration
* @return {object} - contains error if parsing failed
*/
getValidatedObjectLockConfiguration() {
const validConfig = this._parseObjectLockConfig();
if (validConfig.error) {
this._config.error = validConfig.error;
}
return this._config;
}
/**
* Check that mode is valid
* @param {array} mode - array containing mode value
* @return {object} - contains error if parsing failed
*/
_parseMode(mode) {
const validMode = {};
const expectedModes = ['GOVERNANCE', 'COMPLIANCE'];
if (!mode || !mode[0]) {
validMode.error = errors.MalformedXML.customizeDescription(
'request xml does not contain Mode');
return validMode;
}
if (mode.length > 1) {
validMode.error = errors.MalformedXML.customizeDescription(
'request xml contains more than one Mode');
return validMode;
}
if (!expectedModes.includes(mode[0])) {
validMode.error = errors.MalformedXML.customizeDescription(
'Mode request xml must be one of "GOVERNANCE", "COMPLIANCE"');
return validMode;
}
validMode.mode = mode[0];
return validMode;
}
/**
* Check that time limit is valid
* @param {object} dr - DefaultRetention object containing days or years
* @return {object} - contains error if parsing failed
*/
_parseTime(dr) {
const validTime = {};
if (dr.Days && dr.Years) {
validTime.error = errors.MalformedXML.customizeDescription(
'request xml contains both Days and Years');
return validTime;
}
const timeType = dr.Days ? 'Days' : 'Years';
if (!dr[timeType] || !dr[timeType][0]) {
validTime.error = errors.MalformedXML.customizeDescription(
'request xml does not contain Days or Years');
return validTime;
}
if (dr[timeType].length > 1) {
validTime.error = errors.MalformedXML.customizeDescription(
'request xml contains more than one retention period');
return validTime;
}
const timeValue = Number.parseInt(dr[timeType][0], 10);
if (Number.isNaN(timeValue)) {
validTime.error = errors.MalformedXML.customizeDescription(
'request xml does not contain valid retention period');
return validTime;
}
if (timeValue < 1) {
validTime.error = errors.InvalidArgument.customizeDescription(
'retention period must be a positive integer');
return validTime;
}
if ((timeType === 'Days' && timeValue > 36500) ||
(timeType === 'Years' && timeValue > 100)) {
validTime.error = errors.InvalidArgument.customizeDescription(
'retention period is too large');
return validTime;
}
validTime.timeType = timeType.toLowerCase();
validTime.timeValue = timeValue;
return validTime;
}
/**
* Check that object lock configuration is valid
* @return {object} - contains error if parsing failed
*/
_parseObjectLockConfig() {
const validConfig = {};
if (!this._parsedXml || this._parsedXml === '') {
validConfig.error = errors.MalformedXML.customizeDescription(
'request xml is undefined or empty');
return validConfig;
}
const objectLockConfig = this._parsedXml.ObjectLockConfiguration;
if (!objectLockConfig || objectLockConfig === '') {
validConfig.error = errors.MalformedXML.customizeDescription(
'request xml does not include ObjectLockConfiguration');
return validConfig;
}
const objectLockEnabled = objectLockConfig.ObjectLockEnabled;
if (!objectLockEnabled || objectLockEnabled[0] !== 'Enabled') {
validConfig.error = errors.MalformedXML.customizeDescription(
'request xml does not include valid ObjectLockEnabled');
return validConfig;
}
const ruleArray = objectLockConfig.Rule;
if (ruleArray) {
if (ruleArray.length > 1) {
validConfig.error = errors.MalformedXML.customizeDescription(
'request xml contains more than one rule');
return validConfig;
}
const drArray = ruleArray[0].DefaultRetention;
if (!drArray || !drArray[0] || drArray[0] === '') {
validConfig.error = errors.MalformedXML.customizeDescription(
'Rule request xml does not contain DefaultRetention');
return validConfig;
}
if (!drArray[0].Mode || (!drArray[0].Days && !drArray[0].Years)) {
validConfig.error = errors.MalformedXML.customizeDescription(
'DefaultRetention request xml does not contain Mode or ' +
'retention period (Days or Years)');
return validConfig;
}
const validMode = this._parseMode(drArray[0].Mode);
if (validMode.error) {
validConfig.error = validMode.error;
return validConfig;
}
const validTime = this._parseTime(drArray[0]);
if (validTime.error) {
validConfig.error = validTime.error;
return validConfig;
}
this._config.rule = {};
this._config.rule.mode = validMode.mode;
this._config.rule[validTime.timeType] = validTime.timeValue;
}
return validConfig;
}
/**
* Validate the bucket metadata object lock configuration structure and
* value types
* @param {object} config - The object lock configuration to validate
* @return {undefined}
*/
static validateConfig(config) {
assert.strictEqual(typeof config, 'object');
const rule = config.rule;
if (rule) {
assert.strictEqual(typeof rule, 'object');
assert.strictEqual(typeof rule.mode, 'string');
if (rule.days) {
assert.strictEqual(typeof rule.days, 'number');
} else {
assert.strictEqual(typeof rule.years, 'number');
}
}
}
/**
* Get the XML representation of the configuration object
* @param {object} config - The bucket object lock configuration
* @return {string} - The XML representation of the configuration
*/
static getConfigXML(config) {
// object lock is enabled on the bucket but object lock configuration
// not set
if (config.rule === undefined) {
return '<?xml version="1.0" encoding="UTF-8"?>' +
'<ObjectLockConfiguration ' +
'xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
'<ObjectLockEnabled>Enabled</ObjectLockEnabled>' +
'</ObjectLockConfiguration>';
}
const { days, years, mode } = config.rule;
const Mode = `<Mode>${mode}</Mode>`;
const Days = days !== undefined ? `<Days>${days}</Days>` : '';
const Years = years !== undefined ? `<Years>${years}</Years>` : '';
const Time = Days || Years;
return '<?xml version="1.0" encoding="UTF-8"?>' +
'<ObjectLockConfiguration ' +
'xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
'<ObjectLockEnabled>Enabled</ObjectLockEnabled>' +
'<Rule>' +
'<DefaultRetention>' +
`${Mode}` +
`${Time}` +
'</DefaultRetention>' +
'</Rule>' +
'</ObjectLockConfiguration>';
}
}
module.exports = ObjectLockConfiguration;

View File

@ -126,6 +126,7 @@ class ObjectMD {
dataStoreVersionId: '',
},
'dataStoreName': '',
'originOp': '',
'isAborted': undefined,
};
}
@ -927,6 +928,78 @@ class ObjectMD {
return this;
}
/**
* Set object legal hold status
* @param {boolean} legalHold - true if legal hold is 'ON' false if 'OFF'
* @return {ObjectMD} itself
*/
setLegalHold(legalHold) {
this._data.legalHold = legalHold || false;
return this;
}
/**
* Get object legal hold status
* @return {boolean} legal hold status
*/
getLegalHold() {
return this._data.legalHold || false;
}
/**
* Set object retention mode
* @param {string} mode - should be one of 'GOVERNANCE', 'COMPLIANCE'
* @return {ObjectMD} itself
*/
setRetentionMode(mode) {
this._data.retentionMode = mode;
return this;
}
/**
* Set object retention retain until date
* @param {string} date - date in ISO-8601 format
* @return {ObjectMD} itself
*/
setRetentionDate(date) {
this._data.retentionDate = date;
return this;
}
/**
* Returns object retention mode
* @return {string} retention mode string
*/
getRetentionMode() {
return this._data.retentionMode;
}
/**
* Returns object retention retain until date
* @return {string} retention date string
*/
getRetentionDate() {
return this._data.retentionDate;
}
/**
* Set origin operation for object
* @param {string} op - name of origin operation
* @return {ObjectMD} itself
*/
setOriginOp(op) {
this._data.originOp = op;
return this;
}
/**
* Returns origin operation of object
* @return {string} origin operation string
*/
getOriginOp() {
return this._data.originOp;
}
/**
* Returns metadata object
*

View File

@ -14,6 +14,10 @@ class ObjectMDLocation {
* @param {string} locationObj.dataStoreName - type of data store
* @param {string} locationObj.dataStoreETag - internal ETag of
* data part
* @param {number} [location.cryptoScheme] - if location data is
* encrypted: the encryption scheme version
* @param {string} [location.cipheredDataKey] - if location data
* is encrypted: the base64-encoded ciphered data key
*/
constructor(locationObj) {
this._data = {
@ -23,6 +27,10 @@ class ObjectMDLocation {
dataStoreName: locationObj.dataStoreName,
dataStoreETag: locationObj.dataStoreETag,
};
if (locationObj.cryptoScheme) {
this._data.cryptoScheme = locationObj.cryptoScheme;
this._data.cipheredDataKey = locationObj.cipheredDataKey;
}
}
getKey() {
@ -33,9 +41,31 @@ class ObjectMDLocation {
return this._data.dataStoreName;
}
/**
* Update data location with new info
*
* @param {object} location - single data location info
* @param {string} location.key - data backend key
* @param {string} location.dataStoreName - type of data store
* @param {number} [location.cryptoScheme] - if location data is
* encrypted: the encryption scheme version
* @param {string} [location.cipheredDataKey] - if location data
* is encrypted: the base64-encoded ciphered data key
* @return {ObjectMDLocation} return this
*/
setDataLocation(location) {
this._data.key = location.key;
this._data.dataStoreName = location.dataStoreName;
[
'key',
'dataStoreName',
'cryptoScheme',
'cipheredDataKey',
].forEach(attrName => {
if (location[attrName] !== undefined) {
this._data[attrName] = location[attrName];
} else {
delete this._data[attrName];
}
});
return this;
}
@ -64,6 +94,14 @@ class ObjectMDLocation {
return this;
}
getCryptoScheme() {
return this._data.cryptoScheme;
}
getCipheredDataKey() {
return this._data.cipheredDataKey;
}
getValue() {
return this._data;
}

603
lib/network/kmip/Client.js Normal file
View File

@ -0,0 +1,603 @@
'use strict'; // eslint-disable-line
/* eslint new-cap: "off" */
const async = require('async');
const errors = require('../../errors');
const TTLVCodec = require('./codec/ttlv.js');
const TlsTransport = require('./transport/tls.js');
const KMIP = require('.');
const CRYPTOGRAPHIC_OBJECT_TYPE = 'Symmetric Key';
const CRYPTOGRAPHIC_ALGORITHM = 'AES';
const CRYPTOGRAPHIC_CIPHER_MODE = 'CBC';
const CRYPTOGRAPHIC_PADDING_METHOD = 'PKCS5';
const CRYPTOGRAPHIC_LENGTH = 256;
const CRYPTOGRAPHIC_USAGE_MASK = ['Encrypt', 'Decrypt'];
const CRYPTOGRAPHIC_DEFAULT_IV = Buffer.alloc(16).fill(0);
const searchFilter = {
protocolVersionMajor:
'Response Message/Batch Item/' +
'Response Payload/Protocol Version/' +
'Protocol Version Major',
protocolVersionMinor:
'Response Message/Batch Item/' +
'Response Payload/Protocol Version/' +
'Protocol Version Minor',
extensionName:
'Response Message/Batch Item/Response Payload' +
'/Extension Information/Extension Name',
extensionTag:
'Response Message/Batch Item/Response Payload' +
'/Extension Information/Extension Tag',
vendorIdentification:
'Response Message/Batch Item/Response Payload/Vendor Identification',
serverInformation:
'Response Message/Batch Item/Response Payload/Server Information',
operation:
'Response Message/Batch Item/Response Payload/Operation',
objectType:
'Response Message/Batch Item/Response Payload/Object Type',
uniqueIdentifier:
'Response Message/Batch Item/Response Payload/Unique Identifier',
data:
'Response Message/Batch Item/Response Payload/Data',
};
/**
* Normalize errors according to arsenal definitions
* @param {string | Error} err - an Error instance or a message string
* @returns {arsenal.errors} - arsenal error
*/
function _arsenalError(err) {
const messagePrefix = 'KMIP:';
if (typeof err === 'string') {
return errors.InternalError
.customizeDescription(`${messagePrefix} ${err}`);
} else if (err instanceof Error) {
return errors.InternalError
.customizeDescription(`${messagePrefix} ${err.message}`);
}
return errors.InternalError
.customizeDescription(`${messagePrefix} Unspecified error`);
}
/**
* Negotiate with the server the use of a recent version of the protocol and
* update the low level driver with this new knowledge.
* @param {Object} client - The Client instance
* @param {Object} logger - Werelog logger object
* @param {Function} cb - The callback triggered after the negotiation.
* @returns {undefined}
*/
function _negotiateProtocolVersion(client, logger, cb) {
return client.kmip.request(logger, 'Discover Versions', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 4),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 3),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 2),
]),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::negotiateProtocolVersion',
{ error,
vendorIdentification: client.vendorIdentification });
return cb(error);
}
const majorVersions =
response.lookup(searchFilter.protocolVersionMajor);
const minorVersions =
response.lookup(searchFilter.protocolVersionMinor);
if (majorVersions.length === 0 ||
majorVersions.length !== minorVersions.length) {
const error = _arsenalError('No suitable protocol version');
logger.error('KMIP::negotiateProtocolVersion',
{ error,
vendorIdentification: client.vendorIdentification });
return cb(error);
}
client.kmip.changeProtocolVersion(majorVersions[0], minorVersions[0]);
return cb();
});
}
/**
* Obtain from the server the various extensions defined by the vendor
* and update the low level driver with this new knowledge.
* @param {Object} client - The Client instance
* @param {Object} logger - Werelog logger object
* @param {Function} cb - The callback triggered after the extension mapping
* @returns {undefined}
*/
function _mapExtensions(client, logger, cb) {
return client.kmip.request(logger, 'Query', [
KMIP.Enumeration('Query Function', 'Query Extension Map'),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::mapExtensions',
{ error,
vendorIdentification: client.vendorIdentification });
return cb(error);
}
const extensionNames = response.lookup(searchFilter.extensionName);
const extensionTags = response.lookup(searchFilter.extensionTag);
if (extensionNames.length !== extensionTags.length) {
const error = _arsenalError('Inconsistent extension list');
logger.error('KMIP::mapExtensions',
{ error,
vendorIdentification: client.vendorIdentification });
return cb(error);
}
extensionNames.forEach((extensionName, idx) => {
client.kmip.mapExtension(extensionName, extensionTags[idx]);
});
return cb();
});
}
/**
* Query the Server information and identify its vendor
* @param {Object} client - The Client instance
* @param {Object} logger - Werelog logger object
* @param {Function} cb - The callback triggered after the information discovery
* @returns {undefined}
*/
function _queryServerInformation(client, logger, cb) {
client.kmip.request(logger, 'Query', [
KMIP.Enumeration('Query Function', 'Query Server Information'),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.warn('KMIP::queryServerInformation',
{ error });
/* no error returned, caller can keep going */
return cb();
}
client._setVendorIdentification(
response.lookup(searchFilter.vendorIdentification)[0]);
client._setServerInformation(
JSON.stringify(response.lookup(searchFilter.serverInformation)[0]));
logger.info('KMIP Server identified',
{ vendorIdentification: client.vendorIdentification,
serverInformation: client.serverInformation,
negotiatedProtocolVersion: client.kmip.protocolVersion });
return cb();
});
}
/**
* Query the Server for the supported operations and managed object types.
* The fact that a server doesn't announce the support for a required feature
* is not a show stopper because some vendor support more or less what they
* announce. If a subsequent request fails, this information can be used to
* figure out the reason for the failure.
* @param {Object} client - The Client instance
* @param {Object} logger - Werelog logger object
* @param {Function} cb - The callback triggered after the information discovery
* @returns {undefined}
*/
function _queryOperationsAndObjects(client, logger, cb) {
return client.kmip.request(logger, 'Query', [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::queryOperationsAndObjects',
{ error,
vendorIdentification: client.vendorIdentification });
return cb(error);
}
const supportedOperations = response.lookup(searchFilter.operation);
const supportedObjectTypes = response.lookup(searchFilter.objectType);
const supportsEncrypt = supportedOperations.includes('Encrypt');
const supportsDecrypt = supportedOperations.includes('Decrypt');
const supportsActivate = supportedOperations.includes('Activate');
const supportsRevoke = supportedOperations.includes('Revoke');
const supportsCreate = supportedOperations.includes('Create');
const supportsDestroy = supportedOperations.includes('Destroy');
const supportsQuery = supportedOperations.includes('Query');
const supportsSymmetricKeys =
supportedObjectTypes.includes('Symmetric Key');
if (!supportsEncrypt || !supportsDecrypt ||
!supportsActivate || !supportsRevoke ||
!supportsCreate || !supportsDestroy ||
!supportsQuery || !supportsSymmetricKeys) {
/* This should not be considered as an error since some vendors
* are not consistent between what they really support and what
* they announce to support.
*/
logger.warn('KMIP::queryOperationsAndObjects: ' +
'The KMIP Server announces that it ' +
'does not support all of the required features',
{ vendorIdentification: client.vendorIdentification,
serverInformation: client.serverInformation,
supportsEncrypt, supportsDecrypt,
supportsActivate, supportsRevoke,
supportsCreate, supportsDestroy,
supportsQuery, supportsSymmetricKeys });
} else {
logger.info('KMIP Server provides the necessary feature set',
{ vendorIdentification: client.vendorIdentification });
}
return cb();
});
}
class Client {
/**
* Construct a high level KMIP driver suitable for cloudserver
* @param {Object} options - Instance options
* @param {Object} options.kmip - Low level driver options
* @param {Object} options.kmip.client - This high level driver options
* @param {Object} options.kmip.client.compoundCreateActivate -
* Depends on the server's ability. False offers the best
* compatibility. True does not offer a significant
* performance gain, but can be useful in case of unreliable
* time synchronization between the client and the server.
* @param {Object} options.kmip.client.bucketNameAttributeName -
* Depends on the server's ability. Not specifying this
* offers the best compatibility and disable the attachement
* of the bucket name as a key attribute.
* @param {Object} options.kmip.codec - KMIP Codec options
* @param {Object} options.kmip.transport - KMIP Transport options
* @param {Class} CodecClass - diversion for the Codec class,
* defaults to TTLVCodec
* @param {Class} TransportClass - diversion for the Transport class,
* defaults to TlsTransport
*/
constructor(options, CodecClass, TransportClass) {
this.options = options.kmip.client || {};
this.vendorIdentification = '';
this.serverInformation = [];
this.kmip = new KMIP(CodecClass || TTLVCodec,
TransportClass || TlsTransport,
options);
this.kmip.registerHandshakeFunction((logger, cb) => {
this._kmipHandshake(logger, cb);
});
}
/**
* Update this client with the vendor identification of the server
* @param {String} vendorIdentification - Vendor identification string
* @returns {undefined}
*/
_setVendorIdentification(vendorIdentification) {
this.vendorIdentification = vendorIdentification;
}
/**
* Update this client with the information about the server
* @param {Object} serverInformation - Server information object
* @returns {undefined}
*/
_setServerInformation(serverInformation) {
this.serverInformation = serverInformation;
}
/**
* Perform the KMIP level handshake with the server
* @param {Object} logger - Werelog logger object
* @param {Function} cb - Callback to be triggered at the end of the
* handshake. cb(err: Error)
* @returns {undefined}
*/
_kmipHandshake(logger, cb) {
return async.waterfall([
next => _negotiateProtocolVersion(this, logger, next),
next => _mapExtensions(this, logger, next),
next => _queryServerInformation(this, logger, next),
next => _queryOperationsAndObjects(this, logger, next),
], cb);
}
/**
* Activate a cryptographic key managed by the server,
* for a specific bucket. This is a required action to perform after
* the key creation.
* @param {string} keyIdentifier - The bucket key Id
* @param {object} logger - Werelog logger object
* @param {function} cb - The callback(err: Error)
* @returns {undefined}
*/
_activateBucketKey(keyIdentifier, logger, cb) {
return this.kmip.request(logger, 'Activate', [
KMIP.TextString('Unique Identifier', keyIdentifier),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::_activateBucketKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
if (uniqueIdentifier !== keyIdentifier) {
const error = _arsenalError(
'Server did not return the expected identifier');
logger.error('KMIP::cipherDataKey',
{ error, uniqueIdentifier });
return cb(error);
}
return cb(null, keyIdentifier);
});
}
/**
* Create a new cryptographic key managed by the server,
* for a specific bucket
* @param {string} bucketName - The bucket name
* @param {object} logger - Werelog logger object
* @param {function} cb - The callback(err: Error, bucketKeyId: String)
* @returns {undefined}
*/
createBucketKey(bucketName, logger, cb) {
const attributes = [];
if (!!this.options.bucketNameAttributeName) {
attributes.push(KMIP.Attribute('TextString',
this.options.bucketNameAttributeName,
bucketName));
}
attributes.push(...[
KMIP.Attribute('Enumeration', 'Cryptographic Algorithm',
CRYPTOGRAPHIC_ALGORITHM),
KMIP.Attribute('Integer', 'Cryptographic Length',
CRYPTOGRAPHIC_LENGTH),
KMIP.Attribute('Integer', 'Cryptographic Usage Mask',
this.kmip.encodeMask('Cryptographic Usage Mask',
CRYPTOGRAPHIC_USAGE_MASK))]);
if (this.options.compoundCreateActivate) {
attributes.push(KMIP.Attribute('Date-Time', 'Activation Date',
new Date(Date.UTC())));
}
return this.kmip.request(logger, 'Create', [
KMIP.Enumeration('Object Type', CRYPTOGRAPHIC_OBJECT_TYPE),
KMIP.Structure('Template-Attribute', attributes),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::createBucketKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const createdObjectType =
response.lookup(searchFilter.objectType)[0];
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
if (createdObjectType !== CRYPTOGRAPHIC_OBJECT_TYPE) {
const error = _arsenalError(
'Server created an object of wrong type');
logger.error('KMIP::createBucketKey',
{ error, createdObjectType });
return cb(error);
}
if (!this.options.compoundCreateActivate) {
return this._activateBucketKey(uniqueIdentifier, logger, cb);
}
return cb(null, uniqueIdentifier);
});
}
/**
* Revoke a cryptographic key managed by the server, for a specific bucket.
* This is a required action to perform before being able to destroy the
* managed key.
* @param {string} bucketKeyId - The bucket key Id
* @param {object} logger - Werelog logger object
* @param {function} cb - The callback(err: Error)
* @returns {undefined}
*/
_revokeBucketKey(bucketKeyId, logger, cb) {
// maybe revoke first
return this.kmip.request(logger, 'Revoke', [
KMIP.TextString('Unique Identifier', bucketKeyId),
KMIP.Structure('Revocation Reason', [
KMIP.Enumeration('Revocation Reason Code',
'Cessation of Operation'),
KMIP.TextString('Revocation Message',
'About to be deleted'),
]),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::_revokeBucketKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
if (uniqueIdentifier !== bucketKeyId) {
const error = _arsenalError(
'Server did not return the expected identifier');
logger.error('KMIP::_revokeBucketKey',
{ error, uniqueIdentifier });
return cb(error);
}
return cb();
});
}
/**
* Destroy a cryptographic key managed by the server, for a specific bucket.
* @param {string} bucketKeyId - The bucket key Id
* @param {object} logger - Werelog logger object
* @param {function} cb - The callback(err: Error)
* @returns {undefined}
*/
destroyBucketKey(bucketKeyId, logger, cb) {
return this._revokeBucketKey(bucketKeyId, logger, err => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::destroyBucketKey: revocation failed',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
return this.kmip.request(logger, 'Destroy', [
KMIP.TextString('Unique Identifier', bucketKeyId),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::destroyBucketKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
if (uniqueIdentifier !== bucketKeyId) {
const error = _arsenalError(
'Server did not return the expected identifier');
logger.error('KMIP::destroyBucketKey',
{ error, uniqueIdentifier });
return cb(error);
}
return cb();
});
});
}
/**
*
* @param {number} cryptoScheme - crypto scheme version number
* @param {string} masterKeyId - key to retrieve master key
* @param {buffer} plainTextDataKey - data key
* @param {object} logger - werelog logger object
* @param {function} cb - callback
* @returns {undefined}
* @callback called with (err, cipheredDataKey: Buffer)
*/
cipherDataKey(cryptoScheme,
masterKeyId,
plainTextDataKey,
logger,
cb) {
return this.kmip.request(logger, 'Encrypt', [
KMIP.TextString('Unique Identifier', masterKeyId),
KMIP.Structure('Cryptographic Parameters', [
KMIP.Enumeration('Block Cipher Mode',
CRYPTOGRAPHIC_CIPHER_MODE),
KMIP.Enumeration('Padding Method',
CRYPTOGRAPHIC_PADDING_METHOD),
KMIP.Enumeration('Cryptographic Algorithm',
CRYPTOGRAPHIC_ALGORITHM),
]),
KMIP.ByteString('Data', plainTextDataKey),
KMIP.ByteString('IV/Counter/Nonce', CRYPTOGRAPHIC_DEFAULT_IV),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::cipherDataKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
const data = response.lookup(searchFilter.data)[0];
if (uniqueIdentifier !== masterKeyId) {
const error = _arsenalError(
'Server did not return the expected identifier');
logger.error('KMIP::cipherDataKey',
{ error, uniqueIdentifier });
return cb(error);
}
return cb(null, data);
});
}
/**
*
* @param {number} cryptoScheme - crypto scheme version number
* @param {string} masterKeyId - key to retrieve master key
* @param {buffer} cipheredDataKey - data key
* @param {object} logger - werelog logger object
* @param {function} cb - callback
* @returns {undefined}
* @callback called with (err, plainTextDataKey: Buffer)
*/
decipherDataKey(cryptoScheme,
masterKeyId,
cipheredDataKey,
logger,
cb) {
return this.kmip.request(logger, 'Decrypt', [
KMIP.TextString('Unique Identifier', masterKeyId),
KMIP.Structure('Cryptographic Parameters', [
KMIP.Enumeration('Block Cipher Mode',
CRYPTOGRAPHIC_CIPHER_MODE),
KMIP.Enumeration('Padding Method',
CRYPTOGRAPHIC_PADDING_METHOD),
KMIP.Enumeration('Cryptographic Algorithm',
CRYPTOGRAPHIC_ALGORITHM),
]),
KMIP.ByteString('Data', cipheredDataKey),
KMIP.ByteString('IV/Counter/Nonce', CRYPTOGRAPHIC_DEFAULT_IV),
], (err, response) => {
if (err) {
const error = _arsenalError(err);
logger.error('KMIP::decipherDataKey',
{ error,
serverInformation: this.serverInformation });
return cb(error);
}
const uniqueIdentifier =
response.lookup(searchFilter.uniqueIdentifier)[0];
const data = response.lookup(searchFilter.data)[0];
if (uniqueIdentifier !== masterKeyId) {
const error = _arsenalError(
'Server did not return the right identifier');
logger.error('KMIP::decipherDataKey',
{ error, uniqueIdentifier });
return cb(error);
}
return cb(null, data);
});
}
healthcheck(logger, cb) {
// the bucket does not have to exist, just passing a common bucket name here
this.createBucketKey('kmip-healthcheck-test-bucket', logger, (err, bucketKeyId) => {
if (err) {
logger.error('KMIP::healthcheck: failure to create a test bucket key', {
error: err,
});
return cb(err);
}
logger.debug('KMIP::healthcheck: success creating a test bucket key');
this.destroyBucketKey(bucketKeyId, logger, err => {
if (err) {
logger.error('KMIP::healthcheck: failure to remove the test bucket key', {
bucketKeyId,
error: err,
});
}
});
// no need to have the healthcheck wait until the
// destroyBucketKey() call finishes, as the healthcheck
// already succeeded by creating the bucket key
return cb();
});
}
}
module.exports = Client;

View File

@ -0,0 +1,54 @@
'use strict'; // eslint-disable-line
const assert = require('assert');
function _lookup(decodedTTLV, path) {
const xpath = path.split('/').filter(word => word.length > 0);
const canonicalPath = xpath.join('/');
const obj = decodedTTLV;
let res = [];
assert(Array.isArray(obj));
for (let current = xpath.shift(); current; current = xpath.shift()) {
for (let i = 0; i < obj.length; ++i) {
const cell = obj[i];
if (cell[current]) {
if (xpath.length === 0) {
/* Skip if the search path has not been
* completely consumed yet */
res.push(cell[current].value);
} else {
const subPath = xpath.join('/');
assert(current.length + 1 + subPath.length ===
canonicalPath.length);
const intermediate =
_lookup(cell[current].value, subPath);
res = res.concat(intermediate);
}
}
}
}
return res;
}
class Message {
/**
* Construct a new abstract Message
* @param {Object} content - the content of the message
*/
constructor(content) {
this.content = content;
}
/**
* Lookup the values corresponding to the provided path
* @param {String} path - the path in the hierarchy of the values
* of interest
* @return {Object} - an array of the values matching the provided path
*/
lookup(path) {
return _lookup(this.content, path);
}
}
module.exports = Message;

105
lib/network/kmip/README.md Normal file
View File

@ -0,0 +1,105 @@
# KMIP
Key Management Interoperability Protocol
## Preliminary usage example
```javascript
const {
kmipServerHostName,
clientKey,
clientCert,
serverCert,
rootCa
} = require('./myconfiguration.js');
const assert = require('assert');
const fs = require('fs');
const tls = require('tls');
const werelogs = require('werelogs');
const KMIP = require('arsenal').network.kmip;
const logger = new werelogs.Logger('kmiptest');
const kmip = new KMIP;
const options = {
host: kmipServerHostName,
key: fs.readFileSync(clientKey),
cert: fs.readFileSync(clientCert),
ca: [ fs.readFileSync(serverCert),
fs.readFileSync(rootCa), ],
checkServerIdentity: console.log,
};
const message = KMIP.Message([
KMIP.Structure('Request Message', [
KMIP.Structure('Request Header', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 3),
]),
KMIP.Integer('Maximum Response Size', 3456),
KMIP.Integer('Batch Count', 1),
]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', 'Query'),
KMIP.Structure('Request Payload', [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
KMIP.Enumeration('Query Function', 'Query Server Information'),
KMIP.Enumeration('Query Function', 'Query Extension Map'),
]),
]),
])
]);
const encodedMessage = kmip.encodeMessage(logger, message);
const socket = tls.connect(5696, options, () => {
socket.write(encodedMessage);
});
socket.on('data', (data) => {
const decodedMessage = kmip.decodeMessage(logger, data);
const summary = {
major: decodedMessage.lookup(
'Response Message/Response Header/' +
'Protocol Version/Protocol Version Major')[0],
minor: decodedMessage.lookup(
'Response Message/Response Header/' +
'Protocol Version/Protocol Version Minor')[0],
supportedOperations: decodedMessage.lookup(
'Response Message/Batch Item/Response Payload/Operation'),
supportedObjectTypes: decodedMessage.lookup(
'Response Message/Batch Item/Response Payload/Object Type'),
serverInformation: decodedMessage.lookup(
'Response Message/Batch Item/Response Payload/Server Information'),
};
console.log(JSON.stringify(summary));
//console.log(JSON.stringify(decodedMessage.content));
//console.log(data.toString('hex'));
const protocolVersionMajor =
decodedMessage.lookup('Response Message/Response Header/' +
'Protocol Version/Protocol Version Major');
const protocolVersionMinor =
decodedMessage.lookup('Response Message/Response Header/' +
'Protocol Version/Protocol Version Minor');
assert(summary.supportedOperations.includes('Encrypt'));
assert(summary.supportedOperations.includes('Decrypt'));
assert(summary.supportedOperations.includes('Create'));
assert(summary.supportedOperations.includes('Destroy'));
assert(summary.supportedOperations.includes('Query'));
assert(summary.supportedObjectTypes.includes('Symmetric Key'));
assert(protocolVersionMajor[0] >= 2 ||
(protocolVersionMajor[0] === 1 &&
protocolVersionMinor[0] >= 2));
socket.end();
});
socket.on('end', () => {
console.log('server ends connection');
});
```

View File

@ -0,0 +1,367 @@
# KMIP codecs
The KMIP protocol is based on the exchange of structured messages between a
client and a server.
About the structure of the messages:
* It is composed of fields and nested fields
* It varies depending on the context of the request and who emits the message.
* It follows the same encoding rules for both the client and the server
* The set of primitive types is the cornerstone of the protocol, the structure
description is contained within the messages along with the actual payload.
* The set of defined tags is the keystone of the protocol. It permits to
attribute a meaning to the fields of a structured message.
The role of the codec is twofold.
* To decode a message from a particular encoding, to an abstract
representation of the KMIP structured messages.
* To encode a message from its abstract representation to the particular
encoding.
The codecs are not responsible for sending the messages on the wire.
This task is devoted to the transport layer.
## Abstract representation
The primitive data types defined by the protocol are represented internally
as data structures following the form
```javascript
const abstractKmipField = {
[tagName]: {
type,
value
}
};
```
The tag name `tagName` is a string. It is decoded from the tag value
using the KMIP nomenclature and identify the meaning of the field in the message.
The type name `type` is a string and is one of the primitive types
defined by the KMIP protocol. This element of a field also implicitly carries
the information of length for fixed size data types.
The value `value` is decoded from the payload of the KMIP field. This
element carries the length information for varying data types.
## Constructing an abstract Message
```javascript
const msg = KMIP.Message(content);
```
The static method `KMIP.Message` instantiates an object of the class
`Message`. Message objects wrap the content of the message without
alteration and offer a `lookup` method to search the message for
named fields.
### Structure
```javascript
const field =
KMIP.Structure('Request Header', [
field_1,
...,
field_n,
]);
console.log(field);
{
'Request Header': {
type: 'Structure',
value: [
field_1,
...,
field_n
]
}
}
```
Fields in the array parameter must be provided in the order defined by the
specification for the considered structure name.
### Integer
```javascript
const field = KMIP.Integer('Protocol Version Minor', 3);
console.log(field);
{
'Protocol Version Minor': {
type: "Integer",
value: 3
}
}
```
Integers are encoded as four-byte long (32 bit) binary signed numbers in 2's
complement notation, transmitted big-endian.
### LongInteger
```javascript
const field = KMIP.LongInteger('Usage Limits Total', 10 ** 42);
console.log(field);
{
'Usage Limits Total': {
type: 'LongInteger',
value: 1e+42
}
}
```
Long Integers are encoded as eight-byte long (64 bit) binary signed numbers in
2's complement notation, transmitted big-endian.
Due to an accuracy limitation of number representation, `LongInteger` values
cannot exceed 2^53. It's expected from the codec to throw an error when
attempting to transcode a LongInteger greater than this value.
### BigInteger
```javascript
const field = KMIP.BigInteger('Usage Limits Total', value);
console.log(field);
{
'Usage Limits Total': {
type: 'LongInteger',
value: <Buffer ab cd ef ...>
}
}
```
Big Integers are encoded as a sequence of eight-bit bytes, in two's complement
notation, transmitted big-endian. If the length of the sequence is not a
multiple of eight bytes, then Big Integers SHALL be padded with the minimal
number of leading sign-extended bytes to make the length a multiple of eight
bytes. These padding bytes are part of the Item Value and SHALL be counted in
the Item Length.
### Enumeration
```javascript
const field = KMIP.Enumeration('Operation', 'Discover Versions');
console.log(field);
{
'Operation': {
type: 'Enumeration',
value: 'Discover Versions'
}
}
```
### Boolean
```javascript
const field = KMIP.Boolean('Asynchronous Indicator', false);
console.log(field);
{
'Asynchronous Indicator': {
type: 'Boolean',
value: false
}
}
```
### TextString
```javascript
const field = KMIP.TextString('Username', 'alice');
console.log(field);
{
'Username': {
type: 'TextString',
value: 'alice'
}
}
```
Text Strings are sequences of bytes that encode character values according to
the UTF-8 encoding standard. There SHALL NOT be null-termination at the end of
such strings.
### ByteString
```javascript
const field = KMIP.ByteString('Asynchronous Correlation Value', buffer);
console.log(field);
{
'Username': {
type: 'ByteString',
value: <Buffer ab cd ef ...>
}
}
```
Byte Strings are sequences of bytes containing individual unspecified eight-bit
binary values, and are interpreted in the same sequence order.
### DateTime
```javascript
const field = KMIP.DateTime('Activation Date', new Date);
console.log(field);
{
'Username': {
type: 'ByteString',
value: <Date 2019-01-10T20:41:36.914Z>
}
}
```
DateTime takes a Date object as its second parameter. The millisecond part of
the date is silently discarded and not sent through the Network.
For this particular example, the 'Activation Date' tag is used for illustration
purpose. This is not the appropriate way to instanciate this attribute value and
the special function `KMIP.Attribute` must be used instead of `KMIP.DateTime`.
### Interval
```javascript
const field = KMIP.Interval('Lease Time', 42);
console.log(field);
{
'Lease Time': {
type: "Interval",
value: 42
}
}
```
Intervals are encoded as four-byte long (32 bit) binary unsigned numbers,
transmitted big-endian. They have a resolution of one second.
### Special types
#### Bit Mask
Bit masks are encoded using the `Integer` primitive type relative to an instance
of the KMIP class (e.g. `encodeMask` and `decodemask` are not static class
function but regular methods).
```javascript
const kmip = new KMIP;
const mask = ['Encrypt', 'Decrypt'];
const bitMask = kmip.encodeMask('Cryptographic Usage Mask', mask);
const decodedMask = kmip.decodeMask('Cryptographic Usage Mask', bitMask);
assert.deepStrictEqual(decodedMask, mask);
assert(bitMask === 12);
```
#### Attribute
Attribute names and values are managed in a way that deviates from the general
rule. Particularly when it comes associate the value of an enumeration to its
tag. In the nominal form, the value of an enumeration in a field is retrieved
from the tag of this field. For the case of an Attribute, the tag of the
enumeration is referenced in the `Attribute Name` as a `TextString` and the
encoded enumeration value is stored in the `Attribute value`, hence
disconnecting the value from its tag.
```javascript
const cryptographicAlgorithm =
KMIP.Attribute('Enumeration', 'Cryptographic Algorithm', 'AES'),
const requestPayload =
KMIP.Structure('Request Payload', [
KMIP.Enumeration('Object Type', 'Symmetric Key'),
KMIP.Structure('Template-Attribute', [
KMIP.Attribute('TextString', 'x-Name', 's3-thekey'),
cryptographicAlgorithm,
KMIP.Attribute('Integer', 'Cryptographic Length', 256),
KMIP.Attribute('Integer', 'Cryptographic Usage Mask',
kmip.encodeMask('Cryptographic Usage Mask',
['Encrypt', 'Decrypt'])),
KMIP.Attribute('Date-Time', 'Activation Date', new Date),
]),
]);
console.log(cryptographicAlgorithm);
{
'Attribute': {
type: 'Structure',
value: [
{
'Attribute Name': {
type: 'TextString',
value: 'Cryptographic Algorithm'
}
},
{
'Attribute Value': {
type: 'Enumeration'
value: 'AES',
diversion: 'Cryptographic Algorithm'
}
}
]
}
}
```
The `diversion` attribute in the `Attribute Value` structure is used by the
codec to identify the `Enumeration` the value relates to.
## Codec Interface
```javascript
class MyCodec {
/**
* Construct a new instance of the codec
*/
constructor() {}
/**
* Encode a bitmask
* @param {String} tagName - name of the bit mask defining tag
* @param {Array of Strings} value - array of named bits to set in the mask
* @return {Integer} Integer encoded bitmask
*/
encodeMask(tagName, value) {}
/**
* Decode a bitmask
* @param {string} tagName - name of the bit mask defining tag
* @param {Integer} givenMask - bit mask to decode
* @return {Array of Strings} array of named bits set in the given bit mask
*/
decodeMask(tagName, givenMask) {}
/**
* Encode an abstract message
* @param {Object} message - Instance of a KMIP.Message
* @return {Buffer} the encoded message suitable for the transport layer
*/
encode(message) {}
/**
* Decode a raw message, usually received from the transport layer
* @param {Object} logger - a Logger instance
* @param {Buffer} rawMessage - the message to decode
* @return {Object} the decoded message as an instance of KMIP.Message
*/
decode(logger, rawMessage) {}
/**
* Amend the tag nomenclature with a vendor specific extension
* @param {String} tagName - Name of the tag to record
* @param {Integer} tagValue - Tag value represented as an integer
*/
mapExtension(tagName, tagValue) {}
}
```
## Encoding specification links
### TTLV Encoding Baseline Profile
[TTLV Encoding Specification](http://docs.oasis-open.org/kmip/spec/v1.4/os/kmip-spec-v1.4-os.html#_Toc490660911)
### XML Encoding Profile
[XML Encoding Profile Specification](http://docs.oasis-open.org/kmip/profiles/v1.4/csprd01/kmip-profiles-v1.4-csprd01.html#_Toc479342078)
### JSON Encoding Profile
[JSON Encoding Profile Specification](http://docs.oasis-open.org/kmip/profiles/v1.4/csprd01/kmip-profiles-v1.4-csprd01.html#_Toc479342090)

View File

@ -0,0 +1,434 @@
'use strict'; // eslint-disable-line
/* eslint dot-notation: "off" */
const KMIPTags = require('../tags.json');
const KMIPMessage = require('../Message.js');
const UINT32_MAX = Math.pow(2, 32);
function _ttlvPadVector(vec) {
let length = 0;
vec.forEach(buf => {
if (!(buf instanceof Buffer)) {
throw Error('Not a Buffer');
}
length += buf.length;
});
const paddingLength = (Math.ceil(length / 8) * 8) - length;
if (paddingLength > 0) {
vec.push(Buffer.alloc(paddingLength).fill(0));
}
return vec;
}
function _throwError(logger, msg, data) {
logger.error(msg, data);
throw Error(msg);
}
function TTLVCodec() {
if (!new.target) {
return new TTLVCodec();
}
const TagDecoder = JSON.parse(JSON.stringify(KMIPTags));
const TagEncoder = {};
const TypeDecoder = {};
const TypeEncoder = {};
const PrimitiveTypes = {
'01': {
name: 'Structure',
decode: (logger, unusedTag, value) => {
const funcName = 'Structure::decode';
const length = value.length;
let i = 0;
const result = [];
let diversion = null;
while (i < length) {
const element = {};
const elementTag = value.slice(i, i + 3).toString('hex');
const elementType =
value.slice(i + 3, i + 4).toString('hex');
const elementLength = value.readUInt32BE(i + 4);
const property = {};
if (!TypeDecoder[elementType]) {
_throwError(logger,
'Unknown element type',
{ funcName, elementTag, elementType });
}
const elementValue = value.slice(i + 8,
i + 8 + elementLength);
if (elementValue.length !== elementLength) {
_throwError(logger, 'BUG: Wrong buffer size',
{ funcName, elementLength,
bufferLength: elementValue.length });
}
property.type = TypeDecoder[elementType].name;
property.value = TypeDecoder[elementType]
.decode(logger, elementTag, elementValue, diversion);
if (diversion) {
property.diversion = diversion;
diversion = null;
}
const tagInfo = TagDecoder[elementTag];
if (!tagInfo) {
logger.debug('Unknown element tag',
{ funcName, elementTag });
property.tag = elementTag;
element['Unknown Tag'] = property;
} else {
element[tagInfo.name] = property;
if (tagInfo.name === 'Attribute Name') {
if (property.type !== 'TextString') {
_throwError(logger,
'Invalide type',
{ funcName, type: property.type });
}
diversion = property.value;
}
}
i += Math.ceil((8 + elementLength) / 8.0) * 8;
result.push(element);
}
return result;
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type = Buffer.from(TypeEncoder['Structure'].value, 'hex');
const length = Buffer.alloc(4);
let vectorLength = 0;
let encodedValue = [];
value.forEach(item => {
Object.keys(item).forEach(key => {
const itemTagName = key;
const itemType = item[key].type;
const itemValue = item[key].value;
const itemDiversion = item[key].diversion;
if (!TagEncoder[itemTagName]) {
throw Error(`Unknown Tag '${itemTagName}'`);
}
if (!TypeEncoder[itemType]) {
throw Error(`Unknown Type '${itemType}'`);
}
const itemResult =
TypeEncoder[itemType].encode(itemTagName,
itemValue,
itemDiversion);
encodedValue = encodedValue
.concat(_ttlvPadVector(itemResult));
});
});
encodedValue = _ttlvPadVector(encodedValue);
encodedValue.forEach(buf => { vectorLength += buf.length; });
length.writeUInt32BE(vectorLength);
return _ttlvPadVector([tag, type, length, ...encodedValue]);
},
},
'02': {
name: 'Integer',
decode: (logger, tag, value) => {
const funcName = 'Integer::decode';
const fixedLength = 4;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
return value.readUInt32BE(0);
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type = Buffer.from(TypeEncoder['Integer'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(4);
const encodedValue = Buffer.alloc(4);
encodedValue.writeUInt32BE(value);
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
'03': {
name: 'LongInteger',
decode: (logger, tag, value) => {
const funcName = 'LongInteger::decode';
const fixedLength = 8;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
const longUInt = UINT32_MAX * value.readUInt32BE(0) +
value.readUInt32BE(4);
if (longUInt > Number.MAX_SAFE_INTEGER) {
_throwError(logger,
'53-bit overflow',
{ funcName, longUInt });
}
return longUInt;
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type =
Buffer.from(TypeEncoder['LongInteger'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(8);
const encodedValue = Buffer.alloc(8);
encodedValue.writeUInt32BE(Math.floor(value / UINT32_MAX), 0);
encodedValue.writeUInt32BE(value % UINT32_MAX, 4);
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
'04': {
name: 'BigInteger',
decode: (logger, tag, value) => value,
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type =
Buffer.from(TypeEncoder['BigInteger'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(value.length);
return _ttlvPadVector([tag, type, length, value]);
},
},
'05': {
name: 'Enumeration',
decode: (logger, tag, value, diversion) => {
const funcName = 'Enumeration::decode';
const fixedLength = 4;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
const enumValue = value.toString('hex');
const actualTag = diversion ? TagEncoder[diversion].value : tag;
const enumInfo = TagDecoder[actualTag];
if (!enumInfo ||
!enumInfo.enumeration ||
!enumInfo.enumeration[enumValue]) {
return { tag,
value: enumValue,
message: 'Unknown enumeration value',
diversion,
};
}
return enumInfo.enumeration[enumValue];
},
encode: (tagName, value, diversion) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type =
Buffer.from(TypeEncoder['Enumeration'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(4);
const actualTag = diversion || tagName;
const encodedValue =
Buffer.from(TagEncoder[actualTag].enumeration[value],
'hex');
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
'06': {
name: 'Boolean',
decode: (logger, tag, value) => {
const funcName = 'Boolean::decode';
const fixedLength = 8;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
const msUInt = value.readUInt32BE(0);
const lsUInt = value.readUInt32BE(4);
return !!(msUInt | lsUInt);
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type = Buffer.from(TypeEncoder['Boolean'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(8);
const encodedValue = Buffer.alloc(8);
encodedValue.writeUInt32BE(0, 0);
encodedValue.writeUInt32BE(value ? 1 : 0, 4);
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
'07': {
name: 'TextString',
decode: (logger, tag, value) => value.toString('utf8'),
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type =
Buffer.from(TypeEncoder['TextString'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(value.length);
return _ttlvPadVector([tag, type, length,
Buffer.from(value, 'utf8')]);
},
},
'08': {
name: 'ByteString',
decode: (logger, tag, value) => value,
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type =
Buffer.from(TypeEncoder['ByteString'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(value.length);
return _ttlvPadVector([tag, type, length, value]);
},
},
'09': {
name: 'Date-Time',
decode: (logger, tag, value) => {
const funcName = 'Date-Time::decode';
const fixedLength = 8;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
const d = new Date(0);
const utcSeconds = UINT32_MAX * value.readUInt32BE(0) +
value.readUInt32BE(4);
if (utcSeconds > Number.MAX_SAFE_INTEGER) {
_throwError(logger,
'53-bit overflow',
{ funcName, utcSeconds });
}
d.setUTCSeconds(utcSeconds);
return d;
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type = Buffer.from(TypeEncoder['Date-Time'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(8);
const encodedValue = Buffer.alloc(8);
const ts = value.getTime() / 1000;
encodedValue.writeUInt32BE(Math.floor(ts / UINT32_MAX), 0);
encodedValue.writeUInt32BE(ts % UINT32_MAX, 4);
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
'0a': {
name: 'Interval',
decode: (logger, tag, value) => {
const funcName = 'Interval::decode';
const fixedLength = 4;
if (fixedLength !== value.length) {
_throwError(logger,
'Length mismatch',
{ funcName, fixedLength,
bufferLength: value.length });
}
return value.readInt32BE(0);
},
encode: (tagName, value) => {
const tag = Buffer.from(TagEncoder[tagName].value, 'hex');
const type = Buffer.from(TypeEncoder['Interval'].value, 'hex');
const length = Buffer.alloc(4);
length.writeUInt32BE(4);
const encodedValue = Buffer.alloc(4);
encodedValue.writeUInt32BE(value);
return _ttlvPadVector([tag, type, length, encodedValue]);
},
},
};
/* Construct TagDecoder */
Object.keys(TagDecoder).forEach(key => {
const element = {};
element.value = key;
if (TagDecoder[key]['enumeration']) {
const enumeration = {};
Object.keys(TagDecoder[key]['enumeration']).forEach(enumValue => {
const enumKey = TagDecoder[key]['enumeration'][enumValue];
enumeration[enumKey] = enumValue;
});
element.enumeration = enumeration;
}
TagEncoder[TagDecoder[key].name] = element;
});
/* Construct TypeDecoder and TypeEncoder */
Object.keys(PrimitiveTypes).forEach(value => {
const name = PrimitiveTypes[value].name;
const encode = PrimitiveTypes[value].encode;
const decode = PrimitiveTypes[value].decode;
TypeDecoder[value] = { name, decode };
TypeEncoder[name] = { value, encode };
});
/* Public Methods Definition */
this.encodeMask = (tagName, value) => {
let mask = 0;
value.forEach(item => {
const enumValue = TagEncoder[tagName].enumeration[item];
if (!enumValue) {
throw Error('Invalid bit name');
}
mask |= parseInt(enumValue, 16);
});
return mask;
};
this.decodeMask = (tagName, givenMask) => {
let mask = givenMask;
const value = [];
const tag = TagEncoder[tagName].value;
Object.keys(TagDecoder[tag].enumeration).forEach(key => {
const bit = Buffer.from(key, 'hex').readUInt32BE(0);
if (bit & mask) {
mask &= ~bit;
value.push(TagDecoder[tag].enumeration[key]);
}
});
return value;
};
this.decode = (logger, rawMessage) => {
const messageContent =
TypeDecoder['01'].decode(logger, null, rawMessage);
return new KMIPMessage(messageContent);
};
this.encode = message => {
const value = message.content;
let result = [];
value.forEach(item => {
Object.keys(item).forEach(key => {
if (!TagEncoder[key]) {
throw Error(`Unknown Tag '${key}'`);
}
const type = item[key].type;
if (!TypeEncoder[type]) {
throw Error(`Unknown Type '${type}'`);
}
const itemValue = TypeEncoder[type].encode(key,
item[key].value,
item[key].diversion);
result = result.concat(_ttlvPadVector(itemValue));
});
});
return Buffer.concat(_ttlvPadVector(result));
};
this.mapExtension = (tagName, tagValue) => {
const tagValueStr = tagValue.toString(16);
TagDecoder[tagValueStr] = { name: tagName };
TagEncoder[tagName] = { value: tagValueStr };
};
return this;
}
module.exports = TTLVCodec;

350
lib/network/kmip/index.js Normal file
View File

@ -0,0 +1,350 @@
'use strict'; // eslint-disable-line
/* eslint new-cap: "off" */
const uuidv4 = require('uuid/v4');
const Message = require('./Message.js');
/* This client requires at least a KMIP 1.2 compatible server */
const DEFAULT_PROTOCOL_VERSION_MAJOR = 1;
const DEFAULT_PROTOCOL_VERSION_MINOR = 2;
/* Response is for one operation, consider raising this value if
* compounding ops */
const DEFAULT_MAXIMUM_RESPONSE_SIZE = 8000;
function _uniqueBatchItemID() {
const theUUID = Buffer.alloc(16);
return uuidv4(null, theUUID);
}
function _PrimitiveType(tagName, type, value) {
return { [tagName]: { type, value } };
}
class KMIP {
/**
* Construct a new KMIP Object
* @param {Class} Codec -
* @param {Class} Transport -
* @param {Object} options -
* @param {Function} cb -
*/
constructor(Codec, Transport, options) {
this.protocolVersion = {
major: DEFAULT_PROTOCOL_VERSION_MAJOR,
minor: DEFAULT_PROTOCOL_VERSION_MINOR,
};
this.maximumResponseSize = DEFAULT_MAXIMUM_RESPONSE_SIZE;
this.options = options.kmip;
this.codec = new Codec(options.kmip.codec);
this.transport = new Transport(options.kmip.transport);
}
/* Static class methods */
/**
* create a new abstract message instance
* @param {Object} content - Most likely a call to KMIP.Structure
* with 'Request Message' as tagName
* @returns {Object} an instance of Message
*/
static Message(content) {
return new Message(content);
}
/**
* Create a KMIP Structure field instance
* @param {String} tagName - Name of the KMIP field
* @param {Array} value - array of KMIP fields
* @returns {Object} an abstract KMIP field
*/
static Structure(tagName, value) {
return _PrimitiveType(tagName, 'Structure', value);
}
/**
* Create a KMIP Integer field instance
* @param {String} tagName - Name of the KMIP field
* @param {Number} value - a number
* @returns {Object} an abstract KMIP field
*/
static Integer(tagName, value) {
return _PrimitiveType(tagName, 'Integer', value);
}
/**
* Create a KMIP Long Integer field instance
* @param {String} tagName - Name of the KMIP field
* @param {Number} value - a number (beware of the 53-bit limitation)
* @returns {Object} an abstract KMIP field
*/
static LongInteger(tagName, value) {
return _PrimitiveType(tagName, 'LongInteger', value);
}
/**
* Create a KMIP Big Integer field instance
* @param {String} tagName - Name of the KMIP field
* @param {Buffer} value - buffer containing the big integer
* @returns {Object} an abstract KMIP field
*/
static BigInteger(tagName, value) {
if (value.length % 8 !== 0) {
throw Error('Big Integer value length must be a multiple of 8');
}
return _PrimitiveType(tagName, 'BigInteger', value);
}
/**
* Create a KMIP Enumeration field instance
* @param {String} tagName - Name of the KMIP Enumeration
* @param {String} value - Name of the KMIP Enumeration value
* @returns {Object} an abstract KMIP field
*/
static Enumeration(tagName, value) {
return _PrimitiveType(tagName, 'Enumeration', value);
}
/**
* Create a KMIP Boolean field instance
* @param {String} tagName - Name of the KMIP field
* @param {Boolean} value - anything falsey or not (converted to a Boolean)
* @returns {Object} an abstract KMIP field
*/
static Boolean(tagName, value) {
return _PrimitiveType(tagName, 'Boolean', !!value);
}
/**
* Create a KMIP Text String field instance
* @param {String} tagName - Name of the KMIP field
* @param {String} value - the text string
* @returns {Object} an abstract KMIP field
*/
static TextString(tagName, value) {
return _PrimitiveType(tagName, 'TextString', value);
}
/**
* Create a KMIP Byte String field instance
* @param {String} tagName - Name of the KMIP field
* @param {Buffer} value - buffer containing the byte string
* @returns {Object} an abstract KMIP field
*/
static ByteString(tagName, value) {
return _PrimitiveType(tagName, 'ByteString', value);
}
/**
* Create a KMIP Date-Time field instance
* @param {String} tagName - Name of the KMIP field
* @param {Date} value - instance of a Date (ms are discarded)
* @returns {Object} an abstract KMIP field
*/
static DateTime(tagName, value) {
value.setMilliseconds(0);
return _PrimitiveType(tagName, 'Date-Time', value);
}
/**
* Create a KMIP Interval field instance
* @param {String} tagName - Name of the KMIP field
* @param {Integer} value - number of seconds of the interval
* @returns {Object} an abstract KMIP field
*/
static Interval(tagName, value) {
return _PrimitiveType(tagName, 'Interval', value);
}
/**
* Create a KMIP Attribute field instance
* @param {String} type - type of the attribute value
* @param {String} name - Name of the attribute or KMIP field
* @param {Object} value - value of the field suitable for the
* specified type
* @returns {Object} an abstract KMIP field
*/
static Attribute(type, name, value) {
if (type === 'Date-Time') {
value.setMilliseconds(0);
}
return {
Attribute: {
type: 'Structure',
value: [
{
'Attribute Name': {
type: 'TextString',
value: name,
},
},
{
'Attribute Value': {
type,
value,
diversion: name,
},
},
],
},
};
}
/* Object methods */
/**
* Register a higher level handshake function to be called
* after the connection is initialized and before the first
* message is sent.
* @param {Function} handshakeFunction - (logger: Object, cb: Function(err))
* @returns {undefined}
*/
registerHandshakeFunction(handshakeFunction) {
this.transport.registerHandshakeFunction(handshakeFunction);
}
/**
* Decode a raw message, usually received from the transport layer
* @param {Object} logger - a Logger instance
* @param {Buffer} rawMessage - the message to decode
* @returns {Object} the decoded message as an instance of KMIP.Message
*/
_decodeMessage(logger, rawMessage) {
return this.codec.decode(logger, rawMessage);
}
/**
* Encode an message
* @param {Object} message - Instance of a KMIP.Message
* @returns {Buffer} the encoded message suitable for the transport layer
*/
_encodeMessage(message) {
return this.codec.encode(message);
}
/**
* Decode a bitmask
* @param {string} tagName - name of the bit mask defining tag
* @param {Integer} givenMask - bit mask to decode
* @returns {Array} array of named bits set in the given bit mask
*/
decodeMask(tagName, givenMask) {
return this.codec.decodeMask(tagName, givenMask);
}
/**
* Encode a bitmask
* @param {String} tagName - name of the bit mask defining tag
* @param {Array} value - array of named bits to set in the mask
* @returns {Integer} Integer encoded bitmask
*/
encodeMask(tagName, value) {
return this.codec.encodeMask(tagName, value);
}
/**
* Amend the tag nomenclature with a vendor specific extension
* @param {String} extensionName - Name of the tag to record
* @param {Integer} extensionTag - Tag value represented as an integer
* @returns {undefined}
*/
mapExtension(extensionName, extensionTag) {
return this.codec.mapExtension(extensionName, extensionTag);
}
changeProtocolVersion(major, minor) {
this.protocolVersion = { major, minor };
}
/**
* Send an operation request message to the KMIP Server
* @param {Object} logger - Werelog logger object
* @param {String} operation - The name of the operation as defined in
* the KMIP protocol specification.
* @param {Object} payload - payload of the operation request. Specifically
* the content of the Request Payload as defined
* by the KMIP protocol specification.
* @param {Function} cb - The callback(error: Object, response: Object)
* @returns {undefined}
*/
request(logger, operation, payload, cb) {
const uuid = _uniqueBatchItemID();
const message = KMIP.Message([
KMIP.Structure('Request Message', [
KMIP.Structure('Request Header', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major',
this.protocolVersion.major),
KMIP.Integer('Protocol Version Minor',
this.protocolVersion.minor)]),
KMIP.Integer('Maximum Response Size',
this.maximumResponseSize),
KMIP.Integer('Batch Count', 1)]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', operation),
KMIP.ByteString('Unique Batch Item ID', uuid),
KMIP.Structure('Request Payload', payload),
])])]);
const encodedMessage = this._encodeMessage(message);
this.transport.send(
logger, encodedMessage,
(err, conversation, rawResponse) => {
if (err) {
logger.error('KMIP::request: Failed to send message',
{ error: err });
return cb(err);
}
const response = this._decodeMessage(logger, rawResponse);
const performedOperation =
response.lookup('Response Message/' +
'Batch Item/Operation')[0];
const resultStatus =
response.lookup('Response Message/' +
'Batch Item/Result Status')[0];
const resultUniqueBatchItemID =
response.lookup('Response Message/' +
'Batch Item/Unique Batch Item ID')[0];
if (!resultUniqueBatchItemID ||
resultUniqueBatchItemID.compare(uuid) !== 0) {
this.transport.abortPipeline(conversation);
const error = Error('Invalid batch item ID returned');
logger.error('KMIP::request: failed',
{ resultUniqueBatchItemID, uuid, error });
return cb(error);
}
if (performedOperation !== operation) {
this.transport.abortPipeline(conversation);
const error = Error('Operation mismatch',
{ got: performedOperation,
expected: operation });
logger.error('KMIP::request: Operation mismatch',
{ error });
return cb(error);
}
if (resultStatus !== 'Success') {
const resultReason =
response.lookup(
'Response Message/Batch Item/Result Reason')[0];
const resultMessage =
response.lookup(
'Response Message/Batch Item/Result Message')[0];
const error = Error('KMIP request failure',
{ resultStatus,
resultReason,
resultMessage });
logger.error('KMIP::request: request failed',
{ error, resultStatus,
resultReason, resultMessage });
return cb(error);
}
return cb(null, response);
});
}
}
module.exports = KMIP;

579
lib/network/kmip/tags.json Normal file
View File

@ -0,0 +1,579 @@
{
"420006": {
"name": "Asynchronous Correlation Value"
},
"420007": {
"name": "Asynchronous Indicator"
},
"420008": {
"name": "Attribute"
},
"420009": {
"name": "Attribute Index"
},
"42000a": {
"name": "Attribute Name"
},
"42000b": {
"name": "Attribute Value"
},
"42000c": {
"name": "Authentication"
},
"42000d": {
"name": "Batch Count"
},
"42000f": {
"name": "Batch Item"
},
"420011": {
"name": "Block Cipher Mode",
"enumeration": {
"00000001": "CBC",
"00000002": "ECB",
"00000003": "PCBC",
"00000004": "CFB",
"00000005": "OFB",
"00000006": "CTR",
"00000007": "CMAC",
"00000008": "CCM",
"00000009": "GCM",
"0000000a": "CBC-MAC",
"0000000b": "XTS",
"0000000c": "AESKeyWrapPadding",
"0000000d": "NISTKeyWrap",
"0000000e": "X9.102 AESKW",
"0000000f": "X9.102 TDKW",
"00000010": "X9.102 AKW1",
"00000011": "X9.102 AKW2",
"00000012": "AEAD"
}
},
"420028": {
"name": "Cryptographic Algorithm",
"enumeration": {
"00000001": "DES",
"00000002": "3DES",
"00000003": "AES",
"00000004": "RSA",
"00000005": "DSA",
"00000006": "ECDSA",
"00000007": "HMAC-SHA1",
"00000008": "HMAC-SHA224",
"00000009": "HMAC-SHA256",
"0000000a": "HMAC-SHA384",
"0000000b": "HMAC-SHA512",
"0000000c": "HMAC-MD5",
"0000000d": "DH",
"0000000e": "ECDH",
"0000000f": "ECMQV",
"00000010": "Blowfish",
"00000011": "Camellia",
"00000012": "CAST5",
"00000013": "IDEA",
"00000014": "MARS",
"00000015": "RC2",
"00000016": "RC4",
"00000017": "RC5",
"00000018": "SKIPJACK",
"00000019": "Twofish",
"0000001a": "EC",
"0000001b": "One Time Pad",
"0000001c": "ChaCha20",
"0000001d": "Poly1305",
"0000001e": "ChaCha20Poly1305",
"0000001f": "SHA3-224",
"00000020": "SHA3-256",
"00000021": "SHA3-384",
"00000022": "SHA3-512",
"00000023": "HMAC-SHA3-224",
"00000024": "HMAC-SHA3-256",
"00000025": "HMAC-SHA3-384",
"00000026": "HMAC-SHA3-512",
"00000027": "SHAKE-128",
"00000028": "SHAKE-256"
}
},
"42002b": {
"name": "Cryptographic Parameters"
},
"42002c": {
"name": "Cryptographic Usage Mask",
"enumeration": {
"00000001": "Sign",
"00000002": "Verify",
"00000004": "Encrypt",
"00000008": "Decrypt",
"00000010": "Wrap Key",
"00000020": "Unwrap Key",
"00000040": "Export",
"00000080": "MAC Generate",
"00000100": "MAC Verify",
"00000200": "Derive Key",
"00000400": "Content Commitment",
"00000800": "Key Agreement",
"00001000": "Certificate Sign",
"00002000": "CRL Sign",
"00004000": "Generate Cryptogram",
"00008000": "Validate Cryptogram",
"00010000": "Translate Encrypt",
"00020000": "Translate Decrypt",
"00040000": "Translate Wrap",
"00080000": "Translate Unwrap"
}
},
"42003d": {
"name": "IV/Counter/Nonce"
},
"420050": {
"name": "Maximum Response Size"
},
"420054": {
"name": "Name Type",
"enumeration": {
"00000001": "Uninterpreted Text String",
"00000002": "URI"
}
},
"420057": {
"name": "Object Type",
"enumeration": {
"00000001": "Certificate",
"00000002": "Symmetric Key",
"00000003": "Public Key",
"00000004": "Private Key",
"00000005": "Split Key",
"00000006": "Template",
"00000007": "Secret Data",
"00000008": "Opaque Object",
"00000009": "PGP Key"
}
},
"42005c": {
"name": "Operation",
"enumeration": {
"00000001": "Create",
"00000002": "Create Key Pair",
"00000003": "Register",
"00000004": "Re-key",
"00000005": "Derive Key",
"00000006": "Certify",
"00000007": "Re-certify",
"00000008": "Locate",
"00000009": "Check",
"0000000a": "Get",
"0000000b": "Get Attributes",
"0000000c": "Get Attribute List",
"0000000d": "Add Attribute",
"0000000e": "Modify Attribute",
"0000000f": "Delete Attribute",
"00000010": "Obtain Lease",
"00000011": "Get Usage Allocation",
"00000012": "Activate",
"00000013": "Revoke",
"00000014": "Destroy",
"00000015": "Archive",
"00000016": "Recover",
"00000017": "Validate",
"00000018": "Query",
"00000019": "Cancel",
"0000001a": "Poll",
"0000001b": "Notify",
"0000001c": "Put",
"0000001d": "Re-key Key Pair",
"0000001e": "Discover Versions",
"0000001f": "Encrypt",
"00000020": "Decrypt",
"00000021": "Sign",
"00000022": "Signature Verify",
"00000023": "MAC",
"00000024": "MAC Verify",
"00000025": "RNG Retrieve",
"00000026": "RNG Seed",
"00000027": "Hash",
"00000028": "Create Split Key",
"00000029": "Join Split Key",
"0000002a": "Import",
"0000002b": "Export"
}
},
"42005f": {
"name": "Padding Method",
"enumeration": {
"00000001": "None",
"00000002": "OAEP",
"00000003": "PKCS5",
"00000004": "SSL3",
"00000005": "Zeros",
"00000006": "ANSI X9.23",
"00000007": "ISO 10126",
"00000008": "PKCS1 v1.5",
"00000009": "X9.31",
"0000000a": "PSS"
}
},
"420069": {
"name": "Protocol Version"
},
"42006a": {
"name": "Protocol Version Major"
},
"42006b": {
"name": "Protocol Version Minor"
},
"420074": {
"name": "Query Function",
"enumeration": {
"00000001": "Query Operations",
"00000002": "Query Objects",
"00000003": "Query Server Information",
"00000004": "Query Application Namespaces",
"00000005": "Query Extension List",
"00000006": "Query Extension Map",
"00000007": "Query Attestation Types",
"00000008": "Query RNGs",
"00000009": "Query Validations",
"0000000a": "Query Profiles",
"0000000b": "Query Capabilities",
"0000000c": "Query Client Registration Methods"
}
},
"420077": {
"name": "Request Header"
},
"420078": {
"name": "Request Message"
},
"420079": {
"name": "Request Payload"
},
"42007a": {
"name": "Response Header"
},
"42007b": {
"name": "Response Message"
},
"42007c": {
"name": "Response Payload"
},
"42007d": {
"name": "Result Message"
},
"42007e": {
"name": "Result Reason",
"enumeration": {
"00000001": "Item Not Found",
"00000002": "Response Too Large",
"00000003": "Authentication Not Successful",
"00000004": "Invalid Message",
"00000005": "Operation Not Supported",
"00000006": "Missing Data",
"00000007": "Invalid Field",
"00000008": "Feature Not Supported",
"00000009": "Operation Canceled By Requester",
"0000000a": "Cryptographic Failure",
"0000000b": "Illegal Operation",
"0000000c": "Permission Denied",
"0000000d": "Object archived",
"0000000e": "Index Out of Bounds",
"0000000f": "Application Namespace Not Supported",
"00000010": "Key Format Type Not Supported",
"00000011": "Key Compression Type Not Supported",
"00000012": "Encoding Option Error",
"00000013": "Key Value Not Present",
"00000014": "Attestation Required",
"00000015": "Attestation Failed",
"00000016": "Sensitive",
"00000017": "Not Extractable",
"00000018": "Object Already Exists",
"00000100": "General Failure"
}
},
"42007f": {
"name": "Result Status",
"enumeration": {
"00000000": "Success",
"00000001": "Operation Failed",
"00000002": "Operation Pending",
"00000003": "Operation Undone"
}
},
"420080": {
"name": "Revocation Message"
},
"420081": {
"name": "Revocation Reason"
},
"420082": {
"name": "Revocation Reason Code",
"enumeration": {
"00000001": "Unspecified",
"00000002": "Key Compromise",
"00000003": "CA Compromise",
"00000004": "Affiliation Changed",
"00000005": "Superseded",
"00000006": "Cessation of Operation",
"00000007": "Privilege Withdrawn"
}
},
"420088": {
"name": "Server Information"
},
"420091": {
"name": "Template-Attribute"
},
"420092": {
"name": "Time Stamp"
},
"420093": {
"name": "Unique Batch Item ID"
},
"420094": {
"name": "Unique Identifier"
},
"42009d": {
"name": "Vendor Identification"
},
"4200a4": {
"name": "Extension Information"
},
"4200a5": {
"name": "Extension Name"
},
"4200a6": {
"name": "Extension Tag"
},
"4200a7": {
"name": "Extension Type"
},
"4200c2": {
"name": "Data"
},
"4200eb": {
"name": "Profile Information"
},
"4200ec": {
"name": "Profile Name",
"enumeration": {
"00000001": "Baseline Server Basic KMIP v1.2",
"00000002": "Baseline Server TLS v1.2 KMIP v1.2",
"00000003": "Baseline Client Basic KMIP v1.2",
"00000004": "Baseline Client TLS v1.2 KMIP v1.2",
"00000005": "Complete Server Basic KMIP v1.2",
"00000006": "Complete Server TLS v1.2 KMIP v1.2",
"00000007": "Tape Library Client KMIP v1.0",
"00000008": "Tape Library Client KMIP v1.1",
"00000009": "Tape Library Client KMIP v1.2",
"0000000a": "Tape Library Server KMIP v1.0",
"0000000b": "Tape Library Server KMIP v1.1",
"0000000c": "Tape Library Server KMIP v1.2",
"0000000d": "Symmetric Key Lifecycle Client KMIP v1.0",
"0000000e": "Symmetric Key Lifecycle Client KMIP v1.1",
"0000000f": "Symmetric Key Lifecycle Client KMIP v1.2",
"00000010": "Symmetric Key Lifecycle Server KMIP v1.0",
"00000011": "Symmetric Key Lifecycle Server KMIP v1.1",
"00000012": "Symmetric Key Lifecycle Server KMIP v1.2",
"00000013": "Asymmetric Key Lifecycle Client KMIP v1.0",
"00000014": "Asymmetric Key Lifecycle Client KMIP v1.1",
"00000015": "Asymmetric Key Lifecycle Client KMIP v1.2",
"00000016": "Asymmetric Key Lifecycle Server KMIP v1.0",
"00000017": "Asymmetric Key Lifecycle Server KMIP v1.1",
"00000018": "Asymmetric Key Lifecycle Server KMIP v1.2",
"00000019": "Basic Cryptographic Client KMIP v1.2",
"0000001a": "Basic Cryptographic Server KMIP v1.2",
"0000001b": "Advanced Cryptographic Client KMIP v1.2",
"0000001c": "Advanced Cryptographic Server KMIP v1.2",
"0000001d": "RNG Cryptographic Client KMIP v1.2",
"0000001e": "RNG Cryptographic Server KMIP v1.2",
"0000001f": "Basic Symmetric Key Foundry Client KMIP v1.0",
"00000020": "Intermediate Symmetric Key Foundry Client KMIP v1.0",
"00000021": "Advanced Symmetric Key Foundry Client KMIP v1.0",
"00000022": "Basic Symmetric Key Foundry Client KMIP v1.1",
"00000023": "Intermediate Symmetric Key Foundry Client KMIP v1.1",
"00000024": "Advanced Symmetric Key Foundry Client KMIP v1.1",
"00000025": "Basic Symmetric Key Foundry Client KMIP v1.2",
"00000026": "Intermediate Symmetric Key Foundry Client KMIP v1.2",
"00000027": "Advanced Symmetric Key Foundry Client KMIP v1.2",
"00000028": "Symmetric Key Foundry Server KMIP v1.0",
"00000029": "Symmetric Key Foundry Server KMIP v1.1",
"0000002a": "Symmetric Key Foundry Server KMIP v1.2",
"0000002b": "Opaque Managed Object Store Client KMIP v1.0",
"0000002c": "Opaque Managed Object Store Client KMIP v1.1",
"0000002d": "Opaque Managed Object Store Client KMIP v1.2",
"0000002e": "Opaque Managed Object Store Server KMIP v1.0",
"0000002f": "Opaque Managed Object Store Server KMIP v1.1",
"00000030": "Opaque Managed Object Store Server KMIP v1.2",
"00000031": "Suite B minLOS_128 Client KMIP v1.0",
"00000032": "Suite B minLOS_128 Client KMIP v1.1",
"00000033": "Suite B minLOS_128 Client KMIP v1.2",
"00000034": "Suite B minLOS_128 Server KMIP v1.0",
"00000035": "Suite B minLOS_128 Server KMIP v1.1",
"00000036": "Suite B minLOS_128 Server KMIP v1.2",
"00000037": "Suite B minLOS_192 Client KMIP v1.0",
"00000038": "Suite B minLOS_192 Client KMIP v1.1",
"00000039": "Suite B minLOS_192 Client KMIP v1.2",
"0000003a": "Suite B minLOS_192 Server KMIP v1.0",
"0000003b": "Suite B minLOS_192 Server KMIP v1.1",
"0000003c": "Suite B minLOS_192 Server KMIP v1.2",
"0000003d": "Storage Array with Self Encrypting Drive Client KMIP v1.0",
"0000003e": "Storage Array with Self Encrypting Drive Client KMIP v1.1",
"0000003f": "Storage Array with Self Encrypting Drive Client KMIP v1.2",
"00000040": "Storage Array with Self Encrypting Drive Server KMIP v1.0",
"00000041": "Storage Array with Self Encrypting Drive Server KMIP v1.1",
"00000042": "Storage Array with Self Encrypting Drive Server KMIP v1.2",
"00000043": "HTTPS Client KMIP v1.0",
"00000044": "HTTPS Client KMIP v1.1",
"00000045": "HTTPS Client KMIP v1.2",
"00000046": "HTTPS Server KMIP v1.0",
"00000047": "HTTPS Server KMIP v1.1",
"00000048": "HTTPS Server KMIP v1.2",
"00000049": "JSON Client KMIP v1.0",
"0000004a": "JSON Client KMIP v1.1",
"0000004b": "JSON Client KMIP v1.2",
"0000004c": "JSON Server KMIP v1.0",
"0000004d": "JSON Server KMIP v1.1",
"0000004e": "JSON Server KMIP v1.2",
"0000004f": "XML Client KMIP v1.0",
"00000050": "XML Client KMIP v1.1",
"00000051": "XML Client KMIP v1.2",
"00000052": "XML Server KMIP v1.0",
"00000053": "XML Server KMIP v1.1",
"00000054": "XML Server KMIP v1.2",
"00000055": "Baseline Server Basic KMIP v1.3",
"00000056": "Baseline Server TLS v1.2 KMIP v1.3",
"00000057": "Baseline Client Basic KMIP v1.3",
"00000058": "Baseline Client TLS v1.2 KMIP v1.3",
"00000059": "Complete Server Basic KMIP v1.3",
"0000005a": "Complete Server TLS v1.2 KMIP v1.3",
"0000005b": "Tape Library Client KMIP v1.3",
"0000005c": "Tape Library Server KMIP v1.3",
"0000005d": "Symmetric Key Lifecycle Client KMIP v1.3",
"0000005e": "Symmetric Key Lifecycle Server KMIP v1.3",
"0000005f": "Asymmetric Key Lifecycle Client KMIP v1.3",
"00000060": "Asymmetric Key Lifecycle Server KMIP v1.3",
"00000061": "Basic Cryptographic Client KMIP v1.3",
"00000062": "Basic Cryptographic Server KMIP v1.3",
"00000063": "Advanced Cryptographic Client KMIP v1.3",
"00000064": "Advanced Cryptographic Server KMIP v1.3",
"00000065": "RNG Cryptographic Client KMIP v1.3",
"00000066": "RNG Cryptographic Server KMIP v1.3",
"00000067": "Basic Symmetric Key Foundry Client KMIP v1.3",
"00000068": "Intermediate Symmetric Key Foundry Client KMIP v1.3",
"00000069": "Advanced Symmetric Key Foundry Client KMIP v1.3",
"0000006a": "Symmetric Key Foundry Server KMIP v1.3",
"0000006b": "Opaque Managed Object Store Client KMIP v1.3",
"0000006c": "Opaque Managed Object Store Server KMIP v1.3",
"0000006d": "Suite B minLOS_128 Client KMIP v1.3",
"0000006e": "Suite B minLOS_128 Server KMIP v1.3",
"0000006f": "Suite B minLOS_192 Client KMIP v1.3",
"00000070": "Suite B minLOS_192 Server KMIP v1.3",
"00000071": "Storage Array with Self Encrypting Drive Client KMIP v1.3",
"00000072": "Storage Array with Self Encrypting Drive Server KMIP v1.3",
"00000073": "HTTPS Client KMIP v1.3",
"00000074": "HTTPS Server KMIP v1.3",
"00000075": "JSON Client KMIP v1.3",
"00000076": "JSON Server KMIP v1.3",
"00000077": "XML Client KMIP v1.3",
"00000078": "XML Server KMIP v1.3",
"00000079": "Baseline Server Basic KMIP v1.4",
"0000007a": "Baseline Server TLS v1.2 KMIP v1.4",
"0000007b": "Baseline Client Basic KMIP v1.4",
"0000007c": "Baseline Client TLS v1.2 KMIP v1.4",
"0000007d": "Complete Server Basic KMIP v1.4",
"0000007e": "Complete Server TLS v1.2 KMIP v1.4",
"0000007f": "Tape Library Client KMIP v1.4",
"00000080": "Tape Library Server KMIP v1.4",
"00000081": "Symmetric Key Lifecycle Client KMIP v1.4",
"00000082": "Symmetric Key Lifecycle Server KMIP v1.4",
"00000083": "Asymmetric Key Lifecycle Client KMIP v1.4",
"00000084": "Asymmetric Key Lifecycle Server KMIP v1.4",
"00000085": "Basic Cryptographic Client KMIP v1.4",
"00000086": "Basic Cryptographic Server KMIP v1.4",
"00000087": "Advanced Cryptographic Client KMIP v1.4",
"00000088": "Advanced Cryptographic Server KMIP v1.4",
"00000089": "RNG Cryptographic Client KMIP v1.4",
"0000008a": "RNG Cryptographic Server KMIP v1.4",
"0000008b": "Basic Symmetric Key Foundry Client KMIP v1.4",
"0000008c": "Intermediate Symmetric Key Foundry Client KMIP v1.4",
"0000008d": "Advanced Symmetric Key Foundry Client KMIP v1.4",
"0000008e": "Symmetric Key Foundry Server KMIP v1.4",
"0000008f": "Opaque Managed Object Store Client KMIP v1.4",
"00000090": "Opaque Managed Object Store Server KMIP v1.4",
"00000091": "Suite B minLOS_128 Client KMIP v1.4",
"00000092": "Suite B minLOS_128 Server KMIP v1.4",
"00000093": "Suite B minLOS_192 Client KMIP v1.4",
"00000094": "Suite B minLOS_192 Server KMIP v1.4",
"00000095": "Storage Array with Self Encrypting Drive Client KMIP v1.4",
"00000096": "Storage Array with Self Encrypting Drive Server KMIP v1.4",
"00000097": "HTTPS Client KMIP v1.4",
"00000098": "HTTPS Server KMIP v1.4",
"00000099": "JSON Client KMIP v1.4",
"0000009a": "JSON Server KMIP v1.4",
"0000009b": "XML Client KMIP v1.4",
"0000009c": "XML Server KMIP v1.4"
}
},
"4200ed": {
"name": "Server URI"
},
"4200ee": {
"name": "Server Port"
},
"4200ef": {
"name": "Streaming Capability"
},
"4200f0": {
"name": "Asynchronous Capability"
},
"4200f1": {
"name": "Attestation Capability"
},
"4200f2": {
"name": "Unwrap Mode",
"enumeration": {
"00000001": "Unspecified",
"00000002": "Processed",
"00000003": "Not Processed"
}
},
"4200f3": {
"name": "Destroy Action",
"enumeration": {
"00000001": "Unspecified",
"00000002": "Key Material Deleted",
"00000003": "Key Material Shredded",
"00000004": "Meta Data Deleted",
"00000005": "Meta Data Shredded",
"00000006": "Deleted",
"00000007": "Shredded"
}
},
"4200f4": {
"name": "Shredding Algorithm",
"enumeration": {
"00000001": "Unspecified",
"00000002": "Cryptographic",
"00000003": "Unsupported"
}
},
"4200f5": {
"name": "RNG Mode",
"enumeration": {
"00000001": "Unspecified",
"00000002": "Shared Instantiation",
"00000003": "Non-Shared Instantiation"
}
},
"4200f6": {
"name": "Client Registration Method"
},
"4200f7": {
"name": "Capability Information"
},
"420105": {
"name": "Client Correlation Value"
},
"420106": {
"name": "Server Correlation Value"
}
}

View File

@ -0,0 +1,174 @@
'use strict'; // eslint-disable-line
const assert = require('assert');
const DEFAULT_PIPELINE_DEPTH = 8;
const DEFAULT_KMIP_PORT = 5696;
class TransportTemplate {
/**
* Construct a new object of the TransportTemplate class
* @param {Object} channel - Typically the tls object
* @param {Object} options - Instance options
* @param {Number} options.pipelineDepth - depth of the pipeline
* @param {Object} options.tls - Standard TLS socket initialization
* parameters
* @param {Number} options.tls.port - TLS server port to connect to
*/
constructor(channel, options) {
this.channel = channel;
this.options = options;
this.pipelineDepth = Math.max(1, options.pipelineDepth ||
DEFAULT_PIPELINE_DEPTH);
this.callbackPipeline = [];
this.deferedRequests = [];
this.pipelineDrainedCallback = null;
this.handshakeFunction = null;
this.socket = null;
}
/**
* Drain the outstanding and defered request queues by
* calling the associated callback with an error
* @param {Error} error - the error to call the callback function with.
* @returns {undefined}
*/
_drainQueuesWithError(error) {
this.callbackPipeline.forEach(queuedCallback => {
queuedCallback(error);
});
this.deferedRequests.forEach(deferedRequest => {
deferedRequest.cb(error);
});
this.callbackPipeline = [];
this.deferedRequests = [];
}
/**
* Register a higher level handshake function to be called
* after the connection is initialized and before the first
* message is sent.
* @param {Function} handshakeFunction - (logger: Object, cb: Function(err))
* @returns {undefined}
*/
registerHandshakeFunction(handshakeFunction) {
this.handshakeFunction = handshakeFunction;
}
/**
* Create a new conversation (e.g. a socket) between the client
* and the server.
* @param {Object} logger - Werelogs logger object
* @param {Function} readyCallback - callback function to call when the
* conversation is ready to be initiated
* func(err: Error)
* @returns {undefined}
*/
_createConversation(logger, readyCallback) {
try {
const socket = this.channel.connect(
this.options.tls.port || DEFAULT_KMIP_PORT,
this.options.tls,
() => {
if (this.handshakeFunction) {
this.handshakeFunction(logger, readyCallback);
} else {
readyCallback(null);
}
});
socket.on('data', data => {
const queuedCallback = this.callbackPipeline.shift();
queuedCallback(null, socket, data);
if (this.callbackPipeline.length <
this.pipelineDepth &&
this.deferedRequests.length > 0) {
const deferedRequest = this.deferedRequests.shift();
process.nextTick(() => {
this.send(logger,
deferedRequest.encodedMessage,
deferedRequest.cb);
});
} else if (this.callbackPipeline.length === 0 &&
this.deferedRequests.length === 0 &&
this.pipelineDrainedCallback) {
this.pipelineDrainedCallback();
this.pipelineDrainedCallback = null;
}
});
socket.on('end', () => {
const error = Error('Conversation interrupted');
this.socket = null;
this._drainQueuesWithError(error);
});
socket.on('error', err => {
this._drainQueuesWithError(err);
});
this.socket = socket;
} catch (err) {
logger.error(err);
this._drainQueuesWithError(err);
readyCallback(err);
}
}
_doSend(logger, encodedMessage, cb) {
this.callbackPipeline.push(cb);
if (this.socket === null || this.socket.destroyed) {
this._createConversation(logger, () => {});
}
const socket = this.socket;
if (socket) {
socket.cork();
socket.write(encodedMessage);
socket.uncork();
}
return undefined;
}
/**
* Send an encoded message to the server
* @param {Object} logger - Werelogs logger object
* @param {Buffer} encodedMessage - the encoded message to send to the
* server
* @param {Function} cb - (err, conversation, rawResponse)
* @returns {undefined}
*/
send(logger, encodedMessage, cb) {
if (this.callbackPipeline.length >= this.pipelineDepth) {
return this.deferedRequests.push({ encodedMessage, cb });
}
assert(encodedMessage.length !== 0);
return this._doSend(logger, encodedMessage, cb);
}
/**
* Gracefuly interrupt the conversation. If the caller keeps sending
* message after calling this function, the conversation won't
* converge to its end.
* @returns {undefined}
*/
end() {
if (!this.socket) {
return;
}
if (this.callbackPipeline.length !== 0 ||
this.deferedRequests.length !== 0) {
this.pipelineDrainedCallback = this.socket.end.bind(this.socket);
} else {
this.socket.end();
}
}
/**
* Abruptly interrupt the conversation and cancel the outstanding and
* defered requests
* @param {Object} conversation - the conversation to abort
* @returns {undefined}
*/
abortPipeline(conversation) {
conversation.end();
}
}
module.exports = TransportTemplate;

View File

@ -0,0 +1,12 @@
'use strict'; // eslint-disable-line
const tls = require('tls');
const TransportTemplate = require('./TransportTemplate.js');
class TlsTransport extends TransportTemplate {
constructor(options) {
super(tls, options);
}
}
module.exports = TlsTransport;

View File

@ -2,12 +2,14 @@
const Ajv = require('ajv');
const userPolicySchema = require('./userPolicySchema');
const resourcePolicySchema = require('./resourcePolicySchema');
const errors = require('../errors');
const ajValidate = new Ajv({ allErrors: true });
ajValidate.addMetaSchema(require('ajv/lib/refs/json-schema-draft-06.json'));
// compiles schema to functions and caches them for all cases
const userPolicyValidate = ajValidate.compile(userPolicySchema);
const resourcePolicyValidate = ajValidate.compile(resourcePolicySchema);
const errDict = {
required: {
@ -25,33 +27,38 @@ const errDict = {
};
// parse ajv errors and return early with the first relevant error
function _parseErrors(ajvErrors) {
// deep copy is needed as we have to assign custom error description
const parsedErr = Object.assign({}, errors.MalformedPolicyDocument);
parsedErr.description = 'Syntax errors in policy.';
function _parseErrors(ajvErrors, policyType) {
let parsedErr;
if (policyType === 'user') {
// deep copy is needed as we have to assign custom error description
parsedErr = Object.assign({}, errors.MalformedPolicyDocument);
}
if (policyType === 'resource') {
parsedErr = Object.assign({}, errors.MalformedPolicy);
}
ajvErrors.some(err => {
const resource = err.dataPath;
const field = err.params ? err.params.missingProperty : undefined;
const errType = err.keyword;
if (errType === 'type' && (resource === '.Statement' ||
resource === '.Statement.Resource' ||
resource === '.Statement.NotResource')) {
resource.includes('.Resource') ||
resource.includes('.NotResource'))) {
// skip this as this doesn't have enough error context
return false;
}
if (err.keyword === 'required' && field && errDict.required[field]) {
parsedErr.description = errDict.required[field];
} else if (err.keyword === 'pattern' &&
(resource === '.Statement.Action' ||
resource === '.Statement.NotAction')) {
(resource.includes('.Action') ||
resource.includes('.NotAction'))) {
parsedErr.description = errDict.pattern.Action;
} else if (err.keyword === 'pattern' &&
(resource === '.Statement.Resource' ||
resource === '.Statement.NotResource')) {
(resource.includes('.Resource') ||
resource.includes('.NotResource'))) {
parsedErr.description = errDict.pattern.Resource;
} else if (err.keyword === 'minItems' &&
(resource === '.Statement.Resource' ||
resource === '.Statement.NotResource')) {
(resource.includes('.Resource') ||
resource.includes('.NotResource'))) {
parsedErr.description = errDict.minItems.Resource;
}
return true;
@ -78,12 +85,24 @@ function _validatePolicy(type, policy) {
}
userPolicyValidate(parseRes);
if (userPolicyValidate.errors) {
return { error: _parseErrors(userPolicyValidate.errors),
return { error: _parseErrors(userPolicyValidate.errors, 'user'),
valid: false };
}
return { error: null, valid: true };
}
// TODO: add support for resource policies
if (type === 'resource') {
const parseRes = _safeJSONParse(policy);
if (parseRes instanceof Error) {
return { error: Object.assign({}, errors.MalformedPolicy),
valid: false };
}
resourcePolicyValidate(parseRes);
if (resourcePolicyValidate.errors) {
return { error: _parseErrors(resourcePolicyValidate.errors,
'resource'), valid: false };
}
return { error: null, valid: true };
}
return { error: errors.NotImplemented, valid: false };
}
/**

View File

@ -0,0 +1,491 @@
{
"$schema": "http://json-schema.org/draft-06/schema#",
"type": "object",
"title": "AWS Bucket Policy schema.",
"description": "This schema describes a bucket policy per AWS policy grammar rules",
"definitions": {
"principalService": {
"type": "object",
"properties": {
"Service": {
"type": "string",
"const": "backbeat"
}
},
"additionalProperties": false
},
"principalCanonicalUser": {
"type": "object",
"properties": {
"CanonicalUser": {
"type": "string",
"pattern": "^[0-9a-z]{64}$"
}
},
"additionalProperties": false
},
"principalAnonymous": {
"type": "string",
"pattern": "^\\*$"
},
"principalAWSAccountID": {
"type": "string",
"pattern": "^[0-9]{12}$"
},
"principalAWSAccountArn": {
"type": "string",
"pattern": "^arn:aws:iam::[0-9]{12}:root$"
},
"principalAWSUserArn": {
"type": "string",
"pattern": "^arn:aws:iam::[0-9]{12}:user/(?!\\*)[\\w+=,.@ -/]{1,64}$"
},
"principalAWSRoleArn": {
"type": "string",
"pattern": "^arn:aws:iam::[0-9]{12}:role/[\\w+=,.@ -]{1,64}$"
},
"principalAWSItem": {
"type": "object",
"properties": {
"AWS": {
"oneOf": [
{
"$ref": "#/definitions/principalAWSAccountID"
},
{
"$ref": "#/definitions/principalAnonymous"
},
{
"$ref": "#/definitions/principalAWSAccountArn"
},
{
"$ref": "#/definitions/principalAWSUserArn"
},
{
"$ref": "#/definitions/principalAWSRoleArn"
},
{
"type": "array",
"minItems": 1,
"items": {
"$ref": "#/definitions/principalAWSAccountID"
}
},
{
"type": "array",
"minItems": 1,
"items": {
"$ref": "#/definitions/principalAWSAccountArn"
}
},
{
"type": "array",
"minItems": 1,
"items": {
"$ref": "#/definitions/principalAWSRoleArn"
}
},
{
"type": "array",
"minItems": 1,
"items": {
"$ref": "#/definitions/principalAWSUserArn"
}
}
]
}
},
"additionalProperties": false
},
"principalItem": {
"oneOf": [
{
"$ref": "#/definitions/principalAWSItem"
},
{
"$ref": "#/definitions/principalAnonymous"
},
{
"$ref": "#/definitions/principalService"
},
{
"$ref": "#/definitions/principalCanonicalUser"
}
]
},
"actionItem": {
"type": "string",
"pattern": "^[^*:]+:([^:])+|^\\*$"
},
"resourceItem": {
"type": "string",
"pattern": "^\\*|arn:(aws|scality)(:(\\*{1}|[a-z0-9\\*\\-]{2,})*?){3}:((?!\\$\\{\\}).)*?$"
},
"conditionKeys": {
"properties": {
"aws:CurrentTime": true,
"aws:EpochTime": true,
"aws:MultiFactorAuthAge": true,
"aws:MultiFactorAuthPresent": true,
"aws:PrincipalArn": true,
"aws:PrincipalOrgId": true,
"aws:PrincipalTag/${TagKey}": true,
"aws:PrincipalType": true,
"aws:Referer": true,
"aws:RequestTag/${TagKey}": true,
"aws:RequestedRegion": true,
"aws:SecureTransport": true,
"aws:SourceAccount": true,
"aws:SourceArn": true,
"aws:SourceIp": true,
"aws:SourceVpc": true,
"aws:SourceVpce": true,
"aws:TagKeys": true,
"aws:TokenIssueTime": true,
"aws:UserAgent": true,
"aws:userid": true,
"aws:username": true,
"s3:ExistingJobOperation": true,
"s3:ExistingJobPriority": true,
"s3:ExistingObjectTag/<key>": true,
"s3:JobSuspendedCause": true,
"s3:LocationConstraint": true,
"s3:RequestJobOperation": true,
"s3:RequestJobPriority": true,
"s3:RequestObjectTag/<key>": true,
"s3:RequestObjectTagKeys": true,
"s3:VersionId": true,
"s3:authtype": true,
"s3:delimiter": true,
"s3:locationconstraint": true,
"s3:max-keys": true,
"s3:object-lock-legal-hold": true,
"s3:object-lock-mode": true,
"s3:object-lock-remaining-retention-days": true,
"s3:object-lock-retain-until-date": true,
"s3:prefix": true,
"s3:signatureage": true,
"s3:signatureversion": true,
"s3:versionid": true,
"s3:x-amz-acl": true,
"s3:x-amz-content-sha256": true,
"s3:x-amz-copy-source": true,
"s3:x-amz-grant-full-control": true,
"s3:x-amz-grant-read": true,
"s3:x-amz-grant-read-acp": true,
"s3:x-amz-grant-write": true,
"s3:x-amz-grant-write-acp": true,
"s3:x-amz-metadata-directive": true,
"s3:x-amz-server-side-encryption": true,
"s3:x-amz-server-side-encryption-aws-kms-key-id": true,
"s3:x-amz-storage-class": true,
"s3:x-amz-website-redirect-location": true
},
"additionalProperties": false
},
"conditions": {
"type": "object",
"properties": {
"ArnEquals": {
"type": "object"
},
"ArnEqualsIfExists": {
"type": "object"
},
"ArnLike": {
"type": "object"
},
"ArnLikeIfExists": {
"type": "object"
},
"ArnNotEquals": {
"type": "object"
},
"ArnNotEqualsIfExists": {
"type": "object"
},
"ArnNotLike": {
"type": "object"
},
"ArnNotLikeIfExists": {
"type": "object"
},
"BinaryEquals": {
"type": "object"
},
"BinaryEqualsIfExists": {
"type": "object"
},
"BinaryNotEquals": {
"type": "object"
},
"BinaryNotEqualsIfExists": {
"type": "object"
},
"Bool": {
"type": "object"
},
"BoolIfExists": {
"type": "object"
},
"DateEquals": {
"type": "object"
},
"DateEqualsIfExists": {
"type": "object"
},
"DateGreaterThan": {
"type": "object"
},
"DateGreaterThanEquals": {
"type": "object"
},
"DateGreaterThanEqualsIfExists": {
"type": "object"
},
"DateGreaterThanIfExists": {
"type": "object"
},
"DateLessThan": {
"type": "object"
},
"DateLessThanEquals": {
"type": "object"
},
"DateLessThanEqualsIfExists": {
"type": "object"
},
"DateLessThanIfExists": {
"type": "object"
},
"DateNotEquals": {
"type": "object"
},
"DateNotEqualsIfExists": {
"type": "object"
},
"IpAddress": {
"type": "object"
},
"IpAddressIfExists": {
"type": "object"
},
"NotIpAddress": {
"type": "object"
},
"NotIpAddressIfExists": {
"type": "object"
},
"Null": {
"type": "object"
},
"NumericEquals": {
"type": "object"
},
"NumericEqualsIfExists": {
"type": "object"
},
"NumericGreaterThan": {
"type": "object"
},
"NumericGreaterThanEquals": {
"type": "object"
},
"NumericGreaterThanEqualsIfExists": {
"type": "object"
},
"NumericGreaterThanIfExists": {
"type": "object"
},
"NumericLessThan": {
"type": "object"
},
"NumericLessThanEquals": {
"type": "object"
},
"NumericLessThanEqualsIfExists": {
"type": "object"
},
"NumericLessThanIfExists": {
"type": "object"
},
"NumericNotEquals": {
"type": "object"
},
"NumericNotEqualsIfExists": {
"type": "object"
},
"StringEquals": {
"type": "object"
},
"StringEqualsIfExists": {
"type": "object"
},
"StringEqualsIgnoreCase": {
"type": "object"
},
"StringEqualsIgnoreCaseIfExists": {
"type": "object"
},
"StringLike": {
"type": "object"
},
"StringLikeIfExists": {
"type": "object"
},
"StringNotEquals": {
"type": "object"
},
"StringNotEqualsIfExists": {
"type": "object"
},
"StringNotEqualsIgnoreCase": {
"type": "object"
},
"StringNotEqualsIgnoreCaseIfExists": {
"type": "object"
},
"StringNotLike": {
"type": "object"
},
"StringNotLikeIfExists": {
"type": "object"
}
},
"additionalProperties": false
}
},
"properties": {
"Version": {
"type": "string",
"const": "2012-10-17"
},
"Statement": {
"oneOf": [
{
"type": [
"array"
],
"minItems": 1,
"items": {
"type": "object",
"properties": {
"Sid": {
"type": "string",
"pattern": "^[a-zA-Z0-9]+$"
},
"Action": {
"oneOf": [
{
"$ref": "#/definitions/actionItem"
},
{
"type": "array",
"items": {
"$ref": "#/definitions/actionItem"
}
}
]
},
"Effect": {
"type": "string",
"enum": [
"Allow",
"Deny"
]
},
"Principal": {
"$ref": "#/definitions/principalItem"
},
"Resource": {
"oneOf": [
{
"$ref": "#/definitions/resourceItem"
},
{
"type": "array",
"items": {
"$ref": "#/definitions/resourceItem"
},
"minItems": 1
}
]
},
"Condition": {
"$ref": "#/definitions/conditions"
}
},
"required": [
"Action",
"Effect",
"Principal",
"Resource"
]
}
},
{
"type": [
"object"
],
"properties": {
"Sid": {
"type": "string",
"pattern": "^[a-zA-Z0-9]+$"
},
"Action": {
"oneOf": [
{
"$ref": "#/definitions/actionItem"
},
{
"type": "array",
"items": {
"$ref": "#/definitions/actionItem"
}
}
]
},
"Effect": {
"type": "string",
"enum": [
"Allow",
"Deny"
]
},
"Principal": {
"$ref": "#/definitions/principalItem"
},
"Resource": {
"oneOf": [
{
"$ref": "#/definitions/resourceItem"
},
{
"type": "array",
"items": {
"$ref": "#/definitions/resourceItem"
},
"minItems": 1
}
]
},
"Condition": {
"$ref": "#/definitions/conditions"
}
},
"required": [
"Action",
"Effect",
"Resource",
"Principal"
]
}
]
}
},
"required": [
"Version",
"Statement"
],
"additionalProperties": false
}

View File

@ -1,7 +1,7 @@
{
"$schema": "http://json-schema.org/draft-06/schema#",
"type": "object",
"title": "AWS Policy schema.",
"title": "AWS User Policy schema.",
"description": "This schema describes a user policy per AWS policy grammar rules",
"definitions": {
"principalService": {
@ -28,7 +28,7 @@
},
"principalAWSUserArn": {
"type": "string",
"pattern": "^arn:aws:iam::[0-9]{12}:user/[\\w+=,.@ -]{1,64}$"
"pattern": "^arn:aws:iam::[0-9]{12}:user/(?!\\*)[\\w+=,.@ -/]{1,64}$"
},
"principalAWSRoleArn": {
"type": "string",
@ -566,4 +566,4 @@
"Statement"
],
"additionalProperties": false
}
}

View File

@ -134,7 +134,7 @@ class RequestContext {
requesterIp, sslEnabled, apiMethod,
awsService, locationConstraint, requesterInfo,
signatureVersion, authType, signatureAge, securityToken, policyArn,
action) {
action, postXml) {
this._headers = headers;
this._query = query;
this._requesterIp = requesterIp;
@ -163,6 +163,10 @@ class RequestContext {
this._policyArn = policyArn;
this._action = action;
this._needQuota = _actionNeedQuotaCheck[apiMethod] === true;
this._postXml = postXml;
this._requestObjTags = null;
this._existingObjTag = null;
this._needTagEval = false;
return this;
}
@ -191,6 +195,10 @@ class RequestContext {
securityToken: this._securityToken,
policyArn: this._policyArn,
action: this._action,
postXml: this._postXml,
requestObjTags: this._requestObjTags,
existingObjTag: this._existingObjTag,
needTagEval: this._needTagEval,
};
return JSON.stringify(requestInfo);
}
@ -216,7 +224,7 @@ class RequestContext {
obj.apiMethod, obj.awsService, obj.locationConstraint,
obj.requesterInfo, obj.signatureVersion,
obj.authType, obj.signatureAge, obj.securityToken, obj.policyArn,
obj.action);
obj.action, obj.postXml);
}
/**
@ -559,6 +567,86 @@ class RequestContext {
isQuotaCheckNeeded() {
return this._needQuota;
}
/**
* Set request post
*
* @param {string} postXml - request post
* @return {RequestContext} itself
*/
setPostXml(postXml) {
this._postXml = postXml;
return this;
}
/**
* Get request post
*
* @return {string} request post
*/
getPostXml() {
return this._postXml;
}
/**
* Set request object tags
*
* @param {string} requestObjTags - object tag(s) included in request in query string form
* @return {RequestContext} itself
*/
setRequestObjTags(requestObjTags) {
this._requestObjTags = requestObjTags;
return this;
}
/**
* Get request object tags
*
* @return {string} request object tag(s)
*/
getRequestObjTags() {
return this._requestObjTags;
}
/**
* Set info on existing tag on object included in request
*
* @param {string} existingObjTag - existing object tag in query string form
* @return {RequestContext} itself
*/
setExistingObjTag(existingObjTag) {
this._existingObjTag = existingObjTag;
return this;
}
/**
* Get existing object tag
*
* @return {string} existing object tag
*/
getExistingObjTag() {
return this._existingObjTag;
}
/**
* Set whether IAM policy tag condition keys should be evaluated
*
* @param {boolean} needTagEval - whether to evaluate tags
* @return {RequestContext} itself
*/
setNeedTagEval(needTagEval) {
this._needTagEval = needTagEval;
return this;
}
/**
* Get needTagEval param
*
* @return {boolean} needTagEval - whether IAM policy tags condition keys should be evaluated
*/
getNeedTagEval() {
return this._needTagEval;
}
}
module.exports = RequestContext;

View File

@ -6,6 +6,7 @@ const conditions = require('./utils/conditions.js');
const findConditionKey = conditions.findConditionKey;
const convertConditionOperator = conditions.convertConditionOperator;
const checkArnMatch = require('./utils/checkArnMatch.js');
const { transformTagKeyValue } = require('./utils/objectTags');
const evaluators = {};
@ -16,6 +17,7 @@ const operatorsWithVariables = ['StringEquals', 'StringNotEquals',
const operatorsWithNegation = ['StringNotEquals',
'StringNotEqualsIgnoreCase', 'StringNotLike', 'ArnNotEquals',
'ArnNotLike', 'NumericNotEquals'];
const tagConditions = new Set(['s3:ExistingObjectTag', 's3:RequestObjectTagKey', 's3:RequestObjectTagKeys']);
/**
@ -26,7 +28,7 @@ const operatorsWithNegation = ['StringNotEquals',
* @param {object} log - logger
* @return {boolean} true if applicable, false if not
*/
function isResourceApplicable(requestContext, statementResource, log) {
evaluators.isResourceApplicable = (requestContext, statementResource, log) => {
const resource = requestContext.getResource();
if (!Array.isArray(statementResource)) {
// eslint-disable-next-line no-param-reassign
@ -57,7 +59,7 @@ function isResourceApplicable(requestContext, statementResource, log) {
{ requestResource: resource });
// If no match found, no resource is applicable
return false;
}
};
/**
* Check whether action in policy statement applies to request
@ -67,7 +69,7 @@ function isResourceApplicable(requestContext, statementResource, log) {
* @param {Object} log - logger
* @return {boolean} true if applicable, false if not
*/
function isActionApplicable(requestAction, statementAction, log) {
evaluators.isActionApplicable = (requestAction, statementAction, log) => {
if (!Array.isArray(statementAction)) {
// eslint-disable-next-line no-param-reassign
statementAction = [statementAction];
@ -89,27 +91,34 @@ function isActionApplicable(requestAction, statementAction, log) {
{ requestAction });
// If no match found, return false
return false;
}
};
/**
* Check whether request meets policy conditions
* @param {RequestContext} requestContext - info about request
* @param {Object} statementCondition - Condition statement from policy
* @param {Object} log - logger
* @return {boolean} true if meet conditions, false if not
* @return {Object} contains whether conditions are allowed and whether they
* contain any tag condition keys
*/
evaluators.meetConditions = (requestContext, statementCondition, log) => {
// The Condition portion of a policy is an object with different
// operators as keys
const conditionEval = {};
const operators = Object.keys(statementCondition);
const length = operators.length;
for (let i = 0; i < length; i++) {
const operator = operators[i];
const hasPrefix = operator.includes(':');
const hasIfExistsCondition = operator.endsWith('IfExists');
// If has "IfExists" added to operator name, find operator name
// without "IfExists"
const bareOperator = hasIfExistsCondition ? operator.slice(0, -8) :
// If has "IfExists" added to operator name, or operator has "ForAnyValue" or
// "For All Values" prefix, find operator name without "IfExists" or prefix
let bareOperator = hasIfExistsCondition ? operator.slice(0, -8) :
operator;
let prefix;
if (hasPrefix) {
[prefix, bareOperator] = bareOperator.split(':');
}
const operatorCanHaveVariables =
operatorsWithVariables.indexOf(bareOperator) > -1;
const isNegationOperator =
@ -118,6 +127,9 @@ evaluators.meetConditions = (requestContext, statementCondition, log) => {
// Note: this should be the actual operator name, not the bareOperator
const conditionsWithSameOperator = statementCondition[operator];
const conditionKeys = Object.keys(conditionsWithSameOperator);
if (conditionKeys.some(key => tagConditions.has(key)) && !requestContext.getNeedTagEval()) {
conditionEval.tagConditions = true;
}
const conditionKeysLength = conditionKeys.length;
for (let j = 0; j < conditionKeysLength; j++) {
const key = conditionKeys[j];
@ -130,14 +142,18 @@ evaluators.meetConditions = (requestContext, statementCondition, log) => {
value = value.map(item =>
substituteVariables(item, requestContext));
}
// if condition key is RequestObjectTag or ExistingObjectTag,
// tag key is included in condition key and needs to be
// moved to value for evaluation, otherwise key/value are unchanged
const [transformedKey, transformedValue] = transformTagKeyValue(key, value);
// Pull key using requestContext
// TODO: If applicable to S3, handle policy set operations
// where a keyBasedOnRequestContext returns multiple values and
// condition has "ForAnyValue" or "ForAllValues".
// (see http://docs.aws.amazon.com/IAM/latest/UserGuide/
// reference_policies_multi-value-conditions.html)
const keyBasedOnRequestContext =
findConditionKey(key, requestContext);
let keyBasedOnRequestContext =
findConditionKey(transformedKey, requestContext);
// Handle IfExists and negation operators
if ((keyBasedOnRequestContext === undefined ||
keyBasedOnRequestContext === null) &&
@ -154,22 +170,27 @@ evaluators.meetConditions = (requestContext, statementCondition, log) => {
bareOperator !== 'Null') {
log.trace('condition not satisfied due to ' +
'missing info', { operator,
conditionKey: key, policyValue: value });
return false;
conditionKey: transformedKey, policyValue: transformedValue });
return { allow: false };
}
// If condition operator prefix is included, the key should be an array
if (prefix && !Array.isArray(keyBasedOnRequestContext)) {
keyBasedOnRequestContext = [keyBasedOnRequestContext];
}
// Transalate operator into function using bareOperator
const operatorFunction = convertConditionOperator(bareOperator);
// Note: Wildcards are handled in the comparison operator function
// itself since StringLike, StringNotLike, ArnLike and ArnNotLike
// are the only operators where wildcards are allowed
if (!operatorFunction(keyBasedOnRequestContext, value)) {
if (!operatorFunction(keyBasedOnRequestContext, transformedValue, prefix)) {
log.trace('did not satisfy condition', { operator: bareOperator,
keyBasedOnRequestContext, policyValue: value });
return false;
keyBasedOnRequestContext, policyValue: transformedValue });
return { allow: false };
}
}
}
return true;
conditionEval.allow = true;
return conditionEval;
};
/**
@ -195,35 +216,36 @@ evaluators.evaluatePolicy = (requestContext, policy, log) => {
const currentStatement = policy.Statement[i];
// If affirmative resource is in policy and request resource is
// not applicable, move on to next statement
if (currentStatement.Resource && !isResourceApplicable(requestContext,
if (currentStatement.Resource && !evaluators.isResourceApplicable(requestContext,
currentStatement.Resource, log)) {
continue;
}
// If NotResource is in policy and resource matches NotResource
// in policy, move on to next statement
if (currentStatement.NotResource &&
isResourceApplicable(requestContext,
evaluators.isResourceApplicable(requestContext,
currentStatement.NotResource, log)) {
continue;
}
// If affirmative action is in policy and request action is not
// applicable, move on to next statement
if (currentStatement.Action &&
!isActionApplicable(requestContext.getAction(),
!evaluators.isActionApplicable(requestContext.getAction(),
currentStatement.Action, log)) {
continue;
}
// If NotAction is in policy and action matches NotAction in policy,
// move on to next statement
if (currentStatement.NotAction &&
isActionApplicable(requestContext.getAction(),
evaluators.isActionApplicable(requestContext.getAction(),
currentStatement.NotAction, log)) {
continue;
}
const conditionEval = currentStatement.Condition ?
evaluators.meetConditions(requestContext, currentStatement.Condition, log) :
null;
// If do not meet conditions move on to next statement
if (currentStatement.Condition &&
!evaluators.meetConditions(requestContext,
currentStatement.Condition, log)) {
if (conditionEval && !conditionEval.allow) {
continue;
}
if (currentStatement.Effect === 'Deny') {
@ -235,6 +257,9 @@ evaluators.evaluatePolicy = (requestContext, policy, log) => {
// If statement is applicable, conditions are met and Effect is
// to Allow, set verdict to Allow
verdict = 'Allow';
if (conditionEval && conditionEval.tagConditions) {
verdict = 'NeedTagConditionEval';
}
}
log.trace('result of evaluating single policy', { verdict });
return verdict;

View File

@ -35,7 +35,8 @@ class Principal {
// In case of anonymous NotPrincipal, this will neutral everyone
return 'Neutral';
}
if (!Principal._evaluateCondition(params, statement)) {
const conditionEval = Principal._evaluateCondition(params, statement);
if (!conditionEval || conditionEval.allow === false) {
return 'Neutral';
}
return statement.Effect;
@ -65,7 +66,8 @@ class Principal {
if (reverse) {
return 'Neutral';
}
if (!Principal._evaluateCondition(params, statement)) {
const conditionEval = Principal._evaluateCondition(params, statement);
if (!conditionEval || conditionEval.allow === false) {
return 'Neutral';
}
return statement.Effect;
@ -76,7 +78,8 @@ class Principal {
if (reverse) {
return 'Neutral';
}
if (!Principal._evaluateCondition(params, statement)) {
const conditionEval = Principal._evaluateCondition(params, statement);
if (!conditionEval || conditionEval.allow === false) {
return 'Neutral';
}
return statement.Effect;
@ -140,6 +143,7 @@ class Principal {
AWS: [
account,
accountArn,
requesterArn,
],
};
checkAction = true;

View File

@ -1,21 +1,33 @@
const sharedActionMap = {
bucketDelete: 's3:DeleteBucket',
// the "s3:PutEncryptionConfiguration" action also governs DELETE
bucketDeleteEncryption: 's3:PutEncryptionConfiguration',
bucketDeletePolicy: 's3:DeleteBucketPolicy',
bucketDeleteWebsite: 's3:DeleteBucketWebsite',
bucketGet: 's3:ListBucket',
bucketGetACL: 's3:GetBucketAcl',
bucketGetCors: 's3:GetBucketCORS',
bucketGetEncryption: 's3:GetEncryptionConfiguration',
bucketGetLifecycle: 's3:GetLifecycleConfiguration',
bucketGetLocation: 's3:GetBucketLocation',
bucketGetNotification: 's3:GetBucketNotificationConfiguration',
bucketGetObjectLock: 's3:GetBucketObjectLockConfiguration',
bucketGetPolicy: 's3:GetBucketPolicy',
bucketGetReplication: 's3:GetReplicationConfiguration',
bucketGetVersioning: 's3:GetBucketVersioning',
bucketGetWebsite: 's3:GetBucketWebsite',
bucketHead: 's3:ListBucket',
bucketPutACL: 's3:PutBucketAcl',
bucketPutCors: 's3:PutBucketCORS',
bucketPutEncryption: 's3:PutEncryptionConfiguration',
bucketPutLifecycle: 's3:PutLifecycleConfiguration',
bucketPutNotification: 's3:PutBucketNotificationConfiguration',
bucketPutObjectLock: 's3:PutBucketObjectLockConfiguration',
bucketPutPolicy: 's3:PutBucketPolicy',
bucketPutReplication: 's3:PutReplicationConfiguration',
bucketPutVersioning: 's3:PutBucketVersioning',
bucketPutWebsite: 's3:PutBucketWebsite',
bypassGovernanceRetention: 's3:BypassGovernanceRetention',
listMultipartUploads: 's3:ListBucketMultipartUploads',
listParts: 's3:ListMultipartUploadParts',
multipartDelete: 's3:AbortMultipartUpload',
@ -23,9 +35,13 @@ const sharedActionMap = {
objectDeleteTagging: 's3:DeleteObjectTagging',
objectGet: 's3:GetObject',
objectGetACL: 's3:GetObjectAcl',
objectGetLegalHold: 's3:GetObjectLegalHold',
objectGetRetention: 's3:GetObjectRetention',
objectGetTagging: 's3:GetObjectTagging',
objectPut: 's3:PutObject',
objectPutACL: 's3:PutObjectAcl',
objectPutLegalHold: 's3:PutObjectLegalHold',
objectPutRetention: 's3:PutObjectRetention',
objectPutTagging: 's3:PutObjectTagging',
};
@ -51,20 +67,12 @@ const actionMapRQ = Object.assign({
objectPutTaggingVersion: 's3:PutObjectVersionTagging',
serviceGet: 's3:ListAllMyBuckets',
objectReplicate: 's3:ReplicateObject',
objectPutRetentionVersion: 's3:PutObjectVersionRetention',
objectPutLegalHoldVersion: 's3:PutObjectVersionLegalHold',
}, sharedActionMap);
// action map used for bucket policies
const actionMapBP = Object.assign({
bucketDeletePolicy: 's3:DeleteBucketPolicy',
bucketGetObjectLock: 's3:GetBucketObjectLockConfiguration',
bucketGetPolicy: 's3:GetBucketPolicy',
bucketPutObjectLock: 's3:PutBucketObjectLockConfiguration',
bucketPutPolicy: 's3:PutBucketPolicy',
objectGetLegalHold: 's3:GetObjectLegalHold',
objectGetRetention: 's3:GetObjectRetention',
objectPutLegalHold: 's3:PutObjectLegalHold',
objectPutRetention: 's3:PutObjectRetention',
}, sharedActionMap);
const actionMapBP = Object.assign({}, sharedActionMap);
// action map for all relevant s3 actions
const actionMapS3 = Object.assign({
@ -75,7 +83,9 @@ const actionMapS3 = Object.assign({
const actionMonitoringMapS3 = {
bucketDelete: 'DeleteBucket',
bucketDeleteCors: 'DeleteBucketCors',
bucketDeleteEncryption: 'DeleteBucketEncryption',
bucketDeleteLifecycle: 'DeleteBucketLifecycle',
bucketDeletePolicy: 'DeleteBucketPolicy',
bucketDeleteReplication: 'DeleteBucketReplication',
bucketDeleteWebsite: 'DeleteBucketWebsite',
bucketGet: 'ListObjects',
@ -83,16 +93,24 @@ const actionMonitoringMapS3 = {
bucketGetCors: 'GetBucketCors',
bucketGetLifecycle: 'GetBucketLifecycleConfiguration',
bucketGetLocation: 'GetBucketLocation',
bucketGetNotification: 'GetBucketNotificationConfiguration',
bucketGetObjectLock: 'GetObjectLockConfiguration',
bucketGetPolicy: 'GetBucketPolicy',
bucketGetReplication: 'GetBucketReplication',
bucketGetVersioning: 'GetBucketVersioning',
bucketGetEncryption: 'GetBucketEncryption',
bucketGetWebsite: 'GetBucketWebsite',
bucketHead: 'HeadBucket',
bucketPut: 'CreateBucket',
bucketPutACL: 'PutBucketAcl',
bucketPutCors: 'PutBucketCors',
bucketPutLifecycle: 'PutBucketLifecycleConfiguration',
bucketPutNotification: 'PutBucketNotificationConfiguration',
bucketPutObjectLock: 'PutObjectLockConfiguration',
bucketPutPolicy: 'PutBucketPolicy',
bucketPutReplication: 'PutBucketReplication',
bucketPutVersioning: 'PutBucketVersioning',
bucketPutEncryption: 'PutBucketEncryption',
bucketPutWebsite: 'PutBucketWebsite',
completeMultipartUpload: 'CompleteMultipartUpload',
initiateMultipartUpload: 'CreateMultipartUpload',
@ -105,12 +123,16 @@ const actionMonitoringMapS3 = {
objectDeleteTagging: 'DeleteObjectTagging',
objectGet: 'GetObject',
objectGetACL: 'GetObjectAcl',
objectGetLegalHold: 'GetObjectLegalHold',
objectGetRetention: 'GetObjectRetention',
objectGetTagging: 'GetObjectTagging',
objectHead: 'HeadObject',
objectPut: 'PutObject',
objectPutACL: 'PutObjectAcl',
objectPutCopyPart: 'UploadPartCopy',
objectPutLegalHold: 'PutObjectLegalHold',
objectPutPart: 'UploadPart',
objectPutRetention: 'PutObjectRetention',
objectPutTagging: 'PutObjectTagging',
serviceGet: 'ListBuckets',
};
@ -145,6 +167,12 @@ const actionMapIAM = {
listUsers: 'iam:ListUsers',
putGroupPolicy: 'iam:PutGroupPolicy',
removeUserFromGroup: 'iam:RemoveUserFromGroup',
updateAccessKey: 'iam:UpdateAccessKey',
updateGroup: 'iam:UpdateGroup',
updateUser: 'iam:UpdateUser',
getAccessKeyLastUsed: 'iam:GetAccessKeyLastUsed',
generateCredentialReport: 'iam:GenerateCredentialReport',
getCredentialReport: 'iam:GetCredentialReport',
};
const actionMapSSO = {

View File

@ -4,6 +4,7 @@
const checkIPinRangeOrMatch = require('../../ipCheck').checkIPinRangeOrMatch;
const handleWildcards = require('./wildcards.js').handleWildcards;
const checkArnMatch = require('./checkArnMatch.js');
const { getTagKeys } = require('./objectTags');
const conditions = {};
/**
@ -146,6 +147,25 @@ conditions.findConditionKey = (key, requestContext) => {
headers['x-amz-meta-scal-location-constraint']);
map.set('sts:ExternalId', requestContext.getRequesterExternalId());
map.set('iam:PolicyArn', requestContext.getPolicyArn());
// s3:ExistingObjectTag - Used to check that existing object tag has
// specific tag key and value. Extraction of correct tag key is done in CloudServer.
// On first pass of policy evaluation, CloudServer information will not be included,
// so evaluation should be skipped
map.set('s3:ExistingObjectTag', requestContext.getNeedTagEval() ? requestContext.getExistingObjTag() : undefined);
// s3:RequestObjectTag - Used to limit putting object tags to specific
// tag key and value. N/A here.
// Requires information from CloudServer
// On first pass of policy evaluation, CloudServer information will not be included,
// so evaluation should be skipped
map.set('s3:RequestObjectTagKey', requestContext.getNeedTagEval() ? requestContext.getRequestObjTags() : undefined);
// s3:RequestObjectTagKeys - Used to limit putting object tags specific tag keys.
// Requires information from CloudServer.
// On first pass of policy evaluation, CloudServer information will not be included,
// so evaluation should be skipped
map.set('s3:RequestObjectTagKeys',
requestContext.getNeedTagEval() && requestContext.getRequestObjTags()
? getTagKeys(requestContext.getRequestObjTags())
: undefined);
return map.get(key);
};
@ -232,12 +252,21 @@ conditions.convertConditionOperator = operator => {
// eslint-disable-next-line new-cap
return !operatorMap.StringEqualsIgnoreCase(key, value);
},
StringLike: function stringLike(key, value) {
return value.some(item => {
const wildItem = handleWildcards(item);
const wildRegEx = new RegExp(wildItem);
return wildRegEx.test(key);
});
StringLike: function stringLike(key, value, prefix) {
function policyValRegex(testKey) {
return value.some(item => {
const wildItem = handleWildcards(item);
const wildRegEx = new RegExp(wildItem);
return wildRegEx.test(testKey);
});
}
if (prefix === 'ForAnyValue') {
return key.some(policyValRegex);
}
if (prefix === 'ForAllValues') {
return key.every(policyValRegex);
}
return policyValRegex(key);
},
StringNotLike: function stringNotLike(key, value) {
// eslint-disable-next-line new-cap

View File

@ -0,0 +1,33 @@
/**
* Removes tag key value from condition key and adds it to value if needed
* @param {string} key - condition key
* @param {string} value - condition value
* @return {array} key/value pair to use
*/
function transformTagKeyValue(key, value) {
const patternKeys = ['s3:ExistingObjectTag/', 's3:RequestObjectTagKey/'];
if (!patternKeys.some(k => key.includes(k))) {
return [key, value];
}
// if key is RequestObjectTag or ExistingObjectTag,
// remove tag key from condition key and add to value
// and transform value into query string
const [conditionKey, tagKey] = key.split('/');
const transformedValue = [tagKey, value].join('=');
return [conditionKey, [transformedValue]];
}
/**
* Gets array of tag key names from request tag query string
* @param {string} tagQuery - request tags in query string format
* @return {array} array of tag key names
*/
function getTagKeys(tagQuery) {
return tagQuery.split('&')
.map(tag => tag.split('=')[0]);
}
module.exports = {
transformTagKeyValue,
getTagKeys,
};

View File

@ -15,7 +15,7 @@ azureMpuUtils.overviewMpuKey = 'azure_mpu';
azureMpuUtils.maxSubPartSize = 104857600;
azureMpuUtils.zeroByteETag = crypto.createHash('md5').update('').digest('hex');
// TODO: S3C-4657
azureMpuUtils.padString = (str, category) => {
const _padFn = {
left: (str, padString) =>
@ -124,7 +124,8 @@ log, cb) => {
`Error returned from Azure: ${err.message}`)
);
}
const eTag = objectUtils.getHexMD5(result.headers['content-md5']);
const md5 = result.headers['content-md5'] || '';
const eTag = objectUtils.getHexMD5(md5);
return cb(null, eTag, size);
}], log, cb);
};

View File

@ -0,0 +1,54 @@
const oneDay = 24 * 60 * 60 * 1000; // Milliseconds in a day.
class LifecycleDateTime {
constructor(params = {}) {
this._transitionOneDayEarlier = params.transitionOneDayEarlier;
this._expireOneDayEarlier = params.expireOneDayEarlier;
}
getCurrentDate() {
const timeTravel = this._expireOneDayEarlier ? oneDay : 0;
return Date.now() + timeTravel;
}
/**
* Helper method to get total Days passed since given date
* @param {Date} date - date object
* @return {number} Days passed
*/
findDaysSince(date) {
const now = this.getCurrentDate();
const diff = now - date;
return Math.floor(diff / (1000 * 60 * 60 * 24));
}
/**
* Get the Unix timestamp of the given date.
* @param {string} date - The date string to convert to a Unix timestamp
* @return {number} - The Unix timestamp
*/
getTimestamp(date) {
return new Date(date).getTime();
}
/**
* Find the Unix time at which the transition should occur.
* @param {object} transition - A transition from the lifecycle transitions
* @param {string} lastModified - The object's last modified date
* @return {number|undefined} - The normalized transition timestamp
*/
getTransitionTimestamp(transition, lastModified) {
if (transition.Date !== undefined) {
return this.getTimestamp(transition.Date);
}
if (transition.Days !== undefined) {
const lastModifiedTime = this.getTimestamp(lastModified);
const timeTravel = this._transitionOneDayEarlier ? -oneDay : 0;
return lastModifiedTime + (transition.Days * oneDay) + timeTravel;
}
return undefined;
}
}
module.exports = LifecycleDateTime;

View File

@ -0,0 +1,228 @@
const assert = require('assert');
const LifecycleDateTime = require('./LifecycleDateTime');
const { supportedLifecycleRules } = require('../../constants');
class LifecycleUtils {
constructor(supportedRules, datetime) {
if (supportedRules) {
assert(Array.isArray(supportedRules));
}
if (datetime) {
assert(datetime instanceof LifecycleDateTime);
}
this._supportedRules = supportedRules || supportedLifecycleRules;
this._datetime = datetime || new LifecycleDateTime();
}
/**
* Compare two transition rules and return the one that is most recent.
* @param {object} params - The function parameters
* @param {object} params.transition1 - A transition from the current rule
* @param {object} params.transition2 - A transition from the previous rule
* @param {string} params.lastModified - The object's last modified
* date
* @return {object} The most applicable transition rule
*/
compareTransitions(params) {
const { transition1, transition2, lastModified } = params;
if (transition1 === undefined) {
return transition2;
}
if (transition2 === undefined) {
return transition1;
}
return this._datetime.getTransitionTimestamp(transition1, lastModified)
> this._datetime.getTransitionTimestamp(transition2, lastModified)
? transition1 : transition2;
}
/**
* Find the most relevant trantition rule for the given transitions array
* and any previously stored transition from another rule.
* @param {object} params - The function parameters
* @param {array} params.transitions - Array of lifecycle rule transitions
* @param {string} params.lastModified - The object's last modified
* date
* @return {object} The most applicable transition rule
*/
getApplicableTransition(params) {
const {
transitions, store, lastModified, currentDate,
} = params;
const transition = transitions.reduce((result, transition) => {
const isApplicable = // Is the transition time in the past?
this._datetime.getTimestamp(currentDate) >=
this._datetime.getTransitionTimestamp(transition, lastModified);
if (!isApplicable) {
return result;
}
return this.compareTransitions({
transition1: transition,
transition2: result,
lastModified,
});
}, undefined);
return this.compareTransitions({
transition1: transition,
transition2: store.Transition,
lastModified,
});
}
/**
* Filter out all rules based on `Status` and `Filter` (Prefix and Tags)
* @param {array} bucketLCRules - array of bucket lifecycle rules
* @param {object} item - represents a single object, version, or upload
* @param {object} objTags - all tags for given `item`
* @return {array} list of all filtered rules that apply to `item`
*/
filterRules(bucketLCRules, item, objTags) {
/*
Bucket Tags must be included in the list of object tags.
So if a bucket tag with "key1/value1" exists, and an object with
"key1/value1, key2/value2" exists, this bucket lifecycle rules
apply on this object.
Vice versa, bucket rule is "key1/value1, key2/value2" and object
rule is "key1/value1", this buckets rule does not apply to this
object.
*/
function deepCompare(rTags, oTags) {
// check to make sure object tags length matches or is greater
if (rTags.length > oTags.length) {
return false;
}
// all key/value tags of bucket rules must be within object tags
for (let i = 0; i < rTags.length; i++) {
const oTag = oTags.find(pair => pair.Key === rTags[i].Key);
if (!oTag || rTags[i].Value !== oTag.Value) {
return false;
}
}
return true;
}
return bucketLCRules.filter(rule => {
if (rule.Status === 'Disabled') {
return false;
}
// check all locations where prefix could possibly be
// console.log(rule.Prefix);
// console.log(rule.Filter);
// console.log(rule.Filter.And);
const prefix = rule.Prefix
|| (rule.Filter && (rule.Filter.And
? rule.Filter.And.Prefix
: rule.Filter.Prefix));
if (prefix && !item.Key.startsWith(prefix)) {
return false;
}
if (!rule.Filter) {
return true;
}
const tags = rule.Filter.And
? rule.Filter.And.Tags
: (rule.Filter.Tag && [rule.Filter.Tag]);
if (tags && !deepCompare(tags, objTags.TagSet || [])) {
return false;
}
return true;
});
}
/**
* For all filtered rules, get rules that apply the earliest
* @param {array} rules - list of filtered rules that apply to a specific
* object, version, or upload
* @param {object} metadata - metadata about the object to transition
* @return {object} all applicable rules with earliest dates of action
* i.e. { Expiration: { Date: <DateObject>, Days: 10 },
* NoncurrentVersionExpiration: { NoncurrentDays: 5 } }
*/
getApplicableRules(rules, metadata) {
// Declare the current date before the reducing function so that all
// rule comparisons use the same date.
const currentDate = new Date();
/* eslint-disable no-param-reassign */
const applicableRules = rules.reduce((store, rule) => {
// filter and find earliest dates
if (rule.Expiration && this._supportedRules.includes('expiration')) {
if (!store.Expiration) {
store.Expiration = {};
}
if (rule.Expiration.Days) {
if (!store.Expiration.Days || rule.Expiration.Days
< store.Expiration.Days) {
store.Expiration.ID = rule.ID;
store.Expiration.Days = rule.Expiration.Days;
}
}
if (rule.Expiration.Date) {
if (!store.Expiration.Date || rule.Expiration.Date
< store.Expiration.Date) {
store.Expiration.ID = rule.ID;
store.Expiration.Date = rule.Expiration.Date;
}
}
const eodm = rule.Expiration.ExpiredObjectDeleteMarker;
if (eodm !== undefined) {
// preference for later rules in list of rules
store.Expiration.ID = rule.ID;
store.Expiration.ExpiredObjectDeleteMarker = eodm;
}
}
if (rule.NoncurrentVersionExpiration
&& this._supportedRules.includes('noncurrentVersionExpiration')) {
// Names are long, so obscuring a bit
const ncve = 'NoncurrentVersionExpiration';
const ncd = 'NoncurrentDays';
if (!store[ncve]) {
store[ncve] = {};
}
if (!store[ncve][ncd] || rule[ncve][ncd] < store[ncve][ncd]) {
store[ncve].ID = rule.ID;
store[ncve][ncd] = rule[ncve][ncd];
}
}
if (rule.AbortIncompleteMultipartUpload
&& this._supportedRules.includes('abortIncompleteMultipartUpload')) {
// Names are long, so obscuring a bit
const aimu = 'AbortIncompleteMultipartUpload';
const dai = 'DaysAfterInitiation';
if (!store[aimu]) {
store[aimu] = {};
}
if (!store[aimu][dai] || rule[aimu][dai] < store[aimu][dai]) {
store[aimu].ID = rule.ID;
store[aimu][dai] = rule[aimu][dai];
}
}
const hasTransitions = Array.isArray(rule.Transitions) && rule.Transitions.length > 0;
if (hasTransitions && this._supportedRules.includes('transitions')) {
store.Transition = this.getApplicableTransition({
transitions: rule.Transitions,
lastModified: metadata.LastModified,
store,
currentDate,
});
}
// TODO: Add support for NoncurrentVersionTransitions.
return store;
}, {});
// Do not transition to a location where the object is already stored.
if (applicableRules.Transition
&& applicableRules.Transition.StorageClass === metadata.StorageClass) {
applicableRules.Transition = undefined;
}
return applicableRules;
/* eslint-enable no-param-reassign */
}
}
module.exports = LifecycleUtils;

View File

@ -0,0 +1,4 @@
module.exports = {
LifecycleDateTime: require('./LifecycleDateTime'),
LifecycleUtils: require('./LifecycleUtils'),
};

View File

@ -0,0 +1,112 @@
const { parseString } = require('xml2js');
const errors = require('../errors');
/*
Format of the xml request:
<LegalHold>
<Status>ON|OFF</Status>
</LegalHold>
*/
/**
* @param {string[]} status - legal hold status parsed from xml to be validated
* @return {Error|object} - legal hold status or error
*/
function _validateStatus(status) {
const validatedStatus = {};
const expectedValues = new Set(['OFF', 'ON']);
if (!status || status[0] === '') {
validatedStatus.error = errors.MalformedXML.customizeDescription(
'request xml does not contain Status');
return validatedStatus;
}
if (status.length > 1) {
validatedStatus.error = errors.MalformedXML.customizeDescription(
'request xml contains more than one Status');
return validatedStatus;
}
if (!expectedValues.has(status[0])) {
validatedStatus.error = errors.MalformedXML.customizeDescription(
'Status request xml must be one of "ON", "OFF"');
return validatedStatus;
}
validatedStatus.status = status[0];
return validatedStatus;
}
/**
* validate legal hold - validates legal hold xml
* @param {object} parsedXml - parsed legal hold xml object
* @return {object} - object with status or error
*/
function _validateLegalHold(parsedXml) {
const validatedLegalHold = {};
if (!parsedXml) {
validatedLegalHold.error = errors.MalformedXML.customizeDescription(
'request xml is undefined or empty');
return validatedLegalHold;
}
if (!parsedXml.LegalHold) {
validatedLegalHold.error = errors.MalformedXML.customizeDescription(
'request xml does not contain LegalHold');
return validatedLegalHold;
}
const validatedStatus = _validateStatus(parsedXml.LegalHold.Status);
if (validatedStatus.error) {
validatedLegalHold.error = validatedStatus.error;
return validatedLegalHold;
}
validatedLegalHold.status = validatedStatus.status;
return validatedLegalHold;
}
/**
* parse object legal hold - parse and validate xml body
* @param {string} xml - xml body to parse and validate
* @param {object} log - werelogs logger
* @param {function} cb - callback to server
* @return {undefined} - calls callback with legal hold status or error
*/
function parseLegalHoldXml(xml, log, cb) {
parseString(xml, (err, result) => {
if (err) {
log.debug('xml parsing failed', {
error: { message: err.message },
method: 'parseLegalHoldXml',
xml,
});
return cb(errors.MalformedXML);
}
const validatedLegalHold = _validateLegalHold(result);
const validatedLegalHoldStatus = validatedLegalHold.status === 'ON';
if (validatedLegalHold.error) {
log.debug('legal hold validation failed', {
error: { message: validatedLegalHold.error.message },
method: 'parseLegalHoldXml',
xml,
});
return cb(validatedLegalHold.error);
}
return cb(null, validatedLegalHoldStatus);
});
}
/**
* convert to xml - generates legal hold xml
* @param {(boolean|undefined)} legalHold - true if legal hold is on
* false if legal hold is off, undefined if legal hold is not set
* @return {string} - returns legal hold xml
*/
function convertToXml(legalHold) {
if (!legalHold && legalHold !== false) {
return '';
}
const xml = '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>' +
`<LegalHold><Status>${legalHold ? 'ON' : 'OFF'}</Status></LegalHold>`;
return xml;
}
module.exports = {
convertToXml,
parseLegalHoldXml,
};

View File

@ -0,0 +1,156 @@
const { parseString } = require('xml2js');
const constants = require('../constants');
const errors = require('../errors');
/*
Format of xml request:
<Retention>
<Mode>COMPLIANCE|GOVERNANCE</Mode>
<RetainUntilDate>2020-05-20T04:58:45.413000Z</RetainUntilDate>
</Retention>
*/
/**
* validateMode - validate retention mode
* @param {array} mode - parsed xml mode array
* @return {object} - contains mode or error
*/
function validateMode(mode) {
const modeObj = {};
const expectedModes = new Set(['GOVERNANCE', 'COMPLIANCE']);
if (!mode || !mode[0]) {
modeObj.error = errors.MalformedXML.customizeDescription(
'request xml does not contain Mode');
return modeObj;
}
if (mode.length > 1) {
modeObj.error = errors.MalformedXML.customizeDescription(
'request xml contains more than one Mode');
return modeObj;
}
if (!expectedModes.has(mode[0])) {
modeObj.error = errors.MalformedXML.customizeDescription(
'Mode request xml must be one of "GOVERNANCE", "COMPLIANCE"');
return modeObj;
}
modeObj.mode = mode[0];
return modeObj;
}
/**
* validateRetainDate - validate retain until date
* @param {array} retainDate - parsed xml retention date array
* @return {object} - contains retain until date or error
*/
function validateRetainDate(retainDate) {
const dateObj = {};
if (!retainDate || !retainDate[0]) {
dateObj.error = errors.MalformedXML.customizeDescription(
'request xml does not contain RetainUntilDate');
return dateObj;
}
if (!constants.shortIso8601Regex.test(retainDate[0]) &&
!constants.longIso8601Regex.test(retainDate[0])) {
dateObj.error = errors.InvalidRequest.customizeDescription(
'RetainUntilDate timestamp must be ISO-8601 format');
return dateObj;
}
const date = new Date(retainDate[0]);
if (date < Date.now()) {
dateObj.error = errors.InvalidRequest.customizeDescription(
'RetainUntilDate must be in the future');
return dateObj;
}
dateObj.date = retainDate[0];
return dateObj;
}
/**
* validate retention - validate retention xml
* @param {object} parsedXml - parsed retention xml object
* @return {object} - contains retention information on success,
* error on failure
*/
function validateRetention(parsedXml) {
const retentionObj = {};
if (!parsedXml) {
retentionObj.error = errors.MalformedXML.customizeDescription(
'request xml is undefined or empty');
return retentionObj;
}
const retention = parsedXml.Retention;
if (!retention) {
retentionObj.error = errors.MalformedXML.customizeDescription(
'request xml does not contain Retention');
return retentionObj;
}
const modeObj = validateMode(retention.Mode);
if (modeObj.error) {
retentionObj.error = modeObj.error;
return retentionObj;
}
const dateObj = validateRetainDate(retention.RetainUntilDate);
if (dateObj.error) {
retentionObj.error = dateObj.error;
return retentionObj;
}
retentionObj.mode = modeObj.mode;
retentionObj.date = dateObj.date;
return retentionObj;
}
/**
* parseRetentionXml - Parse and validate xml body, returning callback with
* object retentionObj: { mode: <value>, date: <value> }
* @param {string} xml - xml body to parse and validate
* @param {object} log - Werelogs logger
* @param {function} cb - callback to server
* @return {undefined} - calls callback with object retention or error
*/
function parseRetentionXml(xml, log, cb) {
parseString(xml, (err, result) => {
if (err) {
log.trace('xml parsing failed', {
error: err,
method: 'parseRetentionXml',
});
log.debug('invalid xml', { xml });
return cb(errors.MalformedXML);
}
const retentionObj = validateRetention(result);
if (retentionObj.error) {
log.debug('retention validation failed', {
error: retentionObj.error,
method: 'validateRetention',
xml,
});
return cb(retentionObj.error);
}
return cb(null, retentionObj);
});
}
/**
* convertToXml - Convert retention info object to xml
* @param {string} mode - retention mode
* @param {string} date - retention retain until date
* @return {string} - returns retention information xml string
*/
function convertToXml(mode, date) {
const xml = [];
xml.push('<Retention xmlns="http://s3.amazonaws.com/doc/2006-03-01/">');
if (mode && date) {
xml.push(`<Mode>${mode}</Mode>`);
xml.push(`<RetainUntilDate>${date}</RetainUntilDate>`);
} else {
return '';
}
xml.push('</Retention>');
return xml.join('');
}
module.exports = {
parseRetentionXml,
convertToXml,
};

View File

@ -152,8 +152,6 @@ function routes(req, res, params, logger) {
const clientInfo = {
clientIP: req.socket.remoteAddress,
clientPort: req.socket.remotePort,
httpCode: res.statusCode,
httpMessage: res.statusMessage,
httpMethod: req.method,
httpURL: req.url,
endpoint: req.endpoint,

View File

@ -45,6 +45,20 @@ function routeDELETE(request, response, api, log, statsClient) {
return routesUtils.responseNoBody(err, corsHeaders,
response, 204, log);
});
} else if (request.query.policy !== undefined) {
return api.callApiMethod('bucketDeletePolicy', request,
response, log, (err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseNoBody(err, corsHeaders,
response, 204, log);
});
} else if (request.query.encryption !== undefined) {
return api.callApiMethod('bucketDeleteEncryption', request,
response, log, (err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseNoBody(err, corsHeaders,
response, 204, log);
});
}
api.callApiMethod('bucketDelete', request, response, log,
(err, corsHeaders) => {

View File

@ -51,11 +51,11 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
});
} else if (request.query.lifecycle !== undefined) {
api.callApiMethod('bucketGetLifecycle', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else if (request.query.uploads !== undefined) {
// List MultipartUploads
api.callApiMethod('listMultipartUploads', request, response, log,
@ -71,6 +71,34 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
return routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else if (request.query.policy !== undefined) {
api.callApiMethod('bucketGetPolicy', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response,
log, corsHeaders);
});
} else if (request.query['object-lock'] !== undefined) {
api.callApiMethod('bucketGetObjectLock', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response,
log, corsHeaders);
});
} else if (request.query.notification !== undefined) {
api.callApiMethod('bucketGetNotification', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response,
log, corsHeaders);
});
} else if (request.query.encryption !== undefined) {
api.callApiMethod('bucketGetEncryption', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response,
log, corsHeaders);
});
} else {
// GET bucket
api.callApiMethod('bucketGet', request, response, log,
@ -81,7 +109,6 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
});
}
} else {
/* eslint-disable no-lonely-if */
if (request.query.acl !== undefined) {
// GET object ACL
api.callApiMethod('objectGetACL', request, response, log,
@ -90,8 +117,14 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
return routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else if (request.query['legal-hold'] !== undefined) {
api.callApiMethod('objectGetLegalHold', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else if (request.query.tagging !== undefined) {
// GET object Tagging
api.callApiMethod('objectGetTagging', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
@ -106,6 +139,13 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
return routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else if (request.query.retention !== undefined) {
api.callApiMethod('objectGetRetention', request, response, log,
(err, xml, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseXMLBody(err, xml, response, log,
corsHeaders);
});
} else {
// GET object
api.callApiMethod('objectGet', request, response, log,
@ -121,7 +161,6 @@ function routerGET(request, response, api, log, statsClient, dataRetrievalFn) {
range, log);
});
}
/* eslint-enable */
}
}

View File

@ -16,7 +16,6 @@ function routePUT(request, response, api, log, statsClient) {
return routesUtils.responseNoBody(
errors.BadRequest, null, response, null, log);
}
// PUT bucket ACL
if (request.query.acl !== undefined) {
api.callApiMethod('bucketPutACL', request, response, log,
@ -60,6 +59,34 @@ function routePUT(request, response, api, log, statsClient) {
routesUtils.responseNoBody(err, corsHeaders, response, 200,
log);
});
} else if (request.query.policy !== undefined) {
api.callApiMethod('bucketPutPolicy', request, response, log,
(err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
routesUtils.responseNoBody(err, corsHeaders, response, 200,
log);
});
} else if (request.query['object-lock'] !== undefined) {
api.callApiMethod('bucketPutObjectLock', request, response, log,
(err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
routesUtils.responseNoBody(err, corsHeaders, response, 200,
log);
});
} else if (request.query.notification !== undefined) {
api.callApiMethod('bucketPutNotification', request, response, log,
(err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
routesUtils.responseNoBody(err, corsHeaders, response, 200,
log);
});
} else if (request.query.encryption !== undefined) {
api.callApiMethod('bucketPutEncryption', request, response, log,
(err, corsHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseNoBody(err, corsHeaders,
response, 200, log);
});
} else {
// PUT bucket
return api.callApiMethod('bucketPut', request, response, log,
@ -73,8 +100,8 @@ function routePUT(request, response, api, log, statsClient) {
});
}
} else {
// PUT object, PUT object ACL, PUT object multipart or
// PUT object copy
// PUT object, PUT object ACL, PUT object multipart,
// PUT object copy or PUT object legal hold
// if content-md5 is not present in the headers, try to
// parse content-md5 from meta headers
@ -132,6 +159,13 @@ function routePUT(request, response, api, log, statsClient) {
return routesUtils.responseNoBody(err, resHeaders,
response, 200, log);
});
} else if (request.query['legal-hold'] !== undefined) {
api.callApiMethod('objectPutLegalHold', request, response, log,
(err, resHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseNoBody(err, resHeaders,
response, 200, log);
});
} else if (request.query.tagging !== undefined) {
api.callApiMethod('objectPutTagging', request, response, log,
(err, resHeaders) => {
@ -139,6 +173,13 @@ function routePUT(request, response, api, log, statsClient) {
return routesUtils.responseNoBody(err, resHeaders,
response, 200, log);
});
} else if (request.query.retention !== undefined) {
api.callApiMethod('objectPutRetention', request, response, log,
(err, resHeaders) => {
routesUtils.statsReport500(err, statsClient);
return routesUtils.responseNoBody(err, resHeaders,
response, 200, log);
});
} else if (request.headers['x-amz-copy-source']) {
return api.callApiMethod('objectCopy', request, response, log,
(err, xml, additionalHeaders) => {
@ -160,7 +201,6 @@ function routePUT(request, response, api, log, statsClient) {
log.end().addDefaultFields({
contentLength: request.parsedContentLength,
});
api.callApiMethod('objectPut', request, response, log,
(err, resHeaders) => {
routesUtils.statsReport500(err, statsClient);

View File

@ -6,6 +6,7 @@ const jsonStream = require('JSONStream');
const werelogs = require('werelogs');
const errors = require('../../../errors');
const jsutil = require('../../../jsutil');
class ListRecordStream extends stream.Transform {
constructor(logger) {
@ -87,6 +88,7 @@ class LogConsumer {
readRecords(params, cb) {
const recordStream = new ListRecordStream(this.logger);
const _params = params || {};
const cbOnce = jsutil.once(cb);
this.bucketClient.getRaftLog(
this.raftSession, _params.startSeq, _params.limit,
@ -96,26 +98,26 @@ class LogConsumer {
// no such raft session, log and ignore
this.logger.warn('raft session does not exist yet',
{ raftId: this.raftSession });
return cb(null, { info: { start: null,
return cbOnce(null, { info: { start: null,
end: null } });
}
if (err.code === 416) {
// requested range not satisfiable
this.logger.debug('no new log record to process',
{ raftId: this.raftSession });
return cb(null, { info: { start: null,
return cbOnce(null, { info: { start: null,
end: null } });
}
this.logger.error(
'Error handling record log request', { error: err });
return cb(err);
return cbOnce(err);
}
// setup a temporary listener until the 'header' event
// is emitted
recordStream.on('error', err => {
this.logger.error('error receiving raft log',
{ error: err.message });
return cb(errors.InternalError);
return cbOnce(errors.InternalError);
});
const jsonResponse = stream.pipe(jsonStream.parse('log.*'));
jsonResponse.pipe(recordStream);
@ -124,7 +126,7 @@ class LogConsumer {
.on('header', header => {
// remove temporary listener
recordStream.removeAllListeners('error');
return cb(null, { info: header.info,
return cbOnce(null, { info: header.info,
log: recordStream });
})
.on('error', err => recordStream.emit('error', err));

View File

@ -6,6 +6,10 @@
// - rep_group_id 07 bytes replication group identifier
// - other_information arbitrary user input, such as a unique string
const base62Integer = require('base62');
const BASE62 = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ';
const base62String = require('base-x')(BASE62);
// the lengths of the components in bytes
const LENGTH_TS = 14; // timestamp: epoch in ms
const LENGTH_SEQ = 6; // position in ms slot
@ -127,7 +131,7 @@ function generateVersionId(info, replicationGroupId) {
* @param {string} str - the versionId to encode
* @return {string} - the encoded versionId
*/
function encode(str) {
function hexEncode(str) {
return Buffer.from(str, 'utf8').toString('hex');
}
@ -138,7 +142,7 @@ function encode(str) {
* @param {string} str - the encoded versionId to decode
* @return {(string|Error)} - the decoded versionId or an error
*/
function decode(str) {
function hexDecode(str) {
try {
const result = Buffer.from(str, 'hex').toString('utf8');
if (result === '') {
@ -152,4 +156,98 @@ function decode(str) {
}
}
module.exports = { generateVersionId, getInfVid, encode, decode };
/* base62 version Ids constants:
*
* Note: base62Integer() cannot encode integers larger than 15 digits
* so we assume that B62V_TOTAL <= 30 and we cut it in half. Please
* revisit if B62V_TOTAL is greater than 30.
*/
const B62V_TOTAL = LENGTH_TS + LENGTH_SEQ;
const B62V_HALF = B62V_TOTAL / 2;
const B62V_EPAD = '0'.repeat(Math.ceil(B62V_HALF * (Math.log(10) / Math.log(62))));
const B62V_DPAD = '0'.repeat(B62V_HALF);
/**
* Encode a versionId to obscure internal information contained
* in a version ID (less than 40 bytes).
*
* @param {string} str - the versionId to encode
* @return {string} - the encoded base62VersionId
*/
function base62Encode(str) {
const part1 = Number(str.substring(0, B62V_HALF));
const part2 = Number(str.substring(B62V_HALF, B62V_TOTAL));
const part3 = Buffer.from(str.substring(B62V_TOTAL));
const enc1 = base62Integer.encode(part1);
const enc2 = base62Integer.encode(part2);
const enc3 = base62String.encode(part3);
return (B62V_EPAD + enc1).slice(-B62V_EPAD.length) +
(B62V_EPAD + enc2).slice(-B62V_EPAD.length) +
enc3;
}
/**
* Decode a base62VersionId. May return an error if the input string is
* invalid hex string or results in an invalid value.
*
* @param {string} str - the encoded base62VersionId to decode
* @return {(string|Error)} - the decoded versionId or an error
*/
function base62Decode(str) {
try {
let start = 0;
const enc1 = str.substring(start, start + B62V_EPAD.length);
const orig1 = base62Integer.decode(enc1);
start += B62V_EPAD.length;
const enc2 = str.substring(start, start + B62V_EPAD.length);
const orig2 = base62Integer.decode(enc2);
start += B62V_EPAD.length;
const enc3 = str.substring(start);
const orig3 = base62String.decode(enc3);
return (B62V_DPAD + orig1.toString()).slice(-B62V_DPAD.length) +
(B62V_DPAD + orig2.toString()).slice(-B62V_DPAD.length) +
orig3.toString();
} catch (err) {
// in case of exceptions caused by base62 libs
return err;
}
}
const ENC_TYPE_HEX = 0; // legacy (large) encoding
const ENC_TYPE_BASE62 = 1; // new (tiny) encoding
/**
* Encode a versionId to obscure internal information contained
* in a version ID.
*
* @param {string} str - the versionId to encode
* @param {Number} [encType=ENC_TYPE_HEX] - encode format
* @return {string} - the encoded versionId
*/
function encode(str, encType = ENC_TYPE_HEX) {
if (encType === ENC_TYPE_BASE62) {
return base62Encode(str);
}
return hexEncode(str);
}
/**
* Decode a versionId. May return an error if the input string is
* invalid format or results in an invalid value. The function will
* automatically determine the format acc/ to an heuristic.
*
* @param {string} str - the encoded versionId to decode
* @return {(string|Error)} - the decoded versionId or an error
*/
function decode(str) {
if (str.length < 40) {
return base62Decode(str);
}
return hexDecode(str);
}
module.exports = { generateVersionId, getInfVid,
hexEncode, hexDecode,
base62Encode, base62Decode,
encode, decode,
ENC_TYPE_HEX, ENC_TYPE_BASE62 };

View File

@ -3,7 +3,7 @@
"engines": {
"node": ">=16"
},
"version": "7.4.13",
"version": "7.10.4",
"description": "Common utilities for the S3 project components",
"main": "index.js",
"repository": {
@ -22,6 +22,8 @@
"agentkeepalive": "^4.1.3",
"ajv": "6.12.2",
"async": "~2.1.5",
"base62": "2.0.1",
"base-x": "3.0.8",
"debug": "~2.6.9",
"diskusage": "^1.1.1",
"ioredis": "4.9.5",
@ -31,7 +33,7 @@
"node-forge": "^0.7.1",
"prom-client": "10.2.3",
"simple-glob": "^0.2",
"socket.io": "~2.3.0",
"socket.io": "~2.4.0",
"socket.io-client": "~2.3.0",
"utf8": "2.1.2",
"uuid": "^3.0.1",
@ -48,6 +50,7 @@
"eslint-config-scality": "scality/Guidelines#ec33dfb",
"eslint-plugin-react": "^4.3.0",
"mocha": "8.0.1",
"sinon": "^9.0.2",
"temp": "0.9.1"
},
"scripts": {

View File

@ -0,0 +1,118 @@
'use strict'; // eslint-disable-line strict
const assert = require('assert');
const crypto = require('crypto');
const async = require('async');
const TTLVCodec = require('../../../lib/network/kmip/codec/ttlv.js');
const LoopbackServerChannel =
require('../../utils/kmip/LoopbackServerChannel.js');
const TransportTemplate =
require('../../../lib/network/kmip/transport/TransportTemplate.js');
const TlsTransport =
require('../../../lib/network/kmip/transport/tls.js');
const KMIP = require('../../../lib/network/kmip');
const KMIPClient = require('../../../lib/network/kmip/Client.js');
const {
logger,
} = require('../../utils/kmip/ersatz.js');
class LoopbackServerTransport extends TransportTemplate {
constructor(options) {
super(new LoopbackServerChannel(KMIP, TTLVCodec), options);
}
}
describe('KMIP High Level Driver', () => {
[null, 'dummyAttributeName'].forEach(bucketNameAttributeName => {
[false, true].forEach(compoundCreateActivate => {
const options = {
kmip: {
client: {
bucketNameAttributeName,
compoundCreateActivate,
},
codec: {},
transport: {
pipelineDepth: 8,
tls: {
port: 5696,
},
},
},
};
it('should work with' +
` x-name attribute: ${!!bucketNameAttributeName},` +
` compound creation: ${compoundCreateActivate}`,
done => {
const kmipClient = new KMIPClient(options, TTLVCodec,
LoopbackServerTransport);
const plaintext = Buffer.from(crypto.randomBytes(32));
async.waterfall([
next => kmipClient.createBucketKey('plop', logger, next),
(id, next) =>
kmipClient.cipherDataKey(1, id, plaintext,
logger, (err, ciphered) => {
next(err, id, ciphered);
}),
(id, ciphered, next) =>
kmipClient.decipherDataKey(
1, id, ciphered, logger, (err, deciphered) => {
assert(plaintext
.compare(deciphered) === 0);
next(err, id);
}),
(id, next) =>
kmipClient.destroyBucketKey(id, logger, next),
], done);
});
});
});
it('should succeed healthcheck with working KMIP client and server', done => {
const options = {
kmip: {
client: {
bucketNameAttributeName: null,
compoundCreateActivate: false,
},
codec: {},
transport: {
pipelineDepth: 8,
tls: {
port: 5696,
},
},
},
};
const kmipClient = new KMIPClient(options, TTLVCodec,
LoopbackServerTransport);
kmipClient.healthcheck(logger, err => {
assert.ifError(err);
done();
});
});
it('should fail healthcheck with KMIP server not running', done => {
const options = {
kmip: {
client: {
bucketNameAttributeName: null,
compoundCreateActivate: false,
},
codec: {},
transport: {
pipelineDepth: 8,
tls: {
port: 5696,
},
},
},
};
const kmipClient = new KMIPClient(options, TTLVCodec, TlsTransport);
kmipClient.healthcheck(logger, err => {
assert(err);
assert(err.InternalError);
assert(err.description.includes('ECONNREFUSED'));
done();
});
});
});

View File

@ -0,0 +1,52 @@
'use strict'; // eslint-disable-line strict
const assert = require('assert');
const TTLVCodec = require('../../../lib/network/kmip/codec/ttlv.js');
const TransportTemplate =
require('../../../lib/network/kmip/transport/TransportTemplate.js');
const KMIP = require('../../../lib/network/kmip');
const {
logger,
MirrorChannel,
} = require('../../utils/kmip/ersatz.js');
const lowlevelFixtures = require('../../utils/kmip/lowlevelFixtures.js');
class MirrorTransport extends TransportTemplate {
constructor(options) {
super(new MirrorChannel(KMIP, TTLVCodec), options);
}
}
const options = {
kmip: {
codec: {},
transport: {
pipelineDepth: 8,
tls: {
port: 5696,
},
},
},
};
describe('KMIP Low Level Driver', () => {
lowlevelFixtures.forEach((fixture, n) => {
it(`should work with fixture #${n}`, done => {
const kmip = new KMIP(TTLVCodec, MirrorTransport, options);
const requestPayload = fixture.payload(kmip);
kmip.request(logger, fixture.operation,
requestPayload, (err, response) => {
if (err) {
return done(err);
}
const responsePayload = response.lookup(
'Response Message/Batch Item/Response Payload'
)[0];
assert.deepStrictEqual(responsePayload,
requestPayload);
return done();
});
});
});
});

View File

@ -0,0 +1,43 @@
const assert = require('assert');
const net = require('net');
const tls = require('tls');
const TransportTemplate =
require('../../../lib/network/kmip/transport/TransportTemplate.js');
const { logger } = require('../../utils/kmip/ersatz.js');
describe('KMIP Connection Management', () => {
let server;
before(done => {
server = net.createServer(conn => {
// abort the connection as soon as it is accepted
conn.destroy();
});
server.listen(5696);
server.on('listening', done);
});
after(done => {
server.close(done);
});
it('should gracefully handle connection errors', done => {
const transport = new TransportTemplate(
tls,
{
pipelineDepth: 1,
tls: {
port: 5696,
},
});
const request = Buffer.alloc(10).fill(6);
/* Using a for loop here instead of anything
* asynchronous, the callbacks get stuck in
* the conversation queue and are unwind with
* an error. It is the purpose of this test */
transport.send(logger, request, (err, conversation, response) => {
assert(err);
assert(!response);
done();
});
transport.end();
});
});

View File

@ -0,0 +1,82 @@
'use strict'; // eslint-disable-line
const async = require('async');
const assert = require('assert');
const TransportTemplate =
require('../../../lib/network/kmip/transport/TransportTemplate.js');
const { logger, EchoChannel } = require('../../utils/kmip/ersatz.js');
describe('KMIP Transport Template Class', () => {
const pipelineDepths = [1, 2, 4, 8, 16, 32];
const requestNumbers = [1, 37, 1021, 8191];
pipelineDepths.forEach(pipelineDepth => {
requestNumbers.forEach(iterations => {
it(`should survive ${iterations} iterations` +
` with ${pipelineDepth}way pipeline`,
done => {
const transport = new TransportTemplate(
new EchoChannel,
{
pipelineDepth,
tls: {
port: 5696,
},
});
const request = Buffer.alloc(10).fill(6);
async.times(iterations, (n, next) => {
transport.send(logger, request,
(err, conversation, response) => {
if (err) {
return next(err);
}
if (request.compare(response) !== 0) {
return next(Error('arg'));
}
return next();
});
}, err => {
transport.end();
done(err);
});
});
[true, false].forEach(doEmit => {
it('should report errors to outstanding requests.' +
` w:${pipelineDepth}, i:${iterations}, e:${doEmit}`,
done => {
const echoChannel = new EchoChannel;
echoChannel.clog();
const transport = new TransportTemplate(
echoChannel,
{
pipelineDepth,
tls: {
port: 5696,
},
});
const request = Buffer.alloc(10).fill(6);
/* Using a for loop here instead of anything
* asynchronous, the callbacks get stuck in
* the conversation queue and are unwind with
* an error. It is the purpose of this test */
for (let i = 0; i < iterations; ++i) {
transport.send(
logger, request,
(err, conversation, response) => {
assert(err);
assert(!response);
});
}
if (doEmit) {
echoChannel.emit('error', new Error('awesome'));
} else {
transport.abortPipeline(echoChannel);
}
transport.end();
done();
});
});
});
});
});

View File

@ -65,4 +65,63 @@ describe('Basic listing algorithm', () => {
Basic, { maxKeys: 1 }, logger);
assert.deepStrictEqual(res, ['key1']);
});
const attr1 = {
key: 'key1',
value: '{"foo": "bar"}',
};
const attr2 = {
key: 'key2',
value: '{"customAttributes": {"foo": "bar"}}',
};
const attr3 = {
key: 'key3',
value: `{"customAttributes": {
"cd_tenant_id%3D%3D6a84c782-8766-11eb-b0a1-d7238b6e9579": "",
"cd_tenant_id%3D%3Dc486659c-8761-11eb-87c2-8b0faea3c595": ""
}}`,
};
const attr4 = {
key: 'key4',
value: `{"customAttributes": {
"cd_tenant_id%3D%3D6a84c782-8766-11eb-b0a1-d7238b6e9579": ""
}}`,
};
const input = [attr1, attr2, attr3, attr4];
it('Shall ignore custom attributes if no filter is specified', () => {
const output = input;
const res = performListing(
input, Basic,
{},
logger);
assert.deepStrictEqual(res, output);
});
it('Shall report nothing if filter does not match', () => {
const output = [];
const res = performListing(
input, Basic,
{ filterKey: 'do not exist' },
logger);
assert.deepStrictEqual(res, output);
});
it('Shall find key in custom attributes', () => {
const output = [attr3];
const res = performListing(
input, Basic,
{ filterKey: 'cd_tenant_id%3D%3Dc486659c-8761-11eb-87c2-8b0faea3c595' },
logger);
assert.deepStrictEqual(res, output);
});
it('Shall find key starting with a prefix in custom attributes', () => {
const output = [attr3, attr4];
const res = performListing(
input, Basic,
{ filterKeyStartsWith: 'cd_tenant_id%3D%3D' },
logger);
assert.deepStrictEqual(res, output);
});
});

View File

@ -16,6 +16,11 @@ const expectCanId =
const searchEmail2 = 'sampleaccount4@sampling.com';
const expectCanId2 = 'newCanId';
const searchCanId = '79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be';
const expectAccountId = '123456789012';
const invalidAccountId = 'doesnotexist';
describe('S3 in_memory auth backend', () => {
it('should find an account', done => {
const backend = new Backend(JSON.parse(JSON.stringify(ref)));
@ -26,6 +31,26 @@ describe('S3 in_memory auth backend', () => {
});
});
it('should find an accounts accountId from canonicalId', done => {
const backend = new Backend(JSON.parse(JSON.stringify(ref)));
backend.getAccountIds([searchCanId], log, (err, res) => {
assert.ifError(err);
assert.strictEqual(res.message.body[searchCanId],
expectAccountId);
done();
});
});
it('should return "Not Found" for missing accounts', done => {
const backend = new Backend(JSON.parse(JSON.stringify(ref)));
backend.getAccountIds([invalidAccountId], log, (err, res) => {
assert.ifError(err);
assert.strictEqual(res.message.body[invalidAccountId],
'Not Found');
done();
});
});
it('should clear old account authdata on refresh', done => {
const backend = new Backend(JSON.parse(JSON.stringify(ref)));
backend.refreshAuthData(obj2);

View File

@ -1,4 +1,5 @@
const assert = require('assert');
const sinon = require('sinon');
const queryAuthCheck =
require('../../../../lib/auth/v2/queryAuthCheck').check;
@ -26,3 +27,97 @@ describe('v2: queryAuthCheck', () => {
}
}));
});
describe('v2: queryAuthCheck', () => {
let clock;
beforeEach(() => {
clock = sinon.useFakeTimers();
});
afterEach(() => {
process.env.PRE_SIGN_URL_EXPIRY = 604800000;
clock.restore();
});
it('URL should not expire before 7 days with default expiry', () => {
const currentTime = Date.now() / 1000;
const expires = currentTime + 604799; // in seconds
const mockRequest = {
method: 'GET',
url: 'mockurl',
query: {
Expires: expires,
},
headers: {
'Content-MD5': 'c',
},
};
const data = {
Expires: expires,
AWSAccessKeyId: 'keyId',
Signature: 'sign',
};
const res = queryAuthCheck(mockRequest, log, data);
assert.notStrictEqual(res.err.AccessDenied, true);
assert.notStrictEqual(res.err.RequestTimeTooSkewed, true);
});
it('URL should expire after 7 days with default expiry', () => {
clock.tick(604800000); // take time 604800000ms (7 days) ahead
const currentTime = Date.now();
const request = { method: 'GET', query: { Expires: currentTime } };
const data = { Expires: currentTime };
const res = queryAuthCheck(request, log, data);
assert.notStrictEqual(res.err, null);
assert.notStrictEqual(res.err, undefined);
assert.strictEqual(res.err.AccessDenied, true);
});
it('URL should not expire before 7 days with custom expiry', () => {
process.env.PRE_SIGN_URL_EXPIRY = 31556952000; // in ms (1 year)
const currentTime = Date.now() / 1000;
const expires = currentTime + 604799; // in seconds
const mockRequest = {
method: 'GET',
url: 'mockurl',
query: {
Expires: expires,
},
headers: {
'Content-MD5': 'c',
},
};
const data = {
Expires: expires,
AWSAccessKeyId: 'keyId',
Signature: 'sign',
};
const res = queryAuthCheck(mockRequest, log, data);
assert.notStrictEqual(res.err.AccessDenied, true);
assert.notStrictEqual(res.err.RequestTimeTooSkewed, true);
});
it('URL should still not expire after 7 days with custom expiry', () => {
clock.tick(604800000); // take time 604800000ms (7 days) ahead
process.env.PRE_SIGN_URL_EXPIRY = 31556952000; // in ms (1 year)
const currentTime = Date.now() / 1000;
const request = { method: 'GET', query: { Expires: currentTime } };
const data = { Expires: currentTime };
const res = queryAuthCheck(request, log, data);
assert.notStrictEqual(res.err.AccessDenied, true);
});
it('should return RequestTimeTooSkewed with current time > expiry', () => {
clock.tick(123);
const expires = 0;
const request = { method: 'GET', query: { Expires: expires } };
const data = { Expires: expires };
const res = queryAuthCheck(request, log, data);
assert.notStrictEqual(res.err, null);
assert.notStrictEqual(res.err, undefined);
assert.strictEqual(res.err.RequestTimeTooSkewed, true);
});
it('should return MissingSecurityHeader with invalid expires param', () => {
const request = { method: 'GET', query: { Expires: 'a string' } };
const data = { Expires: 'a string' };
const res = queryAuthCheck(request, log, data);
assert.notStrictEqual(res.err, null);
assert.notStrictEqual(res.err, undefined);
assert.strictEqual(res.err.MissingSecurityHeader, true);
});
});

View File

@ -0,0 +1,172 @@
'use strict'; // eslint-disable-line strict
/* eslint new-cap: "off" */
const assert = require('assert');
const TTLVCodec = require('../../../lib/network/kmip/codec/ttlv.js');
const KMIP = require('../../../lib/network/kmip');
const ttlvFixtures = require('../../utils/kmip/ttlvFixtures');
const badTtlvFixtures = require('../../utils/kmip/badTtlvFixtures');
const messageFixtures = require('../../utils/kmip/messageFixtures');
const { logger } = require('../../utils/kmip/ersatz.js');
function newKMIP() {
return new KMIP(TTLVCodec,
class DummyTransport {},
{ kmip: {} }, () => {});
}
describe('KMIP TTLV Codec', () => {
it('should map, encode and decode an extension', done => {
const kmip = newKMIP();
kmip.mapExtension('Dummy Extension', 0x54a000);
const msg = KMIP.Message([
KMIP.TextString('Dummy Extension', 'beautiful'),
]);
const encodedMsg = kmip._encodeMessage(msg);
const decodedMsg = kmip._decodeMessage(logger, encodedMsg);
assert.deepStrictEqual(msg, decodedMsg);
done();
});
ttlvFixtures.forEach((item, idx) => {
['request', 'response'].forEach(fixture => {
it(`should decode the TTLV ${fixture} fixture[${idx}]`, done => {
const kmip = newKMIP();
const msg = kmip._decodeMessage(logger, item[fixture]);
if (!item.degenerated) {
const encodedMsg = kmip._encodeMessage(msg);
assert(encodedMsg.compare(item[fixture]) === 0);
}
done();
});
});
});
it('should validate supported operations', done => {
const kmip = newKMIP();
const msg = kmip._decodeMessage(logger, ttlvFixtures[1].response);
const supportedOperations =
msg.lookup('Response Message/Batch Item/' +
'Response Payload/Operation');
const supportedObjectTypes =
msg.lookup('Response Message/Batch Item/' +
'Response Payload/Object Type');
const protocolVersionMajor =
msg.lookup('Response Message/Response Header/' +
'Protocol Version/Protocol Version Major');
const protocolVersionMinor =
msg.lookup('Response Message/Response Header/' +
'Protocol Version/Protocol Version Minor');
assert(supportedOperations.includes('Encrypt'));
assert(supportedOperations.includes('Decrypt'));
assert(supportedOperations.includes('Create'));
assert(supportedOperations.includes('Destroy'));
assert(supportedOperations.includes('Query'));
assert(supportedObjectTypes.includes('Symmetric Key'));
assert(protocolVersionMajor[0] >= 2 ||
(protocolVersionMajor[0] === 1 &&
protocolVersionMinor[0] >= 2));
done();
});
it('should detect unsupported operations', done => {
const kmip = newKMIP();
const msg = kmip._decodeMessage(logger, ttlvFixtures[2].response);
const supportedOperations =
msg.lookup('Response Message/Batch Item/' +
'Response Payload/Operation');
assert(!supportedOperations.includes('Encrypt'));
assert(!supportedOperations.includes('Decrypt'));
done();
});
it('should support non canonical search path', done => {
const kmip = newKMIP();
const msg = kmip._decodeMessage(logger, ttlvFixtures[1].response);
const supportedOperations =
msg.lookup('/Response Message/Batch Item/' +
'Response Payload/Operation');
const supportedObjectTypes =
msg.lookup('Response Message/Batch Item/' +
'Response Payload/Object Type/');
const protocolVersionMajor =
msg.lookup('Response Message//Response Header///' +
'Protocol Version////Protocol Version Major');
const protocolVersionMinor =
msg.lookup('/Response Message////Response Header///' +
'Protocol Version//Protocol Version Minor/');
assert(supportedOperations.includes('Encrypt'));
assert(supportedOperations.includes('Decrypt'));
assert(supportedOperations.includes('Create'));
assert(supportedOperations.includes('Destroy'));
assert(supportedOperations.includes('Query'));
assert(supportedObjectTypes.includes('Symmetric Key'));
assert(protocolVersionMajor[0] >= 2 ||
(protocolVersionMajor[0] === 1 &&
protocolVersionMinor[0] >= 2));
done();
});
it('should return nothing with an empty search path', done => {
const kmip = newKMIP();
const msg = kmip._decodeMessage(logger, ttlvFixtures[2].response);
const empty1 = msg.lookup('');
const empty2 = msg.lookup('////');
assert(empty1.length === 0);
assert(empty2.length === 0);
done();
});
it('should encode/decode a bit mask', done => {
const kmip = newKMIP();
const usageMask = ['Encrypt', 'Decrypt', 'Export'];
const decodedMask =
kmip.decodeMask('Cryptographic Usage Mask',
kmip.encodeMask('Cryptographic Usage Mask',
usageMask));
assert.deepStrictEqual(usageMask.sort(), decodedMask.sort());
done();
});
it('should detect invalid bit name', done => {
const kmip = newKMIP();
const usageMask = ['Encrypt', 'Decrypt', 'Exprot'];
try {
kmip.encodeMask('Cryptographic Usage Mask', usageMask);
done(Error('Must not succeed'));
} catch (e) {
done();
}
});
messageFixtures.forEach((item, idx) => {
it(`should encode the KMIP message fixture[${idx}]`, done => {
const kmip = newKMIP();
const encodedMessage = kmip._encodeMessage(item);
const decodedMessage = kmip._decodeMessage(logger, encodedMessage);
assert.deepStrictEqual(item.content, decodedMessage.content);
done();
});
});
badTtlvFixtures.forEach((rawMessage, idx) => {
it(`should fail to parse invalid TTLV message fixture[${idx}]`,
done => {
const kmip = newKMIP();
try {
kmip._decodeMessage(logger, rawMessage);
done(Error('Must not succeed'));
} catch (e) {
done();
}
});
});
});

View File

@ -115,8 +115,56 @@ const testLifecycleConfiguration = {
},
],
};
// create a dummy bucket to test getters and setters
const testUid = '99ae3446-7082-4c17-ac97-52965dc004ec';
const testBucketPolicy = {
Version: '2012-10-17',
Statement: [
{
Effect: 'Allow',
Principal: '*',
Resource: 'arn:aws:s3:::examplebucket',
Action: 's3:*',
},
],
};
const testobjectLockEnabled = false;
const testObjectLockConfiguration = {
rule: {
mode: 'GOVERNANCE',
days: 1,
},
};
const testNotificationConfiguration = {
queueConfig: [
{
events: ['s3:ObjectCreated:*'],
queueArn: 'arn:scality:bucketnotif:::target1',
filterRules: [
{
name: 'prefix',
value: 'logs/',
},
{
name: 'suffix',
value: '.log',
},
],
id: 'test-queue-config-1',
},
{
events: ['s3:ObjectRemoved:Delete', 's3:ObjectCreated:Copy'],
queueArn: 'arn:scality:bucketnotif:::target2',
id: 'test-queue-config-2',
},
],
};
// create a dummy bucket to test getters and setters
Object.keys(acl).forEach(
aclObj => describe(`different acl configurations : ${aclObj}`, () => {
const dummyBucket = new BucketInfo(
@ -132,7 +180,12 @@ Object.keys(acl).forEach(
testWebsiteConfiguration,
testCorsConfiguration,
testReplicationConfiguration,
testLifecycleConfiguration);
testLifecycleConfiguration,
testBucketPolicy,
testUid,
testobjectLockEnabled,
testObjectLockConfiguration,
testNotificationConfiguration);
describe('serialize/deSerialize on BucketInfo class', () => {
const serialized = dummyBucket.serialize();
@ -158,6 +211,12 @@ Object.keys(acl).forEach(
dummyBucket._replicationConfiguration,
lifecycleConfiguration:
dummyBucket._lifecycleConfiguration,
bucketPolicy: dummyBucket._bucketPolicy,
uid: dummyBucket._uid,
objectLockEnabled: dummyBucket._objectLockEnabled,
objectLockConfiguration:
dummyBucket._objectLockConfiguration,
notificationConfiguration: dummyBucket._notificationConfiguration,
};
assert.strictEqual(serialized, JSON.stringify(bucketInfos));
done();
@ -174,15 +233,16 @@ Object.keys(acl).forEach(
});
describe('constructor', () => {
it('this should have the right BucketInfo types',
() => {
assert.strictEqual(typeof dummyBucket.getName(), 'string');
assert.strictEqual(typeof dummyBucket.getOwner(), 'string');
assert.strictEqual(typeof dummyBucket.getOwnerDisplayName(),
'string');
assert.strictEqual(typeof dummyBucket.getCreationDate(),
'string');
});
it('this should have the right BucketInfo types', () => {
assert.strictEqual(typeof dummyBucket.getName(), 'string');
assert.strictEqual(typeof dummyBucket.getOwner(), 'string');
assert.strictEqual(typeof dummyBucket.getOwnerDisplayName(),
'string');
assert.strictEqual(typeof dummyBucket.getCreationDate(),
'string');
assert.strictEqual(typeof dummyBucket.isObjectLockEnabled(),
'boolean');
});
it('this should have the right acl\'s types', () => {
assert.strictEqual(typeof dummyBucket.getAcl(), 'object');
assert.strictEqual(
@ -257,6 +317,25 @@ Object.keys(acl).forEach(
assert.deepStrictEqual(dummyBucket.getLifecycleConfiguration(),
testLifecycleConfiguration);
});
it('getBucketPolicy should return policy', () => {
assert.deepStrictEqual(
dummyBucket.getBucketPolicy(), testBucketPolicy);
});
it('getUid should return unique id of bucket', () => {
assert.deepStrictEqual(dummyBucket.getUid(), testUid);
});
it('object lock should be disabled by default', () => {
assert.deepStrictEqual(
dummyBucket.isObjectLockEnabled(), false);
});
it('getObjectLockConfiguration should return configuration', () => {
assert.deepStrictEqual(dummyBucket.getObjectLockConfiguration(),
testObjectLockConfiguration);
});
it('getNotificationConfiguration should return configuration', () => {
assert.deepStrictEqual(dummyBucket.getNotificationConfiguration(),
testNotificationConfiguration);
});
});
describe('setters on BucketInfo class', () => {
@ -378,6 +457,67 @@ Object.keys(acl).forEach(
assert.deepStrictEqual(dummyBucket.getLifecycleConfiguration(),
newLifecycleConfig);
});
it('setBucketPolicy should set bucket policy', () => {
const newBucketPolicy = {
Version: '2012-10-17',
Statement: [
{
Effect: 'Deny',
Principal: '*',
Resource: 'arn:aws:s3:::examplebucket',
Action: 's3:*',
},
],
};
dummyBucket.setBucketPolicy(newBucketPolicy);
assert.deepStrictEqual(
dummyBucket.getBucketPolicy(), newBucketPolicy);
});
it('setObjectLockConfiguration should set object lock ' +
'configuration', () => {
const newObjectLockConfig = {
rule: {
mode: 'COMPLIANCE',
years: 1,
},
};
dummyBucket.setObjectLockConfiguration(newObjectLockConfig);
assert.deepStrictEqual(dummyBucket.getObjectLockConfiguration(),
newObjectLockConfig);
});
[true, false].forEach(bool => {
it('setObjectLockEnabled should set object lock status', () => {
dummyBucket.setObjectLockEnabled(bool);
assert.deepStrictEqual(dummyBucket.isObjectLockEnabled(),
bool);
});
});
it('setNotificationConfiguration should set notification configuration', () => {
const newNotifConfig = {
queueConfig: [
{
events: ['s3:ObjectRemoved:*'],
queueArn: 'arn:scality:bucketnotif:::target3',
filterRules: [
{
name: 'prefix',
value: 'configs/',
},
],
id: 'test-config-3',
},
],
};
dummyBucket.setNotificationConfiguration(newNotifConfig);
assert.deepStrictEqual(
dummyBucket.getNotificationConfiguration(), newNotifConfig);
});
it('setUid should set bucket uid', () => {
const testUid = '7751ec04-da87-44a1-99b4-95ebb345d40e';
dummyBucket.setUid(testUid);
assert.deepStrictEqual(
dummyBucket.getUid(), testUid);
});
});
})
);

View File

@ -0,0 +1,105 @@
const assert = require('assert');
const BucketPolicy = require('../../../lib/models/BucketPolicy');
const testBucketPolicy = {
Version: '2012-10-17',
Statement: [
{
Effect: 'Allow',
Principal: '*',
Resource: 'arn:aws:s3:::examplebucket',
Action: 's3:GetBucketLocation',
},
],
};
const mismatchErr = 'Action does not apply to any resource(s) in statement';
function createPolicy(key, value) {
const newPolicy = Object.assign({}, testBucketPolicy);
newPolicy.Statement[0][key] = value;
return newPolicy;
}
function checkErr(policy, err, message) {
assert.strictEqual(policy.error[err], true);
assert.strictEqual(policy.error.description, message);
}
describe('BucketPolicy class getBucketPolicy', () => {
beforeEach(() => {
testBucketPolicy.Statement[0].Resource = 'arn:aws:s3:::examplebucket';
testBucketPolicy.Statement[0].Action = 's3:GetBucketLocation';
});
it('should return MalformedPolicy error if request json is empty', done => {
const bucketPolicy = new BucketPolicy('').getBucketPolicy();
const errMessage = 'request json is empty or undefined';
checkErr(bucketPolicy, 'MalformedPolicy', errMessage);
done();
});
it('should return MalformedPolicy error if request action is for objects ' +
'but resource refers to bucket', done => {
const newPolicy = createPolicy('Action', 's3:GetObject');
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
checkErr(bucketPolicy, 'MalformedPolicy', mismatchErr);
done();
});
it('should return MalformedPolicy error if request action is for objects ' +
'but does\'t include \'Object\' and resource refers to bucket', done => {
const newPolicy = createPolicy('Action', 's3:AbortMultipartUpload');
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
checkErr(bucketPolicy, 'MalformedPolicy', mismatchErr);
done();
});
it('should return MalformedPolicy error if request action is for objects ' +
'(with wildcard) but resource refers to bucket', done => {
const newPolicy = createPolicy('Action', 's3:GetObject*');
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
checkErr(bucketPolicy, 'MalformedPolicy', mismatchErr);
done();
});
it('should return MalformedPolicy error if request resource refers to ' +
'object but action is for buckets', done => {
const newPolicy = createPolicy('Resource',
'arn:aws:s3:::examplebucket/*');
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
checkErr(bucketPolicy, 'MalformedPolicy', mismatchErr);
done();
});
it('should return MalformedPolicy error if request resource refers to ' +
'object but action is for buckets (with wildcard)', done => {
const newPolicy = createPolicy('Resource',
'arn:aws:s3:::examplebucket/*');
newPolicy.Statement[0].Action = 's3:GetBucket*';
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
checkErr(bucketPolicy, 'MalformedPolicy', mismatchErr);
done();
});
it('should successfully get a valid policy', done => {
const bucketPolicy = new BucketPolicy(JSON.stringify(testBucketPolicy))
.getBucketPolicy();
assert.deepStrictEqual(bucketPolicy, testBucketPolicy);
done();
});
it('should successfully get a valid policy with wildcard in action',
done => {
const newPolicy = createPolicy('Action', 's3:Get*');
const bucketPolicy = new BucketPolicy(JSON.stringify(newPolicy))
.getBucketPolicy();
assert.deepStrictEqual(bucketPolicy, newPolicy);
done();
});
});

View File

@ -112,7 +112,11 @@ const invalidFilters = [
{ tag: 'Tag', label: 'no-value', error: 'MissingRequiredParameter',
errMessage: 'Tag XML does not contain both Key and Value' },
{ tag: 'Tag', label: 'key-too-long', error: 'InvalidRequest',
errMessage: 'Tag Key must be a length between 1 and 128 char' }];
errMessage: 'A Tag\'s Key must be a length between 1 and 128' },
{ tag: 'Tag', label: 'value-too-long', error: 'InvalidRequest',
errMessage: 'A Tag\'s Value must be a length between 0 and 256' },
{ tag: 'Tag', label: 'prefix-too-long', error: 'InvalidRequest',
errMessage: 'The maximum size of a prefix is 1024' }];
function generateAction(errorTag, tagObj) {
const xmlObj = {};
@ -158,6 +162,9 @@ function generateFilter(errorTag, tagObj) {
if (tagObj.label === 'only-prefix') {
middleTags = '<And><Prefix></Prefix></And>';
}
if (tagObj.label === 'empty-prefix') {
middleTags = '<Prefix></Prefix>';
}
if (tagObj.label === 'single-tag') {
middleTags = '<And><Tags><Key>fo</Key><Value></Value></Tags></And>';
}
@ -171,10 +178,26 @@ function generateFilter(errorTag, tagObj) {
const longKey = 'a'.repeat(129);
middleTags = `<Tag><Key>${longKey}</Key><Value></Value></Tag>`;
}
if (tagObj.label === 'value-too-long') {
const longValue = 'b'.repeat(257);
middleTags = `<Tag><Key>a</Key><Value>${longValue}</Value></Tag>`;
}
if (tagObj.label === 'prefix-too-long') {
const longValue = 'a'.repeat(1025);
middleTags = `<Prefix>${longValue}</Prefix>`;
}
if (tagObj.label === 'mult-prefixes') {
middleTags = '<Prefix>foo</Prefix><Prefix>bar</Prefix>' +
`<Prefix>${tagObj.lastPrefix}</Prefix>`;
}
if (tagObj.label === 'mult-tags') {
middleTags = '<And><Tag><Key>color</Key><Value>blue</Value></Tag>' +
'<Tag><Key>shape</Key><Value>circle</Value></Tag></And>';
}
if (tagObj.label === 'not-unique-key-tag') {
middleTags = '<And><Tag><Key>color</Key><Value>blue</Value></Tag>' +
'<Tag><Key>color</Key><Value>red</Value></Tag></And>';
}
Filter = `<Filter>${middleTags}</Filter>`;
if (tagObj.label === 'also-prefix') {
Filter = '<Filter></Filter><Prefix></Prefix>';
@ -349,4 +372,379 @@ describe('LifecycleConfiguration class getLifecycleConfiguration', () => {
done();
});
});
it('should return InvalidRequest is tag key is not unique', done => {
tagObj.label = 'not-unique-key-tag';
const errMessage = 'Tag Keys must be unique';
generateParsedXml('Filter', tagObj, parsedXml => {
checkError(parsedXml, 'InvalidRequest', errMessage, done);
});
});
it('should include prefix in the response even if it is an empty string', done => {
tagObj.label = 'empty-prefix';
const expectedPrefix = '';
generateParsedXml('Filter', tagObj, parsedXml => {
const lcConfig = new LifecycleConfiguration(parsedXml).
getLifecycleConfiguration();
assert.strictEqual(expectedPrefix,
lcConfig.rules[0].filter.rulePrefix);
done();
});
});
});
describe('LifecycleConfiguration::getConfigJson', () => {
const tests = [
[
'without prefix and tags',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
Expiration: { Days: 1 },
},
],
},
],
[
'with prefix and no tags',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
prefix: 'prefix',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Filter: { Prefix: 'prefix' },
Expiration: { Days: 1 },
},
],
},
],
[
'with filter.prefix and no tags',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
filter: { rulePrefix: 'prefix' },
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Expiration: { Days: 1 },
Filter: { Prefix: 'prefix' },
},
],
},
],
[
'with prefix and at least one tag',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
filter: {
tags: [
{ key: 'key', val: 'val' },
],
},
prefix: 'prefix',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Filter: {
And: {
Prefix: 'prefix',
Tags: [
{ Key: 'key', Value: 'val' },
],
},
},
Expiration: { Days: 1 },
},
],
},
],
[
'with filter.prefix and at least one tag',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
filter: {
rulePrefix: 'prefix',
tags: [
{ key: 'key', val: 'val' },
],
},
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Filter: {
And: {
Prefix: 'prefix',
Tags: [
{ Key: 'key', Value: 'val' },
],
},
},
Expiration: { Days: 1 },
},
],
},
],
[
'with no prefix and multiple tags',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 1 },
],
filter: {
tags: [
{ key: 'key1', val: 'val' },
{ key: 'key2', val: 'val' },
],
},
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Filter: {
And: {
Tags: [
{ Key: 'key1', Value: 'val' },
{ Key: 'key2', Value: 'val' },
],
},
},
Expiration: { Days: 1 },
},
],
},
],
[
'single action Expiration',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', deleteMarker: 'true' },
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
Expiration: { ExpiredObjectDeleteMarker: true },
},
],
},
],
[
'single action Expiration days',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'Expiration', days: 10 },
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
Expiration: { Days: 10 },
},
],
},
],
[
'single action Expiration date',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{
actionName: 'Expiration',
date: 'Fri, 21 Dec 2012 00:00:00 GMT',
},
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
Expiration: { Date: 'Fri, 21 Dec 2012 00:00:00 GMT' },
},
],
},
],
[
'single action NoncurrentVersionExpiration',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'NoncurrentVersionExpiration', days: 10 },
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
NoncurrentVersionExpiration: { NoncurrentDays: 10 },
},
],
},
],
[
'single action AbortIncompleteMultipartUpload days',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'AbortIncompleteMultipartUpload', days: 10 },
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
AbortIncompleteMultipartUpload: { DaysAfterInitiation: 10 },
},
],
},
],
[
'multiple actions',
{
rules: [
{
ruleID: 'test-id',
ruleStatus: 'Enabled',
actions: [
{ actionName: 'AbortIncompleteMultipartUpload', days: 10 },
{ actionName: 'NoncurrentVersionExpiration', days: 1 },
{ actionName: 'Expiration', deleteMarker: 'true' },
],
prefix: '',
},
],
},
{
Rules: [
{
ID: 'test-id',
Status: 'Enabled',
Prefix: '',
AbortIncompleteMultipartUpload: { DaysAfterInitiation: 10 },
NoncurrentVersionExpiration: { NoncurrentDays: 1 },
Expiration: { ExpiredObjectDeleteMarker: true },
},
],
},
],
];
tests.forEach(([msg, input, expected]) => it(
`should return correct configuration: ${msg}`, () => {
assert.deepStrictEqual(
LifecycleConfiguration.getConfigJson(input),
expected
);
}));
});

View File

@ -0,0 +1,214 @@
const assert = require('assert');
const { parseString } = require('xml2js');
const NotificationConfiguration =
require('../../../lib/models/NotificationConfiguration.js');
function checkError(parsedXml, err, errMessage, cb) {
const config = new NotificationConfiguration(parsedXml).
getValidatedNotificationConfiguration();
assert.strictEqual(config.error[err], true);
assert.strictEqual(config.error.description, errMessage);
cb();
}
function generateEvent(testParams) {
const event = [];
if (testParams.key === 'Event') {
if (Array.isArray(testParams.value)) {
testParams.value.forEach(v => {
event.push(`${event}<Event>${v}</Event>`);
});
} else {
event.push(`<Event>${testParams.value}</Event>`);
}
} else {
event.push('<Event>s3:ObjectCreated:*</Event>');
}
return event.join('');
}
function generateFilter(testParams) {
let filter = '';
if (testParams.key === 'Filter') {
filter = `<Filter>${testParams.value}</Filter>`;
}
if (testParams.key === 'S3Key') {
filter = `<Filter><S3Key>${testParams.value}</S3Key></Filter>`;
}
if (testParams.key === 'FilterRule') {
if (Array.isArray(testParams.value)) {
testParams.value.forEach(v => {
filter = `${filter}<Filter><S3Key><FilterRule>${v}` +
'</FilterRule></S3Key></Filter>';
});
} else {
filter = `<Filter><S3Key><FilterRule>${testParams.value}` +
'</FilterRule></S3Key></Filter>';
}
}
return filter;
}
function generateXml(testParams) {
const id = testParams.key === 'Id' ? `<Id>${testParams.value}</Id>` : '<Id>queue-id</Id>';
const arn = testParams.key === 'QueueArn' ?
`<Queue>${testParams.value}</Queue>` :
'<Queue>arn:scality:bucketnotif:::target</Queue>';
const event = generateEvent(testParams);
const filter = generateFilter(testParams);
let queueConfig = `<QueueConfiguration>${id}${arn}${event}${filter}` +
'</QueueConfiguration>';
if (testParams.key === 'QueueConfiguration') {
if (testParams.value === 'double') {
queueConfig = `${queueConfig}${queueConfig}`;
} else {
queueConfig = testParams.value;
}
}
const notification = testParams.key === 'NotificationConfiguration' ? '' :
`<NotificationConfiguration>${queueConfig}</NotificationConfiguration>`;
return notification;
}
function generateParsedXml(testParams, cb) {
const xml = generateXml(testParams);
parseString(xml, (err, parsedXml) => {
assert.equal(err, null, 'Error parsing xml');
cb(parsedXml);
});
}
const failTests = [
{
name: 'fail with empty configuration',
params: { key: 'NotificationConfiguration' },
error: 'MalformedXML',
errorMessage: 'request xml is undefined or empty',
},
{
name: 'fail with invalid id',
params: { key: 'Id', value: 'a'.repeat(256) },
error: 'InvalidArgument',
errorMessage: 'queue configuration ID is greater than 255 characters long',
},
{
name: 'fail with repeated id',
params: { key: 'QueueConfiguration', value: 'double' },
error: 'InvalidRequest',
errorMessage: 'queue configuration ID must be unique',
},
{
name: 'fail with empty QueueArn',
params: { key: 'QueueArn', value: '' },
error: 'MalformedXML',
errorMessage: 'each queue configuration must contain a queue arn',
},
{
name: 'fail with invalid QueueArn',
params: { key: 'QueueArn', value: 'arn:scality:bucketnotif:target' },
error: 'MalformedXML',
errorMessage: 'queue arn is invalid',
},
{
name: 'fail with invalid QueueArn partition',
params: { key: 'QueueArn', value: 'arn:aws:bucketnotif:::target' },
error: 'MalformedXML',
errorMessage: 'queue arn is invalid',
},
{
name: 'fail with empty event',
params: { key: 'Event', value: '' },
error: 'MalformedXML',
errorMessage: 'each queue configuration must contain an event',
},
{
name: 'fail with invalid event',
params: { key: 'Event', value: 's3:BucketCreated:Put' },
error: 'MalformedXML',
errorMessage: 'event array contains invalid or unsupported event',
},
{
name: 'fail with unsupported event',
params: { key: 'Event', value: 's3:Replication:OperationNotTracked' },
error: 'MalformedXML',
errorMessage: 'event array contains invalid or unsupported event',
},
{
name: 'fail with filter that does not contain S3Key',
params: { key: 'Filter', value: '<FilterRule><Name>Prefix</Name><Value>logs/</Value></FilterRule>' },
error: 'MalformedXML',
errorMessage: 'if included, queue configuration filter must contain S3Key',
},
{
name: 'fail with filter that does not contain a rule',
params: { key: 'S3Key', value: '<Name>Prefix</Name><Value>logs/</Value>' },
error: 'MalformedXML',
errorMessage: 'if included, queue configuration filter must contain a rule',
},
{
name: 'fail with filter rule that does not contain name and value',
params: { key: 'FilterRule', value: '<Value>noname</Value>' },
error: 'MalformedXML',
errorMessage: 'each included filter must contain a name and value',
},
{
name: 'fail with invalid name in filter rule',
params: { key: 'FilterRule', value: '<Name>Invalid</Name><Value>logs/</Value>' },
error: 'MalformedXML',
errorMessage: 'filter Name must be one of Prefix or Suffix',
},
];
const passTests = [
{
name: 'pass with empty QueueConfiguration',
params: { key: 'QueueConfiguration', value: '[]' },
},
{
name: 'pass with multiple events in one queue configuration',
params: {
key: 'Event', value: ['s3:ObjectCreated:Put', 's3:ObjectCreated:Copy'],
},
},
{
name: 'pass with multiple filter rules',
params: {
key: 'FilterRule',
value: ['<Name>Prefix</Name><Value>logs/</Value>', '<Name>Suffix</Name><Value>.pdf</Value>'] },
},
{
name: 'pass with no id',
params: { key: 'Id', value: '' },
},
{
name: 'pass with basic config', params: {},
},
];
describe('NotificationConfiguration class getValidatedNotificationConfiguration',
() => {
it('should return MalformedXML error if request xml is empty', done => {
const errMessage = 'request xml is undefined or empty';
checkError('', 'MalformedXML', errMessage, done);
});
failTests.forEach(test => {
it(`should ${test.name}`, done => {
generateParsedXml(test.params, xml => {
checkError(xml, test.error, test.errorMessage, done);
});
});
});
passTests.forEach(test => {
it(`should ${test.name}`, done => {
generateParsedXml(test.params, xml => {
const config = new NotificationConfiguration(xml).
getValidatedNotificationConfiguration();
assert.ifError(config.error);
done();
});
});
});
});

View File

@ -0,0 +1,265 @@
const assert = require('assert');
const { parseString } = require('xml2js');
const ObjectLockConfiguration =
require('../../../lib/models/ObjectLockConfiguration.js');
function checkError(parsedXml, err, errMessage, cb) {
const config = new ObjectLockConfiguration(parsedXml).
getValidatedObjectLockConfiguration();
assert.strictEqual(config.error[err], true);
assert.strictEqual(config.error.description, errMessage);
cb();
}
function generateRule(testParams) {
if (testParams.key === 'Rule') {
return `<Rule>${testParams.value}</Rule>`;
}
if (testParams.key === 'DefaultRetention') {
return `<Rule><DefaultRetention>${testParams.value} ` +
'</DefaultRetention></Rule>';
}
const mode = testParams.key === 'Mode' ?
`<Mode>${testParams.value}</Mode>` : '<Mode>GOVERNANCE</Mode>';
let time = '<Days>1</Days>';
if (testParams.key === 'Days') {
time = `<Days>${testParams.value}</Days>`;
}
if (testParams.key === 'Years') {
time = `<Years>${testParams.value}</Years>`;
}
if (testParams.key === 'NoRule') {
return '';
}
return `<Rule><DefaultRetention>${mode}${time}</DefaultRetention></Rule>`;
}
function generateXml(testParams) {
const Enabled = testParams.key === 'ObjectLockEnabled' ?
`<ObjectLockEnabled>${testParams.value}</ObjectLockEnabled>` :
'<ObjectLockEnabled>Enabled</ObjectLockEnabled>';
const Rule = generateRule(testParams);
const ObjectLock = testParams.key === 'ObjectLockConfiguration' ? '' :
`<ObjectLockConfiguration>${Enabled}${Rule}` +
'</ObjectLockConfiguration>';
return ObjectLock;
}
function generateParsedXml(testParams, cb) {
const xml = generateXml(testParams);
parseString(xml, (err, parsedXml) => {
assert.equal(err, null, 'Error parsing xml');
cb(parsedXml);
});
}
const expectedXml = (daysOrYears, time, mode) =>
'<?xml version="1.0" encoding="UTF-8"?>' +
'<ObjectLockConfiguration ' +
'xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
'<ObjectLockEnabled>Enabled</ObjectLockEnabled>' +
'<Rule><DefaultRetention>' +
`<Mode>${mode}</Mode>` +
`<${daysOrYears}>${time}</${daysOrYears}>` +
'</DefaultRetention></Rule>' +
'</ObjectLockConfiguration>';
const failTests = [
{
name: 'fail with empty configuration',
params: { key: 'ObjectLockConfiguration' },
error: 'MalformedXML',
errorMessage: 'request xml is undefined or empty',
},
{
name: 'fail with empty ObjectLockEnabled',
params: { key: 'ObjectLockEnabled', value: '' },
error: 'MalformedXML',
errorMessage: 'request xml does not include valid ObjectLockEnabled',
},
{
name: 'fail with invalid value for ObjectLockEnabled',
params: { key: 'ObjectLockEnabled', value: 'Disabled' },
error: 'MalformedXML',
errorMessage: 'request xml does not include valid ObjectLockEnabled',
},
{
name: 'fail with empty rule',
params: { key: 'Rule', value: '' },
error: 'MalformedXML',
errorMessage: 'Rule request xml does not contain DefaultRetention',
},
{
name: 'fail with empty DefaultRetention',
params: { key: 'DefaultRetention', value: '' },
error: 'MalformedXML',
errorMessage: 'DefaultRetention request xml does not contain Mode or ' +
'retention period (Days or Years)',
},
{
name: 'fail with empty mode',
params: { key: 'Mode', value: '' },
error: 'MalformedXML',
errorMessage: 'request xml does not contain Mode',
},
{
name: 'fail with invalid mode',
params: { key: 'Mode', value: 'COMPLOVERNANCE' },
error: 'MalformedXML',
errorMessage: 'Mode request xml must be one of "GOVERNANCE", ' +
'"COMPLIANCE"',
},
{
name: 'fail with lowercase mode',
params: { key: 'Mode', value: 'governance' },
error: 'MalformedXML',
errorMessage: 'Mode request xml must be one of "GOVERNANCE", ' +
'"COMPLIANCE"',
},
{
name: 'fail with empty retention period',
params: { key: 'Days', value: '' },
error: 'MalformedXML',
errorMessage: 'request xml does not contain Days or Years',
},
{
name: 'fail with NaN retention period',
params: { key: 'Days', value: 'one' },
error: 'MalformedXML',
errorMessage: 'request xml does not contain valid retention period',
},
{
name: 'fail with retention period less than 1',
params: { key: 'Days', value: 0 },
error: 'InvalidArgument',
errorMessage: 'retention period must be a positive integer',
},
{
name: 'fail with Days retention period greater than 36500',
params: { key: 'Days', value: 36501 },
error: 'InvalidArgument',
errorMessage: 'retention period is too large',
},
{
name: 'fail with Years retention period great than 100',
params: { key: 'Years', value: 101 },
error: 'InvalidArgument',
errorMessage: 'retention period is too large',
},
];
const passTests = [
{
name: 'pass with GOVERNANCE retention mode and valid Days ' +
'retention period',
params: {},
},
{
name: 'pass with COMPLIANCE retention mode',
params: { key: 'Mode', value: 'COMPLIANCE' },
},
{
name: 'pass with valid Years retention period',
params: { key: 'Years', value: 1 },
},
{
name: 'pass without Rule',
params: { key: 'NoRule' },
},
];
const passTestsGetConfigXML = [
{
config: {
rule: {
mode: 'COMPLIANCE',
days: 90,
},
},
expectedXml: expectedXml('Days', 90, 'COMPLIANCE'),
description: 'with COMPLIANCE retention mode ' +
'and valid Days retention period',
},
{
config: {
rule: {
mode: 'GOVERNANCE',
days: 30,
},
},
expectedXml: expectedXml('Days', 30, 'GOVERNANCE'),
description: 'with GOVERNANCE retention mode ' +
'and valid Days retention period',
},
{
config: {
rule: {
mode: 'COMPLIANCE',
years: 1,
},
},
expectedXml: expectedXml('Years', 1, 'COMPLIANCE'),
description: 'with COMPLIANCE retention mode ' +
'and valid Years retention period',
},
{
config: {
rule: {
mode: 'GOVERNANCE',
years: 2,
},
},
expectedXml: expectedXml('Years', 2, 'GOVERNANCE'),
description: 'with GOVERNANCE retention mode ' +
'and valid Years retention period',
},
{
config: {},
expectedXml: '<?xml version="1.0" encoding="UTF-8"?>' +
'<ObjectLockConfiguration ' +
'xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
'<ObjectLockEnabled>Enabled</ObjectLockEnabled>' +
'</ObjectLockConfiguration>',
description: 'without rule if object lock ' +
'configuration has not been set',
},
];
describe('ObjectLockConfiguration class getValidatedObjectLockConfiguration',
() => {
it('should return MalformedXML error if request xml is empty', done => {
const errMessage = 'request xml is undefined or empty';
checkError('', 'MalformedXML', errMessage, done);
});
failTests.forEach(test => {
it(`should ${test.name}`, done => {
generateParsedXml(test.params, xml => {
checkError(xml, test.error, test.errorMessage, done);
});
});
});
passTests.forEach(test => {
it(`should ${test.name}`, done => {
generateParsedXml(test.params, xml => {
const config = new ObjectLockConfiguration(xml).
getValidatedObjectLockConfiguration();
assert.ifError(config.error);
done();
});
});
});
});
describe('ObjectLockConfiguration class getConfigXML', () => {
passTestsGetConfigXML.forEach(test => {
const { config, description, expectedXml } = test;
it(`should return correct XML ${description}`, () => {
const responseXml = ObjectLockConfiguration.getConfigXML(config);
assert.strictEqual(responseXml, expectedXml);
});
});
});

View File

@ -0,0 +1,64 @@
const assert = require('assert');
const ObjectMDLocation = require('../../../lib/models/ObjectMDLocation');
describe('ObjectMDLocation', () => {
it('class getters/setters', () => {
const locValue = {
key: 'fookey',
start: 42,
size: 100,
dataStoreName: 'awsbackend',
dataStoreETag: '2:abcdefghi',
cryptoScheme: 1,
cipheredDataKey: 'CiPhErEdDaTaKeY',
};
const location = new ObjectMDLocation(locValue);
assert.strictEqual(location.getKey(), 'fookey');
assert.strictEqual(location.getDataStoreName(), 'awsbackend');
assert.strictEqual(location.getDataStoreETag(), '2:abcdefghi');
assert.strictEqual(location.getPartNumber(), 2);
assert.strictEqual(location.getPartETag(), 'abcdefghi');
assert.strictEqual(location.getPartStart(), 42);
assert.strictEqual(location.getPartSize(), 100);
assert.strictEqual(location.getCryptoScheme(), 1);
assert.strictEqual(location.getCipheredDataKey(), 'CiPhErEdDaTaKeY');
assert.deepStrictEqual(location.getValue(), locValue);
location.setPartSize(200);
assert.strictEqual(location.getPartSize(), 200);
});
it('ObjectMDLocation::setDataLocation()', () => {
const location = new ObjectMDLocation({
key: 'fookey',
start: 42,
size: 100,
dataStoreName: 'awsbackend',
dataStoreETag: '2:abcdefghi',
cryptoScheme: 1,
cipheredDataKey: 'CiPhErEdDaTaKeY',
});
location.setDataLocation({ key: 'secondkey',
dataStoreName: 'gcpbackend' });
assert.strictEqual(location.getKey(), 'secondkey');
assert.strictEqual(location.getDataStoreName(), 'gcpbackend');
assert.strictEqual(location.getCryptoScheme(), undefined);
assert.strictEqual(location.getCipheredDataKey(), undefined);
assert.deepStrictEqual(location.getValue(), {
dataStoreETag: '2:abcdefghi',
dataStoreName: 'gcpbackend',
key: 'secondkey',
size: 100,
start: 42,
});
location.setDataLocation({ key: 'thirdkey',
dataStoreName: 'azurebackend',
cryptoScheme: 1,
cipheredDataKey: 'NeWcIpHeReDdAtAkEy' });
assert.strictEqual(location.getKey(), 'thirdkey');
assert.strictEqual(location.getDataStoreName(), 'azurebackend');
assert.strictEqual(location.getCryptoScheme(), 1);
assert.strictEqual(location.getCipheredDataKey(), 'NeWcIpHeReDdAtAkEy');
});
});

View File

@ -2,6 +2,11 @@ const assert = require('assert');
const ObjectMD = require('../../../lib/models/ObjectMD');
const constants = require('../../../lib/constants');
const retainDate = new Date();
retainDate.setDate(retainDate.getDate() + 1);
const laterDate = new Date();
laterDate.setDate(laterDate.getDate() + 5);
describe('ObjectMD class setters/getters', () => {
let md = null;
@ -103,6 +108,11 @@ describe('ObjectMD class setters/getters', () => {
dataStoreVersionId: '',
}],
['DataStoreName', null, ''],
['LegalHold', null, false],
['LegalHold', true],
['RetentionMode', 'GOVERNANCE'],
['RetentionDate', retainDate.toISOString()],
['OriginOp', null, ''],
['IsAborted', null, undefined],
['IsAborted', true],
].forEach(test => {
@ -198,6 +208,21 @@ describe('ObjectMD class setters/getters', () => {
assert.strictEqual(
md.getReplicationSiteDataStoreVersionId('zenko'), 'a');
});
it('ObjectMD::set/getRetentionMode', () => {
md.setRetentionMode('COMPLIANCE');
assert.deepStrictEqual(md.getRetentionMode(), 'COMPLIANCE');
});
it('ObjectMD::set/getRetentionDate', () => {
md.setRetentionDate(laterDate.toISOString());
assert.deepStrictEqual(md.getRetentionDate(), laterDate.toISOString());
});
it('ObjectMD::set/getOriginOp', () => {
md.setOriginOp('Copy');
assert.deepStrictEqual(md.getOriginOp(), 'Copy');
});
});
describe('ObjectMD import from stored blob', () => {
@ -322,6 +347,7 @@ describe('getAttributes static method', () => {
'dataStoreName': true,
'last-modified': true,
'md-model-version': true,
'originOp': true,
'isAborted': true,
};
assert.deepStrictEqual(attributes, expectedResult);

View File

@ -4,8 +4,9 @@ const assert = require('assert');
const policyValidator = require('../../../lib/policy/policyValidator');
const errors = require('../../../lib/errors');
const validateUserPolicy = policyValidator.validateUserPolicy;
const validateResourcePolicy = policyValidator.validateResourcePolicy;
const successRes = { error: null, valid: true };
const samplePolicy = {
const sampleUserPolicy = {
Version: '2012-10-17',
Statement: {
Sid: 'FooBar1234',
@ -15,6 +16,19 @@ const samplePolicy = {
Condition: { NumericLessThanEquals: { 's3:max-keys': '10' } },
},
};
const sampleResourcePolicy = {
Version: '2012-10-17',
Statement: [
{
Sid: 'ResourcePolicy1',
Effect: 'Allow',
Action: 's3:ListBucket',
Resource: 'arn:aws:s3:::example-bucket',
Condition: { StringLike: { 's3:prefix': 'foo' } },
Principal: '*',
},
],
};
const errDict = {
required: {
@ -30,45 +44,84 @@ const errDict = {
Resource: 'Policy statement must contain resources.',
},
};
let policy;
function failRes(errDescription) {
const error = Object.assign({}, errors.MalformedPolicyDocument);
function failRes(policyType, errDescription) {
let error;
if (policyType === 'user') {
error = Object.assign({}, errors.MalformedPolicyDocument);
}
if (policyType === 'resource') {
error = Object.assign({}, errors.MalformedPolicy);
}
error.description = errDescription || error.description;
return { error, valid: false };
}
function check(input, expected) {
const result = validateUserPolicy(JSON.stringify(input));
function check(input, expected, policyType) {
let result;
if (policyType === 'user') {
result = validateUserPolicy(JSON.stringify(input));
}
if (policyType === 'resource') {
result = validateResourcePolicy(JSON.stringify(input));
}
assert.deepStrictEqual(result, expected);
}
let userPolicy;
let resourcePolicy;
const user = 'user';
const resource = 'resource';
beforeEach(() => {
policy = JSON.parse(JSON.stringify(samplePolicy));
userPolicy = JSON.parse(JSON.stringify(sampleUserPolicy));
resourcePolicy = JSON.parse(JSON.stringify(sampleResourcePolicy));
});
describe('Policies validation - Invalid JSON', () => {
it('should return error for invalid JSON', () => {
it('should return error for invalid user policy JSON', () => {
const result = validateUserPolicy('{"Version":"2012-10-17",' +
'"Statement":{"Effect":"Allow""Action":"s3:PutObject",' +
'"Resource":"arn:aws:s3*"}}');
assert.deepStrictEqual(result, failRes());
assert.deepStrictEqual(result, failRes(user));
});
it('should return error for invaild resource policy JSON', () => {
const result = validateResourcePolicy('{"Version":"2012-10-17",' +
'"Statement":{"Effect":"Allow""Action":"s3:PutObject",' +
'"Resource":"arn:aws:s3*"}}');
assert.deepStrictEqual(result, failRes(resource));
});
});
describe('Policies validation - Version', () => {
it('should validate with version date 2012-10-17', () => {
check(policy, successRes);
it('should validate user policy with version date 2012-10-17', () => {
check(userPolicy, successRes, user);
});
it('should return error for other dates', () => {
policy.Version = '2012-11-17';
check(policy, failRes());
it('should validate resource policy with version date 2012-10-17', () => {
check(resourcePolicy, successRes, 'resource');
});
it('should return error if Version field is missing', () => {
policy.Version = undefined;
check(policy, failRes(errDict.required.Version));
it('user policy should return error for other dates', () => {
userPolicy.Version = '2012-11-17';
check(userPolicy, failRes(user), user);
});
it('resource policy should return error for other dates', () => {
resourcePolicy.Version = '2012-11-17';
check(resourcePolicy, failRes(resource), resource);
});
it('should return error if Version field in user policy is missing', () => {
userPolicy.Version = undefined;
check(userPolicy, failRes(user, errDict.required.Version), user);
});
it('should return error if Version field in resource policy is missing',
() => {
resourcePolicy.Version = undefined;
check(resourcePolicy, failRes(resource, errDict.required.Version),
resource);
});
});
@ -77,20 +130,24 @@ describe('Policies validation - Principal', () => {
{
name: 'an account id',
value: { AWS: '111111111111' },
policyType: [user, resource],
},
{
name: 'anonymous user AWS form',
value: { AWS: '*' },
policyType: [user, resource],
},
{
name: 'an account arn',
value: { AWS: 'arn:aws:iam::111111111111:root' },
policyType: [user, resource],
},
{
name: 'multiple account id',
value: {
AWS: ['111111111111', '111111111112'],
},
policyType: [user, resource],
},
{
name: 'multiple account arn',
@ -100,14 +157,22 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::111111111112:root',
],
},
policyType: [user, resource],
},
{
name: 'anonymous user as string',
value: '*',
policyType: [user, resource],
},
{
name: 'user arn',
value: { AWS: 'arn:aws:iam::111111111111:user/alex' },
policyType: [user, resource],
},
{
name: 'user arn with path',
value: { AWS: 'arn:aws:iam::111111111111:user/path/in/org/leaf' },
policyType: [user, resource],
},
{
name: 'multiple user arns',
@ -117,12 +182,14 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::111111111111:user/thibault',
],
},
policyType: [user, resource],
},
{
name: 'role arn',
value: {
AWS: 'arn:aws:iam::111111111111:role/dev',
},
policyType: [user, resource],
},
{
name: 'multiple role arn',
@ -132,6 +199,7 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::111111111111:role/prod',
],
},
policyType: [user, resource],
},
{
name: 'saml provider',
@ -139,57 +207,84 @@ describe('Policies validation - Principal', () => {
Federated:
'arn:aws:iam::111111111111:saml-provider/mysamlprovider',
},
policyType: [user],
},
{
name: 'with backbeat service',
value: { Service: 'backbeat' },
policyType: [user, resource],
},
{
name: 'with canonical user id',
value: { CanonicalUser:
'1examplecanonicalid12345678909876' +
'54321qwerty12345asdfg67890z1x2c' },
policyType: [resource],
},
].forEach(test => {
it(`should allow principal field with ${test.name}`, () => {
policy.Statement.Principal = test.value;
delete policy.Statement.Resource;
check(policy, successRes);
});
if (test.policyType.includes(user)) {
it(`should allow user policy principal field with ${test.name}`,
() => {
userPolicy.Statement.Principal = test.value;
delete userPolicy.Statement.Resource;
check(userPolicy, successRes, user);
});
it(`shoud allow notPrincipal field with ${test.name}`, () => {
policy.Statement.NotPrincipal = test.value;
delete policy.Statement.Resource;
check(policy, successRes);
});
it(`should allow user policy notPrincipal field with ${test.name}`,
() => {
userPolicy.Statement.NotPrincipal = test.value;
delete userPolicy.Statement.Resource;
check(userPolicy, successRes, user);
});
}
if (test.policyType.includes(resource)) {
it(`should allow resource policy principal field with ${test.name}`,
() => {
resourcePolicy.Statement[0].Principal = test.value;
check(resourcePolicy, successRes, resource);
});
}
});
[
{
name: 'wrong format account id',
value: { AWS: '11111111111z' },
policyType: [user, resource],
},
{
name: 'empty string',
value: '',
policyType: [user, resource],
},
{
name: 'anonymous user federated form',
value: { federated: '*' },
policyType: [user, resource],
},
{
name: 'wildcard in ressource',
name: 'wildcard in resource',
value: { AWS: 'arn:aws:iam::111111111111:user/*' },
policyType: [user, resource],
},
{
name: 'a malformed account arn',
value: { AWS: 'arn:aws:iam::111111111111:' },
policyType: [user, resource],
},
{
name: 'multiple malformed account id',
value: {
AWS: ['1111111111z1', '1111z1111112'],
},
policyType: [user, resource],
},
{
name: 'multiple anonymous',
value: {
AWS: ['*', '*'],
},
policyType: [user, resource],
},
{
name: 'multiple malformed account arn',
@ -199,18 +294,22 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::111111111112:',
],
},
policyType: [user, resource],
},
{
name: 'account id as a string',
value: '111111111111',
policyType: [user, resource],
},
{
name: 'account arn as a string',
value: 'arn:aws:iam::111111111111:root',
policyType: [user, resource],
},
{
name: 'user arn as a string',
value: 'arn:aws:iam::111111111111:user/alex',
policyType: [user, resource],
},
{
name: 'multiple malformed user arns',
@ -220,12 +319,14 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::111111111111:user/',
],
},
policyType: [user, resource],
},
{
name: 'malformed role arn',
value: {
AWS: 'arn:aws:iam::111111111111:role/',
},
policyType: [user, resource],
},
{
name: 'multiple malformed role arn',
@ -235,36 +336,84 @@ describe('Policies validation - Principal', () => {
'arn:aws:iam::11111111z111:role/prod',
],
},
policyType: [user, resource],
},
{
name: 'saml provider as a string',
value: 'arn:aws:iam::111111111111:saml-provider/mysamlprovider',
policyType: [user],
},
{
name: 'with other service than backbeat',
value: { Service: 'non-existent-service' },
policyType: [user, resource],
},
{
name: 'invalid canonical user',
value: { CanonicalUser:
'12345invalid-canonical-id$$$//098' +
'7654321poiu1q2w3e4r5t6y7u8i9o0p' },
policyType: [resource],
},
].forEach(test => {
it(`should fail with ${test.name}`, () => {
policy.Statement.Principal = test.value;
delete policy.Statement.Resource;
check(policy, failRes());
});
if (test.policyType.includes(user)) {
it(`user policy should fail with ${test.name}`, () => {
userPolicy.Statement.Principal = test.value;
delete userPolicy.Statement.Resource;
check(userPolicy, failRes(user), user);
});
}
if (test.policyType.includes(resource)) {
it(`resource policy should fail with ${test.name}`, () => {
resourcePolicy.Statement[0].Principal = test.value;
check(resourcePolicy, failRes(resource), resource);
});
}
});
it('should not allow Resource field', () => {
policy.Statement.Principal = '*';
check(policy, failRes());
userPolicy.Statement.Principal = '*';
check(userPolicy, failRes(user), user);
});
});
describe('Policies validation - Statement', () => {
it('should succeed for a valid object', () => {
check(policy, successRes);
[
{
name: 'should return error for undefined',
value: undefined,
},
{
name: 'should return an error for an empty list',
value: [],
},
{
name: 'should return an error for an empty object',
value: {},
errMessage: errDict.required.Action,
},
].forEach(test => {
it(`user policy ${test.name}`, () => {
userPolicy.Statement = test.value;
check(userPolicy, failRes(user, test.errMessage), user);
});
it(`resource policy ${test.name}`, () => {
resourcePolicy.Statement = test.value;
check(resourcePolicy, failRes(resource, test.errMessage), resource);
});
});
it('should succeed for a valid array', () => {
policy.Statement = [
it('user policy should succeed for a valid object', () => {
check(userPolicy, successRes, user);
});
it('resource policy should succeed for a valid object', () => {
check(resourcePolicy, successRes, resource);
});
it('user policy should succeed for a valid object', () => {
userPolicy.Statement = [
{
Effect: 'Allow',
Action: 's3:PutObject',
@ -276,255 +425,373 @@ describe('Policies validation - Statement', () => {
Resource: 'arn:aws:s3:::my_bucket/uploads/widgetco/*',
},
];
check(policy, successRes);
check(userPolicy, successRes, user);
});
it('should return an error for undefined', () => {
policy.Statement = undefined;
check(policy, failRes());
it('resource policy should succeed for a valid object', () => {
resourcePolicy.Statement = [
{
Effect: 'Allow',
Action: 's3:PutObject',
Resource: 'arn:aws:s3:::my_bucket/uploads/widgetco/*',
Principal: '*',
},
{
Effect: 'Deny',
Action: 's3:DeleteObject',
Resource: 'arn:aws:s3:::my_bucket/uploads/widgetco/*',
Principal: '*',
},
];
check(resourcePolicy, successRes, resource);
});
it('should return an error for an empty list', () => {
policy.Statement = [];
check(policy, failRes());
});
[
{
name: 'should return error for missing a required field - Action',
toDelete: ['Action'],
expected: 'fail',
errMessage: errDict.required.Action,
},
{
name: 'should return error for missing a required field - Effect',
toDelete: ['Effect'],
expected: 'fail',
},
{
name: 'should return error for missing required field - Resource',
toDelete: ['Resource'],
expected: 'fail',
},
{
name: 'should return error for missing multiple required fields',
toDelete: ['Effect', 'Resource'],
expected: 'fail',
},
{
name: 'should succeed w optional fields missing - Sid, Condition',
toDelete: ['Sid', 'Condition'],
expected: successRes,
},
].forEach(test => {
it(`user policy ${test.name}`, () => {
test.toDelete.forEach(p => delete userPolicy.Statement[p]);
if (test.expected === 'fail') {
check(userPolicy, failRes(user, test.errMessage), user);
} else {
check(userPolicy, test.expected, user);
}
});
it('should return an error for an empty object', () => {
policy.Statement = {};
check(policy, failRes(errDict.required.Action));
});
it('should return an error for missing a required field - Action', () => {
delete policy.Statement.Action;
check(policy, failRes(errDict.required.Action));
});
it('should return an error for missing a required field - Effect', () => {
delete policy.Statement.Effect;
check(policy, failRes());
});
it('should return an error for missing a required field - Resource', () => {
delete policy.Statement.Resource;
check(policy, failRes());
});
it('should return an error for missing multiple required fields', () => {
delete policy.Statement.Effect;
delete policy.Statement.Resource;
check(policy, failRes());
});
it('should succeed with optional fields missing - Sid, Condition', () => {
delete policy.Statement.Sid;
delete policy.Statement.Condition;
check(policy, successRes);
it(`resource policy ${test.name}`, () => {
test.toDelete.forEach(p => delete resourcePolicy.Statement[0][p]);
if (test.expected === 'fail') {
check(resourcePolicy, failRes(resource, test.errMessage),
resource);
} else {
check(resourcePolicy, test.expected, resource);
}
});
});
});
describe('Policies validation - Statement::Sid_block', () => {
it('should succeed if Sid is any alphanumeric string', () => {
check(policy, successRes);
it('user policy should succeed if Sid is any alphanumeric string', () => {
check(userPolicy, successRes, user);
});
it('should fail if Sid is not a valid format', () => {
policy.Statement.Sid = 'foo bar()';
check(policy, failRes());
it('resource policy should succeed if Sid is any alphanumeric string',
() => {
check(resourcePolicy, successRes, resource);
});
it('should fail if Sid is not a string', () => {
policy.Statement.Sid = 1234;
check(policy, failRes());
it('user policy should fail if Sid is not a valid format', () => {
userPolicy.Statement.Sid = 'foo bar()';
check(userPolicy, failRes(user), user);
});
it('resource policy should fail if Sid is not a valid format', () => {
resourcePolicy.Statement[0].Sid = 'foo bar()';
check(resourcePolicy, failRes(resource), resource);
});
it('user policy should fail if Sid is not a string', () => {
userPolicy.Statement.Sid = 1234;
check(userPolicy, failRes(user), user);
});
it('resource policy should fail if Sid is not a string', () => {
resourcePolicy.Statement[0].Sid = 1234;
check(resourcePolicy, failRes(resource), resource);
});
});
describe('Policies validation - Statement::Effect_block', () => {
it('should succeed for Allow', () => {
check(policy, successRes);
it('user policy should succeed for Allow', () => {
check(userPolicy, successRes, user);
});
it('should succeed for Deny', () => {
policy.Statement.Effect = 'Deny';
check(policy, successRes);
it('resource policy should succeed for Allow', () => {
check(resourcePolicy, successRes, resource);
});
it('should fail for strings other than Allow/Deny', () => {
policy.Statement.Effect = 'Reject';
check(policy, failRes());
it('user policy should succeed for Deny', () => {
userPolicy.Statement.Effect = 'Deny';
check(userPolicy, successRes, user);
});
it('should fail if Effect is not a string', () => {
policy.Statement.Effect = 1;
check(policy, failRes());
it('resource policy should succeed for Deny', () => {
resourcePolicy.Statement[0].Effect = 'Deny';
check(resourcePolicy, successRes, resource);
});
it('user policy should fail for strings other than Allow/Deny', () => {
userPolicy.Statement.Effect = 'Reject';
check(userPolicy, failRes(user), user);
});
it('resource policy should fail for strings other than Allow/Deny', () => {
resourcePolicy.Statement[0].Effect = 'Reject';
check(resourcePolicy, failRes(resource), resource);
});
it('user policy should fail if Effect is not a string', () => {
userPolicy.Statement.Effect = 1;
check(userPolicy, failRes(user), user);
});
it('resource policy should fail if Effect is not a string', () => {
resourcePolicy.Statement[0].Effect = 1;
check(resourcePolicy, failRes(resource), resource);
});
});
describe('Policies validation - Statement::Action_block/' +
const actionTests = [
{
name: 'should succeed for foo:bar',
value: 'foo:bar',
expected: successRes,
},
{
name: 'should succeed for foo:*',
value: 'foo:*',
expected: successRes,
},
{
name: 'should succeed for *',
value: '*',
expected: successRes,
},
{
name: 'should fail for **',
value: '**',
expected: 'fail',
errMessage: errDict.pattern.Action,
},
{
name: 'should fail for foobar',
value: 'foobar',
expected: 'fail',
errMessage: errDict.pattern.Action,
},
];
describe('User policies validation - Statement::Action_block/' +
'Statement::NotAction_block', () => {
beforeEach(() => {
policy.Statement.Action = undefined;
policy.Statement.NotAction = undefined;
userPolicy.Statement.Action = undefined;
userPolicy.Statement.NotAction = undefined;
});
it('should succeed for foo:bar', () => {
policy.Statement.Action = 'foo:bar';
check(policy, successRes);
actionTests.forEach(test => {
it(`${test.name}`, () => {
userPolicy.Statement.Action = test.value;
if (test.expected === 'fail') {
check(userPolicy, failRes(user, test.errMessage), user);
} else {
check(userPolicy, test.expected, user);
}
policy.Statement.Action = undefined;
policy.Statement.NotAction = 'foo:bar';
check(policy, successRes);
});
it('should succeed for foo:*', () => {
policy.Statement.Action = 'foo:*';
check(policy, successRes);
policy.Statement.Action = undefined;
policy.Statement.NotAction = 'foo:*';
check(policy, successRes);
});
it('should succeed for *', () => {
policy.Statement.Action = '*';
check(policy, successRes);
policy.Statement.Action = undefined;
policy.Statement.NotAction = '*';
check(policy, successRes);
});
it('should fail for **', () => {
policy.Statement.Action = '**';
check(policy, failRes(errDict.pattern.Action));
policy.Statement.Action = undefined;
policy.Statement.NotAction = '**';
check(policy, failRes(errDict.pattern.Action));
});
it('should fail for foobar', () => {
policy.Statement.Action = 'foobar';
check(policy, failRes(errDict.pattern.Action));
policy.Statement.Action = undefined;
policy.Statement.NotAction = 'foobar';
check(policy, failRes(errDict.pattern.Action));
userPolicy.Statement.Action = undefined;
userPolicy.Statement.NotAction = test.value;
if (test.expected === 'fail') {
check(userPolicy, failRes(user, test.errMessage), user);
} else {
check(userPolicy, test.expected, user);
}
});
});
});
describe('Policies validation - Statement::Resource_block' +
describe('Resource policies validation - Statement::Action_block', () => {
actionTests.forEach(test => {
it(`${test.name}`, () => {
resourcePolicy.Statement[0].Action = test.value;
if (test.expected === 'fail') {
check(resourcePolicy, failRes(resource, test.errMessage),
resource);
} else {
check(resourcePolicy, test.expected, resource);
}
});
});
});
const resourceTests = [
{
name: 'should succeed for arn:aws::s3:::*',
value: 'arn:aws:s3:::*',
expected: successRes,
},
{
name: 'should succeed for arn:aws:s3:::test/home/${aws:username}',
value: 'arn:aws:s3:::test/home/${aws:username}',
expected: successRes,
},
{
name: 'should succeed for arn:aws:ec2:us-west-1:1234567890:vol/*',
value: 'arn:aws:ec2:us-west-1:1234567890:vol/*',
expected: successRes,
},
{
name: 'should succeed for *',
value: '*',
expected: successRes,
},
{
name: 'should fail for arn:aws:ec2:us-west-1:vol/* - missing region',
value: 'arn:aws:ec2:us-west-1:vol/*',
expected: 'fail',
errMessage: errDict.pattern.Resource,
},
{
name: 'should fail for arn:aws:ec2:us-west-1:123456789:v/${} - ${}',
value: 'arn:aws:ec2:us-west-1:123456789:v/${}',
expected: 'fail',
errMessage: errDict.pattern.Resource,
},
{
name: 'should fail for ec2:us-west-1:qwerty:vol/* - missing arn:aws:',
value: 'ec2:us-west-1:123456789012:vol/*',
expected: 'fail',
errMessage: errDict.pattern.Resource,
},
];
describe('User policies validation - Statement::Resource_block' +
'Statement::NotResource_block', () => {
beforeEach(() => {
policy.Statement.Resource = undefined;
policy.Statement.NotResource = undefined;
userPolicy.Statement.Resource = undefined;
userPolicy.Statement.NotResource = undefined;
});
it('should succeed for arn:aws:s3:::*', () => {
policy.Statement.Resource = 'arn:aws:s3:::*';
check(policy, successRes);
resourceTests.forEach(test => {
it(`${test.name}`, () => {
userPolicy.Statement.Resource = test.value;
if (test.expected === 'fail') {
check(userPolicy, failRes(user, test.errMessage), user);
} else {
check(userPolicy, test.expected, user);
}
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'arn:aws:s3:::*';
check(policy, successRes);
});
it('should succeed for arn:aws:s3:::test/home/${aws:username}', () => {
policy.Statement.Resource = 'arn:aws:s3:::test/home/${aws:username}';
check(policy, successRes);
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'arn:aws:s3:::test/home/${aws:username}';
check(policy, successRes);
});
it('should succeed for arn:aws:ec2:us-west-1:1234567890:vol/*', () => {
policy.Statement.Resource = 'arn:aws:ec2:us-west-1:1234567890:vol/*';
check(policy, successRes);
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'arn:aws:ec2:us-west-1:1234567890:vol/*';
check(policy, successRes);
});
it('should succeed for *', () => {
policy.Statement.Resource = '*';
check(policy, successRes);
policy.Statement.Resource = undefined;
policy.Statement.NotResource = '*';
check(policy, successRes);
});
it('should fail for arn:aws:ec2:us-west-1:vol/* - missing region', () => {
policy.Statement.Resource = 'arn:aws:ec2:1234567890:vol/*';
check(policy, failRes(errDict.pattern.Resource));
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'arn:aws:ec2:1234567890:vol/*';
check(policy, failRes(errDict.pattern.Resource));
});
it('should fail for arn:aws:ec2:us-west-1:123456789:v/${} - ${}', () => {
policy.Statement.Resource = 'arn:aws:ec2:us-west-1:123456789:v/${}';
check(policy, failRes(errDict.pattern.Resource));
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'arn:aws:ec2:us-west-1:123456789:v/${}';
check(policy, failRes(errDict.pattern.Resource));
});
it('should fail for ec2:us-west-1:qwerty:vol/* - missing arn:aws:', () => {
policy.Statement.Resource = 'ec2:us-west-1:123456789012:vol/*';
check(policy, failRes(errDict.pattern.Resource));
policy.Statement.Resource = undefined;
policy.Statement.NotResource = 'ec2:us-west-1:123456789012:vol/*';
check(policy, failRes(errDict.pattern.Resource));
userPolicy.Statement.Resource = undefined;
userPolicy.Statement.NotResource = test.value;
if (test.expected === 'fail') {
check(userPolicy, failRes(user, test.errMessage), user);
} else {
check(userPolicy, test.expected, user);
}
});
});
it('should fail for empty list of resources', () => {
policy.Statement.Resource = [];
check(policy, failRes(errDict.minItems.Resource));
userPolicy.Statement.Resource = [];
check(userPolicy, failRes(user, errDict.minItems.Resource), user);
});
});
describe('Resource policies validation - Statement::Resource_block', () => {
resourceTests.forEach(test => {
it(`${test.name}`, () => {
resourcePolicy.Statement[0].Resource = test.value;
if (test.expected === 'fail') {
check(resourcePolicy, failRes(resource, test.errMessage),
resource);
} else {
check(resourcePolicy, test.expected, resource);
}
});
});
it('should fail for empty list of resources', () => {
resourcePolicy.Statement[0].Resource = [];
check(resourcePolicy, failRes(resource, errDict.minItems.Resource),
resource);
});
});
describe('Policies validation - Statement::Condition_block', () => {
it('should succeed for single Condition', () => {
check(policy, successRes);
it('user policy should succeed for single Condition', () => {
check(userPolicy, successRes, user);
});
it('should succeed for multiple Conditions', () => {
policy.Statement.Condition = {
StringNotLike: { 's3:prefix': ['Development/*'] },
Null: { 's3:prefix': false },
};
check(policy, successRes);
it('resource policy should succeed for single Condition', () => {
check(resourcePolicy, successRes, resource);
});
it('should fail when Condition is not an Object', () => {
policy.Statement.Condition = 'NumericLessThanEquals';
check(policy, failRes());
});
[
{
name: 'should succeed for multiple Conditions',
value: {
StringNotLike: { 's3:prefix': ['Development/*'] },
Null: { 's3:prefix': false },
},
expected: successRes,
},
{
name: 'should fail when Condition is not an Object',
value: 'NumericLessThanEquals',
expected: 'fail',
},
{
name: 'should fail for an invalid Condition',
value: {
SomethingLike: { 's3:prefix': ['Development/*'] },
},
expected: 'fail',
},
{
name: 'should fail when one of the multiple conditions is invalid',
value: {
Null: { 's3:prefix': false },
SomethingLike: { 's3:prefix': ['Development/*'] },
},
expected: 'fail',
},
{
name: 'should fail when invalid property is assigned',
value: {
SomethingLike: { 's3:prefix': ['Development/*'] },
},
expected: 'fail',
},
].forEach(test => {
it(`user policy ${test.name}`, () => {
userPolicy.Statement.Condition = test.value;
if (test.expected === 'fail') {
check(userPolicy, failRes(user), user);
} else {
check(userPolicy, test.expected, user);
}
});
it('should fail for an invalid Condition', () => {
policy.Statement.Condition = {
SomethingLike: { 's3:prefix': ['Development/*'] },
};
check(policy, failRes());
});
it('should fail when one of the multiple conditions is invalid', () => {
policy.Statement.Condition = {
Null: { 's3:prefix': false },
SomethingLike: { 's3:prefix': ['Development/*'] },
};
check(policy, failRes());
});
it('should fail when invalid property is assigned', () => {
policy.Condition = {
SomethingLike: { 's3:prefix': ['Development/*'] },
};
check(policy, failRes());
it(`resource policy ${test.name}`, () => {
resourcePolicy.Statement[0].Condition = test.value;
if (test.expected === 'fail') {
check(resourcePolicy, failRes(resource), resource);
} else {
check(resourcePolicy, test.expected, resource);
}
});
});
});

View File

@ -472,7 +472,7 @@ describe('Principal evaluator', () => {
},
},
{
name: 'should deny user as principal if account is different',
name: 'should allow user as principal if account is different',
statement: [
{
Principal: {
@ -491,7 +491,7 @@ describe('Principal evaluator', () => {
accountId: defaultAccountId,
},
result: {
result: 'Deny',
result: 'Allow',
checkAction: true,
},
},

View File

@ -1161,6 +1161,85 @@ describe('policyEvaluator', () => {
};
check(requestContext, rcModifiers, policy, 'Neutral');
});
it('should allow with StringEquals operator and ExistingObjectTag ' +
'key if meet condition', () => {
policy.Statement.Condition = {
StringEquals: { 's3:ExistingObjectTag/tagKey': 'tagValue' },
};
const rcModifiers = {
_existingObjTag: 'tagKey=tagValue',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Allow');
});
it('should allow StringEquals operator and RequestObjectTag ' +
'key if meet condition', () => {
policy.Statement.Condition = {
StringEquals: { 's3:RequestObjectTagKey/tagKey': 'tagValue' },
};
const rcModifiers = {
_requestObjTags: 'tagKey=tagValue',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Allow');
});
it('should allow with ForAnyValue prefix if meet condition', () => {
policy.Statement.Condition = {
'ForAnyValue:StringLike': { 's3:RequestObjectTagKeys': ['tagOne', 'tagTwo'] },
};
const rcModifiers = {
_requestObjTags: 'tagOne=keyOne&tagThree=keyThree',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Allow');
});
it('should allow with ForAllValues prefix if meet condition', () => {
policy.Statement.Condition = {
'ForAllValues:StringLike': { 's3:RequestObjectTagKeys': ['tagOne', 'tagTwo'] },
};
const rcModifiers = {
_requestObjTags: 'tagOne=keyOne&tagTwo=keyTwo',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Allow');
});
it('should not allow with ForAnyValue prefix if do not meet condition', () => {
policy.Statement.Condition = {
'ForAnyValue:StringLike': { 's3:RequestObjectTagKeys': ['tagOne', 'tagTwo'] },
};
const rcModifiers = {
_requestObjTags: 'tagThree=keyThree&tagFour=keyFour',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Neutral');
});
it('should not allow with ForAllValues prefix if do not meet condition', () => {
policy.Statement.Condition = {
'ForAllValues:StringLike': { 's3:RequestObjectTagKeys': ['tagOne', 'tagTwo'] },
};
const rcModifiers = {
_requestObjTags: 'tagThree=keyThree&tagFour=keyFour',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Neutral');
});
it('should be neutral with StringEquals if condition key does not exist', () => {
policy.Statement.Condition = {
StringEquals: { 's3:Foobar/tagKey': 'tagValue' },
};
const rcModifiers = {
_requestObjTags: 'tagKey=tagValue',
_needTagEval: true,
};
check(requestContext, rcModifiers, policy, 'Neutral');
});
});
});

View File

@ -0,0 +1,856 @@
const assert = require('assert');
const LifecycleRule = require('../../../lib/models/LifecycleRule');
const { LifecycleUtils } = require('../../../lib/s3middleware/lifecycleHelpers');
// 5 days prior to CURRENT
const PAST = new Date(2018, 1, 5);
const CURRENT = new Date(2018, 1, 10);
// 5 days after CURRENT
const FUTURE = new Date(2018, 1, 15);
// Get the date from the number of days given.
function getDate(params) {
const numberOfDaysFromNow = params.numberOfDaysFromNow || 0;
const oneDay = 24 * 60 * 60 * 1000; // Milliseconds in a day.
const milliseconds = numberOfDaysFromNow * oneDay;
const timestamp = Date.now() + milliseconds;
return new Date(timestamp);
}
// Get the metadata object.
function getMetadataObject(lastModified, storageClass) {
return {
LastModified: lastModified,
StorageClass: storageClass || 'STANDARD',
};
}
// get all rule ID's
function getRuleIDs(rules) {
return rules.map(rule => rule.ID).sort();
}
describe('LifecycleUtils::getApplicableRules', () => {
let lutils;
before(() => {
lutils = new LifecycleUtils([
'expiration',
'noncurrentVersionExpiration',
'abortIncompleteMultipartUpload',
'transitions',
]);
});
it('should return earliest applicable expirations', () => {
const filteredRules = [
new LifecycleRule().addID('task-1').addExpiration('Date', FUTURE)
.build(),
new LifecycleRule().addID('task-2').addExpiration('Days', 10).build(),
new LifecycleRule().addID('task-3').addExpiration('Date', PAST)
.build(),
new LifecycleRule().addID('task-4').addExpiration('Date', CURRENT)
.build(),
new LifecycleRule().addID('task-5').addExpiration('Days', 5).build(),
];
const res1 = lutils.getApplicableRules(filteredRules);
assert.strictEqual(res1.Expiration.Date, PAST);
assert.strictEqual(res1.Expiration.Days, 5);
// remove `PAST` from rules
filteredRules.splice(2, 1);
const res2 = lutils.getApplicableRules(filteredRules);
assert.strictEqual(res2.Expiration.Date, CURRENT);
});
it('should return earliest applicable rules', () => {
const filteredRules = [
new LifecycleRule().addID('task-1').addExpiration('Date', FUTURE)
.build(),
new LifecycleRule().addID('task-2').addAbortMPU(18).build(),
new LifecycleRule().addID('task-3').addExpiration('Date', PAST)
.build(),
new LifecycleRule().addID('task-4').addNCVExpiration(3).build(),
new LifecycleRule().addID('task-5').addNCVExpiration(12).build(),
new LifecycleRule().addID('task-6').addExpiration('Date', CURRENT)
.build(),
new LifecycleRule().addID('task-7').addNCVExpiration(7).build(),
new LifecycleRule().addID('task-8').addAbortMPU(4).build(),
new LifecycleRule().addID('task-9').addAbortMPU(22).build(),
];
const res = lutils.getApplicableRules(filteredRules);
assert.deepStrictEqual(Object.keys(res.Expiration), ['ID', 'Date']);
assert.deepStrictEqual(res.Expiration, { ID: 'task-3', Date: PAST });
assert.strictEqual(
res.AbortIncompleteMultipartUpload.DaysAfterInitiation, 4);
assert.strictEqual(
res.NoncurrentVersionExpiration.NoncurrentDays, 3);
});
it('should return Transition with Days', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([
{
Days: 1,
StorageClass: 'zenko',
},
])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -2 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 1,
StorageClass: 'zenko',
},
});
});
it('should return Transition when multiple rule transitions', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([
{
Days: 1,
StorageClass: 'zenko-1',
},
{
Days: 3,
StorageClass: 'zenko-3',
},
])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -4 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 3,
StorageClass: 'zenko-3',
},
});
});
it('should return Transition with Date', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Date: 0,
StorageClass: 'zenko',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -1 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Date: 0,
StorageClass: 'zenko',
},
});
});
it('should return Transition across many rules: first rule', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 1,
StorageClass: 'zenko-1',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Days: 3,
StorageClass: 'zenko-3',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -2 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 1,
StorageClass: 'zenko-1',
},
});
});
it('should return Transition across many rules: second rule', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 1,
StorageClass: 'zenko-1',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Days: 3,
StorageClass: 'zenko-3',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -4 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 3,
StorageClass: 'zenko-3',
},
});
});
it('should return Transition across many rules: first rule with ' +
'multiple transitions', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 1,
StorageClass: 'zenko-1',
}, {
Days: 3,
StorageClass: 'zenko-3',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Days: 4,
StorageClass: 'zenko-4',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -2 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 1,
StorageClass: 'zenko-1',
},
});
});
it('should return Transition across many rules: second rule with ' +
'multiple transitions', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 1,
StorageClass: 'zenko-1',
}, {
Days: 3,
StorageClass: 'zenko-3',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Days: 4,
StorageClass: 'zenko-4',
}, {
Days: 6,
StorageClass: 'zenko-6',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -5 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 4,
StorageClass: 'zenko-4',
},
});
});
it('should return Transition across many rules: combination of Date ' +
'and Days gets Date result', () => {
const applicableDate = getDate({ numberOfDaysFromNow: -1 });
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 1,
StorageClass: 'zenko-1',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Date: applicableDate,
StorageClass: 'zenko-3',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -4 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Date: applicableDate,
StorageClass: 'zenko-3',
},
});
});
it('should return Transition across many rules: combination of Date ' +
'and Days gets Days result', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Days: 3,
StorageClass: 'zenko-1',
}])
.build(),
new LifecycleRule()
.addTransitions([{
Date: getDate({ numberOfDaysFromNow: -4 }),
StorageClass: 'zenko-3',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -4 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.deepStrictEqual(rules, {
Transition: {
Days: 3,
StorageClass: 'zenko-1',
},
});
});
it('should not return transition when Transitions has no applicable ' +
'rule: Days', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([
{
Days: 3,
StorageClass: 'zenko',
},
])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -2 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.strictEqual(rules.Transition, undefined);
});
it('should not return transition when Transitions has no applicable ' +
'rule: Date', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([{
Date: new Date(getDate({ numberOfDaysFromNow: 1 })),
StorageClass: 'zenko',
}])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: 0 });
const object = getMetadataObject(lastModified);
const rules = lutils.getApplicableRules(applicableRules, object);
assert.strictEqual(rules.Transition, undefined);
});
it('should not return transition when Transitions is an empty ' +
'array', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([])
.build(),
];
const rules = lutils.getApplicableRules(applicableRules, {});
assert.strictEqual(rules.Transition, undefined);
});
it('should not return transition when Transitions is undefined', () => {
const applicableRules = [
new LifecycleRule()
.addExpiration('Days', 1)
.build(),
];
const rules = lutils.getApplicableRules(applicableRules, {});
assert.strictEqual(rules.Transition, undefined);
});
describe('transitioning to the same storage class', () => {
it('should not return transition when applicable transition is ' +
'already stored at the destination', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([
{
Days: 1,
StorageClass: 'zenko',
},
])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -2 });
const object = getMetadataObject(lastModified, 'zenko');
const rules = lutils.getApplicableRules(applicableRules, object);
assert.strictEqual(rules.Transition, undefined);
});
it('should not return transition when applicable transition is ' +
'already stored at the destination: multiple rules', () => {
const applicableRules = [
new LifecycleRule()
.addTransitions([
{
Days: 2,
StorageClass: 'zenko',
},
])
.build(),
new LifecycleRule()
.addTransitions([
{
Days: 1,
StorageClass: 'STANDARD',
},
])
.build(),
];
const lastModified = getDate({ numberOfDaysFromNow: -3 });
const object = getMetadataObject(lastModified, 'zenko');
const rules = lutils.getApplicableRules(applicableRules, object);
assert.strictEqual(rules.Transition, undefined);
});
});
});
describe('LifecycleUtils::filterRules', () => {
let lutils;
before(() => {
lutils = new LifecycleUtils();
});
it('should filter out Status disabled rules', () => {
const mBucketRules = [
new LifecycleRule().addID('task-1').build(),
new LifecycleRule().addID('task-2').disable().build(),
new LifecycleRule().addID('task-3').build(),
new LifecycleRule().addID('task-4').build(),
new LifecycleRule().addID('task-2').disable().build(),
];
const item = {
Key: 'example-item',
LastModified: PAST,
};
const objTags = { TagSet: [] };
const res = lutils.filterRules(mBucketRules, item, objTags);
const expected = mBucketRules.filter(rule =>
rule.Status === 'Enabled');
assert.deepStrictEqual(getRuleIDs(res), getRuleIDs(expected));
});
it('should filter out unmatched prefixes', () => {
const mBucketRules = [
new LifecycleRule().addID('task-1').addPrefix('atask/').build(),
new LifecycleRule().addID('task-2').addPrefix('atasker/').build(),
new LifecycleRule().addID('task-3').addPrefix('cat-').build(),
new LifecycleRule().addID('task-4').addPrefix('xatask/').build(),
new LifecycleRule().addID('task-5').addPrefix('atask').build(),
new LifecycleRule().addID('task-6').addPrefix('Atask/').build(),
new LifecycleRule().addID('task-7').addPrefix('atAsk/').build(),
new LifecycleRule().addID('task-8').build(),
];
const item1 = {
Key: 'atask/example-item',
LastModified: CURRENT,
};
const item2 = {
Key: 'cat-test',
LastModified: CURRENT,
};
const objTags = { TagSet: [] };
const res1 = lutils.filterRules(mBucketRules, item1, objTags);
assert.strictEqual(res1.length, 3);
const expRes1 = getRuleIDs(mBucketRules.filter(rule => {
if (!rule.Filter || !rule.Filter.Prefix) {
return true;
}
if (item1.Key.startsWith(rule.Filter.Prefix)) {
return true;
}
return false;
}));
assert.deepStrictEqual(expRes1, getRuleIDs(res1));
const res2 = lutils.filterRules(mBucketRules, item2, objTags);
assert.strictEqual(res2.length, 2);
const expRes2 = getRuleIDs(mBucketRules.filter(rule =>
(rule.Filter && rule.Filter.Prefix && rule.Filter.Prefix.startsWith('cat-'))
|| (!rule.Filter || !rule.Filter.Prefix)));
assert.deepStrictEqual(expRes2, getRuleIDs(res2));
});
it('should filter out unmatched single tags', () => {
const mBucketRules = [
new LifecycleRule().addID('task-1').addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-2').addTag('tag3-1', 'val3')
.addTag('tag3-2', 'val3').build(),
new LifecycleRule().addID('task-3').addTag('tag3-1', 'val3').build(),
new LifecycleRule().addID('task-4').addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-5').addTag('tag3-2', 'val3')
.addTag('tag3-1', 'false').build(),
new LifecycleRule().addID('task-6').addTag('tag3-2', 'val3')
.addTag('tag3-1', 'val3').build(),
];
const item = {
Key: 'example-item',
LastModified: CURRENT,
};
const objTags1 = { TagSet: [{ Key: 'tag1', Value: 'val1' }] };
const res1 = lutils.filterRules(mBucketRules, item, objTags1);
assert.strictEqual(res1.length, 2);
const expRes1 = getRuleIDs(mBucketRules.filter(rule =>
(rule.Filter && rule.Filter.Tag &&
rule.Filter.Tag.Key === 'tag1' &&
rule.Filter.Tag.Value === 'val1')
));
assert.deepStrictEqual(expRes1, getRuleIDs(res1));
const objTags2 = { TagSet: [{ Key: 'tag3-1', Value: 'val3' }] };
const res2 = lutils.filterRules(mBucketRules, item, objTags2);
assert.strictEqual(res2.length, 1);
const expRes2 = getRuleIDs(mBucketRules.filter(rule =>
rule.Filter && rule.Filter.Tag &&
rule.Filter.Tag.Key === 'tag3-1' &&
rule.Filter.Tag.Value === 'val3'
));
assert.deepStrictEqual(expRes2, getRuleIDs(res2));
});
it('should filter out unmatched multiple tags', () => {
const mBucketRules = [
new LifecycleRule().addID('task-1').addTag('tag1', 'val1')
.addTag('tag2', 'val1').build(),
new LifecycleRule().addID('task-2').addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-3').addTag('tag2', 'val1').build(),
new LifecycleRule().addID('task-4').addTag('tag2', 'false').build(),
new LifecycleRule().addID('task-5').addTag('tag2', 'val1')
.addTag('tag1', 'false').build(),
new LifecycleRule().addID('task-6').addTag('tag2', 'val1')
.addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-7').addTag('tag2', 'val1')
.addTag('tag1', 'val1').addTag('tag3', 'val1').build(),
new LifecycleRule().addID('task-8').addTag('tag2', 'val1')
.addTag('tag1', 'val1').addTag('tag3', 'false').build(),
new LifecycleRule().addID('task-9').build(),
];
const item = {
Key: 'example-item',
LastModified: CURRENT,
};
const objTags1 = { TagSet: [
{ Key: 'tag1', Value: 'val1' },
{ Key: 'tag2', Value: 'val1' },
] };
const res1 = lutils.filterRules(mBucketRules, item, objTags1);
assert.strictEqual(res1.length, 5);
assert.deepStrictEqual(getRuleIDs(res1), ['task-1', 'task-2',
'task-3', 'task-6', 'task-9']);
const objTags2 = { TagSet: [
{ Key: 'tag2', Value: 'val1' },
] };
const res2 = lutils.filterRules(mBucketRules, item, objTags2);
assert.strictEqual(res2.length, 2);
assert.deepStrictEqual(getRuleIDs(res2), ['task-3', 'task-9']);
const objTags3 = { TagSet: [
{ Key: 'tag2', Value: 'val1' },
{ Key: 'tag1', Value: 'val1' },
{ Key: 'tag3', Value: 'val1' },
] };
const res3 = lutils.filterRules(mBucketRules, item, objTags3);
assert.strictEqual(res3.length, 6);
assert.deepStrictEqual(getRuleIDs(res3), ['task-1', 'task-2',
'task-3', 'task-6', 'task-7', 'task-9']);
});
it('should filter correctly for an object with no tags', () => {
const mBucketRules = [
new LifecycleRule().addID('task-1').addTag('tag1', 'val1')
.addTag('tag2', 'val1').build(),
new LifecycleRule().addID('task-2').addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-3').addTag('tag2', 'val1').build(),
new LifecycleRule().addID('task-4').addTag('tag2', 'false').build(),
new LifecycleRule().addID('task-5').addTag('tag2', 'val1')
.addTag('tag1', 'false').build(),
new LifecycleRule().addID('task-6').addTag('tag2', 'val1')
.addTag('tag1', 'val1').build(),
new LifecycleRule().addID('task-7').addTag('tag2', 'val1')
.addTag('tag1', 'val1').addTag('tag3', 'val1').build(),
new LifecycleRule().addID('task-8').addTag('tag2', 'val1')
.addTag('tag1', 'val1').addTag('tag3', 'false').build(),
new LifecycleRule().addID('task-9').build(),
];
const item = {
Key: 'example-item',
LastModified: CURRENT,
};
const objTags = { TagSet: [] };
const objNoTagSet = {};
[objTags, objNoTagSet].forEach(obj => {
const res = lutils.filterRules(mBucketRules, item, obj);
assert.strictEqual(res.length, 1);
assert.deepStrictEqual(getRuleIDs(res), ['task-9']);
});
});
});
describe('LifecycleUtils::getApplicableTransition', () => {
let lutils;
before(() => {
lutils = new LifecycleUtils();
});
describe('using Days time type', () => {
it('should return undefined if no rules given', () => {
const result = lutils.getApplicableTransition({
transitions: [],
currentDate: '1970-01-03T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
assert.deepStrictEqual(result, undefined);
});
it('should return undefined when no rule applies', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Days: 1,
StorageClass: 'zenko',
},
],
currentDate: '1970-01-01T23:59:59.999Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
assert.deepStrictEqual(result, undefined);
});
it('should return a single rule if it applies', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Days: 1,
StorageClass: 'zenko',
},
],
currentDate: '1970-01-02T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
const expected = {
Days: 1,
StorageClass: 'zenko',
};
assert.deepStrictEqual(result, expected);
});
it('should return the most applicable rule: last rule', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Days: 1,
StorageClass: 'zenko',
},
{
Days: 10,
StorageClass: 'zenko',
},
],
currentDate: '1970-01-11T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
const expected = {
Days: 10,
StorageClass: 'zenko',
};
assert.deepStrictEqual(result, expected);
});
it('should return the most applicable rule: middle rule', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Days: 1,
StorageClass: 'zenko',
},
{
Days: 4,
StorageClass: 'zenko',
},
{
Days: 10,
StorageClass: 'zenko',
},
],
currentDate: '1970-01-05T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
const expected = {
Days: 4,
StorageClass: 'zenko',
};
assert.deepStrictEqual(result, expected);
});
});
describe('using Date time type', () => {
it('should return undefined if no rules given', () => {
const result = lutils.getApplicableTransition({
transitions: [],
currentDate: '1970-01-03T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
assert.deepStrictEqual(result, undefined);
});
it('should return undefined when no rule applies', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Date: '1970-01-02T00:00:00.000Z',
StorageClass: 'zenko',
},
],
currentDate: '1970-01-01T23:59:59.999Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
assert.deepStrictEqual(result, undefined);
});
it('should return a single rule if it applies', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Date: '1970-01-02T00:00:00.000Z',
StorageClass: 'zenko',
},
],
currentDate: '1970-01-02T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
const expected = {
Date: '1970-01-02T00:00:00.000Z',
StorageClass: 'zenko',
};
assert.deepStrictEqual(result, expected);
});
it('should return the most applicable rule', () => {
const result = lutils.getApplicableTransition({
transitions: [
{
Date: '1970-01-02T00:00:00.000Z',
StorageClass: 'zenko',
},
{
Date: '1970-01-10T00:00:00.000Z',
StorageClass: 'zenko',
},
],
currentDate: '1970-01-11T00:00:00.000Z',
lastModified: '1970-01-01T00:00:00.000Z',
store: {},
});
const expected = {
Date: '1970-01-10T00:00:00.000Z',
StorageClass: 'zenko',
};
assert.deepStrictEqual(result, expected);
});
});
});
describe('LifecycleUtils::compareTransitions', () => {
let lutils;
before(() => {
lutils = new LifecycleUtils();
});
it('should return undefined if no rules given', () => {
const result = lutils.compareTransitions({ });
assert.strictEqual(result, undefined);
});
it('should return first rule if second rule is not given', () => {
const transition1 = {
Days: 1,
StorageClass: 'zenko',
};
const result = lutils.compareTransitions({ transition1 });
assert.deepStrictEqual(result, transition1);
});
it('should return second rule if first rule is not given', () => {
const transition2 = {
Days: 1,
StorageClass: 'zenko',
};
const result = lutils.compareTransitions({ transition2 });
assert.deepStrictEqual(result, transition2);
});
it('should return the first rule if older than the second rule', () => {
const transition1 = {
Days: 2,
StorageClass: 'zenko',
};
const transition2 = {
Days: 1,
StorageClass: 'zenko',
};
const result = lutils.compareTransitions({
transition1,
transition2,
lastModified: '1970-01-01T00:00:00.000Z',
});
assert.deepStrictEqual(result, transition1);
});
it('should return the second rule if older than the first rule', () => {
const transition1 = {
Days: 1,
StorageClass: 'zenko',
};
const transition2 = {
Days: 2,
StorageClass: 'zenko',
};
const result = lutils.compareTransitions({
transition1,
transition2,
lastModified: '1970-01-01T00:00:00.000Z',
});
assert.deepStrictEqual(result, transition2);
});
});

View File

@ -0,0 +1,92 @@
const assert = require('assert');
const { convertToXml, parseLegalHoldXml } =
require('../../../lib/s3middleware/objectLegalHold');
const DummyRequestLogger = require('../helpers').DummyRequestLogger;
const log = new DummyRequestLogger();
const failTests = [
{
description: 'should fail with empty status',
params: { status: '' },
error: 'MalformedXML',
errMessage: 'request xml does not contain Status',
},
{
description: 'should fail with invalid status "on"',
params: { status: 'on' },
error: 'MalformedXML',
errMessage: 'Status request xml must be one of "ON", "OFF"',
},
{
description: 'should fail with invalid status "On"',
params: { status: 'On' },
error: 'MalformedXML',
errMessage: 'Status request xml must be one of "ON", "OFF"',
},
{
description: 'should fail with invalid status "off"',
params: { status: 'off' },
error: 'MalformedXML',
errMessage: 'Status request xml must be one of "ON", "OFF"',
},
{
description: 'should fail with invalid status "Off"',
params: { status: 'Off' },
error: 'MalformedXML',
errMessage: 'Status request xml must be one of "ON", "OFF"',
},
];
const generateXml = status =>
'<?xml version="1.0" encoding="UTF-8" standalone="yes"?>' +
`<LegalHold><Status>${status}</Status></LegalHold>`;
describe('object legal hold helpers: parseLegalHoldXml', () => {
failTests.forEach(test => {
it(test.description, done => {
const status = test.params.status;
parseLegalHoldXml(generateXml(status), log, err => {
assert(err[test.error]);
assert.strictEqual(err.description, test.errMessage);
done();
});
});
});
it('should pass with legal hold status "ON"', done => {
parseLegalHoldXml(generateXml('ON'), log, (err, result) => {
assert.ifError(err);
assert.strictEqual(result, true);
done();
});
});
it('should pass with legal hold status "OFF"', done => {
parseLegalHoldXml(generateXml('OFF'), log, (err, result) => {
assert.ifError(err);
assert.strictEqual(result, false);
done();
});
});
});
describe('object legal hold helpers: convertToXml', () => {
it('should return correct xml when legal hold status "ON"', () => {
const xml = convertToXml(true);
const expextedXml = generateXml('ON');
assert.strictEqual(xml, expextedXml);
});
it('should return correct xml when legal hold status "OFF"', () => {
const xml = convertToXml(false);
const expextedXml = generateXml('OFF');
assert.strictEqual(xml, expextedXml);
});
it('should return empty string when legal hold not set', () => {
const xml = convertToXml(undefined);
const expextedXml = '';
assert.strictEqual(xml, expextedXml);
});
});

View File

@ -0,0 +1,127 @@
const assert = require('assert');
const {
parseRetentionXml,
convertToXml,
} = require('../../../lib/s3middleware/objectRetention');
const DummyRequestLogger = require('../helpers').DummyRequestLogger;
const log = new DummyRequestLogger();
const date = new Date();
date.setDate(date.getDate() + 1);
const failDate = new Date('05/01/2020');
const passDate = new Date();
passDate.setDate(passDate.getDate() + 2);
function buildXml(key, value) {
const mode = key === 'Mode' ?
`<Mode>${value}</Mode>` :
'<Mode>GOVERNANCE</Mode>';
const retainDate = key === 'RetainDate' ?
`<RetainUntilDate>${value}</RetainUntilDate>` :
`<RetainUntilDate>${date}</RetainUntilDate>`;
const retention = key === 'Retention' ?
`<Retention>${value}</Retention>` :
`<Retention>${mode}${retainDate}</Retention>`;
return retention;
}
const expectedRetention = {
mode: 'GOVERNANCE',
date: passDate.toISOString(),
};
const expectedXml =
'<Retention xmlns="http://s3.amazonaws.com/doc/2006-03-01/">' +
'<Mode>GOVERNANCE</Mode>' +
`<RetainUntilDate>${passDate.toISOString()}</RetainUntilDate>` +
'</Retention>';
const failTests = [
{
name: 'should fail with empty retention',
params: { key: 'Retention', value: '' },
error: 'MalformedXML',
errMessage: 'request xml does not contain Retention',
},
{
name: 'should fail with empty mode',
params: { key: 'Mode', value: '' },
error: 'MalformedXML',
errMessage: 'request xml does not contain Mode',
},
{
name: 'should fail with empty retain until date',
params: { key: 'RetainDate', value: '' },
error: 'MalformedXML',
errMessage: 'request xml does not contain RetainUntilDate',
},
{
name: 'should fail with invalid mode',
params: { key: 'Mode', value: 'GOVERPLIANCE' },
error: 'MalformedXML',
errMessage: 'Mode request xml must be one of "GOVERNANCE", ' +
'"COMPLIANCE"',
},
{
name: 'should fail with retain until date in UTC format',
params: { key: 'RetainDate', value: `${date.toUTCString()}` },
error: 'InvalidRequest',
errMessage: 'RetainUntilDate timestamp must be ISO-8601 format',
},
{
name: 'should fail with retain until date in GMT format',
params: { key: 'RetainDate', value: `${date.toString()}` },
error: 'InvalidRequest',
errMessage: 'RetainUntilDate timestamp must be ISO-8601 format',
},
{
name: 'should fail with retain until date in past',
params: { key: 'RetainDate', value: failDate.toISOString() },
error: 'InvalidRequest',
errMessage: 'RetainUntilDate must be in the future',
},
];
describe('object Retention validation', () => {
failTests.forEach(t => {
it(t.name, done => {
parseRetentionXml(buildXml(t.params.key, t.params.value), log,
err => {
assert(err[t.error]);
assert.strictEqual(err.description, t.errMessage);
done();
});
});
});
it('should pass with valid retention', done => {
parseRetentionXml(buildXml('RetainDate', passDate.toISOString()), log,
(err, result) => {
assert.ifError(err);
assert.deepStrictEqual(result, expectedRetention);
done();
});
});
});
describe('object Retention xml', () => {
it('should return empty string if no retention date', done => {
const xml = convertToXml('GOVERNANCE', '');
assert.equal(xml, '');
done();
});
it('should return empty string if no retention mode', done => {
const xml = convertToXml('', passDate.toISOString());
assert.equal(xml, '');
done();
});
it('should return xml string', done => {
const xml = convertToXml('GOVERNANCE', passDate.toISOString());
assert.strictEqual(xml, expectedXml);
done();
});
});

View File

@ -28,14 +28,63 @@ describe('test generating versionIds', () => {
// nodejs 10 no longer returns error for non-hex string versionIds
it.skip('should return error decoding non-hex string versionIds', () => {
const encoded = vids.map(vid => VID.encode(vid));
const decoded = encoded.map(vid => VID.decode(`${vid}foo`));
const encoded = vids.map(vid => VID.hexEncode(vid));
const decoded = encoded.map(vid => VID.hexDecode(`${vid}foo`));
decoded.forEach(result => assert(result instanceof Error));
});
it('should encode and decode versionIds', () => {
const encoded = vids.map(vid => VID.encode(vid));
const decoded = encoded.map(vid => VID.decode(vid));
const encoded = vids.map(vid => VID.hexEncode(vid));
const decoded = encoded.map(vid => VID.hexDecode(vid));
assert.strictEqual(vids.length, count);
assert.deepStrictEqual(vids, decoded);
});
it('simple base62 version test', () => {
const vid = '98376906954349999999RG001 145.20.5';
const encoded = VID.base62Encode(vid);
assert.strictEqual(encoded, 'aJLWKz4Ko9IjBBgXKj5KQT2G9UHv0g7P');
const decoded = VID.base62Decode(encoded);
assert.strictEqual(vid, decoded);
});
it('base62 version test with smaller part1 number', () => {
const vid = '00000000054349999999RG001 145.20.5';
const encoded = VID.base62Encode(vid);
const decoded = VID.base62Decode(encoded);
assert.strictEqual(vid, decoded);
});
it('base62 version test with smaller part2 number', () => {
const vid = '98376906950000099999RG001 145.20.5';
const encoded = VID.base62Encode(vid);
const decoded = VID.base62Decode(encoded);
assert.strictEqual(vid, decoded);
});
it('base62 version test with smaller part3', () => {
const vid = '98376906950000099999R1 145.20.5';
const encoded = VID.base62Encode(vid);
const decoded = VID.base62Decode(encoded);
assert.strictEqual(vid, decoded);
});
it('base62 version test with smaller part3 - 2', () => {
const vid = '98376906950000099999R1x';
const encoded = VID.base62Encode(vid);
const decoded = VID.base62Decode(encoded);
assert.strictEqual(vid, decoded);
});
it('error case: when invalid base62 key part 3 has invalid base62 character', () => {
const invalidBase62VersionId = 'aJLWKz4Ko9IjBBgXKj5KQT.G9UHv0g7P';
const decoded = VID.base62Decode(invalidBase62VersionId);
assert(decoded instanceof Error);
});
it('should encode and decode base62 versionIds', () => {
const encoded = vids.map(vid => VID.base62Encode(vid));
const decoded = encoded.map(vid => VID.base62Decode(vid));
assert.strictEqual(vids.length, count);
assert.deepStrictEqual(vids, decoded);
});

View File

@ -0,0 +1,372 @@
'use strict'; // eslint-disable-line strict
/* eslint new-cap: "off" */
/* eslint dot-notation: "off" */
const assert = require('assert');
const crypto = require('crypto');
const uuidv4 = require('uuid/v4');
const {
EchoChannel,
logger,
} = require('./ersatz.js');
const expectedObjectType = 'Symmetric Key';
const expectedAlgorithm = 'AES';
const expectedLength = 256;
const expectedBlockCipherMode = 'CBC';
const expectedPaddingMethod = 'PKCS5';
const expectedIV = Buffer.alloc(16).fill(0);
const versionMajor = 1;
const versionMinor = 4;
const vendorIdentification = 'Scality Loopback KMIP Server';
const serverExtensions = [
{
name: 'Security Level',
tag: 0x541976,
type: 7,
value: 'Gangsta Grade',
},
{
name: 'Prefered Algorithm',
tag: 0x542008,
type: 7,
value: 'Kevin Bacon',
},
{
name: 'Yo Dawg',
tag: 0x542011,
type: 7,
value: 'I heard you like kmip, so I put a server in your client ' +
'so you can do both ends of the conversation while you are ' +
'talking about server side encryption',
},
];
class DummyServerTransport {
registerHandshakeFunction() {
throw new Error('DummyServerTransport::registerHandshakeFunction: ' +
'Client side operations not implemented');
}
send() {
throw new Error('DummyServerTransport::send: ' +
'Client side operations not implemented');
}
}
class LoopbackServerChannel extends EchoChannel {
constructor(KMIPClass, Codec, options) {
super();
this.options = options || {
kmip: {
codec: {},
transport: {},
},
};
this.KMIP = KMIPClass;
this.kmip = new KMIPClass(Codec, DummyServerTransport,
this.options);
serverExtensions.forEach(extension => {
this.kmip.mapExtension(extension.name, extension.tag);
});
this.managedKeys = {};
}
write(data) {
const request = this.kmip._decodeMessage(logger, data);
const requestOperation = request.lookup(
'Request Message/Batch Item/Operation')[0];
this.routeRequest(
requestOperation, request, (err, responsePayload) => {
const uniqueBatchItemID = request.lookup(
'Request Message/Batch Item/Unique Batch Item ID')[0];
const requestProtocolVersionMinor = request.lookup(
'Request Message/Request Header/Protocol Version/' +
'Protocol Version Minor')[0];
const requestProtocolVersionMajor = request.lookup(
'Request Message/Request Header/Protocol Version/' +
'Protocol Version Major')[0];
let result;
if (err) {
logger.error('Request processing failed', { error: err });
result = err;
} else {
result = [
this.KMIP.Enumeration('Result Status', 'Success'),
this.KMIP.Structure('Response Payload',
responsePayload),
];
}
const response = this.KMIP.Message([
this.KMIP.Structure('Response Message', [
this.KMIP.Structure('Response Header', [
this.KMIP.Structure('Protocol Version', [
this.KMIP.Integer('Protocol Version Major',
requestProtocolVersionMajor),
this.KMIP.Integer('Protocol Version Minor',
requestProtocolVersionMinor),
]),
this.KMIP.DateTime('Time Stamp', new Date),
this.KMIP.Integer('Batch Count', 1),
]),
this.KMIP.Structure('Batch Item', [
this.KMIP.Enumeration('Operation',
requestOperation),
this.KMIP.ByteString('Unique Batch Item ID',
uniqueBatchItemID),
...result,
]),
]),
]);
super.write(this.kmip._encodeMessage(response));
});
return this;
}
errorResponse(reason, message) {
return [
this.KMIP.Enumeration('Result Status', 'Operation Failed'),
this.KMIP.Enumeration('Result Reason', reason),
this.KMIP.Enumeration('Result Message', message),
];
}
routeRequest(operation, request, cb) {
switch (operation) {
case 'Query': return this.routeQuery(request, cb);
case 'Discover Versions':
return this.routeDiscoverVersions(request, cb);
case 'Create': return this.routeCreate(request, cb);
case 'Activate': return this.routeActivate(request, cb);
case 'Encrypt': return this.routeEncrypt(request, cb);
case 'Decrypt': return this.routeDecrypt(request, cb);
case 'Revoke': return this.routeRevoke(request, cb);
case 'Destroy': return this.routeDestroy(request, cb);
default: return cb(new Error(`Unknown Operation: ${operation}`));
}
}
routeQuery(request, cb) {
const queryFunctions = request.lookup(
'Request Message/Batch Item/Request Payload/Query Function');
const response = [];
if (queryFunctions.includes('Query Operations')) {
response.push(
this.KMIP.Enumeration('Operation', 'Query'),
this.KMIP.Enumeration('Operation', 'Discover Versions'),
this.KMIP.Enumeration('Operation', 'Create'),
this.KMIP.Enumeration('Operation', 'Activate'),
this.KMIP.Enumeration('Operation', 'Encrypt'),
this.KMIP.Enumeration('Operation', 'Decrypt'),
this.KMIP.Enumeration('Operation', 'Revoke'),
this.KMIP.Enumeration('Operation', 'Destroy'));
}
if (queryFunctions.includes('Query Objects')) {
response.push(
this.KMIP.Enumeration('Object Type', 'Symmetric Key'));
}
if (queryFunctions.includes('Query Server Information')) {
response.push(
this.KMIP.TextString('Vendor Identification',
vendorIdentification),
this.KMIP.Structure('Server Information',
serverExtensions.map(extension =>
this.KMIP.TextString(
extension.name,
extension.value)
)));
}
if (queryFunctions.includes('Query Extension Map')) {
serverExtensions.forEach(extension => {
response.push(
this.KMIP.Structure('Extension Information', [
this.KMIP.TextString('Extension Name', extension.name),
this.KMIP.Integer('Extension Tag', extension.tag),
/* 7 is KMIP TextString, not really used anyway in
* this implenetation, it could be anything
* without changing the behavior of the client code. */
this.KMIP.Integer('Extension Type', 7),
]));
});
}
return cb(null, response);
}
routeDiscoverVersions(request, cb) {
const response = [
this.KMIP.Structure('Protocol Version', [
this.KMIP.Integer('Protocol Version Major', versionMajor),
this.KMIP.Integer('Protocol Version Minor', versionMinor),
]),
];
return cb(null, response);
}
routeCreate(request, cb) {
let cryptographicAlgorithm;
let cryptographicLength;
let cryptographicUsageMask;
let activationDate;
const attributes = request.lookup(
'Request Message/Batch Item/Request Payload/Template-Attribute')[0];
attributes.forEach(item => {
const attribute = item['Attribute'];
const attributeValue = attribute.value[1]['Attribute Value'];
const diversion = attributeValue.diversion;
const value = attributeValue.value;
switch (diversion) {
case 'Cryptographic Algorithm':
assert(!cryptographicAlgorithm);
cryptographicAlgorithm = value;
break;
case 'Cryptographic Length':
assert(!cryptographicLength);
cryptographicLength = value;
break;
case 'Cryptographic Usage Mask':
assert(!cryptographicUsageMask);
cryptographicUsageMask = value;
break;
case 'Activation Date':
assert(!activationDate);
activationDate = value;
break;
default:
}
});
const decodedUsageMask =
this.kmip.decodeMask('Cryptographic Usage Mask',
cryptographicUsageMask);
assert(cryptographicAlgorithm === expectedAlgorithm);
assert(cryptographicLength === expectedLength);
assert(decodedUsageMask.includes('Encrypt'));
assert(decodedUsageMask.includes('Decrypt'));
const key = Buffer.from(crypto.randomBytes(cryptographicLength / 8));
const keyIdentifier = uuidv4();
this.managedKeys[keyIdentifier] = {
key,
activationDate,
};
const response = [
this.KMIP.Enumeration('Object Type', expectedObjectType),
this.KMIP.TextString('Unique Identifier', keyIdentifier),
];
return cb(null, response);
}
routeActivate(request, cb) {
const keyIdentifier = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Unique Identifier') || [undefined])[0];
this.managedKeys[keyIdentifier].activationDate =
new Date;
const response = [
this.KMIP.TextString('Unique Identifier', keyIdentifier),
];
return cb(null, response);
}
_getIvCounterNonce(request) {
/* Having / in the path is not a good idea for the server side.
* Because of this, Message::lookup() cannot be directly used to
* extract the IV, hence this function */
const requestPayload = (
request.lookup(
'Request Message/Batch Item/Request Payload')
|| [undefined])[0];
let ivCounterNonce;
requestPayload.forEach(attr => {
const ivCounterNonceAttr = attr['IV/Counter/Nonce'];
if (ivCounterNonceAttr) {
ivCounterNonce = ivCounterNonceAttr.value;
}
});
return ivCounterNonce;
}
_transform(cipherFunc, request, cb) {
const keyIdentifier = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Unique Identifier') || [undefined])[0];
const blockCipherMode = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Cryptographic Parameters/Block Cipher Mode')
|| [undefined])[0];
const paddingMethod = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Cryptographic Parameters/Padding Method')
|| [undefined])[0];
const cryptographicAlgorithm = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Cryptographic Parameters/Cryptographic Algorithm')
|| [undefined])[0];
const ivCounterNonce = this._getIvCounterNonce(request);
const data = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Data')
|| [undefined])[0];
assert(blockCipherMode === expectedBlockCipherMode);
assert(paddingMethod === expectedPaddingMethod);
assert(cryptographicAlgorithm === expectedAlgorithm);
assert(expectedIV.compare(ivCounterNonce) === 0);
const keySpec = this.managedKeys[keyIdentifier];
const now = new Date;
assert(keySpec.activationDate && keySpec.activationDate <= now);
const cipher = cipherFunc('aes-256-cbc', keySpec.key, ivCounterNonce);
let cipheredData = cipher.update(data);
const final = cipher.final();
if (final.length !== 0) {
cipheredData = Buffer.concat([cipheredData, final]);
}
const response = [
this.KMIP.TextString('Unique Identifier', keyIdentifier),
this.KMIP.ByteString('Data', cipheredData),
];
return cb(null, response);
}
routeEncrypt(request, cb) {
return this._transform(crypto.createCipheriv, request, cb);
}
routeDecrypt(request, cb) {
return this._transform(crypto.createDecipheriv, request, cb);
}
routeRevoke(request, cb) {
const keyIdentifier = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Unique Identifier') || [undefined])[0];
this.managedKeys[keyIdentifier].activationDate = null;
const response = [
this.KMIP.TextString('Unique Identifier', keyIdentifier),
];
return cb(null, response);
}
routeDestroy(request, cb) {
const keyIdentifier = (
request.lookup(
'Request Message/Batch Item/Request Payload/' +
'Unique Identifier') || [undefined])[0];
assert(!this.managedKeys[keyIdentifier].activationDate);
this.managedKeys[keyIdentifier] = null;
const response = [
this.KMIP.TextString('Unique Identifier', keyIdentifier),
];
return cb(null, response);
}
}
module.exports = LoopbackServerChannel;

View File

@ -0,0 +1,127 @@
'use strict'; // eslint-disable-line strict
module.exports = [
/* Invalid type */
Buffer.from('2100000000000000', 'hex'),
Buffer.from('2100000b00000000', 'hex'),
/* Structure */
// too short
Buffer.from('42', 'hex'),
Buffer.from('4200', 'hex'),
Buffer.from('420078', 'hex'),
Buffer.from('42007801', 'hex'),
Buffer.from('4200780100', 'hex'),
Buffer.from('420078010000', 'hex'),
Buffer.from('4200780100000001', 'hex'),
Buffer.from('420078010000000100', 'hex'),
Buffer.from('4200780100000008', 'hex'),
Buffer.from('420078010000000800', 'hex'),
Buffer.from('42007801000000080000', 'hex'),
Buffer.from('4200780100000008000000', 'hex'),
Buffer.from('420078010000000800000000', 'hex'),
Buffer.from('4200780100000010', 'hex'),
/* Integer */
// too short
Buffer.from('4200780200000004', 'hex'),
Buffer.from('420078020000000400', 'hex'),
Buffer.from('42007802000000040000', 'hex'),
Buffer.from('4200780200000004000000', 'hex'),
// invalid length for the type
Buffer.from('42007802000000080000000000000000', 'hex'),
Buffer.from('42007802000000020000000000000000', 'hex'),
Buffer.from('42007802000000000000000000000000', 'hex'),
/* Long Integer */
// too short
Buffer.from('4200780300000008', 'hex'),
Buffer.from('420078030000000810', 'hex'),
Buffer.from('42007803000000081000', 'hex'),
Buffer.from('4200780300000008100000', 'hex'),
Buffer.from('420078030000000810000000', 'hex'),
Buffer.from('42007803000000081000000000', 'hex'),
Buffer.from('4200780300000008100000000000', 'hex'),
Buffer.from('420078030000000810000000000000', 'hex'),
// 53bit overflow
Buffer.from('42007803000000081000000000000000', 'hex'),
Buffer.from('4200780300000008ffffffffffffffff', 'hex'),
// invalid length for the type
Buffer.from('420078030000000400000001', 'hex'),
Buffer.from('42007803000000100000000000000000100000000000000000', 'hex'),
/* Big Integer */
// too short
Buffer.from('4200780400000001', 'hex'),
Buffer.from('420078040000000200', 'hex'),
/* Enumeration */
// too short
Buffer.from('4200740500000004', 'hex'),
Buffer.from('4200740500000004000000', 'hex'),
// invalid length for the type
Buffer.from('42007405000000020000', 'hex'),
Buffer.from('4200740500000006000000000000', 'hex'),
// non existing tag and enum value with invalid length
Buffer.from('45007405000000020000', 'hex'),
Buffer.from('4500740500000006000000000000', 'hex'),
/* Boolean */
// too short
Buffer.from('4200740600000008', 'hex'),
Buffer.from('420074060000000800', 'hex'),
Buffer.from('42007406000000080000', 'hex'),
Buffer.from('4200740600000008000000', 'hex'),
Buffer.from('420074060000000800000000', 'hex'),
Buffer.from('42007406000000080000000000', 'hex'),
Buffer.from('4200740600000008000000000000', 'hex'),
// invalid length
Buffer.from('420074060000000400000000', 'hex'),
Buffer.from('420074060000001000000000000000000000000000000000', 'hex'),
/* TextString */
// too short
Buffer.from('4200740700000008', 'hex'),
Buffer.from('420074070000000800', 'hex'),
Buffer.from('42007407000000080000', 'hex'),
Buffer.from('4200740700000008000000', 'hex'),
Buffer.from('420074070000000800000000', 'hex'),
Buffer.from('42007407000000080000000000', 'hex'),
Buffer.from('4200740700000008000000000000', 'hex'),
/* ByteString */
// too short
Buffer.from('4200740800000008', 'hex'),
Buffer.from('420074080000000800', 'hex'),
Buffer.from('42007408000000080000', 'hex'),
Buffer.from('4200740800000008000000', 'hex'),
Buffer.from('420074080000000800000000', 'hex'),
Buffer.from('42007408000000080000000000', 'hex'),
Buffer.from('4200740800000008000000000000', 'hex'),
/* Date-Time */
// too short
Buffer.from('4200740900000008', 'hex'),
Buffer.from('420074090000000800', 'hex'),
Buffer.from('42007409000000080000', 'hex'),
Buffer.from('4200740900000008000000', 'hex'),
Buffer.from('420074090000000800000000', 'hex'),
Buffer.from('42007409000000080000000000', 'hex'),
Buffer.from('4200740900000008000000000000', 'hex'),
// invalid length
Buffer.from('420074090000000400000000', 'hex'),
Buffer.from('420074090000001000000000000000000000000000000000', 'hex'),
/* Interval */
// too short
Buffer.from('4200780a00000004', 'hex'),
Buffer.from('4200780a0000000400', 'hex'),
Buffer.from('4200780a000000040000', 'hex'),
Buffer.from('4200780a00000004000000', 'hex'),
// invalid length for the type
Buffer.from('4200780a000000080000000000000000', 'hex'),
Buffer.from('4200780a000000020000000000000000', 'hex'),
Buffer.from('4200780a000000000000000000000000', 'hex'),
];

102
tests/utils/kmip/ersatz.js Normal file
View File

@ -0,0 +1,102 @@
/* eslint new-cap: "off" */
const { EventEmitter } = require('events');
const logger = {
info: () => {},
debug: () => {},
warn: () => {},
error: () => {},
};
/* Fake tls AND socket objects, duck type */
class EchoChannel extends EventEmitter {
constructor() {
super();
this.clogged = false;
}
/* tls object members substitutes */
connect(port, options, cb) {
process.nextTick(cb);
return this;
}
/* socket object members substitutes */
cork() {
return this;
}
uncork() {
return this;
}
write(data) {
if (!this.clogged) {
return process.nextTick(() => this.emit('data', data));
}
return this;
}
end() {
return this.emit('end');
}
/* Instrumentation member functions */
clog() {
this.clogged = true;
return this;
}
}
class MirrorChannel extends EchoChannel {
constructor(KMIPClass, Codec) {
super();
this.codec = new Codec({});
this.KMIP = KMIPClass;
}
write(data) {
const request = this.codec.decode(logger, data);
const uniqueBatchItemID = request.lookup(
'Request Message/Batch Item/Unique Batch Item ID')[0];
const requestPayload = request.lookup(
'Request Message/Batch Item/Request Payload')[0];
const requestProtocolVersionMinor = request.lookup(
'Request Message/Request Header/Protocol Version/' +
'Protocol Version Minor')[0];
const requestProtocolVersionMajor = request.lookup(
'Request Message/Request Header/Protocol Version/' +
'Protocol Version Major')[0];
const requestOperation = request.lookup(
'Request Message/Batch Item/Operation')[0];
const response = this.KMIP.Message([
this.KMIP.Structure('Response Message', [
this.KMIP.Structure('Response Header', [
this.KMIP.Structure('Protocol Version', [
this.KMIP.Integer('Protocol Version Major',
requestProtocolVersionMajor),
this.KMIP.Integer('Protocol Version Minor',
requestProtocolVersionMinor),
]),
this.KMIP.DateTime('Time Stamp', new Date),
this.KMIP.Integer('Batch Count', 1),
]),
this.KMIP.Structure('Batch Item', [
this.KMIP.Enumeration('Operation', requestOperation),
this.KMIP.ByteString('Unique Batch Item ID',
uniqueBatchItemID),
this.KMIP.Enumeration('Result Status', 'Success'),
this.KMIP.Structure('Response Payload', requestPayload),
]),
]),
]);
super.write(this.codec.encode(response));
return this;
}
}
module.exports = { logger, EchoChannel, MirrorChannel };

View File

@ -0,0 +1,82 @@
'use strict'; // eslint-disable-line strict
/* eslint new-cap: "off" */
const KMIP = require('../../../lib/network/kmip');
module.exports = [
{
operation: 'Query',
payload: () => [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
],
},
{
operation: 'Query',
payload: () => [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
KMIP.Enumeration('Query Function',
'Query Server Information'),
KMIP.Enumeration('Query Function', 'Query Profiles'),
KMIP.Enumeration('Query Function', 'Query Capabilities'),
KMIP.Enumeration('Query Function',
'Query Application Namespaces'),
KMIP.Enumeration('Query Function', 'Query Extension List'),
KMIP.Enumeration('Query Function', 'Query Extension Map'),
KMIP.Enumeration('Query Function',
'Query Attestation Types'),
KMIP.Enumeration('Query Function', 'Query RNGs'),
KMIP.Enumeration('Query Function', 'Query Validations'),
KMIP.Enumeration('Query Function',
'Query Client Registration Methods'),
],
},
{
operation: 'Discover Versions',
payload: () => [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 2),
KMIP.Integer('Protocol Version Minor', 0),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 4),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 3),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 2),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 1),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 0),
]),
],
},
{
operation: 'Create',
payload: kmip => [
KMIP.Enumeration('Object Type', 'Symmetric Key'),
KMIP.Structure('Template-Attribute', [
KMIP.Attribute('TextString', 'x-Name', 's3-thekey'),
KMIP.Attribute('Enumeration', 'Cryptographic Algorithm',
'AES'),
KMIP.Attribute('Integer', 'Cryptographic Length', 256),
KMIP.Attribute('Integer', 'Cryptographic Usage Mask',
kmip.encodeMask(
'Cryptographic Usage Mask',
['Encrypt', 'Decrypt'])),
KMIP.Attribute('Date-Time', 'Activation Date',
new Date),
]),
],
},
];

View File

@ -0,0 +1,120 @@
'use strict'; // eslint-disable-line strict
/* eslint new-cap: "off" */
const TTLVCodec = require('../../../lib/network/kmip/codec/ttlv.js');
const KMIP = require('../../../lib/network/kmip');
const kmip = new KMIP(TTLVCodec,
class DummyTransport {},
{ kmip: {} }, () => {});
module.exports = [
KMIP.Message([
KMIP.Structure('Request Message', [
KMIP.Structure('Request Header', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 3),
]),
KMIP.Integer('Maximum Response Size', 256),
KMIP.Integer('Batch Count', 1),
]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', 'Query'),
KMIP.Structure('Request Payload', [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
]),
]),
]),
]),
KMIP.Message([
KMIP.Structure('Request Message', [
KMIP.Structure('Request Header', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 2),
]),
KMIP.Integer('Maximum Response Size', 2048),
KMIP.Boolean('Asynchronous Indicator', false),
KMIP.Integer('Batch Count', 3),
KMIP.ByteString('Asynchronous Correlation Value',
Buffer.from('Arrggg...', 'utf8')),
]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', 'Query'),
KMIP.ByteString('Unique Batch Item ID',
Buffer.from('foo', 'utf8')),
KMIP.Structure('Request Payload', [
KMIP.Enumeration('Query Function', 'Query Operations'),
KMIP.Enumeration('Query Function', 'Query Objects'),
KMIP.Enumeration('Query Function',
'Query Server Information'),
KMIP.Enumeration('Query Function', 'Query Profiles'),
KMIP.Enumeration('Query Function', 'Query Capabilities'),
KMIP.Enumeration('Query Function',
'Query Application Namespaces'),
KMIP.Enumeration('Query Function', 'Query Extension List'),
KMIP.Enumeration('Query Function', 'Query Extension Map'),
KMIP.Enumeration('Query Function',
'Query Attestation Types'),
KMIP.Enumeration('Query Function', 'Query RNGs'),
KMIP.Enumeration('Query Function', 'Query Validations'),
KMIP.Enumeration('Query Function',
'Query Client Registration Methods'),
]),
]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', 'Discover Versions'),
KMIP.ByteString('Unique Batch Item ID',
Buffer.from('bar', 'utf8')),
KMIP.Structure('Request Payload', [
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 2),
KMIP.Integer('Protocol Version Minor', 0),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 4),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 3),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 2),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 1),
]),
KMIP.Structure('Protocol Version', [
KMIP.Integer('Protocol Version Major', 1),
KMIP.Integer('Protocol Version Minor', 0),
]),
]),
]),
KMIP.Structure('Batch Item', [
KMIP.Enumeration('Operation', 'Create'),
KMIP.ByteString('Unique Batch Item ID',
Buffer.from('baz', 'utf8')),
KMIP.Structure('Request Payload', [
KMIP.Enumeration('Object Type', 'Symmetric Key'),
KMIP.Structure('Template-Attribute', [
KMIP.Attribute('TextString', 'x-Name', 's3-thekey'),
KMIP.Attribute('Enumeration', 'Cryptographic Algorithm',
'AES'),
KMIP.Attribute('Integer', 'Cryptographic Length', 256),
KMIP.Attribute('Integer', 'Cryptographic Usage Mask',
kmip.encodeMask(
'Cryptographic Usage Mask',
['Encrypt', 'Decrypt'])),
KMIP.Attribute('Date-Time', 'Activation Date',
new Date),
]),
]),
]),
]),
]),
];

View File

@ -0,0 +1,247 @@
'use strict'; // eslint-disable-line strict
module.exports = [
{
request: Buffer.from('42007801000000904200770100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200500200000004' +
'000001000000000042000d0200000004' +
'000000010000000042000f0100000038' +
'42005c05000000040000001800000000' +
'42007901000000204200740500000004' +
'00000001000000004200740500000004' +
'0000000200000000', 'hex'),
response: Buffer.from('42007b01000000a042007a0100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200920900000008' +
'00000000568a5be242000d0200000004' +
'000000010000000042000f0100000048' +
'42005c05000000040000001800000000' +
'42007f05000000040000000100000000' +
'42007e05000000040000000200000000' +
'42007d0700000009544f4f5f4c415247' +
'4500000000000000', 'hex'),
},
{
request: Buffer.from('42007801000000904200770100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200500200000004' +
'000008000000000042000d0200000004' +
'000000010000000042000f0100000038' +
'42005c05000000040000001800000000' +
'42007901000000204200740500000004' +
'00000001000000004200740500000004' +
'0000000200000000', 'hex'),
response: Buffer.from('42007b010000038042007a0100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200920900000008' +
'00000000568a5be242000d0200000004' +
'000000010000000042000f0100000328' +
'42005c05000000040000001800000000' +
'42007f05000000040000000000000000' +
'42007c010000030042005c0500000004' +
'000000180000000042005c0500000004' +
'000000080000000042005c0500000004' +
'000000140000000042005c0500000004' +
'0000000a0000000042005c0500000004' +
'000000010000000042005c0500000004' +
'000000030000000042005c0500000004' +
'0000000b0000000042005c0500000004' +
'0000000c0000000042005c0500000004' +
'0000000d0000000042005c0500000004' +
'0000000e0000000042005c0500000004' +
'0000000f0000000042005c0500000004' +
'000000120000000042005c0500000004' +
'000000130000000042005c0500000004' +
'0000001a0000000042005c0500000004' +
'000000190000000042005c0500000004' +
'000000090000000042005c0500000004' +
'000000110000000042005c0500000004' +
'000000020000000042005c0500000004' +
'000000040000000042005c0500000004' +
'000000150000000042005c0500000004' +
'000000160000000042005c0500000004' +
'000000100000000042005c0500000004' +
'0000001d0000000042005c0500000004' +
'000000060000000042005c0500000004' +
'000000070000000042005c0500000004' +
'0000001e0000000042005c0500000004' +
'0000001b0000000042005c0500000004' +
'0000001c0000000042005c0500000004' +
'000000250000000042005c0500000004' +
'000000260000000042005c0500000004' +
'0000001f0000000042005c0500000004' +
'000000200000000042005c0500000004' +
'000000210000000042005c0500000004' +
'000000220000000042005c0500000004' +
'000000230000000042005c0500000004' +
'000000240000000042005c0500000004' +
'000000270000000042005c0500000004' +
'000000280000000042005c0500000004' +
'00000029000000004200570500000004' +
'00000001000000004200570500000004' +
'00000002000000004200570500000004' +
'00000007000000004200570500000004' +
'00000003000000004200570500000004' +
'00000004000000004200570500000004' +
'00000006000000004200570500000004' +
'00000008000000004200570500000004' +
'00000005000000004200570500000004' +
'0000000900000000', 'hex'),
},
{
request: Buffer.from('42007801000000d04200770100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200500200000004' +
'000008000000000042000d0200000004' +
'000000020000000042000f0100000048' +
'4200930800000003666f6f0000000000' +
'42005c05000000040000001800000000' +
'42007901000000204200740500000004' +
'00000001000000004200740500000004' +
'000000020000000042000f0100000028' +
'42005c05000000040000001800000000' +
'42007901000000104200740500000004' +
'0000000300000000', 'hex'),
response: Buffer.from('42007b010000028042007a0100000048' +
'420069010000002042006a0200000004' +
'000000010000000042006b0200000004' +
'00000003000000004200920900000008' +
'000000005c2d3df242000d0200000004' +
'000000020000000042000f0100000188' +
'42005c05000000040000001800000000' +
'4200930800000003666f6f0000000000' +
'42007f05000000040000000000000000' +
'42007c010000015042005c0500000004' +
'000000010000000042005c0500000004' +
'000000020000000042005c0500000004' +
'000000030000000042005c0500000004' +
'000000080000000042005c0500000004' +
'0000000a0000000042005c0500000004' +
'0000000b0000000042005c0500000004' +
'0000000c0000000042005c0500000004' +
'0000000d0000000042005c0500000004' +
'0000000e0000000042005c0500000004' +
'0000000f0000000042005c0500000004' +
'000000120000000042005c0500000004' +
'000000130000000042005c0500000004' +
'000000140000000042005c0500000004' +
'000000180000000042005c0500000004' +
'0000001e000000004200570500000004' +
'00000001000000004200570500000004' +
'00000002000000004200570500000004' +
'00000003000000004200570500000004' +
'00000004000000004200570500000004' +
'00000006000000004200570500000004' +
'000000070000000042000f0100000098' +
'42005c05000000040000001800000000' +
'42009308000000036261720000000000' +
'42007f05000000040000000000000000' +
'42007c010000006042009d070000000c' +
'4b4b4b4b4b4b4b4b4b4b4b4b00000000' +
'4200880100000040420088070000000d' +
'4b4b4b4b4b4b4b4b4b4b4b4b4b000000' +
'420088070000000c4b4b4b4b4b4b4b4b' +
'4b4b4b4b000000004200880700000005' +
'4b4b4b4b4b000000', 'hex'),
},
{
request: Buffer.from('', 'hex'),
response: Buffer.from('', 'hex'),
},
{
request: Buffer.from('4200780100000000', 'hex'),
response: Buffer.from('4200780100000000', 'hex'),
},
{
/* Valid message with unknown tag */
request: Buffer.from('56000002000000040000000100000000', 'hex'),
response: Buffer.from('56000006000000080000000000000001', 'hex'),
degenerated: true,
},
{
/* Valid message with unknown enum */
/* on a non-enumeration tag */
request: Buffer.from('42007805000000040000000100000000', 'hex'),
/* on an enumeration tag */
response: Buffer.from('42007405000000040000000000000000', 'hex'),
degenerated: true,
},
{
/* padding is missing at the end of the message */
request: Buffer.from('42000f080000000400000001', 'hex'),
response: Buffer.from('42000f0a0000000400000001', 'hex'),
degenerated: true,
},
{
request: Buffer.from('42000f08000000040000000100000000', 'hex'),
response: Buffer.from('42000f0a000000040000000100000000', 'hex'),
degenerated: false,
},
{
/* chained message, shouldn't happen but validate the structure loop */
request: Buffer.from('42000f08000000040000000100000000' +
'42000f08000000040000000100000000' +
'42000f08000000040000000100000000' +
'42000f08000000040000000100000000'
, 'hex'),
response: Buffer.from('42000f0a000000040000000100000000' +
'42000f0a000000040000000100000000' +
'42000f0a000000040000000100000000' +
'42000f0a000000040000000100000000'
, 'hex'),
degenerated: false,
},
{
/* zero-length payload */
request: Buffer.from('4200780400000000', 'hex'),
response: Buffer.from('4200780400000000', 'hex'),
degenerated: false,
},
{
/* no padding */
request: Buffer.from('420078040000000100', 'hex'),
response: Buffer.from('420078040000000100', 'hex'),
degenerated: true,
},
{
request: Buffer.from('42007406000000080000000000000001', 'hex'),
response: Buffer.from('42007406000000080000000000000000', 'hex'),
degenerated: false,
},
{
request: Buffer.from('42007406000000081111111111111111', 'hex'),
response: Buffer.from('42007406000000080101010101010101', 'hex'),
degenerated: true,
},
{
request: Buffer.from('4200740700000000', 'hex'),
response: Buffer.from('42007407000000010100000000000000', 'hex'),
degenerated: false,
},
{
request: Buffer.from('42007407000000010000000000000000', 'hex'),
response: Buffer.from('42007407000000020100000000000000', 'hex'),
degenerated: false,
},
{
request: Buffer.from('4200740800000000', 'hex'),
response: Buffer.from('42007408000000010100000000000000', 'hex'),
degenerated: false,
},
{
request: Buffer.from('42007408000000010000000000000000', 'hex'),
response: Buffer.from('42007408000000020100000000000000', 'hex'),
degenerated: false,
},
{
request: Buffer.from('42007409000000080000000000000001', 'hex'),
response: Buffer.from('42007409000000080000000000000000', 'hex'),
degenerated: false,
},
];

226
yarn.lock
View File

@ -34,20 +34,42 @@
dependencies:
"@hapi/hoek" "^8.3.0"
"@sinonjs/commons@^1.7.0":
version "1.8.0"
resolved "https://registry.yarnpkg.com/@sinonjs/commons/-/commons-1.8.0.tgz#c8d68821a854c555bba172f3b06959a0039b236d"
integrity sha512-wEj54PfsZ5jGSwMX68G8ZXFawcSglQSXqCftWX3ec8MDUzQdHgcKvw97awHbY0efQEL5iKUOAmmVtoYgmrSG4Q==
"@sinonjs/commons@^1", "@sinonjs/commons@^1.6.0", "@sinonjs/commons@^1.7.0", "@sinonjs/commons@^1.7.2":
version "1.7.2"
resolved "https://registry.yarnpkg.com/@sinonjs/commons/-/commons-1.7.2.tgz#505f55c74e0272b43f6c52d81946bed7058fc0e2"
integrity sha512-+DUO6pnp3udV/v2VfUWgaY5BIE1IfT7lLfeDzPVeMT1XKkaAp9LgSI9x5RtrFQoZ9Oi0PgXQQHPaoKu7dCjVxw==
dependencies:
type-detect "4.0.8"
"@sinonjs/fake-timers@^6.0.1":
"@sinonjs/fake-timers@^6.0.0", "@sinonjs/fake-timers@^6.0.1":
version "6.0.1"
resolved "https://registry.yarnpkg.com/@sinonjs/fake-timers/-/fake-timers-6.0.1.tgz#293674fccb3262ac782c7aadfdeca86b10c75c40"
integrity sha512-MZPUxrmFubI36XS1DI3qmI0YdN1gks62JtFZvxR67ljjSNCeK6U08Zx4msEWOXuofgqUt6zPHSi1H9fbjR/NRA==
dependencies:
"@sinonjs/commons" "^1.7.0"
"@sinonjs/formatio@^5.0.1":
version "5.0.1"
resolved "https://registry.yarnpkg.com/@sinonjs/formatio/-/formatio-5.0.1.tgz#f13e713cb3313b1ab965901b01b0828ea6b77089"
integrity sha512-KaiQ5pBf1MpS09MuA0kp6KBQt2JUOQycqVG1NZXvzeaXe5LGFqAKueIS0bw4w0P9r7KuBSVdUk5QjXsUdu2CxQ==
dependencies:
"@sinonjs/commons" "^1"
"@sinonjs/samsam" "^5.0.2"
"@sinonjs/samsam@^5.0.2", "@sinonjs/samsam@^5.0.3":
version "5.0.3"
resolved "https://registry.yarnpkg.com/@sinonjs/samsam/-/samsam-5.0.3.tgz#86f21bdb3d52480faf0892a480c9906aa5a52938"
integrity sha512-QucHkc2uMJ0pFGjJUDP3F9dq5dx8QIaqISl9QgwLOh6P9yv877uONPGXh/OH/0zmM3tW1JjuJltAZV2l7zU+uQ==
dependencies:
"@sinonjs/commons" "^1.6.0"
lodash.get "^4.4.2"
type-detect "^4.0.8"
"@sinonjs/text-encoding@^0.7.1":
version "0.7.1"
resolved "https://registry.yarnpkg.com/@sinonjs/text-encoding/-/text-encoding-0.7.1.tgz#8da5c6530915653f3a1f38fd5f101d8c3f8079c5"
integrity sha512-+iTbntw2IZPb/anVDbypzfQa+ay64MW0Zo8aJ8gZPWMMK6/OubMVb6lUPMagqjOPnmtauXnFCACVl3O7ogjeqQ==
JSONStream@^1.0.0:
version "1.3.5"
resolved "https://registry.yarnpkg.com/JSONStream/-/JSONStream-1.3.5.tgz#3208c1f08d3a4d99261ab64f92302bc15e111ca0"
@ -223,28 +245,28 @@ balanced-match@^1.0.0:
resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.0.tgz#89b4d199ab2bee49de164ea02b89ce462d71b767"
integrity sha1-ibTRmasr7kneFk6gK4nORi1xt2c=
base-x@3.0.8:
version "3.0.8"
resolved "https://registry.yarnpkg.com/base-x/-/base-x-3.0.8.tgz#1e1106c2537f0162e8b52474a557ebb09000018d"
integrity sha512-Rl/1AWP4J/zRrk54hhlxH4drNxPJXYUaKffODVI53/dAsV4t9fBxyxYKAVPU1XBHxYwOWP9h9H0hM2MVw4YfJA==
dependencies:
safe-buffer "^5.0.1"
base62@2.0.1:
version "2.0.1"
resolved "https://registry.yarnpkg.com/base62/-/base62-2.0.1.tgz#729cfe179ed34c61e4a489490105b44ce4ea1197"
integrity sha512-4t4WQK7mdbcWzqEBiq6tfo2qDCeIZGXvjifJZyxHIVcjQkZJxpFtu/pa2Va69OouCkg6izZ08hKnPxroeDyzew==
base64-arraybuffer@0.1.4:
version "0.1.4"
resolved "https://registry.yarnpkg.com/base64-arraybuffer/-/base64-arraybuffer-0.1.4.tgz#9818c79e059b1355f97e0428a017c838e90ba812"
integrity sha1-mBjHngWbE1X5fgQooBfIOOkLqBI=
base64-arraybuffer@0.1.5:
version "0.1.5"
resolved "https://registry.yarnpkg.com/base64-arraybuffer/-/base64-arraybuffer-0.1.5.tgz#73926771923b5a19747ad666aa5cd4bf9c6e9ce8"
integrity sha1-c5JncZI7Whl0etZmqlzUv5xunOg=
base64id@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/base64id/-/base64id-2.0.0.tgz#2770ac6bc47d312af97a8bf9a634342e0cd25cb6"
integrity sha512-lGe34o6EHj9y3Kts9R4ZYs/Gr+6N7MCaMlIFA3F1R2O5/m7K06AxfSeO5530PEERE6/WyEg3lsuyw4GHlPZHog==
better-assert@~1.0.0:
version "1.0.2"
resolved "https://registry.yarnpkg.com/better-assert/-/better-assert-1.0.2.tgz#40866b9e1b9e0b55b481894311e68faffaebc522"
integrity sha1-QIZrnhueC1W0gYlDEeaPr/rrxSI=
dependencies:
callsite "1.0.0"
binary-extensions@^2.0.0:
version "2.2.0"
resolved "https://registry.yarnpkg.com/binary-extensions/-/binary-extensions-2.2.0.tgz#75f502eeaf9ffde42fc98829645be4ea76bd9e2d"
@ -329,11 +351,6 @@ caller-path@^0.1.0:
dependencies:
callsites "^0.2.0"
callsite@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/callsite/-/callsite-1.0.0.tgz#280398e5d664bd74038b6f0905153e6e8af1bc20"
integrity sha1-KAOY5dZkvXQDi28JBRU+borxvCA=
callsites@^0.2.0:
version "0.2.0"
resolved "https://registry.yarnpkg.com/callsites/-/callsites-0.2.0.tgz#afab96262910a7f33c19a5775825c69f34e350ca"
@ -474,10 +491,10 @@ concat-stream@^1.4.6:
readable-stream "^2.2.2"
typedarray "^0.0.6"
cookie@0.3.1:
version "0.3.1"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.3.1.tgz#e7e0a1f9ef43b4c8ba925c5c5a96e806d16873bb"
integrity sha1-5+Ch+e9DtMi6klxcWpboBtFoc7s=
cookie@~0.4.1:
version "0.4.1"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.4.1.tgz#afd713fe26ebd21ba95ceb61f9a8116e50a537d1"
integrity sha512-ZwrFkGJxUR3EIoXtO+yVE69Eb7KlixbaeAWfBQB9vVsNn/o+Yw69gBWSSDK825hQNdN+wF8zELf3dFNl/kxkUA==
core-util-is@~1.0.0:
version "1.0.2"
@ -569,7 +586,7 @@ depd@^1.1.2:
resolved "https://registry.yarnpkg.com/depd/-/depd-1.1.2.tgz#9bcd52e14c097763e749b274c4346ed2e560b5a9"
integrity sha1-m81S4UwJd2PnSbJ0xDRu0uVgtak=
diff@4.0.2:
diff@4.0.2, diff@^4.0.2:
version "4.0.2"
resolved "https://registry.yarnpkg.com/diff/-/diff-4.0.2.tgz#60f3aecb89d5fae520c11aa19efc2bb982aade7d"
integrity sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==
@ -622,6 +639,23 @@ engine.io-client@~3.4.0:
xmlhttprequest-ssl "~1.5.4"
yeast "0.1.2"
engine.io-client@~3.5.0:
version "3.5.2"
resolved "https://registry.yarnpkg.com/engine.io-client/-/engine.io-client-3.5.2.tgz#0ef473621294004e9ceebe73cef0af9e36f2f5fa"
integrity sha512-QEqIp+gJ/kMHeUun7f5Vv3bteRHppHH/FMBQX/esFj/fuYfjyUKWGMo3VCvIP/V8bE9KcjHmRZrhIz2Z9oNsDA==
dependencies:
component-emitter "~1.3.0"
component-inherit "0.0.3"
debug "~3.1.0"
engine.io-parser "~2.2.0"
has-cors "1.1.0"
indexof "0.0.1"
parseqs "0.0.6"
parseuri "0.0.6"
ws "~7.4.2"
xmlhttprequest-ssl "~1.6.2"
yeast "0.1.2"
engine.io-parser@~2.2.0:
version "2.2.1"
resolved "https://registry.yarnpkg.com/engine.io-parser/-/engine.io-parser-2.2.1.tgz#57ce5611d9370ee94f99641b589f94c97e4f5da7"
@ -633,17 +667,17 @@ engine.io-parser@~2.2.0:
blob "0.0.5"
has-binary2 "~1.0.2"
engine.io@~3.4.0:
version "3.4.2"
resolved "https://registry.yarnpkg.com/engine.io/-/engine.io-3.4.2.tgz#8fc84ee00388e3e228645e0a7d3dfaeed5bd122c"
integrity sha512-b4Q85dFkGw+TqgytGPrGgACRUhsdKc9S9ErRAXpPGy/CXKs4tYoHDkvIRdsseAF7NjfVwjRFIn6KTnbw7LwJZg==
engine.io@~3.5.0:
version "3.5.0"
resolved "https://registry.yarnpkg.com/engine.io/-/engine.io-3.5.0.tgz#9d6b985c8a39b1fe87cd91eb014de0552259821b"
integrity sha512-21HlvPUKaitDGE4GXNtQ7PLP0Sz4aWLddMPw2VTyFz1FVZqu/kZsJUO8WNpKuE/OCL7nkfRaOui2ZCJloGznGA==
dependencies:
accepts "~1.3.4"
base64id "2.0.0"
cookie "0.3.1"
cookie "~0.4.1"
debug "~4.1.0"
engine.io-parser "~2.2.0"
ws "^7.1.2"
ws "~7.4.2"
entities@~1.1.1:
version "1.1.2"
@ -1030,7 +1064,7 @@ glob-parent@~5.1.0:
dependencies:
is-glob "^4.0.1"
glob@7.1.6, glob@^7.0.3, glob@^7.1.2, glob@^7.1.3:
glob@7.1.6, glob@^7.0.3, glob@^7.1.2:
version "7.1.6"
resolved "https://registry.yarnpkg.com/glob/-/glob-7.1.6.tgz#141f33b81a7c2492e125594307480c46679278a6"
integrity sha512-LwaxwyZ72Lk7vZINtNNrywX0ZuLyStrdDtabefZKAY5ZGJhVtgdznluResxNmPitE0SAO+O26sWTHeKSI2wMBA==
@ -1042,6 +1076,18 @@ glob@7.1.6, glob@^7.0.3, glob@^7.1.2, glob@^7.1.3:
once "^1.3.0"
path-is-absolute "^1.0.0"
glob@^7.1.3:
version "7.1.4"
resolved "https://registry.yarnpkg.com/glob/-/glob-7.1.4.tgz#aa608a2f6c577ad357e1ae5a5c26d9a8d1969255"
integrity sha512-hkLPepehmnKk41pUGm3sYxoFs/umurYfYJCerbXEyFIWcAzvpipAgVkBqqT9RBKMGjnq6kMuyYwha6csxbiM1A==
dependencies:
fs.realpath "^1.0.0"
inflight "^1.0.4"
inherits "2"
minimatch "^3.0.4"
once "^1.3.0"
path-is-absolute "^1.0.0"
globals@^9.2.0:
version "9.18.0"
resolved "https://registry.yarnpkg.com/globals/-/globals-9.18.0.tgz#aa3896b3e69b487f17e31ed2143d69a8e30c2d8a"
@ -1417,6 +1463,11 @@ jsonpointer@^4.0.0:
resolved "https://registry.yarnpkg.com/jsonpointer/-/jsonpointer-4.0.1.tgz#4fd92cb34e0e9db3c89c8622ecf51f9b978c6cb9"
integrity sha1-T9kss04OnbPInIYi7PUfm5eMbLk=
just-extend@^4.0.2:
version "4.1.0"
resolved "https://registry.yarnpkg.com/just-extend/-/just-extend-4.1.0.tgz#7278a4027d889601640ee0ce0e5a00b992467da4"
integrity sha512-ApcjaOdVTJ7y4r08xI5wIqpvwS48Q0PBG4DJROcEkH1f8MdAiNFyFxz3xoL0LWAVwjrwPYZdVHHxhRHcx/uGLA==
keypress@0.1.x:
version "0.1.0"
resolved "https://registry.yarnpkg.com/keypress/-/keypress-0.1.0.tgz#4a3188d4291b66b4f65edb99f806aa9ae293592a"
@ -1575,6 +1626,11 @@ lodash.flatten@^4.4.0:
resolved "https://registry.yarnpkg.com/lodash.flatten/-/lodash.flatten-4.4.0.tgz#f31c22225a9632d2bbf8e4addbef240aa765a61f"
integrity sha1-8xwiIlqWMtK7+OSt2+8kCqdlph8=
lodash.get@^4.4.2:
version "4.4.2"
resolved "https://registry.yarnpkg.com/lodash.get/-/lodash.get-4.4.2.tgz#2d177f652fa31e939b4438d5341499dfa3825e99"
integrity sha1-LRd/ZS+jHpObRDjVNBSZ36OCXpk=
lodash.union@^4.6.0:
version "4.6.0"
resolved "https://registry.yarnpkg.com/lodash.union/-/lodash.union-4.6.0.tgz#48bb5088409f16f1821666641c44dd1aaae3cd88"
@ -1754,6 +1810,17 @@ next-tick@~1.0.0:
resolved "https://registry.yarnpkg.com/next-tick/-/next-tick-1.0.0.tgz#ca86d1fe8828169b0120208e3dc8424b9db8342c"
integrity sha1-yobR/ogoFpsBICCOPchCS524NCw=
nise@^4.0.1:
version "4.0.3"
resolved "https://registry.yarnpkg.com/nise/-/nise-4.0.3.tgz#9f79ff02fa002ed5ffbc538ad58518fa011dc913"
integrity sha512-EGlhjm7/4KvmmE6B/UFsKh7eHykRl9VH+au8dduHLCyWUO/hr7+N+WtTvDUwc9zHuM1IaIJs/0lQ6Ag1jDkQSg==
dependencies:
"@sinonjs/commons" "^1.7.0"
"@sinonjs/fake-timers" "^6.0.0"
"@sinonjs/text-encoding" "^0.7.1"
just-extend "^4.0.2"
path-to-regexp "^1.7.0"
node-forge@^0.7.1:
version "0.7.6"
resolved "https://registry.yarnpkg.com/node-forge/-/node-forge-0.7.6.tgz#fdf3b418aee1f94f0ef642cd63486c77ca9724ac"
@ -1779,11 +1846,6 @@ object-assign@^4.0.1, object-assign@^4.1.0:
resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863"
integrity sha1-IQmtx5ZYh8/AXLvUQsrIv7s2CGM=
object-component@0.0.3:
version "0.0.3"
resolved "https://registry.yarnpkg.com/object-component/-/object-component-0.0.3.tgz#f0c69aa50efc95b866c186f400a33769cb2f1291"
integrity sha1-8MaapQ78lbhmwYb0AKM3acsvEpE=
object-inspect@^1.9.0:
version "1.9.0"
resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.9.0.tgz#c90521d74e1127b67266ded3394ad6116986533a"
@ -1874,25 +1936,11 @@ p-try@^2.0.0:
resolved "https://registry.yarnpkg.com/p-try/-/p-try-2.2.0.tgz#cb2868540e313d61de58fafbe35ce9004d5540e6"
integrity sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==
parseqs@0.0.5:
version "0.0.5"
resolved "https://registry.yarnpkg.com/parseqs/-/parseqs-0.0.5.tgz#d5208a3738e46766e291ba2ea173684921a8b89d"
integrity sha1-1SCKNzjkZ2bikbouoXNoSSGouJ0=
dependencies:
better-assert "~1.0.0"
parseqs@0.0.6:
version "0.0.6"
resolved "https://registry.yarnpkg.com/parseqs/-/parseqs-0.0.6.tgz#8e4bb5a19d1cdc844a08ac974d34e273afa670d5"
integrity sha512-jeAGzMDbfSHHA091hr0r31eYfTig+29g3GKKE/PPbEQ65X0lmMwlEoqmhzu0iztID5uJpZsFlUPDP8ThPL7M8w==
parseuri@0.0.5:
version "0.0.5"
resolved "https://registry.yarnpkg.com/parseuri/-/parseuri-0.0.5.tgz#80204a50d4dbb779bfdc6ebe2778d90e4bce320a"
integrity sha1-gCBKUNTbt3m/3G6+J3jZDkvOMgo=
dependencies:
better-assert "~1.0.0"
parseuri@0.0.6:
version "0.0.6"
resolved "https://registry.yarnpkg.com/parseuri/-/parseuri-0.0.6.tgz#e1496e829e3ac2ff47f39a4dd044b32823c4a25a"
@ -1918,6 +1966,13 @@ path-is-inside@^1.0.1:
resolved "https://registry.yarnpkg.com/path-is-inside/-/path-is-inside-1.0.2.tgz#365417dede44430d1c11af61027facf074bdfc53"
integrity sha1-NlQX3t5EQw0cEa9hAn+s8HS9/FM=
path-to-regexp@^1.7.0:
version "1.8.0"
resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-1.8.0.tgz#887b3ba9d84393e87a0a0b9f4cb756198b53548a"
integrity sha512-n43JRhlUKUAlibEJhPeir1ncUID16QnEjNpwzNdO3Lm4ywrBpBZ5oLD0I6br9evr1Y9JTqwRtAh7JLoOzAQdVA==
dependencies:
isarray "0.0.1"
picomatch@^2.0.4, picomatch@^2.0.7:
version "2.2.2"
resolved "https://registry.yarnpkg.com/picomatch/-/picomatch-2.2.2.tgz#21f333e9b6b8eaff02468f5146ea406d345f4dad"
@ -2184,6 +2239,19 @@ simple-glob@^0.2:
lodash.flatten "^4.4.0"
lodash.union "^4.6.0"
sinon@^9.0.2:
version "9.0.2"
resolved "https://registry.yarnpkg.com/sinon/-/sinon-9.0.2.tgz#b9017e24633f4b1c98dfb6e784a5f0509f5fd85d"
integrity sha512-0uF8Q/QHkizNUmbK3LRFqx5cpTttEVXudywY9Uwzy8bTfZUhljZ7ARzSxnRHWYWtVTeh4Cw+tTb3iU21FQVO9A==
dependencies:
"@sinonjs/commons" "^1.7.2"
"@sinonjs/fake-timers" "^6.0.1"
"@sinonjs/formatio" "^5.0.1"
"@sinonjs/samsam" "^5.0.3"
diff "^4.0.2"
nise "^4.0.1"
supports-color "^7.1.0"
slice-ansi@0.0.4:
version "0.0.4"
resolved "https://registry.yarnpkg.com/slice-ansi/-/slice-ansi-0.0.4.tgz#edbf8903f66f7ce2f8eafd6ceed65e264c831b35"
@ -2194,23 +2262,20 @@ socket.io-adapter@~1.1.0:
resolved "https://registry.yarnpkg.com/socket.io-adapter/-/socket.io-adapter-1.1.2.tgz#ab3f0d6f66b8fc7fca3959ab5991f82221789be9"
integrity sha512-WzZRUj1kUjrTIrUKpZLEzFZ1OLj5FwLlAFQs9kuZJzJi5DKdU7FsWc36SNmA8iDOtwBQyT8FkrriRM8vXLYz8g==
socket.io-client@2.3.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/socket.io-client/-/socket.io-client-2.3.0.tgz#14d5ba2e00b9bcd145ae443ab96b3f86cbcc1bb4"
integrity sha512-cEQQf24gET3rfhxZ2jJ5xzAOo/xhZwK+mOqtGRg5IowZsMgwvHwnf/mCRapAAkadhM26y+iydgwsXGObBB5ZdA==
socket.io-client@2.4.0:
version "2.4.0"
resolved "https://registry.yarnpkg.com/socket.io-client/-/socket.io-client-2.4.0.tgz#aafb5d594a3c55a34355562fc8aea22ed9119a35"
integrity sha512-M6xhnKQHuuZd4Ba9vltCLT9oa+YvTsP8j9NcEiLElfIg8KeYPyhWOes6x4t+LTAC8enQbE/995AdTem2uNyKKQ==
dependencies:
backo2 "1.0.2"
base64-arraybuffer "0.1.5"
component-bind "1.0.0"
component-emitter "1.2.1"
debug "~4.1.0"
engine.io-client "~3.4.0"
component-emitter "~1.3.0"
debug "~3.1.0"
engine.io-client "~3.5.0"
has-binary2 "~1.0.2"
has-cors "1.1.0"
indexof "0.0.1"
object-component "0.0.3"
parseqs "0.0.5"
parseuri "0.0.5"
parseqs "0.0.6"
parseuri "0.0.6"
socket.io-parser "~3.3.0"
to-array "0.1.4"
@ -2249,16 +2314,16 @@ socket.io-parser@~3.4.0:
debug "~4.1.0"
isarray "2.0.1"
socket.io@~2.3.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/socket.io/-/socket.io-2.3.0.tgz#cd762ed6a4faeca59bc1f3e243c0969311eb73fb"
integrity sha512-2A892lrj0GcgR/9Qk81EaY2gYhCBxurV0PfmmESO6p27QPrUK1J3zdns+5QPqvUYK2q657nSj0guoIil9+7eFg==
socket.io@~2.4.0:
version "2.4.0"
resolved "https://registry.yarnpkg.com/socket.io/-/socket.io-2.4.0.tgz#01030a2727bd8eb2e85ea96d69f03692ee53d47e"
integrity sha512-9UPJ1UTvKayuQfVv2IQ3k7tCQC/fboDyIK62i99dAQIyHKaBsNdTpwHLgKJ6guRWxRtC9H+138UwpaGuQO9uWQ==
dependencies:
debug "~4.1.0"
engine.io "~3.4.0"
engine.io "~3.5.0"
has-binary2 "~1.0.2"
socket.io-adapter "~1.1.0"
socket.io-client "2.3.0"
socket.io-client "2.4.0"
socket.io-parser "~3.4.0"
sprintf-js@~1.0.2:
@ -2371,7 +2436,7 @@ strip-json-comments@~1.0.1:
resolved "https://registry.yarnpkg.com/strip-json-comments/-/strip-json-comments-1.0.4.tgz#1e15fbcac97d3ee99bf2d73b4c656b082bbafb91"
integrity sha1-HhX7ysl9Pumb8tc7TGVrCCu6+5E=
supports-color@7.1.0:
supports-color@7.1.0, supports-color@^7.1.0:
version "7.1.0"
resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-7.1.0.tgz#68e32591df73e25ad1c4b49108a2ec507962bfd1"
integrity sha512-oRSIpR8pxT1Wr2FquTNnGet79b3BWljqOuoW/h4oBhxJ/HUbX5nX6JSruTkvXDCFMwDPvsaTTbvMLKZWSy0R5g==
@ -2445,7 +2510,7 @@ type-check@~0.3.2:
dependencies:
prelude-ls "~1.1.2"
type-detect@4.0.8:
type-detect@4.0.8, type-detect@^4.0.8:
version "4.0.8"
resolved "https://registry.yarnpkg.com/type-detect/-/type-detect-4.0.8.tgz#7646fb5f18871cfbb7749e69bd39a6388eb7450c"
integrity sha512-0fr/mIH1dlO+x7TlcMy+bIDqKPsw/70tVyeHW787goQjhmqaZe10uwLujubK9q9Lg6Fiho1KUKDYz0Z7k7g5/g==
@ -2600,11 +2665,6 @@ write@^0.2.1:
dependencies:
mkdirp "^0.5.1"
ws@^7.1.2:
version "7.3.0"
resolved "https://registry.yarnpkg.com/ws/-/ws-7.3.0.tgz#4b2f7f219b3d3737bc1a2fbf145d825b94d38ffd"
integrity sha512-iFtXzngZVXPGgpTlP1rBqsUK82p9tKqsWRPg5L56egiljujJT3vGAYnHANvFxBieXrTFavhzhxW52jnaWV+w2w==
ws@~6.1.0:
version "6.1.4"
resolved "https://registry.yarnpkg.com/ws/-/ws-6.1.4.tgz#5b5c8800afab925e94ccb29d153c8d02c1776ef9"
@ -2612,6 +2672,11 @@ ws@~6.1.0:
dependencies:
async-limiter "~1.0.0"
ws@~7.4.2:
version "7.4.6"
resolved "https://registry.yarnpkg.com/ws/-/ws-7.4.6.tgz#5654ca8ecdeee47c33a9a4bf6d28e2be2980377c"
integrity sha512-YmhHDO4MzaDLB+M9ym/mDA5z0naX8j7SIlT8f8z+I0VtzsRbekxEutHSme7NPS2qE8StCYQNUnfWdXta/Yu85A==
xml2js@~0.4.23:
version "0.4.23"
resolved "https://registry.yarnpkg.com/xml2js/-/xml2js-0.4.23.tgz#a0c69516752421eb2ac758ee4d4ccf58843eac66"
@ -2630,6 +2695,11 @@ xmlhttprequest-ssl@~1.5.4:
resolved "https://registry.yarnpkg.com/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz#c2876b06168aadc40e57d97e81191ac8f4398b3e"
integrity sha1-wodrBhaKrcQOV9l+gRkayPQ5iz4=
xmlhttprequest-ssl@~1.6.2:
version "1.6.3"
resolved "https://registry.yarnpkg.com/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.6.3.tgz#03b713873b01659dfa2c1c5d056065b27ddc2de6"
integrity sha512-3XfeQE/wNkvrIktn2Kf0869fC0BN6UpydVasGIeSm2B1Llihf7/0UfZM+eCkOw3P7bP4+qPgqhm7ZoxuJtFU0Q==
xtend@^4.0.0, xtend@~4.0.0:
version "4.0.2"
resolved "https://registry.yarnpkg.com/xtend/-/xtend-4.0.2.tgz#bb72779f5fa465186b1f438f674fa347fdb5db54"