Compare commits

..

9 Commits

Author SHA1 Message Date
Antonin Coulibaly c75435ec8c wip 2017-07-05 12:07:13 +02:00
Antonin Coulibaly 8ade0f71e9 wip 2017-06-20 14:56:33 +02:00
Antonin Coulibaly 3268fa1cc7 wip 2017-06-16 10:13:54 +02:00
David Pineau 6987e9031a
FT: DefaultFields: Add a memory management helper
Due to the DefaultFields's design, we setup cyclic references that a managed
language such as Node.JS has no way of solving to release the associated
memory.

As such, this commit provides an helper function that allows breaking the
references and can be called manually, in addition to being called
automatically from the end logger's logging methods.

This choice was pushed by the analysis of the possible alternatives (coming
close to naught), and the context accompanying the feature, such as the fact
the DefaultFields class is aimed at providing relationships management between
various Module loggers and the Request Loggers:
 1. The RequestLogger has a lifetime contained within that of its natural
    parent, the Module Logger
 2. The Module Logger has a lifetime matching that of the program's, since it
    is created once for all at the intialization of the associated module
As such, it was deemed acceptable to never release the Module Logger's memory,
and automate the RequestLogger's memory release through the use of the destroy
function at the end of their lifespan: The call of the endlogger's logging
method (that should be called only once, and called as the last logging entry
of the RequestLogger, thus meaning that the RequestLogger should be released
right after that, and not used anymore).

TL;DR; OOP design is shit in memory-managed languages, so here's our own shit
to work around it, and make the memory leak issue "not that bad".
2017-05-26 11:49:32 +02:00
David Pineau 42a8dcc968
FT: Add a parent/child relationship between Logger and RequestLogger
This feature was enabled through the use of the DefaultFields class, that
provides a relationship between its inheritors. This way, the RequestLogger now
includes the Module Logger's defaults fields in a way that is dynamic and can
be updated at runtime, instead of only at object instanciation time.
2017-05-26 11:49:32 +02:00
David Pineau 812d2bec67
feat(hierarchy): add backwards/forward compat
Warning: An important feature update is provided, and as such, we should
clearly show this through a minor version update, but we are not
following semantic versionning, thus cannot hint at that. The current
feature is deprecating some function prototypes, as the API is bound to
change in a way that breaks backwards compatibility. We're currently
maintaining the backwards compatibility through a deprecated version of
the functions, but this should disappear from the API starting at the
next major version.

This commit also introduces the backward/forward compatibility for the
Logger's constructor, as well as the additional tests validating the
forward-compatibility of the API.
2017-05-26 11:49:21 +02:00
David Pineau f0391eb5d1
FT: Use DefaultFields in Logger.js
Provide the DefaultFields class's logic to the Logger class.
This will enable new features such as linking Module loggers together to
provide a more consistent logging scheme.
2017-05-26 10:51:43 +02:00
David Pineau 93291d55e4
RF: Use DefaultFields in RequestLogger.js
Replace the default fields logic from RequestLogger.js (RequestLogger and
EndLogger classes) by the use and extension of the newly written DefaultFields
class.
2017-05-26 10:51:43 +02:00
David Pineau bb63fc629c
FT: Provide the DefaultFields class
This new utility class provides two concepts:
 - 'Default fields' that already existed within the RequestLogger
 - 'Hierarchy relationships', with override capability and automatic
    cache/update of the aggregation of the default fields provided by the whole
    hierarchy

This class is aimed at being used by other pieces of werelogs, and is provided
with its behavioral tests.
2017-05-26 10:51:36 +02:00
31 changed files with 1425 additions and 1235 deletions

View File

@ -1,31 +0,0 @@
name: Tests
on:
push:
branches-ignore:
- development/**
- q/*/**
jobs:
tests:
runs-on: ubuntu-latest
steps:
- name: Checkokut
uses: actions/checkout@v4
- name: Install deps
run: sudo apt-get update -q
- uses: actions/setup-node@v4
with:
node-version: '16'
- name: Install Yarn
run: npm install -g yarn
- name: install dependencies
run: yarn install --frozen-lockfile
- name: run lint
run: echo "linter is disabled temporarily ()" || yarn run --silent lint -- --max-warnings 0
- name: run lint_md
run: yarn --silent lint_md
- name: run test
run: yarn test
- name: run coverage
run: yarn coverage

View File

@ -1,5 +1,6 @@
# WereLogs # WereLogs
[![CircleCI][badgepub]](https://circleci.com/gh/scality/werelogs)
[![Scality CI][badgepriv]](http://ci.ironmann.io/gh/scality/werelogs) [![Scality CI][badgepriv]](http://ci.ironmann.io/gh/scality/werelogs)
This repository provides a NodeJS Library that aims to be an efficient logging This repository provides a NodeJS Library that aims to be an efficient logging
@ -19,7 +20,7 @@ https://github.com/scality/Guidelines/blob/master/CONTRIBUTING.md).
In order to install WereLogs, you can use NPM with github's HTTP url, and save In order to install WereLogs, you can use NPM with github's HTTP url, and save
it in your own package.json: it in your own package.json:
```sh ```
$> npm i --save scality/werelogs $> npm i --save scality/werelogs
``` ```
@ -157,3 +158,11 @@ In order to find out the known issues, it is advised to take a look at the
[project's github page](http://github.com/scality/werelogs). There, you should [project's github page](http://github.com/scality/werelogs). There, you should
be able to find the issues, tagged with the releases they are impacting, be able to find the issues, tagged with the releases they are impacting,
whether they're open or closed. whether they're open or closed.
## Contributing
The contributing rules for this project are defined in the associated
CONTRIBUTING.md file.
[badgepub]: https://circleci.com/gh/scality/werelogs.svg?style=svg
[badgepriv]: http://ci.ironmann.io/gh/scality/werelogs.svg?style=svg&circle-token=a946e81ad65b99814403b5e57f017d9ecbe93f0a

23
circle.yml Normal file
View File

@ -0,0 +1,23 @@
general:
branches:
ignore:
- /^ultron\/.*/ # Ignore ultron/* branches
artifacts:
- coverage/
- doc/
machine:
node:
version: 6.9.5
test:
override:
- npm run lint_md
- npm run lint -- --max-warnings 0
- npm run gendoc
- npm run coverage
# running ft_test packs werelogs and installs it + deps into
# tests/functional. Pack is like publishing werelogs in a local tgz
# archive that can be installed.
# This step shall ensure that no issue is encountered when installing
# the package, and allows to functionally test werelogs.
- npm run ft_test

84
index.d.ts vendored
View File

@ -1,65 +1,45 @@
interface WerelogsConfigOptions { interface config {}
level?: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
dump?: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
streams?: object[];
}
declare class WerelogsConfig { interface LogDictionnary {
constructor(config?: WerelogsConfigOptions);
reset(): WerelogsConfig;
update(config: WerelogsConfig): WerelogsConfig;
logger: any;
level: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
dump: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
end: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
}
interface LogDictionary {
httpMethod?: string; httpMethod?: string;
httpURL?: string; httpURL?: string;
[field: string]: any; [field: string]: any;
} }
declare module 'werelogs' { declare class RequestLogger {
export class RequestLogger { constructor(
constructor( logger: any,
logger: any, logLevel: string,
logLevel: string, dumpThreshold: string,
dumpThreshold: string, endLevel: string,
endLevel: string, uids?: string|Array<string>
uids?: string|Array<string> );
); getUids(): Array<string>;
getUids(): Array<string>; getSerializedUids(): string;
getSerializedUids(): string; addDefaultFields(fields: LogDictionnary): LogDictionnary;
addDefaultFields(fields: LogDictionary): LogDictionary; trace(msg: string, data?: LogDictionnary): void;
trace(msg: string, data?: LogDictionary): void; debug(msg: string, data?: LogDictionnary): void;
debug(msg: string, data?: LogDictionary): void; info(msg: string, data?: LogDictionnary): void;
info(msg: string, data?: LogDictionary): void; warn(msg: string, data?: LogDictionnary): void;
warn(msg: string, data?: LogDictionary): void; error(msg: string, data?: LogDictionnary): void;
error(msg: string, data?: LogDictionary): void; fatal(msg: string, data?: LogDictionnary): void;
fatal(msg: string, data?: LogDictionary): void; end(msg: string, data?: LogDictionnary): void;
end(msg: string, data?: LogDictionary): void; }
errorEnd(msg: string, data?:LogDictionary): void;
}
declare module 'werelogs' {
export class Logger { export class Logger {
name: string; name: string;
constructor(name: string);
constructor(name: string, config?: config);
setLevel(levelName: string): void;
setDumpLevelThreshold(levelName: string): void;
newRequestLogger(uids?: string|Array<string>): RequestLogger; newRequestLogger(uids?: string|Array<string>): RequestLogger;
newRequestLoggerFromSerializedUids(uids: string): RequestLogger; newRequestLoggerFromSerializedUids(uids: string): RequestLogger;
trace(msg: string, data?: LogDictionary): void; trace(msg: string, data?: LogDictionnary): void;
debug(msg: string, data?: LogDictionary): void; debug(msg: string, data?: LogDictionnary): void;
info(msg: string, data?: LogDictionary): void; info(msg: string, data?: LogDictionnary): void;
warn(msg: string, data?: LogDictionary): void; warn(msg: string, data?: LogDictionnary): void;
error(msg: string, data?: LogDictionary): void; error(msg: string, data?: LogDictionnary): void;
fatal(msg: string, data?: LogDictionary): void; fatal(msg: string, data?: LogDictionnary): void;
}
export function configure(config: WerelogsConfigOptions): void;
export class API {
constructor(config: WerelogsConfigOptions);
reconfigure(config: WerelogsConfigOptions): void;
Logger: Logger;
} }
} }

View File

@ -1,51 +1 @@
const API = require('./lib/api.js'); module.exports = { Logger: require('./lib/Logger.js') };
const stderrUtils = require('./lib/stderrUtils');
/*
* For convenience purposes, we provide an already instanciated API; so that
* old uses of the imported Logger class can be kept as-is. For quick logging,
* this also provides a hassle-free way to log using werelogs.
*/
const werelogs = new API();
module.exports = {
Logger: werelogs.Logger,
configure: werelogs.reconfigure.bind(werelogs),
Werelogs: API,
/**
* Timestamp logs going to stderr
*
* @example <caption>Simplest usage</caption>
* ```
* const { stderrUtils } = require('werelogs');
* stderrUtils.catchAndTimestampStderr();
* ```
*
* @example <caption>Manage process exit</caption>
* ```
* const { stderrUtils } = require('werelogs');
* // set exitCode to null to keep process running on uncaughtException
* stderrUtils.catchAndTimestampStderr(undefined, null);
* // application init
* process.on('uncaughtException', (err) => {
* // custom handling, close connections, files
* this.worker.kill(); // or process.exit(1);
* });
* // Note you could use prependListener to execute your callback first
* // and then let stderrUtils exit the process.
* ```
*
* @example <caption>Custom listener</caption>
* ```
* const { stderrUtils } = require('werelogs');
* stderrUtils.catchAndTimestampWarning();
* // application init
* process.on('uncaughtException', (err, origin) => {
* stderrUtils.printErrorWithTimestamp(err, origin);
* // close and stop everything
* process.exit(1);
* });
* ```
*/
stderrUtils,
};

View File

@ -1,5 +1,4 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const LogLevel = require('./LogLevel.js'); const LogLevel = require('./LogLevel.js');
const SimpleLogger = require('./SimpleLogger.js'); const SimpleLogger = require('./SimpleLogger.js');
@ -15,31 +14,10 @@ class Config {
* This is the default constructor of the Config Object, and the only way * This is the default constructor of the Config Object, and the only way
* to instanciate it (with default parameters). * to instanciate it (with default parameters).
* *
* @param {object} conf - A configuration object for werelogs.
* @param {string} conf.level - The string name of the logging level
* ('trace', 'debug', 'info', 'warn',
* 'error' and 'fatal' in order of
* importance.)
* @param {string} conf.dump - The string name of the log dumping
* level ('trace', 'debug', 'info',
* 'warn', 'error' and 'fatal' in order
* of importance.)
* @param {object[]} conf.streams - The array of streams into which to
* log. This is an Array of objects
* which have a field named 'stream',
* which is writeable.
*
* @returns {undefined} * @returns {undefined}
*/ */
constructor(conf) { constructor() {
this.logLevel = 'info'; this.reset();
this.dumpThreshold = 'error';
this.endLevel = 'info';
this.streams = [{ level: 'trace', stream: process.stdout }];
this.simpleLogger = new SimpleLogger('werelogs', this.streams);
if (conf) {
this.update(conf);
}
} }
/** /**
@ -72,9 +50,11 @@ class Config {
* 'error' and 'fatal' in order of * 'error' and 'fatal' in order of
* importance.) * importance.)
* @param {object[]} config.streams - The array of streams into which to * @param {object[]} config.streams - The array of streams into which to
* log. This is an Array of objects * log. Their configuration is directly
* which have a field named 'stream', * related to the expected bunyan
* which is writeable. * streams array, for compatibility
* purposes (except that the 'level'
* field is not accounted for)
* *
* @see [Bunyan's documentation]{@link * @see [Bunyan's documentation]{@link
* https://github.com/trentm/node-bunyan/blob/master/README.md#streams} for * https://github.com/trentm/node-bunyan/blob/master/README.md#streams} for
@ -89,43 +69,29 @@ class Config {
const checkedConfig = config || {}; const checkedConfig = config || {};
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'level')) { if (checkedConfig.hasOwnProperty('level')) {
LogLevel.throwIfInvalid(checkedConfig.level); LogLevel.throwIfInvalid(checkedConfig.level);
}
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'dump')) {
LogLevel.throwIfInvalid(checkedConfig.dump);
}
// for check log level vs. log dump level
const newLogLevel = checkedConfig.level || this.logLevel;
const newLogDumpLevel = checkedConfig.dump || this.dumpThreshold;
if (newLogDumpLevel
&& !LogLevel.shouldLog(newLogDumpLevel, newLogLevel)) {
throw new Error(
'Logging level should be at most logging dump level',
);
}
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'level')) {
this.logLevel = checkedConfig.level; this.logLevel = checkedConfig.level;
} }
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'dump')) { if (checkedConfig.hasOwnProperty('dump')) {
LogLevel.throwIfInvalid(checkedConfig.dump);
this.dumpThreshold = checkedConfig.dump; this.dumpThreshold = checkedConfig.dump;
} }
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'end')) { if (checkedConfig.hasOwnProperty('end')) {
LogLevel.throwIfInvalid(checkedConfig.end); LogLevel.throwIfInvalid(checkedConfig.end);
this.endLevel = checkedConfig.end; this.endLevel = checkedConfig.end;
} }
if (Object.prototype.hasOwnProperty.call(checkedConfig, 'streams')) { if (checkedConfig.hasOwnProperty('streams')) {
if (!Array.isArray(checkedConfig.streams)) { if (!Array.isArray(checkedConfig.streams)) {
throw new TypeError('WereLogs config.streams must be an Array ' throw new Error('WereLogs config.streams must be an Array ' +
+ 'of Writeable Streams.'); 'of Writeable Streams.');
} }
if (!checkedConfig.streams.length) { if (!checkedConfig.streams.length) {
throw new Error('Werelogs config.streams must contain at ' throw new Error('Werelogs config.streams must contain at ' +
+ 'least one stream.'); 'least one stream.');
} }
this.streams = checkedConfig.streams.map(stream => { this.streams = checkedConfig.streams.map(stream => {
stream.level = 'trace'; // eslint-disable-line no-param-reassign stream.level = 'trace'; // eslint-disable-line no-param-reassign
@ -188,4 +154,4 @@ class Config {
} }
} }
module.exports = Config; module.exports = new Config();

247
lib/DefaultFields.js Normal file
View File

@ -0,0 +1,247 @@
'use strict'; // eslint-disable-line strict
const Utils = require('./Utils.js');
const objectCopy = Utils.objectCopy;
/**
* This class manages the internal `default fields` for logging classes, with a
* notion of parent/child relationships and hierarchy.
* A Child element will inherit the fields from the Parent element, and
* complete (or even override part of it) with its own fields. For instance,
* a child can redefine (and thus override in the final result) a field that is
* already defined by its parent.
*
* This class shall be used embedded within loggers, to provide the `Default
* fields` and hierarchy logics.
*
* @private @property {DefaultFields} parent - The parent node in the
* hierarchy
* @private @property {DefaultFields[]} children - The list of children nodes
* in the hierarchy
* @private @property {Object} fields - The dictionary of fields
* defined for this node
* @private @property {Object} parentFields - The dictionary of fields
* inherited by this node from
* its parent node
* @private @property {Object} precomputedFields - The prepared dictionary of
* aggregated fields containing
* the parent's fields
* overriden with the node's
* own fields. This is used as
* a trick to reduce the
* computational need.
*
* @mixin
*/
class DefaultFields {
constructor() {
this.parent = null;
this.children = [];
this.fields = {};
this.parentFields = {};
this.precomputedFields = {};
}
/**
* This function allows to manually destroy a node, so that all references
* to parents or children are released. This helps in avoiding the creation
* of memory leaks due to our internal cyclic references that cannot be
* automatically released due to the mechanics of GC in Node.JS, and its
* shitty prevention of meaningful OOP designs.
*
* All hail << The Coming of the Great White Handkerchief >> that will
* wipe this misery out of our lives !
*
* @returns {undefined}
*/
_dfDestroy() {
this.children.forEach(child => {
child.setParent(null);
});
this.setParent(null);
this.children = [];
}
/**
* Add a node to the list of children of the current node, effectively
* making it one of the node's children.
*
* @param {DefaultFields} child - The node to add as a child of the current
* node
*
* @return {undefined}
*/
_dfRegisterChild(child) {
this.children.push(child);
this._dfNotifyChild(child);
}
/**
* Remove a node from the list of children of the current node, effectively
* cutting off the relationship between the two.
*
* @param {DefaultFields} toRemove - The node to remove from the list of
* children of the current node.
*
* @return {undefined}
*/
_dfUnregisterChild(toRemove) {
this.children = this.children.filter(child => child !== toRemove);
}
/**
* Utility function to notify one child node of an update of the node's
* precomputed fields.
*
* @param {DefaultField} child - The child node to notify
*
* @return {undefined}
*/
_dfNotifyChild(child) {
child._dfSetParentFields(this.precomputedFields);
}
/**
* Utility function to notify every children node of an update of the
* node's precomputed fields.
*
* @return {undefined}
*/
_dfNotifyChildren() {
this.children.forEach(child => this._dfNotifyChild(child));
}
/**
* This function allows changing the Parent node of the current node,
* consequently changing the resulting aggregation of the hierarchy of
* fields. This can be used for a temporary switch of parent node.
*
* @param {DefaultFields} parent - The new parent node to set for the
* current node
*
* @return {DefaultFields|null} The previous parent
*/
setParent(parent) {
const oldParent = this.parent;
if (parent === oldParent) {
return oldParent;
}
if (oldParent) {
oldParent._dfUnregisterChild(this);
}
this.parent = parent || null;
if (this.parent) {
this.parent._dfRegisterChild(this);
} else {
this._dfSetParentFields();
}
return oldParent;
}
/**
* Internal function to partially recompute the precomputedFields through
* inclusion of the newly defined fields into the precomputed ones. This
* inclusion may override some already defined fields.
*
* This function shall always be the last one called when updating the
* internal fields, as it also triggers the update of the children nodes.
*
* @param {Object} newFields - The dictionary of newFields to include into
* the precomputedFields
*
* @return {undefined}
*/
_dfAugmentPrecomputedFields() {
objectCopy(this.precomputedFields, this.fields);
this._dfNotifyChildren();
}
/**
* Internal function to update the fields provided by the parent node in
* the DefaultFields hierarchy. It serves as a notification hook to refresh
* the precomputed fields depending on the parent node's fields.
* Two situations may lead to calling this function:
* 1. The parent node's updated its preComputed fields and notified its
* children, including the current node
* 2. The node reset its own fields, and we must re-compute to remove
* obsolete fields previously provided by the current node.
*
* @param {Object} parentFields - the precomputedFields from the parent node
*
* @return {undefined}
*/
_dfSetParentFields(parentFields) {
this.parentFields = parentFields || {};
this.precomputedFields = objectCopy({}, this.parentFields);
this._dfAugmentPrecomputedFields();
}
/**
* This function allows to reset the fields managed by the DefaultFields
* instance. It automatically triggers the re-computation of the
* precomputed fields, cascading through the node and its children.
*
* /!\ This function may lead to an important use of the computational
* resources if over-used.
*
* @return {undefined}
*/
resetDefaultFields() {
const oldFields = this.fields;
this.fields = {};
// Automatically triggers the recomputation of precomputedFields
this._dfSetParentFields(this.parentFields);
return oldFields;
}
/**
* This function allows the user to remove one or more items from the
* defaultFields's dict.
*
* @param {String[]} fields - List of the names of the fields to be removed
* from the internal dictionary of default
* fields
*
* @return {Object} The previous set of default fields
*/
removeDefaultFields(fields) {
const toRemove = {};
fields.forEach(key => {
toRemove[key] = undefined;
});
return this.addDefaultFields(toRemove);
}
/**
* This function allows the user to add default fields to include into all
* JSON log entries generated through this request logger. As this function
* attempt not to modify the provided fields object, it copies the field
* into a new object for safe keeping.
*
* @param {Object} fields The dictionnary of additional fields to include
* by default for this instance of the
* RequestLogger.
*
* @return {Object} The previous set of default fields (can be
* safely ignored).
*/
addDefaultFields(fields) {
const oldFields = this.fields;
this.fields = objectCopy({}, this.fields, fields);
this._dfAugmentPrecomputedFields(fields);
return oldFields;
}
/**
* This function returns the node's precomputed fields, that includes all
* of its hierarchy's fields and its own. This is intended to retrieve the
* final form of the fields managed by the object.
*
* @return {Object} The precomputed fields to be added to a log entry
*/
_dfGetFields() {
return this.precomputedFields;
}
}
module.exports = DefaultFields;

View File

@ -1,5 +1,4 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const logLevels = [ const logLevels = [
'trace', 'trace',
@ -19,15 +18,10 @@ const logLevels = [
* *
* @throw {Error} A human-readable message that tells which log * @throw {Error} A human-readable message that tells which log
* levels are supported. * levels are supported.
* @throw {TypeError} A human-readable message indicating that the
* provided logging level was not a string
* *
* @returns {undefined} * @returns {undefined}
*/ */
function throwIfInvalid(level) { function throwIfInvalid(level) {
if (typeof level !== 'string') {
throw new TypeError('Logging level should be a string');
}
if (logLevels.indexOf(level) === -1) { if (logLevels.indexOf(level) === -1) {
throw new RangeError(`Invalid logging level: ${level} is neither` throw new RangeError(`Invalid logging level: ${level} is neither`
+ ` ${logLevels.join(', ')}.`); + ` ${logLevels.join(', ')}.`);

View File

@ -1,32 +1,64 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const LogLevel = require('./LogLevel.js'); const LogLevel = require('./LogLevel.js');
const RequestLogger = require('./RequestLogger.js'); const RequestLogger = require('./RequestLogger.js');
const { unserializeUids } = require('./Utils.js'); const Utils = require('./Utils.js');
const Config = require('./Config.js'); const Config = require('./Config.js');
const DefaultFields = require('./DefaultFields.js');
const unserializeUids = Utils.unserializeUids;
const objectCopy = Utils.objectCopy;
class Logger extends DefaultFields {
class Logger {
/** /**
* This is the constructor of the Logger class. It takes optional * This is the constructor of the Logger class. It takes optional
* configuration parameters, that allow to modify its behavior. * configuration parameters, that allow to modify its behavior.
* *
* @param {Werelogs.Config} config - An instanciated Werelogs Config object * @param {string|Object} fields - Deprecated: {string}: The name of the
* Logger that will be included in the log
* entries
* Advised: {Object} A set of fields that
* will be used as the default fields for
* this Logger, and will be included in
* all later log entries.
* *
* @param {string} name - The name of the Logger. It can be found later on * @param {object} config - A configuration object for werelogs.
* in the log entries. * @param {string} config.level - The string name of the logging level
* ('trace', 'debug', 'info', 'warn',
* 'error' and 'fatal' in order of
* importance.)
* @param {string} config.dump - The string name of the log dumping level
* ('trace', 'debug', 'info', 'warn',
* 'error' and 'fatal' in order of
* importance.)
* *
* @returns {undefined} * @returns {undefined}
*/ */
constructor(config, name) { constructor(fields, config) {
if (!(config instanceof Config)) { super();
throw new TypeError('Invalid parameter Type for "config".'); /* TODO XXX FIXME Remove starting at version 8.0 FIXME XXX TODO
* vvvvvvvvvvvvvvvvvvvvvvvv */
if (typeof fields === 'string') {
this.addDefaultFields({ name: fields });
} else if (typeof fields === 'object') {
/* ^^^^^^^^^^^^^^^^^^^^^^^
* TODO XXX FIXME Remove starting at version 8.0 FIXME XXX TODO */
this.addDefaultFields(fields);
/* TODO XXX FIXME Remove starting at version 8.0 FIXME XXX TODO
* vvvvvvvvvvvvvvvvvvvvvvvv */
} }
if (!(typeof name === 'string' || name instanceof String)) { /* ^^^^^^^^^^^^^^^^^^^^^^^
throw new TypeError('Invalid parameter Type for "name".'); * TODO XXX FIXME Remove starting at version 8.0 FIXME XXX TODO */
} Config.update(config);
this.config = config; }
this.name = name;
setLevel(levelName) {
Config.update({ level: levelName });
}
setDumpThreshold(levelName) {
Config.update({ dump: levelName });
} }
/** /**
@ -38,10 +70,10 @@ class Logger {
* @returns {RequestLogger} A Valid Request Logger * @returns {RequestLogger} A Valid Request Logger
*/ */
newRequestLogger(uids) { newRequestLogger(uids) {
const rLog = new RequestLogger(this.config.logger, const rLog = new RequestLogger(Config.logger,
this.config.level, this.config.dump, Config.level, Config.dump, Config.end,
this.config.end, uids); uids);
rLog.addDefaultFields({ name: this.name }); rLog.setParent(this);
return rLog; return rLog;
} }
@ -54,18 +86,16 @@ class Logger {
* @returns {RequestLogger} A Valid Request Logger * @returns {RequestLogger} A Valid Request Logger
*/ */
newRequestLoggerFromSerializedUids(serializedUids) { newRequestLoggerFromSerializedUids(serializedUids) {
const rLog = new RequestLogger(this.config.logger, const rLog = new RequestLogger(Config.logger,
this.config.level, this.config.dump, Config.level, Config.dump, Config.end,
this.config.end, unserializeUids(serializedUids));
unserializeUids(serializedUids)); rLog.setParent(this);
rLog.addDefaultFields({ name: this.name });
return rLog; return rLog;
} }
_doLog(levelName, msg, data) { _doLog(levelName, msg, data) {
const sLogger = this.config.logger; const sLogger = Config.logger;
const finalData = { name: this.name, time: Date.now() }; if (!LogLevel.shouldLog(levelName, Config.level)) {
if (!LogLevel.shouldLog(levelName, this.config.level)) {
return; return;
} }
if (data !== undefined && typeof data !== 'object') { if (data !== undefined && typeof data !== 'object') {
@ -74,10 +104,12 @@ class Logger {
callparams: [msg, data], callparams: [msg, data],
}, },
'Werelogs API was mis-used.' 'Werelogs API was mis-used.'
+ ' This development error should be fixed ASAP.', + ' This development error should be fixed ASAP.');
);
return; return;
} }
const finalData = objectCopy({},
this._dfGetFields(),
{ time: Date.now() });
if (data) { if (data) {
Object.keys(data).forEach(k => { Object.keys(data).forEach(k => {
finalData[k] = data[k]; finalData[k] = data[k];

View File

@ -1,49 +1,54 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict const assert = require('assert');
const LogLevel = require('./LogLevel.js'); const LogLevel = require('./LogLevel.js');
const Utils = require('./Utils.js'); const Utils = require('./Utils.js');
const DefaultFields = require('./DefaultFields.js');
const { serializeUids, generateUid, objectCopy } = Utils; const serializeUids = Utils.serializeUids;
const generateUid = Utils.generateUid;
const objectCopy = Utils.objectCopy;
function ensureUidValidity(uid) { function ensureUidValidity(uid) {
if (uid.indexOf(':') !== -1) { if (uid.indexOf(':') !== -1) {
throw new Error(`RequestLogger UID "${uid}" contains an illegal ` throw new Error(`RequestLogger UID "${uid}" contains an illegal ` +
+ 'character: \':\'.'); 'character: \':\'.');
} }
return uid; return uid;
} }
class EndLogger { /*
* @mixes DefaultFields
*/
class EndLogger extends DefaultFields {
constructor(reqLogger) { constructor(reqLogger) {
super();
this.logger = reqLogger; this.logger = reqLogger;
this.fields = {}; this.setParent(reqLogger);
} }
augmentedLog(level, msg, data) { augmentedLog(level, msg, data) {
// We can alter current instance, as it won't be usable after this assert.strictEqual(this.logger.elapsedTime, null, 'The logger\'s'
// call. + 'end() wrapper should not be called more than'
this.fields = objectCopy(this.fields, data || {}); + ' once.');
return this.logger.log(level, msg, this.fields, true); const fields = objectCopy(this.precomputedFields, data || {});
} const res = this.logger.log(level, msg, fields, true);
/* XXX NOTE
/** *
* This function allows the user to add default fields to include into all * This is our "manual" memory management.
* JSON log entries generated through this request logger. As this function * Since child and parent `DefaultFields` are actually refering to each
* attempt not to modify the provided fields object, it copies the field * other, the GC cannot release the child logger.
* into a new object for safe keeping. * Also, we have no means to automatically unregister the child from
* * its parent, nor to have the parent notify the child of its update
* @param {object} fields The dictionnary of additional fields to include * without keeping those references. So here we go, manual memory
* by default for this instance of the * management in Garbage-collected languages !
* RequestLogger. *
* * All hail << The Coming of the Great White Handkerchief >> that will
* @returns {object} The previous set of default fields (can be * wipe this misery out of our lives !
* safely ignored). *
*/ * XXX NOTE */
addDefaultFields(fields) { this._dfDestroy();
const oldFields = this.fields; return res;
this.fields = objectCopy({}, this.fields, fields);
return oldFields;
} }
/** /**
@ -137,7 +142,8 @@ class EndLogger {
* the global log level; and is used to track the log events for one given * the global log level; and is used to track the log events for one given
* request. * request.
*/ */
class RequestLogger { class RequestLogger extends DefaultFields {
/** /**
* Constructor of the WereLogs Request Logger. * Constructor of the WereLogs Request Logger.
* This function takes a logger instance, a logging level, and a last * This function takes a logger instance, a logging level, and a last
@ -173,7 +179,8 @@ class RequestLogger {
* @returns {undefined} * @returns {undefined}
*/ */
constructor(logger, logLevel, dumpThreshold, endLevel, uids) { constructor(logger, logLevel, dumpThreshold, endLevel, uids) {
let uidList; super();
let uidList = undefined;
if (!LogLevel.shouldLog(dumpThreshold, logLevel)) { if (!LogLevel.shouldLog(dumpThreshold, logLevel)) {
throw new Error('Logging Dump level should be equal or' throw new Error('Logging Dump level should be equal or'
@ -183,13 +190,12 @@ class RequestLogger {
if (uids !== undefined && Array.isArray(uids)) { if (uids !== undefined && Array.isArray(uids)) {
uidList = uids.map(uid => ensureUidValidity(uid)); uidList = uids.map(uid => ensureUidValidity(uid));
uidList.push(generateUid()); uidList.push(generateUid());
} else if (uids !== undefined && typeof uids === 'string') { } else if (uids !== undefined && typeof(uids) === 'string') {
uidList = [ensureUidValidity(uids)]; uidList = [ensureUidValidity(uids)];
} }
this.uids = uidList || [generateUid()]; this.uids = uidList || [generateUid()];
this.entries = []; this.entries = [];
this.fields = {};
this.logLevel = logLevel; this.logLevel = logLevel;
this.dumpThreshold = dumpThreshold; this.dumpThreshold = dumpThreshold;
this.endLevel = endLevel; this.endLevel = endLevel;
@ -229,25 +235,6 @@ class RequestLogger {
return serializeUids(this.uids); return serializeUids(this.uids);
} }
/**
* This function allows the user to add default fields to include into all
* JSON log entries generated through this request logger. As this function
* attempt not to modify the provided fields object, it copies the field
* into a new object for safe keeping.
*
* @param {object} fields The dictionnary of additional fields to include
* by default for this instance of the
* RequestLogger.
*
* @returns {object} The previous set of default fields (can be
* safely ignored).
*/
addDefaultFields(fields) {
const oldFields = this.fields;
this.fields = objectCopy({}, this.fields, fields);
return oldFields;
}
/** /**
* Logging function to write a trace-level log entry. * Logging function to write a trace-level log entry.
* *
@ -365,6 +352,8 @@ class RequestLogger {
if (msg === undefined && data === undefined) { if (msg === undefined && data === undefined) {
return this.endLogger; return this.endLogger;
} }
assert.strictEqual(this.elapsedTime, null, 'The "end()" logging method '
+ 'should not be called more than once.');
return this.log(this.endLevel, msg, data, true); return this.log(this.endLevel, msg, data, true);
} }
@ -381,6 +370,8 @@ class RequestLogger {
* @returns {undefined} * @returns {undefined}
*/ */
errorEnd(msg, data) { errorEnd(msg, data) {
assert.strictEqual(this.elapsedTime, null, 'The "end()" logging method '
+ 'should not be called more than once.');
return this.log('error', msg, data, true); return this.log('error', msg, data, true);
} }
@ -422,11 +413,10 @@ class RequestLogger {
+ ' This development error should be fixed ASAP.', + ' This development error should be fixed ASAP.',
{ {
callparams: [msg, logFields], callparams: [msg, logFields],
}, });
);
return; return;
} }
const fields = objectCopy({}, this.fields, logFields || {}); const fields = objectCopy({}, this._dfGetFields(), logFields || {});
const endFlag = isEnd || false; const endFlag = isEnd || false;
/* /*
@ -440,12 +430,6 @@ class RequestLogger {
// eslint-disable-next-line camelcase // eslint-disable-next-line camelcase
fields.req_id = serializeUids(this.uids); fields.req_id = serializeUids(this.uids);
if (endFlag) { if (endFlag) {
if (this.elapsedTime !== null) {
// reset elapsedTime to avoid an infinite recursion
// while logging the error
this.elapsedTime = null;
this.error('RequestLogger.end() has been called more than once');
}
this.elapsedTime = process.hrtime(this.startTime); this.elapsedTime = process.hrtime(this.startTime);
// eslint-disable-next-line camelcase // eslint-disable-next-line camelcase
fields.elapsed_ms = this.elapsedTime[0] * 1000 fields.elapsed_ms = this.elapsedTime[0] * 1000
@ -466,6 +450,10 @@ class RequestLogger {
} else if (LogLevel.shouldLog(level, this.logLevel)) { } else if (LogLevel.shouldLog(level, this.logLevel)) {
this.doLogIO(logEntry); this.doLogIO(logEntry);
} }
if (isEnd) {
this.setParent(null);
}
} }
/** /**

View File

@ -1,17 +1,7 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const os = require('os'); const os = require('os');
const safeJSONStringify = require('safe-json-stringify'); const safeJSONStringify = require('safe-json-stringify');
const fastJSONStringify = require('fast-safe-stringify')
function errorStackReplacer(key, value) {
if (value instanceof Error) {
return value.stack;
}
return value;
}
/* /*
* This function safely stringifies JSON. If an exception occcurs (due to * This function safely stringifies JSON. If an exception occcurs (due to
* circular references, exceptions thrown from object getters etc.), the module * circular references, exceptions thrown from object getters etc.), the module
@ -23,13 +13,12 @@ function errorStackReplacer(key, value) {
function safeStringify(obj) { function safeStringify(obj) {
let str; let str;
try { try {
// Try to stringify the object (fast version) str = JSON.stringify(obj);
str = fastJSONStringify(obj, errorStackReplacer);
} catch (e) { } catch (e) {
// fallback to remove circular object references or other exceptions // fallback to remove circular object references or other exceptions
// eslint-disable-next-line no-param-reassign // eslint-disable-next-line no-param-reassign
obj.unsafeJSON = true; obj.unsafeJSON = true;
return safeJSONStringify(obj, errorStackReplacer); return safeJSONStringify(obj);
} }
return str; return str;
} }
@ -45,11 +34,11 @@ function isWriteableStream(s) {
class SimpleLogger { class SimpleLogger {
constructor(name, streams) { constructor(name, streams) {
this.name = name; this.name = name;
this.streams = [{ level: 'trace', stream: process.stdout }]; this.streams = [{ level: 'info', stream: process.stdout }];
if (streams) { if (streams) {
if (!Array.isArray(streams)) { if (!Array.isArray(streams)) {
throw new Error('Invalid streams. streams must be an array' throw new Error('Invalid streams. streams must be an array' +
+ ' list of writeable streams'); ' list of writeable streams');
} }
/* /*
* This is for backwards compatibility. current config in projects * This is for backwards compatibility. current config in projects
@ -79,9 +68,8 @@ class SimpleLogger {
logFields.hostname = this.hostname; logFields.hostname = this.hostname;
logFields.pid = process.pid; logFields.pid = process.pid;
const safeString = safeStringify(logFields);
this.streams.forEach(s => s.stream this.streams.forEach(s => s.stream
.write(`${safeString}\n`)); .write(`${safeStringify(logFields)}\n`));
} }
info(fields, message) { info(fields, message) {

View File

@ -1,5 +1,4 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
/** /**
* @constant * @constant

View File

@ -1,72 +0,0 @@
// eslint-disable-line strict
const Config = require('./Config.js');
const Logger = require('./Logger.js');
class API {
/**
* This is the constructor of the Logger class. It takes optional
* configuration parameters, that allow to modify its behavior.
*
* @param {object} config - A configuration object for werelogs.
* @param {string} config.level - The name of the logging level ('trace',
* 'debug', 'info', 'warn', 'error' and
* 'fatal' in order of importance.)
* @param {string} config.dump - The name of the log dumping level
* ('trace', 'debug', 'info', 'warn',
* 'error' and 'fatal' in order of
* importance.)
* @param {object[]} config.streams - The streams into which to log. This
* is an Array of objects which have a
* field named 'stream', which is
* writeable.
*/
constructor(config) {
this.config = new Config(config);
this.preboundLogger = Logger.bind(null, this.config);
}
/**
* This is a thunk function that allows reconfiguring the streams and log
* levels of all Logger and future RequestLogger objects. Note that
* existing RequestLogger will live their lifespan retaining the old
* configuration.
* If the provided configuration is erroneous, the function may throw
* exceptions depending on the detected configuration error. Please see the
* Config class's documentation about that.
*
* @throws {TypeError} - One of the provided arguments is not of the
* expected type
* @throws {RangeError} - The logging level provided is not part of the
* supported logging levels
* @throws {Error} - A human-readable message providing details about
* a logic error due to the input parameters
* provided.
*
* @param {object} config - A configuration object for werelogs.
* @param {string} config.level - The name of the logging level ('trace',
* 'debug', 'info', 'warn', 'error' and
* 'fatal' in order of importance.)
* @param {string} config.dump - The name of the log dumping level
* ('trace', 'debug', 'info', 'warn',
* 'error' and 'fatal' in order of
* importance.)
* @param {object[]} config.streams - The streams into which to log. This
* is an Array of objects which have a
* field named 'stream', which is
* writeable.
*
* @returns {undefined}
*
*/
reconfigure(config) {
this.config.update(config);
}
get Logger() {
return this.preboundLogger;
}
}
module.exports = API;

View File

@ -1,106 +0,0 @@
/**
* @returns {string} a timestamp in ISO format YYYY-MM-DDThh:mm:ss.sssZ
*/
const defaultTimestamp = () => new Date().toISOString();
/**
* Prints on stderr a timestamp, the origin and the error
*
* If no other instructions are needed on uncaughtException,
* consider using `catchAndTimestampStderr` directly.
*
* @example
* process.on('uncaughtException', (err, origin) => {
* printErrorWithTimestamp(err, origin);
* // server.close();
* // file.close();
* process.nextTick(() => process.exit(1));
* });
* // Don't forget to timestamp warning
* catchAndTimestampWarning();
* @param {Error} err see process event uncaughtException
* @param {uncaughtException|unhandledRejection} origin see process event
* @param {string} [date=`defaultTimestamp()`] Date to print
* @returns {boolean} see process.stderr.write
*/
function printErrorWithTimestamp(
err, origin, date = defaultTimestamp(),
) {
return process.stderr.write(`${date}: ${origin}:\n${err.stack}\n`);
}
/**
* Prefer using `catchAndTimestampStderr` instead of this function.
*
* Adds listener for uncaughtException to print with timestamp.
*
* If you want to manage the end of the process, you can set exitCode to null.
* Or use `printErrorWithTimestamp` in your own uncaughtException listener.
*
* @param {Function} [dateFct=`defaultTimestamp`] Fct returning a formatted date
* @param {*} [exitCode=1] On uncaughtException, if not null, `process.exit`
* will be called with this value
* @returns {undefined}
*/
function catchAndTimestampUncaughtException(
dateFct = defaultTimestamp, exitCode = 1,
) {
process.on('uncaughtException', (err, origin) => {
printErrorWithTimestamp(err, origin, dateFct());
if (exitCode !== null) {
process.nextTick(() => process.exit(exitCode));
}
});
}
/**
* Forces the use of `--trace-warnings` and adds a date in warning.detail
* The warning will be printed by the default `onWarning`
*
* @param {string} [dateFct=`defaultTimestamp`] Fct returning a formatted date
* @returns {undefined}
*/
function catchAndTimestampWarning(dateFct = defaultTimestamp) {
process.traceProcessWarnings = true;
// must be executed first, before the default `onWarning`
process.prependListener('warning', warning => {
if (warning.detail) {
// eslint-disable-next-line no-param-reassign
warning.detail += `\nAbove Warning Date: ${dateFct()}`;
} else {
// eslint-disable-next-line no-param-reassign
warning.detail = `Above Warning Date: ${dateFct()}`;
}
});
}
/**
* Adds listener for uncaughtException and warning to print them with timestamp.
*
* If you want to manage the end of the process, you can set exitCode to null.
* Or use `printErrorWithTimestamp` in your own uncaughtException listener.
*
* @example
* const { stderrUtils } = require('werelogs');
* // first instruction in your index.js or entrypoint
* stderrUtils.catchAndTimestampStderr();
*
* @param {Function} [dateFct=`defaultTimestamp`] Fct returning a formatted date
* @param {*} [exitCode=1] On uncaughtException, if not null, `process.exit`
* will be called with this value
* @returns {undefined}
*/
function catchAndTimestampStderr(
dateFct = defaultTimestamp, exitCode = 1,
) {
catchAndTimestampUncaughtException(dateFct, exitCode);
catchAndTimestampWarning(dateFct);
}
module.exports = {
defaultTimestamp,
printErrorWithTimestamp,
catchAndTimestampUncaughtException,
catchAndTimestampWarning,
catchAndTimestampStderr,
};

View File

@ -1,18 +1,18 @@
{ {
"name": "werelogs", "name": "werelogs",
"engines": { "engines": {
"node": ">=10" "node": "6.9.5"
}, },
"version": "8.1.5", "version": "7.0.0",
"description": "An efficient raw JSON logging library aimed at micro-services architectures.", "description": "An efficient raw JSON logging library aimed at micro-services architectures.",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {
"gendoc": "jsdoc $(git ls-files 'lib/*.js') -d doc", "gendoc": "jsdoc $(git ls-files 'lib/*.js') -d doc",
"lint": "eslint $(git ls-files '*.js')", "lint": "eslint $(git ls-files '*.js')",
"lint_md": "markdownlint $(git ls-files '*.md')", "lint_md": "mdlint $(git ls-files '*.md')",
"test": "mocha tests/unit/", "test": "mocha tests/unit/",
"ft_test": "(npm pack && cp werelogs-*.tgz tests/functional && cd tests/functional && cp -R ../../node_modules/ node_modules/ && npm install werelogs-*.tgz && ./node_modules/.bin/mocha . multi-modules/ && rm -rf tests/functional/node_modules tests/functional/werelogs-*.tgz tests/functional/*lock*)", "ft_test": "rm -rf tests/functional/node_modules && npm pack && cp -R node_modules werelogs-*.tgz tests/functional && cd tests/functional && npm install werelogs-*.tgz && ./node_modules/.bin/mocha . multi-modules/ && cd -",
"coverage": "nyc ./node_modules/.bin/_mocha tests/unit" "coverage": "istanbul cover ./node_modules/.bin/_mocha tests/unit"
}, },
"repository": { "repository": {
"type": "git", "type": "git",
@ -33,20 +33,17 @@
}, },
"homepage": "https://github.com/scality/werelogs#readme", "homepage": "https://github.com/scality/werelogs#readme",
"dependencies": { "dependencies": {
"fast-safe-stringify": "^2.1.1", "safe-json-stringify": "^1.0.3"
"safe-json-stringify": "^1.2.0"
}, },
"devDependencies": { "devDependencies": {
"eslint": "^7.32.0", "eslint-plugin-react": "^4.3.0",
"eslint-config-airbnb": "^18.2.1", "eslint": "^2.4.0",
"eslint-config-scality": "git+https://git.yourcmc.ru/vitalif/zenko-eslint-config-scality.git", "eslint-config-airbnb": "^6.0.0",
"eslint-plugin-import": "^2.22.1", "eslint-config-scality": "scality/Guidelines",
"eslint-plugin-jsx-a11y": "^6.4.1", "istanbul": "^1.0.0-alpha",
"eslint-plugin-react": "^7.26.0", "istanbul-api": "==1.0.0-alpha.9",
"eslint-plugin-react-hooks": "^4.2.0", "jsdoc": "^3.4.0",
"jsdoc": "^3.4.3", "mdlint": "^0.1.0",
"markdownlint-cli": "^0.27.1", "mocha": "^3.2.0"
"mocha": ">=3.1.2",
"nyc": "^15.1.0"
} }
} }

View File

@ -1,11 +1,11 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
const LogLevel = require('../lib/LogLevel.js'); const LogLevel = require('../lib/LogLevel.js');
class DummyLogger { class DummyLogger {
constructor() { constructor() {
this.ops = []; this.ops = [];
this.counts = { this.counts = {
@ -57,8 +57,8 @@ function computeBehavior(filterLevel, logLevel, testLevel) {
return { return {
value, value,
msg: `Expected ${logLevel} to be called ${value} times with ` msg: `Expected ${logLevel} to be called ${value} times with ` +
+ `filter level ${filterLevel}.`, `filter level ${filterLevel}.`,
}; };
} }
@ -107,10 +107,9 @@ function loggingMisuseGenerator(test, createLogger) {
logger.info.apply(logger, test.args); logger.info.apply(logger, test.args);
}, },
Error, Error,
`Werelogs should not throw with ${test.desc}`, `Werelogs should not throw with ${test.desc}`);
);
assert(dummyLogger.ops[0][0], 'fatal', assert(dummyLogger.ops[0][0], 'fatal',
'Expected the Module Logger to have logged a fatal message.'); 'Expected the Module Logger to have logged a fatal message.');
done(); done();
}; };
} }

View File

@ -1,12 +1,10 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
const { PassThrough } = require('stream'); const PassThrough = require('stream').PassThrough;
const pass = new PassThrough;
const pass = new PassThrough(); const Logger = require('werelogs').Logger;
const werelogs = require('werelogs'); // eslint-disable-line
// With PassThrough, SimpleLogger can use it as Writeable stream and all the // With PassThrough, SimpleLogger can use it as Writeable stream and all the
// data being written can be read into a variable // data being written can be read into a variable
@ -17,23 +15,23 @@ pass.on('data', data => {
logBuffer.records.push(data.toString()); logBuffer.records.push(data.toString());
}); });
werelogs.configure({ function createModuleLogger(fields) {
level: 'info', const defaultFields = fields || 'FT-test';
dump: 'error', return new Logger(defaultFields, {
streams: [{ level: 'info',
stream: pass, dump: 'error',
type: 'raw', streams: [{
}], stream: pass,
}); type: 'raw',
}],
function createModuleLogger() { });
return new werelogs.Logger('FT-test');
} }
function checkFields(fields) { function checkFields(fields) {
const record = JSON.parse(logBuffer.records[0].trim()); const record = JSON.parse(logBuffer.records[0].trim());
Object.keys(fields).forEach(k => { Object.keys(fields).forEach(k => {
if (Object.prototype.hasOwnProperty.call(fields, k)) { if (fields.hasOwnProperty(k)) {
assert.deepStrictEqual(record[k], fields[k]); assert.deepStrictEqual(record[k], fields[k]);
} }
}); });
@ -52,7 +50,7 @@ describe('Werelogs is usable as a dependency', () => {
assert.doesNotThrow( assert.doesNotThrow(
createModuleLogger, createModuleLogger,
Error, Error,
'Werelogs threw an exception trying to create a ModuleLogger.', 'Werelogs threw an exception trying to create a ModuleLogger.'
); );
done(); done();
}); });
@ -68,11 +66,8 @@ describe('Werelogs is usable as a dependency', () => {
it('Should be able to log a message and additional fields', done => { it('Should be able to log a message and additional fields', done => {
const logger = createModuleLogger(); const logger = createModuleLogger();
const msg = 'This is a message with added fields'; const msg = 'This is a message with added fields';
const fields = { const fields = { errorCode: 9, description: 'TestError',
errorCode: 9, options: { dump: false } };
description: 'TestError',
options: { dump: false },
};
logger.info(msg, fields); logger.info(msg, fields);
assert.strictEqual(parseLogEntry().message, msg); assert.strictEqual(parseLogEntry().message, msg);
checkFields(fields); checkFields(fields);
@ -88,7 +83,7 @@ describe('Werelogs is usable as a dependency', () => {
assert.doesNotThrow( assert.doesNotThrow(
() => createModuleLogger().newRequestLogger(), () => createModuleLogger().newRequestLogger(),
Error, Error,
'Werelogs threw an exception trying to create a ModuleLogger.', 'Werelogs threw an exception trying to create a ModuleLogger.'
); );
done(); done();
}); });
@ -104,15 +99,42 @@ describe('Werelogs is usable as a dependency', () => {
it('Should be able to log a message and additional fields', done => { it('Should be able to log a message and additional fields', done => {
const logger = createModuleLogger().newRequestLogger(); const logger = createModuleLogger().newRequestLogger();
const msg = 'This is a message with added fields'; const msg = 'This is a message with added fields';
const fields = { const fields = { errorCode: 9, description: 'TestError',
errorCode: 9, options: { dump: false } };
description: 'TestError',
options: { dump: false },
};
logger.info(msg, fields); logger.info(msg, fields);
assert.strictEqual(parseLogEntry().message, msg); assert.strictEqual(parseLogEntry().message, msg);
checkFields(fields); checkFields(fields);
done(); done();
}); });
it('Should not log a removed field', done => {
const logger = createModuleLogger().newRequestLogger();
const msg = 'This is a message with no fields(removed)';
const fields = { errorCode: 0, description: 'TestNotFailing' };
logger.addDefaultFields(fields);
logger.removeDefaultFields(['errorCode', 'description']);
logger.info(msg);
assert.strictEqual(parseLogEntry().message, msg);
assert(!parseLogEntry().hasOwnProperty('errorCode'));
assert(!parseLogEntry().hasOwnProperty('description'));
done();
});
it('Should include the parent Loggers default fields', done => {
const mFields = {
name: 'TestModule',
submodule: 'functional',
};
const logger = createModuleLogger(mFields);
const rLog = logger.newRequestLogger();
const msg =
"This is a message including the module's default fields";
rLog.info(msg);
assert.strictEqual(parseLogEntry().message, msg);
assert.deepStrictEqual(parseLogEntry().name, mFields.name);
assert.deepStrictEqual(parseLogEntry().submodule,
mFields.submodule);
done();
});
}); });
}); });

View File

@ -1,14 +1,13 @@
const assert = require('assert'); const assert = require('assert');
const { PassThrough } = require('stream'); const PassThrough = require('stream').PassThrough;
const Werelogs = require('werelogs'); // eslint-disable-line const Werelogs = require('werelogs').Logger;
const modules = [ const modules = [
require('./module1.js'), require('./module1.js'),
require('./module2.js'), require('./module2.js'),
require('./module3.js'), require('./module3.js'),
]; ];
const pass = new PassThrough;
const pass = new PassThrough();
const logBuffer = { const logBuffer = {
records: [], records: [],
@ -17,10 +16,10 @@ pass.on('data', data => {
logBuffer.records.push(JSON.parse(data.toString().trim())); logBuffer.records.push(JSON.parse(data.toString().trim()));
}); });
describe('Config is shared and unique within one API', () => { describe('Config is shared and unique within one program', () => {
it('should find all log entries in the RingBuffer with the right ' it('should find all log entries in the RingBuffer with the right ' +
+ 'module name', done => { 'module name', done => {
Werelogs.configure({ const log = new Werelogs('test-index', {
level: 'debug', level: 'debug',
dump: 'fatal', dump: 'fatal',
streams: [{ streams: [{
@ -28,7 +27,6 @@ describe('Config is shared and unique within one API', () => {
stream: pass, stream: pass,
}], }],
}); });
const log = new Werelogs.Logger('test-index');
modules.forEach(mod => { mod(); }); modules.forEach(mod => { mod(); });
log.warn('Logging as warn'); log.warn('Logging as warn');
const rLog = log.newRequestLogger(); const rLog = log.newRequestLogger();

View File

@ -1,4 +1,4 @@
const Werelogs = require('werelogs').Logger; // eslint-disable-line const Werelogs = require('werelogs').Logger;
const log = new Werelogs('test-mod1'); const log = new Werelogs('test-mod1');

View File

@ -1,4 +1,4 @@
const Werelogs = require('werelogs').Logger; // eslint-disable-line const Werelogs = require('werelogs').Logger;
const log = new Werelogs('test-mod2'); const log = new Werelogs('test-mod2');

View File

@ -1,4 +1,4 @@
const Werelogs = require('werelogs').Logger; // eslint-disable-line const Werelogs = require('werelogs').Logger;
const log = new Werelogs('test-mod3'); const log = new Werelogs('test-mod3');

View File

@ -5,54 +5,51 @@ const assert = require('assert');
const Config = require('../../lib/Config.js'); const Config = require('../../lib/Config.js');
describe('Config', () => { describe('Config', () => {
const config = new Config();
beforeEach(() => { beforeEach(() => {
config.reset(); Config.reset();
}); });
it('should work with default configuration', done => { it('should work with default configuration', done => {
assert.doesNotThrow( assert.doesNotThrow(
() => { () => {
config.logger.info('test message'); Config.logger.info('test message');
}, },
Error, Error);
);
done(); done();
}); });
it('log level should be updateable', done => { it('log level should be updateable', done => {
config.update({ level: 'debug' }); Config.update({ level: 'debug' });
assert.strictEqual(config.level, 'debug', 'Expected config\'s log level to be updated.'); assert.strictEqual(Config.level, 'debug', 'Expected Config\'s log level to be updated.');
done(); done();
}); });
it('dump threshold should be updateable', done => { it('dump threshold should be updateable', done => {
const origDump = config.dump; const origDump = Config.dump;
assert.notStrictEqual(origDump, 'warn', 'Expected original config.dump to differ from value to update.'); assert.notStrictEqual(origDump, 'warn', 'Expected original Config.dump to differ from value to update.');
config.update({ dump: 'warn' }); Config.update({ dump: 'warn' });
assert.strictEqual(config.dump, 'warn', 'Expected config\'s dump threshold to be updated.'); assert.strictEqual(Config.dump, 'warn', 'Expected Config\'s dump threshold to be updated.');
done(); done();
}); });
it('end logging level should be updateable', done => { it('end logging level should be updateable', done => {
const origEnd = config.end; const origEnd = Config.end;
assert.notStrictEqual(origEnd, 'trace', 'Expected original config.end to differ from value to update.'); assert.notStrictEqual(origEnd, 'trace', 'Expected original Config.end to differ from value to update.');
config.update({ end: 'trace' }); Config.update({ end: 'trace' });
assert.strictEqual(config.end, 'trace', 'Expected config\'s end log level to be updated.'); assert.strictEqual(Config.end, 'trace', 'Expected Config\'s end log level to be updated.');
done(); done();
}); });
it('should not be modified by an empty config object', done => { it('should not be modified by an empty config object', done => {
const origLevel = config.level; const origLevel = Config.level;
const origDump = config.dump; const origDump = Config.dump;
const origLogger = config.logger; const origLogger = Config.logger;
const origStreams = config.streams; const origStreams = Config.streams;
config.update({}); Config.update({});
assert.deepStrictEqual(origLevel, config.level, 'Expected logging level not to have changed.'); assert.deepStrictEqual(origLevel, Config.level, 'Expected logging level not to have changed.');
assert.deepStrictEqual(origDump, config.dump, 'Expected dump threshold not to have changed.'); assert.deepStrictEqual(origDump, Config.dump, 'Expected dump threshold not to have changed.');
assert.strictEqual(origLogger, config.logger, 'Expected logger not to have changed.'); assert.strictEqual(origLogger, Config.logger, 'Expected logger not to have changed.');
assert.deepStrictEqual(origStreams, config.streams, 'Expected streams not to have changed.'); assert.deepStrictEqual(origStreams, Config.streams, 'Expected streams not to have changed.');
done(); done();
}); });
}); });

515
tests/unit/DefaultFields.js Normal file
View File

@ -0,0 +1,515 @@
'use strict'; // eslint-disable-line strict
const assert = require('assert');
const DefaultFields = require('../../lib/DefaultFields.js');
describe('class DefaultFields', () => {
describe('Basic Fields logic', () => {
it('Can get the resulting fields', done => {
const df = new DefaultFields();
const fields = df._dfGetFields();
assert(fields !== null && typeof fields === 'object');
done();
});
it('No fields are set by default', done => {
const df = new DefaultFields();
assert.deepStrictEqual(df._dfGetFields(), {});
done();
});
it('Validate defensive coding prevents undefined parentFields',
done => {
const df = new DefaultFields();
df._dfSetParentFields();
assert.deepStrictEqual(df._dfGetFields(), {});
done();
});
describe('With fields', () => {
let df = null;
const fields = {
name: 'Testing',
testing: true,
count: 4,
data: 'doggy',
};
beforeEach(done => {
df = new DefaultFields();
df.addDefaultFields(fields);
done();
});
it('Can add a dictionary of new fields', done => {
assert.deepStrictEqual(df._dfGetFields(), fields);
done();
});
it('Can remove a list of invalid fields', done => {
df.removeDefaultFields(['invalid', 'notthere']);
done();
});
it('Can remove a list of fields', done => {
df.removeDefaultFields(['data', 'count']);
assert.strictEqual(df._dfGetFields().data, undefined);
assert.strictEqual(df._dfGetFields().count, undefined);
assert.strictEqual(df._dfGetFields().name, 'Testing');
assert.strictEqual(df._dfGetFields().testing, true);
done();
});
it('Can reset the fields', done => {
df.resetDefaultFields();
assert.deepStrictEqual(df._dfGetFields(), {});
done();
});
});
});
describe('Basic Parent-Child logic', () => {
let parentNode = null;
beforeEach(done => {
parentNode = new DefaultFields();
done();
});
function linkedChild(pNode) {
const childNode = new DefaultFields();
childNode.setParent(pNode);
return childNode;
}
it('Can set a parent to a node', done => {
const childNode = linkedChild(parentNode);
assert.strictEqual(childNode.parent, parentNode);
assert.notStrictEqual(parentNode.children.indexOf(childNode), -1);
done();
});
it('Can remove a child from a node', done => {
const childNode = linkedChild(parentNode);
parentNode._dfUnregisterChild(childNode);
assert.strictEqual(parentNode.children.indexOf(childNode), -1);
done();
});
it('Can reset the parent of a node (w/ undefined)', done => {
const childNode = linkedChild(parentNode);
childNode.setParent(undefined);
assert.strictEqual(childNode.parent, null);
assert.strictEqual(parentNode.children.indexOf(childNode), -1);
done();
});
it('Can reset the parent of a node (w/ null)', done => {
const childNode = linkedChild(parentNode);
childNode.setParent(null);
assert.strictEqual(childNode.parent, null);
assert.strictEqual(parentNode.children.indexOf(childNode), -1);
done();
});
});
describe('Single parent-child relationship', () => {
let parentNode = null;
let childNode = null;
beforeEach(done => {
parentNode = new DefaultFields();
childNode = new DefaultFields();
childNode.setParent(parentNode);
done();
});
it('Child can define its own fields', done => {
const fields = {
child: true,
parent: false,
test: 1,
};
childNode.addDefaultFields(fields);
assert.deepStrictEqual(childNode._dfGetFields(), fields);
done();
});
it('Parent can define its own fields', done => {
const fields = {
child: false,
parent: true,
test: 2,
};
parentNode.addDefaultFields(fields);
assert.deepStrictEqual(parentNode._dfGetFields(), fields);
done();
});
it('Child inherits parents fields', done => {
const fields = {
child: true,
parent: false,
test: 3,
};
parentNode.addDefaultFields(fields);
assert.deepStrictEqual(childNode._dfGetFields(), fields);
done();
});
it('Child inherits successive parent field updates', done => {
const pFields1 = {
parent: true,
test: 4,
};
const rFields1 = {
parent: true,
test: 4,
};
const pFields2 = { child: false };
const rFields2 = {
parent: true,
test: 4,
child: false,
};
const pFields3 = {
data: 'pouet',
};
const rFields3 = {
parent: true,
test: 4,
child: false,
data: 'pouet',
};
parentNode.addDefaultFields(pFields1);
assert.deepStrictEqual(childNode._dfGetFields(), rFields1);
parentNode.addDefaultFields(pFields2);
assert.deepStrictEqual(childNode._dfGetFields(), rFields2);
parentNode.addDefaultFields(pFields3);
assert.deepStrictEqual(childNode._dfGetFields(), rFields3);
done();
});
it('Child inherits reset parent fields', done => {
const pFields = {
parent: true,
test: 5,
};
parentNode.addDefaultFields(pFields);
assert.deepStrictEqual(childNode._dfGetFields(), pFields);
parentNode.resetDefaultFields();
assert.deepStrictEqual(childNode._dfGetFields(), {});
done();
});
it('Child mixes parent and own fields', done => {
const pFields = { parent: true };
const cFields = {
child: true,
test: 6,
};
const rFields = {
parent: true,
child: true,
test: 6,
};
parentNode.addDefaultFields(pFields);
childNode.addDefaultFields(cFields);
assert.deepStrictEqual(childNode._dfGetFields(), rFields);
done();
});
it('Child overrides conflicting parent fields', done => {
const pFields = {
parent: true,
child: false,
test: 0,
};
const cFields = {
child: true,
test: 7,
};
const rFields = {
parent: true,
child: true,
test: 7,
};
parentNode.addDefaultFields(pFields);
childNode.addDefaultFields(cFields);
assert.deepStrictEqual(childNode._dfGetFields(), rFields);
done();
});
});
describe('Multiple-level parent-child relationship', () => {
let ggpNode = null;
let gpNode = null;
let pNode = null;
let childNode = null;
beforeEach(done => {
ggpNode = new DefaultFields();
gpNode = new DefaultFields();
gpNode.setParent(ggpNode);
pNode = new DefaultFields();
pNode.setParent(gpNode);
childNode = new DefaultFields();
childNode.setParent(pNode);
done();
});
it('Child inherits from whole hierarchy (newer -> older)',
done => {
const ggpFields = { great: true };
const gpFields = { grand: true };
const pFields = { parent: true };
const rFields = {
great: true,
grand: true,
parent: true,
};
pNode.addDefaultFields(pFields);
gpNode.addDefaultFields(gpFields);
ggpNode.addDefaultFields(ggpFields);
assert.deepStrictEqual(childNode._dfGetFields(), rFields);
done();
});
it('Child inherits from whole hierarchy (older -> newer)',
done => {
const ggpFields = { great: true };
const gpFields = { grand: true };
const pFields = { parent: true };
const rFields = {
great: true,
grand: true,
parent: true,
};
ggpNode.addDefaultFields(ggpFields);
gpNode.addDefaultFields(gpFields);
pNode.addDefaultFields(pFields);
assert.deepStrictEqual(childNode._dfGetFields(), rFields);
done();
});
it('Nodes inherit in-hierarchy fields reset', done => {
const ggpFields = { great: true };
const gpFields = { grand: true };
const pFields = { parent: true };
const rFields = {
great: true,
// grand: true, // Part 'reset'
parent: true,
};
ggpNode.addDefaultFields(ggpFields);
gpNode.addDefaultFields(gpFields);
pNode.addDefaultFields(pFields);
gpNode.resetDefaultFields();
assert.deepStrictEqual(childNode._dfGetFields(), rFields);
done();
});
it('Field overriding is cascading through generations (newer -> older)',
done => {
const ggpFields = { generation: 0 };
const gpFields = { generation: 1 };
const pFields = { generation: 2 };
const cFields = { generation: 3 };
childNode.addDefaultFields(cFields);
pNode.addDefaultFields(pFields);
gpNode.addDefaultFields(gpFields);
ggpNode.addDefaultFields(ggpFields);
assert.deepStrictEqual(childNode._dfGetFields(), cFields);
assert.deepStrictEqual(pNode._dfGetFields(), pFields);
assert.deepStrictEqual(gpNode._dfGetFields(), gpFields);
assert.deepStrictEqual(ggpNode._dfGetFields(), ggpFields);
done();
});
it('Field overriding is cascading through generations (older -> newer)',
done => {
const ggpFields = { generation: 0 };
const gpFields = { generation: 1 };
const pFields = { generation: 2 };
const cFields = { generation: 3 };
ggpNode.addDefaultFields(ggpFields);
gpNode.addDefaultFields(gpFields);
pNode.addDefaultFields(pFields);
childNode.addDefaultFields(cFields);
assert.deepStrictEqual(childNode._dfGetFields(), cFields);
assert.deepStrictEqual(pNode._dfGetFields(), pFields);
assert.deepStrictEqual(gpNode._dfGetFields(), gpFields);
assert.deepStrictEqual(ggpNode._dfGetFields(), ggpFields);
done();
});
it('Destroying intermediate level breaks relationships', done => {
const ggpFields = { ggp: 1 };
const gpFields = { gp: 1 };
const pFields = { p: 1 };
const cFields = { c: 1 };
ggpNode.addDefaultFields(ggpFields);
gpNode.addDefaultFields(gpFields);
pNode.addDefaultFields(pFields);
childNode.addDefaultFields(cFields);
pNode._dfDestroy();
assert.strictEqual(gpNode.children.indexOf(pNode), -1);
assert.strictEqual(pNode.parent, null);
assert.strictEqual(pNode.children.indexOf(childNode), -1);
assert.strictEqual(childNode.parent, null);
assert.deepStrictEqual(pNode._dfGetFields(), pFields);
assert.deepStrictEqual(childNode._dfGetFields(), cFields);
done();
});
it('Destroying intermediate level(2) breaks relationships', done => {
const ggpFields = { ggp: 1 };
const gpFields = { gp: 1 };
const pFields = { p: 1 };
const cFields = { c: 1 };
const rCFields = { p: 1, c: 1 };
ggpNode.addDefaultFields(ggpFields);
gpNode.addDefaultFields(gpFields);
pNode.addDefaultFields(pFields);
childNode.addDefaultFields(cFields);
gpNode._dfDestroy();
assert.strictEqual(ggpNode.children.indexOf(gpNode), -1);
assert.strictEqual(gpNode.parent, null);
assert.strictEqual(gpNode.children.indexOf(pNode), -1);
assert.strictEqual(pNode.parent, null);
assert.deepStrictEqual(gpNode._dfGetFields(), gpFields);
assert.deepStrictEqual(childNode._dfGetFields(), rCFields);
done();
});
});
describe('Topology changes', () => {
let ggpNode1 = null;
let ggpNode2 = null;
let gpNode1 = null;
let gpNode2 = null;
let pNode1 = null;
let pNode2 = null;
let cNode = null;
const ggp1Fields = { ggp1: true, generation: 0 };
const ggp2Fields = { ggp2: true, generation: 0 };
const gp1Fields = { gp1: true, generation: 1 };
const gp2Fields = { gp2: true, generation: 1 };
const p1Fields = { p1: true, generation: 2 };
const p2Fields = { p2: true, generation: 2 };
const cFields = { c: true, generation: 3 };
const startFields = {
ggp1: true,
gp1: true,
p1: true,
c: true,
generation: 3,
};
beforeEach(done => {
ggpNode1 = new DefaultFields();
gpNode1 = new DefaultFields();
gpNode1.setParent(ggpNode1);
pNode1 = new DefaultFields();
pNode1.setParent(gpNode1);
ggpNode2 = new DefaultFields();
gpNode2 = new DefaultFields();
gpNode2.setParent(ggpNode2);
pNode2 = new DefaultFields();
pNode2.setParent(gpNode2);
cNode = new DefaultFields();
cNode.setParent(pNode1);
ggpNode1.addDefaultFields(ggp1Fields);
ggpNode2.addDefaultFields(ggp2Fields);
gpNode1.addDefaultFields(gp1Fields);
gpNode2.addDefaultFields(gp2Fields);
pNode1.addDefaultFields(p1Fields);
pNode2.addDefaultFields(p2Fields);
cNode.addDefaultFields(cFields);
done();
});
it('Nodes are updated accordingly when a parent changes '
+ '(change whole upper hierarchy)',
done => {
const rFields = {
ggp2: true,
gp2: true,
p2: true,
c: true,
generation: 3,
};
assert.deepStrictEqual(cNode._dfGetFields(), startFields);
cNode.setParent(pNode2);
assert.deepStrictEqual(cNode._dfGetFields(), rFields);
done();
});
it('Nodes are updated accordingly when a parent changes '
+ '(change part of the upper hierarchy)',
done => {
const rFields = {
ggp2: true,
gp2: true,
p1: true,
c: true,
generation: 3,
};
assert.deepStrictEqual(cNode._dfGetFields(), startFields);
pNode1.setParent(gpNode2);
assert.deepStrictEqual(cNode._dfGetFields(), rFields);
done();
});
it('Nodes are updated accordingly when a parent changes '
+ '(shortcut hierarchy)',
done => {
const rFields = {
ggp2: true,
c: true,
generation: 3,
};
assert.deepStrictEqual(cNode._dfGetFields(), startFields);
cNode.setParent(ggpNode2);
assert.deepStrictEqual(cNode._dfGetFields(), rFields);
done();
});
it('Nodes are updated accordingly when a parent is unset '
+ '(remove whole hierarchy)',
done => {
const rFields = {
c: true,
generation: 3,
};
assert.deepStrictEqual(cNode._dfGetFields(), startFields);
cNode.setParent(null);
assert.deepStrictEqual(cNode._dfGetFields(), rFields);
done();
});
it('Nodes are updated accordingly when a parent is unset '
+ '(remove upper hierarchy)',
done => {
const rFields = {
gp1: true,
p1: true,
c: true,
generation: 3,
};
assert.deepStrictEqual(cNode._dfGetFields(), startFields);
gpNode1.setParent(null);
assert.deepStrictEqual(cNode._dfGetFields(), rFields);
done();
});
});
});

View File

@ -1,5 +1,4 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
@ -13,8 +12,7 @@ function generateValidThrowTest(level) {
}, },
Error, Error,
'Expected level to be valid and ' 'Expected level to be valid and '
+ 'the function not to throw an Error.', + 'the function not to throw an Error.');
);
done(); done();
}; };
} }
@ -28,28 +26,27 @@ describe('LogLevel', () => {
}, },
RangeError, RangeError,
'Expected function to throw an Error instance due to ' 'Expected function to throw an Error instance due to '
+ 'invalid log level.', + 'invalid log level.');
);
done(); done();
}); });
it('should not throw on "trace" level', it('should not throw on "trace" level',
generateValidThrowTest('trace')); generateValidThrowTest('trace'));
it('should not throw on "debug" level', it('should not throw on "debug" level',
generateValidThrowTest('debug')); generateValidThrowTest('debug'));
it('should not throw on "info" level', it('should not throw on "info" level',
generateValidThrowTest('info')); generateValidThrowTest('info'));
it('should not throw on "warn" level', it('should not throw on "warn" level',
generateValidThrowTest('warn')); generateValidThrowTest('warn'));
it('should not throw on "error" level', it('should not throw on "error" level',
generateValidThrowTest('error')); generateValidThrowTest('error'));
it('should not throw on "fatal" level', it('should not throw on "fatal" level',
generateValidThrowTest('fatal')); generateValidThrowTest('fatal'));
}); });
describe('shouldLog(level, floor)', () => { describe('shouldLog(level, floor)', () => {
@ -57,8 +54,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('trace', 'trace'), LogLevel.shouldLog('trace', 'trace'),
true, true,
'Expected trace floor to allow logging trace level.', 'Expected trace floor to allow logging trace level.');
);
done(); done();
}); });
@ -66,8 +62,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('debug', 'debug'), LogLevel.shouldLog('debug', 'debug'),
true, true,
'Expected debug floor to allow logging debug level.', 'Expected debug floor to allow logging debug level.');
);
done(); done();
}); });
@ -75,8 +70,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('info', 'info'), LogLevel.shouldLog('info', 'info'),
true, true,
'Expected info floor to allow logging info level.', 'Expected info floor to allow logging info level.');
);
done(); done();
}); });
@ -84,8 +78,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('warn', 'warn'), LogLevel.shouldLog('warn', 'warn'),
true, true,
'Expected warn floor to allow logging warn level.', 'Expected warn floor to allow logging warn level.');
);
done(); done();
}); });
@ -93,8 +86,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('error', 'error'), LogLevel.shouldLog('error', 'error'),
true, true,
'Expected error floor to allow logging error level.', 'Expected error floor to allow logging error level.');
);
done(); done();
}); });
@ -102,8 +94,7 @@ describe('LogLevel', () => {
assert.strictEqual( assert.strictEqual(
LogLevel.shouldLog('fatal', 'fatal'), LogLevel.shouldLog('fatal', 'fatal'),
true, true,
'Expected fatal floor to allow logging fatal level.', 'Expected fatal floor to allow logging fatal level.');
);
done(); done();
}); });
}); });

View File

@ -1,15 +1,15 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
const { genericFilterGenerator, loggingMisuseGenerator, DummyLogger } = require('../Utils'); const Utils = require('../Utils.js');
const genericFilterGenerator = Utils.genericFilterGenerator;
const loggingMisuseGenerator = Utils.loggingMisuseGenerator;
const DummyLogger = Utils.DummyLogger;
const Config = require('../../lib/Config.js'); const Config = require('../../lib/Config.js');
const RequestLogger = require('../../lib/RequestLogger.js'); const RequestLogger = require('../../lib/RequestLogger.js');
const Logger = require('../../lib/Logger.js'); const Logger = require('../../index.js').Logger;
const config = new Config();
/* /*
* This function is a thunk-function calling the Utils' filterGenerator with * This function is a thunk-function calling the Utils' filterGenerator with
@ -18,14 +18,18 @@ const config = new Config();
*/ */
function filterGenerator(logLevel, callLevel) { function filterGenerator(logLevel, callLevel) {
function createModuleLogger(dummyLogger, filterLevel) { function createModuleLogger(dummyLogger, filterLevel) {
const logger = new Logger('TestModuleLogger',
{
level: filterLevel,
dump: 'fatal',
});
/* /*
* Here, patch the config by setting a specifically designed dummyLogger * Here, patch the Config by setting a specificly designed dummyLogger
* for testing purposes that will help us collect runtime data. * for testing purposes that will help us collect runtime data.
*/ */
const testConfig = new Config({ level: filterLevel, dump: 'fatal' }); Config.simpleLogger = dummyLogger;
testConfig.simpleLogger = dummyLogger;
return new Logger(testConfig, 'TestModuleLogger'); return logger;
} }
return genericFilterGenerator(logLevel, callLevel, createModuleLogger); return genericFilterGenerator(logLevel, callLevel, createModuleLogger);
@ -33,92 +37,146 @@ function filterGenerator(logLevel, callLevel) {
function checkFields(src, result) { function checkFields(src, result) {
Object.keys(src).forEach(k => { Object.keys(src).forEach(k => {
if (Object.prototype.hasOwnProperty.call(src, k)) { if (src.hasOwnProperty(k)) {
assert.deepStrictEqual(result[k], src[k]); assert.deepStrictEqual(result[k], src[k]);
} }
}); });
assert.ok(Object.prototype.hasOwnProperty.call(result, 'time')); assert.ok(result.hasOwnProperty('time'));
// Time field should be current give or take 1s // Time field should be current give or take 1s
assert.ok((Date.now() - result.time) < 1000); assert.ok((Date.now() - result.time) < 1000);
} }
describe('Logger is usable:', () => { describe('WereLogs Logger is usable:', () => {
beforeEach(() => { beforeEach(() => {
config.reset(); Config.reset();
}); });
it('Cannot be instanciated without parameters', done => { it('Can be instanciated with only a name', done => {
assert.throws( assert.doesNotThrow(
() => new Logger(),
TypeError,
'Logger Instanciation should not succeed without parameter.',
);
done();
});
it('Cannot be instanciated with only a config', done => {
assert.throws(
() => new Logger(config),
TypeError,
'Logger Instanciation should not be succeed without a name.',
);
done();
});
it('Cannot be instanciated with a bad config type', done => {
assert.throws(
() => new Logger({ level: 'info' }, 'WereLogsTest'),
TypeError,
'Logger Instanciation should not succeed with a bad config type.',
);
done();
});
it('Cannot be instanciated with only a name', done => {
assert.throws(
() => new Logger('WereLogsTest'), () => new Logger('WereLogsTest'),
TypeError, Error,
'Logger Instanciation should not succeed with only a name.', 'WereLogs Instanciation should not throw any kind of error.');
); done();
});
it('Cannot be instanciated with invalid log level', done => {
assert.throws(
() => new Logger('test', { level: 'invalidlevel' }),
RangeError,
// eslint-disable-next-line max-len
'WereLogs should not be instanciable without the proper logging levels.');
done();
});
it('Cannot be instanciated with invalid dump threshold level', done => {
assert.throws(
() => new Logger('test', { level: 'trace', dump: 'invalidlevel' }),
RangeError,
// eslint-disable-next-line max-len
'WereLogs should not be instanciable without the proper dumping threshold levels.');
done();
});
it('Cannot be instanciated with a non-Array in config.streams', done => {
assert.throws(
() => new Logger('test', { streams: process.stdout }),
Error,
// eslint-disable-next-line max-len
'Werelogs should not be instanciable with a stream option that is not an array.');
done();
});
it('Cannot be instanciated with an empty Array in config.streams', done => {
assert.throws(
() => new Logger('test', { streams: [] }),
Error,
// eslint-disable-next-line max-len
'Werelogs should not be instanciable with an empty array for the streams option.');
done();
});
it('Cannot set logging level to invalid level at runtime', done => {
const logger = new Logger('test');
assert.throws(
() => {
logger.setLevel('invalidLevel');
},
RangeError,
// eslint-disable-next-line max-len
'WereLogs should not be able to set log level to an invalid level.');
done();
});
it('Can set logging level at runtime', done => {
const logger = new Logger('test');
assert.doesNotThrow(
() => {
logger.setLevel('fatal');
},
RangeError,
'WereLogs should be able to set log level at runtime.');
done();
});
it('Cannot set dump threshold to invalid level at runtime', done => {
const logger = new Logger('test');
assert.throws(
() => {
logger.setDumpThreshold('invalidLevel');
},
RangeError,
// eslint-disable-next-line max-len
'WereLogs should not be able to set dump threshold to an invalid level.');
done();
});
it('Can set dump threshold at runtime', done => {
const logger = new Logger('test');
assert.doesNotThrow(
() => {
logger.setDumpThreshold('fatal');
},
RangeError,
'WereLogs should be able to set dump threshold at runtime.');
done(); done();
}); });
it('Can create Per-Request Loggers', done => { it('Can create Per-Request Loggers', done => {
const logger = new Logger(config, 'test'); const logger = new Logger('test');
assert.doesNotThrow( assert.doesNotThrow(
() => { () => {
logger.newRequestLogger(); logger.newRequestLogger();
}, },
Error, Error,
'Werelogs should not throw when creating a request logger.', 'Werelogs should not throw when creating a request logger.');
); const reqLogger = logger.newRequestLogger();
assert(reqLogger instanceof RequestLogger, 'RequestLogger');
done(); done();
}); });
it('Can create Per-Request Loggers from a Serialized UID Array', done => { it('Can create Per-Request Loggers from a Serialized UID Array', done => {
const logger = new Logger(config, 'test'); const logger = new Logger('test');
assert.doesNotThrow( assert.doesNotThrow(
() => { () => {
logger.newRequestLogger(); logger.newRequestLogger();
}, },
Error, Error,
// eslint-disable-next-line max-len // eslint-disable-next-line max-len
'Werelogs should not throw when creating a request logger from a Serialized UID Array.', 'Werelogs should not throw when creating a request logger from a Serialized UID Array.');
);
const reqLogger = logger.newRequestLoggerFromSerializedUids( const reqLogger = logger.newRequestLoggerFromSerializedUids(
'OneUID:SecondUID:TestUID:YouWinUID', 'OneUID:SecondUID:TestUID:YouWinUID');
);
assert(reqLogger instanceof RequestLogger, 'RequestLogger'); assert(reqLogger instanceof RequestLogger, 'RequestLogger');
assert.deepStrictEqual(reqLogger.getUids().slice(0, -1), assert.deepStrictEqual(reqLogger.getUids().slice(0, -1),
['OneUID', 'SecondUID', 'TestUID', 'YouWinUID']); ['OneUID', 'SecondUID', 'TestUID', 'YouWinUID']);
done(); done();
}); });
it('Uses the additional fields as expected', done => { it('Uses the additional fields as expected', done => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
config.simpleLogger = dummyLogger; const logger = new Logger('test');
const logger = new Logger(config, 'test'); Config.simpleLogger = dummyLogger;
const fields = { const fields = {
ip: '127.0.0.1', ip: '127.0.0.1',
method: 'GET', method: 'GET',
@ -134,27 +192,27 @@ describe('Logger is usable:', () => {
describe('Does not crash and logs a fatal message when mis-using its logging API', () => { describe('Does not crash and logs a fatal message when mis-using its logging API', () => {
const testValues = [ const testValues = [
{ desc: 'a string as second argument', args: ['test', 'second-param-string'] }, { desc: 'a string as second argument', args: ['test', 'second-param-string'] },
{ desc: 'a function as second argument', args: ['test', () => { }] }, // eslint-disable-line arrow-body-style { desc: 'a function as second argument', args: ['test', () => { return; }] }, // eslint-disable-line arrow-body-style
{ desc: 'a Number as second argument', args: ['test', 1] }, { desc: 'a Number as second argument', args: ['test', 1] },
{ desc: 'more than 2 arguments', args: ['test', 2, 3, 4] }, { desc: 'more than 2 arguments', args: ['test', 2, 3, 4] },
]; ];
/* eslint-enable max-len */ /* eslint-enable max-len */
function createMisusableLogger(dummyLogger) { function createMisusableLogger(dummyLogger) {
config.simpleLogger = dummyLogger; const logger = new Logger('test');
const logger = new Logger(config, 'test'); Config.simpleLogger = dummyLogger;
return logger; return logger;
} }
for (let i = 0; i < testValues.length; ++i) { for (let i = 0; i < testValues.length; ++i) {
const test = testValues[i]; const test = testValues[i];
it(`Does not crash with ${test.desc}`, it(`Does not crash with ${test.desc}`,
loggingMisuseGenerator(test, createMisusableLogger)); loggingMisuseGenerator(test, createMisusableLogger));
} }
}); });
}); });
/* eslint-disable no-multi-spaces, max-len */ /* eslint-disable no-multi-spaces, max-len */
describe('Logger can log as specified by the log level', () => { describe('Werelogs Module-level Logger can log as specified by the log level', () => {
it('Trace level does not filter trace level out', filterGenerator('trace', 'trace')); it('Trace level does not filter trace level out', filterGenerator('trace', 'trace'));
it('Trace level does not filter debug level out', filterGenerator('trace', 'debug')); it('Trace level does not filter debug level out', filterGenerator('trace', 'debug'));
it('Trace level does not filter info level out', filterGenerator('trace', 'info')); it('Trace level does not filter info level out', filterGenerator('trace', 'info'));
@ -198,3 +256,85 @@ describe('Logger can log as specified by the log level', () => {
it('Fatal level does not filter fatal level out', filterGenerator('fatal', 'fatal')); it('Fatal level does not filter fatal level out', filterGenerator('fatal', 'fatal'));
}); });
/* eslint-enable no-multi-spaces, max-len */ /* eslint-enable no-multi-spaces, max-len */
describe('Werelogs Module-level Logger provide the DefaultFields logic', () => {
const loggerConfig = {
level: 'info',
dump: 'fatal',
end: 'info',
};
it('should not modify the object passed as a parameter', done => {
const defaultFields = {
name: 'TestDefaultFields1',
data: 0,
};
const add1 = {
attr1: 0,
};
const add2 = {
attr2: 'string',
};
const dummyLogger = new DummyLogger();
Config.simpleLogger = dummyLogger;
const logger = new Logger(defaultFields, loggerConfig);
logger.addDefaultFields(add1);
logger.addDefaultFields(add2);
assert.deepStrictEqual(add1, { attr1: 0 });
assert.deepStrictEqual(add2, { attr2: 'string' });
done();
});
it('should add one added default field to the log entries', done => {
const defaultFields = {
name: 'TestDefaultFields2',
data: 0,
};
const clientInfo = {
clientIP: '127.0.0.1',
};
const dummyLogger = new DummyLogger();
Config.simpleLogger = dummyLogger;
const logger = new Logger(defaultFields, loggerConfig);
logger.addDefaultFields(clientInfo);
logger.info('test message');
assert.strictEqual(dummyLogger.ops[0][1][0].clientIP,
clientInfo.clientIP);
done();
});
it('should add multiple added default fields to the log entries',
done => {
const defaultFields = {
name: 'TestDefaultFields3',
data: 0,
};
const clientInfo = {
clientIP: '127.0.0.1',
clientPort: '1337',
};
const requestInfo = {
object: '/tata/self.txt',
creator: 'Joddy',
};
const dummyLogger = new DummyLogger();
Config.simpleLogger = dummyLogger;
const logger = new Logger(defaultFields, loggerConfig);
logger.addDefaultFields(clientInfo);
logger.addDefaultFields(requestInfo);
logger.info('test message');
assert.strictEqual(dummyLogger.ops[0][1][0].clientIP,
clientInfo.clientIP);
assert.strictEqual(dummyLogger.ops[0][1][0].clientPort,
clientInfo.clientPort);
assert.strictEqual(dummyLogger.ops[0][1][0].object,
requestInfo.object);
assert.strictEqual(dummyLogger.ops[0][1][0].creator,
requestInfo.creator);
assert.strictEqual(dummyLogger.ops[0][1][0].name,
defaultFields.name);
assert.strictEqual(dummyLogger.ops[0][1][0].data,
defaultFields.data);
done();
});
});

View File

@ -1,9 +1,11 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
const { DummyLogger, genericFilterGenerator, loggingMisuseGenerator } = require('../Utils.js'); const Utils = require('../Utils.js');
const DummyLogger = Utils.DummyLogger;
const genericFilterGenerator = Utils.genericFilterGenerator;
const loggingMisuseGenerator = Utils.loggingMisuseGenerator;
const RequestLogger = require('../../lib/RequestLogger.js'); const RequestLogger = require('../../lib/RequestLogger.js');
@ -22,7 +24,7 @@ function filterGenerator(logLevel, callLevel) {
function runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts, function runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts,
done) { done) {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, 'trace', 'error', 'info'); const reqLogger = new RequestLogger(dummyLogger, 'trace', 'error', 'info');
@ -44,9 +46,9 @@ function runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts,
expectedHistory.every((val, index) => { expectedHistory.every((val, index) => {
assert.strictEqual(dummyLogger.ops[index][0], val[0], assert.strictEqual(dummyLogger.ops[index][0], val[0],
'Expected log entry levels to match.'); 'Expected log entry levels to match.');
assert.strictEqual(dummyLogger.ops[index][1][1], val[1], assert.strictEqual(dummyLogger.ops[index][1][1], val[1],
'Expected log entry values to match.'); 'Expected log entry values to match.');
return true; return true;
}); });
assert.deepEqual(dummyLogger.counts, expectedCounts); assert.deepEqual(dummyLogger.counts, expectedCounts);
@ -60,8 +62,7 @@ describe('RequestLogger', () => {
assert.throws( assert.throws(
() => new RequestLogger(undefined, 'fatal', 'debug', 'info'), () => new RequestLogger(undefined, 'fatal', 'debug', 'info'),
Error, Error,
'Dump level "debug" should not be valid with logging level "fatal".', 'Dump level "debug" should not be valid with logging level "fatal".');
);
done(); done();
}); });
@ -69,8 +70,7 @@ describe('RequestLogger', () => {
assert.doesNotThrow( assert.doesNotThrow(
() => new RequestLogger(undefined, 'debug', 'fatal', 'info'), () => new RequestLogger(undefined, 'debug', 'fatal', 'info'),
Error, Error,
'Dump level "fatal" should be valid with logging level "debug".', 'Dump level "fatal" should be valid with logging level "debug".');
);
done(); done();
}); });
}); });
@ -98,8 +98,7 @@ describe('RequestLogger', () => {
assert.throws( assert.throws(
() => new RequestLogger(undefined, 'debug', 'fatal', 'info', 'pouet:tata'), () => new RequestLogger(undefined, 'debug', 'fatal', 'info', 'pouet:tata'),
Error, Error,
'UID string "pouet:tata" should be rejected by the RequestLogger constructor.', 'UID string "pouet:tata" should be rejected by the RequestLogger constructor.');
);
done(); done();
}); });
@ -117,8 +116,7 @@ describe('RequestLogger', () => {
assert.throws( assert.throws(
() => new RequestLogger(undefined, 'debug', 'fatal', 'info', ['OneUID', 'SecondUID', 'Test:DashUID']), () => new RequestLogger(undefined, 'debug', 'fatal', 'info', ['OneUID', 'SecondUID', 'Test:DashUID']),
Error, Error,
'UID string "Test:DashUID" should be rejected by the RequestLogger constructor.', 'UID string "Test:DashUID" should be rejected by the RequestLogger constructor.');
);
done(); done();
}); });
}); });
@ -188,7 +186,7 @@ describe('RequestLogger', () => {
describe('Does not crash when mis-using its logging API', () => { describe('Does not crash when mis-using its logging API', () => {
const testValues = [ const testValues = [
{ desc: 'a string as second argument', args: ['test', 'second-param-string'] }, { desc: 'a string as second argument', args: ['test', 'second-param-string'] },
{ desc: 'a function as second argument', args: ['test', function f() { }] }, { desc: 'a function as second argument', args: ['test', function f() { return; }] },
{ desc: 'a Number as second argument', args: ['test', 1] }, { desc: 'a Number as second argument', args: ['test', 1] },
{ desc: 'more than 2 arguments', args: ['test', 2, 3, 4] }, { desc: 'more than 2 arguments', args: ['test', 2, 3, 4] },
]; ];
@ -199,7 +197,7 @@ describe('RequestLogger', () => {
for (let i = 0; i < testValues.length; ++i) { for (let i = 0; i < testValues.length; ++i) {
const test = testValues[i]; const test = testValues[i];
it(`Does not crash with ${test.desc}`, it(`Does not crash with ${test.desc}`,
loggingMisuseGenerator(test, createMisusableRequestLogger)); loggingMisuseGenerator(test, createMisusableRequestLogger));
} }
}); });
@ -250,35 +248,35 @@ describe('RequestLogger', () => {
describe('Logging API regression testing', () => { describe('Logging API regression testing', () => {
it('Should not alter the input fields when not actually logging', it('Should not alter the input fields when not actually logging',
done => { done => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
const refFields = { hits: 45, count: 32 }; const refFields = { hits: 45, count: 32 };
const usedFields = { ...refFields }; const usedFields = Object.assign({}, refFields);
reqLogger.debug('test', usedFields); reqLogger.debug('test', usedFields);
assert.deepStrictEqual(usedFields, refFields); assert.deepStrictEqual(usedFields, refFields);
done(); done();
}); });
it('Should not alter the input fields when actually logging', it('Should not alter the input fields when actually logging',
done => { done => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
const refFields = { hits: 45, count: 32 }; const refFields = { hits: 45, count: 32 };
const usedFields = { ...refFields }; const usedFields = Object.assign({}, refFields);
reqLogger.info('test', usedFields); reqLogger.info('test', usedFields);
assert.deepStrictEqual(usedFields, refFields); assert.deepStrictEqual(usedFields, refFields);
done(); done();
}); });
it('Should not alter the input fields when dumping', done => { it('Should not alter the input fields when dumping', done => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
const refFields = { hits: 45, count: 32 }; const refFields = { hits: 45, count: 32 };
const usedFields = { ...refFields }; const usedFields = Object.assign({}, refFields);
reqLogger.error('test', usedFields); reqLogger.error('test', usedFields);
assert.deepStrictEqual(usedFields, refFields); assert.deepStrictEqual(usedFields, refFields);
done(); done();
@ -295,7 +293,7 @@ describe('RequestLogger', () => {
}; };
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
reqLogger.addDefaultFields(add1); reqLogger.addDefaultFields(add1);
reqLogger.addDefaultFields(add2); reqLogger.addDefaultFields(add2);
assert.deepStrictEqual(add1, { attr1: 0 }); assert.deepStrictEqual(add1, { attr1: 0 });
@ -309,70 +307,71 @@ describe('RequestLogger', () => {
}; };
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
reqLogger.addDefaultFields(clientInfo); reqLogger.addDefaultFields(clientInfo);
reqLogger.info('test message'); reqLogger.info('test message');
assert.strictEqual(clientInfo.clientIP, assert.strictEqual(clientInfo.clientIP,
dummyLogger.ops[0][1][0].clientIP); dummyLogger.ops[0][1][0].clientIP);
done(); done();
}); });
it('should add multiple added default fields to the log entries', it('should add multiple added default fields to the log entries',
done => { done => {
const clientInfo = { const clientInfo = {
clientIP: '127.0.0.1', clientIP: '127.0.0.1',
clientPort: '1337', clientPort: '1337',
}; };
const requestInfo = { const requestInfo = {
object: '/tata/self.txt', object: '/tata/self.txt',
creator: 'Joddy', creator: 'Joddy',
}; };
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
reqLogger.addDefaultFields(clientInfo); reqLogger.addDefaultFields(clientInfo);
reqLogger.addDefaultFields(requestInfo); reqLogger.addDefaultFields(requestInfo);
reqLogger.info('test message'); reqLogger.info('test message');
assert.strictEqual(clientInfo.clientIP, assert.strictEqual(clientInfo.clientIP,
dummyLogger.ops[0][1][0].clientIP); dummyLogger.ops[0][1][0].clientIP);
assert.strictEqual(clientInfo.clientPort, assert.strictEqual(clientInfo.clientPort,
dummyLogger.ops[0][1][0].clientPort); dummyLogger.ops[0][1][0].clientPort);
assert.strictEqual(requestInfo.object, assert.strictEqual(requestInfo.object,
dummyLogger.ops[0][1][0].object); dummyLogger.ops[0][1][0].object);
assert.strictEqual(requestInfo.creator, assert.strictEqual(requestInfo.creator,
dummyLogger.ops[0][1][0].creator); dummyLogger.ops[0][1][0].creator);
done(); done();
}); });
}); });
describe('Automatic Elapsed Time computation', () => { describe('Automatic Elapsed Time computation', () => {
describe('Deprecated API:', () => { describe('Deprecated API:', () => {
it('should include an "elapsed_ms" field in the last log entry', it('should include an "elapsed_ms" field in the last log entry',
done => { done => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
reqLogger.end('Last message'); reqLogger.end('Last message');
assert.strictEqual(dummyLogger.ops[0][1][1], 'Last message'); assert.strictEqual(dummyLogger.ops[0][1][1], 'Last message');
assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms, assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms,
undefined); undefined);
assert.strictEqual(typeof dummyLogger.ops[0][1][0] assert.strictEqual(typeof(dummyLogger.ops[0][1][0]
.elapsed_ms, 'number'); .elapsed_ms),
done(); 'number');
}); done();
});
// eslint-disable-next-line max-len // eslint-disable-next-line max-len
it('should include an "elapsed_ms" field in the last log entry and be error level', () => { it('should include an "elapsed_ms" field in the last log entry and be error level', () => {
const dummyLogger = new DummyLogger(); const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, const reqLogger = new RequestLogger(dummyLogger,
'info', 'fatal', 'info'); 'info', 'fatal', 'info');
reqLogger.errorEnd('Last message failed'); reqLogger.errorEnd('Last message failed');
assert.strictEqual(dummyLogger.ops[0][1][1], assert.strictEqual(dummyLogger.ops[0][1][1],
'Last message failed'); 'Last message failed');
assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms, assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms,
undefined); undefined);
assert.strictEqual(typeof dummyLogger.ops[0][1][0].elapsed_ms, assert.strictEqual(typeof(dummyLogger.ops[0][1][0].elapsed_ms),
'number'); 'number');
assert.strictEqual(dummyLogger.ops[0][0], 'error'); assert.strictEqual(dummyLogger.ops[0][0], 'error');
}); });
}); });
@ -393,7 +392,7 @@ describe('RequestLogger', () => {
endLogging[level](reqLogger.end())('Last message'); endLogging[level](reqLogger.end())('Last message');
assert.strictEqual(dummyLogger.ops[0][1][1], 'Last message'); assert.strictEqual(dummyLogger.ops[0][1][1], 'Last message');
assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms, undefined); assert.notStrictEqual(dummyLogger.ops[0][1][0].elapsed_ms, undefined);
assert.strictEqual(typeof dummyLogger.ops[0][1][0].elapsed_ms, 'number'); assert.strictEqual(typeof(dummyLogger.ops[0][1][0].elapsed_ms), 'number');
assert.strictEqual(dummyLogger.ops[0][0], level); assert.strictEqual(dummyLogger.ops[0][0], level);
done(); done();
}); });
@ -408,87 +407,54 @@ describe('RequestLogger', () => {
// end() logging // end() logging
reqLogger.end().error('Test Augmented END', { endValue: 42 }); reqLogger.end().error('Test Augmented END', { endValue: 42 });
assert.strictEqual(dummyLogger.ops[0][1][1], 'Test Augmented END'); assert.strictEqual(dummyLogger.ops[0][1][1], 'Test Augmented END');
assert.strictEqual(typeof dummyLogger.ops[0][1][0].elapsed_ms, assert.strictEqual(typeof(dummyLogger.ops[0][1][0].elapsed_ms),
'number'); 'number');
assert.strictEqual(dummyLogger.ops[0][1][0].endFlag, true); assert.strictEqual(dummyLogger.ops[0][1][0].endFlag, true);
assert.strictEqual(dummyLogger.ops[0][1][0].endValue, 42); assert.strictEqual(dummyLogger.ops[0][1][0].endValue, 42);
done(); done();
}); });
it('should log an error in addition to request logs when end() called more than once',
done => {
const dummyLogger = new DummyLogger();
const reqLogger = new RequestLogger(dummyLogger, 'trace', 'fatal');
reqLogger.end().info('after first call to end()');
reqLogger.end().debug('after second call to end()');
assert.strictEqual(dummyLogger.ops.length, 3);
assert.strictEqual(dummyLogger.ops[0][0], 'info');
assert.strictEqual(dummyLogger.ops[0][1][1], 'after first call to end()');
assert.strictEqual(dummyLogger.ops[1][0], 'error');
assert.strictEqual(dummyLogger.ops[2][0], 'debug');
assert.strictEqual(dummyLogger.ops[2][1][1], 'after second call to end()');
done();
});
}); });
describe('Log History dumped when logging floor level reached', () => { describe('Log History dumped when logging floor level reached', () => {
it('Dumping duplicates log entries', done => { it('Dumping duplicates log entries', done => {
const commandHistory = ['info', 'error']; const commandHistory = ['info', 'error'];
const expectedHistory = [['info', 0], ['info', 0], ['error', 1]]; const expectedHistory = [['info', 0], ['info', 0], ['error', 1]];
const expectedCounts = { const expectedCounts = { trace: 0, debug: 0, info: 2, warn: 0,
trace: 0, error: 1, fatal: 0 };
debug: 0,
info: 2,
warn: 0,
error: 1,
fatal: 0,
};
runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts, runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts,
done); done);
done(); done();
}); });
it('Dumping Keeps logging history order', done => { it('Dumping Keeps logging history order', done => {
const commandHistory = ['trace', 'info', 'debug', 'error']; const commandHistory = ['trace', 'info', 'debug', 'error'];
const expectedHistory = [['trace', 0], ['info', 1], ['debug', 2], const expectedHistory = [['trace', 0], ['info', 1], ['debug', 2],
['trace', 0], ['info', 1], ['debug', 2], ['trace', 0], ['info', 1], ['debug', 2],
['error', 3]]; ['error', 3]];
const expectedCounts = { const expectedCounts = { trace: 2, debug: 2, info: 2, warn: 0,
trace: 2, error: 1, fatal: 0 };
debug: 2,
info: 2,
warn: 0,
error: 1,
fatal: 0,
};
runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts, runLoggingDumpTest(commandHistory, expectedHistory, expectedCounts,
done); done);
done(); done();
}); });
it('Dumping multiple times does not re-dump already-dumped entries', it('Dumping multiple times does not re-dump already-dumped entries',
done => { done => {
const commandHistory = ['trace', 'info', 'debug', 'error', const commandHistory = ['trace', 'info', 'debug', 'error',
'warn', 'debug', 'fatal']; 'warn', 'debug', 'fatal'];
const expectedHistory = [['trace', 0], ['info', 1], ['debug', 2], const expectedHistory = [['trace', 0], ['info', 1], ['debug', 2],
['trace', 0], ['info', 1], ['debug', 2], ['trace', 0], ['info', 1], ['debug', 2],
['error', 3], ['warn', 4], ['debug', 5], ['error', 3], ['warn', 4], ['debug', 5],
['warn', 4], ['debug', 5], ['warn', 4], ['debug', 5],
['fatal', 6]]; ['fatal', 6]];
const expectedCounts = { const expectedCounts = { trace: 2, debug: 4, info: 2, warn: 2,
trace: 2, error: 1, fatal: 1 };
debug: 4,
info: 2,
warn: 2,
error: 1,
fatal: 1,
};
runLoggingDumpTest(commandHistory, expectedHistory, runLoggingDumpTest(commandHistory, expectedHistory,
expectedCounts, done); expectedCounts, done);
done(); done();
}); });
}); });
}); });

View File

@ -1,17 +1,17 @@
'use strict'; // eslint-disable-line strict
// eslint-disable-line strict
const assert = require('assert'); const assert = require('assert');
const Utils = require('../../lib/Utils.js');
const { const generateUid = Utils.generateUid;
generateUid, serializeUids, unserializeUids, objectCopy, const serializeUids = Utils.serializeUids;
} = require('../../lib/Utils'); const unserializeUids = Utils.unserializeUids;
const objectCopy = Utils.objectCopy;
describe('Utils: generateUid', () => { describe('Utils: generateUid', () => {
it('generates a string-typed ID', done => { it('generates a string-typed ID', done => {
const uid = generateUid(); const uid = generateUid();
assert.strictEqual(typeof uid, 'string', assert.strictEqual(typeof(uid), 'string',
`The generated ID is not a String (${typeof uid})`); `The generated ID is not a String (${typeof(uid)})`);
done(); done();
}); });
it('generate roughly unique IDs', done => { it('generate roughly unique IDs', done => {
@ -24,8 +24,8 @@ describe('Utils: generateUid', () => {
} }
Object.keys(generated).every(uid => { Object.keys(generated).every(uid => {
assert.strictEqual(generated[uid], 1, assert.strictEqual(generated[uid], 1,
`Uid ${uid} was generated ${generated[uid]} ` `Uid ${uid} was generated ${generated[uid]} ` +
+ 'times: It is not even remotely unique.'); 'times: It is not even remotely unique.');
return {}; return {};
}); });
done(); done();
@ -37,7 +37,7 @@ describe('Utils: serializeUids', () => {
const uidList = ['FirstUID', 'SecondUID', 'ThirdUID']; const uidList = ['FirstUID', 'SecondUID', 'ThirdUID'];
const serializedUIDs = serializeUids(uidList); const serializedUIDs = serializeUids(uidList);
assert.strictEqual(serializedUIDs, 'FirstUID:SecondUID:ThirdUID', assert.strictEqual(serializedUIDs, 'FirstUID:SecondUID:ThirdUID',
'Serialized UID List should match expected value.'); 'Serialized UID List should match expected value.');
done(); done();
}); });
@ -54,12 +54,8 @@ describe('Utils: objectCopy', () => {
it('copies all the properties from source to target object', done => { it('copies all the properties from source to target object', done => {
const target = { foo: 'bar' }; const target = { foo: 'bar' };
const source = { id: 1, name: 'demo', value: { a: 1, b: 2, c: 3 } }; const source = { id: 1, name: 'demo', value: { a: 1, b: 2, c: 3 } };
const result = { const result = { foo: 'bar', id: 1, name: 'demo',
foo: 'bar', value: { a: 1, b: 2, c: 3 } };
id: 1,
name: 'demo',
value: { a: 1, b: 2, c: 3 },
};
objectCopy(target, source); objectCopy(target, source);
assert.deepStrictEqual(target, result, assert.deepStrictEqual(target, result,
'target should have the same properties as source'); 'target should have the same properties as source');
@ -67,32 +63,21 @@ describe('Utils: objectCopy', () => {
}); });
it('copies all the properties from multiple sources to target object', it('copies all the properties from multiple sources to target object',
done => { done => {
const target = { foo: 'bar' }; const target = { foo: 'bar' };
const source1 = { const source1 = { id: 1, name: 'demo1',
id: 1, value: { a: 1, b: 2, c: 3 } };
name: 'demo1', // eslint-disable-next-line camelcase
value: { a: 1, b: 2, c: 3 }, const source2 = { req_id: 2, method: 'test',
}; err: { code: 'error', msg: 'test' } };
// eslint-disable-next-line camelcase const result = { foo: 'bar', id: 1, name: 'demo1',
const source2 = { value: { a: 1, b: 2, c: 3 },
req_id: 2, // eslint-disable-next-line camelcase
method: 'test', req_id: 2, method: 'test',
err: { code: 'error', msg: 'test' }, err: { code: 'error', msg: 'test' } };
}; objectCopy(target, source1, source2);
const result = { assert.deepStrictEqual(target, result,
foo: 'bar', 'target should have the same properties as source');
id: 1, done();
name: 'demo1', });
value: { a: 1, b: 2, c: 3 },
// eslint-disable-next-line camelcase
req_id: 2,
method: 'test',
err: { code: 'error', msg: 'test' },
};
objectCopy(target, source1, source2);
assert.deepStrictEqual(target, result,
'target should have the same properties as source');
done();
});
}); });

View File

@ -1,17 +0,0 @@
#!/usr/bin/env node
// Convert string args into primitive value
const fromStr = (str, primitive) => (str === `${primitive}` ? primitive : str);
const date = fromStr(process.argv[2], undefined);
const exitCode = fromStr(fromStr(process.argv[3], null), undefined);
const { stderrUtils } = require('../../../../index');
stderrUtils.catchAndTimestampStderr(
date ? () => date : undefined,
exitCode,
);
process.emitWarning('TestWarningMessage');
// This will print warning after printing error before exit
throw new Error('TestingError');

View File

@ -1,23 +0,0 @@
#!/usr/bin/env node
// Convert string args into primitive value
const fromStr = (str, primitive) => (str === `${primitive}` ? primitive : str);
const date = fromStr(process.argv[2], undefined);
const exitCode = fromStr(fromStr(process.argv[3], null), undefined);
const promise = fromStr(process.argv[4], true);
const { stderrUtils } = require('../../../../index');
stderrUtils.catchAndTimestampUncaughtException(
date ? () => date : undefined,
exitCode,
);
// Executed if process does not exit, process is in undefined behavior (bad)
// eslint-disable-next-line no-console
setTimeout(() => console.log('EXECUTED AFTER UNCAUGHT EXCEPTION'), 1);
if (promise === true) {
Promise.reject();
} else {
throw new Error('TestingError');
}

View File

@ -1,38 +0,0 @@
#!/usr/bin/env node
// Convert string args into primitive value
const fromStr = (str, primitive) => (str === `${primitive}` ? primitive : str);
const date = fromStr(process.argv[2], undefined);
const name = fromStr(process.argv[3], undefined);
const code = fromStr(process.argv[4], undefined);
const detail = fromStr(process.argv[5], undefined);
const { stderrUtils } = require('../../../../index');
stderrUtils.catchAndTimestampWarning(
date ? () => date : undefined,
);
const warning = new Error('TestWarningMessage');
if (name) warning.name = name;
if (code) warning.code = code;
if (detail) warning.detail = detail;
process.emitWarning(warning);
/*
Examples:
(node:203831) Error: TestWarningMessage
at Object.<anonymous> (catchWarning.js:15:17)
...
at node:internal/main/run_main_module:22:47
Above Warning Date: 2024-06-26T16:32:55.505Z
(node:205151) [TEST01] CUSTOM: TestWarningMessage
at Object.<anonymous> (catchWarning.js:15:17)
...
at node:internal/main/run_main_module:22:47
Some additional detail
Above Warning Date: Tue, 31 Dec 2024 10:20:30 GMT
*/

View File

@ -1,309 +0,0 @@
const assert = require('assert');
const { execFile } = require('child_process');
const stderrUtils = require('../../lib/stderrUtils');
/** Simple regex for ISO YYYY-MM-DDThh:mm:ss.sssZ */
// eslint-disable-next-line max-len
const defaultDateRegex = /\d{4}-[01]\d-[0-3]\dT[0-2]\d:[0-5]\d:[0-5]\d\.\d+(?:[+-][0-2]\d:[0-5]\d|Z)/;
// eslint-disable-next-line valid-jsdoc
/** another format: Tue, 31 Dec 2024 10:20:30 GMT */
const customDate = () => new Date('2024-12-31T10:20:30.444Z').toUTCString();
describe('stderrUtils', () => {
const errStackRegex = /Error: TestingError\n(?:.*\sat\s.*\n)+/;
describe('defaultTimestamp', () => {
it('should match ISO format', () => {
assert.match(stderrUtils.defaultTimestamp(), defaultDateRegex);
});
});
describe('printErrorWithTimestamp', () => {
let stderrText;
const originalStderrWrite = process.stderr.write;
const mockedStderrWrite = text => { stderrText = text; return true; };
const err = new Error('TestingError');
const origin = 'uncaughtException';
beforeEach(() => {
stderrText = undefined;
process.stderr.write = mockedStderrWrite;
});
afterEach(() => {
process.stderr.write = originalStderrWrite;
stderrText = undefined;
});
it(
'should write to stderr with current date, origin and stacktrace',
() => {
const written = stderrUtils
.printErrorWithTimestamp(err, origin);
assert.strictEqual(written, true);
const [firstLine, errStack] = stderrText.split(':\n');
const [errDate, errOrigin] = firstLine.split(': ');
assert.match(errDate, defaultDateRegex);
assert.strictEqual(errOrigin, origin);
assert.strictEqual(errStack, `${err.stack}\n`);
},
);
it(
'should write to stderr with custom date, origin and stacktrace',
() => {
const written = stderrUtils
.printErrorWithTimestamp(err, origin, customDate());
assert.strictEqual(written, true);
const [firstLine, errStack] = stderrText.split(':\n');
const [errDate, errOrigin] = firstLine.split(': ');
assert.strictEqual(errDate, customDate());
assert.strictEqual(errOrigin, origin);
assert.strictEqual(errStack, `${err.stack}\n`);
},
);
});
const execOptions = {
cwd: __dirname,
// Subprocess should always stop alone
// But just in case, kill subprocess after 500ms.
// Leave enough time for `nyc` that runs slower.
timeout: 500,
};
// Execute in another process to notice the process exit
// Therefore, looks more like a functional test
const timeoutHint = (ms, retries) =>
`Test fixture process timed out after ${ms}ms with ${retries} retries.\n` +
'Due to nyc coverage first run slowing down process.\nIncrease execOptions.timeout to fix';
describe('catchAndTimestampUncaughtException', () => {
[
{ desc: 'with default date' },
{ desc: 'with custom date', date: customDate() },
{ desc: 'with custom exitCode 42', exitCode: 42 },
{ desc: 'without exit on uncaught exception', exitCode: null },
{ desc: 'for unhandled promise', promise: true },
].forEach(({
desc, date, exitCode, promise,
}) => describe(desc, () => {
/** for before all hook that doesn't support this.retries */
let retries = 4;
let err;
let stdout;
let stderr;
let errStack;
let errDate;
let errOrigin;
before('run process catchUncaughtException', function beforeAllHook(done) {
execFile(
'./fixtures/stderrUtils/catchUncaughtException.js',
[`${date}`, `${exitCode}`, `${promise}`],
execOptions,
(subErr, subStdout, subStderr) => {
if (subErr?.killed) {
retries--;
if (retries <= 0) {
assert.fail(timeoutHint(execOptions.timeout, retries));
}
execOptions.timeout *= 2;
return beforeAllHook(done);
}
err = subErr;
stdout = subStdout;
stderr = subStderr;
let firstLine;
[firstLine, errStack] = stderr.split(':\n');
[errDate, errOrigin] = firstLine.split(': ');
done();
},
);
});
if (exitCode === null) {
it('should not be an error (or timeout)',
() => assert.ifError(err));
it('should have stdout (printed after uncaught exception)',
() => assert.match(stdout,
/^.*EXECUTED AFTER UNCAUGHT EXCEPTION(?:.|\n)*$/));
} else {
it('should be an error',
() => assert.ok(err));
it(`should have exitCode ${exitCode || 1}`,
() => assert.strictEqual(err.code, exitCode || 1));
it('should have empty stdout',
() => assert.strictEqual(stdout, ''));
}
it('should have stderr',
() => assert.ok(stderr));
it('should have date in stderr first line',
() => (date
? assert.strictEqual(errDate, date)
: assert.match(errDate, defaultDateRegex)));
it('should have origin in stderr first line',
() => (promise === true
? assert.strictEqual(errOrigin, 'unhandledRejection')
: assert.strictEqual(errOrigin, 'uncaughtException')));
if (!promise) {
it('should have stack trace on stderr',
() => assert.match(errStack, errStackRegex));
}
}));
});
describe('catchAndTimestampWarning (also tests node onWarning)', () => {
[
{ desc: 'with default date' },
{ desc: 'with custom date', date: customDate() },
{ desc: 'with deprecation warning', name: 'DeprecationWarning' },
{
desc: 'with custom warning',
name: 'CUSTOM',
code: 'TEST01',
detail: 'Some additional detail',
},
].forEach(({
desc, date, name, code, detail,
}) => describe(desc, () => {
/** for before all hook that doesn't support this.retries */
let retries = 4;
let err;
let stdout;
let stderr;
before('run process catchWarning', function beforeAllHook(done) {
execFile(
'./fixtures/stderrUtils/catchWarning.js',
[`${date}`, `${name}`, `${code}`, `${detail}`],
execOptions,
(subErr, subStdout, subStderr) => {
if (subErr?.killed) {
retries--;
if (retries <= 0) {
assert.fail(timeoutHint(execOptions.timeout, retries));
}
execOptions.timeout *= 2;
return beforeAllHook(done);
}
err = subErr;
stdout = subStdout;
stderr = subStderr;
done();
},
);
});
it('should not be an error (or timeout)',
() => assert.ifError(err));
it('should have empty stdout',
() => assert.strictEqual(stdout, ''));
it('should have stderr',
() => assert.ok(stderr));
it('should have message on stderr first line, then stack trace',
() => assert.match(stderr,
/^.*TestWarningMessage\n(?:\s+at\s.*\n)+/));
if (code) {
it('should have code on stderr first line',
() => assert.match(stderr, new RegExp(`^.*[${code}]`)));
}
if (name) {
it('should have name on stderr first line',
() => assert.match(stderr, new RegExp(`^.*${name}:`)));
}
if (detail) {
it('should have detail on stderr',
() => assert.match(stderr, new RegExp(`.*${detail}.*`)));
}
it(`should have ${date ? 'custom' : 'default'} date on stderr`,
() => assert.match(stderr, new RegExp(
`\nAbove Warning Date: ${
date || defaultDateRegex.source}\n`,
)));
}));
});
describe('catchAndTimestampStderr', () => {
[
{ desc: 'with default date' },
{ desc: 'with custom date', date: customDate() },
{ desc: 'with exit code', exitCode: 42 },
].forEach(({
desc, date, exitCode,
}) => describe(desc, () => {
/** for before all hook that doesn't support this.retries */
let retries = 4;
let err;
let stdout;
let stderr;
before('run process catchStderr', function beforeAllHook(done) {
execFile(
'./fixtures/stderrUtils/catchStderr.js',
[`${date}`, `${exitCode}`],
execOptions,
(subErr, subStdout, subStderr) => {
if (subErr?.killed) {
retries--;
if (retries <= 0) {
assert.fail(timeoutHint(execOptions.timeout, retries));
}
execOptions.timeout *= 2;
return beforeAllHook(done);
}
err = subErr;
stdout = subStdout;
stderr = subStderr;
done();
},
);
});
it('should be an error',
() => assert.ok(err));
it(`should have exitCode ${exitCode || 1}`,
() => assert.strictEqual(err.code, exitCode || 1));
it('should have empty stdout',
() => assert.strictEqual(stdout, ''));
it('should have stderr',
() => assert.ok(stderr));
// 2024-06-26T15:04:55.364Z: uncaughtException:
// Error: TestingError
// at Object.<anonymous> (catchStderr.js:16:7)
// at node:internal/main/run_main_module:22:47
it('should have error date, origin and stacktrace in stderr',
() => assert.match(stderr,
new RegExp(`${date || defaultDateRegex.source
}: uncaughtException:\n${errStackRegex.source}`)));
// (node:171245) Warning: TestWarningMessage
// at Object.<anonymous> (catchStderr.js:14:9)
// at node:internal/main/run_main_module:22:47
// Above Warning Date: 2024-06-26T15:04:55.365Z
it('should have warning with stacktrace in stderr', () => {
const trace = 'Warning: TestWarningMessage\n(?:\\s+at\\s.*\n)+';
const detail = `(?:.|\n)*?(?<=\n)Above Warning Date: ${
date || defaultDateRegex.source}\n`;
assert.match(stderr,
new RegExp(`${trace}${detail}`));
});
}));
});
});