--- url: 'https://loglayer.dev/introduction.md' description: Learn more about LogLayer and how it unifies your logging experience --- # Introduction `loglayer` is a unified logger that routes logs to various logging libraries, cloud providers, and OpenTelemetry while providing a fluent API for specifying log messages, metadata and errors, enhancing and standardizing the developer experience around writing logs. ## Why LogLayer? Challenges with logging—choosing, using, and maintaining the right logger for various projects—are a common experience. While most loggers offer the usual methods like `info`, `warn`, and `error`, they vary significantly in handling structured metadata or `Error` objects. This can lead to ad-hoc solutions, like serializing errors or writing custom pipelines, just to get logs formatted correctly. LogLayer was built to address these pain points by introducing a fluid, expressive API. With methods like `withMetadata` and `withError`, **LogLayer separates object injection from the log message itself, making logging code both cleaner and more maintainable.** Logs are processed through a LogLayer Transport, which acts as an adapter for the preferred logging library. This design offers several key advantages: * **Multi-Transport Support**: Send logs to multiple destinations (e.g., [DataDog](/transports/datadog) and [New Relic](/transports/new-relic)) simultaneously. This feature can be used to ship logs directly to DataDog without relying on their APM package or sidecars. * **Easy Logger Swapping**: If Pino has been used with Next.js, issues might arise where it doesn’t work out of the box after a production build without webpack hacks. With LogLayer, a better-suited library can be swapped in without touching the logging code. ## Battle Tested LogLayer has been in production use for at least three years at [Airtop.ai](https://airtop.ai) (formerly Switchboard) in multiple backend and frontend systems. *LogLayer is not affiliated with Airtop.* ## Bring Your Own Logger LogLayer is designed to sit on top of your logging library(s) of choice, such as `pino`, `winston`, `bunyan`, and more. Learn more about logging [transports](/transports/). ## Consistent API No need to remember different parameter orders or method names between logging libraries: ```typescript // With loglayer - consistent API regardless of logging library log.withMetadata({ some: 'data' }).info('my message') // Without loglayer - different APIs for different libraries winston.info('my message', { some: 'data' }) // winston bunyan.info({ some: 'data' }, 'my message') // bunyan ``` Start with [basic logging](/logging-api/basic-logging). ## Standardized Error Handling `loglayer` provides consistent error handling across all logging libraries: ```typescript // Error handling works the same way regardless of logging library log.withError(new Error('test')).error('Operation failed') ``` See more about [error handling](/logging-api/error-handling). ## Powerful Plugin System Extend functionality with plugins: ```typescript const log = new LogLayer({ plugins: [{ onBeforeDataOut: (params) => { // Redact sensitive information before logging if (params.data?.password) { params.data.password = '***' } return params.data } }] }) ``` See more about using and creating [plugins](/plugins/). ## Multiple Logger Support Send your logs to multiple destinations simultaneously: ```typescript import { LogLayer } from 'loglayer' import { PinoTransport } from "@loglayer/transport-pino" import { DatadogBrowserLogsTransport } from "@loglayer/transport-datadog-browser-logs" import { datadogLogs } from '@datadog/browser-logs' import pino from 'pino' // Initialize Datadog datadogLogs.init({ clientToken: '', site: '', forwardErrorsToLogs: true, }) const log = new LogLayer({ transport: [ new PinoTransport({ logger: pino() }), new DatadogBrowserLogsTransport({ id: "datadog", logger: datadogLogs }) ] }) // Logs will be sent to both Pino and Datadog log.info('User logged in successfully') ``` See more about [multi-transport support](/transports/multiple-transports). ## Easy Testing Built-in mocks make testing a breeze: ```typescript import { MockLogLayer } from 'loglayer' // Use MockLogLayer in your tests - no real logging will occur const log = new MockLogLayer() ``` See more about [testing](/logging-api/unit-testing). --- --- url: 'https://loglayer.dev/transports.md' description: Logging libraries that that you can use with LogLayer --- # Transports Transports are the way LogLayer sends logs to a logging library. ## Available Transports --- --- url: 'https://loglayer.dev/transports/aws-lambda-powertools.md' description: Logging for AWS Lambdas with the LogLayer logging library --- # AWS Lambda Powertools Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-aws-lambda-powertools)](https://www.npmjs.com/package/@loglayer/transport-aws-lambda-powertools) A LogLayer transport for [AWS Lambda Powertools Logger](https://docs.powertools.aws.dev/lambda/typescript/latest/core/logger/). [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/aws-lambda-powertools) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-aws-lambda-powertools @aws-lambda-powertools/logger ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-aws-lambda-powertools @aws-lambda-powertools/logger ``` ```sh [yarn] yarn add loglayer @loglayer/transport-aws-lambda-powertools @aws-lambda-powertools/logger ``` ::: ## Setup ::: warning The Logger utility from `@aws-lambda-powertools/logger` must always be instantiated outside the Lambda handler. ::: ```typescript import { Logger } from '@aws-lambda-powertools/logger'; import { LogLayer } from 'loglayer'; import { PowertoolsTransport } from '@loglayer/transport-aws-lambda-powertools'; // Create a new Powertools logger instance const powertoolsLogger = new Logger({ serviceName: 'my-service', logLevel: 'INFO' }); // Create LogLayer instance with Powertools transport const log = new LogLayer({ transport: new PowertoolsTransport({ logger: powertoolsLogger }) }); // Use LogLayer as normal log.withMetadata({ customField: 'value' }).info('Hello from Lambda!'); ``` ## Log Level Mapping | LogLayer | Powertools | |----------|------------| | trace | DEBUG | | debug | DEBUG | | info | INFO | | warn | WARN | | error | ERROR | | fatal | ERROR | ## Changelog View the changelog [here](./changelogs/aws-lambda-powertools-changelog.md). --- --- url: 'https://loglayer.dev/transports/axiom.md' description: Send logs to Axiom cloud logging platform with the LogLayer logging library --- # Axiom Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-axiom)](https://www.npmjs.com/package/@loglayer/transport-axiom) [Transport Source](https://github.com/loglayer/loglayer/blob/master/packages/transports/axiom) The Axiom transport allows you to send logs to [Axiom.co](https://axiom.co), a cloud-native logging and observability platform. It uses the [Axiom JavaScript SDK](https://github.com/axiomhq/axiom-js). ## Installation ::: code-group ```sh [npm] npm install @loglayer/transport-axiom @axiomhq/js serialize-error loglayer ``` ```sh [pnpm] pnpm add @loglayer/transport-axiom @axiomhq/js serialize-error loglayer ``` ```sh [yarn] yarn add @loglayer/transport-axiom @axiomhq/js serialize-error loglayer ``` ::: ## Usage ```typescript import { LogLayer } from "loglayer"; import { AxiomTransport } from "@loglayer/transport-axiom"; import { serializeError } from "serialize-error"; import { Axiom } from "@axiomhq/js"; // Create the Axiom client const axiom = new Axiom({ token: process.env.AXIOM_TOKEN, // Optional: other Axiom client options // orgId: 'your-org-id', // url: 'https://cloud.axiom.co', }); // Create the LogLayer instance with AxiomTransport const logger = new LogLayer({ errorSerializer: serializeError, transport: new AxiomTransport({ logger: axiom, dataset: "your-dataset", }), }); // Start logging logger.info("Hello from LogLayer!"); ``` ## Configuration Options ### Required Parameters | Option | Type | Description | |--------|------|-------------| | `logger` | `Axiom` | Instance of the Axiom client | | `dataset` | `string` | The Axiom dataset name to send logs to | ### Optional Parameters | Option | Type | Description | Default | |--------|------|-------------|---------| | `fieldNames` | `AxiomFieldNames` | Custom field names for log entries | See [Field Names](#field-names) | | `timestampFn` | `() => string \| number` | Function to generate timestamps | `() => new Date().toISOString()` | | `onError` | `(error: Error) => void` | Callback for error handling | `undefined` | | `level` | `"trace" \| "debug" \| "info" \| "warn" \| "error" \| "fatal"` | Minimum log level to process | `"trace"` | | `levelMap` | `AxiomLevelMap` | Custom mapping for log levels | `undefined` | ### Field Names The `fieldNames` object allows you to customize the field names in the log entry JSON: | Field | Type | Description | Default | |-------|------|-------------|---------| | `level` | `string` | Field name for the log level | `"level"` | | `message` | `string` | Field name for the log message | `"message"` | | `timestamp` | `string` | Field name for the timestamp | `"timestamp"` | ### Level Mapping The `levelMap` object allows you to map each log level to either a string or number: | Level | Type | Example (Numeric) | Example (String) | |-------|------|------------------|------------------| | `debug` | `string \| number` | 20 | `"DEBUG"` | | `error` | `string \| number` | 50 | `"ERROR"` | | `fatal` | `string \| number` | 60 | `"FATAL"` | | `info` | `string \| number` | 30 | `"INFO"` | | `trace` | `string \| number` | 10 | `"TRACE"` | | `warn` | `string \| number` | 40 | `"WARNING"` | ## Log Format Each log entry is written as a JSON object with the following format: ```json5 { "level": "info", "message": "Log message", "timestamp": "2024-01-17T12:34:56.789Z", // metadata / context / error data will depend on your LogLayer configuration "userId": "123", "requestId": "abc-123" } ``` ## Log Level Filtering You can set a minimum log level to filter out less important logs: ```typescript const logger = new LogLayer({ transport: new AxiomTransport({ logger: axiom, dataset: "your-dataset", level: "warn", // Only process warn, error, and fatal logs }), }); logger.debug("This won't be sent"); // Filtered out logger.info("This won't be sent"); // Filtered out logger.warn("This will be sent"); // Included logger.error("This will be sent"); // Included ``` ## Error Handling The transport provides error handling through the `onError` callback: ```typescript const logger = new LogLayer({ transport: new AxiomTransport({ logger: axiom, dataset: "your-dataset", onError: (error) => { // Custom error handling console.error("Failed to send log to Axiom:", error); }, }), }); ``` ## Changelog View the changelog [here](./changelogs/axiom-changelog.md). --- --- url: 'https://loglayer.dev/transports/bunyan.md' description: Send logs to Bunyan with the LogLayer logging library --- # Bunyan Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-bunyan)](https://www.npmjs.com/package/@loglayer/transport-bunyan) [Bunyan](https://github.com/trentm/node-bunyan) is a JSON logging library for Node.js services. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/bunyan) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-bunyan bunyan ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-bunyan bunyan ``` ```sh [yarn] yarn add loglayer @loglayer/transport-bunyan bunyan ``` ::: ## Setup ```typescript import bunyan from 'bunyan' import { LogLayer } from 'loglayer' import { BunyanTransport } from "@loglayer/transport-bunyan" const b = bunyan.createLogger({ name: "my-logger", level: "trace", // Show all log levels serializers: { err: bunyan.stdSerializers.err // Use Bunyan's error serializer } }) const log = new LogLayer({ errorFieldName: "err", // Match Bunyan's error field name transport: new BunyanTransport({ logger: b }) }) ``` ## Log Level Mapping | LogLayer | Bunyan | |----------|---------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | fatal | ## Changelog View the changelog [here](./changelogs/bunyan-changelog.md). --- --- url: 'https://loglayer.dev/logging-api/child-loggers.md' description: Learn how to create child loggers in LogLayer --- # Child Loggers Child loggers allow you to create new logger instances that inherit configuration, context, and plugins from their parent logger. This is particularly useful for creating loggers with additional context for specific components or modules while maintaining the base configuration. ## Creating Child Loggers Use the `child()` method to create a child logger: ```typescript const parentLog = new LogLayer({ transport: new ConsoleTransport({ logger: console }) }) const childLog = parentLog.child() ``` ## Inheritance Behavior Child loggers inherit: 1. Configuration from the parent 2. Context data (as a shallow copy by default, or shared reference if configured) 3. Plugins ### Configuration Inheritance All configuration options are inherited from the parent: ```typescript const parentLog = new LogLayer({ transport: new ConsoleTransport({ logger: console }), contextFieldName: 'context', metadataFieldName: 'metadata', errorFieldName: 'error' }) // Child inherits all configuration const childLog = parentLog.child() ``` ### Context Inheritance By default, context data is shallow copied from the parent: ```typescript const parentLog = new LogLayer({}).withContext({ app: 'myapp', version: '1.0.0' }) // Child inherits parent's context const childLog = parentLog.child() childLog.info('Hello') // Output includes: { app: 'myapp', version: '1.0.0' } // Add additional context to child childLog.withContext({ module: 'users' }) childLog.info('User created') // Output includes: { app: 'myapp', version: '1.0.0', module: 'users' } // Parent's context remains unchanged parentLog.info('Parent log') // Output includes: { app: 'myapp', version: '1.0.0' } ``` --- --- url: 'https://loglayer.dev/transports/configuration.md' description: Learn how to configure LogLayer transports --- # Transport Configuration All LogLayer transports share a common set of configuration options that control their behavior. These options are passed to the transport constructor when creating a new transport instance. ## Common Configuration Options ```typescript interface TransportConfig { /** * A unique identifier for the transport. If not provided, a random ID will be generated. This is used if you need to call getLoggerInstance() on the LogLayer instance. */ id?: string; /** * If false, the transport will not send any logs to the logger. * Useful for temporarily disabling a transport. * @default true */ enabled?: boolean; /** * If true, the transport will also log messages to the console. * Useful for debugging transport behavior. * @default false */ consoleDebug?: boolean; } ``` ## Example Usage Here's an example of configuring a transport with common options: ```typescript import { LogLayer } from 'loglayer' import { PinoTransport } from "@loglayer/transport-pino" import pino from 'pino' const pinoLogger = pino() const transport = new PinoTransport({ // Custom identifier for the transport id: 'main-pino-transport', // Your configured logger instance logger: pinoLogger, // Disable the transport temporarily enabled: process.env.NODE_ENV !== 'test', // Enable console debugging consoleDebug: process.env.DEBUG === 'true' }) const log = new LogLayer({ transport }) ``` --- --- url: 'https://loglayer.dev/configuration.md' description: Learn how to configure LogLayer to customize its behavior --- # Configuration LogLayer can be configured with various options to customize its behavior. Here's a comprehensive guide to all available configuration options. ## Basic Configuration When creating a new LogLayer instance, you can pass a configuration object: ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), // ... other options }) ``` ## Configuration Options ### Transport Configuration The `transport` option is the only required configuration. It specifies which logging library to use: ```typescript { // Can be a single transport or an array of transports transport: new ConsoleTransport({ logger: console, }) } ``` You can also pass an array of transports to the `transport` option. This is useful if you want to send logs to multiple destinations. ```typescript { transport: [ new ConsoleTransport({ logger: console }), new DatadogBrowserLogsTransport({ logger: datadogBrowserLogs })], } ``` For more transport options, see the [Transport Configuration](./transports/configuration) section. ### Message Prefixing You can add a prefix to all log messages: ```typescript { // Will prepend "[MyApp]" to all log messages prefix: '[MyApp]' } ``` ### Logging Control Control whether logging is enabled: ```typescript { // Set to false to disable all logging (default: true) enabled: true } ``` You can also enable/disable logging programmatically: ```typescript log.enableLogging() // Enable logging log.disableLogging() // Disable logging ``` ### Debugging If you're implementing a transport, you can set the `consoleDebug` option to `true` to output to the console before sending to the logging library: ```typescript { // Useful for debugging - will output to console before sending to logging library consoleDebug: true } ``` This is useful when: * Debugging why logs aren't appearing in your logging library * Verifying the data being sent to the logging library * Testing log formatting ### Error Handling Configuration Configure how errors are handled and serialized: ```typescript { // Function to transform Error objects (useful if logging library doesn't handle errors well) errorSerializer: (err) => ({ message: err.message, stack: err.stack }), // Field name for errors (default: 'err') errorFieldName: 'err', // Copy error.message to log message when using errorOnly() (default: false) copyMsgOnOnlyError: true, // Include error in metadata instead of root level (default: false) errorFieldInMetadata: false } ``` ### Data Structure Configuration ::: tip See [error handling configuration](#error-handling-configuration) for configuring the error field name and placement. ::: Control how context and metadata are structured in log output: ```typescript { // Put context data in a specific field (default: flattened) contextFieldName: 'context', // Put metadata in a specific field (default: flattened) metadataFieldName: 'metadata', // Disable context/metadata in log output muteContext: false, muteMetadata: false } ``` Example output with field names configured: ```json { "level": 30, "time": 1638138422796, "msg": "User logged in", "context": { "requestId": "123" }, "metadata": { "userId": "456" } } ``` Example output with flattened fields (default): ```json { "level": 30, "time": 1638138422796, "msg": "User logged in", "requestId": "123", "userId": "456" } ``` ### Plugin System Plugins are used to modify logging behavior. See the [Plugins](./plugins/index) section for more information. ## Complete Configuration Example Here's an example showing all configuration options: ```typescript const log = new LogLayer({ // Required: Transport configuration transport: new ConsoleTransport({ logger: console, }), // Optional configurations prefix: '[MyApp]', enabled: true, consoleDebug: false, // Error handling errorSerializer: (err) => ({ message: err.message, stack: err.stack }), errorFieldName: 'error', copyMsgOnOnlyError: true, errorFieldInMetadata: false, // Data structure contextFieldName: 'context', metadataFieldName: 'metadata', muteContext: false, muteMetadata: false, // Plugins plugins: [ { id: 'timestamp-plugin', onBeforeDataOut: ({ data }) => { if (data) { data.timestamp = Date.now() } return data } } ] }) ``` --- --- url: 'https://loglayer.dev/transports/consola.md' description: Send logs to Consola with the LogLayer logging library --- # Consola Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-consola)](https://www.npmjs.com/package/@loglayer/transport-consola) [Consola](https://github.com/unjs/consola) is an elegant and configurable console logger for Node.js and browser. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/consola) ## Important Notes * The default log level is `3` which excludes `debug` and `trace` * Set level to `5` to enable all log levels * Consola provides additional methods not available through LogLayer ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-consola consola ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-consola consola ``` ```sh [yarn] yarn add loglayer @loglayer/transport-consola consola ``` ::: ## Setup ```typescript import { createConsola } from 'consola' import { LogLayer } from 'loglayer' import { ConsolaTransport } from "@loglayer/transport-consola" const log = new LogLayer({ transport: new ConsolaTransport({ logger: createConsola({ level: 5 // Enable all log levels }) }) }) ``` ## Log Level Mapping | LogLayer | Consola | |----------|---------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | fatal | ## Changelog View the changelog [here](./changelogs/consola-changelog.md). --- --- url: 'https://loglayer.dev/transports/console.md' description: Send logs to the console with the LogLayer logging library --- # Console Transport The simplest integration is with the built-in `console` object, which is available in both Node.js and browser environments. [Transport Source](https://github.com/loglayer/loglayer/blob/master/packages/core/loglayer/src/transports/ConsoleTransport.ts) ## Installation No additional packages needed beyond the core `loglayer` package: ::: code-group ```sh [npm] npm i loglayer ``` ```sh [pnpm] pnpm add loglayer ``` ```sh [yarn] yarn add loglayer ``` ::: ## Setup ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, // Optional: control where object data appears in log messages appendObjectData: false // default: false - object data appears first }) }) ``` ## Configuration Options ### `level` Sets the minimum log level to process. Messages with a lower priority level will be ignored. * Type: `"trace" | "debug" | "info" | "warn" | "error" | "fatal"` * Default: `"trace"` (processes all log levels) Example with minimum level set to `"info"`: ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, level: "info" // Will only process info, warn, error, and fatal logs }) }); log.debug('This message will be ignored'); log.info('This message will be logged'); ``` ### `appendObjectData` Controls where object data (metadata, context, errors) appears in the log messages: * `false` (default): Object data appears as the first parameter * `true`: Object data appears as the last parameter *Has no effect if `messageField` is defined.* Example with `appendObjectData: false` (default): ```typescript log.withMetadata({ user: 'john' }).info('User logged in'); // console.info({ user: 'john' }, 'User logged in') ``` Example with `appendObjectData: true`: ```typescript log.withMetadata({ user: 'john' }).info('User logged in'); // console.info('User logged in', { user: 'john' }) ``` ### `messageField` If defined, this option will: * Place the message into the specified field in the log object * Join multi-parameter messages with a space (use the sprintf plugin for formatted messages) * Only log the object to the console Example with `messageField: 'msg'`: ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, messageField: 'msg' }) }); log.withMetadata({ user: 'john' }).info('User logged in', 'successfully'); // console.info({ user: 'john', msg: 'User logged in successfully' }) ``` ## Log Level Mapping | LogLayer | Console | |----------|-----------| | trace | debug | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | --- --- url: 'https://loglayer.dev/logging-api/context.md' description: Learn how to create logs with context data in LogLayer --- # Logging with Context Context allows you to add persistent data that will be included with every log message. This is particularly useful for adding request IDs, user information, or any other data that should be present across multiple log entries. ::: info Message field name The output examples use `msg` as the message field. The name of this field may vary depending on the logging library you are using. In the `console` logger, this field does not exist, and the message is printed directly. ::: ## Adding Context Use the `withContext` method to add context data: ```typescript log.withContext({ requestId: '123', userId: 'user_456' }) // Context will be included in all subsequent log messages log.info('Processing request') log.warn('User quota exceeded') ``` By default, context data is flattened into the root of the log object: ```json { "msg": "Processing request", "requestId": "123", "userId": "user_456" } ``` ::: warning Clearing context Passing an empty value (`null`, `undefined`, or an empty object) to `withContext` will *not* clear the context; it does nothing. Use the `clearContext()` method to remove all context data. ::: ## Structuring Context ### Using a Dedicated Context Field You can configure LogLayer to place context data in a dedicated field: ```typescript log.withContext({ requestId: '123', userId: 'user_456' }).info('Processing request') ``` This produces: ```json { "msg": "Processing request", "context": { "requestId": "123", "userId": "user_456" } } ``` ### Combining Context and Metadata Fields If you set the same field name for both context and metadata, they will be merged: ```typescript const log = new LogLayer({ contextFieldName: 'data', metadataFieldName: 'data', }) log.withContext({ requestId: '123' }) .withMetadata({ duration: 1500 }) .info('Request completed') ``` This produces: ```json { "msg": "Request completed", "data": { "requestId": "123", "duration": 1500 } } ``` ## Managing Context ### Getting Current Context You can retrieve the current context data: ```typescript log.withContext({ requestId: '123' }) const context = log.getContext() // Returns: { requestId: '123' } ``` ### Clearing Context You can clear context data: ```typescript log.clearContext() ``` ### Muting Context You can temporarily disable context logging: ```typescript // Via configuration const log = new LogLayer({ muteContext: true, }) // Or via methods log.muteContext() // Disable context log.unMuteContext() // Re-enable context ``` This is useful for development or troubleshooting when you want to reduce log verbosity. ## Combining Context with Other Features ### With Errors Context data is included when logging errors: ```typescript log.withContext({ requestId: '123' }) .withError(new Error('Not found')) .error('Failed to fetch user') ``` ### With Metadata Context can be combined with per-message metadata: ```typescript log.withContext({ requestId: '123' }) .withMetadata({ userId: 'user_456' }) .info('User logged in') ``` --- --- url: 'https://loglayer.dev/context-managers.md' description: Learn how to create and use context managers with LogLayer --- # Context Managers *New in LogLayer v6*. Context managers in LogLayer are responsible for managing contextual data that gets included with log entries. They provide a way to store and retrieve context data that will be automatically included with every log message. ::: tip Do you need to specify a context manager? Context managers are an advanced feature of LogLayer. Unless you need to manage context data in a specific way, you can use the default context manager, which is already automatically used when creating a new LogLayer instance. ::: ## Context Manager Management ### Using a custom context manager You can set a custom context manager using the `withContextManager()` method. Example usage: ```typescript import { MyCustomContextManager } from './MyCustomContextManager'; const logger = new LogLayer() .withContextManager(new MyCustomContextManager()); ``` ::: tip Use the `withContextManager()` method right after creating the LogLayer instance. Using it after the context has already been set will drop the existing context data. ::: ### Obtaining the current context manager You can get the current context manager instance using the `getContextManager()` method: ```typescript const contextManager = logger.getContextManager(); ``` You can also type the return value when getting a specific context manager implementation: ```typescript const linkedContextManager = logger.getContextManager(); ``` --- --- url: 'https://loglayer.dev/context-managers/creating-context-managers.md' description: Learn how to create a custom context manager for LogLayer --- # Creating Context Managers ::: warning Using async libraries LogLayer is a synchronous library, so context managers must perform synchronous operations only. Integrations that use promises, callbacks, or other asynchronous patterns to set and fetch context data is not supported / recommended unless you are making those calls out-of-band for other reasons. ::: ## The IContextManager Interface To create a custom context manager, you'll first need to install the base package: ::: code-group ```bash [npm] npm install @loglayer/context-manager ``` ```bash [yarn] yarn add @loglayer/context-manager ``` ```bash [pnpm] pnpm add @loglayer/context-manager ``` ::: Then implement the `IContextManager` interface: ```typescript import type { IContextManager, ILogLayer } from '@loglayer/context-manager'; interface OnChildLoggerCreatedParams { /** * The parent logger instance */ parentLogger: ILogLayer; /** * The child logger instance */ childLogger: ILogLayer; /** * The parent logger's context manager */ parentContextManager: IContextManager; /** * The child logger's context manager */ childContextManager: IContextManager; } interface IContextManager { // Sets the context data. Set to undefined to clear the context. setContext(context?: Record): void; // Appends context data to existing context appendContext(context: Record): void; // Returns the current context data getContext(): Record; // Returns true if there is context data present hasContextData(): boolean; // Called when a child logger is created onChildLoggerCreated(params: OnChildLoggerCreatedParams): void; // Creates a new instance with the same context data clone(): IContextManager; } ``` ## Context Manager Lifecycle When using a context manager with a LogLayer logger instance: * When the logger is first created, the [Default Context Manager](/context-managers/default) is automatically is attached to it * The context manager is attached to a logger using [`withContextManager()`](/context-managers/#using-a-custom-context-manager) * If the existing context manager implements [`Disposable`](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-5-2.html#using-declarations-and-explicit-resource-management), it will be called to clean up resources * When `withContext()` is called on the logger it calls `appendContext()` on the context manager * When a child logger is created: * `clone()` is called on the parent's context manager and the cloned context manager is attached to the child logger * `onChildLoggerCreated()` is called on the parent's context manager * When LogLayer needs to obtain context data, it first calls `hasContextData()` to check if context is present, then calls `getContext()` to get the context data if it is. ## Resource Cleanup with Disposable If your context manager needs to clean up resources (like file handles, memory, or external connections), you can implement the [`Disposable`](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-5-2.html#using-declarations-and-explicit-resource-management) interface. LogLayer will automatically call the dispose method when the context manager is replaced using `withContextManager()` if defined. ### Implementing Disposable To make your context manager disposable: 1. Add `Disposable` to your class implementation 2. Implement the `[Symbol.dispose]()` method 3. Add a flag to track the disposed state 4. Guard your methods against calls after disposal Here's an example: ```typescript export class MyContextManager implements IContextManager, Disposable { private isDisposed = false; private hasContext = false; private someResource: any; // ... other methods ... hasContextData(): boolean { if (this.isDisposed) return false; return this.hasContext; } setContext(context?: Record): void { if (this.isDisposed) return; // Implementation } getContext(): Record { if (this.isDisposed) return {}; return this.context; } [Symbol.dispose](): void { if (this.isDisposed) return; // Clean up resources this.someResource?.close(); this.context = {}; this.isDisposed = true; } } ``` :::tip Always implement `Disposable` if your context manager holds onto resources that need cleanup. This ensures proper resource management and prevents memory leaks. ::: ## Example Implementation Here's an example of a simple file-based context manager that saves context to a file. ::: warning Don't try this at home This example is for educational purposes only and will have a significant performance impact and has possible race conditions in actual usage. ::: ```typescript import { openSync, closeSync, readSync, writeSync, fstatSync } from 'node:fs'; import type { IContextManager, OnChildLoggerCreatedParams } from '@loglayer/context-manager'; /** * Example context manager that persists context to a file using a file handle. * Implements both IContextManager for context management and Disposable for cleanup. * * This example demonstrates proper resource cleanup by maintaining an open file handle * that needs to be properly closed when the context manager is disposed. */ export class FileContextManager implements IContextManager, Disposable { // In-memory storage of context data private context: Record = {}; // Flag to track if we have any context data private hasContext = false; // Path to the file where context is persisted private filePath: string; // File handle for persistent storage private fileHandle: number | null = null; // Flag to track if this manager has been disposed private isDisposed = false; constructor(filePath: string) { this.filePath = filePath; // Open file handle in read/write mode, create if doesn't exist try { this.fileHandle = openSync(filePath, 'a+'); this.loadContext(); } catch (err) { // Handle error gracefully - continue with empty context this.context = {}; this.hasContext = false; } } /** * Loads context from the file system into memory using the file handle. * Called during initialization and after file changes. */ private loadContext() { if (this.isDisposed || this.fileHandle === null) return; try { // Get file size const stats = fstatSync(this.fileHandle); if (stats.size === 0) { this.context = {}; this.hasContext = false; return; } // Read entire file content const buffer = Buffer.alloc(stats.size); readSync(this.fileHandle, buffer, 0, stats.size, 0); // Parse content const data = buffer.toString('utf8'); this.context = JSON.parse(data); this.hasContext = Object.keys(this.context).length > 0; } catch (err) { // Handle error gracefully - initialize empty context this.context = {}; this.hasContext = false; } } /** * Persists the current in-memory context to the file system using the file handle. * Called after any context modifications. */ private saveContext() { if (this.isDisposed || this.fileHandle === null) return; try { const data = JSON.stringify(this.context); const buffer = Buffer.from(data); // Truncate file first writeSync(this.fileHandle, buffer, 0, buffer.length, 0); } catch (err) { // Handle error gracefully - continue with in-memory context } } /** * Sets the entire context, replacing any existing context. * Passing undefined clears the context. */ setContext(context?: Record): void { if (this.isDisposed) return; if (!context) { this.context = {}; this.hasContext = false; } else { this.context = { ...context }; this.hasContext = true; } this.saveContext(); } /** * Merges new context data with existing context. * Any matching keys will be overwritten with new values. */ appendContext(context: Record): void { if (this.isDisposed) return; this.context = { ...this.context, ...context }; this.hasContext = true; this.saveContext(); } /** * Returns the current context data. * Returns empty object if disposed. */ getContext(): Record { if (this.isDisposed) return {}; return this.context; } /** * Checks if there is any context data present. * Returns false if disposed. */ hasContextData(): boolean { if (this.isDisposed) return false; return this.hasContext; } /** * Called when a child logger is created to handle context inheritance. * Copies parent context to child if parent has context data. */ onChildLoggerCreated({ parentContextManager, childContextManager }: OnChildLoggerCreatedParams): void { if (this.isDisposed) return; // Copy parent context to child if parent has context if (parentContextManager.hasContextData()) { const parentContext = parentContextManager.getContext(); childContextManager.setContext({ ...parentContext }); } } /** * Creates a new instance with a copy of the current context. * Note: This implementation most likely has issues since the same file is being manipulated. * This could potentially introduce a race condition when this method is called via child() */ clone(): IContextManager { return new FileContextManager(this.filePath); } /** * Implements the Disposable interface for cleanup. * Properly closes the file handle and cleans up memory resources. * This is critical to prevent file handle leaks in the operating system. */ [Symbol.dispose](): void { if (this.isDisposed) return; // Clean up in-memory resources this.context = {}; this.hasContext = false; // Close the file handle if it's open if (this.fileHandle !== null) { try { closeSync(this.fileHandle); } catch (err) { // Handle cleanup error gracefully } this.fileHandle = null; } this.isDisposed = true; } } ``` You can use this context manager like this: ```typescript import { LogLayer } from 'loglayer'; import { FileContextManager } from './FileContextManager'; // The context manager will maintain an open file handle until disposed const logger = new LogLayer() .withContextManager(new FileContextManager('./context.json')); logger.withContext({ user: 'alice' }); logger.info('User logged in'); // Will include { user: 'alice' } in context ``` ## Best Practices When implementing a context manager: * Make all operations synchronous * Handle errors gracefully without throwing exceptions * Implement proper cleanup in stateful context managers with `Disposable` --- --- url: 'https://loglayer.dev/transports/creating-transports.md' description: Learn how to create custom transports for LogLayer --- # Creating Transports To integrate a logging library with LogLayer, you must create a transport. A transport is a class that translates LogLayer's standardized logging format into the format expected by your target logging library or service. ## Installation To implement a transport, you must install the `@loglayer/transport` package. ::: code-group ```bash [npm] npm install @loglayer/transport ``` ```bash [yarn] yarn add @loglayer/transport ``` ```bash [pnpm] pnpm add @loglayer/transport ``` ::: ## Implementing a Transport The key requirement for any transport is extending the `BaseTransport` or `LoggerlessTransport` class and implementing the `shipToLogger` method. This method is called by LogLayer whenever a log needs to be sent, and it's where you transform LogLayer's standardized format into the format your target library or service expects. ## Types of Transports LogLayer supports two types of transports: ### Logger-Based Transports For libraries that follow a common logging interface with methods like `info()`, `warn()`, `error()`, `debug()`, etc., extend the `BaseTransport` class. The `BaseTransport` class provides a `logger` property where users pass in their logging library instance: ```typescript import { BaseTransport, type LogLayerTransportConfig, type LogLayerTransportParams, } from "@loglayer/transport"; export interface CustomLoggerTransportConfig extends LogLayerTransportConfig { // Add configuration options here if necessary } export class CustomLoggerTransport extends BaseTransport { constructor(config: CustomLoggerTransportConfig) { super(config); } shipToLogger({ logLevel, messages, data, hasData }: LogLayerTransportParams) { if (data && hasData) { // Most logging libraries expect data as first or last parameter messages.unshift(data); // or messages.push(data); } switch (logLevel) { case LogLevel.info: this.logger.info(...messages); break; case LogLevel.warn: this.logger.warn(...messages); break; case LogLevel.error: this.logger.error(...messages); break; // ... handle other log levels } return messages; } } ``` To use this transport, you must provide a logger instance when creating it: ```typescript import { LogLayer } from 'loglayer'; import { YourLogger } from 'your-logger-library'; // Initialize your logging library const loggerInstance = new YourLogger(); // Create LogLayer instance with the transport const log = new LogLayer({ transport: new CustomLoggerTransport({ logger: loggerInstance // Required: the logger instance is passed here }) }); ``` ### Loggerless Transports For services or libraries that don't follow the common logging interface (e.g., analytics services, monitoring tools), extend the `LoggerlessTransport` class. Unlike `BaseTransport`, `LoggerlessTransport` doesn't provide a `logger` property since these services typically don't require a logger instance. Instead, you'll usually initialize your service in the constructor: ::: info All loggerless transports have an optional `level` input as part of configuration. This is used by the `LoggerlessTransport` class to filter out logs that are below the specified level. You do not need to do any work around filtering based on level. ::: ```typescript import { LoggerlessTransport, type LogLayerTransportParams, type LoggerlessTransportConfig } from "@loglayer/transport"; export interface CustomServiceTransportConfig extends LoggerlessTransportConfig { // Add configuration options here if necessary } export class CustomServiceTransport extends LoggerlessTransport { private service: YourServiceType; constructor(config: CustomServiceTransportConfig) { super(config); this.service = new YourServiceType(config); } shipToLogger({ logLevel, messages, data, hasData }: LogLayerTransportParams) { const payload = { level: logLevel, message: messages.join(" "), timestamp: new Date().toISOString(), ...(data && hasData ? data : {}) }; // Send to your service this.service.send(payload); return messages; } } ``` To use this transport, you only need to provide the configuration for your service: ```typescript import { LogLayer } from 'loglayer'; // Create LogLayer instance with the transport const log = new LogLayer({ transport: new CustomServiceTransport({ // No logger property needed, just your service configuration apiKey: 'your-api-key', endpoint: 'https://api.yourservice.com/logs' }) }); ``` ## `shipToLogger` Parameters LogLayer calls the `shipToLogger` method of a transport at the end of log processing to send the log to the target logging library. It receives a `LogLayerTransportParams` object with these fields: ```typescript interface LogLayerTransportParams { /** * The log level of the message */ logLevel: LogLevel; /** * The parameters that were passed to the log message method (eg: info / warn / debug / error) */ messages: any[]; /** * Object data such as metadata, context, and / or error data */ data?: Record; /** * If true, the data object is included in the message parameters */ hasData?: boolean; } ``` For example, if a user does the following: ```typescript logger.withMetadata({foo: 'bar'}).info('hello world', 'foo'); ``` The parameters passed to `shipToLogger` would be: ```typescript { logLevel: LogLevel.info, messages: ['hello world', 'foo'], data: {foo: 'bar'}, hasData: true } ``` ## Resource Cleanup with Disposable If your transport needs to clean up resources (like network connections, file handles, or external service connections), you can implement the [`Disposable`](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-5-2.html#using-declarations-and-explicit-resource-management) interface. LogLayer will automatically call the dispose method when: * The transport is replaced using `withFreshTransports()` ### Implementing Disposable To make your transport disposable: 1. Add `Disposable` to your class implementation 2. Implement the `[Symbol.dispose]()` method 3. Add a flag to track the disposed state 4. Guard your methods against calls after disposal Here's an example: ```typescript export class MyTransport extends LoggerlessTransport implements Disposable { private isDisposed = false; private client: ExternalServiceClient; constructor(config: MyTransportConfig) { super(config); this.client = new ExternalServiceClient(config); } shipToLogger({ logLevel, messages, data, hasData }: LogLayerTransportParams) { if (this.isDisposed) return messages; // Implementation this.client.send({ level: logLevel, message: messages.join(" "), ...(data && hasData ? data : {}) }); return messages; } [Symbol.dispose](): void { if (this.isDisposed) return; // Clean up resources this.client?.close(); this.isDisposed = true; } } ``` :::tip Always implement `Disposable` if your transport maintains connections or holds onto resources that need cleanup. This ensures proper resource management and prevents memory leaks. ::: For a real-world example of a transport that implements `Disposable`, see the [Log Rotation Transport Source](/transports/log-file-rotation) implementation which properly manages file handles. ## Examples ### Logger-Based Example: Console Transport ```typescript import { BaseTransport, LogLevel, LogLayerTransportParams } from "@loglayer/transport"; export class ConsoleTransport extends BaseTransport { shipToLogger({ logLevel, messages, data, hasData }: LogLayerTransportParams) { if (data && hasData) { // put object data as the first parameter messages.unshift(data); } switch (logLevel) { case LogLevel.info: this.logger.info(...messages); break; case LogLevel.warn: this.logger.warn(...messages); break; case LogLevel.error: this.logger.error(...messages); break; case LogLevel.trace: this.logger.trace(...messages); break; case LogLevel.debug: this.logger.debug(...messages); break; case LogLevel.fatal: this.logger.error(...messages); break; } return messages; } } ``` ### Loggerless Example: DataDog Transport For an example of a loggerless transport that sends logs to a third-party service, see the [Datadog Transport](https://github.com/loglayer/loglayer/blob/master/packages/transports/datadog/src/DataDogTransport.ts) implementation. ## Boilerplate / Template Code A sample project that you can use as a template is provided here: [GitHub Boilerplate Template](https://github.com/loglayer/loglayer-transport-boilerplate) --- --- url: 'https://loglayer.dev/plugins/creating-plugins.md' description: Learn how to create custom plugins for LogLayer --- # Creating Plugins ## Overview A plugin is a plain object that implements the `LogLayerPlugin` interface from `@loglayer/plugin` or `loglayer`: ```typescript interface LogLayerPlugin { /** * Unique identifier for the plugin. Used for selectively disabling / enabling * and removing the plugin. */ id?: string; /** * If true, the plugin will skip execution */ disabled?: boolean; /** * Called after the assembly of the data object that contains * metadata / context / error data before being sent to the logging library. */ onBeforeDataOut?(params: PluginBeforeDataOutParams, loglayer: ILogLayer): Record | null | undefined; /** * Called to modify message data before it is sent to the logging library. */ onBeforeMessageOut?(params: PluginBeforeMessageOutParams, loglayer: ILogLayer): MessageDataType[]; /** * Controls whether the log entry should be sent to the logging library. */ shouldSendToLogger?(params: PluginShouldSendToLoggerParams, loglayer: ILogLayer): boolean; /** * Called when withMetadata() or metadataOnly() is called. */ onMetadataCalled?(metadata: Record, loglayer: ILogLayer): Record | null | undefined; /** * Called when withContext() is called. */ onContextCalled?(context: Record, loglayer: ILogLayer): Record | null | undefined; } ``` ## In-Project If you want to quickly write a plugin for your own project, you can use the `loglayer` package to get the Typescript types for the plugin interface. ### Example ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' import type { LogLayerPlugin, PluginBeforeMessageOutParams } from 'loglayer' // Create a timestamp plugin const timestampPlugin: LogLayerPlugin = { onBeforeMessageOut: ({ messages }: PluginBeforeMessageOutParams, loglayer: ILogLayer): string[] => { // Add timestamp prefix to each message return messages.map(msg => `[${new Date().toISOString()}] ${msg}`) } } // Create LogLayer instance with console transport and timestamp plugin const log = new LogLayer({ transport: new ConsoleTransport({ logger: console }), plugins: [timestampPlugin] }) // Usage example log.info('Hello world!') // Output: [2024-01-17T12:34:56.789Z] Hello world! ``` ## As an NPM Package If you're creating an npm package, you should use the `@loglayer/plugin` package to get the Typescript types for the plugin interface instead of making `loglayer` a dependency. ::: info We recommend it as a `dependency` and not a `devDependency` as `@loglayer/plugin` may not be types-only in the future. ::: ### Installation ::: code-group ```sh [npm] npm install @loglayer/plugin ``` ```sh [pnpm] pnpm add @loglayer/plugin ``` ```sh [yarn] yarn add @loglayer/plugin ``` ::: ### Example ```typescript import type { LogLayerPlugin, PluginBeforeMessageOutParams, LogLayerPluginParams, ILogLayer } from '@loglayer/plugin' // LogLayerPluginParams provides the common options for the plugin export interface TimestampPluginOptions extends LogLayerPluginParams { /** * Format of the timestamp. If not provided, uses ISO string */ format?: 'iso' | 'locale' } export const createTimestampPlugin = (options: TimestampPluginOptions = {}, loglayer: ILogLayer): LogLayerPlugin => { return { // Copy over the common options id: options.id, disabled: options.disabled, // Implement the onBeforeMessageOut lifecycle method onBeforeMessageOut: ({ messages }: PluginBeforeMessageOutParams, loglayer: ILogLayer): string[] => { const timestamp = options.format === 'locale' ? new Date().toLocaleString() : new Date().toISOString() return messages.map(msg => `[${timestamp}] ${msg}`) } } } ``` ## Plugin Lifecycle Methods ### onBeforeDataOut Allows you to modify or transform the data object containing metadata, context, and error information before it's sent to the logging library. This is useful for adding additional fields, transforming data formats, or filtering sensitive information. **Method Signature:** ```typescript onBeforeDataOut?(params: PluginBeforeDataOutParams, loglayer: ILogLayer): Record | null | undefined ``` **Parameters:** ```typescript interface PluginBeforeDataOutParams { data?: Record; // The object containing metadata / context / error data logLevel: LogLevel; // Log level of the data } ``` **Example:** ```typescript const dataEnrichmentPlugin = { onBeforeDataOut: ({ data, logLevel }: PluginBeforeDataOutParams, loglayer: ILogLayer) => { return { ...(data || {}), environment: process.env.NODE_ENV, timestamp: new Date().toISOString(), logLevel } } } ``` ### onBeforeMessageOut Allows you to modify or transform the message content before it's sent to the logging library. This is useful for adding prefixes, formatting messages, or transforming message content. **Method Signature:** ```typescript onBeforeMessageOut?(params: PluginBeforeMessageOutParams, loglayer: ILogLayer): MessageDataType[] ``` **Parameters:** ```typescript interface PluginBeforeMessageOutParams { messages: any[]; // Message data that is copied from the original logLevel: LogLevel; // Log level of the message } ``` **Example:** ```typescript const messageFormatterPlugin = { onBeforeMessageOut: ({ messages, logLevel }: PluginBeforeMessageOutParams, loglayer: ILogLayer) => { return messages.map(msg => `[${logLevel.toUpperCase()}][${new Date().toISOString()}] ${msg}`) } } ``` ### shouldSendToLogger Controls whether a log entry should be sent to the logging library. This is useful for implementing log filtering, rate limiting, or environment-specific logging. **Method Signature:** ```typescript shouldSendToLogger?(params: PluginShouldSendToLoggerParams, loglayer: ILogLayer): boolean ``` **Parameters:** ```typescript interface PluginShouldSendToLoggerParams { transportId?: string; // ID of the transport that will send the log messages: any[]; // Message data that is copied from the original logLevel: LogLevel; // Log level of the message data?: Record; // The object containing metadata / context / error data } ``` **Example:** ```typescript const productionFilterPlugin = { shouldSendToLogger: ({ logLevel, data }: PluginShouldSendToLoggerParams, loglayer: ILogLayer) => { // Filter out debug logs in production if (process.env.NODE_ENV === 'production') { return logLevel !== 'debug' } // Rate limit error logs if (logLevel === 'error') { return !isRateLimited('error-logs') } return true } } ``` **Example:** ```typescript const productionFilterPlugin = { shouldSendToLogger: ({ transportId, logLevel, data }: PluginShouldSendToLoggerParams, loglayer: ILogLayer) => { // don't send logs if the transportId is 'console' if (transportId === 'console') { return false } return true } } ``` ### onMetadataCalled Intercepts and modifies metadata when `withMetadata()` or `metadataOnly()` is called. This is useful for transforming or enriching metadata before it's attached to logs. Returning `null` or `undefined` will prevent the metadata from being added to the log. **Method Signature:** ```typescript onMetadataCalled?(metadata: Record, loglayer: ILogLayer): Record | null | undefined ``` **Parameters:** * `metadata`: Record\ - The metadata object being added * `loglayer`: ILogLayer - The LogLayer instance **Example:** ```typescript const metadataEnrichmentPlugin = { onMetadataCalled: (metadata: Record, loglayer: ILogLayer) => { return { ...metadata, enrichedAt: new Date().toISOString(), userId: getCurrentUser()?.id } } } ``` ### onContextCalled Intercepts and modifies context when `withContext()` is called. This is useful for transforming or enriching context data before it's used in logs. Returning `null` or `undefined` will prevent the context from being added to the log. **Method Signature:** ```typescript onContextCalled?(context: Record, loglayer: ILogLayer): Record | null | undefined ``` **Parameters:** * `context`: Record\ - The context object being added * `loglayer`: ILogLayer - The LogLayer instance **Example:** ```typescript const contextEnrichmentPlugin = { onContextCalled: (context: Record, loglayer: ILogLayer) => { return { ...context, environment: process.env.NODE_ENV, processId: process.pid, timestamp: new Date().toISOString() } } } ``` ## Boilerplate / Template Code A sample project that you can use as a template is provided here: [GitHub Boilerplate Template](https://github.com/loglayer/loglayer-plugin-boilerplate) --- --- url: 'https://loglayer.dev/example-integrations/nextjs.md' description: Learn how to implement LogLayer with Next.js --- # Custom logging in Next.js ## Installation This guide assumes you already have [Next.js](https://nextjs.org/) set up. First, install the required packages. You can use any transport you prefer - we'll use [Pino](/transports/pino) in this example: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-pino pino serialize-error ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-pino pino serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-pino pino serialize-error ``` ::: ## Setup ```typescript // logger.ts import { LogLayer } from 'loglayer' import { PinoTransport } from '@loglayer/transport-pino' import { serializeError } from 'serialize-error' import pino from 'pino' // Create a Pino instance (only needs to be done once) const pinoLogger = pino({ level: 'trace' // Set to desired log level }) const log = new LogLayer({ errorSerializer: serializeError, transport: new PinoTransport({ logger: pinoLogger }) }) export function getLogger() { return log; } ``` We expose a function called `getLogger()` to get the logger instance. We do this in the event that you want to mock the logger in your tests, where you can override `getLogger()` to return the LogLayer mock, [MockLogLayer](/logging-api/unit-testing). At this point you should be able to call `getLogger()` anywhere in your Next.js app to get the logger instance and write logs. ```typescript // pages.tsx import { getLogger } from './logger' export default function Page() { const log = getLogger() log.withMetadata({ some: "data" }).info('Hello, world!') return
Hello, world!
} ``` ## Distinguish between server and client logs If you use transports that are only client-side or server-side (such as the [DataDog](/transports/datadog) and [DataDog Browser](/transports/datadog-browser-logs) Transports), you can conditionally enable them based on the environment. Add a const to detect if the code is running on the server or client: ```typescript const isServer = typeof window === 'undefined' ``` Modify your transport to run only on the server: ```typescript const isServer = typeof window === 'undefined' const log = new LogLayer({ errorSerializer: serializeError, transport: new PinoTransport({ enabled: isServer, // runs server-side only logger: pinoLogger }), plugins: [ { // Add a plugin to label the log entry as coming from the server or client onBeforeMessageOut(params: PluginBeforeMessageOutParams) { const tag = isServer ? "Server" : "Client"; if (params.messages && params.messages.length > 0) { if (typeof params.messages[0] === "string") { params.messages[0] = `[${tag}] ${params.messages[0]}`; } } return params.messages; }, }, ] }) // Can also add to context data too; would be stamped on every log entry log.withContext({ isServer }) ``` ## Handling server-side uncaught exceptions and rejections Next.js [does not](https://github.com/vercel/next.js/discussions/63787) have a way to use a custom logger for server-side uncaught exceptions and rejections. To use LogLayer for this, you will need to create an [instrumentation file](https://nextjs.org/docs/app/building-your-application/optimizing/instrumentation) in the root of your project. Here's an example using the [Pino](/transports/pino) and [DataDog](/transports/datadog) transports: ```typescript // instrumentation.ts import { LogLayer, type ILogLayer } from 'loglayer'; import { DataDogTransport } from "@loglayer/transport-datadog"; import { PinoTransport } from "@loglayer/transport-pino"; import pino from "pino"; import { serializeError } from "serialize-error"; /** * Strip ANSI codes from a string, which is something Next.js likes to inject. */ function stripAnsiCodes(str: string): string { return str.replace( /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g, "", ); } /** * Create a console method that logs to LogLayer */ function createConsoleMethod(log: ILogLayer, method: "error" | "info" | "warn" | "debug" | "log") { let mappedMethod = method; if (method === "log") { mappedMethod = "info"; } return (...args: unknown[]) => { const data: Record = {}; let hasData = false; let error: Error | null = null; const messages: string[] = []; for (const arg of args) { if (arg instanceof Error) { error = arg; continue; } if (typeof arg === "object" && arg !== null) { Object.assign(data, arg); hasData = true; continue; } if (typeof arg === "string") { messages.push(arg); } } let finalMessage = stripAnsiCodes(messages.join(" ")).trim(); // next.js uses an "x" for the error message when it's an error object if (finalMessage === "⨯" && error) { finalMessage = error?.message || ""; } if (error && hasData && messages.length > 0) { log.withError(error).withMetadata(data)[mappedMethod](finalMessage); } else if (error && messages.length > 0) { log.withError(error)[mappedMethod](finalMessage); } else if (hasData && messages.length > 0) { log.withMetadata(data)[mappedMethod](finalMessage); } else if (error && hasData && messages.length === 0) { log.withError(error).withMetadata(data)[mappedMethod](""); } else if (error && messages.length === 0) { log.errorOnly(error); } else if (hasData && messages.length === 0) { log.metadataOnly(data); } else { log[mappedMethod](finalMessage); } }; } export async function register() { const logger = new LogLayer({ errorSerializer: serializeError, transport: [ new PinoTransport({ logger: pino(), }), new DataDogTransport(...), ] }) if (process.env.NEXT_RUNTIME === "nodejs") { console.error = createConsoleMethod(logger, "error"); console.log = createConsoleMethod(logger, "log"); console.info = createConsoleMethod(logger, "info"); console.warn = createConsoleMethod(logger, "warn"); console.debug = createConsoleMethod(logger, "debug"); } } ``` If you threw an error from `page.tsx` that is uncaught, you should see this in the terminal: ```json lines {"err":{"type":"Object","message":"test","stack":"Error: test\n at Page (webpack-internal:///(rsc)/./src/app/page.tsx:12:11)","digest":"699232626","name":"Error"},"msg":"test"} ``` --- --- url: 'https://loglayer.dev/transports/datadog-browser-logs.md' description: Send logs using the DataDog Browser Logs library with LogLayer --- # DataDog Browser Logs Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-datadog-browser-logs)](https://www.npmjs.com/package/@loglayer/transport-datadog-browser-logs) [@datadog/browser-logs](https://docs.datadoghq.com/logs/log_collection/javascript/) is Datadog's official browser-side logging library. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/datadog-browser-logs) ## Important Notes * Only works in browser environments (not in Node.js) * For server-side logging, use the [`@loglayer/transport-datadog`](/transports/datadog.html) package * You will not get any console output since this sends directly to DataDog. Use the `onDebug` option to log out messages. ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-datadog-browser-logs @datadog/browser-logs ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-datadog-browser-logs @datadog/browser-logs ``` ```sh [yarn] yarn add loglayer @loglayer/transport-datadog-browser-logs @datadog/browser-logs ``` ::: ## Setup ```typescript import { datadogLogs } from '@datadog/browser-logs' import { LogLayer } from 'loglayer' import { DatadogBrowserLogsTransport } from "@loglayer/transport-datadog-browser-logs" // Initialize Datadog datadogLogs.init({ clientToken: '', site: '', forwardErrorsToLogs: true, sampleRate: 100 }) // Basic setup const log = new LogLayer({ transport: new DatadogBrowserLogsTransport({ logger: datadogLogs }) }) // Or with a custom logger instance const logger = datadogLogs.createLogger('my-logger') const log = new LogLayer({ transport: new DatadogBrowserLogsTransport({ logger }) }) ``` ## Configuration Options | Option | Type | Default | Description | |--------|------|---------|-------------| | `logger` | `DatadogLogs` | - | **Required.** The DataDog browser logs instance | | `enabled` | `boolean` | `true` | Whether the transport is enabled | | `level` | `"trace" \| "debug" \| "info" \| "warn" \| "error" \| "fatal"` | `"trace"` | Minimum log level to process. Logs below this level will be filtered out | ## Changelog View the changelog [here](./changelogs/datadog-browser-logs-changelog.md). ## Log Level Mapping | LogLayer | Datadog | |----------|---------| | trace | debug | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/datadog-browser-logs-changelog.md). --- --- url: 'https://loglayer.dev/transports/datadog.md' description: Send logs to DataDog with the LogLayer logging library --- # DataDog Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-datadog)](https://www.npmjs.com/package/@loglayer/transport-datadog) Ships logs server-side to Datadog using the [datadog-transport-common](https://www.npmjs.com/package/datadog-transport-common) library. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/datadog) ## Important Notes * Only works server-side (not in browsers) * For browser-side logging, use the [`@loglayer/transport-datadog-browser-logs`](/transports/datadog-browser-logs) package * You will not get any console output since this sends directly to DataDog. Use the `onDebug` option to log out messages. ## Installation Install the required packages (`datadog-transport-common` is installed as part of `@loglayer/transport-datadog`): ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-datadog serialize-error ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-datadog serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-datadog serialize-error ``` ::: ## Usage Example ```typescript import { LogLayer } from 'loglayer' import { DataDogTransport } from "@loglayer/transport-datadog" import { serializeError } from "serialize-error"; const log = new LogLayer({ errorSerializer: serializeError, transport: new DataDogTransport({ options: { ddClientConf: { authMethods: { apiKeyAuth: "YOUR_API_KEY", }, }, ddServerConf: { // Note: This must match the site you use for your DataDog login - See below for more info site: "datadoghq.eu" }, onDebug: (msg) => { console.log(msg); }, onError: (err, logs) => { console.error(err, logs); }, }, }) }) ``` ## Transport Configuration ```typescript interface DatadogTransportConfig { /** * Whether the transport is enabled. Default is true. */ enabled?: boolean /** * The field name to use for the message. Default is "message". */ messageField?: string; /** * The field name to use for the log level. Default is "level". */ levelField?: string; /** * The field name to use for the timestamp. Default is "time". */ timestampField?: string; /** * A custom function to stamp the timestamp. The default timestamp uses the ISO 8601 format. */ timestampFunction?: () => any; /** * Minimum log level to process. Logs below this level will be filtered out. Default is "trace". */ level?: "trace" | "debug" | "info" | "warn" | "error" | "fatal"; /** * The options for the transport. */ options: DDTransportOptions } ``` ## Changelog View the changelog [here](./changelogs/datadog-changelog.md). --- --- url: 'https://loglayer.dev/context-managers/default.md' description: The default context manager used in LogLayer. --- # Default Context Manager [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fcontext-manager)](https://www.npmjs.com/package/@loglayer/context-manager) [Context Manager Source](https://github.com/loglayer/loglayer/tree/master/packages/core/context-manager) The Default Context Manager is the base context manager used by LogLayer. It provides a simple key-value store for managing context data with independent context for each logger instance. ::: info Batteries included This context manager is automatically used when creating a new LogLayer instance. You should not need to use this context manager directly. ::: ## Installation This package is included with the `loglayer` package, so you don't need to install it separately. It is, however, available as a standalone package: ::: code-group ```bash [npm] npm install @loglayer/context-manager ``` ```bash [yarn] yarn add @loglayer/context-manager ``` ```bash [pnpm] pnpm add @loglayer/context-manager ``` ::: ## Usage ### Basic Usage ```typescript import { LogLayer, ConsoleTransport } from "loglayer"; import { DefaultContextManager } from "@loglayer/context-manager"; const logger = new LogLayer({ transport: new ConsoleTransport({ logger: console }) // NOTE: This is redundant and unnecessary since DefaultContextManager is already // the default context manager when LogLayer is created. }).withContextManager(new DefaultContextManager()); // Set context logger.setContext({ requestId: "123", userId: "456" }); // Log with context logger.info("User action"); // Will include requestId and userId in the log entry ``` ### Child Loggers When creating child loggers, the Default Context Manager will: 1. Copy the parent's context to the child logger at creation time 2. Maintain independent context after creation ```typescript parentLogger.setContext({ requestId: "123" }); const childLogger = parentLogger.child(); // Child inherits parent's context at creation via shallow-copy childLogger.info("Initial log"); // Includes requestId: "123" // Child can modify its context independently childLogger.appendContext({ userId: "456" }); childLogger.info("User action"); // Includes requestId: "123" and userId: "456" // Parent's context remains unchanged parentLogger.info("Parent log"); // Only includes requestId: "123" ``` --- --- url: 'https://loglayer.dev/transports/dynatrace.md' description: Send logs to Dynatrace with the LogLayer logging library --- # Dynatrace Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-dynatrace)](https://www.npmjs.com/package/@loglayer/transport-dynatrace) [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/dynatrace) The Dynatrace transport sends logs to Dynatrace using their [Log Monitoring API v2](https://docs.dynatrace.com/docs/discover-dynatrace/references/dynatrace-api/environment-api/log-monitoring-v2/post-ingest-logs). ::: warning See the "Limitations" section of the documentation for limits. This transport does not do any checks on limitations, so it's up to you to ensure you're not exceeding them. Although the limitations are pretty generous, it is advised to define the `onError` callback to handle any errors that may occur. ::: ## Installation ::: code-group ```bash [npm] npm install loglayer @loglayer/transport-dynatrace serialize-error ``` ```bash [yarn] yarn add loglayer @loglayer/transport-dynatrace serialize-error ``` ```bash [pnpm] pnpm add loglayer @loglayer/transport-dynatrace serialize-error ``` ::: ## Usage You will need an access token with the `logs.ingest` scope. See [access token documentation](https://docs.dynatrace.com/docs/discover-dynatrace/references/dynatrace-api/basics/dynatrace-api-authentication) for more details. ```typescript import { LogLayer } from 'loglayer' import { DynatraceTransport } from "@loglayer/transport-dynatrace" const log = new LogLayer({ errorSerializer: serializeError, transport: new DynatraceTransport({ url: "https://your-environment-id.live.dynatrace.com/api/v2/logs/ingest", ingestToken: "your-api-token", onError: (error) => { console.error('Failed to send log to Dynatrace:', error) } }) }) log.info('Hello world') ``` ## Configuration The transport accepts the following configuration options: ### Required Options * `url`: The URL to post logs to. Should be in one of these formats: * `https://.live.dynatrace.com/api/v2/logs/ingest` * `https://{your-activegate-domain}:9999/e/{your-environment-id}/api/v2/logs/ingest` * `ingestToken`: An API token with the `logs.ingest` scope ### Optional Options * `onError`: A callback function that will be called when there's an error sending logs to Dynatrace * `enabled`: If set to `false`, the transport will not send any logs (defaults to `true`) * `consoleDebug`: If set to `true`, logs will also be output to the console (defaults to `false`) * `level`: Minimum log level to process. Logs below this level will be filtered out (defaults to `"trace"`) ## Log Format The transport sends logs to Dynatrace in the following format: ```json { "content": "Your log message", "severity": "info|warn|error|debug", "timestamp": "2024-01-01T00:00:00.000Z", // Any additional metadata fields } ``` ## Changelog View the changelog [here](./changelogs/dynatrace-changelog.md). --- --- url: 'https://loglayer.dev/transports/electron-log.md' description: Send logs to electron-log with the LogLayer logging library --- # Electron-log Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-electron-log)](https://www.npmjs.com/package/@loglayer/transport-electron-log) [Electron-log](https://github.com/megahertz/electron-log) is a logging library designed specifically for Electron applications. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/electron-log) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-electron-log electron-log ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-electron-log electron-log ``` ```sh [yarn] yarn add loglayer @loglayer/transport-electron-log electron-log ``` ::: ## Setup ```typescript // Main process logger import log from 'electron-log/src/main' // Or for Renderer process // import log from 'electron-log/src/renderer' import { LogLayer } from 'loglayer' import { ElectronLogTransport } from "@loglayer/transport-electron-log" const logger = new LogLayer({ transport: new ElectronLogTransport({ logger: log }) }) ``` ## Log Level Mapping | LogLayer | Electron-log | |----------|--------------| | trace | silly | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/electron-log-changelog.md). --- --- url: 'https://loglayer.dev/logging-api/error-handling.md' description: Learn how to pass errors to LogLayer for logging --- # Error Handling LogLayer provides robust error handling capabilities with flexible configuration options for how errors are logged and serialized. ## Basic Error Logging ### With a Message The most common way to log an error is using the `withError` method along with a message: ```typescript const error = new Error('Database connection failed') log.withError(error).error('Failed to process request') ``` You can use any log level with error logging: ```typescript // Log error with warning level log.withError(error).warn('Database connection unstable') // Log error with info level log.withError(error).info('Retrying connection') ``` ### Error-Only Logging When you just want to log an error without an additional message: ```typescript // Default log level is 'error' log.errorOnly(new Error('Database connection failed')) // With custom log level log.errorOnly(new Error('Connection timeout'), { logLevel: LogLevel.warn }) ``` ## Error Configuration ### Error Field Name By default, errors are logged under the `err` field. You can customize this: ```typescript const log = new LogLayer({ errorFieldName: 'error', // Default is 'err' }) log.errorOnly(new Error('test')) // Output: { "error": { "message": "test", "stack": "..." } } ``` ### Error Serialization Some logging libraries don't handle Error objects well. You can provide a custom error serializer: ```typescript const log = new LogLayer({ errorSerializer: (err) => ({ message: err.message, stack: err.stack, code: err.code }), }) ``` For libraries like `roarr` that require error serialization, you can use a package like `serialize-error`: ```typescript import { serializeError } from 'serialize-error' const log = new LogLayer({ errorSerializer: serializeError, transport: new RoarrTransport({ logger: roarr }) }) ``` ::: tip Use serialize-error We strongly recommend the use of `serialize-error` for error serialization. ::: ### Error Message Copying You can configure LogLayer to automatically copy the error's message as the log message: ```typescript const log = new LogLayer({ copyMsgOnOnlyError: true, }) // Will include error.message as the log message log.errorOnly(new Error('Connection failed')) ``` You can override this behavior per call: ```typescript // Disable message copying for this call log.errorOnly(new Error('test'), { copyMsg: false }) // Enable message copying for this call even if disabled globally log.errorOnly(new Error('test'), { copyMsg: true }) ``` ### Error in Metadata You can configure errors to be included in the metadata field instead of at the root level: ```typescript const log = new LogLayer({ errorFieldInMetadata: true, metadataFieldName: 'metadata', }) log.errorOnly(new Error('test')) // Output: { "metadata": { "err": { "message": "test", "stack": "..." } } } ``` ## Combining Errors with Other Data ### With Metadata You can combine errors with metadata: ```typescript log.withError(new Error('Query failed')) .withMetadata({ query: 'SELECT * FROM users', duration: 1500 }) .error('Database error') ``` ### With Context Errors can be combined with context data: ```typescript log.withContext({ requestId: '123' }) .withError(new Error('Not found')) .error('Resource not found') ``` --- --- url: 'https://loglayer.dev/plugins/filter.md' description: 'Filter logs using string patterns, regular expressions, or JSON Queries' --- # Filter Plugin [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fplugin-filter)](https://www.npmjs.com/package/@loglayer/plugin-filter) [Plugin Source](https://github.com/loglayer/loglayer/tree/master/packages/plugins/filter) A plugin that filters log messages. You can filter logs using string patterns, regular expressions, or [JSON Queries](https://jsonquerylang.org/). ## Installation ::: code-group ```bash [npm] npm install @loglayer/plugin-filter ``` ```bash [yarn] yarn add @loglayer/plugin-filter ``` ```bash [pnpm] pnpm add @loglayer/plugin-filter ``` ::: ## Usage ```typescript import { filterPlugin } from '@loglayer/plugin-filter'; // Create a filter that only allows error messages const filter = filterPlugin({ // checks the assembled message using an includes() messages: ['error'], }); // Checks the level of the log const levelFilter = filterPlugin({ queries: ['.level == "error" or .level == "warn"'], }); ``` ### Configuration The plugin accepts the following configuration options: | Option | Type | Description | |--------|------|-----------------------------------------------------------------------------------------------------------------------------------| | `messages` | `Array` | Optional. Array of string patterns or regular expressions to match against log messages | | `queries` | `string[]` | Optional. Array of JSON queries to filter logs. A JSON Query `filter()` is applied, which each item being part of an OR condition | | `debug` | `boolean` | Optional. Enable debug mode for troubleshooting | | `disabled` | `boolean` | Optional. Disable the plugin | ## Message Pattern Matching You can filter logs using string patterns or regular expressions: ```typescript // Using string patterns const filter = filterPlugin({ messages: ['error', 'warning'], }); // Using regular expressions const regexFilter = filterPlugin({ messages: [/error/i, /warning\d+/], }); // Mixed patterns const mixedFilter = filterPlugin({ messages: ['error', /warning\d+/], }); ``` ## Query-Based Filtering You can use [JSON Queries](https://jsonquerylang.org/) to filter logs based on any field. ### Usage ```typescript const filter = filterPlugin({ // each item is used as an OR condition queries: [ // Filter by log level '.level == "error"', // Filter by data properties '.data.userId == 123', // Complex conditions '(.level == "error") and (.data.retryCount > 3)', ], }); ``` ::: tip For joining conditions, wrap them in parentheses. ::: This would translate in JSON Query to: ```text filter((.level == "error") or (.data.userId == 123) or ((.level == "error") and (.data.retryCount > 3))) ``` ::: info * `filter()` is added around the queries by the plugin. * Single-quotes are converted to double-quotes. ::: ### Query Context The queries are executed against an array containing an object that is defined as the following: ```typescript [{ level: string; // Log level message: string; // Combined log message data: object; // Additional log data, which includes, error data, context data and metadata }] ``` If you did the following: ```typescript log.withMetadata({ userId: '123' }).error('Failed to process request'); ``` Then the query context would be: ```typescript { level: 'error', message: 'Failed to process request', data: { userId: '123' } } ``` ### Example Queries ```text // Filter by log level [".level == 'error'"] // Filter by message content // see: https://github.com/jsonquerylang/jsonquery/blob/main/reference/functions.md#regex ["regex(.message, 'test', 'i')"] // Filter by data properties [".data.user.age == 25"] // Complex conditions ["(.level == "error") and (.data.retryCount > 3)"] ``` ## Debug Mode Enable debug mode to see detailed information about the filtering process: ```typescript const filter = filterPlugin({ messages: ['error'], queries: ['.level == "error"'], debug: true, }); ``` ## Filter Logic The plugin follows this logic when filtering logs: 1. If no filters are defined (no messages and no queries), allow all logs 2. If message patterns are defined, check them first * If any pattern matches, allow the log 3. If no message patterns match (or none defined) and queries are defined: * Execute queries * If any query matches, allow the log 4. If no patterns or queries match, filter out the log ## Changelog View the changelog [here](./changelogs/filter-changelog.md). --- --- url: 'https://loglayer.dev/getting-started.md' description: Learn how to install and use LogLayer in your project --- # Getting Started ## Installation ::: code-group ```sh [npm] npm install loglayer ``` ```sh [pnpm] pnpm add loglayer ``` ```sh [yarn] yarn add loglayer ``` ::: ## Basic Usage with Console Transport The simplest way to get started is to use the built-in console transport, which uses the standard `console` object for logging: ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), }) // Basic logging log.info('Hello world!') // Logging with metadata log.withMetadata({ user: 'john' }).info('User logged in') // Logging with context (persists across log calls) log.withContext({ requestId: '123' }) log.info('Processing request') // Will include requestId // Logging errors log.withError(new Error('Something went wrong')).error('Failed to process request') ``` ## Next steps * Optionally [configure](/configuration) LogLayer to further customize logging behavior. * Start exploring the [Logging API](/logging-api/basic-logging) section for more advanced logging features. * See the [Transports](/transports/) section for more ways to ship logs to different destinations. --- --- url: 'https://loglayer.dev/transports/google-cloud-logging.md' description: Send logs to Google Cloud Logging with the LogLayer logging library --- # Google Cloud Logging Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-google-cloud-logging)](https://www.npmjs.com/package/@loglayer/transport-google-cloud-logging) [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/google-cloud-logging) Implements the [Google Cloud Logging library](https://www.npmjs.com/package/@google-cloud/logging). This transport sends logs to [Google Cloud Logging](https://cloud.google.com/logging) (formerly known as Stackdriver Logging). ## Configuration Options | Option | Type | Default | Description | |--------|------|---------|-------------| | `logger` | `Log` | - | **Required.** The Google Cloud Logging instance | | `level` | `"trace" \| "debug" \| "info" \| "warn" \| "error" \| "fatal"` | `"trace"` | The minimum log level to process. Logs below this level will be filtered out | | `rootLevelData` | `Record` | - | Data to be included in the metadata portion of the log entry | | `onError` | `(error: Error) => void` | - | Error handling callback | ## Installation ::: code-group ```bash [npm] npm install @loglayer/transport-google-cloud-logging @google-cloud/logging serialize-error ``` ```bash [yarn] yarn add @loglayer/transport-google-cloud-logging @google-cloud/logging serialize-error ``` ```bash [pnpm] pnpm add @loglayer/transport-google-cloud-logging @google-cloud/logging serialize-error ``` ::: ## Usage ::: info This transport uses `log.entry(metadata, data)` as described in the library documentation. * The `metadata` portion is not the data from `withMetadata()` or `withContext()`. See the `rootLevelData` option for this transport on how to modify this value. * The `data` portion is actually the `jsonPayload` is what the transport uses for all LogLayer data. * The message data is stored in `jsonPayload.message` For more information, see [Structured Logging](https://cloud.google.com/logging/docs/structured-logging), specifically [LogEntry](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry). ::: ```typescript import { LogLayer } from "loglayer"; import { GoogleCloudLoggingTransport } from "@loglayer/transport-google-cloud-logging"; import { Logging } from '@google-cloud/logging'; import { serializeError } from "serialize-error"; // Create the logging client const logging = new Logging({ projectId: "GOOGLE_CLOUD_PLATFORM_PROJECT_ID" }); const log = logging.log('my-log'); // Create LogLayer instance with the transport const logger = new LogLayer({ errorSerializer: serializeError, transport: new GoogleCloudLoggingTransport({ logger: log, }) }); // The logs will include the default metadata logger.info("Hello from Cloud Run!"); ``` ## Configuration ### `rootLevelData` The root level data to include for all log entries. This is not the same as using `withContext()`, which would be included as part of the `jsonPayload`. The `rootLevelData` option accepts any valid [Google Cloud LogEntry](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry) fields except for `severity`, `timestamp`, and `jsonPayload` which are managed by the transport. ```typescript const logger = new LogLayer({ transport: new GoogleCloudLoggingTransport({ logger: log, rootLevelData: { resource: { type: "cloud_run_revision", labels: { project_id: "my-project", service_name: "my-service", revision_name: "my-revision", }, }, labels: { environment: "production", version: "1.0.0", }, }, }), }); ``` ## Log Level Mapping LogLayer log levels are mapped to Google Cloud Logging severity levels as follows: | LogLayer Level | Google Cloud Logging Severity | |---------------|------------------------------| | `fatal` | `CRITICAL` | | `error` | `ERROR` | | `warn` | `WARNING` | | `info` | `INFO` | | `debug` | `DEBUG` | | `trace` | `DEBUG` | ## Changelog View the changelog [here](./changelogs/google-cloud-logging-changelog.md). --- --- url: 'https://loglayer.dev/context-managers/linked.md' description: Share context between parent and child logs in LogLayer. --- # Linked Context Manager [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fcontext-manager-linked)](https://www.npmjs.com/package/@loglayer/context-manager-linked) [Context Manager Source](https://github.com/loglayer/loglayer/tree/master/packages/context-managers/linked) A context manager that keeps context linked between parent and child loggers. This means that changes to the context in the parent / child / child of child loggers will affect all loggers. ## Installation ::: code-group ```bash [npm] npm install @loglayer/context-manager-linked ``` ```bash [yarn] yarn add @loglayer/context-manager-linked ``` ```bash [pnpm] pnpm add @loglayer/context-manager-linked ``` ::: ## Usage ```typescript import { LogLayer, ConsoleTransport } from "loglayer"; import { LinkedContextManager } from '@loglayer/context-manager-linked'; const parentLog = new LogLayer({ transport: new ConsoleTransport({ logger: console }), }).withContextManager(new LinkedContextManager()); const childLog = parentLog.child(); childLog.withContext({ module: 'users' }); parentLog.withContext({ app: 'myapp' }); parentLog.info('Parent log'); childLog.info('Child log'); // Output includes: { module: 'users', app: 'myapp' } // for both parentLog and childLog ``` ## Changelog View the changelog [here](./changelogs/linked-changelog.md). --- --- url: 'https://loglayer.dev/transports/log-file-rotation.md' description: >- Write logs to files with automatic rotation based on size or time with the LogLayer logging library --- # Log File Rotation Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-log-file-rotation)](https://www.npmjs.com/package/@loglayer/transport-log-file-rotation) [Transport Source](https://github.com/loglayer/loglayer/blob/master/packages/transports/log-file-rotation) The Log File Rotation transport writes logs to files with automatic rotation based on size or time. This transport is built on top of [`file-stream-rotator`](https://github.com/rogerc/file-stream-rotator/), a library for handling log file rotation in Node.js applications. ## Features * Automatic log file rotation based on time (hourly, daily) * Support for date patterns in filenames using numerical values * Size-based rotation with support for KB, MB, and GB units * Compression of rotated log files * Maximum file count or age-based retention * Automatic cleanup of old log files * Batch processing of logs for improved performance (must be enabled) ## Installation ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-log-file-rotation serialize-error ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-log-file-rotation serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-log-file-rotation serialize-error ``` ::: ## Usage ```typescript import { LogLayer } from "loglayer"; import { LogFileRotationTransport } from "@loglayer/transport-log-file-rotation"; import { serializeError } from "serialize-error"; const logger = new LogLayer({ errorSerializer: serializeError, transport: [ new LogFileRotationTransport({ filename: "./logs/app.log" }), ], }); logger.info("Application started"); ``` ::: warning Filename Uniqueness Each instance of `LogFileRotationTransport` must have a unique filename to prevent possible race conditions. If you try to create multiple transport instances with the same filename, an error will be thrown. If you need multiple loggers to write to the same file, they should share the same transport instance: ```typescript // Create a single transport instance const fileTransport = new LogFileRotationTransport({ filename: "./logs/app-%DATE%.log", dateFormat: "YMD", frequency: "daily" }); // Share it between multiple loggers const logger1 = new LogLayer({ transport: [fileTransport] }); const logger2 = new LogLayer({ transport: [fileTransport] }); ``` Child loggers do not have this problem as they inherit the transport instance from their parent logger. ::: ## Configuration Options ### Required Parameters | Option | Type | Description | |--------|------|------------------------------------------------------------------------------------------------------------------------------------| | `filename` | `string` | The filename pattern to use for the log files. Supports date format using numerical values (e.g., `"./logs/application-%DATE%.log"`) | ### Optional Parameters | Option | Type | Description | Default | |--------|------|----------------------------------------------------------------------------------------------------------------------------------------------------|---------| | `auditFile` | `string` | Location to store the log audit file | None | | `auditHashType` | `"md5" \| "sha256"` | Hashing algorithm for audit file. Use 'sha256' for FIPS compliance | `"md5"` | | `batch` | `object` | Batch processing configuration. See [Batch Configuration](#batch-configuration) for details | None | | `callbacks` | `object` | Event callbacks for various file stream events. See [Callbacks](#callbacks) for details | None | | `compressOnRotate` | `boolean` | Whether to compress rotated log files using gzip | `false` | | `createSymlink` | `boolean` | Create a tailable symlink to the current active log file | `false` | | `dateFormat` | `string` | The date format to use in the filename. Uses single characters: 'Y' (full year), 'M' (month), 'D' (day), 'H' (hour), 'm' (minutes), 's' (seconds) | `"YMD"` | | `delimiter` | `string` | Delimiter between log entries | `"\n"` | | `extension` | `string` | File extension to be appended to the filename | None | | `fieldNames` | `object` | Custom field names for the log entry JSON. See [Field Names](#field-names) for details | See below | | `fileMode` | `number` | File mode (permissions) to be used when creating log files | `0o640` | | `fileOptions` | `object` | Options passed to fs.createWriteStream | `{ flags: 'a' }` | | `frequency` | `string` | The frequency of rotation. Can be 'daily', 'date', '\[1-30]m' for minutes, or '\[1-12]h' for hours | None | | `levelMap` | `object` | Custom mapping for log levels. See [Level Mapping](#level-mapping) for details | None | | `maxLogs` | `string \| number` | Maximum number of logs to keep. Can be a number of files or days (e.g., "10d" for 10 days) | None | | `size` | `string` | The size at which to rotate. Must include a unit suffix: "k"/"K" for kilobytes, "m"/"M" for megabytes, "g"/"G" for gigabytes (e.g., "10M", "100K") | None | | `staticData` | `(() => Record) \| Record` | Static data to be included in every log entry. Can be either a function that returns an object, or a direct object. If it's a function, it's called for each log entry. | None | | `symlinkName` | `string` | Name to use when creating the symbolic link | `"current.log"` | | `timestampFn` | `() => string \| number` | Custom function to generate timestamps | `() => new Date().toISOString()` | | `utc` | `boolean` | Use UTC time for date in filename | `false` | | `verbose` | `boolean` | Whether to enable verbose mode in the underlying file-stream-rotator. See [Verbose Mode](#verbose-mode) for details | `false` | ### Field Names The `fieldNames` object allows you to customize the field names in the log entry JSON: | Field | Type | Description | Default | |-------|------|------------------------------------------------------------------|---------| | `level` | `string` | Field name for the log level | `"level"` | | `message` | `string` | Field name for the log message | `"message"` | | `timestamp` | `string` | Field name for the timestamp | `"timestamp"` | ### Callbacks The `callbacks` object supports the following event handlers: | Callback | Parameters | Description | |----------|------------|-------------| | `onClose` | `() => void` | Called when a log file is closed | | `onError` | `(error: Error) => void` | Called when an error occurs | | `onFinish` | `() => void` | Called when the stream is finished | | `onLogRemoved` | `(info: { date: number; name: string; hash: string }) => void` | Called when a log file is removed due to retention policy | | `onNew` | `(newFile: string) => void` | Called when a new log file is created | | `onOpen` | `() => void` | Called when a log file is opened | | `onRotate` | `(oldFile: string, newFile: string) => void` | Called when a log file is rotated | ### Level Mapping The `levelMap` object allows you to map each log level to either a string or number: | Level | Type | Example (Numeric) | Example (String) | |-------|------|------------------|------------------| | `debug` | `string \| number` | 20 | `"DEBUG"` | | `error` | `string \| number` | 50 | `"ERROR"` | | `fatal` | `string \| number` | 60 | `"FATAL"` | | `info` | `string \| number` | 30 | `"INFO"` | | `trace` | `string \| number` | 10 | `"TRACE"` | | `warn` | `string \| number` | 40 | `"WARNING"` | ### Batch Configuration The `batch` option enables batching of log entries to improve performance by reducing disk writes. When enabled, logs are queued in memory and written to disk in batches. The configuration accepts the following options: | Option | Type | Description | Default | |--------|------|-------------|---------| | `size` | `number` | Maximum number of log entries to queue before writing | `1000` | | `timeout` | `number` | Maximum time in milliseconds to wait before writing queued logs | `5000` | Queued logs are automatically flushed in the following situations: * When the batch size is reached * When the batch timeout is reached * When the transport is disposed * When the process exits (including SIGINT and SIGTERM signals) Example usage: ```typescript new LogFileRotationTransport({ filename: "./logs/app", frequency: "daily", dateFormat: "YMD", extension: ".log", batch: { size: 1000, // Write after 1000 logs are queued timeout: 5000 // Or after 5 seconds, whichever comes first } }); ``` ::: tip Performance Tuning For high-throughput applications, you might want to adjust the batch settings based on your needs: * Increase `batch.size` for better throughput at the cost of higher memory usage * Decrease `batch.timeout` to reduce the risk of losing logs in case of crashes ::: ## Verbose Mode It is recommended to enable `verbose` when configuring log rotation rules. This option allows troubleshooting and debugging of the rotation settings. Once properly configured, it can be removed / disabled. ```typescript new LogFileRotationTransport({ filename: "./logs/daily/test-%DATE%.log", frequency: "1h", dateFormat: "YMD", // using hourly frequency, but missing the "m" part verbose: true, }); ``` If there is something wrong with the configuration, you will get something like: ```bash [FileStreamRotator] Date format not suitable for X hours rotation. Changing date format to 'YMDHm ``` Which will help you identify the issue and correct it: ```typescript new LogFileRotationTransport({ filename: "./logs/daily/test-%DATE%.log", frequency: "1h", dateFormat: "YMDm", }); ``` ## Log Format Each log entry is written as a JSON object with the following format: ```json5 { "level": "info", "message": "Log message", "timestamp": "2024-01-17T12:34:56.789Z", // metadata / context / error data will depend on your LogLayer configuration "userId": "123", "requestId": "abc-123" } ``` ## Adding Static Data to Every Log Entry ```typescript import { hostname } from "node:os"; // Using a function new LogFileRotationTransport({ filename: "./logs/app-%DATE%.log", frequency: "daily", dateFormat: "YMD", staticData: () => ({ hostname: hostname(), // Add the server's hostname pid: process.pid, // Add the process ID environment: process.env.NODE_ENV || "development" }) }); // Using a direct object new LogFileRotationTransport({ filename: "./logs/app-%DATE%.log", frequency: "daily", dateFormat: "YMD", staticData: { hostname: hostname(), pid: process.pid, environment: process.env.NODE_ENV || "development" } }); ``` This will add the hostname, process ID, and environment to every log entry: ```json { "level": "info", "message": "Application started", "timestamp": "2024-01-17T12:34:56.789Z", "hostname": "my-server", "pid": 12345, "environment": "production" } ``` ::: tip Static Data Performance When using static values that don't change during the lifetime of your application (like hostname and process ID), it's better to use a direct object instead of a function: ```typescript // Better performance: object is created once new LogFileRotationTransport({ filename: "./logs/app-%DATE%.log", staticData: { hostname: hostname(), pid: process.pid, environment: process.env.NODE_ENV || "development" } }); // Use a function only if you need dynamic values new LogFileRotationTransport({ filename: "./logs/app-%DATE%.log", staticData: () => ({ timestamp: Date.now(), // Dynamic value that changes hostname: hostname(), // Static value pid: process.pid // Static value }) }); ``` ::: ## Rotation Examples ::: tip Date Format Requirements The transport requires specific date formats based on the rotation frequency: * For daily rotation: use `dateFormat: "YMD"` * For hourly rotation: use `dateFormat: "YMDHm"` * For minute rotation: use `dateFormat: "YMDHm"` These formats ensure proper rotation timing and file naming. ::: ### Daily Rotation ```typescript new LogFileRotationTransport({ filename: "./logs/daily/test-%DATE%.log", frequency: "daily", dateFormat: "YMD", // Required for daily rotation }); ``` ### Hourly Rotation ```typescript new LogFileRotationTransport({ filename: "./logs/hourly/test-%DATE%.log", frequency: "1h", dateFormat: "YMDHm", // Required for hourly rotation }); ``` ### Minute-based Rotation ```typescript new LogFileRotationTransport({ filename: "./logs/minutes/test-%DATE%.log", frequency: "5m", dateFormat: "YMDHm", // Required for minute rotation }); ``` ### Size-based Rotation ```typescript new LogFileRotationTransport({ filename: "./logs/size/app.log", size: "50k", maxLogs: 5, }); ``` ## Changelog View the changelog [here](./changelogs/log-file-rotation-changelog.md). --- --- url: 'https://loglayer.dev/transports/log4js.md' description: Send logs to Log4js with the LogLayer logging library --- # Log4js Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-log4js)](https://www.npmjs.com/package/@loglayer/transport-log4js) [Log4js-node](https://log4js-node.github.io/log4js-node/) is a conversion of the Log4j framework to Node.js. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/log4js-node) ## Important Notes * Log4js only works in Node.js environments (not in browsers) * By default, logging is disabled and must be configured via `level` or advanced configuration * Consider using Winston as an alternative if Log4js configuration is too complex ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-log4js log4js ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-log4js log4js ``` ```sh [yarn] yarn add loglayer @loglayer/transport-log4js log4js ``` ::: ## Setup ```typescript import log4js from 'log4js' import { LogLayer } from 'loglayer' import { Log4JsTransport } from "@loglayer/transport-log4js" const logger = log4js.getLogger() // Enable logging output logger.level = "trace" const log = new LogLayer({ transport: new Log4JsTransport({ logger }) }) ``` ## Log Level Mapping | LogLayer | Log4js | |----------|---------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | fatal | ## Changelog View the changelog [here](./changelogs/log4js-node-changelog.md). --- --- url: 'https://loglayer.dev/logging-api/basic-logging.md' description: Learn how to log messages at different severity levels with LogLayer --- # Basic Logging LogLayer provides a simple and consistent API for logging messages at different severity levels. This guide covers the basics of logging messages. ## Log Levels LogLayer supports six standard log levels, each with its own method: * `info()` - For general information messages * `warn()` - For warning messages * `error()` - For error messages * `debug()` - For debug information * `trace()` - For detailed debugging information * `fatal()` - For critical errors that require immediate attention ::: info Unsupported Log Levels Some logging libraries may not support all levels. In such cases: * `trace` is mapped to `debug` * `fatal` is mapped to `error` ::: ## Basic Message Logging The simplest way to log a message is to use one of the log level methods: ```typescript // Basic info message log.info('User logged in successfully') // Warning message log.warn('API rate limit approaching') // Error message log.error('Failed to connect to database') // Debug message log.debug('Processing request payload') // Trace message (detailed debugging) log.trace('Entering authentication function') // Fatal message (critical errors) log.fatal('System out of memory') ``` ## Message Parameters All log methods accept multiple parameters, which can be strings, booleans, numbers, null, or undefined: ```typescript // Multiple parameters log.info('User', 123, 'logged in') // With string formatting log.info('User %s logged in from %s', 'john', 'localhost') ``` ::: tip sprintf-style formatting The logging library you use may or may not support sprintf-style string formatting. If it does not, you can use the [sprintf plugin](/plugins/sprintf) to enable support. ::: ## Message Prefixing You can add a prefix to all log messages either through configuration or using the `withPrefix` method: ```typescript // Via configuration const log = new LogLayer({ prefix: '[MyApp]', transport: new ConsoleTransport({ logger: console }) }) // Via method const prefixedLogger = log.withPrefix('[MyApp]') // Output: "[MyApp] User logged in" prefixedLogger.info('User logged in') ``` ## Enabling/Disabling Logging You can control whether logs are output using these methods: ```typescript // Disable all logging log.disableLogging() // Enable logging again log.enableLogging() ``` You can also configure this through the initial configuration: ```typescript const log = new LogLayer({ enabled: false, // Starts with logging disabled transport: new ConsoleTransport({ logger: console }) }) ``` --- --- url: 'https://loglayer.dev/logging-api/metadata.md' description: Learn how to log structured metadata with your log messages in LogLayer --- # Logging with Metadata Metadata allows you to add structured data to your log messages. LogLayer provides several ways to include metadata in your logs. ::: info The output examples use `msg` as the message field. The name of this field may vary depending on the logging library you are using. In the `console` logger, this field does not exist, and the message is printed directly. ::: ## Adding Metadata to Messages The most common way to add metadata is using the `withMetadata` method: ```typescript log.withMetadata({ userId: '123', action: 'login', browser: 'Chrome' }).info('User logged in') ``` By default, this produces a flattened log entry: ```json { "msg": "User logged in", "userId": "123", "action": "login", "browser": "Chrome" } ``` ::: info Passing empty metadata Passing an empty value (`null`, `undefined`, or an empty object) to `withMetadata` will not add any metadata or call related plugins. ::: ## Logging Metadata Only Sometimes you want to log metadata without a message. Use `metadataOnly` for this: ```typescript // Default log level is 'info' log.metadataOnly({ status: 'healthy', memory: '512MB', cpu: '45%' }) // Or specify a different log level log.metadataOnly({ status: 'warning', memory: '1024MB', cpu: '90%' }, LogLevel.warn) ``` ::: info Passing empty metadata Passing an empty value (`null`, `undefined`, or an empty object) to `withMetadata` will not add any metadata or call related plugins. ::: ## Structuring Metadata By default, metadata is flattened into the root of the log object. You can change this by configuring a dedicated metadata field: ```typescript const log = new LogLayer({ metadataFieldName: 'metadata', transport: new ConsoleTransport({ logger: console }) }) log.withMetadata({ userId: '123', action: 'login' }).info('User logged in') ``` This produces: ```json { "msg": "User logged in", "metadata": { "userId": "123", "action": "login" } } ``` ## Combining Metadata with Other Data ### With Context Metadata can be combined with context data: ```typescript log.withContext({ requestId: 'abc' }) .withMetadata({ userId: '123' }) .info('Processing request') ``` If using field names: ```json { "msg": "Processing request", "context": { "requestId": "abc" }, "metadata": { "userId": "123" } } ``` ### With Errors Metadata can be combined with error logging: ```typescript log.withError(new Error('Database connection failed')) .withMetadata({ dbHost: 'localhost', retryCount: 3 }) .error('Failed to connect') ``` ## Controlling Metadata Output ### Muting Metadata You can temporarily disable metadata output: ```typescript // Via configuration const log = new LogLayer({ muteMetadata: true, transport: new ConsoleTransport({ logger: console }) }) // Or via methods log.muteMetadata() // Disable metadata log.unMuteMetadata() // Re-enable metadata ``` --- --- url: 'https://loglayer.dev/logging-api/typescript.md' description: Notes on using LogLayer with Typescript --- # Typescript Tips ## Use `ILogLayer` if you need to type your logger `ILogLayer` is the interface implemented by `LogLayer`. By using this interface, you will also be able to use the mock `MockLogLayer` class for unit testing. ```typescript import type { ILogLayer } from 'loglayer' const logger: ILogLayer = new LogLayer() ``` ## Use `LogLevel` if you need to type your log level when creating a logger ```typescript import type { LogLevel } from 'loglayer' const logger = new LogLayer({ transport: new ConsoleTransport({ level: process.env.LOG_LEVEL as LogLevel }) }) ``` ## Use `LogLayerTransport` if you need to type an array of transports ```typescript import type { LogLayerTransport } from 'loglayer' const transports: LogLayerTransport[] = [ new ConsoleTransport({ level: process.env.LOG_LEVEL as LogLevel }), new FileTransport({ level: process.env.LOG_LEVEL as LogLevel }) ] const logger = new LogLayer({ transport: transports, }) ``` --- --- url: 'https://loglayer.dev/transports/loglevel.md' description: Send logs to loglevel with the LogLayer logging library --- # loglevel Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-loglevel)](https://www.npmjs.com/package/@loglayer/transport-loglevel) [loglevel](https://github.com/pimterry/loglevel) is a minimal lightweight logging library for JavaScript. It provides a simple logging API that works in both Node.js and browser environments. [Transport Source](https://github.com/loglayer/loglayer/blob/master/packages/transports/loglevel) ## Installation ::: code-group ```sh [npm] npm i @loglayer/transport-loglevel loglevel ``` ```sh [pnpm] pnpm add @loglayer/transport-loglevel loglevel ``` ```sh [yarn] yarn add @loglayer/transport-loglevel loglevel ``` ::: ## Setup ```typescript import { LogLayer } from 'loglayer'; import { LogLevelTransport } from '@loglayer/transport-loglevel'; import log from 'loglevel'; const logger = log.getLogger('myapp'); logger.setLevel('trace'); // Enable all log levels const loglayer = new LogLayer({ transport: new LogLevelTransport({ logger, // Optional: control where object data appears in log messages appendObjectData: false // default: false - object data appears first }) }); ``` ## Configuration Options ### `appendObjectData` Controls where object data (metadata, context, errors) appears in the log messages: * `false` (default): Object data appears as the first parameter * `true`: Object data appears as the last parameter Example with `appendObjectData: false` (default): ```typescript loglayer.withMetadata({ user: 'john' }).info('User logged in'); // logger.info({ user: 'john' }, 'User logged in') ``` Example with `appendObjectData: true`: ```typescript loglayer.withMetadata({ user: 'john' }).info('User logged in'); // logger.info('User logged in', { user: 'john' }) ``` ## Log Level Mapping | LogLayer | loglevel | |----------|----------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/loglevel-changelog.md). --- --- url: 'https://loglayer.dev/transports/new-relic.md' description: Send logs to New Relic with the LogLayer logging library --- # New Relic Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-new-relic)](https://www.npmjs.com/package/@loglayer/transport-new-relic) The New Relic transport allows you to send logs directly to New Relic's [Log API](https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/). It provides robust features including compression, retry logic, rate limiting support, and validation of New Relic's API constraints. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/new-relic) ## Installation ::: code-group ```bash [npm] npm install loglayer @loglayer/transport-new-relic serialize-error ``` ```bash [pnpm] pnpm add loglayer @loglayer/transport-new-relic serialize-error ``` ```bash [yarn] yarn add loglayer @loglayer/transport-new-relic serialize-error ``` ::: ## Basic Usage ```typescript import { LogLayer } from 'loglayer' import { NewRelicTransport } from "@loglayer/transport-new-relic" import { serializeError } from "serialize-error"; const log = new LogLayer({ errorSerializer: serializeError, transport: new NewRelicTransport({ apiKey: "YOUR_NEW_RELIC_API_KEY", endpoint: "https://log-api.newrelic.com/log/v1", // optional, this is the default useCompression: true, // optional, defaults to true maxRetries: 3, // optional, defaults to 3 retryDelay: 1000, // optional, base delay in ms, defaults to 1000 respectRateLimit: true, // optional, defaults to true onError: (err) => { console.error('Failed to send logs to New Relic:', err); }, onDebug: (entry) => { console.log('Log entry being sent:', entry); }, }) }) // Use the logger log.info("This is a test message"); log.withMetadata({ userId: "123" }).error("User not found"); ``` ## Configuration Options | Option | Type | Default | Description | |--------|------|---------|-------------| | `apiKey` | `string` | - | **Required.** Your New Relic API key | | `endpoint` | `string` | `"https://log-api.newrelic.com/log/v1"` | The New Relic Log API endpoint | | `useCompression` | `boolean` | `true` | Whether to use gzip compression | | `maxRetries` | `number` | `3` | Maximum number of retry attempts | | `retryDelay` | `number` | `1000` | Base delay between retries (ms) | | `respectRateLimit` | `boolean` | `true` | Whether to respect rate limiting | | `onError` | `(err: Error) => void` | - | Error handling callback | | `onDebug` | `(entry: Record) => void` | - | Debug callback for inspecting log entries | | `enabled` | `boolean` | `true` | Whether the transport is enabled | | `level` | `"trace" \| "debug" \| "info" \| "warn" \| "error" \| "fatal"` | `"trace"` | Minimum log level to process. Logs below this level will be filtered out | ## Features ### Compression The transport uses gzip compression by default to reduce bandwidth usage. You can disable this if needed: ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", useCompression: false }) ``` ### Retry Logic The transport includes a sophisticated retry mechanism with exponential backoff and jitter: ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", maxRetries: 5, // Increase max retries retryDelay: 2000 // Increase base delay to 2 seconds }) ``` The actual delay between retries is calculated using: ``` delay = baseDelay * (2 ^ attemptNumber) + random(0-200)ms ``` ### Rate Limiting The transport handles New Relic's rate limiting in two ways: 1. **Respect Rate Limits (Default)** ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", respectRateLimit: true // This is the default }) ``` * Waits for the duration specified in the `Retry-After` header * Rate limit retries don't count against `maxRetries` * Uses 60 seconds as default wait time if no header is present 2. **Ignore Rate Limits** ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", respectRateLimit: false, onError: (err) => { if (err.name === "RateLimitError") { // Handle rate limit error } } }) ``` * Fails immediately when rate limited * Calls `onError` with a `RateLimitError` ### Validation The transport automatically validates logs against New Relic's constraints: ```typescript // This will be validated: log.withMetadata({ veryLongKey: "x".repeat(300), // Will throw ValidationError (name too long) normalKey: "x".repeat(5000) // Will be truncated to 4094 characters }).info("Test message") ``` Validation includes: * Maximum payload size of 1MB (before and after compression) * Maximum of 255 attributes per log entry * Maximum attribute name length of 255 characters * Automatic truncation of attribute values longer than 4094 characters ### Error Handling The transport provides detailed error information through the `onError` callback: ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", onError: (err) => { switch (err.name) { case "ValidationError": // Handle validation errors (payload size, attribute limits) break; case "RateLimitError": // Handle rate limiting errors const rateLimitErr = err as RateLimitError; console.log(`Rate limited. Retry after: ${rateLimitErr.retryAfter}s`); break; default: // Handle other errors (network, API errors) console.error("Failed to send logs:", err.message); } } }) ``` ### Debug Callback The transport includes a debug callback that allows you to inspect log entries before they are sent to New Relic: ```typescript new NewRelicTransport({ apiKey: "YOUR_API_KEY", onDebug: (entry) => { // Log the entry being sent console.log('Sending log entry:', JSON.stringify(entry, null, 2)); } }) ``` ## Best Practices 1. **Error Handling**: Always provide an `onError` callback to handle failures gracefully. 2. **Compression**: Keep compression enabled unless you have a specific reason to disable it. 3. **Rate Limiting**: Use the default rate limit handling unless you have a custom rate limiting strategy. 4. **Retry Configuration**: Adjust `maxRetries` and `retryDelay` based on your application's needs: * Increase for critical logs that must be delivered * Decrease for high-volume, less critical logs 5. **Validation**: Be aware of the attribute limits when adding metadata to avoid validation errors. ## Changelog View the changelog [here](./changelogs/new-relic-changelog.md). ## TypeScript Support The transport is written in TypeScript and provides full type definitions: ```typescript import type { NewRelicTransportConfig } from "@loglayer/transport-new-relic" const config: NewRelicTransportConfig = { apiKey: "YOUR_API_KEY", // TypeScript will enforce correct options } ``` ## Changelog View the changelog [here](./changelogs/new-relic-changelog.md). --- --- url: 'https://loglayer.dev/plugins/opentelemetry.md' description: Add OpenTelemetry trace context to logs using LogLayer --- # OpenTelemetry Plugin [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fplugin-opentelemetry)](https://www.npmjs.com/package/@loglayer/plugin-opentelemetry) [Plugin Source](https://github.com/loglayer/loglayer/tree/master/packages/plugins/opentelemetry) The OpenTelemetry plugin for [LogLayer](https://loglayer.dev) uses [`@opentelemetry/api`](https://www.npmjs.com/package/@opentelemetry/api) to store the following in the log context: * `trace_id` * `span_id` * `trace_flags` ::: info If you are using OpenTelemetry with log processors, use the [OpenTelemetry Transport](/transports/opentelemetry). If you don't know what that is, then you'll want to use this plugin instead of the transport. ::: ## Installation ::: code-group ```bash [npm] npm install loglayer @loglayer/plugin-opentelemetry ``` ```bash [yarn] yarn add loglayer @loglayer/plugin-opentelemetry ``` ```bash [pnpm] pnpm add loglayer @loglayer/plugin-opentelemetry ``` ::: ## Usage Follow the [OpenTelemetry Getting Started Guide](https://opentelemetry.io/docs/languages/js/getting-started/nodejs/) to set up OpenTelemetry in your application. ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' import { openTelemetryPlugin } from '@loglayer/plugin-opentelemetry' const logger = new LogLayer({ transport: [ new ConsoleTransport({ logger: console }), ], plugins: [ openTelemetryPlugin() ] }); ``` ## Configuration The plugin accepts the following configuration options: ```typescript interface OpenTelemetryPluginParams { /** * If specified, all trace fields will be nested under this key */ traceFieldName?: string; /** * Field name for the trace ID. Defaults to 'trace_id' */ traceIdFieldName?: string; /** * Field name for the span ID. Defaults to 'span_id' */ spanIdFieldName?: string; /** * Field name for the trace flags. Defaults to 'trace_flags' */ traceFlagsFieldName?: string; /** * Whether the plugin is disabled */ disabled?: boolean; } ``` ### Example with Custom Configuration ```typescript const logger = new LogLayer({ transport: [ new ConsoleTransport({ logger: console }), ], plugins: [ openTelemetryPlugin({ // Nest all trace fields under 'trace' traceFieldName: 'trace', // Custom field names traceIdFieldName: 'traceId', spanIdFieldName: 'spanId', traceFlagsFieldName: 'flags' }) ] }); ``` This would output logs with the following structure: ```json { "trace": { "traceId": "8de71fcab951aad172f1148c74d0877e", "spanId": "349623465c6dfc1b", "flags": "01" } } ``` ## Example with Express This example has been tested to work with the plugin. * It sets up `express`-based instrumentation using OpenTelemetry * Going to the root endpoint will log a message with the request context ### Installation ::: info This setup assumes you have Typescript configured and have `tsx` installed as a dev dependency. ::: ::: code-group ```bash [npm] npm install express loglayer @loglayer/plugin-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-node \ @opentelemetry/semantic-conventions ``` ```bash [yarn] yarn add express loglayer @loglayer/plugin-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-node \ @opentelemetry/semantic-conventions ``` ```bash [pnpm] pnpm add express loglayer @loglayer/plugin-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-node \ @opentelemetry/semantic-conventions ``` ::: ### Files #### instrumentation.ts ```typescript // instrumentation.ts import { ExpressInstrumentation } from "@opentelemetry/instrumentation-express"; import { HttpInstrumentation } from "@opentelemetry/instrumentation-http"; import { Resource } from "@opentelemetry/resources"; import { NodeSDK } from "@opentelemetry/sdk-node"; import { ATTR_SERVICE_NAME, ATTR_SERVICE_VERSION } from "@opentelemetry/semantic-conventions"; const sdk = new NodeSDK({ resource: new Resource({ [ATTR_SERVICE_NAME]: "yourServiceName", [ATTR_SERVICE_VERSION]: "1.0", }), instrumentations: [ // Express instrumentation expects HTTP layer to be instrumented new HttpInstrumentation(), new ExpressInstrumentation(), ], }); sdk.start(); ``` #### app.ts ```typescript // app.ts import express from "express"; import { type ILogLayer, LogLayer } from "loglayer"; import { serializeError } from "serialize-error"; import { openTelemetryPlugin } from "@loglayer/plugin-opentelemetry"; import { ConsoleTransport } from "loglayer"; const app = express(); // Add types for the req.log property declare global { namespace Express { interface Request { log: ILogLayer; } } } // Define logging middleware app.use((req, res, next) => { // Create a new LogLayer instance for each request req.log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), errorSerializer: serializeError, plugins: [openTelemetryPlugin()], }).withContext({ reqId: crypto.randomUUID(), // Add unique request ID method: req.method, path: req.path, }); next(); }); function sayHelloWorld(req: express.Request) { req.log.info("Printing hello world"); return "Hello world!"; } // Use the logger in your routes app.get("/", (req, res) => { req.log.info("Processing request to root endpoint"); // Add additional context for specific logs req.log.withContext({ query: req.query }).info("Request includes query parameters"); res.send(sayHelloWorld(req)); }); // Error handling middleware app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => { req.log.withError(err).error("An error occurred while processing the request"); res.status(500).send("Internal Server Error"); }); app.listen(3000, () => { console.log("Server started on http://localhost:3000"); }); ``` ### Running the Example ```bash npx tsx --import ./instrumentation.ts ./app.ts ``` Then visit `http://localhost:3000` in your browser. ### Sample Output Output might look like this: ```text { reqId: 'c34ab246-fc51-4b69-9ba6-5e0dfa150e5a', method: 'GET', path: '/', query: {}, trace_id: '8de71fcab951aad172f1148c74d0877e', span_id: '349623465c6dfc1b', trace_flags: '01' } Printing hello world ``` ## Changelog View the changelog [here](./changelogs/opentelemetry-changelog.md). --- --- url: 'https://loglayer.dev/transports/opentelemetry.md' description: Send logs to OpenTelemetry with the LogLayer logging library --- # OpenTelemetry Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-opentelemetry)](https://www.npmjs.com/package/@loglayer/transport-opentelemetry) [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/opentelemetry) The OpenTelemetry transport sends logs using the [OpenTelemetry Logs SDK](https://www.npmjs.com/package/@opentelemetry/sdk-logs). This allows you to integrate logs with OpenTelemetry's observability ecosystem. Compatible with OpenTelemetry JS API and SDK `1.0+`. ::: info In most cases, you should use the [OpenTelemetry Plugin](/plugins/opentelemetry) instead as it stamps logs with trace context. Use this transport if you are using OpenTelemetry log processors, where the log processors do the actual shipping of logs. ::: ### Acknowledgements A lot of the code is based on the [@opentelemetry/winston-transport](https://github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/winston-transport) code, which is licensed under Apache 2.0. ## Installation ::: code-group ```bash [npm] npm install loglayer @loglayer/transport-opentelemetry serialize-error ``` ```bash [yarn] yarn add loglayer @loglayer/transport-opentelemetry serialize-error ``` ```bash [pnpm] pnpm add loglayer @loglayer/transport-opentelemetry serialize-error ``` ::: ## Usage Follow the [OpenTelemetry Getting Started Guide](https://opentelemetry.io/docs/languages/js/getting-started/nodejs/) to set up OpenTelemetry in your application. ```typescript import { LogLayer } from 'loglayer' import { OpenTelemetryTransport } from '@loglayer/transport-opentelemetry' import { serializeError } from 'serialize-error' const logger = new LogLayer({ errorSerializer: serializeError, // This will send logs to the OpenTelemetry SDK // Where it sends to depends on the configured logRecordProcessors in the SDK transport: [ new OpenTelemetryTransport({ // Optional: provide a custom error handler onError: (error) => console.error('OpenTelemetry logging error:', error), // Optional: disable the transport enabled: process.env.NODE_ENV !== 'test', // Optional: enable console debugging consoleDebug: process.env.DEBUG === 'true' }), ], }); ``` ## Configuration Options ### `level` Minimum log level to process. Logs below this level will be filtered out. * Type: `'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal'` * Default: `'trace'` (allows all logs) Example: ```typescript // Only process info logs and above (info, warn, error, fatal) new OpenTelemetryTransport({ level: 'info' }) // Process all logs (default behavior) new OpenTelemetryTransport() ``` ### `onError` Callback to handle errors that occur when logging. * Type: `(error: any) => void` * Default: `undefined` ### `enabled` Enable or disable the transport. * Type: `boolean` * Default: `true` ### `consoleDebug` Enable console debugging for the transport. * Type: `boolean` * Default: `false` ## Example with Express This example has been tested to work with the `OpenTelemetryTransport`. * It uses the OpenTelemetry SDK to send logs to the console (via `logRecordProcessors`) * It sets up `express`-based instrumentation using OpenTelemetry * Going to the root endpoint will log a message with the request context ### Installation ::: info This setup assumes you have Typescript configured and have `tsx` installed as a dev dependency. ::: ::: code-group ```bash [npm] npm install express loglayer @loglayer/transport-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-logs @opentelemetry/sdk-node \ @opentelemetry/sdk-trace-node @opentelemetry/semantic-conventions ``` ```bash [yarn] yarn add express loglayer @loglayer/transport-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-logs @opentelemetry/sdk-node \ @opentelemetry/sdk-trace-node @opentelemetry/semantic-conventions ``` ```bash [pnpm] pnpm add express loglayer @loglayer/transport-opentelemetry serialize-error \ @opentelemetry/instrumentation-express @opentelemetry/instrumentation-http \ @opentelemetry/resources @opentelemetry/sdk-logs @opentelemetry/sdk-node \ @opentelemetry/sdk-trace-node @opentelemetry/semantic-conventions ``` ::: ### Files #### instrumentation.ts ```typescript // instrumentation.ts import { ExpressInstrumentation } from "@opentelemetry/instrumentation-express"; import { HttpInstrumentation } from "@opentelemetry/instrumentation-http"; import { Resource } from "@opentelemetry/resources"; import { ConsoleLogRecordExporter, SimpleLogRecordProcessor } from "@opentelemetry/sdk-logs"; import { NodeSDK } from "@opentelemetry/sdk-node"; import { ConsoleSpanExporter } from "@opentelemetry/sdk-trace-node"; import { ATTR_SERVICE_NAME, ATTR_SERVICE_VERSION } from "@opentelemetry/semantic-conventions"; const sdk = new NodeSDK({ resource: new Resource({ [ATTR_SERVICE_NAME]: "yourServiceName", [ATTR_SERVICE_VERSION]: "1.0", }), traceExporter: new ConsoleSpanExporter(), logRecordProcessors: [new SimpleLogRecordProcessor(new ConsoleLogRecordExporter())], instrumentations: [ // Express instrumentation expects HTTP layer to be instrumented new HttpInstrumentation(), new ExpressInstrumentation(), ], }); sdk.start(); ``` #### app.ts ```typescript // app.ts import express from "express"; import { type ILogLayer, LogLayer } from "loglayer"; import { serializeError } from "serialize-error"; import { OpenTelemetryTransport } from "@loglayer/transport-opentelemetry"; const app = express(); // Add types for the req.log property declare global { namespace Express { interface Request { log: ILogLayer; } } } // Define logging middleware app.use((req, res, next) => { // Create a new LogLayer instance for each request req.log = new LogLayer({ transport: [new OpenTelemetryTransport({ // Optional: provide a custom error handler onError: (error) => console.error('OpenTelemetry logging error:', error), // Optional: provide a custom ID id: 'otel-transport', // Optional: disable the transport enabled: process.env.NODE_ENV !== 'test', // Optional: enable console debugging consoleDebug: process.env.DEBUG === 'true' })], errorSerializer: serializeError, }).withContext({ reqId: crypto.randomUUID(), // Add unique request ID method: req.method, path: req.path, }); next(); }); function sayHelloWorld(req: express.Request) { req.log.info("Printing hello world"); return "Hello world!"; } // Use the logger in your routes app.get("/", (req, res) => { req.log.info("Processing request to root endpoint"); // Add additional context for specific logs req.log.withContext({ query: req.query }).info("Request includes query parameters"); res.send(sayHelloWorld(req)); }); // Error handling middleware app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => { req.log.withError(err).error("An error occurred while processing the request"); res.status(500).send("Internal Server Error"); }); app.listen(3000, () => { console.log("Server started on http://localhost:3000"); }); ``` ### Running the Example ```bash npx tsx --import ./instrumentation.ts ./app.ts ``` Then visit `http://localhost:3000` in your browser. ### Sample Output Output might look like this: ```json { "resource": { "attributes": { "service.name": "yourServiceName", "telemetry.sdk.language": "nodejs", "telemetry.sdk.name": "opentelemetry", "telemetry.sdk.version": "1.30.0", "service.version": "1.0" } }, "instrumentationScope": { "name": "loglayer", "version": "5.1.1", "schemaUrl": "undefined" }, "timestamp": 1736730221608000, "traceId": "c738a5f750b89f988d679235405e1b3b", "spanId": "676f11075b9785d9", "traceFlags": 1, "severityText": "info", "severityNumber": 9, "body": "Printing hello world", "attributes": { "reqId": "3164c21c-d195-49be-b757-3e056881f3d6", "method": "GET", "path": "/" } } ``` --- --- url: 'https://loglayer.dev/transports/pino.md' description: Send logs to Pino with the LogLayer logging library --- # Pino Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-pino)](https://www.npmjs.com/package/@loglayer/transport-pino) [Pino](https://github.com/pinojs/pino) is a very low overhead Node.js logger, focused on performance. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/pino) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-pino pino ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-pino pino ``` ```sh [yarn] yarn add loglayer @loglayer/transport-pino pino ``` ::: ## Setup ```typescript import pino, { P } from 'pino' import { LogLayer } from 'loglayer' import { PinoTransport } from "@loglayer/transport-pino" const p = pino({ level: 'trace' // Enable all log levels }) const log = new LogLayer({ transport: new PinoTransport({ logger: p }) }) ``` ## Log Level Mapping | LogLayer | Pino | |----------|---------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | fatal | ## Changelog View the changelog [here](./changelogs/pino-changelog.md). --- --- url: 'https://loglayer.dev/plugins.md' description: Learn how to create and use plugins with LogLayer --- # Plugins LogLayer's plugin system allows you to extend and modify logging behavior at various points in the log lifecycle. Plugins can modify data and messages before they're sent to the logging library, control whether logs should be sent, and intercept metadata calls. ## Plugin Management ### Adding Plugins You can add plugins when creating the LogLayer instance: ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console }), plugins: [ timestampPlugin(), { // id is optional id: 'sensitive-data-filter', onBeforeDataOut(params) { // a simple plugin that does something return params.data } } ] }) ``` Or add them later: ```typescript log.addPlugins([timestampPlugin()]) log.addPlugins([{ onBeforeDataOut(params) { // a simple plugin that does something return params.data } }]) ``` ### Enabling/Disabling Plugins Plugins can be enabled or disabled at runtime using their ID (if defined): ```typescript // Disable a plugin log.disablePlugin('sensitive-data-filter') // Enable a plugin log.enablePlugin('sensitive-data-filter') ``` ### Removing Plugins Remove a plugin using its ID (if defined): ```typescript log.removePlugin('sensitive-data-filter') ``` ### Replacing All Plugins Replace all existing plugins with new ones: ```typescript log.withFreshPlugins([ timestampPlugin(), { onBeforeDataOut(params) { // do something return params.data } } ]) ``` When used with child loggers, this only affects the current logger instance and does not modify the parent's plugins. ::: warning Potential Performance Impact Replacing plugins at runtime may have a performance impact if you are frequently creating new plugins. It is recommended to re-use the same plugin instance(s) where possible. ::: --- --- url: 'https://loglayer.dev/transports/pretty-terminal.md' description: Interact with pretty printed logs in the terminal --- # Pretty Terminal Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-pretty-terminal)](https://www.npmjs.com/package/@loglayer/transport-pretty-terminal) [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/pretty-terminal) The Pretty Terminal Transport provides interactivity and pretty printing for your logs in the terminal. It has interactive browsing, text search, detailed viewing for large logs, and themes. ## Features * 🔍 **Interactive Selection Mode** - Browse and inspect logs in a full-screen interactive view * 📝 **Detailed Log Inspection** - Examine individual log entries with formatted data and context * 🔎 **Search/Filter Functionality** - Find specific logs with powerful filtering capabilities * 💅 **JSON Pretty Printing** - Beautifully formatted structured data with syntax highlighting * 🎭 **Configurable Themes** - Choose from pre-built themes or customize your own colors ## Installation ::: warning Compatbility Note Pretty Terminal has only been tested in MacOS with the native Terminal app and [Warp](https://www.warp.dev/). It may not work as expected in other terminal emulators or operating systems. ::: ::: code-group ```bash [npm] npm install loglayer @loglayer/transport-pretty-terminal serialize-error ``` ```bash [pnpm] pnpm add loglayer @loglayer/transport-pretty-terminal serialize-error ``` ```bash [yarn] yarn add loglayer @loglayer/transport-pretty-terminal serialize-error ``` ::: ## Basic Usage ::: warning Development Only Pretty Terminal is designed to work in a terminal only for local development. It should not be used for production environments. It is recommended that you disable other transports when using Pretty Terminal to avoid duplicate log output. ::: ```typescript import { LogLayer, ConsoleTransport } from 'loglayer'; import { getPrettyTerminal } from '@loglayer/transport-pretty-terminal'; import { serializeError } from "serialize-error"; // Create LogLayer instance with the transport const log = new LogLayer({ errorSerializer: serializeError, transport: [ new ConsoleTransport({ // Example of how to enable a transport for non-development environments enabled: process.env.NODE_ENV !== 'development', }), getPrettyTerminal({ // Only enable Pretty Terminal in development enabled: process.env.NODE_ENV === 'development', }) ], }); // Start logging! log.withMetadata({ foo: 'bar' }).info('Hello from Pretty Terminal!'); ``` ::: warning Single-Instance Only Because Pretty Terminal is an interactive transport, it may not work well if you run multiple applications in the same terminal window that share the same output stream. If you need to run multiple applications that use Pretty Terminal in the same terminal window, you can: 1. Use the `disableInteractiveMode` option to disable keyboard input and navigation features 2. Keep interactive mode enabled in only one application and disable it in others The transport is designed to work as a single interactive instance. `getPrettyTerminal()` can be safely used multiple times in the same application as it uses the same transport reference. ::: ::: warning Performance Note Logs are stored using an in-memory SQLite database by default. For long-running applications or large log volumes, consider using a persistent storage file using the `logFile` option to avoid out of memory issues. ::: ## Keyboard Controls The Pretty Terminal Transport provides an interactive interface with three main modes: ### Simple View (Default) ![Simple View](/images/pretty-terminal/simple-view.webp) The default view shows real-time log output with the following controls: * `P`: Toggle pause/resume of log output * `C`: Cycle through view modes (full → truncated → condensed) * `↑/↓`: Enter selection mode When paused, new logs are buffered and a counter shows how many logs are waiting. Resuming will display all buffered logs. The view has three modes: * **Full View** (default): Shows all information with complete data structures (no truncation) * **Truncated View**: Shows complete log information including timestamp, ID, level, message, with data structures truncated based on `maxInlineDepth` and `maxInlineLength` settings * **Condensed View**: Shows only the timestamp, log level and message for a cleaner output (no data shown) When entering selection mode while paused: * Only logs that were visible before pause are shown initially * Buffered logs from pause are tracked as new logs * The notification shows how many new logs are available * Pressing ↓ at the bottom will reveal new logs ### Selection Mode ![Selection Mode](/images/pretty-terminal/selection-mode.webp) An interactive mode for browsing and filtering logs: * `↑/↓`: Navigate through logs * `ENTER`: View detailed log information (preserves current filter) * `TAB`: Return to simple view * Type to filter logs (searches through all log content) * `BACKSPACE`: Edit/clear filter text When filtering is active: * Only matching logs are displayed * The filter persists when entering detail view * Navigation (↑/↓) only moves through filtered results * New logs that match the filter are automatically included Each log entry in selection mode shows: * Timestamp and log ID * Log level with color coding * Complete message * Full structured data inline (like simple view's full mode) * Selected entry is highlighted with `►` ### Detail View ![Detail View](/images/pretty-terminal/detail-view.webp) A full-screen view showing comprehensive log information: * `↑/↓`: Scroll through log content line by line * `Q/W`: Page up/down through content * `←/→`: Navigate to previous/next log entry (respects active filter) * `A/S`: Jump to first/last log entry * `C`: Toggle array collapse in JSON data * `J`: Toggle raw JSON view (for easy copying) * `TAB`: Return to selection view (or return to detail view from JSON view) Features in Detail View: * Shows full timestamp and log level * Displays complete structured data with syntax highlighting * Shows context (previous and next log entries) * Shows active filter in header when filtering is enabled * Auto-updates when viewing latest log (respects current filter) * Pretty-prints JSON data with color coding * Collapsible arrays for better readability * Raw JSON view for easy copying ## Configuration The Pretty Terminal Transport can be customized with various options: ```typescript import { getPrettyTerminal, moonlight } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ // Maximum depth for inline data display in truncated mode maxInlineDepth: 4, // Maximum length for inline data in truncated mode maxInlineLength: 120, // Custom theme configuration (default is moonlight) theme: moonlight, // Optional path to SQLite file for persistent storage logFile: 'path/to/logs.sqlite', // Enable/disable the transport (defaults to true) enabled: process.env.NODE_ENV === 'development', // Disable interactive mode for multi-app terminal output (defaults to false) disableInteractiveMode: false, }); ``` ### Configuration Options | Option | Type | Default | Description | |--------|------|---------|-------------| | `maxInlineDepth` | number | 4 | Maximum depth for displaying nested data inline. Only applies in truncated view mode. Selection mode and detail view always show full depth. | | `maxInlineLength` | number | 120 | Maximum length for inline data before truncating. Only applies in truncated view mode. Selection mode and detail view always show full content. | | `theme` | PrettyTerminalTheme | moonlight | Theme configuration for colors and styling | | `logFile` | string | ":memory:" | Path to SQLite file for persistent storage. Relative paths are resolved from the current working directory. If not provided, uses in-memory database | | `enabled` | boolean | true | Whether the transport is enabled. If false, all operations will no-op | | `disableInteractiveMode` | boolean | false | Whether to disable interactive mode (keyboard input and navigation). Useful when multiple applications need to print to the same terminal | ::: warning Security Note If using the `logFile` option, be aware that: 1. All logs will be stored in the specified SQLite database file. 2. The file will be purged of any existing data when the transport initializes 3. Relative paths (e.g., "logs/app.db") are resolved from the current working directory 4. It is recommended to add the `logFile` path to your `.gitignore` file to avoid committing sensitive log data 5. Do not use the same logfile path in another application (as in two separate applications running the transport against the same file) to avoid data corruption. If you do have sensitive data that shouldn't be logged in general, use the [Redaction Plugin](/plugins/redaction) to filter out sensitive information before logging. ::: ## Themes The transport comes with several built-in themes to match your terminal style: ### Moonlight Theme (Default) A dark theme with cool blue tones, perfect for night-time coding sessions and modern IDEs. ![Moonlight Theme](/images/pretty-terminal/moonlight.webp) ```typescript import { getPrettyTerminal, moonlight } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ theme: moonlight, }); ``` ### Sunlight Theme A light theme with warm tones, ideal for daytime use, high-glare environments, and printed documentation. ![Sunlight Theme](/images/pretty-terminal/sunlight.webp) ```typescript import { getPrettyTerminal, sunlight } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ theme: sunlight, }); ``` ### Neon Theme A vibrant, cyberpunk-inspired theme with electric colors and high contrast, perfect for modern tech-focused applications. ![Neon Theme](/images/pretty-terminal/neon.webp) ```typescript import { getPrettyTerminal, neon } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ theme: neon, }); ``` ### Nature Theme A light theme with organic, earthy colors inspired by forest landscapes. Great for nature-inspired interfaces and applications focusing on readability. ![Nature Theme](/images/pretty-terminal/nature.webp) ```typescript import { getPrettyTerminal, nature } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ theme: nature, }); ``` ### Pastel Theme A soft, calming theme with gentle colors inspired by watercolor paintings. Perfect for long coding sessions and reduced visual stress. ![Pastel Theme](/images/pretty-terminal/pastel.webp) ```typescript import { getPrettyTerminal, pastel } from '@loglayer/transport-pretty-terminal'; const transport = getPrettyTerminal({ theme: pastel, }); ``` ## Custom Themes You can create your own theme by implementing the `PrettyTerminalTheme` interface, which uses [`chalk`](https://github.com/chalk/chalk) for color styling: ```typescript import { getPrettyTerminal, chalk } from '@loglayer/transport-pretty-terminal'; const myCustomTheme = { // Configuration for the default log view shown in real-time simpleView: { // Color configuration for different log levels colors: { trace: chalk.gray, // Style for trace level logs debug: chalk.blue, // Style for debug level logs info: chalk.green, // Style for info level logs warn: chalk.yellow, // Style for warning level logs error: chalk.red, // Style for error level logs fatal: chalk.bgRed.white, // Style for fatal level logs - background red with white text }, logIdColor: chalk.dim, // Style for the unique log identifier dataValueColor: chalk.white, // Style for the actual values in structured data dataKeyColor: chalk.dim, // Style for the keys/property names in structured data selectorColor: chalk.cyan, // Style for the selection indicator (►) in selection mode }, detailedView: { // Inherits all options from simpleView, plus additional detailed view options colors: { trace: chalk.gray, debug: chalk.blue, info: chalk.green, warn: chalk.yellow, error: chalk.red, fatal: chalk.bgRed.white, }, logIdColor: chalk.dim, dataValueColor: chalk.white, dataKeyColor: chalk.dim, // Additional detailed view specific options headerColor: chalk.bold.cyan, // Style for section headers labelColor: chalk.bold, // Style for field labels (e.g., "Timestamp:", "Level:") separatorColor: chalk.dim, // Style for visual separators // Configuration for JSON pretty printing jsonColors: { keysColor: chalk.dim, // Style for JSON property names dashColor: chalk.dim, // Style for array item dashes numberColor: chalk.yellow, // Style for numeric values stringColor: chalk.green, // Style for string values multilineStringColor: chalk.green, // Style for multiline strings positiveNumberColor: chalk.yellow, // Style for positive numbers negativeNumberColor: chalk.red, // Style for negative numbers booleanColor: chalk.cyan, // Style for boolean values nullUndefinedColor: chalk.gray, // Style for null/undefined values dateColor: chalk.magenta, // Style for date values }, }, }; const transport = getPrettyTerminal({ theme: myCustomTheme, }); ``` --- --- url: 'https://loglayer.dev/plugins/redaction.md' description: Learn how to use the redaction plugin to protect sensitive data in your logs --- # Redaction Plugin [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fplugin-redaction)](https://www.npmjs.com/package/@loglayer/plugin-redaction) [Plugin Source](https://github.com/loglayer/loglayer/tree/master/packages/plugins/redaction) The redaction plugin for LogLayer provides a simple way to redact sensitive information from your logs using [fast-redact](https://www.npmjs.com/package/fast-redact). It currently only performs redaction on metadata. ## Installation ::: code-group ```sh [npm] npm install @loglayer/plugin-redaction ``` ```sh [pnpm] pnpm add @loglayer/plugin-redaction ``` ```sh [yarn] yarn add @loglayer/plugin-redaction ``` ::: ## Basic Usage ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' import { redactionPlugin } from '@loglayer/plugin-redaction' const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["password"], }), ], }) // The password will be redacted in the output log.metadataOnly({ password: "123456", }) ``` ## Configuration Options ```typescript interface RedactionPluginOptions { /** * Unique identifier for the plugin. Used for selectively disabling / enabling * and removing the plugin. */ id?: string; /** * If true, the plugin will skip execution */ disabled?: boolean; /** * An array of strings describing the nested location of a key in an object. * See https://www.npmjs.com/package/fast-redact for path syntax. */ paths?: string[]; /** * This is the value which overwrites redacted properties. * Default: "[REDACTED]" */ censor?: string | ((v: any) => any); /** * When set to true, will cause keys to be removed from the serialized output. * Default: false */ remove?: boolean; /** * When set to true, will cause the redactor function to throw if instead of an object it finds a primitive. * Default: false */ strict?: boolean; } ``` ## Examples ### Basic Path Redaction ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["password", "creditCard", "ssn"], }), ], }) log.metadataOnly({ user: "john", password: "secret123", creditCard: "4111111111111111", ssn: "123-45-6789" }) // Output: // { // "user": "john", // "password": "[REDACTED]", // "creditCard": "[REDACTED]", // "ssn": "[REDACTED]" // } ``` ### Nested Path Redaction ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["user.password", "payment.*.number"], }), ], }) log.metadataOnly({ user: { name: "john", password: "secret123" }, payment: { credit: { number: "4111111111111111", expiry: "12/24" }, debit: { number: "4222222222222222", expiry: "01/25" } } }) // Output: // { // "user": { // "name": "john", // "password": "[REDACTED]" // }, // "payment": { // "credit": { // "number": "[REDACTED]", // "expiry": "12/24" // }, // "debit": { // "number": "[REDACTED]", // "expiry": "01/25" // } // } // } ``` ### Custom Censor Value ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["password"], censor: "***", }), ], }) log.metadataOnly({ user: "john", password: "secret123" }) // Output: // { // "user": "john", // "password": "***" // } ``` ### Remove Instead of Redact ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["password"], remove: true, }), ], }) log.metadataOnly({ user: "john", password: "secret123" }) // Output: // { // "user": "john" // } ``` ### Custom Censor Function ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, }), plugins: [ redactionPlugin({ paths: ["creditCard"], censor: (value) => { if (typeof value === 'string') { return value.slice(-4).padStart(value.length, '*') } return '[REDACTED]' }, }), ], }) log.metadataOnly({ user: "john", creditCard: "4111111111111111" }) // Output: // { // "user": "john", // "creditCard": "************1111" // } ``` ## Changelog View the changelog [here](./changelogs/redaction-changelog.md). --- --- url: 'https://loglayer.dev/transports/roarr.md' description: Send logs to Roarr with the LogLayer logging library --- # Roarr Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-roarr)](https://www.npmjs.com/package/@loglayer/transport-roarr) [Roarr](https://github.com/gajus/roarr) is a JSON logger for Node.js and browser environments. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/roarr) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-roarr roarr serialize-error ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-roarr roarr serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-roarr roarr serialize-error ``` ::: ## Setup Roarr requires environment configuration to enable logging: ### Node.js ```bash ROARR_LOG=true node your-app.js ``` ### Browser ```typescript window.ROARR = { enabled: true } ``` ### Implementation ```typescript import { Roarr as r } from 'roarr' import { LogLayer } from 'loglayer' import { RoarrTransport } from "@loglayer/transport-roarr" import { serializeError } from 'serialize-error' const log = new LogLayer({ transport: new RoarrTransport({ logger: r }), errorSerializer: serializeError // Roarr requires error serialization }) ``` ## Changelog View the changelog [here](./changelogs/roarr-changelog.md). --- --- url: 'https://loglayer.dev/transports/signale.md' description: Send logs to Signale with the LogLayer logging library --- # Signale Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-signale)](https://www.npmjs.com/package/@loglayer/transport-signale) [Signale](https://github.com/klaussinani/signale) is a highly configurable logging utility designed for CLI applications. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/signale) ## Important Notes * Signale only works in Node.js environments (not in browsers) * It is primarily designed for CLI applications * LogLayer only integrates with standard log levels (not CLI-specific levels like `success`, `await`, etc.) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-signale signale ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-signale signale ``` ```sh [yarn] yarn add loglayer @loglayer/transport-signale signale ``` ::: ## Setup ```typescript import { Signale } from 'signale' import { LogLayer } from 'loglayer' import { SignaleTransport } from "@loglayer/transport-signale" const signale = new Signale() const log = new LogLayer({ transport: new SignaleTransport({ logger: signale }) }) ``` ## Log Level Mapping | LogLayer | Signale | |----------|---------| | trace | debug | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/signale-changelog.md). --- --- url: 'https://loglayer.dev/plugins/sprintf.md' description: Printf-style string formatting support for LogLayer --- # Sprintf Plugin [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Fplugin-sprintf)](https://www.npmjs.com/package/@loglayer/plugin-sprintf) [Plugin Source](https://github.com/loglayer/loglayer/tree/master/packages/plugins/sprintf) The sprintf plugin provides printf-style string formatting support using [sprintf-js](https://www.npmjs.com/package/sprintf-js). It allows you to format your log messages using familiar printf-style placeholders if a logging library does not support this behavior. ::: warning LogLayer does not allow passing items that are not strings, booleans, or numbers into message methods like `info`, `error`, etc. **It is recommended to only use string / boolean / number specifiers in your format strings.** ::: ## Installation ::: code-group ```bash [npm] npm install @loglayer/plugin-sprintf ``` ```bash [yarn] yarn add @loglayer/plugin-sprintf ``` ```bash [pnpm] pnpm add @loglayer/plugin-sprintf ``` ::: ## Usage ```typescript import { LogLayer, ConsoleTransport } from 'loglayer' import { sprintfPlugin } from '@loglayer/plugin-sprintf' const log = new LogLayer({ transport: new ConsoleTransport({ logger: console }), plugins: [ sprintfPlugin() ] }) // Example usage log.info("Hello %s!", "world") // Output: Hello world! log.info("Number: %d", 42) // Output: Number: 42 ``` ## Changelog View the changelog [here](./changelogs/sprintf-changelog.md). --- --- url: 'https://loglayer.dev/transports/sumo-logic.md' description: Send logs to Sumo Logic with the LogLayer logging library --- # Sumo Logic Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-sumo-logic)](https://www.npmjs.com/package/@loglayer/transport-sumo-logic) [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/sumo-logic) The Sumo Logic transport sends logs to [Sumo Logic](https://www.sumologic.com/) via their [HTTP Source](https://help.sumologic.com/docs/send-data/hosted-collectors/http-source/logs-metrics/upload-logs/). ## Installation Using npm: ```bash npm install @loglayer/transport-sumo-logic serialize-error ``` Using yarn: ```bash yarn add @loglayer/transport-sumo-logic serialize-error ``` Using pnpm: ```bash pnpm add @loglayer/transport-sumo-logic serialize-error ``` ## Usage First, you'll need to [create an HTTP Source in Sumo Logic](https://help.sumologic.com/docs/send-data/hosted-collectors/http-source/logs-metrics/#configure-an-httplogs-and-metrics-source). Once you have the URL, you can configure the transport: ```typescript import { LogLayer } from "loglayer"; import { SumoLogicTransport } from "@loglayer/transport-sumo-logic"; import { serializeError } from "serialize-error"; const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", }); const logger = new LogLayer({ errorSerializer: serializeError, // Important for proper error serialization transport }); logger.info("Hello from LogLayer!"); ``` ## Configuration Options ### Required Options | Option | Type | Description | |--------|------|-------------| | `url` | `string` | The URL of your HTTP Source endpoint | ### Optional Options | Option | Type | Default | Description | |--------|------|---------|-------------| | `useCompression` | `boolean` | `true` | Whether to use gzip compression | | `sourceCategory` | `string` | - | Source category to assign to the logs | | `sourceName` | `string` | - | Source name to assign to the logs | | `sourceHost` | `string` | - | Source host to assign to the logs | | `fields` | `Record` | `{}` | Fields to be added as X-Sumo-Fields header | | `headers` | `Record` | `{}` | Custom headers to be added to the request | | `messageField` | `string` | `"message"` | Field name to use for the log message | | `onError` | `(error: Error \| string) => void` | - | Callback for error handling | | `level` | `"trace" \| "debug" \| "info" \| "warn" \| "error" \| "fatal"` | `"trace"` | Minimum log level to process. Logs below this level will be filtered out | ### Retry Configuration | Option | Type | Default | Description | |--------|------|---------|-------------| | `retryConfig.maxRetries` | `number` | `3` | Maximum number of retry attempts | | `retryConfig.initialRetryMs` | `number` | `1000` | Initial retry delay in milliseconds | ## Examples ### With Source Information ```typescript const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", sourceCategory: "backend", sourceName: "api-server", sourceHost: "prod-api-1" }); ``` ### With Custom Fields ```typescript const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", fields: { environment: "production", team: "platform", region: "us-west-2" } }); ``` ### With Custom Message Field ```typescript const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", messageField: "log_message" // Messages will be sent as { log_message: "..." } }); ``` ### With Retry Configuration ```typescript const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", retryConfig: { maxRetries: 5, initialRetryMs: 500 } }); ``` ### With Custom Headers ```typescript const transport = new SumoLogicTransport({ url: "YOUR_SUMO_LOGIC_HTTP_SOURCE_URL", headers: { "X-Custom-Header": "value" } }); ``` ## Log Format The transport sends logs to Sumo Logic in the following format: ```typescript { message?: string; // Present only if there are string messages severity: string; // The log level (e.g., "INFO", "ERROR") timestamp: string; // ISO 8601 timestamp ...metadata // Any additional metadata passed to the logger } ``` ### Custom Fields Custom fields specified in the `fields` option are sent as an `X-Sumo-Fields` header in the format: ``` X-Sumo-Fields: key1=value1,key2=value2 ``` This allows for better indexing and searching in Sumo Logic. ## Size Limits The transport enforces Sumo Logic's 1MB payload size limit. If a payload exceeds this limit: 1. The transport will not send the log 2. The `onError` callback will be called with an error message 3. The error will include the actual size that exceeded the limit This applies to both raw and compressed payloads. ## Changelog View the changelog [here](./changelogs/sumo-logic-changelog.md). --- --- url: 'https://loglayer.dev/plugins/testing-plugins.md' description: Learn how to write tests for your LogLayer plugins --- # Testing Plugins LogLayer provides a `TestTransport` and `TestLoggingLibrary` that make it easy to test your plugins. Here's an example of how to test a plugin that adds a timestamp to metadata: ```typescript import { LogLayer, TestLoggingLibrary, TestTransport } from "loglayer"; import { describe, expect, it } from "vitest"; describe("timestamp plugin", () => { it("should add timestamp to metadata", () => { // Create a test logger to capture output const logger = new TestLoggingLibrary(); // Create the timestamp plugin const timestampPlugin = { id: "timestamp", onMetadataCalled: (metadata) => ({ ...metadata, timestamp: "2024-01-01T00:00:00.000Z" }) }; // Create LogLayer instance with the plugin const log = new LogLayer({ transport: new TestTransport({ logger, }), plugins: [timestampPlugin], }); // Test the plugin by adding some metadata log.metadataOnly({ message: "test message" }); // Get the logged line and verify the timestamp was added const line = logger.popLine(); expect(line.data[0].timestamp).toBe("2024-01-01T00:00:00.000Z"); expect(line.data[0].message).toBe("test message"); }); it("should handle empty metadata", () => { const logger = new TestLoggingLibrary(); const timestampPlugin = { id: "timestamp", onMetadataCalled: (metadata) => ({ ...metadata, timestamp: "2024-01-01T00:00:00.000Z" }) }; const log = new LogLayer({ transport: new TestTransport({ logger, }), plugins: [timestampPlugin], }); log.metadataOnly({}); const line = logger.popLine(); expect(line.data[0].timestamp).toBe("2024-01-01T00:00:00.000Z"); }); }); ``` ## TestLoggingLibrary API The `TestLoggingLibrary` provides several methods and properties to help you test your plugins: ### Properties * `lines`: An array containing all logged lines. Each line has a `level` (`LogLevel`) and `data` (array of parameters passed to the log method). ### Methods * `getLastLine()`: Returns the most recent log line without removing it. Returns null if no lines exist. * `popLine()`: Returns and removes the most recent log line. Returns null if no lines exist. * `clearLines()`: Removes all logged lines, resetting the library to its initial state. Each logged line has the following structure: ```typescript { level: LogLevel; // The log level (info, warn, error, etc.) data: any[]; // Array of parameters passed to the log method } ``` --- --- url: 'https://loglayer.dev/transports/testing-transports.md' description: Learn how to test transports for LogLayer --- # Testing Transports ## Unit testing Unfortunately there is not a prescribed way to unit test transport implementations. This is because the implementation of a transport is highly dependent on the target logging library. You will want to at least test that the calls to LogLayer reaches the intended method of the destination logger. * Some loggers allow for the specification of an output stream. You can usually use this to end-to-end test the logger output. * A lot of loggers do this, and you can check out the unit tests for the LogLayer transports for examples. * If they don't have an output stream, replace the logger with a mock and test that the mock is called with the correct parameters. ## Live testing ### With `testTransportOutput` Live testing tests that the transport actually works with the target logger. The `@loglayer/transport` library exports a `testTransportOutput(label: string, loglayer: LogLayer)` function that can be used to test that the transport works with the target logger. It calls the commonly used methods on the `loglayer` instance and outputs what the result is to the console. Lots of transports have a `src/__tests__/livetest.ts` file that you can look at to see how to use it. Here is an example of how to use it from the bunyan transport: ```typescript // livetest.ts import { testTransportOutput } from "@loglayer/transport"; import bunyan from "bunyan"; import { LogLayer } from "loglayer"; import { BunyanTransport } from "../BunyanTransport.js"; const b = bunyan.createLogger({ name: "my-logger", level: "trace", // Show all log levels serializers: { err: bunyan.stdSerializers.err, // Use Bunyan's error serializer }, }); const log = new LogLayer({ errorFieldName: "err", // Match Bunyan's error field name transport: new BunyanTransport({ logger: b, }), }); testTransportOutput("Bunyan logger", log); ``` Then you can use `pnpm run livetest` / `npx tsx livetest.ts` to run the test. ### For cloud providers For cloud provider-sent logs, you'll have to use the cloud provider's log console to verify that the logs are being sent correctly. This was done for the DataDog transports, where the logs were sent to DataDog and verified in the DataDog console. --- --- url: 'https://loglayer.dev/transports/tracer.md' description: Send logs to Tracer with the LogLayer logging library --- # Tracer Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-tracer)](https://www.npmjs.com/package/@loglayer/transport-tracer) [Tracer](https://www.npmjs.com/package/tracer) is a powerful and customizable logging library for Node.js. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/tracer) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-tracer tracer ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-tracer tracer ``` ```sh [yarn] yarn add loglayer @loglayer/transport-tracer tracer ``` ::: ## Setup ```typescript import { LogLayer } from 'loglayer' import { TracerTransport } from '@loglayer/transport-tracer' import tracer from 'tracer' // Create a tracer logger instance const logger = tracer.console() const log = new LogLayer({ transport: new TracerTransport({ id: 'tracer', logger }) }) ``` ## Log Level Mapping | LogLayer | Tracer | |----------|---------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/tracer-changelog.md). --- --- url: 'https://loglayer.dev/logging-api/transport-management.md' description: How to manage transports in LogLayer --- # Transport Management ## Replacing Transports `withFreshTransports(transports: LogLayerTransport | Array): ILogLayer` Replaces the existing transports with the provided transport(s). This can be useful for dynamically changing transports at runtime. This only replaces the transports for the current logger instance, so if you are replacing transports for a child logger, it will not affect the parent logger and vice versa. ```typescript // Replace with a single transport logger.withFreshTransports(new PinoTransport({ logger: pino() })) // Replace with multiple transports logger.withFreshTransports([ new ConsoleTransport({ logger: console }), new PinoTransport({ logger: pino() }) ]) ``` ::: warning Potential Performance Impact Replacing transports at runtime may have a performance impact if you are frequently creating new transports. It is recommended to re-use the same transport instance(s) where possible. ::: ## Obtaining the underlying logger instance You can get the underlying logger for a transport if you've assigned an ID to it: ```typescript const log = new LogLayer({ transport: new ConsoleTransport({ logger: console, id: 'console' }) }) const consoleLogger = log.getLoggerInstance('console') ``` ```typescript import { type P, pino } from "pino"; import { PinoTransport } from "@loglayer/transport-pino"; const log = new LogLayer({ transport: new PinoTransport({ logger: pino(), id: 'pino' }) }) const pinoLogger = log.getLoggerInstance('pino') ``` ::: info Not all transports have a logger instance attached to them. In those cases, you will get a null result. You can identify if a transport can return such an instance if it takes in a `logger` parameter in its constructor. ::: --- --- url: 'https://loglayer.dev/transports/tslog.md' description: Send logs to TsLog with the LogLayer logging library --- # tslog Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-tslog)](https://www.npmjs.com/package/@loglayer/transport-tslog) [tslog](https://tslog.js.org/) is a powerful TypeScript logging library that provides beautiful logging with full TypeScript support. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/tslog) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-tslog tslog ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-tslog tslog ``` ```sh [yarn] yarn add loglayer @loglayer/transport-tslog tslog ``` ::: ## Setup ```typescript import { Logger } from "tslog" import { LogLayer } from 'loglayer' import { TsLogTransport } from "@loglayer/transport-tslog" const tslog = new Logger() const log = new LogLayer({ transport: new TsLogTransport({ logger: tslog }) }) ``` ## Log Level Mapping | LogLayer | tslog | |----------|--------| | trace | trace | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | fatal | ## Changelog View the changelog [here](./changelogs/tslog-changelog.md). --- --- url: 'https://loglayer.dev/example-integrations/async-context.md' description: Learn how to implement LogLayer across async contexts --- # Asynchronous context tracking with LogLayer This document will explain how to use LogLayer across async contexts using [`AsyncLocalStorage`](https://nodejs.org/api/async_context.html#class-asynclocalstorage). ## Why use `AsyncLocalStorage`? Trevor Lasn in his article, [AsyncLocalStorage: Simplify Context Management in Node.js ](https://www.trevorlasn.com/blog/node-async-local-storage), says it best: *AsyncLocalStorage gives you a way to maintain context across your async operations without manually passing data through every function. Think of it like having a secret storage box that follows your request around, carrying important information that any part of your code can access.* ### Addresses context tracking hell Like how promises addressed [callback hell](https://medium.com/@raihan_tazdid/callback-hell-in-javascript-all-you-need-to-know-296f7f5d3c1), `AsyncLocalStorage` addresses the problem of context tracking hell: ```typescript // An example of context management hell using express async function myExternalFunction(log: ILogLayer) { log.info('Doing something') // need to pass that logger down await someNestedFunction(log) } async function someNestedFunction(log: ILogLayer) { log.info('Doing something else') } // Define logging middleware app.use((req, res, next) => { // Create a new LogLayer instance for each request req.log = new LogLayer() next() }) // Use the logger in your routes app.get('/', async (req, res) => { req.log.info('Processing request to root endpoint') // You have to pass in the logger here await myExternalFunction(req.log) res.send('Hello World!') }) ``` * In the above example, we have to pass the `log` (or a context) object to every function that needs it. This can lead to a lot of boilerplate code. * Using `AsyncLocalStorage`, we can avoid this and make our code cleaner. ## Do not use async hooks You may have heard of async hooks for addressing this problem, but it has been superseded by async local storage. The documentation for [async hooks](https://nodejs.org/api/async_hooks.html) has it in an experimental state for years, citing that it has "usability issues, safety risks, and performance implications", and to instead use `AsyncLocalStorage`. ## Integration with `AsyncLocalStorage` The following example has been tested to work. It uses the [`express`](./express) framework, but you can use the `async-local-storage.ts` and `logger.ts` code for any other framework. ### Create a file for the `AsyncLocalStorage` instance ```typescript // async-local-storage.ts import { AsyncLocalStorage } from "node:async_hooks"; import type { ILogLayer } from "loglayer"; export const asyncLocalStorage = new AsyncLocalStorage<{ logger: ILogLayer }>(); ``` ### Create a file to get the logger instance from the storage ```typescript // logger.ts import { asyncLocalStorage } from "./async-local-storage"; import { ConsoleTransport, LogLayer } from "loglayer"; import type { ILogLayer } from "loglayer"; export function createLogger() { return new LogLayer({ transport: new ConsoleTransport({ logger: console, }), }) } // Create a default logger for non-request contexts const defaultLogger = createLogger(); export function getLogger(): ILogLayer { const store = asyncLocalStorage.getStore(); if (!store) { // Use non-request specific logger // Remove these console logs once you're convinced it works console.log("using non-async local storage logger"); return defaultLogger; } console.log("Using async local storage logger"); return store.logger; } ``` ### Register the logger per-request to the storage ```typescript // app.ts import express from 'express'; import { asyncLocalStorage } from "./async-local-storage"; import { getLogger, createLogger } from "./logger"; import type { ILogLayer } from "loglayer"; // Extend Express Request type to include log property declare global { namespace Express { interface Request { log: ILogLayer; } } } // Initialize Express app const app = express(); // no need to pass in the logger now that we can use async local storage async function myExternalFunction() { // Will use the request-specific logger if being called // from the context of a request getLogger().info('Doing something') await someNestedFunction() } async function someNestedFunction() { getLogger().info('Doing something else') } // Define logging middleware app.use((req, res, next) => { const logger = createLogger() req.log = logger; // Stores the request-specific logger into storage asyncLocalStorage.run({ logger }, next); }) // Use the logger in your routes app.get('/', async (req, res) => { // You can also use getLogger() instead req.log.info('Processing request to root endpoint') await myExternalFunction() res.send('Hello World!') }) // Start the server app.listen(3000, () => { console.log('Server is running on port 3000'); }); ``` ### Output ``` Processing request to root endpoint Using async local storage logger Doing something Using async local storage logger Doing something else ``` --- --- url: 'https://loglayer.dev/example-integrations/express.md' description: Learn how to implement LogLayer with Express --- # LogLayer with Express LogLayer can be easily integrated with Express as middleware to provide request-scoped logging via `req.log`. This guide will show you how to set it up. ## Installation First, install the required packages. You can use any transport you prefer - we'll use Pino in this example: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-pino pino express ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-pino pino express serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-pino pino express serialize-error ``` ::: ## Example ```typescript import express from 'express' import pino from 'pino' import { ILogLayer, LogLayer } from 'loglayer' import { PinoTransport } from '@loglayer/transport-pino' import { serializeError } from 'serialize-error'; // Create a Pino instance (only needs to be done once) const pinoLogger = pino({ level: 'trace' // Set to desired log level }) const app = express() // Add types for the req.log property declare global { namespace Express { interface Request { log: ILogLayer } } } // Define logging middleware app.use((req, res, next) => { // Create a new LogLayer instance for each request req.log = new LogLayer({ transport: new PinoTransport({ logger: pinoLogger }), errorSerializer: serializeError, }).withContext({ reqId: crypto.randomUUID(), // Add unique request ID method: req.method, path: req.path, ip: req.ip }) next() }) // Use the logger in your routes app.get('/', (req, res) => { req.log.info('Processing request to root endpoint') // Add additional context for specific logs req.log .withContext({ query: req.query }) .info('Request includes query parameters') res.send('Hello World!') }) // Error handling middleware app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => { req.log.withError(err).error('An error occurred while processing the request') res.status(500).send('Internal Server Error') }) app.listen(3000, () => { console.log('Server started on port 3000') }) ``` ## Using Async Local Storage You will most likely want to use async local storage to avoid passing the logger around in your code. See an example of how to do this [here](./async-context). --- --- url: 'https://loglayer.dev/example-integrations/fastify.md' description: Learn how to implement LogLayer with Fastify --- # LogLayer with Fastify ## Installation First, install the required packages. Pino is the default logger for Fastify, so we'll use it in this example: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-pino pino fastify serialize-error ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-pino pino fastify serialize-error ``` ```sh [yarn] yarn add loglayer @loglayer/transport-pino pino fastify serialize-error ``` ::: ## Example ```typescript import Fastify from 'fastify' import { type P, pino } from "pino"; import { ILogLayer, LogLayer } from 'loglayer' import { PinoTransport } from '@loglayer/transport-pino' import { serializeError } from 'serialize-error'; declare module 'fastify' { interface FastifyBaseLogger extends ILogLayer {} } let p: P.Logger = pino(); const logger = new LogLayer({ transport: new PinoTransport({ logger: p, }), errorSerializer: serializeError, }); const fastify = Fastify({ // @ts-ignore LogLayer doesn't implement some of Fastify's logger interface // but we've found this hasn't been an issue in production usage loggerInstance: logger, // This makes logs extremely verbose, so only disable for debugging disableRequestLogging: true, }) // Add request path to logs fastify.addHook('onRequest', async (request, reply) => { // @ts-ignore LogLayer doesn't implement some of Fastify's logger interface request.log = request.log.withContext({ path: request.url }); }); // Declare a route fastify.get('/', function (request, reply) { request.log.info('hello world') reply.send({ hello: 'world' }) }) // Run the server! fastify.listen({ port: 3000 }, function (err, address) { if (err) { fastify.log.withError(err).error("error starting server") process.exit(1) } // Server is now listening on ${address} }) ``` ## Example repo You can find a complete example of using LogLayer with Fastify in the [fastify-starter-turbo-monorepo](https://github.com/theogravity/fastify-starter-turbo-monorepo) example. It uses [`AsyncLocalStorage`](./async-context) to store the logger for use in request contexts. --- --- url: 'https://loglayer.dev/logging-api/unit-testing.md' description: Learn how to silence logging during unit tests using MockLogLayer --- # Working with LogLayer in Testing ## No-op Mock LogLayer for Unit Testing LogLayer provides a `MockLogLayer` class that implements the `ILogLayer` interface implemented by `LogLayer` but all methods are no-ops (they do nothing). This is useful for testing services that use logging. This example demonstrates how to use `MockLogLayer` for testing a service that uses logging. ```typescript import { describe, it, expect } from 'vitest' import { MockLogLayer, ILogLayer } from 'loglayer' // Example service that uses logging class UserService { private logger: ILogLayer constructor(logger: ILogLayer) { this.logger = logger } async createUser(username: string, email: string) { try { // Simulate user creation this.logger.withMetadata({ username, email }).info('Creating new user') if (!email.includes('@')) { const error = new Error('Invalid email format') this.logger.withError(error).error('Failed to create user') throw error } // Simulate successful creation this.logger.withContext({ userId: '123' }).info('User created successfully') return { id: '123', username, email } } catch (error) { this.logger.errorOnly(error) throw error } } } describe('UserService', () => { it('should create a user successfully', async () => { // Create a mock logger const mockLogger = new MockLogLayer() const userService = new UserService(mockLogger) const result = await userService.createUser('testuser', 'test@example.com') expect(result).toEqual({ id: '123', username: 'testuser', email: 'test@example.com' }) }) it('should throw error for invalid email', async () => { const mockLogger = new MockLogLayer() const userService = new UserService(mockLogger) await expect( userService.createUser('testuser', 'invalid-email') ).rejects.toThrow('Invalid email format') }) // Example showing that the mock logger implements all methods but doesn't actually log it('should handle all logging methods without throwing errors', () => { const mockLogger = new MockLogLayer() // All these calls should work without throwing errors mockLogger.info('test message') mockLogger.error('error message') mockLogger.warn('warning message') mockLogger.debug('debug message') mockLogger.trace('trace message') mockLogger.fatal('fatal message') // Method chaining should work mockLogger .withContext({ userId: '123' }) .withMetadata({ action: 'test' }) .info('test with context and metadata') // Error logging should work mockLogger.withError(new Error('test error')).error('error occurred') mockLogger.errorOnly(new Error('standalone error')) // All these calls should complete without throwing errors expect(true).toBe(true) }) }) ``` ## Writing Tests Against LogLayer Directly When a new instance of `MockLogLayer` is created, it also internally creates a new instance of a [`MockLogBuilder`](https://github.com/loglayer/loglayer/blob/master/packages/core/loglayer/src/MockLogBuilder.ts), which is used when chaining methods like `withMetadata`, `withError`, etc. `MockLogLayer` has three methods to help with directly testing the logger itself: * `getMockLogBuilder(): ILogBuilder`: Returns the underlying `MockLogBuilder` instance. * `resetMockLogBuilder()`: Tells `MockLogLayer` to create a new internal instance of the `MockLogBuilder`. * `setMockLogBuilder(builder: ILogBuilder)`: Sets the mock log builder instance to be used if you do not want to use the internal instance. The following example shows how you can use these methods to write tests against the logger directly. ```typescript import { describe, expect, it, vi } from "vitest"; import { MockLogLayer, MockLogBuilder } from "loglayer"; describe("MockLogLayer tests", () => { it("should be able to mock a log message method", () => { const logger = new MockLogLayer(); logger.info = vi.fn(); logger.info("testing"); expect(logger.info).toBeCalledWith("testing"); }); it("should be able to spy on a log message method", () => { const logger = new MockLogLayer(); const infoSpy = vi.spyOn(logger, "info"); logger.info("testing"); expect(infoSpy).toBeCalledWith("testing"); }); it("should be able to spy on a chained log message method", () => { const logger = new MockLogLayer(); // Get the mock builder instance const builder = logger.getMockLogBuilder(); const infoSpy = vi.spyOn(builder, "info"); logger.withMetadata({ test: "test" }).info("testing"); expect(infoSpy).toBeCalledWith("testing"); }); it("should be able to mock a log message method when using withMetadata", () => { const logger = new MockLogLayer(); const builder = logger.getMockLogBuilder(); // to be able to chain withMetadata with info, we need to // make sure the withMetadata method returns the builder builder.withMetadata = vi.fn().mockReturnValue(builder); builder.info = vi.fn(); logger.withMetadata({ test: "test" }).info("testing"); expect(builder.withMetadata).toBeCalledWith({ test: "test" }); expect(builder.info).toBeCalledWith("testing"); }); it("should be able to spy on a log message method when using withMetadata", () => { const logger = new MockLogLayer(); const builder = logger.getMockLogBuilder(); // to be able to chain withMetadata with info, we need to // make sure the withMetadata method returns the builder const metadataSpy = vi.spyOn(builder, "withMetadata"); const infoSpy = vi.spyOn(builder, "info"); logger.withMetadata({ test: "test" }).info("testing"); expect(metadataSpy).toBeCalledWith({ test: "test" }); expect(infoSpy).toBeCalledWith("testing"); }); it('should be able to spy on a multi-chained log message method', () => { const logger = new MockLogLayer(); const builder = logger.getMockLogBuilder(); const error = new Error('test error'); const metadataSpy = vi.spyOn(builder, 'withMetadata'); const errorSpy = vi.spyOn(builder, 'withError'); const infoSpy = vi.spyOn(builder, 'info'); logger .withMetadata({ test: 'test' }) .withError(error) .info('testing'); expect(metadataSpy).toBeCalledWith({ test: 'test' }); expect(errorSpy).toBeCalledWith(error); expect(infoSpy).toBeCalledWith('testing'); }); it("should use a custom MockLogBuilder", () => { const builder = new MockLogBuilder(); const logger = new MockLogLayer(); // Get the mock builder instance logger.setMockLogBuilder(builder); builder.withMetadata = vi.fn().mockReturnValue(builder); builder.info = vi.fn(); logger.withMetadata({ test: "test" }).info("testing"); expect(builder.withMetadata).toBeCalledWith({ test: "test" }); expect(builder.info).toBeCalledWith("testing"); }); it("should be able to mock errorOnly", () => { const error = new Error("testing"); const logger = new MockLogLayer(); logger.errorOnly = vi.fn(); logger.errorOnly(error); expect(logger.errorOnly).toBeCalledWith(error); }); }); ``` ## References * [MockLogLayer](https://github.com/loglayer/loglayer/blob/master/packages/core/loglayer/src/MockLogLayer.ts) * [MockLogBuilder](https://github.com/loglayer/loglayer/blob/master/packages/core/loglayer/src/MockLogBuilder.ts) --- --- url: 'https://loglayer.dev/transports/multiple-transports.md' description: Learn how to use multiple transports with LogLayer --- # Multiple Transports You can use multiple logging libraries simultaneously: ```typescript import { LogLayer } from 'loglayer' import { PinoTransport } from "@loglayer/transport-pino" import { WinstonTransport } from "@loglayer/transport-winston" const log = new LogLayer({ transport: [ new PinoTransport({ logger: pinoLogger }), new WinstonTransport({ logger: winstonLogger }) ] }) ``` --- --- url: 'https://loglayer.dev/transports/winston.md' description: Send logs to Winston with the LogLayer logging library --- # Winston Transport [![NPM Version](https://img.shields.io/npm/v/%40loglayer%2Ftransport-winston)](https://www.npmjs.com/package/@loglayer/transport-winston) [Winston](https://github.com/winstonjs/winston) A logger for just about everything. [Transport Source](https://github.com/loglayer/loglayer/tree/master/packages/transports/winston) ## Installation Install the required packages: ::: code-group ```sh [npm] npm i loglayer @loglayer/transport-winston winston ``` ```sh [pnpm] pnpm add loglayer @loglayer/transport-winston winston ``` ```sh [yarn] yarn add loglayer @loglayer/transport-winston winston ``` ::: ## Setup ```typescript import winston from 'winston' import { LogLayer } from 'loglayer' import { WinstonTransport } from "@loglayer/transport-winston" const w = winston.createLogger({}) const log = new LogLayer({ transport: new WinstonTransport({ logger: w }) }) ``` ## Log Level Mapping | LogLayer | Winston | |----------|---------| | trace | silly | | debug | debug | | info | info | | warn | warn | | error | error | | fatal | error | ## Changelog View the changelog [here](./changelogs/winston-changelog.md).