-
Notifications
You must be signed in to change notification settings - Fork 299
Closed
Feature
5 / 65 of 6 issues completed
Copy link
Description
What is it
- Add support for Azure Log Analytics as a telemetry sink
- Align with existing
application-insightsand futurefilesink syntax - Support both
shared-keyandentra-idauthentication modes
Added properties
{
"runtime": {
"telemetry": {
"azure-log-analytics": { ... }
}
}
}properties
"azure-log-analytics": {
"enabled": true, (bool, default: false)
"auth": {
"custom-table-name": "string", (string, required)
"dcr-immutable-id": "string", (string, required if type == "entra-id")
"dce-endpoint": "string" (string, required if type == "entra-id")
},
"dab-identifier": "DabLogs", (string, default: "DabLogs")
"flush-interval-seconds": 5 (integer, default: 5)
}| Setting | Meaning | Default |
|---|---|---|
enabled |
Enables the sink | false |
auth.custom-table-name |
Name of Azure Log Analytics Workspace Table | (required) |
auth.dcr-immutable-id |
DCR ID for entra-id mode |
(required) |
auth.dce-endpoint |
DCE endpoint for entra-id mode |
(required) |
dab-identifier |
Custom name of logs sent to DAB | DabLogs |
flush-interval-seconds |
Interval between log batch pushes (in seconds) | 5 |
5is an important default forflushbecause our containers are often ephemeral.
JSON Schema
Update the DAB JSON schema to support the new azure-log-analytics sink with conditional requirements based on the selected auth type.
Schema structure:
{
"azure-log-analytics": {
"type": "object",
"properties": {
"enabled": {
"type": "boolean",
"default": false
},
"auth": {
"type": "object",
"properties": {
"custom-table-name": {
"type": "string"
},
"dcr-immutable-id": {
"type": "string"
},
"dce-endpoint": {
"type": "string"
}
}
},
"dab-identifier": {
"type": "string",
"default": "DabLogs"
},
"flush-interval-seconds": {
"type": "integer",
"default": 5
}
},
"required": ["enabled", "auth"]
}
}Notes:
-
Enforce dab-identifier pattern using:
"pattern": "^[A-Za-z][A-Za-z0-9_]{0,99}$" -
Default values apply if fields are omitted.
-
Add schema definitions under
/runtime/telemetry/azure-log-analyticsin the main schema structure.
CLI Updates
Add support to dab configure:
dab configure --runtime.telemetry.azure-log-analytics.enableddab configure --runtime.telemetry.azure-log-analytics.auth.custom-table-namedab configure --runtime.telemetry.azure-log-analytics.auth.dcr-immutable-iddab configure --runtime.telemetry.azure-log-analytics.auth.dce-endpointdab configure --runtime.telemetry.azure-log-analytics.dab-identifierdab configure --runtime.telemetry.azure-log-analytics.flush-interval-seconds
Considerations
- Respect
log-levelfilters from global config - Logs will duplicate across sinks when explicitly enabled
- Must handle retries and backoff for ingestion failures (should be part of the built-in library)
Errors
- If
enabledis true but required fields are missing, fail to start and log error - If ingestion fails, retry up to 3 times with exponential backoff, then log and continue
- If invalid
dab-identifieris used (must match[A-Za-z][A-Za-z0-9_]{0,99}), fail to start
Flow
sequenceDiagram
participant Engine
participant Config
participant LogAnalytics
participant TelemetrySink
Engine->>Config: Load config
Config-->>Engine: Config object
Engine->>TelemetrySink: Initialize Log Analytics sink
TelemetrySink->>LogAnalytics: Authenticate using auth.type
TelemetrySink->>LogAnalytics: Send batched logs every 5 seconds
Comments for the future docs
If running in a container:
- Recommend storing credentials via
@env()or@akv() - For
entra-id, use system-assigned or user-assigned managed identity - Logs sent over HTTPS, no persistent file storage required
Sub-issues
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request
Type
Projects
Status
Done