A software component installed on systems to collect and forward telemetry data. Can be managed (traditionally installed) or auto-managed (automatically deployed).
Agentless
Collection method that does not require installing software on target systems, typically using existing protocols like SSH or WinRM.
Aggregation
Process of combining and grouping multiple data points based on common properties.
Alert
Notification of significant system events.
Archival
Long-term data storage process.
Audit
Review of logs for the purpose of verifying that IT policies are correctly implemented.
Audit Trail
Records of all system access instances and actions carried out.
Authentication
Verification of the identity of users, devices, or systems attempting to access network resources.
Access Control
A security mechanism that regulates who or what can view, use, or modify resources in a computing environment based on predefined policies and permissions.
Agent-Director Coordination
Enterprise architecture where lightweight Agents receive centralized configuration from Directors via websocket connections, enabling large-scale network management without per-endpoint configuration.
Authorization
Permission to access specific resources.
Enterprise architecture where lightweight Agents receive centralized configuration from Directors via websocket connections, enabling large-scale network management without per-endpoint configuration.
External system that receives processed telemetry data from DataStream targets, such as analytics platforms, archival systems, or compliance tools.
Content Hub
Pre-built pipeline templates and business scenarios that provide ready-made processing logic for common telemetry use cases, eliminating the need for custom configuration development.
Calibration
Measurement accuracy adjustment.
Cost Optimization
Strategic configuration of telemetry processing to minimize data processing costs by selecting only business-essential data fields and avoiding expensive operations on unnecessary data.
Collector
System component that receives and processes telemetry data.
Compression
Data size reduction technique used for saving storage space.
Connection Pooling
Maintaining a cache of database connections that can be reused when future requests to the database are required, reducing the overhead of creating new database connections for each request.
Checkpoint Recovery
Using periodic snapshots (checkpoints) of the database state to restore system consistency after a failure by only needing to replay transactions that occurred after the most recent checkpoint.
Configuration settings that define how data should be collected and processed in Azure Monitor.
Data Enrichment
The process of enhancing data with additional context or information from external sources.
Data Retention Period
Duration for which telemetry data is stored.
Data Resolution
Level of detail used in measurements.
Data Partitioning
Dividing a large dataset into smaller, more manageable sections. Frequently used in parallel computing and distributed systems to improve performance and scalability.
Data Point
Single measurement at a specific time.
Diagnostics
System health monitoring tools and methods.
Device
A source of log data, which can be physical hardware, virtual machines, or software components.
Director
Central orchestration service (vmetric-director.exe) that runs telemetry processing pipelines and coordinates Agent configuration across enterprise networks.
Component that separates parts of raw data into semantic units.
Payload
Actual data content being transmitted.
Pipeline
Pre-configured processing logic for a data stream.
Polling Interval
Time between consecutive data collection attempts.
Processor
A component that performs a specific operation on streaming data.
Postprocessing
Stage 5 of DataStream processing where pipeline output is converted to target-specific formats with schema enforcement and compression optimization.
Predefined Definitions
Ready-made collection patterns for common data sources (such as Windows security events) that Directors provide to Agents, eliminating manual configuration requirements.
Preprocessing
Stage 2 of DataStream processing where device-specific formats are transformed to standardized pipeline input using VMFL binary encoding.
Control and optimization of system resources (CPU, memory, disk) during data processing.
RBAC (Role-Based Access Control)
Access management based on user roles.
Route
YAML orchestration configuration that connects devices, pipelines, and targets to create complete data flow processing chains for specific business purposes.
Routing
Sending or directing a data stream to a destination for further processing or analysis.
Gradual change over time of the layout and structure of data underlying a processing system.
Schema Validation
The process of verifying that data conforms to a predefined structure, format, and rules (schema) before it is processed or stored in a system.
Size-Based Rotation
The practice of automatically creating new log/data files when the current file reaches a specified size limit, helping to manage storage and prevent individual files from becoming too large to handle efficiently.
SNMP (Simple Network Management Protocol)
Standard protocol for network monitoring.
Socket Reuse
Optimization technique allowing multiple processes to share network sockets for improved performance.
Store-and-Forward
A technique where data is temporarily stored locally before being transmitted, ensuring data preservation during network issues.
Supervisor
The component that checks the health of the processes. It restarts stopped services, cleans up the temp folder, etc.