Skip to main content

Glossary

A

Agent
A software component installed on systems to collect and forward telemetry data. Can be managed (traditionally installed) or auto-managed (automatically deployed).
Agentless
Collection method that does not require installing software on target systems, typically using existing protocols like SSH or WinRM.
Aggregation
Process of combining and grouping multiple data points based on common properties.
Alert
Notification of significant system events.
Archival
Long-term data storage process.
Audit
Review of logs for the purpose of verifying that IT policies are correctly implemented.
Audit Trail
Records of all system access instances and actions carried out.
Authentication
Verification of the identity of users, devices, or systems attempting to access network resources.
Authorization
Permission to access specific resources.
Access Control
A security mechanism that regulates who or what can view, use, or modify resources in a computing environment based on predefined policies and permissions.

B

Batch Processing
Collective sequential processing of a large number of records that require the same operations.
Bandwidth
Data transmission capacity per unit time.
Baseline
Reference values for normal operation used for comparison across multiple cases.
Buffer
Temporary storage for telemetry data.
Buffer Management
Control of memory buffers used for temporary data storage during processing, including size control and flush intervals.

C

Cache
Fast-access temporary data storage.
Calibration
Measurement accuracy adjustment.
Collector
System component that receives and processes telemetry data.
Compression
Data size reduction technique used for saving storage space.
Connection Pooling
Maintaining a cache of database connections that can be reused when future requests to the database are required, reducing the overhead of creating new database connections for each request.
Checkpoint Recovery
Using periodic snapshots (checkpoints) of the database state to restore system consistency after a failure by only needing to replay transactions that occurred after the most recent checkpoint.

D

Dashboard
Visual interface for telemetry data display.
Data Collection Rules (DCR)
Configuration settings that define how data should be collected and processed in Azure Monitor.
Data Enrichment
The process of enhancing data with additional context or information from external sources.
Data Retention Period
Duration for which telemetry data is stored.
Data Resolution
Level of detail used in measurements.
Data Partitioning
Dividing a large dataset into smaller, more manageable sections. Frequently used in parallel computing and distributed systems to improve performance and scalability.
Data Point
Single measurement at a specific time.
Diagnostics
System health monitoring tools and methods.
Device
A source of log data, which can be physical hardware, virtual machines, or software components.

E

Encryption
Data security achieved through encoding of data to render it unreadable by humans.
Event
Any occurrence in a computerized and networked system that registers as a sender or receiver of signals.

F

Failover
System backup activation process used in case of emergencies when a hardware resources becomes unusable.
Field Mapping
The process of translating field names from one format to another during data normalization.
Flow
A sequence of related packets in network traffic, typically representing a single connection or transaction.

G

Gateway
Interface between different network segments.
Geohash
A system for encoding geographic locations into short strings of letters and numbers.
Geotile
A method of dividing geographic areas into tiles based on zoom levels and coordinates.

H

Heartbeat
Periodic signal sent to confirm system operation.
Historical Data
Previously collected measurements.

I

Ingestion
Collection of data from a source for a specific purpose of use such as analysis or monitoring.

L

Latency
Time delay in data transmission.
Load Balancing
Distribution of processing load among multiple hardware resources.
Logging
Recording of system events.

M

Management
The policy implemented to handle log lines after they have been processed.
Metrics
Quantifiable measurements of system attributes.
MIB (Management Information Base)
Definition files for network object structures used for automated management.
Multi-Worker Architecture
Design pattern where multiple worker processes handle tasks concurrently for improved performance.

N

Node
Individual point in a telemetry network.
Normalization
The process of transforming data formats with the intention of rendering them more suitable for use by a consumer.

O

OID (Object Identifier)
Unique identifiers for network objects in a MIB.

P

Parser
Component that separates parts of raw data into semantic units.
Payload
Actual data content being transmitted.
Pipeline
Pre-configured processing logic for a data stream.
Polling Interval
Time between consecutive data collection attempts.
Processor
A component that performs a specific operation on streaming data.
Protocol
Rules governing data communication and exchange.

Q

Query
Request for specific telemetry information.

R

Real-time Monitoring
Immediate observation of system state and events.
Redundancy
Backup systems used for reliability.
Resource Management
Control and optimization of system resources (CPU, memory, disk) during data processing.
RBAC (Role-Based Access Control)
Access management based on user roles.
Routing
Sending or directing a data stream to a destination for further processing or analysis.

S

Sampling Rate
Frequency at which measurements are taken.
Schema Evolution
Gradual change over time of the layout and structure of data underlying a processing system.
Schema Validation
The process of verifying that data conforms to a predefined structure, format, and rules (schema) before it is processed or stored in a system.
Size-Based Rotation
The practice of automatically creating new log/data files when the current file reaches a specified size limit, helping to manage storage and prevent individual files from becoming too large to handle efficiently.
SNMP (Simple Network Management Protocol)
Standard protocol for network monitoring.
Socket Reuse
Optimization technique allowing multiple processes to share network sockets for improved performance.
Store-and-Forward
A technique where data is temporarily stored locally before being transmitted, ensuring data preservation during network issues.
Supervisor
The component that checks the health of the processes. It restarts stopped services, cleans up the temp folder, etc.
Synchronization
Alignment of time-based data.

T

Telemetry
Remote measurement and data collection system.
Template
Container object with selection and/or processing logic for incoming data stream.
Threshold
Predefined lower limit that triggers alerts or actions.
Timestamp
Time marker associated with collected data.
Time Series
Data points collected over sequential time periods.
TLS (Transport Layer Security)
Cryptographic protocol providing secure communication over networks.
Topology
Network structure and connections. Used to describe the characteristics of and the relations between components.
Trend Analysis
Study of data patterns over time.

V

Validation
Data accuracy verification.
Vectorized Processing
Using all available cores in a system to load balance the processing operation.
VIP (Variable Information Period)
Configurable time interval for data sampling.