Skip to main content
Version: 1.5.1

IBM Event Streams

IBM Cloud Message Queue

Send processed telemetry data to IBM Event Streams managed Kafka service.

Synopsis

The IBM Event Streams target writes log messages to IBM's managed Kafka service on IBM Cloud with full Kafka API compatibility. IBM Event Streams provides enterprise-grade messaging with automatic scaling, high availability, and IBM Cloud integration. Configuration follows Apache Kafka patterns with IBM Cloud-specific authentication.

Schema

targets:
- name: <string>
type: kafka
properties:
address: <string>
port: <integer>
client_id: <string>
topic: <string>
algorithm: <string>
username: <string>
password: <string>
compression: <string>
compression_level: <string>
acknowledgments: <string>
allow_auto_topic_creation: <boolean>
disable_idempotent_write: <boolean>
max_bytes: <integer>
max_events: <integer>
tls:
status: <boolean>
insecure_skip_verify: <boolean>
min_tls_version: <string>
max_tls_version: <string>
cert_name: <string>
key_name: <string>
passphrase: <string>
field_format: <string>
interval: <string/numeric>
cron: <string>

Configuration

Base Target Fields

FieldTypeRequiredDescription
namestringYUnique identifier for this target
descriptionstringNHuman-readable description
typestringYMust be kafka
pipelinesarrayNPipeline names to apply before sending
statusbooleanNEnable (true) or disable (false) this target

IBM Event Streams Connection

FieldTypeRequiredDescription
addressstringYIBM Event Streams broker address (from service credentials)
portintegerNKafka broker port. Default: 9093
client_idstringNClient identifier for connection tracking
topicstringYKafka topic name for message delivery

Authentication

FieldTypeRequiredDescription
algorithmstringNAuthentication mechanism. Default: scram-sha-512
usernamestringYIBM Event Streams username (from service credentials)
passwordstringYIBM Event Streams password (from service credentials)

Producer Settings

FieldTypeRequiredDescription
compressionstringNMessage compression (none, gzip, snappy, lz4, zstd). Default: none
compression_levelstringNCompression level (algorithm-specific)
acknowledgmentsstringNAcknowledgment level (none, leader, all). Default: leader
allow_auto_topic_creationbooleanNAllow automatic topic creation. Default: false
disable_idempotent_writebooleanNDisable idempotent producer. Default: false

Batch Configuration

FieldTypeRequiredDescription
max_bytesintegerNMaximum batch size in bytes (0 = unlimited). Default: 0
max_eventsintegerNMaximum number of events per batch. Default: 1000

TLS Configuration

FieldTypeRequiredDescription
tls.statusbooleanNEnable TLS encryption. Default: true (required for IBM Cloud)
tls.insecure_skip_verifybooleanNSkip TLS certificate verification. Default: false
tls.min_tls_versionstringNMinimum TLS version. Default: tls1.2
tls.max_tls_versionstringNMaximum TLS version. Default: tls1.3
tls.cert_namestringNClient certificate filename (PEM format)
tls.key_namestringNClient private key filename (PEM format)
tls.passphrasestringNPrivate key passphrase if encrypted

Normalization

FieldTypeRequiredDescription
field_formatstringNApply format normalization (ECS, ASIM, UDM)

Scheduler

FieldTypeRequiredDescription
intervalstring/numericNExecution frequency (realtime by default)
cronstringNCron expression for scheduled execution

Details

Authentication

SASL/SCRAM-SHA-512:

  • IBM Event Streams uses SASL/SCRAM-SHA-512 authentication
  • Set algorithm: scram-sha-512 (default for IBM Event Streams)
  • Username and password from IBM Cloud service credentials
  • TLS encryption required for authentication

Service Credentials:

  • Obtain credentials from IBM Cloud console
  • Navigate to Event Streams instance � Service Credentials
  • Create new credentials with appropriate permissions
  • Extract kafka_brokers_sasl, user, and password values
Service Credential Format

IBM Event Streams service credentials include:

  • kafka_brokers_sasl: Array of broker addresses
  • user: SASL username for authentication
  • password: SASL password for authentication
  • Use any broker address from the array

Connection Configuration

Broker Addresses:

  • IBM Event Streams provides multiple broker endpoints
  • Format: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093
  • Use any broker from service credentials
  • Default port: 9093 (SASL_SSL)

TLS Requirements:

  • TLS encryption mandatory for IBM Cloud connections
  • Set tls.status: true (default)
  • IBM Event Streams uses valid certificates
  • No need for custom CA certificates

Topic Management

Topic Creation:

  • Pre-create topics in IBM Cloud console
  • Configure allow_auto_topic_creation: true for automatic creation (not recommended)
  • Topic configuration managed through IBM Cloud UI or CLI

Topic Permissions:

  • Service credentials grant topic-level permissions
  • Writer role required for producing messages
  • Configure permissions in IBM Cloud console

Performance Optimization

Batch Configuration:

  • Larger batches improve throughput
  • Balance batch size against latency requirements
  • IBM Event Streams handles high-throughput workloads

Compression:

  • Enable compression to reduce bandwidth costs
  • Recommended: snappy (fast) or zstd (high compression)
  • Compression reduces network transfer and storage

Connection Pooling:

  • Maintains persistent connection to IBM Event Streams
  • Automatic reconnection on connection loss
  • Configurable client ID for connection tracking
IBM Cloud Pricing

IBM Event Streams charges based on throughput and storage. Enable compression and tune batch sizes to optimize costs.

Kafka API Compatibility

DataStream uses the standard Kafka Producer API with support for:

  • Idempotent writes
  • Batch compression
  • SASL authentication
  • TLS encryption

Security Best Practices

Credential Management:

  • Store service credentials in environment variables
  • Rotate credentials periodically
  • Use separate credentials for different environments

TLS Configuration:

  • Always enable TLS for production (mandatory for IBM Cloud)
  • Verify server certificates (insecure_skip_verify: false)
  • IBM Event Streams uses valid public certificates

Access Control:

  • Use IBM Cloud IAM for fine-grained permissions
  • Create service-specific credentials
  • Monitor credential usage through IBM Cloud

Examples

Basic Configuration

Sending logs to IBM Event Streams using SASL/SCRAM authentication...

targets:
- name: ibm-event-streams
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: application-logs
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
tls:
status: true

With Compression

Enabling Snappy compression for bandwidth efficiency...

targets:
- name: ibm-event-streams-compressed
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: telemetry-events
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
compression: snappy
tls:
status: true

High-Volume Configuration

Optimizing for high-volume ingestion with larger batches and compression...

targets:
- name: ibm-event-streams-high-volume
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: metrics-stream
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
compression: zstd
compression_level: "3"
max_events: 1000
max_bytes: 1048576
acknowledgments: all
tls:
status: true

With Client Identification

Using client ID for connection tracking and monitoring...

targets:
- name: ibm-event-streams-identified
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
client_id: datastream-director-01
topic: security-logs
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
tls:
status: true

Multi-Topic Publishing

Publishing different event types to separate topics...

targets:
- name: ibm-es-security
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: security-events
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
tls:
status: true

- name: ibm-es-application
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: application-events
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
tls:
status: true

With Normalization

Applying ECS normalization before sending to IBM Event Streams...

targets:
- name: ibm-event-streams-normalized
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
topic: normalized-events
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
field_format: ECS
compression: zstd
tls:
status: true

Production Configuration

Production-ready IBM Event Streams configuration with compression, batching, TLS, and full acknowledgments...

targets:
- name: ibm-event-streams-production
type: kafka
properties:
address: broker-0.kafka.svc01.us-south.eventstreams.cloud.ibm.com
port: 9093
client_id: datastream-production-01
topic: production-telemetry
algorithm: scram-sha-512
username: "${IBM_ES_USERNAME}"
password: "${IBM_ES_PASSWORD}"
compression: zstd
compression_level: "3"
acknowledgments: all
max_events: 1000
max_bytes: 1048576
field_format: ASIM
tls:
status: true
min_tls_version: tls1.2