Kafka Exporter
Created:2024-11-05 Last Modified:2024-11-05
This document was translated by ChatGPT
#1. Functionality
Using Kafka, you can export metrics, flow logs, call logs, and IO events generated by DeepFlow to external platforms.
#2. Metrics Overview
Within DeepFlow, metrics can be categorized into two types:
- Application Performance Metrics: Refer to details
- Corresponds to
flow_metrics.application*
table data in ClickHouse
- Corresponds to
- Network Performance Metrics: Refer to details
- Corresponds to
flow_metrics.network*
table data in ClickHouse
- Corresponds to
#3. Kafka Export
The protocol format uses JSON.
#4. DeepFlow Server Configuration Guide
To enable metrics export, add the following configuration under the Server settings:
ingester:
exporters:
- protocol: kafka
enabled: true
endpoints: [broker1.example.com:9092, broker2.example.com:9092]
data-sources:
- flow_log.l7_flow_log
# - flow_log.l4_flow_log
# - flow_metrics.application_map.1s
# - flow_metrics.application_map.1m
# - flow_metrics.application.1s
# - flow_metrics.application.1m
# - flow_metrics.network_map.1s
# - flow_metrics.network_map.1m
# - flow_metrics.network.1s
# - flow_metrics.network.1m
# - event.perf_event
queue-count: 4
queue-size: 100000
batch-size: 1024
flush-timeout: 10
tag-filters:
export-fields:
- $tag
- $metrics
export-empty-tag: false
export-empty-metrics-disabled: false
enum-translate-to-name-disabled: false
universal-tag-translate-to-name-disabled: false
sasl:
enabled: false
security-protocol: SASL_SSL # currently only supports: SASL_SSL
sasl-mechanism: PLAIN # currently only supports: PLAIN
username: aaa
password: bbb
topic:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
#5. Detailed Parameter Description
Field | Type | Required | Description |
---|---|---|---|
protocol | string | Yes | Fixed value kafka |
data-sources | strings | Yes | Values from ClickHouse flow_metrics.*/flow_log.*/event.perf_event data, also used for Kafka topic names |
endpoints | strings | Yes | Remote receiving addresses, Kafka broker receiving addresses, randomly select one that can send successfully |
batch-size | int | No | Batch size, when this value is reached, send in batches. Default value: 1024 |
export-fields | strings | Yes | Recommended configuration: [$tag, $metrics] |
sasl | struct | No | Kafka connection authentication method, currently only supports 'SASL_SSL' with 'PLAIN' method |
topic | string | No | Kafka topic name, if empty, the default value is deepflow.$data-source , such as deepflow.flow_log.l7_flow_log |