本文介紹實時數據訂閱功能的數據消費格式定義說明和示例。
前提條件
您已經完成客戶端配置(kafka/Prometheus監控系統/api地址),且公網網絡聯通正常。同時客戶端服務器入方向端口允許訪問。
說明本文以kafka消費數據為例,基本操作步驟如下:
安裝kafka客戶端,并修改相關配置信息。
配置客戶端服務器網絡入方向允許訪問。
啟動客戶端kafka程序。
在云監控控制臺,創建數據訂閱任務。(填寫客戶端kafka相關配置信息)
數據消費定義說明
指標數據格式(kafka消息隊列)
告警數據不做壓縮,直接消費即可
監控數據vminsert寫入kafka的數據采用zstd壓縮,單條message可能包含多個metrics指標,消費到數據后先解壓縮,解壓后生成數據如下:
{
"metrics":[{
"tags":{
"__report_by__":"harvest",
"cpu":"cpu-total",
"host":"test-2dd4bfrv",
"idc":"neimengaz03",
"job":"virtual_machine",
"__name__":"cpu_util",
"uuid":"test-8194e630-fec1"
},
"fields":{
"cpu_util":0.26697814117055
},
"name":"prometheus_remote_write",
"timestamp":1713482161
}]
}
指標數據格式(remotewrite-API)
{"metric":{"__name__":"cpu_util","__report_by__":"harvest","cpu":"cpu-total","from":"subscription_translate","host":"ecm-dd23","idc":"neimengaz03","job":"virtual_machine","region_id":"test","uuid":"test-8194e630-fec1"},"value":[1743406897.472,"0.2334889928784665"]}
告警數據格式
[
{
"service":"cstor_sfs",
"dimension":"oceanfs",
"region_id":"test",
"idc":"neimengaz03",
"key":"fd8ef95badd54206bc243cd4fe6da114",
"model_id":"405f3438-fba8-52d1-a7e5-8e67f26f92b6",
"issue_id":"67b0aead8d4ae0276248dd21",
"info_id":"67b0aead8d4ae0276248dd22",
"name":"資源分組海量文件規則",
"alarm_type":"series",
"status":0,
"ctime":1739632301,
"value":104857600,
"resource":[{
"name":"uuid",
"value":"56m0a9zta02dkxx7"
},{
"name":"instancename",
"value":"oceanfs-ac40"
},{
"name":"console_resource_id",
"value":"56m0a9zta02dkxx7"
}],
"metric":"fs_capacity_total",
"alarm_name":"資源分組海量文件規則",
"threshold":"0",
"operator":"ge",
"unit":"MB"
},
...
{
"service":"cstor_sfs",
"dimension":"oceanfs",
"region_id":"81f7728662dd11ec810800155d307d5b",
"idc":"neimengaz03",
"key":"fd8ef95badd54206bc243cd4fe6da114",
"model_id":"405f3438-fba8-52d1-a7e5-8e67f26f92b6",
"issue_id":"67b0aead8d4ae0276248dd21",
"info_id":"67b0aead8d4ae0276248dd22",
"name":"資源分組海量文件規則",
"alarm_type":"series",
"status":0,
"ctime":1739632301,
"value":104857600,
"resource":[{
"name":"uuid",
"value":"56m0a9zta02dkxx7"
},{
"name":"instancename",
"value":"oceanfs-ac40"
},{
"name":"console_resource_id",
"value":"56m0a9zta02dkxx7"
}],
"metric":"fs_capacity_total",
"alarm_name":"資源分組海量文件規則",
"threshold":"0",
"operator":"ge",
"unit":"MB"
}
]
事件數據格式
[
{
"specversion":"1.0",
"id":"test-8194e630-fec1",
"source":"ctyun.site_monitor",
"type":"site_monitor:response_timeout",
"subject":"ctyun.site_monitor:fdda0184-2211-40dd-8098-02bd4e1f0452:response_timeout",
"datacontenttype":"application/json",
"time":"2025-01-16T14:06:52.368536895Z",
"data":{
"taskID":"fdda0184-2211-40dd-8098-02bd4e1f0452",
"pointID":"1e138ae8-b991-4c25-a139-b0ad3ac2f22a",
"targetAddr":"203.83.233.26",
"targetIP":"",
"dnsServer":null,
"responseTime":"",
"accountID":"",
"protocol":"ping",
"interval":60,
"err":"fdda0184-2211-40dd-8098-02bd4e1f0452 Failed to sent any pkg or receive any valid pkg, sent: 10, receive: 0"
},
"reportidc":"guizhou03",
"ctyunregion":"guizhou03"
},
...
{
"specversion":"1.0",
"id":"1d7554df-b7c4-4109-be17-7da11a09dd6a",
"source":"ctyun.site_monitor",
"type":"site_monitor:response_timeout",
"subject":"ctyun.site_monitor:fdda0184-2211-40dd-8098-02bd4e1f0452:response_timeout",
"datacontenttype":"application/json",
"time":"2025-01-16T14:06:52.368536895Z",
"data":{
"taskID":"fdda0184-2211-40dd-8098-02bd4e1f0452",
"pointID":"1e138ae8-b991-4c25-a139-b0ad3ac2f22a",
"targetAddr":"203.83.233.26",
"targetIP":"",
"dnsServer":null,
"responseTime":"",
"accountID":"",
"protocol":"ping",
"interval":60,
"err":"fdda0184-2211-40dd-8098-02bd4e1f0452 Failed to sent any pkg or receive any valid pkg, sent: 10, receive: 0"
},
"reportidc":"guizhou03",
"ctyunregion":"guizhou03"
}
]
說明
訂閱數據類型分為指標類數據、事件類數據、告警類數據。
指標類數據:指標數據是對系統、服務或資源的各種量化特征進行實時或定期監測和收集的數據。這些數據通常以數值形式表示,用于描述系統的運行狀態、性能表現和資源使用情況等。
事件類數據:事件數據是記錄云資源或服務狀態變化的離散信息,通常由系統自動生成,用于描述某一時刻發生的操作或狀態變更。
告警類數據:告警數據是基于指標數據或事件數據,當系統檢測到某些特定告警規則時產生。告警數據通常包含了告警的類型、等級、發生時間、相關資源信息以及告警的具體描述等。
消費數據亂碼問題
對于Java生產者和消費者,可以在配置中設置字符編碼:
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("serializer.encoding", "UTF-8"); // 設置編碼格式
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("deserializer.encoding", "UTF-8"); // 設置編碼格式
如果使用的是命令行工具,確保控制臺的字符編碼與Kafka消息編碼一致。
在Windows系統中,可以通過修改控制臺默認編碼為UTF-8來解決:
chcp 65001
在Linux系統中,可以通過設置環境變量來調整字符編碼:
export LANG=en_US.UTF-8