融合自适应图和多级注意力的多变量时间序列异常检测

Anomaly detection of multivariate time series by adaptive graph and multi-level attention

  • 摘要: 多变量时间序列异常检测旨在发现时序数据中不符合一般规律的异常模式. 现有方法往往忽略不同变量在每个时刻的重要性、不同变量间的长期稳定和短期动态变化的关联关系,且从数据中提取特征时难以取得长期和局部时序依赖关系间的平衡,导致特征提取不充分、异常检测结果不准确. 为此,提出融合自适应图和多级注意力的多变量时间序列异常检测方法. 首先,使用图嵌入模型学习变量间的长、短期关联关系,并增加图损失机制以指导长期稳定关联关系的学习;然后,设计多级注意力机制:变量级注意力机制用于捕捉当前时刻的关键变量;序列级加权注意力机制克服了传统自注意机制中可能出现的全局性分散问题,有效捕获时间序列的长期与局部时序依赖关系,进而重构多变量时间序列的正常模式;最后,在多个真实数据集上进行实验,实验结果表明,提出的方法平均F1值高于0.96,显著优于现有方法.

     

    Abstract: Multivariate time series (MTS) anomaly detection aims to identify abnormal patterns in time series data that do not conform to general rules. Existing methods often neglect the importance of different variables at each moment, and the long-term stable and short-term dynamic associations among variables. Moreover, these methods make effects to balance long-term and local temporal dependencies when extracting features from data, leading to insufficient feature extraction and inaccurate anomaly detection results. To address this, we propose the method for MTS anomaly detection by integrating adaptive graph and multi-level attention. Firstly, we learn the long-term stable and short-term dynamic associations between variables using node embeddings and node inputs, respectively. Secondly, a graph loss mechanism is added to guide the learning of long-term stable associations. Then, we design multi-level attention mechanisms: the variable-level attention mechanism is used to capture the key variables at the current time step. The sequence-level weighted attention mechanism overcome the potential global dispersion issue in traditional self-attention mechanisms, effectively capturing both long-term and local temporal dependencies in time series. Thus, the learned normal patterns of MTS could be reconstructed. Finally, experiments on several real datasets show that the average F1-score of the proposed method is high than 0.96, which significantly outperforms the state-of-the-art methods.

     

/

返回文章
返回