keras-anomaly-detection 代码分析——本质上就是SAE、LSTM时间序列预测

keras-anomaly-detection

Anomaly detection implemented in Keras

The source codes of the recurrent, convolutional and feedforward networks auto-encoders for anomaly detection can be found in keras_anomaly_detection/library/convolutional.py and keras_anomaly_detection/library/recurrent.py and keras_anomaly_detection/library/feedforward.py

The the anomaly detection is implemented using auto-encoder with convolutional, feedforward, and recurrent networks and can be applied to:

  • timeseries data to detect timeseries time windows that have anomaly pattern
  • structured data (i.e., tabular data) to detect anomaly in data records
    • Conv1DAutoEncoder in keras_anomaly_detection/library/convolutional.py
    • FeedforwardAutoEncoder in keras_anomaly_detection/library/feedforward.py
       
      看LSTM的模型吧:
      1
      2
      3
      4
      5
      6
      7
      8
      9
          def create_model(time_window_size, metric):
              model = Sequential()
              model.add(LSTM(units=128, input_shape=(time_window_size, 1), return_sequences=False))
       
              model.add(Dense(units=time_window_size, activation='linear'))
       
              model.compile(optimizer='adam', loss='mean_squared_error', metrics=[metric])
              print(model.summary())
      return model

      再看feedforward的模型:

      1
      2
      3
      4
      5
      6
      7
      8
      9
      10
      11
      12
      13
      14
      15
          def create_model(self, input_dim):
              encoding_dim = 14
              input_layer = Input(shape=(input_dim,))
       
              encoder = Dense(encoding_dim, activation="tanh",
                              activity_regularizer=regularizers.l1(10e-5))(input_layer)
              encoder = Dense(encoding_dim // 2, activation="relu")(encoder)
       
              decoder = Dense(encoding_dim // 2, activation='tanh')(encoder)
              decoder = Dense(input_dim, activation='relu')(decoder)
       
              model = Model(inputs=input_layer, outputs=decoder)
              model.compile(optimizer='adam',
                            loss='mean_squared_error',
      metrics=['accuracy'])

       CNN的:

      1
      2
      3
      4
      5
      6
      7
      8
      9
      10
      11
      def create_model(time_window_size, metric):
          model = Sequential()
          model.add(Conv1D(filters=256, kernel_size=5, padding='same', activation='relu',
                           input_shape=(time_window_size, 1)))
          model.add(GlobalMaxPool1D())
       
          model.add(Dense(units=time_window_size, activation='linear'))
       
          model.compile(optimizer='adam', loss='mean_squared_error', metrics=[metric])
          print(model.summary())
          return model

       都是将输出设置成自己,异常点就是查看偏离那90%的预测error较大的点。

posted @   bonelee  阅读(1908)  评论(0编辑  收藏  举报
编辑推荐:
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
· 浏览器原生「磁吸」效果!Anchor Positioning 锚点定位神器解析
· 没有源码,如何修改代码逻辑?
阅读排行:
· 全程不用写代码,我用AI程序员写了一个飞机大战
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· 记一次.NET内存居高不下排查解决与启示
· 白话解读 Dapr 1.15:你的「微服务管家」又秀新绝活了
· DeepSeek 开源周回顾「GitHub 热点速览」
历史上的今天:
2017-10-25 通过DNS通道传输的交互式PowerShell脚本
2017-10-25 一种神经元探索系统方法及装置
点击右上角即可分享
微信分享提示