Timeseries anomaly detection using an Autoencoder

Setup

Load the data

get_data <- function(url_suffix) {
  url_root <- "https://raw.githubusercontent.com/numenta/NAB/master/data/"
  url <- paste0(url_root, url_suffix)
  file <- get_file(origin = url) # cache file locally
  # parse csv; 2 columns with types: datetime (T), double (d)
  readr::read_csv(file, col_types = "Td")
}

df_small_noise   <- get_data("artificialNoAnomaly/art_daily_small_noise.csv")
df_daily_jumpsup <- get_data("artificialWithAnomaly/art_daily_jumpsup.csv")

Visualize the data

plot_ts <- function(df) {
  ggplot(df, aes(x = timestamp, y = value)) + geom_line() +
    scale_x_datetime(date_breaks = "1 day", date_labels = "%b-%d")
}

plot_ts(df_small_noise) + ggtitle("Without Anomaly")

df_train <- df_small_noise |> 
  mutate(value = (value - mean(value)) / sd(value))
TIME_STEPS <- 288

as_dataset <- function(df) {
  x <- as.matrix(df$value)
  ds <- timeseries_dataset_from_array(x, NULL, sequence_length = TIME_STEPS)
  ds |> as_array_iterator() |> iterate() |> bind_on_rows()
}

x_train <- as_dataset(df_train)
dim(x_train)
[1] 3745  288    1

Build a model

Convolutional Reconstruction Autoencoder Model

  • Convolutional layer: 합성곱층으로 fully connected layer 대신 필터(kernel)가 이동하며 지역적 특징을 학습

  • Reconstruction: 입력 데이터의 중요한 패턴만 남기고 noise를 제거하며 복원

  • Autoencoder: 입력 데이터를 저차원 잠재공간(latent space)으로 압축(encoding)하고 다시 복원(decoding)함으로써 원본과 유사한 출력을 만들도록 학습

  • layer_conv_1d

    • filters: 출력 차원수로 1개의 필터는 특정 시계열 패턴을 감지

    • kernel_size: 필터의 길이로 한 번에 패턴을 인지하는 시계열 길이

    • padding:시계열 경계 처리 방식

      • valid: 경계 밖 데이터 제거

      • same: 경계에 0 추가

    • strides: 필터 이동 간격

model <- keras_model_sequential(input_shape = c(TIME_STEPS, 1)) |> # batch_size는 제외
  
  layer_conv_1d(
    filters = 32, 
    kernel_size = 7,
    padding = "same",
    strides = 2,
    activation = "relu"
  ) |> 
  layer_dropout(0.2) |> 
  
    layer_conv_1d(
    filters = 16, 
    kernel_size = 7,
    padding = "same",
    strides = 2,
    activation = "relu"
  ) |> 
  
  layer_conv_1d_transpose(
    filters = 16, 
    kernel_size = 7,
    padding = "same",
    strides = 2,
    activation = "relu"
  ) |> 
  layer_dropout(0.2) |> 
  
  layer_conv_1d_transpose(
    filters = 32, 
    kernel_size = 7,
    padding = "same",
    strides = 2,
    activation = "relu"
  ) |> 
  
  layer_conv_1d_transpose(
    filters = 1, 
    kernel_size = 7,
    padding = "same"
  )

model |> compile(
  optimizer = optimizer_adam(learning_rate = 0.001),
  loss = "mse"
)

summary(model)
Model: "sequential"
┌───────────────────────────────────┬──────────────────────────┬───────────────
│ Layer (type)                      │ Output Shape             │       Param # 
├───────────────────────────────────┼──────────────────────────┼───────────────
│ conv1d (Conv1D)                   │ (None, 144, 32)          │           256 
├───────────────────────────────────┼──────────────────────────┼───────────────
│ dropout (Dropout)                 │ (None, 144, 32)          │             0 
├───────────────────────────────────┼──────────────────────────┼───────────────
│ conv1d_1 (Conv1D)                 │ (None, 72, 16)           │         3,600 
├───────────────────────────────────┼──────────────────────────┼───────────────
│ conv1d_transpose                  │ (None, 144, 16)          │         1,808 
│ (Conv1DTranspose)                 │                          │               
├───────────────────────────────────┼──────────────────────────┼───────────────
│ dropout_1 (Dropout)               │ (None, 144, 16)          │             0 
├───────────────────────────────────┼──────────────────────────┼───────────────
│ conv1d_transpose_1                │ (None, 288, 32)          │         3,616 
│ (Conv1DTranspose)                 │                          │               
├───────────────────────────────────┼──────────────────────────┼───────────────
│ conv1d_transpose_2                │ (None, 288, 1)           │           225 
│ (Conv1DTranspose)                 │                          │               
└───────────────────────────────────┴──────────────────────────┴───────────────
 Total params: 9,505 (37.13 KB)
 Trainable params: 9,505 (37.13 KB)
 Non-trainable params: 0 (0.00 B)

Train the model

Reconstruction model이기에 train datatarget data가 동일한 것을 알 수 있다. 해당 모델의 목적은 단순히 미래 예측이 아니라, 과거 데이터를 통해서 과거 데이터를 그대로 복원하려는 것에 있다. 훈련 결과, 복원 에러(mse)가 낮다면 해당 모델은 데이터를 제대로 파악하고 있다는 것이며 이를 통해 데이터의 비정상성(anomalies)을 감지할 수 있게 된다.

history <- model |> fit(
  x_train, x_train,
  epochs = 50,
  validation_split = 0.1,
  callbacks = c(
    callback_early_stopping(
      monitor = "val_loss",
      patience = 5,
      mode = "min"
    )
  )
)
Epoch 1/50
106/106 - 2s - 17ms/step - loss: 0.1890 - val_loss: 0.0300
Epoch 2/50
106/106 - 1s - 5ms/step - loss: 0.0376 - val_loss: 0.0235
Epoch 3/50
106/106 - 0s - 4ms/step - loss: 0.0277 - val_loss: 0.0213
Epoch 4/50
106/106 - 0s - 5ms/step - loss: 0.0227 - val_loss: 0.0244
Epoch 5/50
106/106 - 0s - 5ms/step - loss: 0.0194 - val_loss: 0.0183
Epoch 6/50
106/106 - 0s - 4ms/step - loss: 0.0169 - val_loss: 0.0161
Epoch 7/50
106/106 - 0s - 4ms/step - loss: 0.0151 - val_loss: 0.0171
Epoch 8/50
106/106 - 0s - 4ms/step - loss: 0.0135 - val_loss: 0.0145
Epoch 9/50
106/106 - 0s - 5ms/step - loss: 0.0123 - val_loss: 0.0131
Epoch 10/50
106/106 - 0s - 5ms/step - loss: 0.0113 - val_loss: 0.0117
Epoch 11/50
106/106 - 1s - 5ms/step - loss: 0.0103 - val_loss: 0.0099
Epoch 12/50
106/106 - 0s - 4ms/step - loss: 0.0095 - val_loss: 0.0089
Epoch 13/50
106/106 - 0s - 5ms/step - loss: 0.0088 - val_loss: 0.0082
Epoch 14/50
106/106 - 0s - 5ms/step - loss: 0.0082 - val_loss: 0.0078
Epoch 15/50
106/106 - 0s - 5ms/step - loss: 0.0077 - val_loss: 0.0067
Epoch 16/50
106/106 - 1s - 5ms/step - loss: 0.0073 - val_loss: 0.0066
Epoch 17/50
106/106 - 1s - 6ms/step - loss: 0.0069 - val_loss: 0.0057
Epoch 18/50
106/106 - 1s - 5ms/step - loss: 0.0065 - val_loss: 0.0065
Epoch 19/50
106/106 - 1s - 5ms/step - loss: 0.0062 - val_loss: 0.0072
Epoch 20/50
106/106 - 1s - 6ms/step - loss: 0.0059 - val_loss: 0.0061
Epoch 21/50
106/106 - 1s - 5ms/step - loss: 0.0056 - val_loss: 0.0059
Epoch 22/50
106/106 - 1s - 5ms/step - loss: 0.0053 - val_loss: 0.0072

Detecting anomalies

  1. 최대 MAE loss 계산
  2. 최대 MAE loss를 비정상성 탐지의 임계값(threshold)으로 설정
  3. 특정 샘플의 MAE loss가 해당 임계값보다 크면 비정상성 패턴으로 인지
x_train_pred <- model |> predict(x_train)
118/118 - 0s - 3ms/step
train_mae_loss <- apply(abs(x_train_pred - x_train), 1, mean)

hist(train_mae_loss, breaks = 50)

threshold <- max(train_mae_loss)

cat("Reconstruction error threshold: ", threshold, "\n")
Reconstruction error threshold:  0.07027409 
plot(NULL, NULL, ylab = "Value", 
     xlim = c(0, TIME_STEPS),
     ylim = range(c(x_train[1,,], x_train_pred[1,,])))
lines(x_train[1,,])
lines(x_train_pred[1,,], col = 'red')
legend("topleft", lty = 1, legend = c("actual", "predicted"), col = c("black", "red"))

Prepare test data

df_test <- df_daily_jumpsup |> 
  mutate(value = (value - mean(df_small_noise$value)) / sd(df_small_noise$value))

plot_ts(df_test)

x_test <- as_dataset(df_test)

x_test_pred <- model |> predict(x_test)
118/118 - 0s - 2ms/step
test_mae_loss <- apply(abs(x_test_pred - x_test), 1, mean)

hist(test_mae_loss, breaks = 50, xlab = "test MAE loss", ylab = "No of samples")

anomalies <- test_mae_loss > threshold

cat("Number of anomaly samples:", sum(anomalies), "\n")
Number of anomaly samples: 587 
cat("Indices of anomaly samples", which(anomalies), "\n", fill = TRUE)
Indices of anomaly samples 107 187 215 216 218 439 442 443 506 771 773 774 775 
776 777 778 779 781 789 791 792 793 794 795 796 801 805 809 813 913 917 933 971 
972 975 977 981 1005 1595 1619 1623 1626 1627 1656 1660 1941 1942 1943 1944 
1945 1946 1949 1950 1953 1957 1958 1970 1974 1978 1982 1986 1990 1994 1997 1998 
2001 2002 2005 2006 2008 2009 2010 2012 2013 2014 2016 2017 2018 2020 2021 2022 
2024 2025 2026 2029 2030 2033 2034 2036 2037 2038 2040 2041 2042 2044 2045 2046 
2048 2049 2050 2052 2053 2054 2056 2057 2058 2060 2061 2062 2064 2065 2066 2068 
2069 2070 2072 2073 2074 2076 2077 2078 2080 2081 2082 2084 2085 2086 2088 2089 
2090 2092 2093 2094 2096 2097 2098 2100 2101 2102 2104 2105 2106 2108 2109 2110 
2112 2113 2114 2116 2117 2118 2120 2121 2122 2123 2124 2126 2129 2130 2132 2133 
2134 2137 2138 2140 2141 2142 2144 2145 2146 2148 2149 2150 2152 2154 2156 2158 
2160 2166 2519 2520 2522 2523 2699 2700 2702 2703 2704 2705 2706 2707 2708 2709 
2710 2711 2712 2713 2714 2715 2716 2717 2718 2719 2720 2721 2722 2723 2724 2725 
2726 2727 2728 2729 2730 2731 2732 2733 2734 2735 2736 2737 2738 2739 2740 2741 
2742 2743 2744 2745 2746 2747 2748 2749 2750 2751 2752 2753 2754 2755 2756 2757 
2758 2759 2760 2761 2762 2763 2764 2765 2766 2767 2768 2769 2770 2771 2772 2773 
2774 2775 2776 2777 2778 2779 2780 2781 2782 2783 2784 2785 2786 2787 2788 2789 
2790 2791 2792 2793 2794 2795 2796 2797 2798 2799 2800 2801 2802 2803 2804 2805 
2806 2807 2808 2809 2810 2811 2812 2813 2814 2815 2816 2817 2818 2819 2820 2821 
2822 2823 2824 2825 2826 2827 2828 2829 2830 2831 2832 2833 2834 2835 2836 2837 
2838 2839 2840 2841 2842 2843 2844 2845 2846 2847 2848 2849 2850 2851 2852 2853 
2854 2855 2856 2857 2858 2859 2860 2861 2862 2863 2864 2865 2866 2867 2868 2869 
2870 2871 2872 2873 2874 2875 2876 2877 2878 2879 2880 2881 2882 2883 2884 2885 
2886 2887 2888 2889 2890 2891 2892 2893 2894 2895 2896 2897 2898 2899 2900 2901 
2902 2903 2904 2905 2906 2907 2908 2909 2910 2911 2912 2913 2914 2915 2916 2917 
2918 2919 2920 2921 2922 2923 2924 2925 2926 2927 2928 2929 2930 2931 2932 2933 
2934 2935 2936 2937 2938 2939 2940 2941 2942 2943 2944 2945 2946 2947 2948 2949 
2950 2951 2952 2953 2954 2955 2956 2957 2958 2959 2960 2961 2962 2963 2964 2965 
2966 2967 2968 2969 2970 2971 2972 2973 2974 2975 2976 2977 2978 2979 2980 2981 
2982 2983 2984 2985 2986 2987 2988 2989 2990 2991 2992 2993 2994 2995 2996 2997 
2998 2999 3000 3001 3002 3003 3004 3005 3006 3007 3008 3009 3010 3011 3012 3013 
3014 3015 3016 3017 3018 3019 3020 3021 3022 3023 3024 3025 3026 3027 3028 3029 
3030 3031 3032 3033 3034 3035 3036 3037 3038 3039 3040 3041 3042 3043 3044 3045 
3046 3047 3048 3049 3050 3051 3052 3053 3054 3055 3056 3057 3058 3059 3060 3061 
3062 3063 3064 3065 3066 3067 3068 3069 3070 3071 3072 3073 3074 3075 3076 3077 
3078 3079 3080 3081 3082 3083 3084 3085 3086 3087 3088 3089 3090 3091 3092 3093 
3094 3095 3096 3097 3098 3099 3100 3101 3102 3103