Introduction
Related work
Traditional method
Deep learning method
Proposed method
Overview
Two-path spatio-temporal information extraction module (TSIEM)
Multi-level feature enhancement module (MFEM)
Experiments
Datasets and parameter settings
Metrics | |||||||||
---|---|---|---|---|---|---|---|---|---|
Modules | Acc ↑ | Pre ↑ | Rec ↑ | F1 ↑ | PWC ↓ | FPR ↓ | FNR ↓ | Sp ↑ | AUC ↑ |
Baseline (BL) | 0.9725 | 0.7848 | 0.7515 | 0.7603 | 1.3545 | 0.0082 | 0.2485 | 0.9918 | 0.9476 |
BL + IMFEM (TSIEM) | 0.9735 | 0.8324 | 0.8099 | 0.8125 | 1.1616 | 0.0077 | 0.1901 | 0.9923 | 0.9703 |
TSIEM + EDDC | 0.9749 | 0.8562 | 0.8257 | 0.8338 | 0.8721 | 0.0053 | 0.1743 | 0.9949 | 0.9724 |
TSIEM + MFEM (our) | 0.9773 | 0.8786 | 0.8547 | 0.8609 | 0.7173 | 0.0042 | 0.1454 | 0.9960 | 0.9739 |
Ablation study
Comparison with state-of-the-arts
Videos | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Methods | IBS | ICA | IIL | IMB | IOC | ISI | OCL | ORA | OSN | OSU | Avg |
Zovkovik [62]-2004 | 0.53 | 0.83 | 0.24 | 0.87 | 0.95 | 0.91 | 0.88 | 0.83 | 0.38 | 0.71 | 0.71 |
Maddalena 1[55]-2008 | 0.42 | 0.85 | 0.61 | 0.76 | 0.91 | 0.87 | 0.88 | 0.84 | 0.58 | 0.80 | 0.75 |
Maddalena 2[56]-2012 | 0.40 | 0.86 | 0.21 | 0.91 | 0.95 | 0.95 | 0.87 | 0.85 | 0.81 | 0.88 | 0.77 |
Haines [57]-2014 | 0.68 | 0.89 | 0.85 | 0.84 | 0.92 | 0.89 | 0.83 | 0.86 | 0.17 | 0.86 | 0.78 |
Cuevas [58]-2018 | 0.66 | 0.84 | 0.65 | 0.93 | 0.78 | 0.88 | 0.93 | 0.87 | 0.78 | 0.72 | 0.81 |
FgSegNet-S-55 [16]-2018 | 0.22 | 0.60 | 0.39 | 0.60 | 0.23 | 0.39 | 0.23 | 0.15 | 0.13 | 0.37 | 0.33 |
FgSegNet-M-55 [16]-2018 | 0.21 | 0.69 | 0.32 | 0.71 | 0.31 | 0.43 | 0.22 | 0.18 | 0.19 | 0.25 | 0.35 |
MSFS-51 [59]-2020 | 0.22 | 0.60 | 0.32 | 0.50 | 0.30 | 0.44 | 0.31 | 0.24 | 0.28 | 0.38 | 0.36 |
MSFS-55 [59]-2020 | 0.36 | 0.40 | 0.35 | 0.64 | 0.37 | 0.39 | 0.41 | 0.35 | 0.31 | 0.37 | 0.40 |
3CDC-51 [60]-2021 | 0.81 | 0.76 | 0.90 | 0.90 | 0.90 | 0.91 | 0.89 | 0.89 | 0.72 | 0.85 | 0.85 |
3CDC-55 [60]-2021 | 0.72 | 0.82 | 0.92 | 0.89 | 0.91 | 0.87 | 0.87 | 0.90 | 0.69 | 0.85 | 0.84 |
BSUV-Net 2.0 [61]-2021 | 0.77 | 0.68 | 0.88 | 0.81 | 0.96 | 0.92 | 0.93 | 0.94 | 0.84 | 0.79 | 0.85 |
Our | 0.85 | 0.90 | 0.94 | 0.96 | 0.95 | 0.83 | 0.94 | 0.94 | 0.79 | 0.84 | 0.89 |
Training | Testing | Metrics | ||||
---|---|---|---|---|---|---|
Video | Challenge | Video | Challenge | Pre ↑ | Rec ↑ | F1 ↑ |
I_BS_01 | Bootstrap | I_CA_01 | Camouflage | 0.6035 | 0.7268 | 0.6417 |
I_IL_01 | Illumination change | I_SI_01 | Simple sequence | 0.9010 | 0.8754 | 0.8805 |
O_CL_01 | Cloudy | O_SN_01 | Snowy | 0.7926 | 0.7049 | 0.7356 |
I_CA_02 | Camouflage | I_MB_01 | Modified background | 0.5637 | 0.6113 | 0.5763 |
O_CL_01 | Cloudy | O_RA_01 | Rainy | 0.8574 | 0.8265 | 0.8375 |
I_CA_01 | Camouflage | I_OC_02 | Occlusion | 0.7366 | 0.5031 | 0.5842 |
O_SN_01 | Snowy | O_RA_01 | Rainy | 0.8572 | 0.7438 | 0.7851 |
I_IL_01 | Illumination change | I_CA_01 | Camouflage | 0.8914 | 0.5985 | 0.7069 |
I_IL_02 | Illumination change | I_MB_01 | Modified background | 0.8483 | 0.9256 | 0.8746 |
O_CL_02 | Cloudy | O_SU_02 | Sunny | 0.6109 | 0.5264 | 0.5514 |
Average | 0.7663 | 0.7042 | 0.7174 |
Methods | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Videos | Zovkovik [62] 2004 | PAWCS [63] 2015 | MBS [24] 2017 | DeepBS [64] 2018 | BMN-BSN [35] 2019 | RT-SBS-V1 [65] 2020 | RT-SBS-V2 [65] 2020 | SPAMOD [11] 2021 | BSUV-Net 2.0 [61] 2021 | Our |
Highway (HW) | 0.9038 | 0.9436 | 0.9217 | 0.9655 | 0.9542 | 0.9493 | 0.9672 | 0.9265 | 0.9648 | 0.9268 |
Office (OC) | 0.6564 | 0.9375 | 0.9719 | 0.9780 | 0.9666 | 0.9347 | 0.9534 | 0.9445 | 0.9913 | 0.9448 |
PETS2006 (PS) | 0.8327 | 0.9315 | 0.8648 | 0.9425 | 0.9244 | 0.8769 | 0.9178 | 0.9211 | 0.9727 | 0.9242 |
Blizzard (BZ) | 0.7585 | 0.7737 | 0.8572 | 0.6115 | 0.8395 | 0.7607 | 0.6958 | 0.9603 | 0.7850 | 0.9609 |
Skating (ST) | 0.8644 | 0.8984 | 0.9223 | 0.9669 | 0.9722 | 0.8928 | 0.9279 | 0.9689 | 0.9143 | 0.9570 |
Snowfall (SF) | 0.7631 | 0.8393 | 0.8782 | 0.8648 | 0.9041 | 0.8738 | 0.8974 | 0.9481 | 0.9501 | 0.9452 |
Traffic (TF) | 0.6137 | 0.8278 | 0.6781 | 0.8776 | 0.4488 | 0.7761 | 0.8543 | 0.9165 | 0.8655 | 0.9133 |
Badminton (BM) | 0.6669 | 0.8920 | 0.9021 | 0.9527 | 0.8176 | 0.8695 | 0.9001 | 0.8870 | 0.9334 | 0.8729 |
Canoe (CO) | 0.8851 | 0.9379 | 0.9345 | 0.9794 | 0.8206 | 0.9420 | 0.9422 | 0.9171 | 0.9717 | 0.9095 |
Sofa | 0.6524 | 0.7247 | 0.8455 | 0.8134 | 0.9122 | 0.7706 | 0.9104 | 0.9266 | 0.9112 | 0.9270 |
Turnpike_0_5fps (TP) | 0.7729 | 0.9146 | 0.8901 | 0.4917 | 0.7203 | 0.8952 | 0.9176 | 0.8108 | 0.9691 | 0.8140 |
Cubicle (CI) | 0.6480 | 0.8713 | 0.5613 | 0.9427 | 0.6264 | 0.9702 | 0.9702 | 0.9016 | 0.9715 | 0.9034 |
Copymachine (CM) | 0.6597 | 0.9143 | 0.8711 | 0.9534 | 0.9620 | 0.9541 | 0.9541 | 0.9513 | 0.9657 | 0.9506 |
Park | 0.6989 | 0.8286 | 0.7099 | 0.8741 | 0.7210 | 0.6454 | 0.7903 | 0.7712 | 0.9136 | 0.8274 |
Lakeside (LS) | 0.5221 | 0.6147 | 0.6541 | 0.6535 | 0.5082 | 0.5699 | 0.7725 | 0.8419 | 0.7849 | 0.8434 |
Dingroom (DR) | 0.7925 | 0.8997 | 0.8531 | 0.8970 | 0.8659 | 0.7838 | 0.9482 | 0.9548 | 0.9226 | 0.9576 |
Library (LA) | 0.4247 | 0.9390 | 0.9594 | 0.4773 | 0.9320 | 0.8920 | 0.9564 | 0.9706 | 0.9215 | 0.9554 |
Turbulence0 (T0) | 0.0673 | 0.1326 | 0.1435 | 0.7971 | 0.0200 | 0.6319 | 0.7237 | 0.8390 | 0.9429 | 0.8428 |
Turbulence1 (T1) | 0.3118 | 0.8083 | 0.5413 | 0.7698 | 0.5588 | 0.1411 | 0.5573 | 0.8431 | 0.6416 | 0.8485 |
Turbulence3 (T3) | 0.7405 | 0.7356 | 0.8329 | 0.9340 | 0.8714 | 0.7865 | 0.7865 | 0.7791 | 0.8053 | 0.8024 |
Avg | 0.6618 | 0.8183 | 0.7897 | 0.8371 | 0.7673 | 0.7958 | 0.8672 | 0.8990 | 0.9051 | 0.9014 |
Metrics | ||||
---|---|---|---|---|
Methods | Acc ↑ | Rec ↑ | Sp ↑ | AUC ↑ |
STSM [66]-2015 | 0.75 | 0.70 | 0.28 | 0.70 |
Akula-CNN [32]-2016 | 0.79 | 0.73 | 0.26 | 0.73 |
DL [67]-2017 | 0.80 | 0.75 | 0.20 | 0.74 |
MRF [68]-2018 | 0.81 | 0.79 | 0.19 | 0.78 |
Qiu [54]-2019 | 0.83 | 0.80 | 0.16 | 0.81 |
SPAMOD [11]-2021 | 0.98 | 0.62 | 0.90 | 0.90 |
Our | 0.98 | 0.77 | 0.99 | 0.96 |
Metrics | ||
---|---|---|
Methods | F1 ↑ | mIoU ↑ |
CNN-feat [69]-2015 | 0.287 | 0.535 |
WS-Net [70]-2017 | 0.145 | 0.389 |
DeconvNet [71]-2018 | 0.341 | 0.625 |
Mask-CDNet [72]-2020 | 0.370 | 0.649 |
SASCNet [45]-2020 | 0.867 | 0.780 |
Our | 0.889 | 0.801 |
Method | FPS | #Param |
---|---|---|
FgSegNet [16]-2018 | 18 | 2.60 M |
3CDC [60]-2021 | 25 | 0.13 M |
BSUV-Net 2.0 [61]-2021 | 6 | NA |
MFCN [17]-2018 | 27 | 20.83 M |
RT-SBS [65]-2020 | 25 | NA |
SASCNet [45]-2020 | 10 | NA |
SPAMOD [11]-2021 | 13 | 35.1 M |
our | 24 | 5.27 M |