enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Isolation forest - Wikipedia

    en.wikipedia.org/wiki/Isolation_forest

    Isolation Forest is an algorithm for data anomaly detection using binary trees.It was developed by Fei Tony Liu in 2008. [1] It has a linear time complexity and a low memory use, which works well for high-volume data.

  3. Anomaly detection - Wikipedia

    en.wikipedia.org/wiki/Anomaly_detection

    Anomaly detection is crucial in the petroleum industry for monitoring critical machinery. [20] Martí et al. used a novel segmentation algorithm to analyze sensor data for real-time anomaly detection. [20] This approach helps promptly identify and address any irregularities in sensor readings, ensuring the reliability and safety of petroleum ...

  4. ELKI - Wikipedia

    en.wikipedia.org/wiki/ELKI

    Version 0.2 (July 2009) added functionality for time series analysis, in particular distance functions for time series. [ 13 ] Version 0.3 (March 2010) extended the choice of anomaly detection algorithms and visualization modules.

  5. Change detection - Wikipedia

    en.wikipedia.org/wiki/Change_detection

    In statistical analysis, change detection or change point detection tries to identify times when the probability distribution of a stochastic process or time series changes. In general the problem concerns both detecting whether or not a change has occurred, or whether several changes might have occurred, and identifying the times of any such ...

  6. Local outlier factor - Wikipedia

    en.wikipedia.org/wiki/Local_outlier_factor

    In anomaly detection, the local outlier factor (LOF) is an algorithm proposed by Markus M. Breunig, Hans-Peter Kriegel, Raymond T. Ng and Jörg Sander in 2000 for finding anomalous data points by measuring the local deviation of a given data point with respect to its neighbours.

  7. DBSCAN - Wikipedia

    en.wikipedia.org/wiki/DBSCAN

    DBSCAN is one of the most commonly used and cited clustering algorithms. [2] In 2014, the algorithm was awarded the Test of Time Award (an award given to algorithms which have received substantial attention in theory and practice) at the leading data mining conference, ACM SIGKDD. [3]

  8. CUSUM - Wikipedia

    en.wikipedia.org/wiki/CUSUM

    The low CUSUM value, detecting a negative anomaly, + = (, +) where ω {\displaystyle \omega } is a critical level parameter (tunable, same as threshold T) that's used to adjust the sensitivity of change detection: larger ω {\displaystyle \omega } makes CUSUM less sensitive to the change and vice versa.

  9. Random sample consensus - Wikipedia

    en.wikipedia.org/wiki/Random_sample_consensus

    Therefore, it also can be interpreted as an outlier detection method. [1] It is a non-deterministic algorithm in the sense that it produces a reasonable result only with a certain probability, with this probability increasing as more iterations are allowed. The algorithm was first published by Fischler and Bolles at SRI International in 1981 ...