Skip to content

BPM-Filtering

The content of this page has been converted from the presentation given as "OMC3 bad BPM detection" on the 31.03.2025 .

OMC-Analysis

BPM vs Owl

  • To get reliable and reproducible optics measurements, we cannot trust all BPM data that we are getting.
  • At different stages of the code, we try to determine the "trustworthyness".
  • Some BPMs are fully excluded, others are getting errorbars based on their noise-level (which are often used as weights, e.g. for correction calculations).

Automatic

  • First the Turn-by-Turn Data is checked for "obvious" signs.
    • EXACT_ZERO: Any value has an exact zero (might lead to false positives, but unlikely if happens in multiple datasets).
    • NANS: Data contains NaN-values (happens in SPS BPMs).
    • FLAT: Peak-to-Peak value was below a given threshold (default: 10nm).
    • SPIKY: A spike in the data, above a given threshold (default: 2cm).
  • To reduce noise we perform an SVD-decomposition and keep only the strongest modes.
    • SVD_PEAK: The BPM had a mode in the U-Matrix above a given threshold (default: 0.925).
  • Most information about the optics come from a spectral analysis.
    • NO_TUNE: Tune line could not be found in the spectrum.
    • TUNE_CLEAN: Tune line found was too far from the average of the other BPMs (default: > \(10^{-5}\)).

Manual

  • KNOWN: Identified manually; sticking out spectrum/optics regularly.
    • Large error bars.
    • Non-sensical data points.
    • Cause analysis issues (e.g. phase-offsets, negative \(\beta\), NaNs).
    • Calibration issues:
      • \(\beta\)-from-phase looks normal, peak in \(\beta\)-from-amplitude
      • \(\beta\)-ratios large.
      • Measure in different optics to confirm BPM issue.
  • Good hint: Filtering them solves observed issues.
Observed Beta-Beating with some large errorbars (e.g. IP1 and IP5) and unrealistic beating (around 2200m in Y)

Isolation Forest

  • Using machine learning techniques to identify Bad-BPMs.
    • IFOREST: Identified BPM, due to being an outlier in Tune, Noise and/or Amplitude

Under Re-evaluation

This functionality is currently under re-evaluation and has only been used on a small amount of data in 2021-2024!

Isolation Forest Example

E. Fol - Machine Learning for BPM failure detection
E. Fol - Isolation Forest for bad BPMs: performance evaluation

Bad BPMs Summary Script

  • Bad-BPMs are written out in files per analysed TbT-Data.
  • Script available to gather, summarize and make statistics.
usage: bad_bpms_summary.py [-h] --dates DATES [DATES ...] [--root ROOT] [--outfile OUTFILE]
                           [--print_percentage PRINT_PERCENTAGE] [--accel_glob ACCEL_GLOB]

options:
  --dates DATES [DATES ...]
                        Dates to include in analysis. This should be either subfolders in
                        `root` or glob-patterns for those.
  --root ROOT           Path to the root directory, containing the dates.
                        Default: `/user/slops/data/LHC_DATA/OP_DATA/BetaBeat/`
  --outfile OUTFILE     Path to the file to write out.
  --print_percentage PRINT_PERCENTAGE
                        Print out BPMs that appear in more than this percentage of measurements.
  --accel_glob ACCEL_GLOB
                        Accelerator name (glob for the sub-directories).

Example: All bad BPMs from 2025, written into file and all > 50% in terminal

python -m omc3.scripts.bad_bpms_summary --dates 2025-* \
                                       --accel_glob LHCB* \
                                       --outfile bad_bpms_2025.txt \
                                       --print_percentage 50

A Bad BPM