offline_filterscripts

  • Maintainer: Erik Blaufuss (blaufuss @ umd.edu). et al

Overview

This is the project containing the new offline filters for the 2023 season and beyond. Highlights include:

* New WG filter selections in the North, replacing the old L1/L2
  filters with selections aimed to get closer to "final level" in a
  simpler and more direct manner.

For simulation, these software are setup to mimic normal data, with a few changes to remove compression and space saving.

Important IceTray segments

  • read_superdst_files - intended to read files from pole as first step of northern processing

    • location: python/read_superdst_files.py

    • This segment prepares the compact data files from pole for reconstruction and filtering:

      • Read input GCD file and input file(s)

      • Drop frames that don’t have the SuperDST/DSTTriggers/I3EventHeader needed to process further

      • Mask out readouts from per-Run based Bad DOM lists (contained in GCD)

      • Generate InIce and IceTop Masked versions of RecoPulses

      • Split for InIce processing (InIceSplit) with TriggerSplitter

      • Split for IceTop processing (IceTopSplt) with I3TopHLCClusterCleaning

      • Null split for processing of entire DAQ events

More Details

offline_filterscripts is very much a work in progress, Please don’t hesitate to contact me (#new_processing_development or @blaufuss on Slack) if you have more questions.

Writing a filter

You should aim to write a self-contained IceTray Segment that performs needed reconstructions and events selections. These go in python/filter_segments. There is an example ‘example_filter.py’ that implements “ExampleFilter_22”.

The output for now for your filter should be an I3Bool in the frame with the decision of your filter. Please try to stick to python for filter implementation. Contact experts if you find this is not workable.

Useful scripts

There are a few useful “driver” scripts where you can add filters while testing and process data. These are in resources/scripts

  • UnpackDST.py - Intended to be the first processing applied to data from pole. Currently just “rehydrates” the data, using the ReadSuperDSTFiles tray segment. It does not currently apply the ML selections. You can include a test filter here if the ML information is not needed.

Guidelines

When writing filters and python code here, please follow these guidelines

  • make sure your filter IceTray segment cleans up after itself. Any stray objects created that are not intended for long term preservation should be deleted from the frame at the end of the segment (see the example…)

  • Please check for formatting styles to match the other files. Here, I’m using flake8 (https://flake8.pycqa.org/en/latest/). Please check your code with this tool.

  • Please make a branch of icetray/main, develop your filter and make a pull request (PR) (should be assigned to Erik automatically) when ready for inclusion. It will be reviewed before merging. For details on branching, PRs, etc see the Git Guide (https://github.com/icecube/icecube.github.io/wiki/GitGuide%3AGitHub-in-IceCube)

Resources