.. _finallevel_filter_diffusenumu-main: ****************************** Finallevel Filter Diffuse-NuMu ****************************** Description ============================== This project contains the processing scripts for "Diffuse Numu / Northern Tracks" Event-selection and contains additional stages of processing used to get from the ~1 Hz L3 rates down to a neutrino-dominated sample. It is divided into two filter levels (L4 and L5), mainly to provide some structure. It should not be necessary to run these individually. Additionally, some "post-L5" variables can be calculated. This project is based on https://code.icecube.wisc.edu/projects/icecube/browser/IceCube/projects/finallevel_filter_diffusenumu/trunk but updated to run within the py3-v4.1.1 environment from cvmfs. Level 4 ------------------------------ Level 4 scripts are located in ``python/level4``. It contains some fixed cuts and re-runs some reconstructions without the DeepCore strings (the new keys have an "IC" tag) as a preparation stop for the BDT-based cuts. Level 5 ------------------------------ Scripts are located in ``python/level5``. With the re-run SplineMPE reco, the final zenith cut is applied. From the other reconstructions, the input variables for the BDT's are calculated. Two BDT scores are calculated: the cascade score and the track score, and cuts are applied on both scores maximizing purity. Finally, Millipede and Paraboloid recos run for the passed events. Post-L5 ------------------------------ The scripts in ``python/post_level5`` add some optional additional variables, but contain no further cuts. Usage & Requirements ============================== In order to evaluate the BDT's (part of the level5 scripts) one needs to install scikit-learn 0.24.1. The BDT's have initially been trained in an older version of scikit-learn and are manually ported to a python-3 compatible version. Proper behaviour of the migrated BDT's has been tested. To test your setup for the processing, use run ``resources/test/test_finallevel_diffusenumu.py``. To run the processing, run ``python/process_L45.py``. If starting from Level2 data, additionally run MuonLevel3 with ``python/process_L345.py``. To build .hdf files from the .i3 files, use ``python/write_hdf.py``. Collection of Analysis links ============================== Links to the analysis wikipage (9.5 years): https://user-web.icecube.wisc.edu/~jstettner/DiffuseExtensions_wiki/html/index.html 9.5 year analysis, PhD-thesis by Jöran Stettner: https://www.institut3b.physik.rwth-aachen.de/global/show_document.asp?id=aaaaaaaaawyqakk 6 year analysis paper: :arxiv:`1607.08006` Wiki explaining the event selection: https://user-web.icecube.wisc.edu/~lraedel/html/multi_year_diffuse/event_selections.html Detailed information regarding the BDT training and cuts: (chapters 5 and 6), PhD thesis by Leif Rädel: https://publications.rwth-aachen.de/record/709576 PhD thesis by Sebastian Schoenen: https://publications.rwth-aachen.de/record/696221