Skip to content

kratsg/L1TriggerAnalysis

Repository files navigation

L1TriggerAnalysis

Level 1 Trigger Analysis for gFEX

This uses the atlas_jets package that I've been writing as well, to act as a wrapper for the incoming trigger data loaded from the files. The python packages are bundled up and can be copied over via XRootD, see instructions here.

Quick Start

ssh -Y kratsg@uct3.uchicago.edu
setupATLAS
cd Work/gFEX/
git clone https://github.com/kratsg/L1TriggerAnalysis L1TriggerAnalysis
cd L1TriggerAnalysis/
localSetupFAX --rootVersion=current-SL6 && voms-proxy-init -voms atlas
xrdcp root://faxbox.usatlas.org//user/kratsg/L1TriggerAnalysis/local.tar.gz local.tar.gz
tar -xzf local.tar.gz 
./scaffold.sh testAnalysis

Flocking via Condor

cd testAnalysis/
ruby make_config.rb
condor_submit config

To Run locally

. ./setupROOTandPython
cd testAnalysis
python main.py --processNum=0 --file=input.root --start=120 --numEvents=80 --seedEt=15 --towerThresh=6 --noiseFilter=0 --digitization=256

Scaffolding

One can quickly scaffold a new physics analysis by running

./scaffold.sh newAnalysis

which will set up a new directory newAnalysis/ with the correct symlinked files and example configuration files copied over for you to edit.

File Structure

At the top level, there are multiple folders for each of the physics analysis that is to be done. The same code should apply to all of them, apart from configuration files datasets.yml and plot_configs.py. The former defines the list of files for your jobs to run over, and the latter defines extra configs for make_plots.py. As of right now, the only configuration for plotting is the dataSetStr which is placed on every plot made.

To generate a Condor config file, you must run

ruby make_config.rb

which will build a config file for you. Run ruby make_config.rb -h for available options. Once you make your config file, you can submit to Condor. Every time you make a change to datasets.yml, you need to re-make the config file. After the jobs are done, you can merge as many files as you'd like for the plotting

python merge.py --*kwargs

where you can get a list of keyword arguments from python merge.py -h. Once this is done, you can just make some plots via

python make_plots.py --*kwargs

and the list of keyword arguments from python make_plots.py -h. Notice that a base set of keyword arguments are the same to help with consistency and copy-paste if needed.

Submitting a job

Simply, all you need to do is edit main.sh from any physics analysis directory (this file is symlinked). Then you can just resubmit a condor job. The job submission process is on the idea that you want to generate a job with specific configurations and then just simply cd into the other directories and re-submit via condor_submit config without any extra work. I've set this up to do exactly that. Each directory is separate because the datasets are different and the plots need to be distinct.

Obtaining the python packages

If you run ruby make_config.rb and it states that it is missing local.tar.gz which is not included in this repository (62MB), you can grab a copy here: faxbox::/user/kratsg/L1TriggerAnalysis/local.tar.gz or by XRootD

xrdcp root://faxbox.usatlas.org//user/kratsg/L1TriggerAnalysis/local.tar.gz local.tar.gz

Setting up Dataset configurations for Condor config

In particular, I provide a datasets.yml file which make_config.rb expects by default (you can pass in a different file using command line flags). This is a YAML (Yet Another Markup Language) file which you can read about. There are global and inline parameters which can be set using inheritance so that inline parameters override globally set parameters. For example, if all the files are located within the same directory, set a prefix

prefix: root://faxbox.usatlas.org//user/kratsg/LArStudies/

which gets preprended to all files listed below. Similarly, if most or all of your files have the same number of events, feel free to set

numEvents: 10000

globally. Then for specific files, you can set numEvents to a different value if that particularly file has a different number.

You also specify how many jobs are run per file, which is defining how to automatically partition the file up. For example, if we set numEvents: 10000 and then

numJobs: 10

Each job will run over 1000 events in the file. Like numEvents, you can also set this to a different value on a file-by-file basis.

Setting up Plot configurations

I also provide a plot_configs.py which only contains one variable right now, the dataSetStr. This is just a LaTeX string that gets rendered when making plots (such as for TTbar or ZH->nu nu bbar). Matplotlib can parse it correctly with standard LaTeX, but if you need help making this string, contact Giordon Stark with questions, or file an issue.

Making plots

Now, making plots requires a specific environment. The best way to do this (at least the way I've done it) is to define a bash function which sets up ROOT and then updates the python path. First you need to extract the local.tar.gz file:

tar -xzvf local.tar.gz

and then you just need to run

localSetupROOT 5.34.18-x86_64-slc6-gcc4.7
export PYTHONPATH=$HOME/L1TriggerAnalysis/.local/lib/python2.7/site-packages:$PYTHONPATH

which sets up ROOT 5.34.18 with Python-2.7, and then it exports the location of the package you just extracted into the python path so you have access to NumPy, Matplotlib, atlas_jets, and rootpy.

Contact

File an issue or contact Giordon Stark with any questions.

About

Level 1 Trigger Analysis for gFEX

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published