For increased portability and ease of use, we are moving towards storing halo data in sqlite database files rather than ascii tables.
Code for creating the simulation databases from ascii files can be found in the /migration subdirectory
- numpy
- pandas
- astropy (for utility functions and merger tree code)
- cosmolopy (for utility functions)
-
Download this package to a location in your python path
-
(optional) Add SIM_ENV="omega" or SIM_ENV="local" to bashrc or bash_profile to set the default directory for simulation database. Or see note below.
-
Import database interface
import caps.io.reader as db
-
load your database
sim = db.Simulation("L500_NR_0")
-
Get your properties by passing a list of strings of columns names and list of halo ids
property_list = ["Mtotal_500c", "r500c"] clusters = [1,3,10] halos = sim.get_halo_properties(clusters, property_list, aexp)
-
Or get profiles using the same format of arguments. Use the table argument to request profiles that are in a table other than the default "profiles" table. Note: 'rmid' values are automatically pulled from database when you request profiles
profile_list = ["Mgas", "Mdm"] clusters = [1,3,10] halos = sim.get_halo_profiles(clusters, profile_list, aexp) profile_list = ["Ptherm", "Prand"] clusters = [1,3,10] halos = sim.get_halo_profiles(clusters, profile_list, aexp, table="prand_profiles")
-
Global data and profiles will be returned in pandas data frames
M500 = halos["Mtotal_500c"] M500 = halos[halos["id"] == id]["Mtotal_500c"] bins = profiles[profiles["id"] = id]["rmid"] Pth = profiles[profiles["id"] = id]["Ptherm"]
If you set the SIM_ENV in your .bashrc or .bash_profile to omega or local, the code will automatically look in the group_scratch/[catalog]/ or ~/Data/[catalog]/ respectively for a file named [catalog].db. If you would like to point the code to alternate location for the database, you can specify the location using the keyword argument db_dir:
sim = db.Simulation(catalog, db_dir="")
There is a problem with Python 2.7.2 installation on Omega that prevents sqlite3 from working. The HPC folks recommend using the following stack instead:
module load Langs/Python/2.7.5
module load Libs/SCIPY
module load Libs/NUMPY
module load Libs/PANDAS
module load Libs/MATPLOTLIB
-
/mergertree - contains code to generate a merger tree from halo_particles data and database with full halo catalog data. Will write directly to mergertree and mergers tables.
-
/migration - contains scripts to input the halo catalogs generated by cart_hfind and profiles generated by cart_halo_profile into a new database.
- Add the option to join profile tables for reducing queries?