Extended source sensitivity calculator
This notebook shows how to compute a extended source sensitivity for a specific model. In the current status of cosipy, we are limited by the available responses, i.e at 511, 1157, 1173, 1333 and 1809 keV and a coarse continuum. Keep in mind that due to the coarseness of the current responses, the sensitivity might be underestimated.
The collaboration is currently developping a neural network approach for the response to use with an unbinned analysis. This will allow us to fully exploit COSI power.
[1]:
import os
import logging
import sys
logger = logging.getLogger('cosipy')
logger.setLevel(logging.INFO)
logger.addHandler(logging.StreamHandler(sys.stdout))
os.environ["OMP_NUM_THREADS"] = "1"
os.environ["MKL_NUM_THREADS"] = "1"
os.environ["NUMEXPR_NUM_THREADS"] = "1"
from cosipy.spacecraftfile import SpacecraftHistory
from cosipy.response.FullDetectorResponse import FullDetectorResponse
from cosipy.response import PointSourceResponse,ExtendedSourceResponse
from cosipy.util import fetch_wasabi_file
from cosipy.statistics import PoissonLikelihood
from cosipy.background_estimation import FreeNormBinnedBackground
from cosipy.interfaces import ThreeMLPluginInterface
from cosipy.response import BinnedThreeMLModelFolding, BinnedInstrumentResponse, BinnedThreeMLExtendedSourceResponse
from cosipy.data_io import EmCDSBinnedData
from cosipy.source_injector import SourceInjector
from cosipy.threeml.custom_functions import SpatialTemplate_2D_Healpix
from histpy import Histogram
from mhealpy import HealpixMap
from scoords import SpacecraftFrame
from astropy.time import Time
import astropy.units as u
from astropy.coordinates import SkyCoord, Galactic
import numpy as np
import matplotlib.pyplot as plt
from threeML import *
from astromodels import *
from pathlib import Path
%matplotlib inline
15:21:22 WARNING The naima package is not available. Models that depend on it will not be functions.py:43 available
WARNING The GSL library or the pygsl wrapper cannot be loaded. Models that depend on it functions.py:65 will not be available.
WARNING The ebltable package is not available. Models that depend on it will not be absorption.py:33 available
15:21:23 INFO Starting 3ML! __init__.py:44
WARNING WARNINGs here are NOT errors __init__.py:45
WARNING but are inform you about optional packages that can be installed __init__.py:46
WARNING to disable these messages, turn off start_warning in your config file __init__.py:47
WARNING no display variable set. using backend for graphics without display (agg) __init__.py:53
WARNING ROOT minimizer not available minimization.py:1208
WARNING Multinest minimizer not available minimization.py:1218
WARNING PyGMO is not available minimization.py:1228
15:21:24 WARNING Could not import plugin FermiLATLike.py. Do you have the relative instrument __init__.py:126 software installed and configured?
WARNING No fermitools installed lat_transient_builder.py:44
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
Cell In[1], line 27
25 from cosipy.data_io import EmCDSBinnedData
26 from cosipy.source_injector import SourceInjector
---> 27 from cosipy.threeml.custom_functions import SpatialTemplate_2D_Healpix
28 from histpy import Histogram
29 from mhealpy import HealpixMap
ImportError: cannot import name 'SpatialTemplate_2D_Healpix' from 'cosipy.threeml.custom_functions' (/uni-mainz.de/homes/sgallego/software/COSItools/cosipy_DC4/Pyenv_DC4/lib/python3.13/site-packages/cosipy/threeml/custom_functions.py)
Download the data
[3]:
data_path = Path("") # /path/to/files. Current dir by default
[ ]:
#Download the Orientation file
fetch_wasabi_file('COSI-SMEX/develop/Data/Orientation/DC3_final_530km_3_month_with_slew_15sbins_GalacticEarth_SAA.fits', checksum = 'e86df2407eb052cf0c1db4a8e7598727')
[3]:
#Download the extended sourde response
#Choose the response depending on your model
#Model for a 511 keV line
#fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_511_merged.h5.gz', unzip=True,checksum = 'afcf551d3a8f800a268bef1490d0d48a')
#Model for a 1157 keV line
#fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Ti44_merged.h5.gz',unzip=True,checksum = '7ac6e1d9651ae33cccf94102f5356d9e')
#Model for a 1173 keV line
#fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Fe60_low_merged.h5.gz', unzip=True ,checksum = '2562f2ac0be44a60ea2d1adcea8e3b4a')
#Model for a 1333 keV line
#fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Fe60_high_merged.h5.gz', unzip=True ,checksum = '037217db2e8ca8a9a74a247e8fab9eb3')
#Model for a 1809 keV line
fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Al26_merged.h5.gz',unzip=True,checksum = '218d088159c6e11d0ddecc03ca9fa399')
#Model for an other line or continuum source
#fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_continuum_merged.h5.gz' ,unzip=True,checksum = 'a3a4a57acfbfe00378507441fa7a5891')
Downloading cosi-pipeline-public/COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Al26_merged.h5
---------------------------------------------------------------------------
ClientError Traceback (most recent call last)
Cell In[3], line 17
1 #Download the extended sourde response
2 #Choose the response depending on your model
3
(...) 15
16 #Model for a 1809 keV line
---> 17 fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_Al26_merged.h5',checksum = '218d088159c6e11d0ddecc03ca9fa399')
20 #Model for an other line or continuum source
21 #fetch_wasabi_file('COSI-SMEX/DC3/Data/Responses/extended_source_response/extended_source_response_continuum_merged.h5',checksum = '16fe005d3ab924ad98322b6579aabf2a')
File ~/software/COSItools/cosipy_interfaces/cosipy/util/data_fetching.py:245, in fetch_wasabi_file(file, output, overwrite, unzip, unzip_output, checksum, bucket, endpoint, access_key, secret_key)
241 # We'll get here only if:
242 # - The output file doesn't exist
243 # - The output file exists, has the wrong checksum/ETag, and overwrite = True
244 logger.info(f"Downloading {bucket}/{file}")
--> 245 s3.download_file(Bucket=bucket, Key=file, Filename=output)
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/botocore/context.py:123, in with_current_context.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
121 if hook:
122 hook()
--> 123 return func(*args, **kwargs)
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/boto3/s3/inject.py:223, in download_file(self, Bucket, Key, Filename, ExtraArgs, Callback, Config)
188 """Download an S3 object to a file.
189
190 Usage::
(...) 220 transfer.
221 """
222 with S3Transfer(self, Config) as transfer:
--> 223 return transfer.download_file(
224 bucket=Bucket,
225 key=Key,
226 filename=Filename,
227 extra_args=ExtraArgs,
228 callback=Callback,
229 )
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/boto3/s3/transfer.py:484, in S3Transfer.download_file(self, bucket, key, filename, extra_args, callback)
480 future = self._manager.download(
481 bucket, key, filename, extra_args, subscribers
482 )
483 try:
--> 484 future.result()
485 # This is for backwards compatibility where when retries are
486 # exceeded we need to throw the same error from boto3 instead of
487 # s3transfer's built in RetriesExceededError as current users are
488 # catching the boto3 one instead of the s3transfer exception to do
489 # their own retries.
490 except S3TransferRetriesExceededError as e:
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/s3transfer/futures.py:111, in TransferFuture.result(self)
106 def result(self):
107 try:
108 # Usually the result() method blocks until the transfer is done,
109 # however if a KeyboardInterrupt is raised we want want to exit
110 # out of this and propagate the exception.
--> 111 return self._coordinator.result()
112 except KeyboardInterrupt as e:
113 self.cancel()
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/s3transfer/futures.py:287, in TransferCoordinator.result(self)
284 # Once done waiting, raise an exception if present or return the
285 # final result.
286 if self._exception:
--> 287 raise self._exception
288 return self._result
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/s3transfer/tasks.py:272, in SubmissionTask._main(self, transfer_future, **kwargs)
268 self._transfer_coordinator.set_status_to_running()
270 # Call the submit method to start submitting tasks to execute the
271 # transfer.
--> 272 self._submit(transfer_future=transfer_future, **kwargs)
273 except BaseException as e:
274 # If there was an exception raised during the submission of task
275 # there is a chance that the final task that signals if a transfer
(...) 284
285 # Set the exception, that caused the process to fail.
286 self._log_and_set_exception(e)
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/s3transfer/download.py:359, in DownloadSubmissionTask._submit(self, client, config, osutil, request_executor, io_executor, transfer_future, bandwidth_limiter)
329 """
330 :param client: The client associated with the transfer manager
331
(...) 353 downloading streams
354 """
355 if (
356 transfer_future.meta.size is None
357 or transfer_future.meta.etag is None
358 ):
--> 359 response = client.head_object(
360 Bucket=transfer_future.meta.call_args.bucket,
361 Key=transfer_future.meta.call_args.key,
362 **transfer_future.meta.call_args.extra_args,
363 )
364 # If a size was not provided figure out the size for the
365 # user.
366 transfer_future.meta.provide_transfer_size(
367 response['ContentLength']
368 )
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/botocore/client.py:602, in ClientCreator._create_api_method.<locals>._api_call(self, *args, **kwargs)
598 raise TypeError(
599 f"{py_operation_name}() only accepts keyword arguments."
600 )
601 # The "self" in this scope is referring to the BaseClient.
--> 602 return self._make_api_call(operation_name, kwargs)
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/botocore/context.py:123, in with_current_context.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
121 if hook:
122 hook()
--> 123 return func(*args, **kwargs)
File ~/software/COSItools/cosipy_interfaces/Pyenv/lib/python3.11/site-packages/botocore/client.py:1078, in BaseClient._make_api_call(self, operation_name, api_params)
1074 error_code = request_context.get(
1075 'error_code_override'
1076 ) or error_info.get("Code")
1077 error_class = self.exceptions.from_code(error_code)
-> 1078 raise error_class(parsed_response, operation_name)
1079 else:
1080 return parsed_response
ClientError: An error occurred (404) when calling the HeadObject operation: Not Found
[ ]:
#Download the background model
#Choose the file depending on your model
#Model for a 511 keV line
#fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_511_binned_smoothed_fwhm_10.0deg.hdf5', checksum = '1b437be9b4d9e85a230dfe4f4d2ca67e')
#Model for a 1157 keV line
#fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_Ti44_binned_smoothed_fwhm_10.0deg.hdf5',checksum = 'bd62bfde46628a6e68434a6c68c7167d')
#Model for a 1173 keV line
#fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_Fe60_low_binned_smoothed_fwhm_10.0deg.hdf5',checksum = '57f972b0a1c2ece4626c85ced3cc050f')
#Model for a 1333 keV line
#fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_Fe60_high_binned_smoothed_fwhm_10.0deg.hdf5',checksum = 'db56a9fd9d8fc37cb03fba5e907a80c3')
#Model for a 1809 keV line
fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_Al26_binned_smoothed_fwhm_10.0deg.hdf5',checksum = 'bd6a32f8abee5b989b87149111487faa')
#Model for an other line or continuum source
#fetch_wasabi_file('COSI-SMEX/DC4/Data/Backgrounds/gal_DC4_bkg_continuum_binned_smoothed_fwhm_10.0deg.hdf5',checksum = '1acceb619b0d7e43eea8cb29ffe74279')
Define your source model
For an extended source we need to define a spectrum shape and a spatial shape :
Spectrum Shape
For this example we are using a Gaussian but other functions can be used (see https://astromodels.readthedocs.io/en/latest/notebooks/function_list.html#Included-1D-Functions)
Additionaly you can also provide your own spectrum template from a file. See Template (Table) Models in : https://threeml.readthedocs.io/en/stable/notebooks/spectral_models.html
Threeml expect the spectrum shape to have units of cm2/s/keV
[4]:
F = 3e-5 / u.cm / u.cm / u.s
mu = 1809*u.keV
sigma = 0.3*u.keV
spectrum = Gaussian()
spectrum.F.value = F.value
spectrum.F.unit = F.unit
spectrum.mu.value = mu.value
spectrum.mu.unit = mu.unit
spectrum.sigma.value = sigma.value
spectrum.sigma.unit = sigma.unit
# Set spectral parameters for fitting:
spectrum.F.free = True
spectrum.mu.free = False
spectrum.sigma.free = False
#set limit for the param
spectrum.F.min_value = 0
spectrum.F.max_value = 1
Spatial shape
For the spatial shape you need to provide a Healpix map in a fits file. The map should be normalized to the pixel area so that sum(map)*pixelarea = 1 /sr
[5]:
#For this example we will create here the spatial shape model but the fits file containing the spatial distribution can come
#from any source (ex : CLUMPY)
skymap = HealpixMap(nside = 16, scheme = "ring", dtype = float, coordsys='G')
skymap[:] = 1 #fill the map with 1, so no particular spatial distribution
#normalized the map to 1/sr
pix_area = skymap.pixarea().value
skymap[:] = skymap[:]/(np.sum(skymap)*pix_area)
#save the map example to a fits file
skymap.write_map("HealpixMap_example.fits",overwrite=True)
#declare the spatial shape
spatial_shape = SpatialTemplate_2D_Healpix(fits_file="HealpixMap_example.fits")
Extended source model
Now that we declare our spectral shape and spatial shape, we can create our extended source model
[6]:
source = ExtendedSource('source', spectral_shape=spectrum, spatial_shape=spatial_shape)
model = Model(source) # Model with single source. If we had multiple sources, we would do Model(source1, source2, ...)
Open the Response, Orientation and background files
[7]:
#open the Response file
#replace the background and response file by the one you want to use
extendedsourceresponse = "extended_source_response_Al26_merged.h5"
bkgfile = "gal_DC4_bkg_Al26_binned_smoothed_fwhm_10.0deg.hdf5"
sc_orientation_path = "DC3_final_530km_3_month_with_slew_15sbins_GalacticEarth_SAA.fits"
#open the extended source response
dr = ExtendedSourceResponse.open(data_path / extendedsourceresponse )
#Create the extended source injector
injector = SourceInjector(response_path=data_path / extendedsourceresponse, response_frame = "galactic")
#open the background model
bg_model = Histogram.open(data_path / bkgfile)
#open the Orientation file
sc_orientation = SpacecraftHistory.open(data_path / sc_orientation_path)
WARNING: XDG_CACHE_HOME is set to '/localscratch/.cache/sgallego', but the default location, /uni-mainz.de/homes/sgallego/.astropy/cache, already exists, and takes precedence. This environment variable will be ignored. [astropy.config.paths]
WARNING: XDG_CACHE_HOME is set to '/localscratch/.cache/sgallego', but the default location, /uni-mainz.de/homes/sgallego/.astropy/cache, already exists, and takes precedence. This environment variable will be ignored. [astropy.config.paths]
Compute the sensitivity
The Likelihood (LH) fit is ran hundred times (more would be better but depending on how fast it is you could increase this number). Each time a new signal dataset is created by convolving the model with the response and applying some poisson statistic (also called the source injector). Similarly for the background, a dataset is created by taking a poisson sample of the background model.
For each iteration the LH ratio test is computed :
\(L_0\) being the LH value for the background only hypothesis and \(L_1\) for background+signal . Assuming the Wilk’s theorem is applying, the LH ratio test should follow a \(\chi^2\) distribution with dof = nb of parameters. The median value of the LH ratio test distribution should tell us if COSI is sensitive or not to this flux value.
For our example we have only one parameter we test : the signal strength . So for a 3 \(\sigma\) sensitivity we want to have the median superior or equal to 9.
Keep in mind we are testing only 3 months of observations but we will scale the flux later to 2 years of observation if it is already sensitive after 3 months.
⚠ !! WARNING !! ⚠ In this example we used a predifined Gaussian function from Threeml so the parameter value is F and is call with spectrum.F.value . If you changed the spectrum shape model, please change in the code below F with the new name of your parameter. To know it, you can use model.display()
[8]:
#Calculate the response once
#create signal dataset
signal = injector.inject_model(model = model, make_spectrum_plot = False ,data_save_path = None)#,fluctuate=True)
#create a bkg dataset from the bkg model
bkg_poissonsamp = Histogram(bg_model.axes)
bkg_poissonsamp[:] += np.random.poisson(lam = bg_model.to_dense()[:])
#combine the data + the bck like we would get for real data
data = EmCDSBinnedData(signal.project('Em', 'Phi', 'PsiChi') + bkg_poissonsamp.project('Em', 'Phi', 'PsiChi'))
#declare the extended source response
esr = BinnedThreeMLExtendedSourceResponse(data = data,
precomputed_psr = dr,
polarization_axis = dr.axes['Pol'] if 'Pol' in dr.axes.labels else None)
response = BinnedThreeMLModelFolding(data = data, extended_source_response = esr)
[10]:
# We start with 100 but higher would be better
# To adjust depending on how fast the LH fit is
# You can also run a script in parallel and save the likelihood ratio test instead of using a for loop
nbiter = 100 # put this number to 100
LHratiotest = []
for i in range(nbiter) :
print(f"ITERATION {i}")
#create a signal dataset with the source injector
spectrum.F.value = F.value
spectrum.F.free = True
signal = injector.inject_model(model = model, make_spectrum_plot = False ,data_save_path = None)#,fluctuate=True)
#create a bkg dataset from the bkg model
bkg_poissonsamp = Histogram(bg_model.axes)
bkg_poissonsamp[:] += np.random.poisson(lam = bg_model.to_dense()[:])
# Set background parameter, which is used to fit the amplitude of the background, and instantiate the COSI 3ML plugin
# here we define only one bkg model but we could add more
bkg_dist = {"Background":bg_model.project('Em', 'Phi', 'PsiChi')}
# Workaround to avoid inf values. Out bkg should be smooth, but currently it's not.
for bckfile in bkg_dist.keys() :
bkg_dist[bckfile] += sys.float_info.min
#combine the data + the bck like we would get for real data
newdata = EmCDSBinnedData(signal.project('Em', 'Phi', 'PsiChi')
+ bkg_poissonsamp.project('Em', 'Phi', 'PsiChi')
)
bkg = FreeNormBinnedBackground(bkg_dist,
sc_history=sc_orientation,
copy = True)
#change the data of the extended source response
esr._data = newdata
##====
like_fun = PoissonLikelihood(newdata, response, bkg)
cosi = ThreeMLPluginInterface('cosi',
like_fun,
response,
bkg)
# Nuisance parameter guess, bounds, etc.
cosi.bkg_parameter['Background'] = Parameter("Background", # background parameter
1, # initial value of parameter
min_value=0, # minimum value of parameter
max_value=10, # maximum value of parameter
delta=0.05, # initial step used by fitting engine
unit = u.Hz
)
# Set the source Model
cosi.set_model(model)
plugins = DataList(cosi) # If we had multiple instruments, we would do e.g. DataList(cosi, lat, hawc, ...)
like = JointLikelihood(model, plugins, verbose = False)
#do the fit
like.fit(compute_covariance = False,quiet=True)
#get L1
L1 = cosi.get_log_like()
spectrum.F.value = 0 #null hypothesis (no signal)
spectrum.F.free = False
#do the fit again
like.fit(compute_covariance = False,quiet=True)
#Get L0
L0 = cosi.get_log_like()
#save LH ratio test
LHratiotest.append( -2 * (L0-L1) )
ITERATION 0
13:45:49 INFO set the minimizer to minuit joint_likelihood.py:1017
ITERATION 1
13:46:50 INFO set the minimizer to minuit joint_likelihood.py:1017
ITERATION 2
13:47:52 INFO set the minimizer to minuit joint_likelihood.py:1017
Plot the LH ratio test
When testing high flux value for your model, some “islands” in the TS distribution could appears (peaks with very high TS). This happens when a signal event lands in a bin where the estimated background was originally 0, and then filled with an arbitrary floor value. This is not a problem with the sensitivity estimation algorithm as this doesn’t affect significantly the median of a TS distribution of a flux near threshold, so the estimated sensitivity is still sound. See this issue for more informations : https://github.com/cositools/cosipy/issues/598
[13]:
if np.median(LHratiotest) <= 9 :
print(f"COSI is not sensitive to this model with a flux of {F.value} ph/cm2/s. Please try a higher flux")
else :
print(f"COSI is 3 sigma sensitive to {F.value} ph/cm2/s. You can continue to the next cell")
fig,ax = plt.subplots()
ax.hist(LHratiotest,bins=30)
ax.axvline(np.median(LHratiotest),label=f"median {np.median(LHratiotest):.3f}",color="r",linestyle="--")
ax.legend()
ax.set_xlabel("LH ratio test")
ax.set_ylabel("counts")
ax.set_title(f"Model - flux {F.value} ph/cm2/s")
plt.show()
COSI is 3 sigma sensitive to 3e-05 ph/cm2/s. You can continue to the next cell
Extrapolate the sensitivity to 2 years
DC4 simulations only includes 3 months of orbit. To get the sensitivity after 2 years (nominal life time of COSI) as a first approximation we can scale the flux by \(\frac{1}{\sqrt{T}}\)
[14]:
# 3 * 8 = 24 months = 2 years
sensi_2y = F.value/np.sqrt(8)
print(f"3 sigma sensitivity after 2 years of COSI observation : {sensi_2y}")
3 sigma sensitivity after 2 years of COSI observation : 1.0606601717798212e-05