-
Large sky surveys in Time domain astronomy are becoming common
-
Large amounts of incoming data:
- require new methods to detect objects
- more accurate methods to study objects, as precision increases
-
Brings to fore survey science and data science methods.
-
How can preparation be done without the data? Test analysis methods using:
- reprocess old datasets: important as it includes real characterstics, but misses the appropriate size necessary to study effects.
- Simulated datasets: Get the appropriate size for testing precision effects, but the challenge is including appropriate correlations and subtle effects.
- Important for analysis methods, survey design: eg. survey strategy requirements.
-
Simulating Data :
- The astrophysical sources:
- Transients/Variables are among some of the rich and complex astrophysical systems, and therefore of interest to frontier research in astrophysics : typically generated by computationally intensive codes involving solutions of partial differential equations describing transport phenomena, dynamics and nucleosynthesis.
- Simulating transients from data driven models: Fast, low computational requirements, with sufficient data, often better tuned at describing realistic populations of astrophysical transients.
- Equally important is the survey science aspect:
- Representing a survey appropriately, including the proper cadence, seeing, depths. Emphasized here as work is supposed to be general.
- Misses effects present in the image processing pipeline: emulators necessary.
- Which data product to emulate ?
- Forced photometry results
- Forced photometry of detected objects, or all objects? helps in understanding the impact of selection. Over what time scales ? Data Size issues.
- What information should be recorded?
- The astrophysical sources:
-
Code architecture choices:
- Parallelizability
- Reproducibility
- Library vs script
- Choice of primitives
-
Inputs and Outputs in this work and relationship to other work:
- Intended to work with LSST, must interact with LSST data products, and software stack
- Output of survey strategy in
OpSim
outputs SNCosmo
SNANA
MAF
- a model that knows the time evolution of the transient. The model is expected to provide the correct time evolution in different bands.
- Such a model may depend on a set of parameters. For a particular set of parameters, it describes a particular class of transients as observed from the top of the earth's atmosphere.
- In order to produce light curves as observed in a survey, this model requires an input of the survey strategy, as well as observational parameters.
- a model, and a sequence of parameters. The sequence of parameters may be drawn from a stochastic distribution or a deterministic list.
- It should include coordinate positions (ra, dec) for calculation of MW extinction. However, we have a Population class which does not include this.
- There is further support of primitives to build up a population object, in terms of rate distributions, cosmological volume calculations, etc.
- While such a sequence may include all objects of interest in the universe, or the survey, reproducibility demands a definition of a quantum which will never be split in between nodes during simulations.
- A convenient identificiatin of such quanta for variables of roughly uniform densities in different sky locations is over quasi equal spatial areas.
- For objects with different spatial densities depending on the location in the sky (eg. variable stars), a similar method may be chosen, but it may help to identify quanta based on positions.
- Both cases may handled easily through hierarchical tesselations of the sky.
- In order to simulate a population, the code
- Adding chips functionality might be desirable. Check on Mangle help for this.
- Should include discussion of parallelization and reproducibility on parallelizing SN vs tiles.
- Keep all 3 codes together
- APJ Supplement