LVV-T23 - Verify implementation of Test Storing Approximations of Per-pixel Metadata (DMS-REQ-0326)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Simon Krughoff |
Image depth and mask information shall be available in a parameterized approximate form in addition to a full per-pixel form.
- LVV-157 - DMS-REQ-0326-V-01: Storing Approximations of Per-pixel Metadata
Test data: A data repository containing a full DRP data reduction of the HCS PDR dataset.
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Create the coadd pixel level depth map for the HSC PDR dataset from step 1.
Step 5
Generate compressed representation of the pixel level depth map.
Step 6
Create the coadd pixel level mask map for the HSC PDR dataset from step 1.
Step 7
Generate compressed representation of the mask map.
Step 8
Sample randomly from both the pixel level and compressed depth maps. Compare the distribution of depths sampled from the pixel level depth map to that sampled from the compressed representation.
Step 9
Divide the mask planes into two groups: INFO and BAD. BAD flags are any that would cause a particular pixel to be excluded from processing: e.g. EDGE, SAT, BAD. Sample masks from both the pixel level mask map and the compressed mask map.
For each sample, compute sum(mask_pixel xor mask_compressed). Produce the distribution of the number of bits that differ between the samples.
Repeat for both the INFO flags and the BAD flags.
LVV-T24 - Verify implementation of Computing Derived Quantities (DMS-REQ-0331)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Common derived quantities shall be made available to end-users by either providing pre-computed columns or providing functions that can be used dynamically in queries. These should at least include the ability to calculate the reduced chi-squared of fitted models and make it as easy as possible to calculate color-color diagrams.
- LVV-162 - DMS-REQ-0331-V-01: Computing Derived Quantities
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Load into DPDD+Science Platform
Step 5
Constructing color-color diagram and and fitting stellar locus in Science Platform.
Step 6
Invite three members of commissioning team to create color-color diagram from coadd catalogs based on merged coadd reference catalog.
LVV-T25 - Verify implementation of Denormalizing Database Tables (DMS-REQ-0332)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
The database tables shall contain views presented to the users that will be appropriately denormalized for ease of use.
- LVV-163 - DMS-REQ-0332-V-01: Denormalizing Database Tables
Step 1
Log host and networking details of the client host to be used. Log the Web browser version to be used.
Step 2
Establish VPN connectivity to the PDAC at NCSA.
Step 3
Manually perform, and log, the following steps against the Portal Aspect:
- Navigate to the PDAC Portal. Log the URL used to do so.
- Perform a cone search around (ra=0,dec=0), radius 300 arcseconds, in each of the Object-like, ForcedSource-like, and a Source-like catalog. Choose a row from each search and record the primary key value for each for later use. Take screen shots of the search form and of the results of the three searches. Record the wall clock time required for the searches, if long enough to measure.
- Perform a multi-object cone search based on the coordinates in the file LDM-540/test_scripts/lsp-00-15.coords in the Object-like table. Take screen shots of the search form and of the results of the search. Record the wall clock time required for the searches, if long enough to measure.
- Perform a search on each of the Object-like, ForcedSource-like, and Source-like catalogs for the IDs previously saved. Confirm that each search is successful and re- turns the same information as in the original search from which the ID was taken. Perform a search on the ForcedSource-like catalog using the ID from the Object- like catalog. Confirm that a time series of measurements of that object in multiple epochs is returned. Take screen shots of the search forms and of the results of the searches. Record the wall clock time required for the searches, if long enough to measure.
- On each of the Object-like catalog and a Source-like catalog, by performing searches over small regions of sky and exploring the results, choose a set of attributes and search parameters which should select a relatively small number of rows (< 100, 000) when applied to the entire sky. This may require some iterative experimentation at increasingly larger scales. Take screen shots of the final search forms and of the results of the searches. Record the wall clock time required for the searches, if long enough to measure.
Step 4
List the available views in the database.
Step 5
Take 20 sampled queries and determine which are easily done on views and which require complicated joins. Discuss the complicated ones and determine if any could be simplified by adding additional views.
LVV-T26 - Verify implementation of Maximum Likelihood Values and Covariances (DMS-REQ-0333)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Quantities delivered by all measurement algorithms shall include maximum likelihood values and covariances.
- LVV-164 - DMS-REQ-0333-V-01: Maximum Likelihood Values and Covariances
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify that maximum likelihood and covariant quantities are provide. Test and manually inspect that they are reasonable (finite, appropriately normed).
LVV-T27 - Verify implementation of Data Availability (DMS-REQ-0346)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
All raw data used to generate any public data product (raw exposures, calibration frames, telemetry, configuration metadata, etc.) shall be kept and made available for download.
- LVV-177 - DMS-REQ-0346-V-01: Data Availability
Step 1
Invite two reviewers to review that plan that seems reasonable to expect the archiving and provision of raw data
Step 2
Pass a set of HSC data through (equal in size to the first public data release) the data backbone through ingest and provide interface
Step 3
Track the ingestion of AuxTel data during one month in 2018-2019 and verify delivery and test download.
LVV-T28 - Verify implementation of Measurements in catalogs (DMS-REQ-0347)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
All catalogs shall record source measurements in flux units.
- LVV-178 - DMS-REQ-0347-V-01: Measurements in catalogs
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 7
A “Data Butler” will be initialized to access the repository.
Step 8
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 9
Verify that each of the single-visit, coadd, and difference image catalogs from HSC reporcessing and HiTS reprocessing (which may be the first source of regular difference images) provide measurements in flux units.
LVV-T29 - Verify implementation of Raw Science Image Data Acquisition (DMS-REQ-0018)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Delegate to Prompt Services (Ingest raw data from L1 Test Stand DAQ while simulating all modes)
- LVV-8 - DMS-REQ-0018-V-01: Raw Science Image Data Acquisition
Step 1
Ingest raw data from L1 Test Stand DAQ, simulating each observing mode
Step 2
Observe image metadata is present and queryable
LVV-T30 - Verify implementation of Wavefront Sensor Data Acquisition (DMS-REQ-0020)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Delegate to Prompt Services (Ingest wavefront sensor data from L1 Test Stand DAQ while simulating all modes)
- LVV-9 - DMS-REQ-0020-V-01: Wavefront Sensor Data Acquisition
Step 1
Ingest wavefront sensor data from L1 Test Stand DAQ while simulating all modes
Step 2
Observe wavefront sensor data and metadata archived
LVV-T31 - Verify implementation of Crosstalk Corrected Science Image Data Acquisition (DMS-REQ-0022)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Image and EFD Archiving, Prompt Processing & Delegate to Prompt Services (Ingest crosstalk corrected data from L1 Test Stand DAQ while simulating all modes)
- LVV-10 - DMS-REQ-0022-V-01: Crosstalk Corrected Science Image Data Acquisition
Step 1
Inject signals of different relative strength
Step 2
Apply Camera cross-talk correction
Step 3
Verify that DMS sytem can import the cross-talk corrected images
Step 4
Verify that images are corrected for crosstalk
LVV-T32 - Verify implementation of Raw Image Assembly (DMS-REQ-0024)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Delegate to Prompt Services (Ingest raw data from L1 Test Stand DAQ, observe image and metadata output)
- LVV-11 - DMS-REQ-0024-V-01: Raw Image Assembly
Step 1
Ingest data from L1 Camera Test Stand DAQ
Step 2
Simulate all different modes
Step 3
Verify that a raw image is constructed in correct format
Step 4
Verify that a raw image is constructed with correct metadata
LVV-T33 - Verify implementation of Raw Science Image Metadata (DMS-REQ-0068)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Ingest raw data from L1 Test Stand DAQ, observe image metadata is present and queryable
- LVV-28 - DMS-REQ-0068-V-01: Raw Science Image Metadata
Step 1
Ingest raw data from L1 Test Stand DAQ, simulating each observing mode
Step 2
Observe image metadata is present and queryable
Step 3
Ingest data from L1 Camera Test Stand DAQ
Step 4
Simulate all different modes
Step 5
Verify that a raw image is constructed in correct format
Step 6
Verify that a raw image is constructed with correct metadata
Step 7
Verify that time of exposure start/end, site metadata, telescope metadata, and camera metadata are stored in DMS system.
LVV-T34 - Verify implementation of Guider Calibration Data Acquisition (DMS-REQ-0265)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Ingest calibration frames from L1 Test Stand DAQ, execute CPP payloads, observe guider calibration products
- LVV-96 - DMS-REQ-0265-V-01: Guider Calibration Data Acquisition
Step 1
Ingest calibration frames from L1 Test Stand DAQ
Step 2
Execute CPP payloads
Step 3
Observe guider calibration products
LVV-T35 - Verify implementation of Nightly Data Accessible Within 24 hrs (DMS-REQ-0004)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
With the exception of alerts and Solar System Objects, all Level 1 Data Products shall be made public within time L1PublicT (LSR-REQ-0104) of the acquisition of the raw image data. Alerts shall be made available within time OTT1 (LSR-REQ-0101) from the conclusion of readout of the raw exposures used to generate each alert to the distribution of the alert to community distribution mechanisms. Solar System Object orbits shall, on average, be calculated before the following night’s observing finishes and the results shall be made available within time L1PublicT of those calculations being completed.
- LVV-4 - DMS-REQ-0004-V-01: Nightly Data Accessible Within 24 hrs
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
These instructions need to be adjusted for Kubernetes
Copy alert packets to local storage and change to that directory.
Step 7
Step 8
Start a consumer that monitors the full stream and logs only End of Partition status messages:
docker service create \
--name monitor_full \
--network alert_stream_default \
--constraint node.role==worker \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/monitorStream.py full-stream > monitor_log.txt
Step 9
Start a consumer that monitors the full stream and logs a deserialized version of every Nth packet:
adjust for printing Nth packets
docker service create \
--name monitor_full \
--network alert_stream_default \
--constraint node.role==worker \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/printStream.py full-stream > packet_log.txt
Step 10
Start a producer that reads alert packets from disk and loads them into the Kafka queue:
docker service create \
--name sender \
--network alert_stream_default \
-v $PWD:/home/alert_stream/data:ro \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/sendAlertStream.py full-stream
Step 11
Examine output log files.
The monitor log should show end-of partition messages such as
topic:full-stream, partition:0, status:end, offset:1000, key:None, time:1528496269.734
And the packet log should show deserialized alert packets with contents matching the input packets.
Step 12
Time processing of data starting from (pre-ingested) raw files until an alert is available for distribution; verify that this time is less than OTT1.
Step 13
Time processing of data starting from (pre-ingested) raw files until the required data products are available in the Science Platform. Verify that this time is less than L1PublicT.
Step 14
Run MOPS on 1 night equivalent of LSST observing worth of precursor data and verify that Solar System Object orbits can be updated within 24 hours.
Step 15
Record time between completion of MOPS processing and availability of the updated SSObject catalogue through the Science Platform; verify this time is less than L1PublicT.
LVV-T36 - Verify implementation of Difference Exposures (DMS-REQ-0010)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
The DMS shall create a Difference Exposure from each Processed Visit Image by subtracting a re-projected, scaled, PSF-matched Template Image in the same passband.
- LVV-7 - DMS-REQ-0010-V-01: Difference Exposures
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
Demonstrate successful creation of a template image from HSC PDF and DECAM HiTS data. Demonstrate successful creation of a Difference Exposure for at least 10 other images from survey, ideally at a range of arimass. In particular, HiTS has 2013A u-band data. While the Blanco 4-m does have an ADC, there are still some chromatic effects and we should demonstrate that we can successfully produce Difference Exposures and templates for diferent airmass bins.
LVV-T37 - Verify implementation of Difference Exposure Attributes (DMS-REQ-0074)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
For each Difference Exposure, the DMS shall store: the identify of the input exposures and related provenance information, and a set of metadata attributes including at least a representation of the PSF matching kernel used in the differencing
- LVV-32 - DMS-REQ-0074-V-01: Difference Exposure Attributes
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
For each of HSC PDR and DECAM HiTS data: set up three different templates and run subtractions on 10 different images from at least two different filters. Verify that we can recover the provenance information about which template was used for each subtraction, which input images were used for that template, and that we can successfull extract the PSF matching kernel.
LVV-T38 - Verify implementation of Processed Visit Images (DMS-REQ-0069)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
The DMS shall produce Processed Visit Images, in which the corresponding raw sensor array data has been trimmed of overscan and corrected for instrumental signature. Images obtained in pairs during a standard visit are combined.
- LVV-29 - DMS-REQ-0069-V-01: Processed Visit Images
Step 1
Process HSC data, DECAM data. Verify that Processed Visit Images are generated at correct size and with significant instrumental artifacts removed.
Step 2
Run camera test stand data through full acquisition+backbone+ISR.
Step 3
Run simulated LSST data with calibrations through prompt processing system and inspect Processed Visit images to verify that they have been cleaned of significant artifacts and are of the correct, shape, and described orientation.
LVV-T39 - Verify implementation of Generate Photometric Zeropoint for Visit Image (DMS-REQ-0029)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-12 - DMS-REQ-0029-V-01: Generate Photometric Zeropoint for Visit Image
Step 1
LVV-T40 - Verify implementation of Generate WCS for Visit Images (DMS-REQ-0030)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-13 - DMS-REQ-0030-V-01: Generate WCS for Visit Images
Step 1
LVV-T41 - Verify implementation of Generate PSF for Visit Images (DMS-REQ-0070)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-30 - DMS-REQ-0070-V-01: Generate PSF for Visit Images
Step 1
LVV-T42 - Verify implementation of Processed Visit Image Content (DMS-REQ-0072)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-31 - DMS-REQ-0072-V-01: Processed Visit Image Content
Step 1
LVV-T43 - Verify implementation of Background Model Calculation (DMS-REQ-0327)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-158 - DMS-REQ-0327-V-01: Background Model Calculation
Step 1
LVV-T44 - Verify implementation of Documenting Image Characterization (DMS-REQ-0328)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Alert Production & Delegate to Alert Production
- LVV-159 - DMS-REQ-0328-V-01: Documenting Image Characterization
Step 1
LVV-T45 - Verify implementation of Level 1 Data Quality Report Definition (DMS-REQ-0097)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
QC System, Alert Production & Ingest raw data from L1 Test Stand DAQ, execute AP, load Prompt QC, observe telemetry and report
- LVV-39 - DMS-REQ-0097-V-01: Level 1 Data Quality Report Definition
Step 1
LVV-T46 - Verify implementation of Level 1 Performance Report Definition (DMS-REQ-0099)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Prompt Processing, QC System & Execute single-day operations rehearsal, observe report
- LVV-41 - DMS-REQ-0099-V-01: Level 1 Performance Report Definition
Step 1
LVV-T47 - Verify implementation of Level 1 Calibration Report Definition (DMS-REQ-0101)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
OCS Driven Batch, Raw Calibration Validation, Daily Calibration Products Update & Execute single-day operations rehearsal, observe report
- LVV-43 - DMS-REQ-0101-V-01: Level 1 Calibration Report Definition
Step 1
LVV-T48 - Verify implementation of Exposure Catalog (DMS-REQ-0266)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Precursor data, execute AP, load results, observe catalog contents
- LVV-97 - DMS-REQ-0266-V-01: Exposure Catalog
Step 1
Verify that Exposure Catalogs contained required elements
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
LVV-T49 - Verify implementation of DIASource Catalog (DMS-REQ-0269)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Precursor data, execute AP, load results, observe catalog contents
- LVV-100 - DMS-REQ-0269-V-01: DIASource Catalog
Step 1
Verify that products are produced for DIASource catalog
Step 2
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 3
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 4
A “Data Butler” will be initialized to access the repository.
Step 5
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 6
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
LVV-T50 - Verify implementation of Faint DIASource Measurements (DMS-REQ-0270)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Delegate to Alert Production (Precursor data, execute AP, observe measurements are present)
- LVV-101 - DMS-REQ-0270-V-01: Faint DIASource Measurements
Input Data
DECam HiTS data.
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
As an example of selecting with constrains, Re-run source detection as an afterburner to select isolated sources (defined as more than 2 arcseconds away from any other objects in the single-image-depth catalog) that are fainter than the fiducial transSNR cut.
LVV-T51 - Verify implementation of DIAObject Catalog (DMS-REQ-0271)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Precursor data, execute AP, load results, observe catalog contents
- LVV-102 - DMS-REQ-0271-V-01: DIAObject Catalog
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T17 - AG-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
DIASource records will be accessed by querying the Butler, then examined interactively at a Python prompt.
Step 4
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 5
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 6
A “Data Butler” will be initialized to access the repository.
Step 7
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 8
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 9
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T17 - AG-00-00).
Step 10
sqlite3 or Python’s sqlalchemy module will be used to access the Level 1 database.
Step 11
Verify that DIAObjects have diaNearbyObjMaxStar and diaNearbyObjMaxGalaxies that point to the Object catalog and are within dianNearbyObjRadius; the probability of association; and the required DIAObject properties.
LVV-T52 - Verify implementation of DIAObject Attributes (DMS-REQ-0272)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Alert Production & Delegate to Alert Production
- LVV-103 - DMS-REQ-0272-V-01: DIAObject Attributes
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
LVV-T53 - Verify implementation of SSObject Catalog (DMS-REQ-0273)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Execute AP on precursor data, execute MOPS, load results, observe catalog contents
- LVV-104 - DMS-REQ-0273-V-01: SSObject Catalog
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
Run the MOPS pipeline on the Prompt Products database.
Step 7
Inspect SSObject catalog and verify the presence of the required elements (LVV-104).
LVV-T54 - Verify implementation of Alert Content (DMS-REQ-0274)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Delegate to Alert Production
- LVV-105 - DMS-REQ-0274-V-01: Alert Content
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
These instructions need to be adjusted for Kubernetes
Copy alert packets to local storage and change to that directory.
Step 7
Step 8
Start a consumer that monitors the full stream and logs only End of Partition status messages:
docker service create \
--name monitor_full \
--network alert_stream_default \
--constraint node.role==worker \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/monitorStream.py full-stream > monitor_log.txt
Step 9
Start a consumer that monitors the full stream and logs a deserialized version of every Nth packet:
adjust for printing Nth packets
docker service create \
--name monitor_full \
--network alert_stream_default \
--constraint node.role==worker \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/printStream.py full-stream > packet_log.txt
Step 10
Start a producer that reads alert packets from disk and loads them into the Kafka queue:
docker service create \
--name sender \
--network alert_stream_default \
-v $PWD:/home/alert_stream/data:ro \
-e PYTHONUNBUFFERED=0 \
alert_stream python bin/sendAlertStream.py full-stream
Step 11
Examine output log files.
The monitor log should show end-of partition messages such as
topic:full-stream, partition:0, status:end, offset:1000, key:None, time:1528496269.734
And the packet log should show deserialized alert packets with contents matching the input packets.
Step 12
Examine the serialized alert packets to confirm the presence of the required elements (LVV-105).
LVV-T55 - Verify implementation of DIAForcedSource Catalog (DMS-REQ-0317)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Precursor data, execute AP, load results, observe catalog contents
- LVV-148 - DMS-REQ-0317-V-01: DIAForcedSource Catalog
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
LVV-T56 - Verify implementation of Characterizing Variability (DMS-REQ-0319)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Delegate to Alert Production
- LVV-150 - DMS-REQ-0319-V-01: Characterizing Variability
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
Verify that the issued alerts contain measurements during the diaCharacterizationCutoff.
LVV-T57 - Verify implementation of Calculating SSObject Parameters (DMS-REQ-0323)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Precursor data, execute MOPS, load results, observe functions usable in queries
- LVV-154 - DMS-REQ-0323-V-01: Calculating SSObject Parameters
Step 1
Run the MOPS pipeline on the Prompt Products database.
Step 2
Step 3
Inspect SSObject catalog and verify the presence of the required elements (LVV-104).
Step 4
Computer the phase angle, reduced and absolute asteroid magnitudes for objects identified in SSObject Catalog
LVV-T58 - Verify implementation of Matching DIASources to Objects (DMS-REQ-0324)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Execute DRP and AP on precursor data, load results, confirm crossmatch table or view is present
- LVV-155 - DMS-REQ-0324-V-01: Matching DIASources to Objects
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 5
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 6
A “Data Butler” will be initialized to access the repository.
Step 7
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 8
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 9
Verify that a cross-match table between the Prompt DIASources and DRP Objects is available.
LVV-T59 - Verify implementation of Regenerating L1 Data Products During Data Release Processing (DMS-REQ-0325)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Data Release Production & Delegate to DRP
- LVV-156 - DMS-REQ-0325-V-01: Regenerating L1 Data Products During Data Release Processing
Step 1
Execute DRP
Step 2
Observe production of difference image data products
LVV-T60 - Verify implementation of Publishing predicted visit schedule (DMS-REQ-0353)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Delegate to PPP
Pointing Prediction Publishing
- LVV-184 - DMS-REQ-0353-V-01: Publishing predicted visit schedule
Step 1
LVV-T61 - Verify implementation of Associate Sources to Objects (DMS-REQ-0034)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Data Release Production & Delegate to DRP
- LVV-16 - DMS-REQ-0034-V-01: Associate Sources to Objects
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify that sources have objects
Step 5
Verify that objects list sources that seem reasonably near them.
LVV-T62 - Verify implementation of Provide PSF for Coadded Images (DMS-REQ-0047)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Data Release Production & Delegate to DRP
- LVV-20 - DMS-REQ-0047-V-01: Provide PSF for Coadded Images
Fully covered by preconditions for LVV-T16.
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00)
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each combination of tract/patch/filter, the PVI will be retrieved from the Butler, and the existence of all components described in Test items section §4.6.2 will be verified.
Step 5
Scripts from the pipe_analysis package will be run on every visit to check for the presence of data products and make plots
Step 6
Ten patches will be chosen at random and inspected by eye for unmasked artifacts.
Step 7
Select Objects classified as point sources on 10 different coadd images (including all bands). Evaluate the PSF model at the positions of these Objects, and verify that subtracting a scaled version of the PSF model from the coadd image yields residuals consistent with pure noise.
LVV-T63 - Verify implementation of Produce Images for EPO (DMS-REQ-0103)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Gregory Dubois-Felsmann |
Delegate to DRP
- LVV-45 - DMS-REQ-0103-V-01: Produce Images for EPO
This test will verify that the image data products called out in LSE-131
have been produced. In order for that to be successful, as a
precondition the inputs to that production must exist.
As the only currently mandated image data production in LSE-131 is that
of a color all-sky HiPS map down to 1 arcsecond resolution, the
prerequisite inputs to that must be the single-filter coadds in the
bands required by the yet-to-be-specified color prescription.
Depending on the test dataset used for different runs of this test over time, e.g., precursor or LSST-commissioning data, the size of the resulting HiPS image map will vary.
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify that a HiPS image map covering the LSST survey area, with a limiting depth yielding 1 arcsecond resolution, has been produced matching the color prescriptions provided by EPO (in updates to LSE-131 which are expected to be made "once ComCam data is available").
Step 5
Place the image map in a location accessible to a Firefly and an Aladin Lite client, ideally with the client running in the EPO data systems environment.
Step 6
Use Firefly to manually explore the image map at the largest scales to verify coverage of the entire sky. Sample in various locations to confirm the 1 arcsecond maximum depth. Confirm using Aladin Lite that the format of the image map is supported by this common community tool.
Step 7
Verify programmatically, perhaps both by sampling a variety of locations, and by counting the tiles created at the 1-arcsecond-resolution depth, that the map is complete and meets its specifications.
Step 8
Apply an IVOA-community HiPS service validation tool, if available, to the service location.
Step 9
Verify that the HiPS map created is in a location accessible to the EPO data systems.
LVV-T64 - Verify implementation of Coadded Image Provenance (DMS-REQ-0106)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP
- LVV-46 - DMS-REQ-0106-V-01: Coadded Image Provenance
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
Query and verify provenance of input images, and software versions that went into producing stack.
Step 6
Test re-generating 10 different coadds tract+patches based on the provenance image given
LVV-T65 - Verify implementation of Source Catalog (DMS-REQ-0267)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Data Release Production, Parallel Distributed Database & Precursor data, execute DRP, load results, observe catalog contents
- LVV-98 - DMS-REQ-0267-V-01: Source Catalog
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
LVV-T66 - Verify implementation of Forced-Source Catalog (DMS-REQ-0268)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Precursor data, execute DRP, load results, observe catalog contents
- LVV-99 - DMS-REQ-0268-V-01: Forced-Source Catalog
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 7
A “Data Butler” will be initialized to access the repository.
Step 8
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 9
Verify that there exist entries in the forced-photometry table for all coadd objects for the PVIs on which the object should appear.
Step 10
Verify that there exist entries in a forced-photometry table for each image for all DIAObjects.
LVV-T67 - Verify implementation of Object Catalog (DMS-REQ-0275)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Precursor data, execute DRP, load results, observe catalog contents
- LVV-106 - DMS-REQ-0275-V-01: Object Catalog
Input Data
DECam HiTS data (raw science images and master calibrations)
HSC "RC2" data (raw science images and master calibrations)
Step 1
load LSST DM Stack
Step 2
Run the single-frame processing and self-calibration steps of the DRP pipeline.
Step 3
Insert simulated sources into all single-frame images, including:
- static objects (e.g. galaxies), including some too faint to be detectable in single-epoch images;
- objects with static positions that are sufficiently bright and variable that they should be detectable in single-epoch difference images;
- transient objects that appear in only a few epochs;
- stars with significant proper motions and parallaxes, some below the single-epoch detection limit
- simulated solar system objects with orbits that can be constrained from just the epochs in the test dataset
Step 4
Run all remaining DRP pipeline steps.
Step 5
Load data into DRP database
Step 6
Verify that the injected simulated objects are recovered at a rate consistent with their S/N when not blended with each other or real objects, and that flags indicating how each Object was detected are consistent with their properties:
- static objects should be detected in coadds only (not difference images)
- static-position/variable-flux objects should be detected in coadds and possibly difference images
- transient objects should be detected in difference images only
- stars with significant proper motions may be detected in either coadds or difference images
- solar system objects should be detected in difference images only.
LVV-T68 - Verify implementation of Provide Photometric Redshifts of Galaxies (DMS-REQ-0046)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Precursor data, execute DRP, load results, observe catalog contents
- LVV-19 - DMS-REQ-0046-V-01: Provide Photometric Redshifts of Galaxies
Input Data
HSC Public Data Release (raw science images, master calibrations)
Assorted public spectroscopic catalogs and high-accuracy photometric
redshift catalogs in the HSC PDR footprint.
Step 1
Run DRP processing steps through (at least) final galaxy photometry measurements.
Step 2
Train photometric redshift algorithm(s) on spectroscopic and high-accuracy photometric redshift catalogs.
Step 3
Estimate photometric redshifts for all Objects generated by DRP processing.
Step 4
Load into DRP Database
Step 5
Inspect database to verify that photometric redshifts are present for all objects
LVV-T69 - Verify implementation of Object Characterization (DMS-REQ-0276)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Data Release Production, Parallel Distributed Database & Precursor data, execute DRP, load results, observe catalog contents
- LVV-107 - DMS-REQ-0276-V-01: Object Characterization
Step 1
LVV-T70 - Verify implementation of Coadd Source Catalog (DMS-REQ-0277)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Precursor data, execute DRP, load results, observe catalog contents
- LVV-108 - DMS-REQ-0277-V-01: Coadd Source Catalog
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify that there exists a catalog of merged sources.
LVV-T71 - Verify implementation of Detecting extended low surface brightness objects (DMS-REQ-0349)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP
- LVV-180 - DMS-REQ-0349-V-01: Detecting extended low surface brightness objects
Input Data
HSC "RC2" data (raw science images and master calibrations)
Step 1
load LSST DM Stack
Step 2
Run the single-frame processing and self-calibration steps of the DRP pipeline.
Step 3
Insert simulated low-surface-brightness galaxies (with exponential profiles) consistently into all calibrated single-epoch images.
Step 4
Run all remaining DRP pipeline steps.
Step 5
Load data into DRP database
Step 6
Verify that the injected simulated objects are recovered at a rate consistent with their S/N and true profile when not blended with each other or real objects.
LVV-T72 - Verify implementation of Coadd Image Method Constraints (DMS-REQ-0278)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP. Verify the implementation of how Coadd images are created.
- LVV-109 - DMS-REQ-0278-V-01: Coadd Image Method Constraints
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify that coadds were created following specification
LVV-T73 - Verify implementation of Deep Detection Coadds (DMS-REQ-0279)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Data Release Production & Delegate to DRP
- LVV-110 - DMS-REQ-0279-V-01: Deep Detection Coadds
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Verify through inspection that per-filter coadds exist for each tract+patch possible
Step 5
Verify through inspection that the images used to generate those coadds met specified conditions
Step 6
Visually inspect a subset of the coadds to verify that they visually appear reasonable and to be from good quality data.
LVV-T74 - Verify implementation of Template Coadds (DMS-REQ-0280)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Precursor or simulate data, execute Template Generation, observe image products
- LVV-111 - DMS-REQ-0280-V-01: Template Coadds
Step 1
Step 2
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 3
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 4
A “Data Butler” will be initialized to access the repository.
Step 5
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 6
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
LVV-T75 - Verify implementation of Multi-band Coadds (DMS-REQ-0281)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP
- LVV-112 - DMS-REQ-0281-V-01: Multi-band Coadds
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00)
Step 6
A “Data Butler” will be initialized to access the repository.
Step 7
For each combination of tract/patch/filter, the PVI will be retrieved from the Butler, and the existence of all components described in Test items section §4.6.2 will be verified.
Step 8
Scripts from the pipe_analysis package will be run on every visit to check for the presence of data products and make plots
Step 9
Ten patches will be chosen at random and inspected by eye for unmasked artifacts.
Step 10
Verify that deep detection coadds exist based on all filters.
LVV-T76 - Verify implementation of All-Sky Visualization of Data Releases (DMS-REQ-0329)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Simon Krughoff |
Delegate to DRP
- LVV-160 - DMS-REQ-0329-V-01: All-Sky Visualization of Data Releases
Input Data
Dataset of perhaps ~100 square degrees. The first HSC Public Data
Release will be used for this test. Larger (in sky area) datasets
should be identified for further testing.
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
Run all sky tile generation task to produce the data products necessary for serving the all sky visualization.
Step 5
Manually perform, and log (including timing where applicable), the following steps against that all sky visualization application. At all steps take special care to note any missing or un-rendered image tiles:
- Navigate to the all sky viewer and log the URL, browser and version.
- Zoom to native pixel display (1 image pixel per display pixel)
- Zoom to fit the full PDR footprint
- Zoom to 1/4x native resolution
- Pan to eastern edge of the footprint.
- Pan to western edge of the footprint.
- Navigate to the middle of the footprint.
- Zoom to max magnification
LVV-T77 - Verify implementation of Best Seeing Coadds (DMS-REQ-0330)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP
- LVV-161 - DMS-REQ-0330-V-01: Best Seeing Coadds
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
Explicitly create a coadd for a specified seeing range in each filter.
Step 6
Verify that these coadds exist.
LVV-T78 - Verify implementation of Persisting Data Products (DMS-REQ-0334)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Verify that per-band deep coadds and best-seeing coadds are present, kept, and available.
- LVV-165 - DMS-REQ-0334-V-01: Persisting Data Products
Precursor data from HSC PDR.
Step 1
Produce some relevant coadds and store them in the Archive
Step 2
Examine the data retention policies for those products
LVV-T79 - Verify implementation of PSF-Matched Coadds (DMS-REQ-0335)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Delegate to DRP
- LVV-166 - DMS-REQ-0335-V-01: PSF-Matched Coadds
Step 1
Step 2
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
Verify that PSF-matched coadds were created.
LVV-T80 - Verify implementation of Detecting faint variable objects (DMS-REQ-0337)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Delegate to DRP
- LVV-168 - DMS-REQ-0337-V-01: Detecting faint variable objects
Input Data
DECam HiTS data.
Gaia catalog of faint moving objects.
Catalog of spectroscopically confirmed quasars.
(Alternative: input data injected with faint variable sources).
Step 1
The DM Stack and Alert Processing packaged shall be initialized as described in LVT-T17 (AG-00-00).
Step 2
The alert generation processing will be executed using the verification cluster:
bash
python ap_verify/bin/prepare_demo_slurm_files.py
# At present we must run a single ccd+visit to handle ingestion before
# parallel processing can begin
./ap_verify/bin/exec_demo_run_1ccd.sh 410915 25
ln -s ap_verify/bin/demo_run.sl
ln -s ap_verify/bin/demo_cmds.conf
sbatch demo_run.sl
and any errors or failures reported.
Step 3
A “Data Butler” will be initialized to access the repository.
Step 4
For each of the expected data products types (listed in §4.2.2) and each of the expected units (PVIs, catalogs, etc.), the data product will be retrieved from the Butler and verified to be non-empty.
Step 5
DIAObjects are currently only stored in a database, without shims to the Butler, so the existence of the database table and its non-empty contents will be verified by directly accessing it using sqlite3 and executing appropriate SQL queries.
Step 6
Identify 100 objects from Gaia with proper motions high enough to have detectably moved during HSC observations.
Step 7
Measure reported proper motion of these objects in DM Stack processing. Verify that it is consistent with Gaia objects.
Step 8
Identify 100 quasars from color-space or existing extragalactic spectroscopic catalog.
Step 9
Measure lightcurves of these quasars. Determine if structure function is reasonable (may require at least a year to determine if the structure function of 100 quasars is "reasonable").
Step 10
(Alternative: if faint variable source can be injected into the input data, test to see if they are recovered).
LVV-T81 - Verify implementation of Targeted Coadds (DMS-REQ-0338)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Remove DR from disk, observe retention of designated coadd sections, observe accessibility
- LVV-169 - DMS-REQ-0338-V-01: Targeted Coadds
Step 1
Remove DR from disk
Step 2
Observe retention of designated coadd sections
Step 3
Observe accessibility of designated coadd sections via simulated DAC LSP instance
LVV-T82 - Verify implementation of Tracking Characterization Changes Between Data Releases (DMS-REQ-0339)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Jim Bosch |
Remove DR from disk, observe retention of designated catalog sections, observe accessibility
- LVV-170 - DMS-REQ-0339-V-01: Tracking Characterization Changes Between Data Releases
Step 1
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 2
A “Data Butler” will be initialized to access the repository.
Step 3
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 4
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00).
Step 5
A “Data Butler” will be initialized to access the repository.
Step 6
Scripts from the pipe_analysis package will be run on every visit to check for the presence of data products and make plots.
Step 7
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00).
Step 8
A “Data Butler” will be initialized to access the repository.
Step 9
Scripts from the pipe_analysis package will be run on every tract to check for the presence of data products and make plots
Step 10
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00).
Step 11
A “Data Butler” will be initialized to access the repository.
Step 12
For each processed CCD, the PVI will be retrieved from the Butler, and the existence of all components described in section Test Items (§4.6.2) will be verified.
Step 13
Scripts from the pipe_analysis package will be run on every visit to check for the presence of data products and make plots
Step 14
Five sensors will be chosen at random from each of two visits and inspected by eye for unmasked artifacts.
Step 15
The DM Stack shall be initialized using the loadLSST script (as described in LVV-T10 - DRP-00-00)
Step 16
A “Data Butler” will be initialized to access the repository.
Step 17
For each combination of tract/patch/filter, the PVI will be retrieved from the Butler, and the existence of all components described in Test items section §4.6.2 will be verified.
Step 18
Scripts from the pipe_analysis package will be run on every visit to check for the presence of data products and make plots
Step 19
Ten patches will be chosen at random and inspected by eye for unmasked artifacts.
Step 20
Prepare a second DRP run -> DPDD with different configuration parameters for this second test Data Release.
Step 21
Stage subset of products from first test Data Release to separate storage.
Step 22
Scientifically compare the results of the subset of that region of sky to those in the second test Data Release comparing the results of the DRP Scientific Verification tests.
LVV-T83 - Verify implementation of Bad Pixel Map (DMS-REQ-0059)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-22 - DMS-REQ-0059-V-01: Bad Pixel Map
Step 1
LVV-T84 - Verify implementation of Bias Residual Image (DMS-REQ-0060)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-23 - DMS-REQ-0060-V-01: Bias Residual Image
Step 1
LVV-T85 - Verify implementation of Crosstalk Correction Matrix (DMS-REQ-0061)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-24 - DMS-REQ-0061-V-01: Crosstalk Correction Matrix
Step 1
LVV-T86 - Verify implementation of Illumination Correction Frame (DMS-REQ-0062)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-25 - DMS-REQ-0062-V-01: Illumination Correction Frame
Step 1
LVV-T87 - Verify implementation of Monochromatic Flatfield Data Cube (DMS-REQ-0063)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-26 - DMS-REQ-0063-V-01: Monochromatic Flatfield Data Cube
Step 1
LVV-T88 - Verify implementation of Calibration Data Products (DMS-REQ-0130)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Data Backbone, Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-57 - DMS-REQ-0130-V-01: Calibration Data Products
Step 1
LVV-T89 - Verify implementation of Calibration Image Provenance (DMS-REQ-0132)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Batch Production, Managed Database, Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Precursor data, execute CPP, observe provenance
- LVV-59 - DMS-REQ-0132-V-01: Calibration Image Provenance
Step 1
LVV-T90 - Verify implementation of Dark Current Correction Frame (DMS-REQ-0282)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-113 - DMS-REQ-0282-V-01: Dark Current Correction Frame
Step 1
LVV-T91 - Verify implementation of Fringe Correction Frame (DMS-REQ-0283)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Lupton |
Daily Calibration Products Update, Periodic Calibration Products, Annual Calibration Products & Delegate to CPP
- LVV-114 - DMS-REQ-0283-V-01: Fringe Correction Frame
Step 1
LVV-T92 - Verify implementation of Processing of Data From Special Programs (DMS-REQ-0320)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Special Programs Production, Data Release Production, Task Framework, Workload and Orchestration & Precursor data, execute representative Special Programs pipelines
*DMS-REQ-0320 will be as follows, once LCR 1309 (accepted) is
implemented.
*
It shall be possible for special programs to trigger their own data processing recipes. It shall also be possible for special programs data to be processed with the prompt- and/or annual-release pipelines alongside data from the main survey.
: LSST will provide these recipes for processing Special Programs data when possible, which includes cases where DM can run original or reconfigured versions of existing pipelines, and excludes cases where the development new algorithms, or the allocation significant additional computational resources, are required. The data from Special Programs should only be included in the prompt- and/or annual-release processing along with data from the wide-fast-deep main survey when it is (a) possible for DM to do so without additional effort and (b) beneficial to the LSST's main science objectives.
The requirement for prompt Special Programs data processing to be completed within 24h is covered by TVV-T93 .
- LVV-151 - DMS-REQ-0320-V-01: Processing of Data From Special Programs
A variety of imaging data from Special Programs, including these
scenarios:
(1) Special Programs data that can be processed by the Prompt pipeline
(i.e., standard visits)
(2) Special Programs data that requires 'real-time' (~24) processing
with a reconfigured pipeline (e.g., DDF imaging sequence)
(3) Special Programs data that can (should) be processed by the Data
Release pipeline (e.g., North Ecliptic Spur standard visits)
Step 1
(1) Special Programs data that can be processed by the Prompt pipeline (i.e., standard visits). Check that all images with the header keyword for SP were processed by the Prompt pipeline. Check that the Prompt pipeline's data products -- DIASource, DIAObject catalogs and the Alerts -- contain items flagged with their origin as that SP.
Step 2
(2) Special Programs data that requires 'real-time' (~24) processing with a reconfigured pipeline (e.g., DDF imaging sequence) Check that all images with the header keywords for a given SP were processed by their reconfigured pipeline. Check that the pipeline's data products have been updated, and passed their QA.
Step 3
(3) Special Programs data that can (should) be processed by the Data Release pipeline (e.g., North Ecliptic Spur standard visits). SP data would be added manually to the DRP processing. Check that the DRP's data products -- Source, Object, CoAdds -- contain items flagged as originating in that SP.
LVV-T93 - Verify implementation of Level 1 Processing of Special Programs Data (DMS-REQ-0321)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Prompt Processing, Alert Production, MOPS and Forced Photometry & Execute single-day operations rehearsal, observe processing completed in time
*The relevant requirement here is this one:
*
All Level 1 processing from special programs shall be completed before data arrives from the following night’s observations
This refers to Special Programs that require prompt processing either
with the Prompt pipeline (Alert Generation), or with a reconfigured
pipeline.
The requirement for Special Programs data to be automatically processed
with the Prompt (and/or prompt reconfigured) pipelines is covered by
LVV-T92
.
- LVV-152 - DMS-REQ-0321-V-01: Level 1 Processing of Special Programs Data
Imaging data obtained under a Special Program: for example, a sequence of consecutive images of a deep drilling field.
Step 1
If imaging data for a Special Program that requires processing with the Prompt pipeline was obtained the previous night, check that there exist DIASources/Objects/Alerts with flags that they originated from the Special Program.
Step 2
If imaging data for a Special Program that requires prompt processing with a reconfigured pipeline was obtained the previous night, check that the relevant data products have been updated.
LVV-T94 - Verify implementation of Special Programs Database (DMS-REQ-0322)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Data Backbone, Managed Database, LSP Web APIs, Special Programs Production, Parallel Distributed Database & Precursor data, execute representative Special Programs pipelines, load results, observe distinct database
The relevant requirement here is:
Data products for special programs shall be stored in databases that are distinct from those used to store standard Level 1 and Level 2 data products. It shall be possible for these databases to be federated with the Level 1 and Level 2 databases to allow cross-queries and joins.
The requirement that these database be created by reconfigured DM pipelines is covered by LVV-T92 .
- LVV-153 - DMS-REQ-0322-V-01: Special Programs Database
Databases created by reconfigured pipelines for processing Special Programs data (e.g., DIAObject/DIASource catalogs for a Deep Drilling Field)
Step 1
SP data product: DDF DIAObjects catalog Non-SP data product: WFD DIAObjects catalog Test: join the two catalogs by coordinate (e.g., to get a longer time baseline for variable stars in the DDF)
Step 2
SP data product: DDF Objects catalog Non-SP data product: WFD DIAObjects catalog Test: join the two catalogs by coordinate to identify faint host galaxies of transients found in WFD
Step 3
LVV-T95 - Verify implementation of Constraints on Level Special Program Products Generation (DMS-REQ-0344)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Melissa Graham |
Execute single-day operations rehearsal, observe data products generated in time
1.6.3 Constraints On Level 1 Special Program Products Generation Id: Dms-Req-0344 (Priority: 2) Specification:
The publishing of Level 1 data products from Special Programs shall be
subject to the same performance requirements of the standard Level 1
system. In particular L1PublicT and OTT1.
L1PublicT = 24 hours
OTT1 = 1 minute
When data from Special Programs is processed by the Prompt pipeline, it
is treated the same way as WFD data, and the equivalent requirement for
WFD data is DMS-REQ-0004.
Therefore, the test scripts for the implementation of DMS-REQ-0004,
TVV-T35
, apply here.
- LVV-175 - DMS-REQ-0344-V-01: Constraints on Level 1 Special Program Products Generation
Step 1
Time processing of data starting from (pre-ingested) raw files until an alert is available for distribution; verify that this time is less than OTT1.
Step 2
Time processing of data starting from (pre-ingested) raw files until the required data products are available in the Science Platform. Verify that this time is less than L1PublicT.
Step 3
Run MOPS on 1 night equivalent of LSST observing worth of precursor data and verify that Solar System Object orbits can be updated within 24 hours.
Step 4
Step 5
Step 6
Record time between completion of MOPS processing and availability of the updated SSObject catalogue through the Science Platform; verify this time is less than L1PublicT.
LVV-T96 - Verify implementation of Query Repeatability (DMS-REQ-0291)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Load multiple DRs and PPDB, observe repeatability
Data Backbone, Managed Database, LSP Web APIs, Parallel Distributed Database
- LVV-122 - DMS-REQ-0291-V-01: Query Repeatability
Step 1
Select and download (deterministic) random subsample of records from Data Release Object and Source tables.
Step 2
Select and download random subsample of PPDB DIAObject and DIASource tables.
Step 3
As appropriate, wait for some amount of non-trivial database usage to occur, such as Prompt Processing ingestion or ingestion of other DRP database tables.
Step 4
Re-run the queries in steps 1 and 2 and verify that the resulting data are identical.
LVV-T97 - Verify implementation of Uniqueness of IDs Across Data Releases (DMS-REQ-0292)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Load multiple DRs and PPDB, observe uniqueness of IDs
- LVV-123 - DMS-REQ-0292-V-01: Uniqueness of IDs Across Data Releases
Step 1
Load multiple DRs and PPDB
Step 2
Observe uniqueness of IDs
LVV-T98 - Verify implementation of Selection of Datasets (DMS-REQ-0293)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Load DR, observe retrieval of representative datasets
- LVV-124 - DMS-REQ-0293-V-01: Selection of Datasets
Step 1
Load DR
Step 2
Observe retrieval of single PVI with metadata
Step 3
Observe retrieval of multiple PVIs with metadata
Step 4
Observe retrieval of coadd patch with metadata
Step 5
Observe retrieval of subset of rows in each catalog
LVV-T99 - Verify implementation of Processing of Datasets (DMS-REQ-0294)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute AP and DRP, simulate failures, observe correct processing
- LVV-125 - DMS-REQ-0294-V-01: Processing of Datasets
Step 1
Execute AP and DRP
Step 2
Simulate failures
Step 3
Observe correct processing
LVV-T100 - Verify implementation of Transparent Data Access (DMS-REQ-0295)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Observe dataset retrieval from multiple LSP instances
- LVV-126 - DMS-REQ-0295-V-01: Transparent Data Access
Step 1
Observe dataset retrieval from multiple LSP instances
LVV-T101 - Verify implementation of Transient Alert Distribution (DMS-REQ-0002)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Precursor or simulated data, execute AP, observe distribution to simulated clients using standard protocols
- LVV-3 - DMS-REQ-0002-V-01: Transient Alert Distribution
Obtain precursor or simulated data
Step 1
Execute AP
Step 2
Observe distribution to simulated clients using standard protocols
LVV-T102 - Verify implementation of Solar System Objects Available Within Specified Time (DMS-REQ-0089)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute single-day operations rehearsal, observe data products generated in time
- LVV-36 - DMS-REQ-0089-V-01: Solar System Objects Available Within Specified Time
Step 1
Execute single-day operations rehearsal
Step 2
Observe data products generated in time
LVV-T103 - Verify implementation of Generate Data Quality Report Within Specified Time (DMS-REQ-0096)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
QC System & Delegate to Prompt QC
- LVV-38 - DMS-REQ-0096-V-01: Generate Data Quality Report Within Specified Time
Step 1
Execute single-day operations rehearsal
Step 2
Observe data quality report is generated on time and with correct contents
LVV-T104 - Verify implementation of Generate DMS Performance Report Within Specified Time (DMS-REQ-0098)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
QC System & Delegate to Prompt QC
- LVV-40 - DMS-REQ-0098-V-01: Generate DMS Performance Report Within Specified Time
Step 1
Execute single-day operations rehearsal
Step 2
Observe performance report is generated on time and with correct contents
LVV-T105 - Verify implementation of Generate Calibration Report Within Specified Time (DMS-REQ-0100)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
QC System, Daily Calibration Products Update & Delegate to Prompt QC
- LVV-42 - DMS-REQ-0100-V-01: Generate Calibration Report Within Specified Time
Step 1
Execute single-day operations rehearsal
Step 2
Observe calibration report is generated on time and with correct contents
LVV-T106 - Verify implementation of Calibration Images Available Within Specified Time (DMS-REQ-0131)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute single-day operations rehearsal, observe data products generated
- LVV-58 - DMS-REQ-0131-V-01: Calibration Images Available Within Specified Time
Step 1
Execute single-day operations rehearsal
Step 2
Observe data products generated
LVV-T107 - Verify implementation of Level-1 Production Completeness (DMS-REQ-0284)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Image and EFD Archiving, Prompt Processing, Observatory Operations Data & Ingest raw data while simulating failures and outages, observe eventual recovery
- LVV-115 - DMS-REQ-0284-V-01: Level-1 Production Completeness
Step 1
LVV-T108 - Verify implementation of Level 1 Source Association (DMS-REQ-0285)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Alert Production & Delegate to AP
- LVV-116 - DMS-REQ-0285-V-01: Level 1 Source Association
Step 1
LVV-T109 - Verify implementation of SSObject Precovery (DMS-REQ-0286)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
MOPS and Forced Photometry & Delegate to AP
- LVV-117 - DMS-REQ-0286-V-01: SSObject Precovery
Step 1
LVV-T110 - Verify implementation of DIASource Precovery (DMS-REQ-0287)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
LSP Web APIs, MOPS and Forced Photometry & Execute single-day operations rehearsal, observe data products generated in time
- LVV-118 - DMS-REQ-0287-V-01: DIASource Precovery
Step 1
LVV-T111 - Verify implementation of Use of External Orbit Catalogs (DMS-REQ-0288)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Prompt Processing, Alert Production, MOPS and Forced Photometry & Delegate to AP
- LVV-119 - DMS-REQ-0288-V-01: Use of External Orbit Catalogs
Step 1
LVV-T112 - Verify implementation of Alert Filtering Service (DMS-REQ-0342)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Alert Filtering, LSP Portal & Simulated alert stream, observe ability to define filters and proper filter results
- LVV-173 - DMS-REQ-0342-V-01: Alert Filtering Service
Step 1
LVV-T113 - Verify implementation of Performance Requirements for LSST Alert Filtering Service (DMS-REQ-0343)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Alert Distribution, Alert Filtering, LSP Portal & Simulated alert stream, observe ability to support specified load
- LVV-174 - DMS-REQ-0343-V-01: Performance Requirements for LSST Alert Filtering Service
Step 1
LVV-T114 - Verify implementation of Pre-defined alert filters (DMS-REQ-0348)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Eric Bellm |
Alert Filtering, LSP Portal & Simulated alert stream, observe predefined filter existence and proper filter results
- LVV-179 - DMS-REQ-0348-V-01: Pre-defined alert filters
Step 1
LVV-T115 - Verify implementation of Calibration Production Processing (DMS-REQ-0289)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute CPP on a variety of representative cadences
- LVV-120 - DMS-REQ-0289-V-01: Calibration Production Processing
Step 1
Execute CPP on a variety of representative cadences
Step 2
Observe lack of failures and expected data products
LVV-T116 - Verify implementation of Associating Objects across data releases (DMS-REQ-0350)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Load DR, observe queryable association
- LVV-181 - DMS-REQ-0350-V-01: Associating Objects across data releases
Step 1
Load DR
Step 2
Observe queryable association
LVV-T117 - Verify implementation of DAC resource allocation for Level 3 processing (DMS-REQ-0119)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Observe resource allocation in PDAC
Batch Computing, Containerized Application Management, Identity Management, LSP Portal, LSP JupyterLab, LSP Web APIs
- LVV-47 - DMS-REQ-0119-V-01: DAC resource allocation for Level 3 processing
Step 1
Create a test user account for the Science Platform.
Step 2
Set the LSP resource allocations for the test user to very low values.
Step 3
Initiate example batch jobs and notebook sessions that will exceed the specified resource limits.
Step 4
Transfer sufficient data volumes into the user workspace and MyDB tables that would exceed the resource quotas.
Step 5
Reset the user resource quotas to normal values.
Step 6
Initiate the same example batch jobs and notebook sessions that previously caused an error.
Step 7
Transfer the same data volumes into the user workspace and MyDB tables that previously caused an error.
LVV-T118 - Verify implementation of Level 3 Data Product Self Consistency (DMS-REQ-0120)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Data Backbone, LSP Web APIs, Workload and Orchestration & Execute representative processing on DR in PDAC, observe consistency
- LVV-48 - DMS-REQ-0120-V-01: Level 3 Data Product Self Consistency
Step 1
LVV-T119 - Verify implementation of Provenance for Level 3 processing at DACs (DMS-REQ-0121)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Data Backbone, LSP Web APIs, Data Butler Access Client, Task Framework, Workload and Orchestration & Execute representative processing on DR in PDAC, observe provenance recording
- LVV-49 - DMS-REQ-0121-V-01: Provenance for Level 3 processing at DACs
Step 1
LVV-T120 - Verify implementation of Software framework for Level 3 catalog processing (DMS-REQ-0125)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Data Backbone, LSP JupyterLab, LSP Web APIs, Data Butler Access Client, Task Framework, Workload and Orchestration & Execute representative processing on DR in PDAC, observe recognition of a
- LVV-53 - DMS-REQ-0125-V-01: Software framework for Level 3 catalog processing
Step 1
LVV-T121 - Verify implementation of Software framework for Level 3 image processing (DMS-REQ-0128)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Data Backbone, LSP JupyterLab, LSP Web APIs, Data Butler Access Client, Task Framework, Workload and Orchestration & Execute representative processing on DR in PDAC, observe recognition of and
- LVV-56 - DMS-REQ-0128-V-01: Software framework for Level 3 image processing
Step 1
LVV-T122 - Verify implementation of Level 3 Data Import (DMS-REQ-0290)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Load representative files into PDAC
Data Backbone, LSP Web APIs, Parallel Distributed Database
- LVV-121 - DMS-REQ-0290-V-01: Level 3 Data Import
Step 1
Use the Science Platform catalog upload tool to ingest a small example FITS table.
Step 2
Use the Science Platform catalog upload tool to ingest a small example CSV table.
Step 3
Use the Science Platform catalog upload tool to ingest a large FITS table that needs to be spatially-sharded in the database.
Step 4
Perform example queries on each of the three tables to verify that all data is present.
LVV-T123 - Verify implementation of Access Controls of Level 3 Data Products (DMS-REQ-0340)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, IT Security, Identity Management, LSP Portal, Parallel Distributed Database & Configure representative access controls in PDAC, observe proper restrictions
- LVV-171 - DMS-REQ-0340-V-01: Access Controls of Level 3 Data Products
Step 1
LVV-T124 - Verify implementation of Software Architecture to Enable Community Re-Use (DMS-REQ-0308)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Simon Krughoff |
Execution of algorithms on batch cluster and desktop.
- LVV-139 - DMS-REQ-0308-V-01: Software Architecture to Enable Community Re-Use
Step 1
Using curated test datasets for multiple precursor instruments, verify and log that the prototype DRP pipelines execute successfully in three contexts:
- The CI system
- On a single user system: laptop, desktop, or notebook running in the Notebook aspect of the LSP.
Step 2
Using a template testing notebook in the Notebook aspect of the LSP, verify and log the following:
- Individual pipeline steps (tasks) are importable and executable on their own. this is not comprehensive, but demonstrative.
- Individual pipeline steps may be overridden by configuration.
- Users can implement a custom pipeline step and insert i into the processing flow via configuration.
Step 3
The DM Stack shall be initialized using the loadLSST script (as described in DRP-00-00).
Step 4
A “Data Butler” will be initialized to access the repository.
Step 5
For each of the expected data products types (listed in Test Items section §4.3.2) and each of the expected units (PVIs, coadds, etc), the data product will be retrieved from the Butler and verified to be non-empty.
Step 6
Run subset of full DRP from previous step on an individual node. Was this organizationally easy? Did the performance scale appropriately?
Step 7
Re-run aperture correction on subset. Verify that same results as DRP run are achieved.
Step 8
Re-run photometric redshift estimation algorithm on subset coadd catalogs. Verify that same results are achieved as from full DRP.
LVV-T125 - Verify implementation of Simulated Data (DMS-REQ-0009)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production & Delegate to AP and DRP
- LVV-6 - DMS-REQ-0009-V-01: Simulated Data
Step 1
LVV-T126 - Verify implementation Image Differencing (DMS-REQ-0032)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production, Science Algorithms & Delegate to AP and DRP
- LVV-14 - DMS-REQ-0032-V-01: Image Differencing
Step 1
LVV-T127 - Verify implementation of Provide Source Detection Software (DMS-REQ-0033)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production, Science Algorithms & Delegate to AP and DRP
- LVV-15 - DMS-REQ-0033-V-01: Provide Source Detection Software
Step 1
LVV-T128 - Verify implementation Provide Astrometric Model (DMS-REQ-0042)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production, Science Algorithms, Science Primitives & Delegate to AP and DRP
- LVV-17 - DMS-REQ-0042-V-01: Provide Astrometric Model
Step 1
LVV-T129 - Verify implementation of Provide Calibrated Photometry (DMS-REQ-0043)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production, Science Algorithms, Science Primitives & Delegate to AP and DRP
- LVV-18 - DMS-REQ-0043-V-01: Provide Calibrated Photometry
Step 1
LVV-T130 - Verify implementation of Enable a Range of Shape Measurement Approaches (DMS-REQ-0052)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Alert Production, Data Release Production, Science Algorithms, Science Primitives & Delegate to AP and DRP
- LVV-21 - DMS-REQ-0052-V-01: Enable a Range of Shape Measurement Approaches
Step 1
LVV-T131 - Verify implementation of Provide User Interface Services (DMS-REQ-0160)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Gregory Dubois-Felsmann |
Delegate to LSP: it is intended that the detailed test coverage for this requirement will largely arise from testing at the Science Platform level.
- LVV-63 - DMS-REQ-0160-V-01: Provide User Interface Services
- Testing this requirement relies on a set of data products meeting
the data model implied by the DPDD existing in a deployment of the
Science Platform and its underlying database and file services.
- In particular, both image and catalog data products are required.
- From the specific language of the underlying requirement, it appears clear that coadded data products are required, but in practice single-epoch data products should be included in the test as well.
- Depending on when this requirement is tested, the tests may involve either or both of precursor data and LSST commissioning data. The use of the latter is ultimately essential to ensure that the tests are performed with as LSST-like a dataset as possible.
Step 1
Establishment of test coordinates: Establish sky positions and surrounding regions (e.g., cones or polygons), field sizes, filter bands, and temporal epochs for the tests that are consistent with the known content of the test dataset, whether precursor or LSST commissioning data. Establishing sky positions should include pre-determining the corresponding LSST "tract and patch" identifiers. If the plan to not keep all calibrated single-epoch images on disk is still in place at the time of the test, identify for use in the test both images that are, and are not, on disk. Establish target image boundaries, projections, and pixel scales to be used for resampling tests. Ensure that at least some of these test conditions include coadded image boundaries that cross tract and patch boundaries, and single-epoch image boundaries that cross focal plane raft boundaries.
Step 2
Butler image access: From within the Notebook Aspect, verify that coadded images for the identified regions of sky and filter bands are accessible via the Butler. Verify that the same images are available whether obtained by direct reference to the previous established tract/patch identifiers or by the use of LSST stack code for retrieving images based on sky coordinates. From within the Notebook Aspect, verify that single-epoch raw images for the selected locations and times are available. Verify that calibrated images (PVIs) for the selected locations and times are available; depending on the details of the test dataset, verify that PVIs still on disk can be retrieved immediately. Verify that lists or tables of image metadata, not just individual images, can be retrieved. E.g., a list of all the single-epoch images covering a selected sky location.
Step 3
Programmatic PVI re-creation: From within the Notebook Aspect, verify that the recreation on demand of a PVI can be performed. Ideally, this should be done as follows:
- Verify that recreation of a PVI that is still available works and that it reproduces the original PVI exactly (except for provenance metadata that must be different) or within the reasonable ability of processing systems to do so (e.g., taking into account that the original calibration and the recreation may have run on different CPU architectures).
- The test conditions should ensure the verification that a recreation was actually performed, i.e., that the still-available PVI was not returned instead.
- Note that it does not appear to be a requirement that at Butler level recreation on demand of PVIs is a completely transparent process. If this is decided to be a requirement, the test must also verify that it has been satisfied. If it is not a requirement, verify that adequate documentation on the PVI-recreation process (e.g., the SuperTasks and configuration to be used) is available.
Step 4
Butler catalog access: From within the Notebook Aspect, verify that all the catalog data products described in the DPDD can be retrieved for the coordinates selected above via the Butler. (This test should include access to SSObject data, but the details of how such a test would depend on the coordinate selections require additional thought.)
Step 5
LSST-stack-based resampling/reprojection: Verify the availability of software in the LSST stack, and associated documentation, that permits the resampling of LSST images to different pixel grids and projections. Exercise this capability for the test conditions selected in Step 1 above. Perform photometric and astrometric tests on the resulting resampled images to provide evidence that the transformations performed were correct to the accuracy supported by the data.
Step 6
Comment: The following API Aspect test steps should be carried out on the required "offsite-like" test platform, to ensure that their success does not reflect any privileged access given to processes inside the Data Access Center or other Science Platform instance. However, at least a small sampling of them should also be carried out within the Science Platform environment, i.e., in the Notebook Aspect, and the results compared.
Step 7
API Aspect image access: Using IVOA services such as the Registry and ObsTAP, from the "offsite-like" test platform, verify that the existence of the classes of image data products foreseen in the DPDD can be determined. Verify that ObsTAP and/or SIAv2 can be used to find the same images and lists of images for the established test coordinates that were retrieved via the Butler in Step 2 above. Verify that the selected images are retrievable from the Web services. Verify that the retrieved images are identical in their pixel content and metadata. The tests must include both coadded and single-epoch images.
Step 8
API Aspect image transformations: Verify that image cutouts and resamplings can be performed via the IVOA SODA service, and that the results are identical to those obtained for the same parameters from the LSST-stack-based tests in Step 5. (The requirements for supported reprojections, if any, in the SODA service have not been established at the time of writing.)
Step 9
API Aspect catalog data access: Verify that the IVOA Registry, RegTAP, TAP_SCHEMA, and other relevant mechanisms can be used to discover the existence of all the catalog data products foreseen in the DPDD. Using the IVOA TAP service, verify that all the catalog data products foreseen in the DPDD can be retrieved for the coordinates determined in Step 1. Verify that their scientific content is the same as when they are retrieved via the Butler.
Step 10
Comment: The Portal Aspect tests below should be carried out from a web browser on an "offsite-like" test platform, to ensure that no privileged access provided to intra-data-center clients is relied upon.
Step 11
Portal Aspect data browsing: Verify that the Portal Aspect can be used to discover the existence of all the data products foreseen in the DPDD. Verify that the UI permits locating the data for the coordinates selected in Step 1 by visual means, e.g., by zooming and panning in from an all-sky view. Verify that the UI permits locating the data by typing in coordinates as well.
Step 12
Portal Aspect image access: Verify that the Portal Aspect allows both the retrieval of "original" image data, i.e., in its native LSST pixel projection and with full metadata, as well as retrieval of on-demand UI cutouts of coadded image data for selected locations.
Step 13
Portal Aspect catalog query and visualization: Verify that the Portal Aspect allows graphical querying of DPDD catalog data, both coadded and single-epoch, for selected regions of sky and/or with selected properties, and supports the visualization of the results (including histogramming, scatterplots, time series, table manipulations, and overplotting on image data). (Note that the Science Platform requirements, LDM-554, lay out a detailed set of requirements on the selection and visualization of catalog data.)
Step 14
Portal Aspect data download: Verify that data identified and/or visualized in the Portal Aspect can be downloaded to the remote system running the web browser in which the Portal is displayed, as well as to the User Workspace.
LVV-T132 - Verify implementation of Pre-cursor, and Real Data (DMS-REQ-0296)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Science Algorithms, Data Butler Access Client & Execute AP and DRP on non-LSST data
- LVV-127 - DMS-REQ-0296-V-01: Pre-cursor, and Real Data
Step 1
LVV-T133 - Verify implementation of Provide Beam Projector Coordinate Calculation Software (DMS-REQ-0351)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Science Primitives & Delegate to CPP
- LVV-182 - DMS-REQ-0351-V-01: Provide Beam Projector Coordinate Calculation Software
Step 1
LVV-T134 - Verify implementatino of Provide Image Access Services (DMS-REQ-0065)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
LSP Portal, LSP Web APIs & Delegate to LSP
- LVV-27 - DMS-REQ-0065-V-01: Provide Image Access Services
Step 1
LVV-T135 - Verify implementation of Provide Data Access Services (DMS-REQ-0155)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
LSP Web APIs & Delegate to LSP
- LVV-60 - DMS-REQ-0155-V-01: Provide Data Access Services
Step 1
LVV-T136 - Verify implementation of Data Product and Raw Data Access (DMS-REQ-0298)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP Web APIs & Delegate to LSP
- LVV-129 - DMS-REQ-0298-V-01: Data Product and Raw Data Access
Step 1
LVV-T137 - Verify implementation of Data Product Ingest (DMS-REQ-0299)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, LSP Web APIs & Delegate to DBB
- LVV-130 - DMS-REQ-0299-V-01: Data Product Ingest
Step 1
LVV-T138 - Verify implementation Bulk Download Service (DMS-REQ-0300)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Delegate to Bulk Download
- LVV-131 - DMS-REQ-0300-V-01: Bulk Download Service
A large dataset (at least a few TB) must be available.
Requires identity management to confirm bulk download use.
While this can be tested and shown to work using LSST DAC, Chilean DAC,
and IN2P3 endpoints, this should also be tested to demonstrate expected
throughput for outside users (e.g. FNAL, NERSC sites could be tested).
Step 1
Setup large transfer request and examine the data transfer rates achieved.
Step 2
Test should be repeated while observing in firehose mode (with LSSTCam) during science verification to ensure that bulk transfer does not compromise normal nightly operations.
LVV-T139 - Verify implementation of Provide Pipeline Execution Services (DMS-REQ-0156)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Batch Production, Workload and Orchestration & Verify subsidiary requirements
- LVV-61 - DMS-REQ-0156-V-01: Provide Pipeline Execution Services
Step 1
LVV-T140 - Verify implementation of Production Orchestration (DMS-REQ-0302)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Batch Production, Workload and Orchestration & Delegate to Batch Production
- LVV-133 - DMS-REQ-0302-V-01: Production Orchestration
Step 1
LVV-T141 - Verify implementation of Production Monitoring ( DMS-REQ-0303)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Service Management and Monitoring, Workload and Orchestration & Observe monitoring during DRP execution
- LVV-134 - DMS-REQ-0303-V-01: Production Monitoring
Step 1
LVV-T142 - Verify implementation of Production Fault Tolerance (DMS-REQ-0304)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Batch Production, Task Framework, Workload and Orchestration & Execute AP and DRP, simulate failures, observe correct processing
- LVV-135 - DMS-REQ-0304-V-01: Production Fault Tolerance
Step 1
LVV-T143 - Verify implementation of Provide Pipeline Construction Services (DMS-REQ-0158)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Task Framework & Delegate to Middleware
- LVV-62 - DMS-REQ-0158-V-01: Provide Pipeline Construction Services
Step 1
LVV-T144 - Verify implementation of Task Specification (DMS-REQ-0305)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Delegate to Middleware
Task Framework
- LVV-136 - DMS-REQ-0305-V-01: Task Specification
Step 1
Inspect software architecture. Verify that there exists Tasks that can be run and configured without re-complication.
Step 2
Verify that an example science algorithm can be run through one of these Tasks. Three examples from different areas: source measurement, image subtraction, and photometric-redshift estimation.
LVV-T145 - Verify implementation of Task Configuration (DMS-REQ-0306)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Task Framework & Delegate to Middleware
- LVV-137 - DMS-REQ-0306-V-01: Task Configuration
Step 1
Inspect software design to verify that one can define the configuration for a Task.
Step 2
Run a Task with a known invalid configuration. Verify that the error is caught before the science algorithm executes.
Step 3
Run a simple task with two different configurations that make a material difference for a Task. E.g., specify a different source detection threshold. Verify that the configuration is different between the two runs through difference in recorded provenance and in results.
LVV-T146 - Verify implementation of DMS Initialization Component (DMS-REQ-0297)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Compute/Storage/LAN & Power-cycle all DM systems at each Facility, observe recovery
- LVV-128 - DMS-REQ-0297-V-01: DMS Initialization Component
Step 1
LVV-T147 - Verify implementation pof Control of Level-1 Production (DMS-REQ-0301)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Pointing Prediction Publishing, Prompt Processing, OCS Driven Batch & Observe existence and capability of Prompt DMCS
- LVV-132 - DMS-REQ-0301-V-01: Control of Level-1 Production
Step 1
LVV-T148 - Verify implementation of Unique Processing Coverage (DMS-REQ-0307)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Batch Production, Workload and Orchestration & Execute representative processing, observe lack of duplicates or missing rows even in the presence of failures
- LVV-138 - DMS-REQ-0307-V-01: Unique Processing Coverage
Step 1
LVV-T149 - Verify implementation of Catalog Queries (DMS-REQ-0075)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-33 - DMS-REQ-0075-V-01: Catalog Queries
Step 1
LVV-T150 - Verify implementation of Maintain Archive Publicly Accessible (DMS-REQ-0077)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Data Backbone, Managed Database, Service Management and Monitoring, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Observe access to prior DR on tape
- LVV-34 - DMS-REQ-0077-V-01: Maintain Archive Publicly Accessible
Step 1
LVV-T151 - Verify implementation of Catalog Export Formats (DMS-REQ-0078)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
LSP Portal, LSP JupyterLab, LSP Web APIs & Delegate to LSP
- LVV-35 - DMS-REQ-0078-V-01: Catalog Export Formats
Step 1
LVV-T152 - Verify implementation of Keep Historical Alert Archive (DMS-REQ-0094)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, LSP Portal, LSP JupyterLab, LSP Web APIs & Simulated alert stream, load Alert DB, observe access to Alert DB
- LVV-37 - DMS-REQ-0094-V-01: Keep Historical Alert Archive
Step 1
LVV-T153 - Verify implementation of Provide Engineering and Facility Database Archive (DMS-REQ-0102)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Data Backbone, Managed Database & Execute single-day operations rehearsal, observe data products generated in time
- LVV-44 - DMS-REQ-0102-V-01: Provide Engineering & Facility Database Archive
Step 1
LVV-T154 - Verify implementation of Raw Data Archiving Reliability (DMS-REQ-0309)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Data Backbone, Managed Database & Analyze sources of loss or corruption after mitigation to compute estimated reliability
- LVV-140 - DMS-REQ-0309-V-01: Raw Data Archiving Reliability
Step 1
LVV-T155 - Verify implementation of Un-Archived Data Product Cache (DMS-REQ-0310)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Data Backbone & Delegate to DBB
- LVV-141 - DMS-REQ-0310-V-01: Un-Archived Data Product Cache
Step 1
LVV-T156 - Verify implementation of Regenerate Un-archived Data Products (DMS-REQ-0311)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Delegate to LSP (note: here and below, LSP test must be without shims for DBB)
- LVV-142 - DMS-REQ-0311-V-01: Regenerate Un-archived Data Products
Step 1
Run a small DRP processing job and download unarchived data products.
Step 2
Wait for (or force) a processing stack change so that the subsequent re-processing will be forced to use an older software build.
Step 3
Using provenance information from the products in Step 1, request a re-processing and compare results with previously unarchived products.
LVV-T157 - Verify implementation Level 1 Data Product Access (DMS-REQ-0312)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Prompt Processing, Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Alert Production & Delegate to LSP
- LVV-143 - DMS-REQ-0312-V-01: Level 1 Data Product Access
Step 1
LVV-T158 - Verify implementation Level 1 and 2 Catalog Access (DMS-REQ-0313)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-144 - DMS-REQ-0313-V-01: Level 1 & 2 Catalog Access
Step 1
LVV-T159 - Verify implementation of Regenerating Data Products from Previous Data Releases (DMS-REQ-0336)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Data Backbone, LSP Web APIs & Delegate to LSP
- LVV-167 - DMS-REQ-0336-V-01: Regenerating Data Products from Previous Data Releases
Step 1
LVV-T160 - Verify implementation of Providing a Precovery Service (DMS-REQ-0341)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
LSP Portal, MOPS and Forced Photometry & Precursor data, execute representative requests, observe correct results generated in time
- LVV-172 - DMS-REQ-0341-V-01: Providing a Precovery Service
- DECam HiTS data could be an appropriate set for this activity.
- Precovery pipelines for follow-on to alert processing must exist and be made available as a containerized version within the Science Platform.
- Determine limitations over which general precovery is supported. I would suggest that precovery services be limited to current (or last two) DRP campaigns with the possible addition of including non-DRP products to encompass observations over the preceding year (does this then require means to re-generate PVIs from Alert Production in addition to DRP?)
- Could re-use elements of LVV-T80 where quasars are used to test faint object detection.
Step 1
Run Precovery within follow-on Alert Production (i.e. daily post-processing on 30 day store).
Step 2
Within Science Platform, initiate request to perform precovery for a list of sources over same period (and longer). Include among the sources for precovery quasars from LVV-T80.
Step 3
Examine the results. Compare the results for the period where there is overlap with precovery run... and quasar photometry with those from LVV-T80 to verify user service performs as production services.
LVV-T161 - Verify implementation of Logging of catalog queries (DMS-REQ-0345)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Managed Database, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-176 - DMS-REQ-0345-V-01: Logging of catalog queries
Step 1
LVV-T162 - Verify implementation of Access to Previous Data Releases (DMS-REQ-0363)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Delegate to LSP
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database
- LVV-189 - DMS-REQ-0363-V-01: Access to Previous Data Releases
Requires two (fake) releases within DAC (or PDAC) with common area/observations (preferably with some differing results but could use metadata identifying provenance).
Step 1
From Science Platform initiate request for image and catalog products from one of the two release sets.
Step 2
From Science Platform re-issue the same request but specifying the alternate/earlier release set.
Step 3
Compare results and identify differences that are germaine to the relevant Data Release Sets are found.
LVV-T163 - Verify implementation of Data Access Services (DMS-REQ-0364)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs & Delegate to LSP
- LVV-190 - DMS-REQ-0364-V-01: Data Access Services
Step 1
LVV-T164 - Verify implementation of Operations Subsets (DMS-REQ-0365)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-191 - DMS-REQ-0365-V-01: Operations Subsets
Step 1
LVV-T165 - Verify implementation of Subsets Support (DMS-REQ-0366)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-192 - DMS-REQ-0366-V-01: Subsets Support
Step 1
LVV-T166 - Verify implementation of Access Services Performance (DMS-REQ-0367)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, Compute/Storage/LAN, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-193 - DMS-REQ-0367-V-01: Access Services Performance
Step 1
LVV-T167 - Verify implementation of Implementation Provisions (DMS-REQ-0368)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-194 - DMS-REQ-0368-V-01: Implementation Provisions
Step 1
LVV-T168 - Verify implementation of Evolution (DMS-REQ-0369)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, Service Management and Monitoring, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-195 - DMS-REQ-0369-V-01: Evolution
Step 1
LVV-T169 - Verify implementation of Older Release Behavior (DMS-REQ-0370)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-196 - DMS-REQ-0370-V-01: Older Release Behavior
Step 1
LVV-T170 - Verify implementation of Query Availability (DMS-REQ-0371)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Managed Database, LSP Portal, LSP JupyterLab, LSP Web APIs, Parallel Distributed Database & Delegate to LSP
- LVV-197 - DMS-REQ-0371-V-01: Query Availability
Step 1
LVV-T171 - Verify implementation of Pipeline Availability (DMS-REQ-0008)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, OCS Driven Batch, Telemetry Gateway, Alert Distribution, Alert Filtering, Batch Production, Data Backbone, Compute/Storage/LAN, Inter-Site Networks, Service Management and Mo
- LVV-5 - DMS-REQ-0008-V-01: Pipeline Availability
Step 1
LVV-T172 - Verify implementation of Optimization of Cost, Reliability and Availability (DMS-REQ-0161)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Pointing Prediction Publishing, Alert Distribution, Alert Filtering, Data Backbone, Compute/Storage/LAN, Service Management and Monitoring, LSP Portal, LSP JupyterLab, LSP Web APIs
- LVV-64 - DMS-REQ-0161-V-01: Optimization of Cost, Reliability and Availability in Order
Step 1
LVV-T173 - Verify implementation of Pipeline Throughput (DMS-REQ-0162)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, OCS Driven Batch, Alert Distribution, Alert Filtering, Data Backbone, Compute/Storage/LAN, Service Management and Monitoring & Execute single-day operations rehearsal, observe
- LVV-65 - DMS-REQ-0162-V-01: Pipeline Throughput
Step 1
LVV-T174 - Verify implementation of Re-processing Capacity (DMS-REQ-0163)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Robert Gruendl |
Batch Production, Data Backbone, Batch Computing, Compute/Storage/LAN, Inter-Site Networks, Service Management and Monitoring & Analyze sizing model; execute DRP, observe scaling
- LVV-66 - DMS-REQ-0163-V-01: Re-processing Capacity
Step 1
LVV-T175 - Verify implementation of Temporary Storage for Communications Links (DMS-REQ-0164)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Compute/Storage/LAN & Analyze sizing model and network/storage design
- LVV-67 - DMS-REQ-0164-V-01: Temporary Storage for Communications Links
Step 1
LVV-T176 - Verify implementation of Infrastructure Sizing for "catching up" (DMS-REQ-0165)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Data Backbone, Batch Computing, Compute/Storage/LAN, Inter-Site Networks & Execute single-day operations rehearsal including catch-up after failure, observe data products gene
- LVV-68 - DMS-REQ-0165-V-01: Infrastructure Sizing for "catching up"
Step 1
LVV-T177 - Verify implementation of Incorporate Fault-Tolerance (DMS-REQ-0166)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Containerized Application Management, Compute/Storage/LAN, Inter-Site Networks & Analyze design; execute single-day operations rehearsal including failures, observe recovery without loss of data
- LVV-69 - DMS-REQ-0166-V-01: Incorporate Fault-Tolerance
Step 1
LVV-T178 - Verify implementation of Incorporate Autonomics (DMS-REQ-0167)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, Observatory Operations Data, Alert Distribution, Alert Filtering, Batch Production, Data Backbone, Containerized Application Management, Compute/Storage/LAN, Inter-Site Netw
- LVV-70 - DMS-REQ-0167-V-01: Incorporate Autonomics
Step 1
LVV-T179 - Verify implementation of Compute Platform Heterogeneity (DMS-REQ-0314)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, Observatory Operations Data, OCS Driven Batch, Telemetry Gateway, Alert Distribution, Alert Filtering, Batch Production, Bulk Distribution, Developer Services, Integ
- LVV-145 - DMS-REQ-0314-V-01: Compute Platform Heterogeneity
Step 1
LVV-T180 - Verify implementation of Data Management Unscheduled Downtime (DMS-REQ-0318)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, Observatory Operations Data, OCS Driven Batch, Telemetry Gateway, Alert Distribution, Alert Filtering, Batch Production, Bulk Distribution, Data Backbone, Mana
- LVV-149 - DMS-REQ-0318-V-01: Data Management Unscheduled Downtime
Step 1
LVV-T181 - Verify implementation of Summit Facility Data Communications (DMS-REQ-0168)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-71 - DMS-REQ-0168-V-01: Summit Facility Data Communications
Step 1
LVV-T182 - Verify implementation of Prefer Computing and Storage Down (DMS-REQ-0170)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Compute/Storage/LAN & Analyze design
- LVV-72 - DMS-REQ-0170-V-01: Prefer Computing and Storage Down
Step 1
LVV-T183 - Verify implementation DMS Communication with OCS (DMS-REQ-0315)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Image and EFD Archiving, Prompt Processing, Observatory Operations Data, OCS Driven Batch, Telemetry Gateway & Delegate to IIP
- LVV-146 - DMS-REQ-0315-V-01: DMS Communication with OCS
Step 1
LVV-T184 - Verify implementation of Summit to Base Network (DMS-REQ-0171)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-73 - DMS-REQ-0171-V-01: Summit to Base Network
Step 1
LVV-T185 - Verify implementation of Summit to Base Network Availability (DMS-REQ-0172)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-74 - DMS-REQ-0172-V-01: Summit to Base Network Availability
Step 1
LVV-T186 - Verify implementation of Summit to Base Network Reliability (DMS-REQ-0173)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-75 - DMS-REQ-0173-V-01: Summit to Base Network Reliability
Step 1
LVV-T187 - Verify implementation of Summit to Base Network Secondary Link (DMS-REQ-0174)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-76 - DMS-REQ-0174-V-01: Summit to Base Network Secondary Link
Step 1
LVV-T188 - Verify implementation of Summit to Base Network Ownership and Operation (DMS-REQ-0175)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-77 - DMS-REQ-0175-V-01: Summit to Base Network Ownership and Operation
Step 1
LVV-T189 - Verify implementation of Base Facility Infrastructure (DMS-REQ-0176)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, Compute/Storage/LAN & Analyze design and sizing model
- LVV-78 - DMS-REQ-0176-V-01: Base Facility Infrastructure
Step 1
LVV-T190 - Verify implementation of Base Facility Co-Location with Existing Facility (DMS-REQ-0178)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Base Facility & Analyze design
- LVV-80 - DMS-REQ-0178-V-01: Base Facility Co-Location with Existing Facility
Step 1
LVV-T191 - Verify implementation of Commissioning Cluster (DMS-REQ-0316)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Commissioning Cluster, Compute/Storage/LAN, Base Facility & Analyze design and budget
- LVV-147 - DMS-REQ-0316-V-01: Commissioning Cluster
Step 1
LVV-T192 - Verify implementation of Base Wireless LAN (WiFi) (DMS-REQ-0352)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Compute/Storage/LAN & Delegate to Networks
- LVV-183 - DMS-REQ-0352-V-01: Base Wireless LAN (WiFi)
Step 1
LVV-T193 - Verify implementation of Base to Archive Network (DMS-REQ-0180)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-81 - DMS-REQ-0180-V-01: Base to Archive Network
Step 1
LVV-T194 - Verify implementation of Base to Archive Network Availability (DMS-REQ-0181)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-82 - DMS-REQ-0181-V-01: Base to Archive Network Availability
Step 1
LVV-T195 - Verify implementation of Base to Archive Network Reliability (DMS-REQ-0182)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-83 - DMS-REQ-0182-V-01: Base to Archive Network Reliability
Step 1
LVV-T196 - Verify implementation of Base to Archive Network Secondary Link (DMS-REQ-0183)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-84 - DMS-REQ-0183-V-01: Base to Archive Network Secondary Link
Step 1
LVV-T197 - Verify implementation of Archive Center (DMS-REQ-0185)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Bulk Distribution, Data Backbone, Service Management and Monitoring, NCSA Facility & Analyze design and sizing model
- LVV-85 - DMS-REQ-0185-V-01: Archive Center
Step 1
LVV-T198 - Verify implementation of Archive Center Disaster Recovery (DMS-REQ-0186)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Bulk Distribution, Data Backbone, Compute/Storage/LAN, Service Management and Monitoring & Analyze design; simulate storage failure, observe restore from disaster recovery
- LVV-86 - DMS-REQ-0186-V-01: Archive Center Disaster Recovery
Step 1
LVV-T199 - Verify implementation of Archive Center Co-Location with Existing Facility (DMS-REQ-0187)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
NCSA Facility & Analyze design
- LVV-87 - DMS-REQ-0187-V-01: Archive Center Co-Location with Existing Facility
Step 1
LVV-T200 - Verify implementation of Archive to Data Access Center Network (DMS-REQ-0188)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-88 - DMS-REQ-0188-V-01: Archive to Data Access Center Network
Step 1
LVV-T201 - Verify implementation of Archive to Data Access Center Network Availability (DMS-REQ-0189)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-89 - DMS-REQ-0189-V-01: Archive to Data Access Center Network Availability
Step 1
LVV-T202 - Verify implementation of Archive to Data Access Center Network Reliability (DMS-REQ-0190)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Inter-Site Networks & Delegate to Networks
- LVV-90 - DMS-REQ-0190-V-01: Archive to Data Access Center Network Reliability
Step 1
LVV-T203 - Verify implementation of Archive to Data Access Center Network Secondary Link (DMS-REQ-0191)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Inter-Site Networks & Delegate to Networks
- LVV-91 - DMS-REQ-0191-V-01: Archive to Data Access Center Network Secondary Link
Step 1
Take primary network link down
Step 2
Observe operations support over secondary link
Step 3
Bring primary network link back up
Step 4
Observe catch-up capability over secondary link
LVV-T204 - Verify implementation of Access to catalogs for external Level 3 processing (DMS-REQ-0122)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute bulk distribution of DRP catalogs, observe correct transfer and use of maintenance/validation tools
- LVV-50 - DMS-REQ-0122-V-01: Access to catalogs for external Level 3 processing
Step 1
Execute bulk distribution of DRP catalogs
Step 2
Observe correct transfer and use of maintenance/validation tools
LVV-T205 - Verify implementation of Access to input catalogs for DAC-based Level 3 processing (DMS-REQ-0123)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Bulk Distribution, Data Backbone, LSP Portal, LSP JupyterLab, LSP Web APIs & Load Prompt and DR catalogs into PDAC, observe access via LSP
- LVV-51 - DMS-REQ-0123-V-01: Access to input catalogs for DAC-based Level 3 processing
Step 1
LVV-T206 - Verify implementation of Federation with external catalogs (DMS-REQ-0124)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Colin Slater |
Load external catalog into PDAC (using VO if possible), observe federation with other catalogs via LSP
- LVV-52 - DMS-REQ-0124-V-01: Federation with external catalogs
Step 1
LVV-T207 - Verify implementation of Access to images for external Level 3 processing (DMS-REQ-0126)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Execute bulk distribution of DRP images, observe correct transfer and use of maintenance/validation tools
- LVV-54 - DMS-REQ-0126-V-01: Access to images for external Level 3 processing
Step 1
Execute bulk distribution of DRP images
Step 2
Observe correct transfer and use of maintenance/validation tools
LVV-T208 - Verify implementation of Access to input images for DAC-based Level 3 processing (DMS-REQ-0127)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Load Prompt and DR images into PDAC, observe access via LSP
- LVV-55 - DMS-REQ-0127-V-01: Access to input images for DAC-based Level 3 processing
Step 1
Load Prompt and DR images into PDAC
Step 2
Observe access via LSP
LVV-T209 - Verify implementation of Data Access Centers (DMS-REQ-0193)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Analysis | False | Kian-Tat Lim |
Analyze design
- LVV-92 - DMS-REQ-0193-V-01: Data Access Centers
Step 1
Analyze design
LVV-T210 - Verify implementation of Data Access Center Simultaneous Connections (DMS-REQ-0194)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Kian-Tat Lim |
Simulate data access to PDAC, observe scaling
- LVV-93 - DMS-REQ-0194-V-01: Data Access Center Simultaneous Connections
Step 1
Simulate data access to PDAC
Step 2
Observe scaling
LVV-T211 - Verify implementation of Data Access Center Geographical Distribution (DMS-REQ-0196)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Analysis | False | Kian-Tat Lim |
Analyze design
- LVV-94 - DMS-REQ-0196-V-01: Data Access Center Geographical Distribution
Step 1
Analyze design
LVV-T212 - Verify implementation of No Limit on Data Access Centers (DMS-REQ-0197)
Version | Status | Priority | Verification Type | Critical Event | Owner |
---|---|---|---|---|---|
1 | Draft | Normal | Test | False | Leanne Guy |
Data Backbone, LSP Portal, LSP JupyterLab, LSP Web APIs & Analyze design; instantiate and load simulated DAC, observe correct functioning
- LVV-95 - DMS-REQ-0197-V-01: No Limit on Data Access Centers
Step 1