diff --git a/changes/140.phrosty.rst b/changes/140.phrosty.rst new file mode 100644 index 0000000..a0a11a2 --- /dev/null +++ b/changes/140.phrosty.rst @@ -0,0 +1 @@ +Massive refactor to use snappl for getting images and objects, removing phrosty explicit OU2024 dependence. diff --git a/docs/development.rst b/docs/development.rst index 998200c..e0d4e74 100644 --- a/docs/development.rst +++ b/docs/development.rst @@ -5,7 +5,7 @@ Development =========== .. contents:: - + If you're one of the phrosty developers, or are otherwise interested in contributing, here is some useful information. @@ -20,7 +20,7 @@ Running Tests Running all of the tests requires an NVIDIA GPU with enough memory. We are able to run them on 40GB NVIDIA GPUs, a GPU with only 12GB is not enough. (TODO: figure out the actual cutoff.) -To run the tests, forst make sure you've set up your environment and pulled down the necessary docker images as described in :ref:`phrosty installation prerequisites`. +To run the tests, forst make sure you've set up your environment and pulled down the necessary docker images as described in :ref:`phrosty installation prerequisites`. If you haven't already, get a copy of phrosty:: @@ -34,7 +34,7 @@ Make a couple of necessary directories:: mkdir dia_out_dir mkdir phrosty_temp - + Run the container with:: docker run --gpus=all -it \ @@ -74,7 +74,7 @@ Run the container with:: Once inside the container, cd into ``/home/phrosty/phrosty/tests`` and run:: - SNPIT_CONFIG=phrosty_test_config.yaml pytest -v + pytest -v Manually running a test lightcurve @@ -126,7 +126,7 @@ With these directories in place, run a container with:: /bin/bash If you placed any of the new directories anywhere other than underneath your current working directory, modify the ``source=...`` parts of the command above to reflect that. - + Inside the container, cd into ``/home/phrosty`` and try running:: nvidia-smi diff --git a/docs/usage.rst b/docs/usage.rst index d8783d1..0d63b83 100644 --- a/docs/usage.rst +++ b/docs/usage.rst @@ -58,15 +58,15 @@ Next, try running:: SNPIT_CONFIG=phrosty/tests/phrosty_test_config.yaml python phrosty/pipeline.py --help | less -You should see all the options you can pass to phrosty. There are a lot, because there are (verbose) options for everything that's in the config file. Press ``q`` to get out of ``less``. +You should see all the options you can pass to phrosty. There are a lot, because there are (verbose) options for everything that's in the config file. The options you need to think about most are at the top. Press ``q`` to get out of ``less``. Try running:: SNPIT_CONFIG=phrosty/tests/phrosty_test_config.yaml python phrosty/pipeline.py \ + --oc ou2024 \ --oid 20172782 \ - --ra 7.551093401915147 \ - --dec -44.80718106491529 \ -b Y106 \ + --ic ou2024 \ -t phrosty/tests/20172782_instances_templates_1.csv \ -s phrosty/tests/20172782_instances_science_2.csv \ -p 3 -w 3 \ @@ -182,16 +182,16 @@ The main Python executable for running the pipeline is ``phrosty/phrosty/pipelin SNPIT_CONFIG=phrosty/examples/perlmutter/phrosty_config.yaml python phrosty/phrosty/pipeline.py --help -to see how it works, and to see what the various parameters you can specify are. +to see how it works, and to see what the various parameters you can specify are. The output will be long, becasue everything that's in the config file is included as something you can override on the command line. The arguments near the top are the ones you're more likely to want to think about. You might want to pipe the output of this ``-help`` into ``less`` so you can see what's going on. Run this on your example lightcurve with:: python phrosty/phrosty/pipeline.py \ -c phrosty/examples/perlmutter/phrosty_config.yaml \ + --oc ou2024 \ --oid 20172782 \ - -r 7.551093401915147 \ - -d -44.80718106491529 \ -b R062 \ + --ic ou2024 \ -t phrosty/examples/perlmutter/20172782_instances_templates_1.csv \ -s phrosty/examples/perlmutter/20172782_instances_science_2.csv \ -p 3 \ @@ -222,20 +222,22 @@ When developing/debugging the pipeline, it's useful to run with a profiler, so y --trace=cuda,nvtx,cublas,cusparse,cudnn,cudla,cusolver,opengl,openacc,openmp,osrt,mpi,nvvideo,vulkan,python-gil \ python phrosty/phrosty/pipeline.py \ -c phrosty/examples/perlmutter/phrosty_config.yaml \ + --oc ou2024 \ --oid 20172782 \ - -r 7.551093401915147 \ - -d -44.80718106491529 \ -b R062 \ + --ic ou2024 \ -t phrosty/examples/perlmutter/20172782_instances_templates_1.csv \ -s phrosty/examples/perlmutter/20172782_instances_science_2.csv \ -p 3 \ -w 3 -*Ideally*, this would create a file ``report1.nsys-rep``, but something about that is broken; I'm not sure what. It does create a file ``report1.qdstrm``, which you can then somehow somewhere else convert to a ``.nsys-rep`` file. On a Linux system, if you've installed the ``nsight-compute`` and ``nsight-systems`` packages (see `Nvidia's Nsight Systems installation guide `_), you can download the ``.qdstrm`` file to your system and run:: +*Ideally*, this would create a file ``report1.nsys-rep`` (or ``report2.nsys-rep``, or higher numbers based on what files are already in the directory), but something about that is broken; I'm not sure what. If that file is created, be happy. If not, should leave behind a file ``report.qdstrm``. You can couple that file to another system and manually convert it to a ``.nsys-rep`` file. On a Linux system, if you've installed the ``nsight-compute`` and ``nsight-systems`` packages (see `Nvidia's Nsight Systems installation guide `_), you can download the ``.qdstrm`` file to your system and run:: - /opt/nvidia/nsight-systems/2024.4.2/host-linux-x64/QdstrmImporter -i .qdstrm + /opt/nvidia/nsight-systems/2025.3.2/host-linux-x64/QdstrmImporter -i .qdstrm -where ``.qstrm`` is the file you downloaded. (Note that the directory may have something other than ``2024.4.2`` in it, depending on what version you've installed. For best comptibility with the version of Nsight in the current (as of this writing) snpit docker image, I recommend trying to install something close to ``nsight-compute-2024.3.1`` and ``nsight-systems-2024.4.2``; exactly what is avialable seems to vary with time.) This should produce a file ``.nsys-rep``. Then, on your local desktop, run:: +where ``.qstrm`` is the file you downloaded. (Note that the directory may have something other than ``2025.3.2`` in it, depending on what version you've installed. For best comptibility with the version of Nsight in the current (as of this writing) snpit docker image, I recommend trying to install something close to ``nsight-compute-2025.3.0`` and ``nsight-systems-2025.3.2``; exactly what is avialable seems to vary with time.) This should produce a file ``.nsys-rep``. + +Once you, somehow, have a ``.nsys-rep`` file, copy it down to your local desktop if it's not there already, and run:: nsys-ui .nsys-rep @@ -304,6 +306,75 @@ and, ideally, there should be no lines anywhere in the file with ``ERROR`` near Note that ``/lc_out_dir/...`` is the absolute path _inside_ the container; it maps to ``lc_out_dir/...`` underneath your working directory where you ran ``sbatch``. You will find the lightcurve in that ``.csv`` file. There will also be a number of files written to the ``dia_out_dir`` directory. +Running on a HPC system that uses apptainer/singularity +------------------------------------------------------- + +Everything below assumes that the apptainer/singulariy executable is named ``apptainer``. That's the newer name; the older name was singlarity. If your system doesn't have ``apptainer``, try running ``singularity`` in its place. + +Set up the environment +^^^^^^^^^^^^^^^^^^^^^^ + +You need to import the roman SNPIT docker environment into singularity. First, cd to a directory that will be fast to read or write to (this may be a scratch partition or some such on your cluster) and run:: + + apptainer pull docker://rknop/roman-snpit-env:cuda-dev + +(You can also try pulling from ``registry.nersc.gov``, but to do that you'll have to figure out how to log into the repository with ``apptainer``; try something like ``apptainer remote login --username docker://registry.nersc.gov``.) + +That will take a long time. When it's done, there should be a file:: + + roman-snpit-env_cuda-dev.sif + +Pick a directory to work in; I will henceforth call this your "parent" directory. Make some necessary directories here:: + + mkdir phrosty_temp + mkdir dia_out_dir + mkdir lc_out_dir + mkdir ou2024_images + +Copy the data +^^^^^^^^^^^^^ + +This example assumes you're just going to use the data available in ``photometry_test_data``. Pull it down with:: + + git clone https://github.com/Roman-Supernova-PIT/photometry_test_data + +If you want to run on more than the images that are there, figure out which images you're going to run on. Look at the ``.csv`` files you'll be feeding to phrosty. For all of the images in those files, copy the data file from the host system to ``ou2024_images`` (preserving the relative path that's in the ``.csv`` files.) On Perlmutter, you can find the files underneath:: + + /dvs_ro/cfs/cdirs/lsst/shared/external/roman-desc-sims/Roman_data/RomanTDS/images/simple_model/ + + +Check out the code +^^^^^^^^^^^^^^^^^^ + +In a directory which I will henceforth call your "parent" directory, get a copy of phrosty:: + + git clone https://github.com/Roman-Supernova-PIT/phrosty.git + +Run the container +^^^^^^^^^^^^^^^^^ + +Do:: + + apptainer shell --nv \ + --bind $PWD:/home \ + --bind $PWD/photometry_test_data:/photometry_test_data \ + --bind $PWD/dia_out_dir:/dia_out_dir \ + --bind $PWD/lc_out_dir:/lc_out_dir \ + --bind $PWD/phrosty_temp:/phrosty_temp \ + --env LD_LIBRARY_PATH=/usr/lib64:/usr/lib/x86_64-linux-gnu:/usr/local/cuda/lib64:/usr/local/cuda/lib64/stubs \ + --env PYTHONPATH=/roman_imsim \ + --env OPENBLAS_NUM_THREADS=1 \ + --env MKL_NUM_THREADS=1 \ + --env NUMEXPR_NUM_THREADS=1 \ + --env OMP_NUM_THREADS=1 \ + --env VECLIB_MAXIMUM_THREADS=1 \ + --env TERM=xterm \ + roman-snpit-env_cuda-dev.sif + + +If you ran the ``apptainer pull`` command above in a different place from where you are now, replaced ``roman-snpit-env_cuda-dev.sif`` above with the full path to that ``.sif`` file. + + Phrosty Functionality ===================== diff --git a/examples/perlmutter/20172782_instances_science.csv b/examples/perlmutter/20172782_instances_science.csv index 5fef7f3..75779c6 100644 --- a/examples/perlmutter/20172782_instances_science.csv +++ b/examples/perlmutter/20172782_instances_science.csv @@ -1,54 +1,54 @@ path pointing sca mjd band -simple_model/R062/35083/Roman_TDS_simple_model_R062_35083_8.fits.gz 35083 8 62455.174 R062 -simple_model/R062/35088/Roman_TDS_simple_model_R062_35088_2.fits.gz 35088 2 62455.19 R062 -simple_model/R062/35857/Roman_TDS_simple_model_R062_35857_3.fits.gz 35857 3 62465.187 R062 -simple_model/R062/36242/Roman_TDS_simple_model_R062_36242_5.fits.gz 36242 5 62470.187 R062 -simple_model/R062/36626/Roman_TDS_simple_model_R062_36626_15.fits.gz 36626 15 62475.185 R062 -simple_model/R062/37391/Roman_TDS_simple_model_R062_37391_6.fits.gz 37391 6 62485.168 R062 -simple_model/R062/37395/Roman_TDS_simple_model_R062_37395_16.fits.gz 37395 16 62485.182 R062 -simple_model/R062/37396/Roman_TDS_simple_model_R062_37396_8.fits.gz 37396 8 62485.185 R062 -simple_model/R062/38155/Roman_TDS_simple_model_R062_38155_11.fits.gz 38155 11 62495.15 R062 -simple_model/R062/38160/Roman_TDS_simple_model_R062_38160_4.fits.gz 38160 4 62495.166 R062 -simple_model/R062/38535/Roman_TDS_simple_model_R062_38535_15.fits.gz 38535 15 62500.134 R062 -simple_model/R062/38920/Roman_TDS_simple_model_R062_38920_3.fits.gz 38920 3 62505.134 R062 -simple_model/R062/39680/Roman_TDS_simple_model_R062_39680_15.fits.gz 39680 15 62515.102 R062 -simple_model/R062/40060/Roman_TDS_simple_model_R062_40060_18.fits.gz 40060 18 62520.086 R062 -simple_model/R062/40820/Roman_TDS_simple_model_R062_40820_10.fits.gz 40820 10 62530.053 R062 -simple_model/R062/40825/Roman_TDS_simple_model_R062_40825_15.fits.gz 40825 15 62530.07 R062 -simple_model/R062/41200/Roman_TDS_simple_model_R062_41200_1.fits.gz 41200 1 62535.037 R062 -simple_model/R062/41205/Roman_TDS_simple_model_R062_41205_12.fits.gz 41205 12 62535.053 R062 -simple_model/R062/42356/Roman_TDS_simple_model_R062_42356_18.fits.gz 42356 18 62550.04 R062 -simple_model/R062/43122/Roman_TDS_simple_model_R062_43122_13.fits.gz 43122 13 62560.027 R062 -simple_model/R062/43507/Roman_TDS_simple_model_R062_43507_17.fits.gz 43507 17 62565.027 R062 -simple_model/R062/43893/Roman_TDS_simple_model_R062_43893_1.fits.gz 43893 1 62570.029 R062 -simple_model/R062/44278/Roman_TDS_simple_model_R062_44278_17.fits.gz 44278 17 62575.029 R062 -simple_model/R062/44279/Roman_TDS_simple_model_R062_44279_9.fits.gz 44279 9 62575.032 R062 -simple_model/R062/44664/Roman_TDS_simple_model_R062_44664_12.fits.gz 44664 12 62580.032 R062 -simple_model/R062/44669/Roman_TDS_simple_model_R062_44669_5.fits.gz 44669 5 62580.048 R062 -simple_model/R062/45054/Roman_TDS_simple_model_R062_45054_14.fits.gz 45054 14 62585.048 R062 -simple_model/R062/47004/Roman_TDS_simple_model_R062_47004_7.fits.gz 47004 7 62610.128 R062 -simple_model/R062/47394/Roman_TDS_simple_model_R062_47394_8.fits.gz 47394 8 62615.144 R062 -simple_model/R062/47784/Roman_TDS_simple_model_R062_47784_5.fits.gz 47784 5 62620.16 R062 -simple_model/R062/48174/Roman_TDS_simple_model_R062_48174_11.fits.gz 48174 11 62625.176 R062 -simple_model/R062/48943/Roman_TDS_simple_model_R062_48943_9.fits.gz 48943 9 62635.174 R062 -simple_model/R062/48948/Roman_TDS_simple_model_R062_48948_3.fits.gz 48948 3 62635.19 R062 -simple_model/R062/49333/Roman_TDS_simple_model_R062_49333_1.fits.gz 49333 1 62640.19 R062 -simple_model/R062/50102/Roman_TDS_simple_model_R062_50102_2.fits.gz 50102 2 62650.187 R062 -simple_model/R062/50487/Roman_TDS_simple_model_R062_50487_8.fits.gz 50487 8 62655.187 R062 -simple_model/R062/50871/Roman_TDS_simple_model_R062_50871_11.fits.gz 50871 11 62660.185 R062 -simple_model/R062/51256/Roman_TDS_simple_model_R062_51256_4.fits.gz 51256 4 62665.185 R062 -simple_model/R062/51635/Roman_TDS_simple_model_R062_51635_17.fits.gz 51635 17 62670.166 R062 -simple_model/R062/52015/Roman_TDS_simple_model_R062_52015_14.fits.gz 52015 14 62675.15 R062 -simple_model/R062/52020/Roman_TDS_simple_model_R062_52020_1.fits.gz 52020 1 62675.166 R062 -simple_model/R062/52395/Roman_TDS_simple_model_R062_52395_18.fits.gz 52395 18 62680.134 R062 -simple_model/R062/52400/Roman_TDS_simple_model_R062_52400_2.fits.gz 52400 2 62680.15 R062 -simple_model/R062/52780/Roman_TDS_simple_model_R062_52780_12.fits.gz 52780 12 62685.134 R062 -simple_model/R062/53160/Roman_TDS_simple_model_R062_53160_15.fits.gz 53160 15 62690.118 R062 -simple_model/R062/53540/Roman_TDS_simple_model_R062_53540_18.fits.gz 53540 18 62695.102 R062 -simple_model/R062/53920/Roman_TDS_simple_model_R062_53920_17.fits.gz 53920 17 62700.086 R062 -simple_model/R062/54300/Roman_TDS_simple_model_R062_54300_16.fits.gz 54300 16 62705.07 R062 -simple_model/R062/54305/Roman_TDS_simple_model_R062_54305_15.fits.gz 54305 15 62705.086 R062 -simple_model/R062/54685/Roman_TDS_simple_model_R062_54685_14.fits.gz 54685 14 62710.07 R062 -simple_model/R062/55065/Roman_TDS_simple_model_R062_55065_11.fits.gz 55065 11 62715.053 R062 -simple_model/R062/55440/Roman_TDS_simple_model_R062_55440_8.fits.gz 55440 8 62720.021 R062 -simple_model/R062/55445/Roman_TDS_simple_model_R062_55445_3.fits.gz 55445 3 62720.037 R062 +R062/35083/Roman_TDS_simple_model_R062_35083_8.fits.gz 35083 8 62455.174 R062 +R062/35088/Roman_TDS_simple_model_R062_35088_2.fits.gz 35088 2 62455.19 R062 +R062/35857/Roman_TDS_simple_model_R062_35857_3.fits.gz 35857 3 62465.187 R062 +R062/36242/Roman_TDS_simple_model_R062_36242_5.fits.gz 36242 5 62470.187 R062 +R062/36626/Roman_TDS_simple_model_R062_36626_15.fits.gz 36626 15 62475.185 R062 +R062/37391/Roman_TDS_simple_model_R062_37391_6.fits.gz 37391 6 62485.168 R062 +R062/37395/Roman_TDS_simple_model_R062_37395_16.fits.gz 37395 16 62485.182 R062 +R062/37396/Roman_TDS_simple_model_R062_37396_8.fits.gz 37396 8 62485.185 R062 +R062/38155/Roman_TDS_simple_model_R062_38155_11.fits.gz 38155 11 62495.15 R062 +R062/38160/Roman_TDS_simple_model_R062_38160_4.fits.gz 38160 4 62495.166 R062 +R062/38535/Roman_TDS_simple_model_R062_38535_15.fits.gz 38535 15 62500.134 R062 +R062/38920/Roman_TDS_simple_model_R062_38920_3.fits.gz 38920 3 62505.134 R062 +R062/39680/Roman_TDS_simple_model_R062_39680_15.fits.gz 39680 15 62515.102 R062 +R062/40060/Roman_TDS_simple_model_R062_40060_18.fits.gz 40060 18 62520.086 R062 +R062/40820/Roman_TDS_simple_model_R062_40820_10.fits.gz 40820 10 62530.053 R062 +R062/40825/Roman_TDS_simple_model_R062_40825_15.fits.gz 40825 15 62530.07 R062 +R062/41200/Roman_TDS_simple_model_R062_41200_1.fits.gz 41200 1 62535.037 R062 +R062/41205/Roman_TDS_simple_model_R062_41205_12.fits.gz 41205 12 62535.053 R062 +R062/42356/Roman_TDS_simple_model_R062_42356_18.fits.gz 42356 18 62550.04 R062 +R062/43122/Roman_TDS_simple_model_R062_43122_13.fits.gz 43122 13 62560.027 R062 +R062/43507/Roman_TDS_simple_model_R062_43507_17.fits.gz 43507 17 62565.027 R062 +R062/43893/Roman_TDS_simple_model_R062_43893_1.fits.gz 43893 1 62570.029 R062 +R062/44278/Roman_TDS_simple_model_R062_44278_17.fits.gz 44278 17 62575.029 R062 +R062/44279/Roman_TDS_simple_model_R062_44279_9.fits.gz 44279 9 62575.032 R062 +R062/44664/Roman_TDS_simple_model_R062_44664_12.fits.gz 44664 12 62580.032 R062 +R062/44669/Roman_TDS_simple_model_R062_44669_5.fits.gz 44669 5 62580.048 R062 +R062/45054/Roman_TDS_simple_model_R062_45054_14.fits.gz 45054 14 62585.048 R062 +R062/47004/Roman_TDS_simple_model_R062_47004_7.fits.gz 47004 7 62610.128 R062 +R062/47394/Roman_TDS_simple_model_R062_47394_8.fits.gz 47394 8 62615.144 R062 +R062/47784/Roman_TDS_simple_model_R062_47784_5.fits.gz 47784 5 62620.16 R062 +R062/48174/Roman_TDS_simple_model_R062_48174_11.fits.gz 48174 11 62625.176 R062 +R062/48943/Roman_TDS_simple_model_R062_48943_9.fits.gz 48943 9 62635.174 R062 +R062/48948/Roman_TDS_simple_model_R062_48948_3.fits.gz 48948 3 62635.19 R062 +R062/49333/Roman_TDS_simple_model_R062_49333_1.fits.gz 49333 1 62640.19 R062 +R062/50102/Roman_TDS_simple_model_R062_50102_2.fits.gz 50102 2 62650.187 R062 +R062/50487/Roman_TDS_simple_model_R062_50487_8.fits.gz 50487 8 62655.187 R062 +R062/50871/Roman_TDS_simple_model_R062_50871_11.fits.gz 50871 11 62660.185 R062 +R062/51256/Roman_TDS_simple_model_R062_51256_4.fits.gz 51256 4 62665.185 R062 +R062/51635/Roman_TDS_simple_model_R062_51635_17.fits.gz 51635 17 62670.166 R062 +R062/52015/Roman_TDS_simple_model_R062_52015_14.fits.gz 52015 14 62675.15 R062 +R062/52020/Roman_TDS_simple_model_R062_52020_1.fits.gz 52020 1 62675.166 R062 +R062/52395/Roman_TDS_simple_model_R062_52395_18.fits.gz 52395 18 62680.134 R062 +R062/52400/Roman_TDS_simple_model_R062_52400_2.fits.gz 52400 2 62680.15 R062 +R062/52780/Roman_TDS_simple_model_R062_52780_12.fits.gz 52780 12 62685.134 R062 +R062/53160/Roman_TDS_simple_model_R062_53160_15.fits.gz 53160 15 62690.118 R062 +R062/53540/Roman_TDS_simple_model_R062_53540_18.fits.gz 53540 18 62695.102 R062 +R062/53920/Roman_TDS_simple_model_R062_53920_17.fits.gz 53920 17 62700.086 R062 +R062/54300/Roman_TDS_simple_model_R062_54300_16.fits.gz 54300 16 62705.07 R062 +R062/54305/Roman_TDS_simple_model_R062_54305_15.fits.gz 54305 15 62705.086 R062 +R062/54685/Roman_TDS_simple_model_R062_54685_14.fits.gz 54685 14 62710.07 R062 +R062/55065/Roman_TDS_simple_model_R062_55065_11.fits.gz 55065 11 62715.053 R062 +R062/55440/Roman_TDS_simple_model_R062_55440_8.fits.gz 55440 8 62720.021 R062 +R062/55445/Roman_TDS_simple_model_R062_55445_3.fits.gz 55445 3 62720.037 R062 diff --git a/examples/perlmutter/20172782_instances_science_2.csv b/examples/perlmutter/20172782_instances_science_2.csv index beebb7e..40f1b17 100644 --- a/examples/perlmutter/20172782_instances_science_2.csv +++ b/examples/perlmutter/20172782_instances_science_2.csv @@ -1,3 +1,3 @@ path pointing sca mjd band -simple_model/R062/35083/Roman_TDS_simple_model_R062_35083_8.fits.gz 35083 8 62455.174 R062 -simple_model/R062/41205/Roman_TDS_simple_model_R062_41205_12.fits.gz 41205 12 62535.053 R062 +R062/35083/Roman_TDS_simple_model_R062_35083_8.fits.gz 35083 8 62455.174 R062 +R062/41205/Roman_TDS_simple_model_R062_41205_12.fits.gz 41205 12 62535.053 R062 diff --git a/examples/perlmutter/20172782_instances_templates_1.csv b/examples/perlmutter/20172782_instances_templates_1.csv index 3f3ddca..9383359 100644 --- a/examples/perlmutter/20172782_instances_templates_1.csv +++ b/examples/perlmutter/20172782_instances_templates_1.csv @@ -1,2 +1,2 @@ path pointing sca mjd band -simple_model/R062/6/Roman_TDS_simple_model_R062_6_17.fits.gz 6 17 62000.04011 R062 +R062/6/Roman_TDS_simple_model_R062_6_17.fits.gz 6 17 62000.04011 R062 diff --git a/examples/perlmutter/20172782_instances_templates_10.csv b/examples/perlmutter/20172782_instances_templates_10.csv index 52641f6..2fcac57 100644 --- a/examples/perlmutter/20172782_instances_templates_10.csv +++ b/examples/perlmutter/20172782_instances_templates_10.csv @@ -1,11 +1,11 @@ path pointing sca mjd band -simple_model/R062/6/Roman_TDS_simple_model_R062_6_17.fits.gz 6 17 62000.04011 R062 -simple_model/R062/1928/Roman_TDS_simple_model_R062_1928_10.fits.gz 1928 10 62025.0294 R062 -simple_model/R062/2314/Roman_TDS_simple_model_R062_2314_6.fits.gz 2314 6 62030.0321 R062 -simple_model/R062/2319/Roman_TDS_simple_model_R062_2319_8.fits.gz 2319 8 62030.0481 R062 -simple_model/R062/2709/Roman_TDS_simple_model_R062_2709_7.fits.gz 2709 7 62035.0642 R062 -simple_model/R062/3089/Roman_TDS_simple_model_R062_3089_17.fits.gz 3089 17 62040.0481 R062 -simple_model/R062/3094/Roman_TDS_simple_model_R062_3094_10.fits.gz 3094 10 62040.0642 R062 -simple_model/R062/4264/Roman_TDS_simple_model_R062_4264_7.fits.gz 4264 7 62055.1123 R062 -simple_model/R062/5044/Roman_TDS_simple_model_R062_5044_9.fits.gz 5044 9 62065.1444 R062 -simple_model/R062/5429/Roman_TDS_simple_model_R062_5429_4.fits.gz 5429 4 62070.1444 R062 +R062/6/Roman_TDS_simple_model_R062_6_17.fits.gz 6 17 62000.04011 R062 +R062/1928/Roman_TDS_simple_model_R062_1928_10.fits.gz 1928 10 62025.0294 R062 +R062/2314/Roman_TDS_simple_model_R062_2314_6.fits.gz 2314 6 62030.0321 R062 +R062/2319/Roman_TDS_simple_model_R062_2319_8.fits.gz 2319 8 62030.0481 R062 +R062/2709/Roman_TDS_simple_model_R062_2709_7.fits.gz 2709 7 62035.0642 R062 +R062/3089/Roman_TDS_simple_model_R062_3089_17.fits.gz 3089 17 62040.0481 R062 +R062/3094/Roman_TDS_simple_model_R062_3094_10.fits.gz 3094 10 62040.0642 R062 +R062/4264/Roman_TDS_simple_model_R062_4264_7.fits.gz 4264 7 62055.1123 R062 +R062/5044/Roman_TDS_simple_model_R062_5044_9.fits.gz 5044 9 62065.1444 R062 +R062/5429/Roman_TDS_simple_model_R062_5429_4.fits.gz 5429 4 62070.1444 R062 diff --git a/examples/perlmutter/20172782_slurm_demo.sh b/examples/perlmutter/20172782_slurm_demo.sh index c0cad8a..a8eb91a 100644 --- a/examples/perlmutter/20172782_slurm_demo.sh +++ b/examples/perlmutter/20172782_slurm_demo.sh @@ -40,10 +40,10 @@ podman-hpc run --gpu \ registry.nersc.gov/m4385/rknop/roman-snpit-env:cuda-dev \ python phrosty/phrosty/pipeline.py \ -c phrosty/examples/perlmutter/phrosty_config.yaml \ + --oc ou2024 \ --oid 20172782 \ - -r 7.551093401915147 \ - -d -44.80718106491529 \ -b R062 \ + --ic ou2024 -t phrosty/examples/perlmutter/20172782_instances_templates_1.csv \ -s phrosty/examples/perlmutter/20172782_instances_science.csv \ -p 9 \ diff --git a/examples/perlmutter/phrosty_config.yaml b/examples/perlmutter/phrosty_config.yaml index bd2c04e..72b8cfa 100644 --- a/examples/perlmutter/phrosty_config.yaml +++ b/examples/perlmutter/phrosty_config.yaml @@ -2,12 +2,13 @@ ou24: tds_base: /ou2024/RomanTDS config_file: /home/phrosty/examples/perlmutter/tds.yaml sn_truth_dir: /ou2024_snana - images: /ou2024/RomanTDS/images + images: /ou2024/RomanTDS/images/simple_model sims_sed_library: /ou2024_sims_sed_library photometry: snappl: A25ePSF_path: /snappl/snappl/tests/testdata/A25ePSF + temp_dir: /phrosty_temp phrosty: image_type: ou2024fits diff --git a/phrosty/auxiliary/tds.yaml b/phrosty/auxiliary/tds.yaml deleted file mode 100644 index d03289f..0000000 --- a/phrosty/auxiliary/tds.yaml +++ /dev/null @@ -1,161 +0,0 @@ -# Default settings for roman simulation -# Includes creation of noisless oversampled images (including PSF) -# -- processing of other detector and instrument effects are still handled in the -# python postprocessing layer to enable things not currently in galsim.roman - -modules: - - # Including galsim.roman in the list of modules to import will add a number of Roman-specific - # functions and classes that we will use here. - - roman_imsim - - galsim.roman - - # We need this for one of our Eval items. GalSim does not by default import datetime into - # the globals dict it uses when evaluating Eval items, so we can tell it to import it here. - - datetime - -# Define some other information about the images -image: - - # A special Image type that knows all the Roman SCA geometry, WCS, gain, etc. - # It also by default applies a number of detector effects, but these can be turned - # off if desired by setting some parameters (given below) to False. - type: roman_sca - - wcs: - type: RomanWCS - SCA: '@image.SCA' - ra: { type: ObSeqData, field: ra } - dec: { type: ObSeqData, field: dec } - pa: { type: ObSeqData, field: pa } - mjd: { type: ObSeqData, field: mjd } - max_sun_angle: 50 - force_cvz: True - - bandpass: - type: RomanBandpass - name: { type: ObSeqData, field: filter } - - # When you want to have multiple images generate the same random galaxies, then - # you can set up multiple random number generators with different update cadences - # by making random_seed a list. - # The default behavior is just to have the random seeds for each object go in order by - # object number across all images, but this shows how to set it up so we use two separate - # cadences. - # The first one behaves normally, which will be used for things like noise on the image. - # The second one sets the initial seed for each object to repeat to the same starting value - # at the start of each filter. If we were doing more than 3 total files, it would then - # move on to another sequence for the next 3 and so on. - random_seed: - # Used for noise and nobjects. - - { type: ObSeqData, field: visit } - - # Used for objects. Repeats sequence for each filter - # Note: Don't use $ shorthand here, since that will implicitly be evaluated once and then - # treated the same way as an integer (i.e. making a regular sequence starting from that - # value). Using an explicit dict with an Eval type means GalSim will leave it alone and - # evaluate it as is for each object. - - - # We're just doing one SCA here. - # If you wanted to do all of them in each of three filters (given below), you could use: - # - # SCA: - # type: Sequence - # first: 1 - # last: 18 - # repeat: 3 # repeat each SCA num 3 times before moving on, for the 3 filters. - # - SCA: 10 - mjd: { type: ObSeqData, field: mjd } - filter: { type: ObSeqData, field: filter } - exptime: { type: ObSeqData, field: exptime } - - # Photon shooting is way faster for chromatic objects than fft, especially when most of them - # are fairly faint. The cross-over point for achromatic objects is generally of order - # flux=1.e6 or so (depending on the profile). Most of our objects here are much fainter than - # that. The fft rendering for chromatic is a factor of 10 or so slower still, whereas - # chromatic photon shooting is only slighly slower than achromatic, so the difference - # is even more pronounced in this case. - draw_method: 'auto' - - # These are all by default turned on, but you can turn any of them off if desired: - ignore_noise: True - stray_light: False - thermal_background: False - reciprocity_failure: False - dark_current: False - nonlinearity: False - ipc: False - read_noise: False - sky_subtract: False - -stamp: - type: Roman_stamp - world_pos: - type: SkyCatWorldPos - exptime: { type: ObSeqData, field: exptime } - - photon_ops: - - - type: ChargeDiff - -# psf: -# type: roman_psf -# # If omitted, it would figure this out automatically, because we are using the RomanSCA image -# # type. But if we weren't, you'd have to tell it which SCA to build the PSF for. -# SCA: '@image.SCA' -# # n_waves defines how finely to sample the PSF profile over the bandpass. -# # Using 10 wavelengths usually gives decent accuracy. -# n_waves: 10 - -# Define the galaxy type and positions to use -gal: - type: SkyCatObj - -input: - obseq_data: - file_name: /cwork/mat90/RomanDESC_sims_2024/RomanTDS/Roman_TDS_obseq_11_6_23.fits - visit: 80840 - SCA: '@image.SCA' - roman_psf: - SCA: '@image.SCA' - n_waves: 5 - sky_catalog: - file_name: /cwork/mat90/RomanDESC_sims_2024/roman_rubin_cats_v1.1.2_faint/skyCatalog.yaml - edge_pix: 512 - mjd: { type: ObSeqData, field: mjd } - exptime: { type: ObSeqData, field: exptime } - obj_types: ['diffsky_galaxy', 'star', 'snana'] - -output: - - nfiles: 1 - dir: /cwork/mat90/RomanDESC_sims_2024/RomanTDS/images/truth - file_name: - type: FormattedStr - format: "Roman_TDS_truth_%s_%i_%i.fits.gz" - items: - - { type: ObSeqData, field: filter } - - { type: ObSeqData, field: visit } - - '@image.SCA' - - truth: - dir: /cwork/mat90/RomanDESC_sims_2024/RomanTDS/truth - file_name: - type: FormattedStr - format: "Roman_TDS_index_%s_%i_%i.txt" - items: - - { type: ObSeqData, field: filter } - - { type: ObSeqData, field: visit } - - '@image.SCA' - columns: - object_id: "@object_id" - ra: "$sky_pos.ra.deg" - dec: "$sky_pos.dec.deg" - x: "$image_pos.x" - y: "$image_pos.y" - realized_flux: "@realized_flux" - flux: "@flux" - mag: "@mag" - obj_type: "@object_type" diff --git a/phrosty/unused.py b/phrosty/deprecated/unused.py similarity index 100% rename from phrosty/unused.py rename to phrosty/deprecated/unused.py diff --git a/phrosty/deprecated/utils.py b/phrosty/deprecated/utils.py new file mode 100644 index 0000000..dce4a13 --- /dev/null +++ b/phrosty/deprecated/utils.py @@ -0,0 +1,633 @@ +# ruff: noqa + +# This module should no longer be needed, and should be deleted soon. +# +# All the functions are left here commented out for now so that when we +# get to fixing anything that's in plotting.py or plotaesthetics.py, we know +# what they were trying to do before, and can figure out how to get what we +# need out of snappl. + +# __all__ = [ 'get_roman_bands', 'read_truth_txt', 'radec_isin', 'get_corners', +# 'get_transient_radec', 'get_transient_mjd', 'get_transient_zcmb', +# 'get_transient_peakmjd', 'get_transient_info', 'transient_in_or_out', +# 'get_mjd_limits', 'get_radec_limits', 'get_mjd', 'pointings_near_mjd', +# 'get_mjd_info', 'get_exptime', 'make_object_table' ] + +# # IMPORTS Standard: +# import os +# import os.path as pa +# import numpy as np +# import pandas as pd +# import requests +# import warnings +# from glob import glob + +# # IMPORTS Astro: +# from astropy.coordinates import SkyCoord +# from astropy.io import fits +# from astropy.wcs import WCS, FITSFixedWarning +# import astropy.wcs.wcs +# from astropy.wcs.utils import skycoord_to_pixel +# from astropy.table import Table +# from astropy import units as u + +# # IMPORTS snpit +# from snpit_utils.config import Config +# from snpit_utils.logger import SNLogger + +# # The FITSFixedWarning is consequenceless and it will get thrown every single time you deal with a WCS. +# warnings.simplefilter('ignore', category=FITSFixedWarning) + + +# def _build_filepath(path, band, pointing, sca, filetype, rootdir=None): +# """Builds the filepath to an OpenUniverse simulation file. + +# Parameters +# ---------- +# path: Path +# If the path to the file is already known, this overrides the rest of the function and returns +# the input to kwarg 'path'. +# band: str +# Filter associated with target file. +# pointing: str +# Pointing number associated with target file. +# sca: str +# SCA number associated with target file. +# filetype: str +# The type of target file within the OpenUniverse simulations that you are looking for. Valid +# values are 'image' (*.fits.gz), 'truth' (*.fits.gz), and 'truthtxt' (*.txt). +# rootdir: Path, default None +# Root directory where OpenUniverse files are stored. + +# Returns +# ------- +# path: Path +# Path to target file. + +# Raises +# ------ +# ValueError +# if filetype is not 'image', 'truth', or 'truthtxt', a ValueError is raised. +# ValueError +# if (band is None) or (pointing is None) or (sca is None), +# you need to specify band, pointing, and sca if you do not provide a full filepath. +# ValueError +# if (path is None) and (band is None) and (pointing is None) and (sca is None), +# you need to provide either the full image path, or the band, pointing, and SCA. + +# """ + +# rootdir = Config.get().value( 'ou24.tds_base' ) if rootdir is None else rootdir + +# # First, what kind of file are we looking for? +# neededtypes = [ 'image', 'truth', 'truthtxt' ] +# if filetype not in neededtypes: +# raise ValueError(f"filetype must be in {neededtypes}.") +# elif filetype == 'image': +# subdir = 'images/simple_model' +# prefix = 'simple_model' +# extension = 'fits.gz' +# elif filetype == 'truth': +# subdir = 'images/truth' +# prefix = 'truth' +# extension = 'fits.gz' +# elif filetype == 'truthtxt': +# subdir = 'truth' +# prefix = 'index' +# extension = 'txt' + +# # Did you already provide a path? +# if path is not None: +# return path +# elif (band is None) or (pointing is None) or (sca is None): +# raise ValueError('You need to specify band, pointing, and sca if you do not provide a full filepath.') +# elif (band is not None) and (pointing is not None) and (sca is not None): +# path = pa.join(rootdir, subdir, band, str(pointing), +# f'Roman_TDS_{prefix}_{band}_{str(pointing)}_{str(sca)}.{extension}') +# return path + +# elif (path is None) and (band is None) and (pointing is None) and (sca is None): +# raise ValueError('You need to provide either the full image path, or the band, pointing, and SCA.') + + +# def ou2024_obseq_path( path=None ): +# """Retrieve the path to the OpenUniverse obseq file. + +# Parameters +# ---------- +# path: Path, default None +# Path to file. If not provided, use config file. + +# Returns +# ------- +# Path to OpenUniverse obseq file. + +# """ +# return ( os.path.join( Config.get().value('ou24.tds_base'), 'Roman_TDS_obseq_11_6_23.fits' ) +# if path is None else path ) + + +# def get_roman_bands(): +# """Get roman passbands. + +# Returns +# ------- +# List of bands included in the Roman-DESC TDS simulations. + +# """ +# return ['R062', 'Z087', 'Y106', 'J129', 'H158', 'F184', 'K213'] + + +# def read_truth_txt(path=None, band=None, pointing=None, sca=None): +# """Reads in the txt versions of the OpenUniverse truth files as convenient astropy tables. + +# Parameters +# ---------- +# truthpath: str, default None +# Path to txt catalog version of truth file. If you do not +# provide this, you need to specify the arguments +# band, pointing, and sca. +# band: str, default None +# Roman filter. If you do not provide this, you need to provide truthpath. +# pointing: str, default None +# Pointing ID. If you do not provide this, you need to provide truthpath. +# sca: str, default None +# SCA ID. If you do not provide this, you need to provide truthpath. + +# Returns +# ------- +# truth: astropy.table.Table +# Astropy table with contents of the specified catalog txt file. + +# """ + +# _truthpath = _build_filepath(path=path, band=band, pointing=pointing, sca=sca, filetype='truthtxt') +# truth_colnames = ['object_id', 'ra', 'dec', 'x', 'y', 'realized_flux', 'flux', 'mag', 'obj_type'] +# truth_pd = pd.read_csv(_truthpath, comment='#', skipinitialspace=True, sep=' ', names=truth_colnames) +# truth = Table.from_pandas(truth_pd) + +# return truth + + +# def radec_isin(ra, dec, path=None, band=None, pointing=None, sca=None): +# """Check if a given RA, dec coordinate is in a given target image. + +# Parameters +# ---------- +# ra : float +# RA in degrees. +# dec : float +# Dec in degrees. +# path : Path, default None +# Path to image to check. +# band : str, default None +# Filter assocated with target image. +# pointing : str, default None +# Pointing associated with target image. +# sca : str, default None +# SCA associated with target image. + +# Returns +# ------- +# res: boolean +# True if provided RA, Dec is in the image. False if not. + +# """ + +# _imgpath = _build_filepath(path, band, pointing, sca, 'image') +# with fits.open(_imgpath) as hdu: +# wcs = WCS(hdu[1].header) +# worldcoords = SkyCoord(ra=ra*u.deg, dec=dec*u.deg) +# try: +# x, y = skycoord_to_pixel(worldcoords, wcs) +# except astropy.wcs.wcs.NoConvergence: +# return False +# pxradec = np.array([x, y]) +# if np.logical_or(any(pxradec < 0), any(pxradec > 4088)): +# res = False +# else: +# res = True + +# return res + + +# def get_corners(path=None, band=None, pointing=None, sca=None): +# """Retrieves the RA, dec of the corners of the specified SCA in degrees. + +# :param band: Roman filter. +# :type band: str +# :param pointing: Pointing ID. +# :type pointing: str +# :param sca: SCA ID. +# :type sca: str +# :return: Tuple containing four numpy arrays, each with the RA and dec of the corner +# of the specified image in degrees. +# :rtype: tuple +# """ +# _imgpath = _build_filepath(path, band, pointing, sca, 'image') +# with fits.open(_imgpath) as hdu: +# wcs = WCS(hdu[1].header) +# corners = [[0, 0], [0, 4088], [4088, 0], [4088, 4088]] +# wcs_corners = wcs.pixel_to_world_values(corners) + +# return wcs_corners + + +# # TODO clean this up for style +# # TODO write something to clear this out +# _parquet_cache = {} + + +# def _read_parquet( file ): +# global _parquet_cache + +# if file not in _parquet_cache: +# SNLogger.info( f"**** Reading parquet file {file}" ) +# _parquet_cache[file] = pd.read_parquet( file ) + +# totm = 0 +# for f, df in _parquet_cache.items(): +# totm += df.memory_usage(index=True).sum() + +# SNLogger.info( f"**** Done reading parquet file {file}; cache using {totm/1024/1024} MiB" ) +# return _parquet_cache[ file ] + + +# def get_transient_radec(oid): +# """Retrieve RA, dec of a transient based on its object ID in the OpenUniverse sims. + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# ra : float +# RA in degrees of target transient. +# dec : float +# Dec in degrees of target transient. + +# """ + +# oid = int(oid) +# snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) +# file_list = glob(snana_pq_path) +# for file in file_list: +# # Read the Parquet file +# df = _read_parquet(file) +# if len(df[df['id'] == oid]) != 0: +# ra = df[df['id'] == oid]['ra'].values[0] +# dec = df[df['id'] == oid]['dec'].values[0] +# return ra, dec + + +# def get_transient_mjd(oid): +# """Retrieve start and end dates of a transient in the OpenUniverse sims based on its object ID. + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# start : float +# Simulated start MJD of target transient. + +# end : float +# Simulated end MJD of target transient. + +# """ +# oid = int(oid) +# snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) +# file_list = glob(snana_pq_path) +# for file in file_list: +# # Read the Parquet file +# df = _read_parquet(file) +# if len(df[df['id'] == oid]) != 0: +# start = df[df['id'] == oid]['start_mjd'].values[0] +# end = df[df['id'] == oid]['end_mjd'].values[0] + +# return start, end + + +# def get_transient_zcmb(oid): + +# """Retrieve z_CMB of a transient in the OpenUniverse sims based on its object ID. + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# z : float +# z_CMB of target transient. + +# """ + +# oid = int(oid) +# snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) +# file_list = glob(snana_pq_path) +# for file in file_list: +# # Read the Parquet file +# df = _read_parquet(file) +# if len(df[df['id'] == oid]) != 0: +# z = float(df[df['id'] == oid]['z_CMB'].values[0]) + +# return z + + +# def get_transient_peakmjd(oid): + +# """Retrieve MJD of peak brightness for a transient object in the OpenUniverse sims. + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# mjd : float +# Peak MJD of target transient. + +# """ + +# oid = int(oid) +# snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) +# file_list = glob(snana_pq_path) +# for file in file_list: +# # Read the Parquet file +# df = _read_parquet(file) +# if len(df[df['id'] == oid]) != 0: +# mjd = df[df['id'] == oid]['peak_mjd'].values[0] + +# return mjd + + +# def get_transient_info(oid): +# """Retrieve RA, Dec, start MJD, and end MJD for a specified object in the OpenUnvierse sims. + +# This function calls get_transient_radec() and get_transient_mjd(). + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# ra : float +# RA in degrees of target transient. + +# dec : float +# Dec in degrees of target transient. + +# start : float +# Simulated start MJD of target transient. + +# end : float +# Simulated end MJD of target transient. + +# """ + +# SNLogger.info( "*** calling get_transient_radec" ) +# ra, dec = get_transient_radec(oid) +# SNLogger.info( "*** calling get_transient_mjd" ) +# start, end = get_transient_mjd(oid) +# SNLogger.info( "*** Done with get_transient_info" ) + +# return ra, dec, start, end + + +# def transient_in_or_out(oid, start, end, band): +# """Retrieve pointings that contain and do not contain the specified SN, per the truth files by MJD. + +# transient_info_filepath is the output of make_object_table. + +# Returns a tuple of astropy tables (images with the SN, images without the SN). +# """ +# tab = Table.from_pandas( make_object_table( oid ) ) + +# tab.sort('pointing') +# tab = tab[tab['filter'] == band] + +# in_all = get_mjd_info(start, end) +# in_rows = np.where(np.isin(tab['pointing'], in_all['pointing']))[0] +# in_tab = tab[in_rows] + +# out_all = get_mjd_info(start, end, return_inverse=True) +# out_rows = np.where(np.isin(tab['pointing'], out_all['pointing']))[0] +# out_tab = tab[out_rows] + +# return in_tab, out_tab + + +# def get_mjd_limits(obseq_path=None): +# """Retrive the earliest and latest MJD in the OpenUniverse TDS simulations. + +# Parameters +# ---------- +# obseq_path : Path, default None +# Path to obseq file Roman_TDS_obseq_11_6_23.fits. + +# Returns +# ------- +# start : float +# Simulated start MJD of OpenUniverse TDS survey. + +# end : float +# Simulated end MJD OpenUniverse TDS survey. + +# """ + +# with fits.open(ou2024_obseq_path(obseq_path)) as obs: +# obseq = Table(obs[1].data) + +# start = min(obseq['date']) +# end = max(obseq['date']) + +# return start, end + + +# def get_radec_limits(obseq_path=None): +# """Retrieve the RA, dec limits of the boresight coordinates of the simulated TDS OpenUniverse survey in degrees. + +# Parameters +# ---------- +# obseq_path : Path, default None +# Path to obseq file Roman_TDS_obseq_11_6_23.fits. + +# Returns +# ------- +# Dictionary with keys 'ra' and 'dec'. Each key identifies a list +# with [minimum RA, maximum RA] and [minimum Dec, maximum Dec], +# respectively. + +# """ +# with fits.open(ou2024_obseq_path(obseq_path)) as obs: +# obseq = Table(obs[1].data) + +# ra_min = min(obseq['ra']) +# ra_max = max(obseq['ra']) + +# dec_min = min(obseq['dec']) +# dec_max = max(obseq['dec']) + +# return {'ra': [ra_min, ra_max], 'dec': [dec_min, dec_max]} + + +# def get_mjd(pointing, obseq_path=None): +# """Retrieve the MJD of a given pointing in the OpenUniverse TDS simulation. + +# Parameters +# ---------- +# pointing : int +# Pointing number. +# obseq_path: Path, default None +# Path to obseq file Roman_TDS_obseq_11_6_23.fits. + +# Returns +# ------- +# mjd : float +# MJD of the specified input pointing. + +# """ + +# with fits.open(ou2024_obseq_path(obseq_path)) as obs: +# obseq = Table(obs[1].data) +# mjd = float(obseq['date'][int(pointing)]) + +# return mjd + + +# def pointings_near_mjd(mjd, window=3, obseq_path=None): +# """Retrieve pointings near given MJD. + +# Parameters +# ---------- +# mjd : float +# Central MJD to search around. +# window : float, default 3 +# Number of days around central MJD to include in search. +# obseq_path : Path, default None +# Path to obseq file Roman_TDS_obseq_11_6_23.fits. + +# Returns +# ------- +# pointings : list +# List of pointings within specified MJD range. + +# """ + +# with fits.open(ou2024_obseq_path(obseq_path)) as obs: +# obseq = Table(obs[1].data) + +# pointings = np.where(np.logical_and(obseq['date'] < mjd + window, obseq['date'] > mjd - window))[0] +# return pointings + + +# def get_mjd_info(mjd_start=0, mjd_end=np.inf, return_inverse=False, obseq_path=None): +# """Get all pointings and corresponding filters between two MJDs. + +# Returns an astropy table with columns 'filter' and 'pointing'. +# Does not return an 'sca' column because every sca belonging to a +# given pointing satisfies an MJD requirement. + +# Parameters +# ---------- +# mjd_start : float, default 0 +# Start MJD +# mjd_end : float, default np.inf +# End MJD +# return_inverse: boolean, default False +# If true, returns all pointings outside the MJD range specified instead of inside. +# obseq_path: Path, default None +# Path to obseq file Roman_TDS_obseq_11_6_23.fits. + +# Returns +# ------- +# mjd_tab : astropy.table.Table +# Astropy table with pointing numbers and corresponding filters that satisfy the +# MJD requirements. + +# """ +# with fits.open(ou2024_obseq_path(obseq_path)) as obs: +# obseq = Table(obs[1].data) + +# if not return_inverse: +# mjd_idx = np.where((obseq['date'] > float(mjd_start)) & (obseq['date'] < float(mjd_end)))[0] +# elif return_inverse: +# mjd_idx = np.where(~((obseq['date'] > float(mjd_start)) & (obseq['date'] < float(mjd_end))))[0] + +# mjd_tab = Table([obseq['filter'][mjd_idx], mjd_idx], names=('filter', 'pointing')) + +# return mjd_tab + + +# def get_exptime(band=None): + +# """Retrieves exposure times from the TDS OpenUniverse sims. + +# https://arxiv.org/abs/2501.05632 + +# Parameters +# ---------- +# band : str, default None +# Band for which to retrieve exposure time. + +# Returns +# ------- +# exptime : dict or float +# If a band is specified, a float with the exposure time in seconds is +# returned. If no band is specified, a dictionary with the exposure time +# for all bands is returned. +# """ + +# exptime = {'F184': 901.175, +# 'J129': 302.275, +# 'H158': 302.275, +# 'K213': 901.175, +# 'R062': 161.025, +# 'Y106': 302.275, +# 'Z087': 101.7} + +# if band in exptime.keys(): +# return exptime[band] +# else: +# return exptime + + +# def make_object_table(oid): +# """Retrieves a table with all images that contain the RA, Dec coordinates of a specified transient. + +# Parameters +# ---------- +# oid : int +# Object ID of target transient. + +# Returns +# ------- +# objs : pd.DataFrame +# Table with columns ['filter', 'pointing', 'sca']. + +# Raises +# ------ +# RuntimeError +# Error is raised if indexing fails. It will probably work if you +# run it again. + +# """ +# ra, dec = get_transient_radec(oid) + +# server_url = 'https://roman-desc-simdex.lbl.gov' +# req = requests.Session() +# result = req.post(f'{server_url}/findromanimages/containing=({ra}, {dec})') +# if result.status_code != 200: +# raise RuntimeError(f"Got status code {result.status_code}\n{result.text}") + +# objs = pd.DataFrame(result.json())[['filter', 'pointing', 'sca']] +# return objs diff --git a/phrosty/imagesubtraction.py b/phrosty/imagesubtraction.py index de097ab..7ce2c9c 100644 --- a/phrosty/imagesubtraction.py +++ b/phrosty/imagesubtraction.py @@ -1,126 +1,103 @@ -__all__ = [ 'gz_and_ext', 'sky_subtract', 'stampmaker' ] +__all__ = [ 'sky_subtract', 'stampmaker' ] # IMPORTS Standard: -import os -import io import numpy as np -import gzip -import multiprocessing -import shutil import pathlib - -# IMPORTS Astro: -from astropy.io import fits -from astropy.coordinates import SkyCoord -from astropy.wcs.utils import skycoord_to_pixel -import astropy.units as u -from astropy.wcs import WCS +import random # IMPORTS SFFT: from sfft.utils.SExSkySubtract import SEx_SkySubtract from sfft.utils.StampGenerator import Stamp_Generator # IMPORTS internal +import snappl.image from snpit_utils.logger import SNLogger from snpit_utils.config import Config -def gz_and_ext(in_path, out_path): - """Utility function that unzips the original file and turns - it into a single-extension FITS file. - - Parameters - ---------- - in_path: Path - The input path of the file to unzip and flatten. - - out_path: Path - The desired output path of the file that is unzipped and flattened, once it is saved. - - Returns - ------- - out_path: Path - The output path of the file that has been unzipped and flattened. - - """ - - bio = io.BytesIO() - with gzip.open(in_path, 'rb') as f_in: - shutil.copyfileobj(f_in, bio) - bio.seek(0) - - with fits.open(bio) as hdu: - newhdu = fits.HDUList([fits.PrimaryHDU(data=hdu[1].data, header=hdu[0].header)]) - SNLogger.debug( f"Process {multiprocessing.current_process().pid} Writing extracted HDU 1 to {out_path}..." ) - newhdu.writeto(out_path, overwrite=True) - SNLogger.debug( f"...process {multiprocessing.current_process().pid} done writing " - f"extracted HDU 1 to {out_path}." ) - - return out_path - - -def sky_subtract( inpath, skysubpath, detmaskpath, temp_dir=pathlib.Path("/tmp"), force=False ): +def sky_subtract( img, temp_dir=None ): """Subtracts background, found with Source Extractor. Parameters ---------- - inpath: Path - Original FITS image - - skysubpath: Path - Sky-subtracted FITS image + img: snappl.image.Image + Original image. - detmaskpath: Path - Detection Mask FITS Image. (Will be uint8, I think.) - - temp_dir: Path + temp_dir: Path, default None Already-existing directory where we can write a temporary file. - (If the image is .gz compressed, source-extractor can't handle - that, so we have to write a decompressed version.) - - force: bool, default False - If False, and outpath already exists, do nothing. If True, - clobber the existing file and recalculate it. + Defaults to photometry.snappl.temp_dir from the config. Returns ------- - skyrms: float - Median of the skyrms image calculated by source-extractor - - """ - - if ( not force ) and ( skysubpath.is_file() ) and ( detmaskpath.is_file() ): - with fits.open( skysubpath ) as hdul: - skyrms = hdul[0].header['SKYRMS'] - return skyrms + skysubim: snappl.image.FITSImageOnDisk, detmask: snappl.image.FITSImageOnDisk, skyrms: float - # do_skysub = force or ( ( not force ) and ( not skysubpath.is_file() ) and ( detmaskpath.is_file() ) ) + skysubim is the sky-subtracted image. Its location on disk + will be underneath temp_dir. It's the caller's responsibility + to clean this up. The file will have been written, so you + can pass skysubim.path to any thing that needs the path of a + single-HDU FITS image. - if inpath.name[-3:] == '.gz': - decompressed_path = temp_dir / inpath.name[:-3] - gz_and_ext( inpath, decompressed_path ) - else: - decompressed_path = inpath + detmask is the detection mask. Its location on disk will be + underneath temp_dir. It's the caller's responsibility to clean + this up. The file will have been written, so you can pass + detmask.path to any thing that needs the path of a single-HDU + FITS image. + skyrms is the median of the skyrms image calculated by source-extractor - SNLogger.debug( "Calling SEx_SkySubtract.SSS..." ) - ( _SKYDIP, _SKYPEAK, _PixA_skysub, - _PixA_sky, PixA_skyrms ) = SEx_SkySubtract.SSS(FITS_obj=decompressed_path, - FITS_skysub=skysubpath, - FITS_detmask=detmaskpath, - FITS_sky=None, FITS_skyrms=None, - ESATUR_KEY='ESATUR', - BACK_SIZE=64, BACK_FILTERSIZE=3, - DETECT_THRESH=1.5, DETECT_MINAREA=5, - DETECT_MAXAREA=0, - VERBOSE_LEVEL=2, MDIR=None) - SNLogger.debug( "...back from SEx_SkySubtract.SSS" ) + """ - return np.median( PixA_skyrms ) + # SEx_SkySubtract.SSS requires FITS files to chew on. At some point + # we should refactor this so that we can pass data to it. However, + # for now, write a snappl.image.FITSImageOnDisk so we have something + # to give to it. + + temp_dir = pathlib.Path( temp_dir if temp_dir is not None else Config.get().value( 'photometry.snappl.temp_dir' ) ) + barf = "".join( random.choices( "0123456789abcdef", k=10 ) ) + tmpimpath = temp_dir / f"{barf}_sub.fits" + tmpsubpath = temp_dir / f"{barf}_sub.fits" + tmpdetmaskpath = temp_dir / f"{barf}_detmask.fits" + + origimg = img + try: + if isinstance( origimg, snappl.image.FITSImageOnDisk ): + img = origimg.uncompressed_version( include=['data'] ) + else: + img = snappl.image.FITSImageOnDisk( path=tmpimpath ) + img.data = origimg.data + img.save_data() + + SNLogger.debug( "Calling SEx_SkySubtract.SSS..." ) + ( _SKYDIP, _SKYPEAK, _PixA_skysub, + _PixA_sky, PixA_skyrms ) = SEx_SkySubtract.SSS(FITS_obj=img.path, + FITS_skysub=tmpsubpath, + FITS_detmask=tmpdetmaskpath, + FITS_sky=None, FITS_skyrms=None, + ESATUR_KEY='ESATUR', + BACK_SIZE=64, BACK_FILTERSIZE=3, + DETECT_THRESH=1.5, DETECT_MINAREA=5, + DETECT_MAXAREA=0, + VERBOSE_LEVEL=2, MDIR=None) + SNLogger.debug( "...back from SEx_SkySubtract.SSS" ) + + subim = snappl.image.FITSImageOnDisk( path=tmpsubpath ) + detmaskim = snappl.image.FITSImageOnDisk( path=tmpdetmaskpath ) + skyrms = np.median( PixA_skyrms ) + return subim, detmaskim, skyrms + + finally: + # Clean up the image temp file if necessary + if img.path != origimg.path: + img.path.unlink( missing_ok=True ) + + +def stampmaker(ra, dec, shape, img, savedir=None, savename=None): + """Make stamps. + TODO : pass an array of ra and dec to make this more efficient; + otherwise, we may be writing and deleting a FITS image over and over + again! -def stampmaker(ra, dec, shape, imgpath, savedir=None, savename=None): - """Make stamps. Parameters ---------- @@ -133,14 +110,15 @@ def stampmaker(ra, dec, shape, imgpath, savedir=None, savename=None): shape: np.array Shape of stamp. must be a numpy array. e.g. np.array([100,100]) - imgpath: Path - Path to image that stamp will be extracted from. + img: snappl.image.Image + Image from whose data the stamp will be extracted. - savedir: Path - Directory stamp will be saved to. + savedir: Path, default None + Directory stamp will be saved to. Defaults to "stamps" underneath + phtometry.phrosty.paths.dia_out_dir from the config. - savename: Path - Base name of stamp filepath. + savename: Path, default None + Base name of stamp filepath; defaults to the name from img's path. Returns ------- @@ -156,21 +134,29 @@ def stampmaker(ra, dec, shape, imgpath, savedir=None, savename=None): savedir = pathlib.Path( savedir ) savedir.mkdir( parents=True, exist_ok=True ) - if savename is None: - savename = f'stamp_{os.path.basename(imgpath)}.fits' + savepath = savedir / ( savename if savename is not None else f'stamp_{img.path.stem}.fits' ) - savepath = savedir / savename - - coord = SkyCoord(ra=ra*u.deg, dec=dec*u.deg) - with fits.open(imgpath) as hdu: - hdun = 1 if str(imgpath)[-3:] in ('.fz', '.gz') else 0 - wcs = WCS(hdu[hdun].header) - - x, y = skycoord_to_pixel(coord, wcs) + x, y = img.get_wcs().world_to_pixel( ra, dec ) pxradec = np.array([[x, y]]) - # TODO : if Stamp_Generator.SG can take a Path in FITS_StpLst, remove the str() - Stamp_Generator.SG(FITS_obj=imgpath, EXTINDEX=hdun, COORD=pxradec, COORD_TYPE='IMAGE', - STAMP_IMGSIZE=shape, FILL_VALUE=np.nan, FITS_StpLst=str(savepath)) - - return savepath + # Give Stamp_Generator.SG a FITS image to chew on + origimg = img + try: + if isinstance( origimg, snappl.image.FITSImageOnDisk ): + img = origimg.uncompressed_version( include=['data'] ) + else: + barf = "".join( random.choices( "0123456789abcdef:", k=10 ) ) + img = snappl.image.FITSImageOnDisk( path=savedir / f"{barf}.fits" ) + img.data = origimg.data + img.save_data() + + # TODO : if Stamp_Generator.SG can take a Path in FITS_StpLst, remove the str() + Stamp_Generator.SG(FITS_obj=img.path, EXTINDEX=0, COORD=pxradec, COORD_TYPE='IMAGE', + STAMP_IMGSIZE=shape, FILL_VALUE=np.nan, FITS_StpLst=str(savepath)) + + return savepath + + finally: + # Clean up temp file if necessary + if img.path != origimg.path: + img.path.unlink( missing_ok=True ) diff --git a/phrosty/photometry.py b/phrosty/photometry.py deleted file mode 100644 index 335ac9a..0000000 --- a/phrosty/photometry.py +++ /dev/null @@ -1,116 +0,0 @@ -__all__ = [ 'ap_phot', 'psfmodel', 'psf_phot' ] - -# IMPORTS Standard: -import numpy as np - -# IMPORTS Astro: -from astropy.table import hstack -from astropy.modeling.fitting import NonFiniteValueError -from photutils.aperture import CircularAperture, aperture_photometry, ApertureStats -# from photutils.background import LocalBackground, MMMBackground, Background2D -from photutils.psf import PSFPhotometry, FittableImageModel # EPSFBuilder, extract_stars, -from photutils.background import LocalBackground, MMMBackground - - -""" -NOTE: This module assumes images have already had background subtracted. -""" - -roman_bands = ['R062', 'Z087', 'Y106', 'J129', 'H158', 'F184', 'K213'] - - -def ap_phot(scienceimage, coords, - ap_r=9, method='subpixel', subpixels=5, - merge_tables=True, **kwargs): - """Does aperture photometry on the input science image and - specified coordinates. - - :param scienceimage: Array containing science image. - :type scienceimage: array-like - :param coords: Table with columns 'x' and 'y' representing pixel - coordinates for the detected sources. Table can contain - more than just 'x' and 'y', and this may be useful if you - use merge_tables=True. - :type coords: astropy.table.Table - :param ap_r: Aperture radius to use for aperture photometry. Defaults to 3. - :type ap_r: int, optional - :param method: _description_. Defaults to 'subpixel'. - :type method: str, optional - :param subpixels: _description_. Defaults to 5. - :type subpixels: int, optional - :param merge_tables: If true, output is merged coords and results from aperture - photometry. Defaults to True. - :type merge_tables: boolean, optional - - Returns: - astropy.table.Table: Table containing results from aperture photometry. - """ - x = np.array(coords['x']) - y = np.array(coords['y']) - photcoords = np.transpose(np.vstack([x, y])) - apertures = CircularAperture(photcoords, r=ap_r) - - ap_results = aperture_photometry(scienceimage, apertures, method=method, subpixels=subpixels, **kwargs) - apstats = ApertureStats(scienceimage, apertures) - ap_results['max'] = apstats.max - - # Needs to be 'xcentroid' and 'ycentroid' for PSF photometry. - # Same with 'flux'. - ap_results.rename_column('xcenter', 'xcentroid') - ap_results.rename_column('ycenter', 'ycentroid') - # ap_results.rename_column('aperture_sum', 'flux') - - for col in ap_results.colnames: - ap_results[col] = ap_results[col].value - - if merge_tables: - ap_results = hstack([ap_results, coords]) - # These are duplicates: - ap_results.remove_columns(['x', 'y']) - - return ap_results - - -def psfmodel(psfimg): - psf = FittableImageModel(psfimg) - return psf - - -def psf_phot(scienceimage, noise, psf, init_params, wcs=None, - forced_phot=True, fwhm=3.0, fit_shape=(5, 5), - oversampling=3, maxiters=10): - - # mean, median, stddev = sigma_clipped_stats(scienceimage) - # daofind = DAOStarFinder(fwhm=fwhm, threshold = 5.*(stddev)) - - if 'flux_init' not in init_params.colnames: - raise Exception('Astropy table passed to kwarg init_params must contain column \"flux_init\".') - - if forced_phot: - print('x, y are fixed!') - psf.x_0.fixed = True - psf.y_0.fixed = True - else: - print('x, y are fitting parameters!') - psf.x_0.fixed = False - psf.x_0.fixed = False - - try: - bkgfunc = LocalBackground(15, 25, MMMBackground()) - psfphot = PSFPhotometry(psf, fit_shape, localbkg_estimator=bkgfunc) - psf_results = psfphot(scienceimage, error=noise, init_params=init_params) - - if wcs is not None: - init_radec = wcs.pixel_to_world(psf_results['x_init'], psf_results['y_init']) - radec = wcs.pixel_to_world(psf_results['x_fit'], psf_results['y_fit']) - - psf_results['ra_init'] = init_radec.ra.value - psf_results['dec_init'] = init_radec.dec.value - - psf_results['ra'] = radec.ra.value - psf_results['dec'] = radec.dec.value - - return psf_results - - except NonFiniteValueError: - print('fit_shape overlaps with edge of image, and therefore encloses NaNs! Photometry cancelled.') diff --git a/phrosty/pipeline.py b/phrosty/pipeline.py index cd745fd..5b790cf 100644 --- a/phrosty/pipeline.py +++ b/phrosty/pipeline.py @@ -6,7 +6,6 @@ import cupy as cp from functools import partial import logging -import matplotlib.pyplot as plt from multiprocessing import Pool import numpy as np import nvtx @@ -19,46 +18,37 @@ # Imports ASTRO from astropy.coordinates import SkyCoord from astropy.io import fits -import astropy.table from astropy.table import Table import astropy.units as u -from astropy.wcs import WCS from astropy.wcs.utils import skycoord_to_pixel -from galsim import roman # Imports INTERNAL from phrosty.imagesubtraction import sky_subtract, stampmaker -from phrosty.photometry import ap_phot, psfmodel, psf_phot -from phrosty.utils import get_exptime, read_truth_txt from sfft.SpaceSFFTCupyFlow import SpaceSFFT_CupyFlow -from snappl.image import OpenUniverse2024FITSImage +from snappl.diaobject import DiaObject +from snappl.imagecollection import ImageCollection +from snappl.image import FITSImageOnDisk from snappl.psf import PSF from snpit_utils.config import Config from snpit_utils.logger import SNLogger class PipelineImage: - """Holds a snappl.image.Image, with some other stuff the pipeline needs.""" - def __init__( self, imagepath, pointing, sca, pipeline ): + def __init__( self, image, pipeline ): """Create a PipelineImage Parameters: ----------- - imagepath : str or Path - A path to the image. This will be passed on to an Image - subclass constructor; which subclass depends on the config - option photometry.phrosty.image_type - - pointing : str or int - An identifier of the pointing of this image. Used e.g. to pull PSFs. + image : snappl.image.Image + The image we're encapsulating. Pass either this or imagepath. pipeline : phrosty.pipeline.Pipeline The pipeline that owns this image. """ - # self.psf is a object of a subclass of snappl.psf.PSF + self.config = Config.get() self.temp_dir = pipeline.temp_dir self.keep_intermediate = self.config.value( 'photometry.phrosty.keep_intermediate' ) @@ -67,22 +57,18 @@ def __init__( self, imagepath, pointing, sca, pipeline ): elif not self.keep_intermediate: self.save_dir = self.temp_dir - if self.config.value( 'photometry.phrosty.image_type' ) == 'ou2024fits': - self.image = OpenUniverse2024FITSImage( imagepath, None, sca ) - else: - raise RuntimeError( "At the moment, phrosty only works with ou2024fits images. " - "We hope this will change soon." ) - - self.pointing = pointing - self.band = pipeline.band + self.image = image + if self.image.band != pipeline.band: + raise ValueError( f"Image {self.image.path.name} has a band {self.image.band}, " + f"which is different from the pipeline band {pipeline.band}" ) # Intermediate files if self.keep_intermediate: # Set to None. The path gets defined later on. # They have to be defined here in __init__ so that they exist # and are accessible in later functions. - self.skysub_path = None - self.detmask_path = None + self.skysub_img = None + self.detmask_img = None self.input_sci_psf_path = None self.input_templ_psf_path = None self.aligned_templ_img_path = None @@ -108,41 +94,16 @@ def __init__( self, imagepath, pointing, sca, pipeline ): self.psf_data = None def run_sky_subtract( self, mp=True ): - # Eventually, we may not want to save the sky subtracted image, but keep - # it in memory. (Reduce I/O.) Will require SFFT changes. - # (This may not be practical, as it will increase memory usage a *lot*. - # We may still need to write files.) try: - imname = self.image.name - # HACK ALERT : we're stripping the .gz off of the end of filenames - # if they have them, and making sure filenames end in .fits, - # because that's what SFFT needs. This can go away if we - # refactor to pass data. - if imname[-3:] == '.gz': - imname = imname[:-3] - if imname[-5:] != '.fits': - imname = f'{imname}.fits' - # Hey Rob--the below is broken. The pipeline runs with mp if I use the - # example science image file from examples/perlmutter, but not if I use - # my own file. If I run with one process, my own file runs. - # if mp: - # SNLogger.multiprocessing_replace() - SNLogger.debug( f"run_sky_subtract on {imname}" ) - - self.skysub_path = self.save_dir / f"skysub_{imname}" - self.detmask_path = self.save_dir / f"detmask_{imname}" - self.skyrms = sky_subtract( self.image.path, self.skysub_path, self.detmask_path, temp_dir=self.save_dir, - force=self.config.value( 'photometry.phrosty.force_sky_subtract' ) ) - SNLogger.debug( f"...done running sky subtraction on {self.image.name}" ) - return ( self.skysub_path, self.detmask_path, self.skyrms ) + return sky_subtract( self.image ) except Exception as ex: SNLogger.exception( ex ) raise def save_sky_subtract_info( self, info ): SNLogger.debug( f"Saving sky_subtract info for path {info[0]}" ) - self.skysub_path = info[0] - self.detmask_path = info[1] + self.skysub_img = info[0] + self.detmask_img = info[1] self.skyrms = info[2] def get_psf( self, ra, dec ): @@ -165,8 +126,10 @@ def get_psf( self, ra, dec ): if self.psfobj is None: psftype = self.config.value( 'photometry.phrosty.psf.type' ) - self.psfobj = PSF.get_psf_object( psftype, band=self.band, pointing=self.pointing, sca=self.image.sca, - x=x, y=y ) + self.psfobj = PSF.get_psf_object( psftype, x=x, y=y, + band=self.image.band, + pointing=self.image.pointing, + sca=self.image.sca ) stamp = self.psfobj.get_stamp( x, y ) if self.keep_intermediate: @@ -181,31 +144,51 @@ def keep_psf_data( self, psf_data ): def free( self ): """Try to free memory. More might be done here.""" self.image.free() + self.skysub_img.free() + self.detmask_img.free() + self.psf_data = None class Pipeline: """Phrosty's top-level pipeline""" - def __init__( self, object_id, ra, dec, band, science_images, template_images, nprocs=1, nwrite=5, + def __init__( self, diaobj, imgcol, band, + science_images=None, + template_images=None, + science_csv=None, + template_csv=None, + nprocs=1, nwrite=5, verbose=False ): """Create the a pipeline object. Parameters ---------- - object_id: int - - ra, dec: float - Position of transient in decimal degrees + diaobj : DiaObject + The object we're building a lightcurve for band: str One of R062, Z087, Y106, J129, H158, F184, K213 - science_images: list of tuple - ( path_to_image, pointing, sca ) + science_images: list of snappl.image.Image + The science images. + + template_images: list of snappl.image.Image + The template images. + + science_csv: Path or str + CSV file with the science images. The first line must be:: + + path pointing sca mjd band - template_images: list of tuple - ( path_to_image, pointing, sca ) + subsequent lines must have that information for all the + science images. path must be relative to ou24.images in + config. Pipeline will extract the images from this file + whose band matches the band of the pipeline (and ignore + the rest) + + template_csv: Path or str + CSV file with template images. Same format as science_csv. nprocs: int, default 1 Number of cpus for the CPU multiprocessing segments of the pipeline. @@ -214,13 +197,14 @@ def __init__( self, object_id, ra, dec, band, science_images, template_images, n nwrite: int, default 5 Number of asynchronous FITS writer processes. - """ SNLogger.setLevel( logging.DEBUG if verbose else logging.INFO ) self.config = Config.get() - self.tds_base_dir = pathlib.Path( self.config.value( 'ou24.tds_base' ) ) - self.image_base_dir = self.tds_base_dir / 'images' + self.imgcol = imgcol + self.diaobj = diaobj + self.band = band + self.dia_out_dir = pathlib.Path( self.config.value( 'photometry.phrosty.paths.dia_out_dir' ) ) self.scratch_dir = pathlib.Path( self.config.value( 'photometry.phrosty.paths.scratch_dir' ) ) self.temp_dir_parent = pathlib.Path( self.config.value( 'photometry.phrosty.paths.temp_dir' ) ) @@ -228,14 +212,19 @@ def __init__( self, object_id, ra, dec, band, science_images, template_images, n self.temp_dir.mkdir() self.ltcv_dir = pathlib.Path( self.config.value( 'photometry.phrosty.paths.ltcv_dir' ) ) - self.object_id = object_id - self.ra = ra - self.dec = dec - self.band = band - self.science_images = ( [ PipelineImage( self.image_base_dir / ppsmb[0], ppsmb[1], ppsmb[2], self ) - for ppsmb in science_images if ppsmb[4] == self.band ] ) - self.template_images = ( [ PipelineImage( self.image_base_dir / ppsmb[0], ppsmb[1], ppsmb[2], self ) - for ppsmb in template_images if ppsmb[4] == self.band ] ) + if ( science_images is None) == ( science_csv is None ): + raise ValueError( "Pass exactly one of science_images or science_csv" ) + if science_csv is not None: + science_images = self._read_csv( science_csv ) + + if ( template_images is None ) == ( template_csv is None ): + raise ValueError( "Pass exactly one of template_images or template_csv" ) + if template_csv is not None: + template_images = self._read_csv( template_csv ) + + self.science_images = [ PipelineImage( i, self ) for i in science_images ] + self.template_images = [ PipelineImage( i, self ) for i in template_images ] + self.nprocs = nprocs self.nwrite = nwrite @@ -244,19 +233,35 @@ def __init__( self, object_id, ra, dec, band, science_images, template_images, n self.mem_trace = self.config.value( 'photometry.phrosty.mem_trace' ) + def _read_csv( self, csvfile ): + imlist = [] + with open( csvfile ) as ifp: + hdrline = ifp.readline() + if not re.search( r"^\s*path\s+pointing\s+sca\s+mjd\s+band\s*$", hdrline ): + raise ValueError( f"First line of list file {csvfile} didn't match what was expected." ) + for line in ifp: + path, pointing, sca, mjd, band = line.split() + if band == self.band: + # This should yell at us if the pointing or sca doesn't match what is read from the path + imlist.append( self.imgcol.get_image( path=path, pointing=pointing, sca=sca, band=band ) ) + return imlist + + def sky_sub_all_images( self ): # Currently, this writes out a bunch of FITS files. Further refactoring needed # to support more general image types. all_imgs = self.science_images.copy() # shallow copy all_imgs.extend( self.template_images ) + self._omg = False + def log_error( img, x ): SNLogger.error( f"Sky subtraction subprocess failure: {x} for image {img.image.path}" ) + self._omg = True if self.nprocs > 1: with Pool( self.nprocs ) as pool: for img in all_imgs: - pool.apply_async( img.run_sky_subtract, (), {}, callback=img.save_sky_subtract_info, error_callback=partial(log_error, img) ) @@ -266,23 +271,34 @@ def log_error( img, x ): for img in all_imgs: img.save_sky_subtract_info( img.run_sky_subtract( mp=False ) ) + if self._omg: + raise RuntimeError( "Sky subtraction errors." ) + def get_psfs( self ): all_imgs = self.science_images.copy() # shallow copy all_imgs.extend( self.template_images ) + self._omg = False + + def log_error( x ): + SNLogger.error( f"get_psf subprocess failure: {x}" ) + self._omg = True # noqa: F841 + if self.nprocs > 1: with Pool( self.nprocs ) as pool: for img in all_imgs: # callback_partial = partial( img.save_psf_path, all_imgs ) - pool.apply_async( img.get_psf, (self.ra, self.dec), {}, - img.keep_psf_data, - lambda x: SNLogger.error( f"get_psf subprocess failure: {x}" ) ) + pool.apply_async( img.get_psf, (self.diaobj.ra, self.diaobj.dec), {}, + img.keep_psf_data, log_error ) pool.close() pool.join() else: for img in all_imgs: - img.keep_psf_data( img.get_psf(self.ra, self.dec) ) + img.keep_psf_data( img.get_psf(self.diaobj.ra, self.diaobj.dec) ) + + if self._omg: + raise RuntimeError( "get_psf errors." ) def align_and_pre_convolve(self, templ_image, sci_image ): @@ -290,35 +306,47 @@ def align_and_pre_convolve(self, templ_image, sci_image ): Parameters ---------- - sci_image: phrosty.Image + sci_image: phrosty.PipelineImage The science (new) image. - templ_image: phrosty.Image + templ_image: phrosty.PipelineImage The template (ref) image that will be subtracted from sci_image. + Returns + ------- + sfftifier: SpaceSFFT_CupyFlow + Use this object for futher SFFT work. Be sure to + dereference it to free the prodigious amount of memory it + allcoates. + """ - with fits.open( sci_image.skysub_path ) as hdul: - hdr_sci = hdul[0].header - data_sci = cp.array( np.ascontiguousarray(hdul[0].data.T), dtype=cp.float64 ) - with fits.open( templ_image.skysub_path ) as hdul: - hdr_templ = hdul[0].header - data_templ = cp.array( np.ascontiguousarray(hdul[0].data.T), dtype=cp.float64 ) + # SFFT needs FITS headers with a WCS and with NAXIS[12] + hdr_sci = sci_image.image.get_wcs().get_astropy_wcs().to_header( relax=True ) + hdr_sci.insert( 0, ('NAXIS', 2) ) + hdr_sci.insert( 'NAXIS', ('NAXIS1', sci_image.image.data.shape[1] ), after=True ) + hdr_sci.insert( 'NAXIS1', ('NAXIS2', sci_image.image.data.shape[0] ), after=True ) + data_sci = cp.array( np.ascontiguousarray(sci_image.image.data.T), dtype=cp.float64 ) + noise_sci = cp.array( np.ascontiguousarray(sci_image.image.noise.T), dtype=cp.float64 ) + + hdr_templ = templ_image.image.get_wcs().get_astropy_wcs().to_header( relax=True ) + hdr_templ.insert( 0, ('NAXIS', 2) ) + hdr_templ.insert( 'NAXIS', ('NAXIS1', templ_image.image.data.shape[1] ), after=True ) + hdr_templ.insert( 'NAXIS1', ('NAXIS2', templ_image.image.data.shape[0] ), after=True ) + data_templ = cp.array( np.ascontiguousarray(templ_image.image.data.T), dtype=cp.float64 ) + noise_templ = cp.array( np.ascontiguousarray(templ_image.image.noise.T), dtype=cp.float64 ) sci_psf = cp.ascontiguousarray( cp.array( sci_image.psf_data.T, dtype=cp.float64 ) ) templ_psf = cp.ascontiguousarray( cp.array( templ_image.psf_data.T, dtype=cp.float64 ) ) - with fits.open( sci_image.detmask_path ) as hdul: - sci_detmask = cp.array( np.ascontiguousarray( hdul[0].data.T ) ) - - with fits.open( templ_image.detmask_path ) as hdul: - templ_detmask = cp.array( np.ascontiguousarray( hdul[0].data.T ) ) + sci_detmask = cp.array( np.ascontiguousarray( sci_image.detmask_img.data.T ) ) + templ_detmask = cp.array( np.ascontiguousarray( templ_image.detmask_img.data.T ) ) sfftifier = SpaceSFFT_CupyFlow( hdr_sci, hdr_templ, sci_image.skyrms, templ_image.skyrms, data_sci, data_templ, - cp.array( sci_image.image.noise ), cp.array( templ_image.image.noise ), + noise_sci, noise_templ, sci_detmask, templ_detmask, sci_psf, templ_psf, KerPolyOrder=Config.get().value('photometry.phrosty.kerpolyorder') @@ -329,13 +357,43 @@ def align_and_pre_convolve(self, templ_image, sci_image ): return sfftifier - def phot_at_coords( self, img, err, psf, pxcoords=(50, 50), ap_r=4 ): - """Do photometry at forced set of pixel coordinates.""" + def phot_at_coords( self, img, psf, pxcoords=(50, 50), ap_r=4 ): + """Do photometry at forced set of pixel coordinates. + + Parameters + ---------- + img: snappl.image.Image + The image on which to do the photometry + + psf: snappl.psf.PSF + The PSF. + + pxcoords: tuple of (int, int) + The position on the image to do the photometry + + ap_r: float + Radius of aperture. + + Returns + ------- + results: dict + Keys and values are: + * 'aperture_sum': flux in aperture of radius ap_r + * 'flux_fit': flux from PSF photometry + * 'flux_fit_err': uncertainty on flux_fit + * 'mag_fit': instrumental magnitude (i.e. no zeropoint) from flux_fit + * 'mag_fit_err': uncertainty on mag_fit + + All values are floats. + + """ forcecoords = Table([[float(pxcoords[0])], [float(pxcoords[1])]], names=["x", "y"]) - init = ap_phot(img, forcecoords, ap_r=ap_r) - init['flux_init'] = init['aperture_sum'] - final = psf_phot(img, err, psf, init, forced_phot=True) + init = img.ap_phot( forcecoords, ap_r=ap_r ) + init.rename_column( 'aperture_sum', 'flux_init' ) + init.rename_column( 'xcenter', 'xcentroid' ) + init.rename_column( 'ycenter', 'ycentroid' ) + final = img.psf_phot( init, psf, forced_phot=True ) flux = final['flux_fit'][0] flux_err = final['flux_err'][0] @@ -343,7 +401,7 @@ def phot_at_coords( self, img, err, psf, pxcoords=(50, 50), ap_r=4 ): mag_err = (2.5 / np.log(10)) * np.abs(final["flux_err"][0] / final["flux_fit"][0]) results_dict = { - 'aperture_sum': init['aperture_sum'][0], + 'aperture_sum': init['flux_init'][0], 'flux_fit': flux, 'flux_fit_err': flux_err, 'mag_fit': mag, @@ -352,169 +410,59 @@ def phot_at_coords( self, img, err, psf, pxcoords=(50, 50), ap_r=4 ): return results_dict - def get_stars(self, truthpath, nx=4088, ny=4088, transform=False, wcs=None): - """Get the stars in the science images. + def make_phot_info_dict( self, sci_image, templ_image, ap_r=4 ): + """"Do things. + + Parmaeters + ---------- + sci_image: PipelineImage + science image wrapper + + temp_image: PipelineImage + template image wrapper + + ap_r: float, default 4 + Radius of aperture to use in something + + Returns + ------- + something: dict + It has things in it - Optional to transform to another WCS. """ - truth_tab = read_truth_txt(path=truthpath) - truth_tab['mag'].name = 'mag_truth' - truth_tab['flux'].name = 'flux_truth' - - if transform: - assert wcs is not None, 'You need to provide a WCS to transform to!' - truth_tab['x'].name, truth_tab['y'].name = 'x_orig', 'y_orig' - worldcoords = SkyCoord(ra=truth_tab['ra'] * u.deg, dec=truth_tab['dec'] * u.deg) - x, y = skycoord_to_pixel(worldcoords, wcs) - truth_tab['x'] = x - truth_tab['y'] = y - - if not transform: - truth_tab['x'] -= 1 - truth_tab['y'] -= 1 - - idx = np.where(truth_tab['obj_type'] == 'star')[0] - stars = truth_tab[idx] - stars = stars[np.logical_and(stars["x"] < nx, stars["x"] > 0)] - stars = stars[np.logical_and(stars["y"] < ny, stars["y"] > 0)] - - return stars - - def get_galsim_values(self): - exptime = get_exptime(self.band) - area_eff = roman.collecting_area - gs_zpt = roman.getBandpasses()[self.band].zeropoint - - return {'exptime': exptime, 'area_eff': area_eff, 'gs_zpt': gs_zpt} - - def get_zpt(self, zptimg, err, psf, band, stars, ap_r=4, - zpt_plot=None, oid=None, sci_pointing=None, sci_sca=None): - - # TODO : Need to move this code all over into snappl Image. It sounds like - # for Roman images we may have to do our own zeropoints (which is what - # is happening here), but for actual Roman we're going to use - # calibration information we get from elsewhere, so we don't want to bake - # doing the calibration into the pipeline as we do here. - # - # Also Issue #70 - - """Get the zeropoint based on the stars.""" - - # First, need to do photometry on the stars. - init_params = ap_phot(zptimg, stars, ap_r=ap_r) - init_params['flux_init'] = init_params['aperture_sum'] - final_params = psf_phot(zptimg, err, psf, init_params, forced_phot=True) - - # Do not need to cross match. Can just merge tables because they - # will be in the same order. - photres = astropy.table.join(stars, init_params, keys=['object_id', 'ra', 'dec', 'realized_flux', - 'flux_truth', 'mag_truth', 'obj_type']) - photres = astropy.table.join(photres, final_params, keys=['id']) - - # Get the zero point. - galsim_vals = self.get_galsim_values() - star_ap_mags = -2.5 * np.log10(photres['aperture_sum']) - star_fit_mags = -2.5 * np.log10(photres['flux_fit']) - star_truth_mags = ( -2.5 * np.log10(photres['flux_truth']) + galsim_vals['gs_zpt'] - + 2.5 * np.log10(galsim_vals['exptime'] * galsim_vals['area_eff']) ) - - # Eventually, this should be a S/N cut, not a mag cut. - zpt_mask = np.logical_and(star_truth_mags > 19, star_truth_mags < 21.5) - zpt = np.nanmedian(star_truth_mags[zpt_mask] - star_fit_mags[zpt_mask]) - ap_zpt = np.nanmedian(star_truth_mags[zpt_mask] - star_ap_mags[zpt_mask]) - - if zpt_plot is not None: - assert oid is not None, 'If zpt_plot=True, oid must be provided.' - assert sci_pointing is not None, 'If zpt_plot=True, sci_pointing must be provided.' - assert sci_sca is not None, 'If zpt_plot=True, sci_sca must be provided.' - - savedir = self.ltcv_dir / f'figs/{oid}/zpt_plots' - savedir.mkdir(parents=True, exist_ok=True) - savepath = savedir / f'zpt_stars_{band}_{sci_pointing}_{sci_sca}.png' - - plt.figure(figsize=(8, 8)) - yaxis = star_fit_mags + zpt - star_truth_mags - - plt.plot(star_truth_mags, yaxis, marker='o', linestyle='') - plt.axhline(0, linestyle='--', color='k') - plt.xlabel('Truth mag') - plt.ylabel('Fit mag - zpt + truth mag') - plt.title(f'{band} {sci_pointing} {sci_sca}') - plt.savefig(savepath, dpi=300, bbox_inches='tight') - plt.close() - - SNLogger.info(f'zpt debug plot saved to {savepath}') - - # savepath = os.path.join(savedir, f'hist_truth-fit_{band}_{sci_pointing}_{sci_sca}.png') - # plt.hist(star_truth_mags[zpt_mask] - star_fit_mags[zpt_mask]) - # plt.title(f'{band} {sci_pointing} {sci_sca}') - # plt.xlabel('star_truth_mags[zpt_mask] - star_fit_mags[zpt_mask]') - # plt.savefig(savepath, dpi=300, bbox_inches='tight') - # plt.close() - - return zpt, ap_zpt - def make_phot_info_dict( self, sci_image, templ_image, ap_r=4 ): - # Do photometry on stamp because it will read faster - diff_img_stamp_path = sci_image.diff_stamp_path[ templ_image.image.name ] - diff_img_var_stamp_path = sci_image.diff_var_stamp_path[ templ_image.image.name ] + # Do photometry on stamp because it will read faster. + # (We hope. But CFS latency will kill you at 1 byte.) + SNLogger.debug( "...make_phot_info_dict reading stamp and psf" ) + diff_img = FITSImageOnDisk( sci_image.diff_stamp_path[ templ_image.image.name ], + noisepath=sci_image.diff_var_stamp_path[ templ_image.image.name ] ) + psf_img = FITSImageOnDisk( sci_image.decorr_psf_path[ templ_image.image.name ] ) results_dict = {} results_dict['sci_name'] = sci_image.image.name results_dict['templ_name'] = templ_image.image.name results_dict['success'] = False - results_dict['ra'] = self.ra - results_dict['dec'] = self.dec + results_dict['ra'] = self.diaobj.ra + results_dict['dec'] = self.diaobj.dec results_dict['mjd'] = sci_image.image.mjd results_dict['filter'] = self.band - results_dict['pointing'] = sci_image.pointing + results_dict['pointing'] = sci_image.image.pointing results_dict['sca'] = sci_image.image.sca - results_dict['template_pointing'] = templ_image.pointing + results_dict['template_pointing'] = templ_image.image.pointing results_dict['template_sca'] = templ_image.image.sca - if diff_img_stamp_path.is_file(): - # Load in the difference image stamp. - with fits.open(diff_img_stamp_path) as diff_hdu: - diffimg = diff_hdu[0].data - wcs = WCS(diff_hdu[0].header) - - with fits.open( diff_img_var_stamp_path ) as var_hdu: - err = np.sqrt( var_hdu[0].data ) - - # Load in the decorrelated PSF. - psfpath = sci_image.decorr_psf_path[ templ_image.image.name ] - with fits.open( psfpath ) as hdu: - psf = psfmodel( hdu[0].data ) - coord = SkyCoord(ra=self.ra * u.deg, dec=self.dec * u.deg) - pxcoords = skycoord_to_pixel(coord, wcs) - results_dict.update( self.phot_at_coords(diffimg, err, psf, pxcoords=pxcoords, ap_r=ap_r) ) - - # Get the zero point from the decorrelated, convolved science image. - # First, get the table of known stars. - - # TODO -- take this galsim-specific code out, move it to a separate module. Define a general - # zeropointing interface, of which the galsim-speicifc one will be one instance - truthpath = str( self.tds_base_dir / - f'truth/{self.band}/{sci_image.pointing}/' - f'Roman_TDS_index_{self.band}_{sci_image.pointing}_{sci_image.image.sca}.txt' ) - stars = self.get_stars(truthpath) - # Now, calculate the zero point based on those stars. - zptimg_path = sci_image.decorr_zptimg_path[ templ_image.image.name ] - with fits.open(zptimg_path) as hdu: - zptimg = hdu[0].data - - zpt, ap_zpt = self.get_zpt(zptimg, sci_image.image.noise, psf, self.band, stars, oid=self.object_id, - sci_pointing=sci_image.pointing, sci_sca=sci_image.image.sca) - - # Add additional info to the results dictionary so it can be merged into a nice file later. - results_dict['zpt'] = zpt - results_dict['ap_zpt'] = ap_zpt - results_dict['success'] = True - - else: + try: + # Make sure the files are there. Has side effect of loading the header and data. + diff_img.get_data( which='data', cache=True ) + diff_img.get_data( which='noise', cache=True ) + # The thing written to disk was actually variance, so fix that + diff_img.noise = np.sqrt( diff_img.noise ) + psf_img.get_data( which='data', cache=True ) + except Exception: + # TODO : change this to a more specific exception SNLogger.warning( f"Post-processed image files for " - f"{self.band}_{sci_image.pointing}_{sci_image.image.sca}-" - f"{self.band}_{templ_image.pointing}_{templ_image.image.sca} " + f"{self.band}_{sci_image.image.pointing}_{sci_image.image.sca}-" + f"{self.band}_{templ_image.image.pointing}_{templ_image.image.sca} " f"do not exist. Skipping." ) results_dict['zpt'] = np.nan results_dict['ap_zpt'] = np.nan @@ -524,11 +472,28 @@ def make_phot_info_dict( self, sci_image, templ_image, ap_r=4 ): results_dict['mag_fit'] = np.nan results_dict['mag_fit_err'] = np.nan + SNLogger.debug( "...make_phot_info_dict getting psf" ) + coord = SkyCoord(ra=self.diaobj.ra * u.deg, dec=self.diaobj.dec * u.deg) + pxcoords = skycoord_to_pixel( coord, diff_img.get_wcs().get_astropy_wcs() ) + psf = PSF.get_psf_object( 'OversampledImagePSF', + x=diff_img.data.shape[1]/2., y=diff_img.data.shape[1]/2., + oversample_factor=1., + data=psf_img.data ) + SNLogger.debug( "...make_phot_info_dict doing photometry" ) + results_dict.update( self.phot_at_coords( diff_img, psf, pxcoords=pxcoords, ap_r=ap_r) ) + + # Add additional info to the results dictionary so it can be merged into a nice file later. + SNLogger.debug( "...make_phot_info_dict getting zeropoint" ) + results_dict['zpt'] = sci_image.image.zeropoint + results_dict['success'] = True + + SNLogger.debug( "...make_phot_info_dict done." ) return results_dict def add_to_results_dict( self, one_pair ): for key, arr in self.results_dict.items(): arr.append( one_pair[ key ] ) + SNLogger.debug( "Done adding to results dict" ) def save_stamp_paths( self, sci_image, templ_image, paths ): sci_image.zpt_stamp_path[ templ_image.image.name ] = paths[0] @@ -537,23 +502,26 @@ def save_stamp_paths( self, sci_image, templ_image, paths ): def do_stamps( self, sci_image, templ_image ): - zptname = sci_image.decorr_zptimg_path[ templ_image.image.name ] - zpt_stampname = stampmaker( self.ra, self.dec, np.array([100, 100]), - zptname, + zptim = FITSImageOnDisk( sci_image.decorr_zptimg_path[ templ_image.image.name ] ) + zpt_stampname = stampmaker( self.diaobj.ra, self.diaobj.dec, np.array([100, 100]), + zptim, savedir=self.dia_out_dir, - savename=f"stamp_{zptname.name}" ) + savename=f"stamp_{zptim.path.name}" ) + zptim.free() - diffname = sci_image.decorr_diff_path[ templ_image.image.name ] - diff_stampname = stampmaker( self.ra, self.dec, np.array([100, 100]), - diffname, + diffim = FITSImageOnDisk( sci_image.decorr_diff_path[ templ_image.image.name ] ) + diff_stampname = stampmaker( self.diaobj.ra, self.diaobj.dec, np.array([100, 100]), + diffim, savedir=self.dia_out_dir, - savename=f"stamp_{diffname.name}" ) + savename=f"stamp_{diffim.path.name}" ) + diffim.free() - diffvarname = sci_image.diff_var_path[ templ_image.image.name ] - diffvar_stampname = stampmaker( self.ra, self.dec, np.array([100, 100]), - diffvarname, + diffvarim = FITSImageOnDisk( sci_image.diff_var_path[ templ_image.image.name ] ) + diffvar_stampname = stampmaker( self.diaobj.ra, self.diaobj.dec, np.array([100, 100]), + diffvarim, savedir=self.dia_out_dir, - savename=f"stamp_{diffvarname.name}" ) + savename=f"stamp_{diffvarim.path.name}" ) + diffvarim.free() SNLogger.info(f"Decorrelated diff stamp path: {pathlib.Path( diff_stampname )}") SNLogger.info(f"Zpt image stamp path: {pathlib.Path( zpt_stampname )}") @@ -574,7 +542,6 @@ def make_lightcurve( self ): 'template_pointing': [], 'template_sca': [], 'zpt': [], - 'ap_zpt': [], 'aperture_sum': [], 'flux_fit': [], 'flux_fit_err': [], @@ -582,15 +549,19 @@ def make_lightcurve( self ): 'mag_fit_err': [], } + self._omg = False + + def log_error( x ): + SNLogger.error( f"make_phot_info_dict subprocess failure: {x}" ) + self._omg = True + if self.nprocs > 1: with Pool( self.nprocs ) as pool: for sci_image in self.science_images: for templ_image in self.template_images: pool.apply_async( self.make_phot_info_dict, (sci_image, templ_image), {}, self.add_to_results_dict, - error_callback=lambda x: SNLogger.error( f"make_phot_info_dict " - f"subprocess failure: {x}" ) - ) + error_callback=log_error ) pool.close() pool.join() else: @@ -599,16 +570,26 @@ def make_lightcurve( self ): for templ_image in self.template_images: self.add_to_results_dict( self.make_phot_info_dict( sci_image, templ_image ) ) + if self._omg: + raise RuntimeError( "make_phot_info_dict failed" ) + + SNLogger.debug( "Saving results..." ) results_tab = Table(self.results_dict) results_tab.sort('mjd') - results_savedir = self.ltcv_dir / 'data' / str(self.object_id) + results_savedir = self.ltcv_dir / 'data' / str(self.diaobj.id) results_savedir.mkdir( exist_ok=True, parents=True ) - results_savepath = results_savedir / f'{self.object_id}_{self.band}_all.csv' + results_savepath = results_savedir / f'{self.diaobj.id}_{self.band}_all.csv' results_tab.write(results_savepath, format='csv', overwrite=True) SNLogger.info(f'Results saved to {results_savepath}') + return results_savepath + def write_fits_file( self, data, header, savepath ): - fits.writeto( savepath, data, header=header, overwrite=True ) + try: + fits.writeto( savepath, data, header=header, overwrite=True ) + except Exception as e: + SNLogger.exception( f"Exception writing FITS image {savepath}: {e}" ) + raise def clear_contents( self, directory ): for f in directory.iterdir(): @@ -622,6 +603,30 @@ def clear_contents( self, directory ): print( f'Oops! Deleting {f} from {directory} did not work.\nReason: {e}' ) def __call__( self, through_step=None ): + """Run the pipeline. + + Parameters + ---------- + through_step: str, default None + Which step to run thorough? Runs them all if not given. + + Steps in order are: + * sky_subtract + * get_psfs + * align_and_preconvolve + * subtract + * find_decorrelation + * apply_decorrelation + * make_stamps + * make_lightcurve + + Returns + ------- + ltcvpath : pathlib.Path or None + The path to the output lightcurve file if make_lightcurve + was run, otherwise None. + + """ if self.mem_trace: tracemalloc.start() tracemalloc.reset_peak() @@ -684,13 +689,11 @@ def log_fits_write_error( savepath, x ): SNLogger.info( "...generate variance image" ) with nvtx.annotate( "variance", color=0x44ccff ): diff_var = sfftifier.create_variance_image() - mess = ( f"{self.band}_{sci_image.pointing}_{sci_image.image.sca}_-" - f"_{self.band}_{templ_image.pointing}_{templ_image.image.sca}.fits" ) + mess = f"{sci_image.image.name}-{templ_image.image.name}" diff_var_path = self.dia_out_dir / f"diff_var_{mess}" if 'apply_decorrelation' in steps: - mess = ( f"{self.band}_{sci_image.pointing}_{sci_image.image.sca}_-" - f"_{self.band}_{templ_image.pointing}_{templ_image.image.sca}.fits" ) + mess = f"{sci_image.image.name}-{templ_image.image.name}" decorr_psf_path = self.dia_out_dir / f"decorr_psf_{mess}" decorr_zptimg_path = self.dia_out_dir / f"decorr_zptimg_{mess}" decorr_diff_path = self.dia_out_dir / f"decorr_diff_{mess}" @@ -724,42 +727,39 @@ def log_fits_write_error( savepath, x ): # In the future, we may want to write these things right after they happen # instead of saving it all for the end of the SFFT stuff. - sci_filepathpart = f'{sci_image.band}_{sci_image.pointing}_{sci_image.image.sca}' - templ_filepathpart = f'{templ_image.band}_{templ_image.pointing}_{templ_image.image.sca}' - write_filepaths = {'aligned': [['img', - f'{templ_filepathpart}_-_{sci_filepathpart}.fits', + f'{templ_image.image.name}_-_{sci_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_resamp_object_GPU.T), sfftifier.hdr_target], ['var', - f'{templ_filepathpart}_-_{sci_filepathpart}.fits', + f'{templ_image.image.name}_-_{sci_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_resamp_objectVar_GPU.T), sfftifier.hdr_target], ['psf', - f'{templ_filepathpart}_-_{sci_filepathpart}.fits', + f'{templ_image.image.name}_-_{sci_image.image.name}.fits', cp.asnumpy(sfftifier.PSF_resamp_object_GPU.T), sfftifier.hdr_target], ['detmask', - f'{sci_filepathpart}_-_{templ_filepathpart}.fits', + f'{sci_image.image.name}_-_{templ_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_resamp_object_DMASK_GPU.T), sfftifier.hdr_target] ], 'convolved': [['img', - f'{sci_filepathpart}_-_{templ_filepathpart}.fits', + f'{sci_image.image.name}_-_{templ_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_Ctarget_GPU.T), sfftifier.hdr_target], ['img', - f'{templ_filepathpart}_-_{sci_filepathpart}.fits', + f'{templ_image.image.name}_-_{sci_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_Cresamp_object_GPU.T), sfftifier.hdr_target] ], 'diff': [['img', - f'{sci_filepathpart}_-_{templ_filepathpart}.fits', + f'{sci_image.image.name}_-_{templ_image.image.name}.fits', cp.asnumpy(sfftifier.PixA_DIFF_GPU.T), sfftifier.hdr_target] ], 'decorr': [['kernel', - f'{sci_filepathpart}_-_{templ_filepathpart}.fits', + f'{sci_image.image.name}_-_{templ_image.image.name}.fits', cp.asnumpy(sfftifier.FKDECO_GPU.T), sfftifier.hdr_target] ] @@ -790,7 +790,13 @@ def log_fits_write_error( savepath, x ): if 'make_stamps' in steps: SNLogger.info( "Starting to make stamps..." ) with nvtx.annotate( "make stamps", color=0xff8888 ): - partialstamp = partial(stampmaker, self.ra, self.dec, np.array([100, 100])) + self._omg = False + + def log_stamp_err( x ): + SNLogger.error( f"do_stamps subprocess failure: {x} " ) + self._omg = True + + partialstamp = partial(stampmaker, self.diaobj.ra, self.diaobj.dec, np.array([100, 100])) # template path, savedir, savename templstamp_args = ( (ti.image.path, self.dia_out_dir, f'stamp_{str(ti.image.name)}') for ti in self.template_images ) @@ -808,10 +814,7 @@ def log_fits_write_error( savepath, x ): sci_stamp_pool.apply_async( self.do_stamps, pair, {}, callback = partial(self.save_stamp_paths, sci_image, templ_image), - error_callback=partial(SNLogger.error, - "do_stamps subprocess failure: {x}") - ) - + error_callback=log_stamp_err ) sci_stamp_pool.close() sci_stamp_pool.join() @@ -824,15 +827,18 @@ def log_fits_write_error( savepath, x ): stamp_paths = self.do_stamps( sci_image, templ_image) self.save_stamp_paths( sci_image, templ_image, stamp_paths ) + if self._omg: + raise RuntimeError( "Error making stamps." ) + SNLogger.info('...finished making stamps.') if self.mem_trace: SNLogger.info( f"After make_stamps, memory usage = {tracemalloc.get_traced_memory()[1]/(1024**2):.2f} MB" ) + lightcurve_path = None if 'make_lightcurve' in steps: - SNLogger.info( "Making lightcurve" ) with nvtx.annotate( "make_lightcurve", color=0xff8888 ): - self.make_lightcurve() + lightcurve_path = self.make_lightcurve() if self.mem_trace: SNLogger.info( f"After make_lightcurve, memory usage = \ @@ -841,7 +847,10 @@ def log_fits_write_error( savepath, x ): if self.remove_temp_dir: self.clear_contents( self.temp_dir ) - # ====================================================================== + return lightcurve_path + + +# ====================================================================== def main(): @@ -859,7 +868,7 @@ def main(): if str(e) == 'No default config defined yet; run Config.init(configfile)': sys.stderr.write( "Error, no configuration file defined.\n" "Either run phrosty with -c \n" - "or set the SNPIT_CONFIG environment varaible.\n" ) + "or set the SNPIT_CONFIG environment variable.\n" ) sys.exit(1) else: raise @@ -867,15 +876,26 @@ def main(): parser = argparse.ArgumentParser() # Put in the config_file argument, even though it will never be found, so it shows up in help parser.add_argument( '-c', '--config-file', help="Location of the .yaml config file" ) - parser.add_argument( '--oid', type=int, required=True, help="Object ID" ) - parser.add_argument( '-r', '--ra', type=float, required=True, help="Object RA" ) - parser.add_argument( '-d', '--dec', type=float, required=True, help="Object Dec" ) + parser.add_argument( '--object-collection', '--oc', required=True, + help='Collection of the object. Currently only "ou2024" and "manual" supported.' ) + parser.add_argument( '--object-subset', '--os', default=None, + help="Collection subset. Not used by all collections." ) + parser.add_argument( '--oid', type=int, required=True, + help="Object ID. Meaning is collection-dependent." ) + parser.add_argument( '-r', '--ra', type=float, default=None, + help="Object RA. By default, uses the one found for the object." ) + parser.add_argument( '-d', '--dec', type=float, default=None, + help="Object Dec. By default, uses the one found for the object." ) parser.add_argument( '-b', '--band', type=str, required=True, help="Band: R062, Z087, Y106, J129, H158, F184, or K213" ) + parser.add_argument( '--image-collection', '--ic', required=True, help="Collection of the images we're using" ) + parser.add_argument( '--image-subset', '--is', default=None, help="Image collection subset" ) + parser.add_argument( '--base-path', type=str, default=None, + help='Base path for images. Required for "manual_fits" image collection' ) parser.add_argument( '-t', '--template-images', type=str, required=True, - help="Path to file with, per line, ( path_to_image, pointing, sca )" ) + help="Path to file with, per line, ( path_to_image, pointing, sca, mjd, band )" ) parser.add_argument( '-s', '--science-images', type=str, required=True, - help="Path to file with, per line, ( path_to_image, pointing, sca )" ) + help="Path to file with, per line, ( path_to_image, pointing, sca, mjd, band )" ) parser.add_argument( '-p', '--nprocs', type=int, default=1, help="Number of process for multiprocessing steps (e.g. skysub)" ) parser.add_argument( '-w', '--nwrite', type=int, default=5, help="Number of parallel FITS writing processes" ) @@ -887,19 +907,33 @@ def main(): args = parser.parse_args( leftovers ) cfg.parse_args( args ) - science_images = [] - template_images = [] - for infile, imlist in zip( [ args.science_images, args.template_images ], - [ science_images, template_images ] ): - with open( infile ) as ifp: - hdrline = ifp.readline() - if not re.search( r"^\s*path\s+pointing\s+sca\s+mjd\s+band\s*$", hdrline ): - raise ValueError( f"First line of list file {infile} didn't match what was expected." ) - for line in ifp: - img, point, sca, mjd, band = line.split() - imlist.append( ( pathlib.Path(img), int(point), int(sca), float(mjd), band ) ) - - pipeline = Pipeline( args.oid, args.ra, args.dec, args.band, science_images, template_images, + # Get the DiaObject, update the RA and Dec + + diaobjs = DiaObject.find_objects( collection=args.object_collection, subset=args.object_subset, + id=args.oid, ra=args.ra, dec=args.dec ) + if len( diaobjs ) == 0: + raise ValueError( f"Could not find DiaObject with id={args.id}, ra={args.ra}, dec={args.dec}." ) + if len( diaobjs ) > 1: + raise ValueError( f"Found multiple DiaObject with id={args.id}, ra={args.ra}, dec={args.dec}." ) + diaobj = diaobjs[0] + if args.ra is not None: + if np.fabs( args.ra - diaobj.ra ) > 1. / 3600. / np.cos( diaobj.dec * np.pi / 180. ): + SNLogger.warning( f"Given RA {args.ra} is far from DiaObject nominal RA {diaobj.ra}" ) + diaobj.ra = args.ra + if args.dec is not None: + if np.fabs( args.dec - diaobj.dec ) > 1. / 3600.: + SNLogger.warning( f"Given Dec {args.dec} is far from DiaObject nominal Dec {diaobj.dec}" ) + diaobj.dec = args.dec + + # Get the image collection + + imgcol = ImageCollection.get_collection( collection=args.image_collection, subset=args.image_subset, + base_path=args.base_path ) + + # Create and launch the pipeline + + pipeline = Pipeline( diaobj, imgcol, args.band, + science_csv=args.science_images, template_csv=args.template_images, nprocs=args.nprocs, nwrite=args.nwrite, verbose=args.verbose ) pipeline( args.through_step ) diff --git a/phrosty/tests/20172782_instances_science_2.csv b/phrosty/tests/20172782_instances_science_2.csv index 495ac20..364c820 100644 --- a/phrosty/tests/20172782_instances_science_2.csv +++ b/phrosty/tests/20172782_instances_science_2.csv @@ -1,3 +1,3 @@ path pointing sca mjd band -simple_model/Y106/35198/Roman_TDS_simple_model_Y106_35198_2.fits.gz 35198 2 62455.669 Y106 -simple_model/Y106/39790/Roman_TDS_simple_model_Y106_39790_15.fits.gz 39790 15 62515.527 Y106 +Y106/35198/Roman_TDS_simple_model_Y106_35198_2.fits.gz 35198 2 62455.669 Y106 +Y106/39790/Roman_TDS_simple_model_Y106_39790_15.fits.gz 39790 15 62515.527 Y106 diff --git a/phrosty/tests/20172782_instances_templates_1.csv b/phrosty/tests/20172782_instances_templates_1.csv index e85338e..0d25969 100644 --- a/phrosty/tests/20172782_instances_templates_1.csv +++ b/phrosty/tests/20172782_instances_templates_1.csv @@ -1,2 +1,2 @@ path pointing sca mjd band -simple_model/Y106/5934/Roman_TDS_simple_model_Y106_5934_3.fits.gz 5934 6 62075.648 Y106 +Y106/5934/Roman_TDS_simple_model_Y106_5934_3.fits.gz 5934 3 62075.648 Y106 diff --git a/phrosty/tests/conftest.py b/phrosty/tests/conftest.py index 29d448a..6a21a55 100644 --- a/phrosty/tests/conftest.py +++ b/phrosty/tests/conftest.py @@ -5,46 +5,106 @@ from tox.pytest import init_fixture # noqa: F401 from snpit_utils.config import Config +from snappl.image import FITSImageOnDisk +from snappl.diaobject import DiaObject +from snappl.imagecollection import ImageCollection -import phrosty.imagesubtraction -direc = pathlib.Path( __file__ ).parent +_direc = pathlib.Path( __file__ ).parent.resolve() -@pytest.fixture( scope='session' ) -def dia_out_dir(): - return pathlib.Path( Config.get().value( 'photometry.phrosty.paths.dia_out_dir' ) ) +@pytest.fixture( scope='session', autouse=True ) +def config(): + """Set the global config for phrosty tests. + Tests ignore SNPIT_CONFIG env var, but will always use + phrosty/tests/phrosty_test_config. -@pytest.fixture( scope='session' ) -def sims_dir(): - return pathlib.Path( Config.get().value( 'ou24.tds_base' ) ) / 'images' + Output directories are made underneath test_output. This fixture + empties out all those output directories when the tests finish. + """ -# @pytest.fixture( scope='session' ) -# def sn_info_dir(): -# return pathlib.Path( os.getenv( "SN_INFO_DIR", direc / "sn_info_dir" ) ) + # Directories we'll use for test outputs + temp_dir = _direc / 'test_output/temp' + scratch_dir = temp_dir + dia_out_dir = _direc / 'test_output/dia_out_dir' + ltcv_dir = _direc / 'test_output/lc_out_dir' + # Make sure they exist + for path in [ temp_dir, scratch_dir, dia_out_dir, ltcv_dir ]: + path.mkdir( exist_ok=True, parents=True ) + + try: + # We're going to mangle the config to move the output + # directories underneath tests so that we can clean them up + # when done. Ideally, each test should clean itself up. + # This means cheating with the config so we can modify it. The + # first call to Config.get() gets a pointer to the global + # singleton config object (as it defaults to static=True . + # (We don't just do this in the phrosty_test_config.yaml file + # because some examples use that and need access to the + # directories that the tests don't destroy.) + cfg = Config.get( _direc / 'phrosty_test_config.yaml', setdefault=True ) + # Now we're going to poke inside the Config object so we can + # modify this global singleton. We're not supposed to do + # that. If this was Java, it totally wouldn't let us. But + # here we go. + cfg._static = False + # Edit the config. + cfg.set_value( 'photometry.snappl.temp_dir', str(temp_dir) ) + cfg.set_value( 'photometry.phrosty.paths.scratch_dir', str(temp_dir) ) + cfg.set_value( 'photometry.phrosty.paths.temp_dir', str(temp_dir) ) + cfg.set_value( 'photometry.phrosty.paths.dia_out_dir', str(dia_out_dir) ) + cfg.set_value( 'photometry.phrosty.paths.ltcv_dir', str(ltcv_dir) ) + # Reset the config to static + cfg._static = True + + yield cfg + + finally: + # Clean up the output directories. (Don't delete the + # directories themselves, but do delete all files and + # subdirectories.) + # TODO: we could put a check here that they're actually empty, + # and yell if they're not. That means some test didn't clean + # up after itself. + def nukedir( path ): + for f in path.iterdir(): + if f.is_dir(): + nukedir( f ) + f.rmdir() + else: + f.unlink( missing_ok=True ) + + nukedir( temp_dir ) + nukedir( scratch_dir ) + nukedir( dia_out_dir ) + nukedir( ltcv_dir ) @pytest.fixture( scope='session' ) def template_csv(): - return direc / "20172782_instances_templates_1.csv" + return _direc / "20172782_instances_templates_1.csv" @pytest.fixture( scope='session' ) def two_science_csv(): - return direc / "20172782_instances_science_2.csv" + return _direc / "20172782_instances_science_2.csv" -@pytest.fixture( scope='session' ) -def test_dia_image(): - # This is the first image from the csv file in 20172782_instances_science_2.csv - return { 'relpath': 'simple_model/Y106/35198/Roman_TDS_simple_model_Y106_35198_2.fits.gz', - 'pointing': 35198, - 'sca': 2, - 'mjd': 62455.669, - 'band': 'Y106' - } +@pytest.fixture( scope="session" ) +def dia_out_dir( config ): + return pathlib.Path( config.value( 'photometry.phrosty.paths.dia_out_dir' ) ) + +# @pytest.fixture( scope='session' ) +# def test_dia_image(): +# # This is the first image from the csv file in 20172782_instances_science_2.csv +# return { 'relpath': 'simple_model/Y106/35198/Roman_TDS_simple_model_Y106_35198_2.fits.gz', +# 'pointing': 35198, +# 'sca': 2, +# 'mjd': 62455.669, +# 'band': 'Y106' +# } # @pytest.fixture( scope='session' ) # def test_truth_file(): @@ -53,60 +113,64 @@ def test_dia_image(): # 'Y106' / '35198' / 'Roman_TDS_index_Y106_35198_2.txt' -@pytest.fixture( scope='session' ) -def test_sn(): - # This object is on the science images in 20172782_instances_science_2.csv - return { 'oid': 20172782, - 'mjd0': 62450.0, - 'mjd1': 62881.0, - 'ra': 7.551093401915147, - 'dec': -44.80718106491529, - 'zcmb': 0.3601, - 'peakmjd': 62476.507812 - } - +# @pytest.fixture( scope='session' ) +# def test_sn(): +# # This object is on the science images in 20172782_instances_science_2.csv +# return { 'oid': 20172782, +# 'mjd0': 62450.0, +# 'mjd1': 62881.0, +# 'ra': 7.551093401915147, +# 'dec': -44.80718106491529, +# 'zcmb': 0.3601, +# 'peakmjd': 62476.507812 +# } -@pytest.fixture( scope='session' ) -def compressed_template_image_path( sims_dir, template_csv ): - with open( template_csv ) as ifp: - line = ifp.readline() - assert line.split() == [ 'path', 'pointing', 'sca', 'mjd', 'band' ] - line = ifp.readline() - img, _point, _sca, _mjd, _band = line.split() - return sims_dir / img +@pytest.fixture( scope="session" ) +def object_for_tests(): + return DiaObject.find_objects( collection='ou2024', id=20172782 )[0] -@pytest.fixture( scope='session' ) -def one_compressed_science_image_path( sims_dir, two_science_csv ): - with open( two_science_csv ) as ifp: - line = ifp.readline() - assert line.split() == [ 'path', 'pointing', 'sca', 'mjd', 'band' ] - line = ifp.readline() - img, _point, _sca, _mjd, _band = line.split() +@pytest.fixture( scope="session" ) +def ou2024_image_collection(): + return ImageCollection.get_collection( "ou2024" ) - return sims_dir / img +# These next two are session scope fixtures, so be sure not to modify +# the things that you get from them. +@pytest.fixture +def one_science_image( scope="session" ): + try: + img = FITSImageOnDisk( path=('/photometry_test_data/ou2024/images/simple_model/' + 'Y106/35198/Roman_TDS_simple_model_Y106_35198_2.fits.gz'), + imhdu=1, + pointing=35198, + sca=2 ).uncompressed_version() + yield img + finally: + img.path.unlink( missing_ok=True ) -@pytest.fixture( scope='session' ) -def template_image_path( compressed_template_image_path, dia_out_dir ): - out_path = dia_out_dir / "templ.fits" - phrosty.imagesubtraction.gz_and_ext( compressed_template_image_path, out_path ) +@pytest.fixture +def one_template_image( scope="session" ): try: - yield out_path - + img = FITSImageOnDisk( path=('/photometry_test_data/ou2024/images/simple_model/' + 'Y106/5934/Roman_TDS_simple_model_Y106_5934_3.fits.gz' ), + imhdu=1, + pointing=5934, + sca=3 ).uncompressed_version() + yield img finally: - out_path.unlink( missing_ok=True ) + img.path.unlink( missing_ok=True ) -@pytest.fixture( scope='session' ) -def one_science_image_path( one_compressed_science_image_path, dia_out_dir ): - out_path = dia_out_dir / "sci.fits" - phrosty.imagesubtraction.gz_and_ext( one_compressed_science_image_path, out_path ) +@pytest.fixture +def one_ou2024_template_image( ou2024_image_collection ): + return ou2024_image_collection.get_image( pointing=5934, sca=3, band='Y106' ) - try: - yield out_path - finally: - out_path.unlink( missing_ok=True ) +@pytest.fixture +def two_ou2024_science_images( ou2024_image_collection ): + img1 = ou2024_image_collection.get_image( pointing=35198, sca=2, band='Y106' ) + img2 = ou2024_image_collection.get_image( pointing=39790, sca=15, band='Y106' ) + return img1, img2 diff --git a/phrosty/tests/phrosty_test_config.yaml b/phrosty/tests/phrosty_test_config.yaml index 06059f1..6061bca 100644 --- a/phrosty/tests/phrosty_test_config.yaml +++ b/phrosty/tests/phrosty_test_config.yaml @@ -5,8 +5,8 @@ ou24: sn_truth_dir: /photometry_test_data/ou2024/snana_truth # sn_truth_dir: /snana_pq_dir - images: /photometry_test_data/ou2024/images - + images: /photometry_test_data/ou2024/images/simple_model + sims_sed_library: /photometry_test_data/ou2024/sims_sed_library # Comment from Cole & Rob: This is a path to the SEDs that were used to generate the # OU24 simulations. Only a subset of the files are stored in @@ -19,6 +19,7 @@ ou24: photometry: snappl: A25ePSF_path: /snappl/snappl/tests/testdata/A25ePSF + temp_dir: /phrosty_temp phrosty: image_type: ou2024fits @@ -27,7 +28,7 @@ photometry: remove_temp_dir: false mem_trace: false kerpolyorder: 2 - + paths: scratch_dir: /phrosty_temp temp_dir: /phrosty_temp diff --git a/phrosty/tests/test_imagesubtraction.py b/phrosty/tests/test_imagesubtraction.py index 9d9a1f4..e3dd7d4 100644 --- a/phrosty/tests/test_imagesubtraction.py +++ b/phrosty/tests/test_imagesubtraction.py @@ -2,27 +2,16 @@ import pathlib import pytest +import numpy as np +import numpy.random as random + from astropy.io import fits from astropy.wcs import WCS +from snappl.image import FITSImageOnDisk import phrosty import phrosty.imagesubtraction -import numpy as np -import numpy.random as random - - -def test_gz_and_ext( dia_out_dir, compressed_template_image_path ): - out_path = dia_out_dir / "gunzipped.fits" - try: - assert out_path == phrosty.imagesubtraction.gz_and_ext( compressed_template_image_path, out_path ) - - with fits.open( out_path ) as hdu: - assert len(hdu) == 1 - assert hdu[0].data.shape == (4088, 4088) - - finally: - out_path.unlink( missing_ok=True ) @pytest.mark.skipif( os.getenv("SKIP_GPU_TESTS", 0), reason="SKIP_GPU_TESTS is set") @@ -34,16 +23,14 @@ def test_sky_subtract( dia_out_dir ): try: rng = random.default_rng( 42 ) imdata = rng.normal( 100., 10., ( 512, 512 ) ) - fits.writeto( in_path, imdata ) + hdr = fits.header.Header() + fits.writeto( in_path, imdata, header=hdr ) + img = FITSImageOnDisk( path=in_path ) - skymedrms = phrosty.imagesubtraction.sky_subtract( in_path, skysub_path, detmask_path, temp_dir=dia_out_dir ) - # Surprised it's not closer to 10 given the number of pixels, but whatevs + subim, detmask, skymedrms = phrosty.imagesubtraction.sky_subtract( img, temp_dir=dia_out_dir ) assert skymedrms == pytest.approx( 10., abs=0.2 ) - with fits.open( skysub_path ) as hdul: - data = hdul[0].data - assert data.mean() == pytest.approx( 0., abs=5 * 10. / 512. ) # 3σ - assert data.std() == pytest.approx( 10., rel=0.05 ) - assert hdul[0].header['SKYRMS'] == pytest.approx( 10., abs=0.2 ) + assert subim.data.mean() == pytest.approx( 0., abs=5 * 10. / 512. ) # 3σ + assert subim.data.std() == pytest.approx( 10., rel=0.05 ) finally: for f in ( in_path, skysub_path, detmask_path ): @@ -84,16 +71,16 @@ def _check_resampled_image( templ, resamp ): @pytest.mark.skipif( os.getenv("SKIP_GPU_TESTS", 0), reason="SKIP_GPU_TESTS is set") -def test_stampmaker( dia_out_dir, test_dia_image, test_sn, one_science_image_path ): +def test_stampmaker( object_for_tests, one_science_image, dia_out_dir ): savedir = dia_out_dir savename = 'stamp.fits' - ra = test_sn[ 'ra' ] - dec = test_sn[ 'dec' ] + ra = object_for_tests.ra + dec = object_for_tests.dec shape = np.array( [ 100, 100 ] ) savepath = None try: - savepath = pathlib.Path( phrosty.imagesubtraction.stampmaker( ra, dec, shape, one_science_image_path, + savepath = pathlib.Path( phrosty.imagesubtraction.stampmaker( ra, dec, shape, one_science_image, savedir=savedir, savename=savename ) ) with fits.open( savepath ) as stamp: assert stamp[0].data.shape == tuple( shape ) @@ -106,122 +93,3 @@ def test_stampmaker( dia_out_dir, test_dia_image, test_sn, one_science_image_pat finally: if savepath is not None: savepath.unlink( missing_ok = True ) - -# @pytest.mark.skipif( os.getenv("SKIP_GPU_TESTS", 0), reason="SKIP_GPU_TESTS is set") -# def test_run_resample( dia_out_dir, template_image_path, one_science_image_path ): -# sci = one_science_image_path -# templ = template_image_path -# resamp = dia_out_dir / "resamp.fits" - -# try: -# phrosty.imagesubtraction.run_resample( templ, sci, resamp ) -# _check_resampled_image( templ, resamp ) -# finally: -# resamp.unlink( missing_ok=True ) - - -# def test_imalign( dia_out_dir, template_image_path, one_science_image_path ): -# sci = dia_out_dir / "sci.fits" -# templ = dia_out_dir / "templ.fits" -# resamp = dia_out_dir / "align/resamp.fits" -# phrosty.imagesubtraction.gz_and_ext( one_science_image_path, sci ) -# phrosty.imagesubtraction.gz_and_ext( template_image_path, templ ) - -# # BROKEN, FIX PATH PASSING - -# try: -# t0 = time.perf_counter() -# import pdb; pdb.set_trace() -# phrosty.imagesubtraction.imalign( templ, sci, out_path=resamp ) -# firsttime = time.perf_counter() - t0 -# _check_resampled_image( templ, resamp ) - -# # Rerun again with force false, it should be faster -# # This is a bit scary, because we're depending on the -# # filesystem being fast, and that may be bad, because -# # GPUs are fast too. -# t0 = time.perf_counter() -# phrosty.imagesubtraction.imalign( templ, sci, out_path=resamp, force=False ) -# forcefalsetime = time.perf_counter() - t0 -# assert forcefalsetime < firsttime / 10. -# _check_resampled_image( templ, resamp ) - -# # Now run with force, it should be slow again -# t0 = time.perf_counter() -# phrosty.imagesubtraction.imalign( templ, sci, out_path=resamp, force=True ) -# forcetruetime = time.perf_counter() - t0 -# assert forcetruetime / firsttime == pytest.approx( 1., rel=0.2 ) -# _check_resampled_image( templ, resamp ) - -# finally: -# resamp.unlink( missing_ok=True ) -# sci.unlink( missing_ok=True ) -# templ.unlink( missing_ok=True ) - - -# TODO : move these next two tests to effective test a test of snappl's ou24PSF.get_stamp -# def test_get_imsim_psf( sims_dir, sn_info_dir, dia_out_dir, test_dia_image, test_sn, one_science_image_path ): -# impath = one_science_image_path -# ra = test_sn[ 'ra' ] -# dec = test_sn[ 'dec' ] -# band = test_dia_image[ 'band' ] -# pointing = test_dia_image[ 'pointing' ] -# sca = test_dia_image[ 'sca' ] -# config_yaml = sn_info_dir / "tds.yaml" -# psf_path = dia_out_dir / "psf.fits" -# size = 201 - -# phrosty.imagesubtraction.get_imsim_psf( impath, ra, dec, band, pointing, sca, size=size, -# config_yaml_file=config_yaml, psf_path=psf_path ) -# with fits.open( psf_path ) as psf: -# ctr = size // 2 -# # Center pixel should be way brighter, it's undersampled -# for dx, dy in zip( [ -1, 1, 0, 0 ], [ 0, 0, -1, 1 ] ): -# assert psf[0].data[ ctr, ctr ] > 10.* ( psf[0].data[ ctr+dx, ctr+dy ] ) -# # OMG the PSF isn't normalized, I hope we deal with this right -# assert psf[0].data.sum() == pytest.approx( 2.33, abs=0.01 ) - -# # TODO test force and all that - - -# def test_get_imsim_psf_photonOps( sims_dir, sn_info_dir, dia_out_dir, -# test_dia_image, test_sn, one_science_image_path ): -# impath = one_science_image_path -# ra = test_sn[ 'ra' ] -# dec = test_sn[ 'dec' ] -# band = test_dia_image[ 'band' ] -# pointing = test_dia_image[ 'pointing' ] -# sca = test_dia_image[ 'sca' ] -# config_yaml = sn_info_dir / "tds.yaml" -# psf_path = dia_out_dir / "psf.fits" -# size = 201 -# photonOps = True -# n_phot = 1e6 -# oversampling_factor = 1 - -# phrosty.imagesubtraction.get_imsim_psf( impath, ra, dec, band, pointing, sca, size=size, -# config_yaml_file=config_yaml, psf_path=psf_path, -# include_photonOps=photonOps, n_phot=n_phot, -# oversampling_factor=oversampling_factor ) -# with fits.open( psf_path ) as psf: -# ctr = size // 2 -# # Center pixel should be way brighter, it's undersampled -# # Photon shooting seems to blur out the PSF a bit, so the -# # condition is center pixel is >8× neighbors, whereas it -# # was 10x in the previous test. -# for dx, dy in zip( [ -1, 1, 0, 0 ], [ 0, 0, -1, 1 ] ): -# assert psf[0].data[ ctr, ctr ] > 3.* ( psf[0].data[ ctr+dx, ctr+dy ] ) -# # ...but it looks like it's normalized now! -# # ...or is it? It's about .21% away from 1. -# # ...it's different very time, because the photonOps -# # uses an RNG. Until such a time as we have -# # an argument to seed that, we will need -# # to accept that sometimes this test passes, -# # sometimes it fails. (The fact that -# # it varies by up to half a percent makes -# # me scared that we aren't using enough photons, -# # but, then, long term we aren't going to use -# # galsim psfs anyway.) -# assert psf[0].data.sum() == pytest.approx( 1.000, abs=0.005 ) - -# # TODO test force and all that diff --git a/phrosty/tests/test_pipeline.py b/phrosty/tests/test_pipeline.py new file mode 100644 index 0000000..bbd69cc --- /dev/null +++ b/phrosty/tests/test_pipeline.py @@ -0,0 +1,69 @@ +import os +import pytest +from phrosty.pipeline import Pipeline + +# TODO : separate tests for PipelineImage, for all the functions in# +# PipelineImage and Pipeline. Right now we just have this regression +# test. + + +@pytest.mark.skipif( os.getenv("SKIP_GPU_TESTS", 0 ), reason="SKIP_GPU_TESTS is set" ) +def test_pipeline_run( object_for_tests, ou2024_image_collection, + one_ou2024_template_image, two_ou2024_science_images ): + pip = Pipeline( object_for_tests, ou2024_image_collection, 'Y106', + science_images=two_ou2024_science_images, + template_images=[one_ou2024_template_image], + nprocs=2, nwrite=3 ) + ltcv = pip() + + with open( ltcv ) as ifp: + hdrline = ifp.readline().strip() + assert hdrline == ( 'ra,dec,mjd,filter,pointing,sca,template_pointing,template_sca,zpt,' + 'aperture_sum,flux_fit,flux_fit_err,mag_fit,mag_fit_err' ) + kws = hdrline.split( "," ) + pairs = [] + for line in ifp: + data = line.strip().split( "," ) + pairs.append( { kw: val for kw, val in zip( kws, data ) } ) + + assert len(pairs) == 2 + for pair, img in zip( pairs, two_ou2024_science_images ): + assert float(pair['ra']) == pytest.approx( object_for_tests.ra, abs=0.1/3600. ) + assert float(pair['dec']) == pytest.approx( object_for_tests.dec, abs=0.1/3600. ) + assert float(pair['mjd']) == pytest.approx( img.mjd, abs=1e-3 ) + assert pair['filter'] == 'Y106' + assert int(pair['pointing']) == int(img.pointing) + assert int(pair['sca']) == int(img.sca) + assert int(pair['template_pointing']) == int(one_ou2024_template_image.pointing) + assert int(pair['template_sca']) == int(one_ou2024_template_image.sca) + # TODO : fix the zeropoint! + assert pair['zpt'] == '' + + # Tests aren't exactly reproducible from one run to the next, + # because some classes (including the galsim PSF that we use right + # now) have random numbers in them, and at the moment we aren't + # controlling the seed. So, we can only test for approximately + # consistent results. Going to do 0.3 times the uncertainty, + # because a difference by that much is not all that meaningful + # change, and empirically they vary by that much. (Which is + # alarming, but what can you do.) + + dflux = float( pairs[0]['flux_fit_err'] ) + assert dflux == pytest.approx( 25., rel=0.3 ) + dmag = float( pairs[0]['mag_fit_err'] ) + assert dmag == pytest.approx( 0.15, abs=0.1 ) + assert float( pairs[0]['aperture_sum'] ) == pytest.approx( 1006.8969554640036, abs=0.3*dflux ) + assert float( pairs[0]['flux_fit'] ) == pytest.approx( 181.9182196835094, abs=0.3*dflux ) + assert float( pairs[0]['mag_fit'] ) == pytest.approx( -5.64, abs=max( 0.3*dmag, 0.01 ) ) + + dflux = float( pairs[1]['flux_fit_err'] ) + assert dflux == pytest.approx( 27., rel=0.3 ) + dmag = float( pairs[1]['mag_fit_err'] ) + assert dmag == pytest.approx( 0.05, abs=0.1 ) + assert float( pairs[1]['aperture_sum'] ) == pytest.approx( 4112, abs=0.3*dflux ) + assert float( pairs[1]['flux_fit'] ) == pytest.approx( 721, abs=0.3*dflux ) + assert float( pairs[1]['mag_fit'] ) == pytest.approx( -7.14, abs=max( 0.3*dmag, 0.01 ) ) + + # TODO : cleanup output directories! This is scary if you're using the same + # directories for tests and for running... so don't do that... but the + # way we're set up right now, you probably are. diff --git a/phrosty/tests/test_utils.py b/phrosty/tests/test_utils.py index df7ea69..2fb0b5d 100644 --- a/phrosty/tests/test_utils.py +++ b/phrosty/tests/test_utils.py @@ -1,219 +1,219 @@ -# IMPORTS Standard: -import numpy as np -import os -import os.path -import re -import pytest +# # IMPORTS Standard: +# import numpy as np +# import os +# import os.path +# import re +# import pytest -# IMPORTS Astro: -from astropy.io import fits -from astropy.table import Table +# # IMPORTS Astro: +# from astropy.io import fits +# from astropy.table import Table -# IMPORTS Internal: -import phrosty.utils -from snpit_utils.config import Config +# # IMPORTS Internal: +# import phrosty.utils +# from snpit_utils.config import Config -def test_build_filepath( test_dia_image ): - band = test_dia_image[ 'band' ] - pointing = test_dia_image[ 'pointing' ] - sca = test_dia_image[ 'sca' ] +# def test_build_filepath( test_dia_image ): +# band = test_dia_image[ 'band' ] +# pointing = test_dia_image[ 'pointing' ] +# sca = test_dia_image[ 'sca' ] - with pytest.raises( ValueError, match=r"filetype must be in \['image', 'truth', 'truthtxt'\]" ): - phrosty.utils._build_filepath( None, band, pointing, sca, 'foo' ) +# with pytest.raises( ValueError, match=r"filetype must be in \['image', 'truth', 'truthtxt'\]" ): +# phrosty.utils._build_filepath( None, band, pointing, sca, 'foo' ) - for args in [ [ None, None, pointing, sca, 'image' ], - [ None, band, None, sca, 'image' ], - [ None, band, pointing, None, 'image' ] ]: - with pytest.raises( ValueError, match="You need to specify band" ): - phrosty.utils._build_filepath( *args ) +# for args in [ [ None, None, pointing, sca, 'image' ], +# [ None, band, None, sca, 'image' ], +# [ None, band, pointing, None, 'image' ] ]: +# with pytest.raises( ValueError, match="You need to specify band" ): +# phrosty.utils._build_filepath( *args ) - types = [ 'image', 'truth', 'truthtxt' ] - for typ in types: - fp = phrosty.utils._build_filepath( None, band, pointing, sca, typ ) - assert re.search( f"^Roman_TDS_.*_{band}_{pointing}_{sca}", os.path.basename(fp) ) +# types = [ 'image', 'truth', 'truthtxt' ] +# for typ in types: +# fp = phrosty.utils._build_filepath( None, band, pointing, sca, typ ) +# assert re.search( f"^Roman_TDS_.*_{band}_{pointing}_{sca}", os.path.basename(fp) ) - fp = phrosty.utils._build_filepath( "/foo/bar", None, None, None, 'image' ) - assert fp == '/foo/bar' +# fp = phrosty.utils._build_filepath( "/foo/bar", None, None, None, 'image' ) +# assert fp == '/foo/bar' -def test_ou2024_obseq_path(): - test_path_none = os.path.join( Config.get().value('ou24.tds_base'), 'Roman_TDS_obseq_11_6_23.fits' ) - test_path_arg = '/foo/bar' +# def test_ou2024_obseq_path(): +# test_path_none = os.path.join( Config.get().value('ou24.tds_base'), 'Roman_TDS_obseq_11_6_23.fits' ) +# test_path_arg = '/foo/bar' - path_none = phrosty.utils.ou2024_obseq_path() - path_arg = phrosty.utils.ou2024_obseq_path( test_path_arg ) +# path_none = phrosty.utils.ou2024_obseq_path() +# path_arg = phrosty.utils.ou2024_obseq_path( test_path_arg ) - assert path_none == test_path_none - assert path_arg == test_path_arg +# assert path_none == test_path_none +# assert path_arg == test_path_arg -def test_get_roman_bands(): - assert phrosty.utils.get_roman_bands() == [ 'R062', 'Z087', 'Y106', 'J129', 'H158', 'F184', 'K213' ] +# def test_get_roman_bands(): +# assert phrosty.utils.get_roman_bands() == [ 'R062', 'Z087', 'Y106', 'J129', 'H158', 'F184', 'K213' ] -def test_read_truth_txt( test_dia_image ): - band = test_dia_image[ 'band' ] - pointing = test_dia_image[ 'pointing' ] - sca = test_dia_image[ 'sca' ] - truth = phrosty.utils.read_truth_txt( None, band, pointing, sca ) - assert len(truth) == 25622 - assert set(truth.columns) == {'dec', 'ra', 'obj_type', 'flux', 'x', 'realized_flux', 'object_id', 'mag', 'y'} +# def test_read_truth_txt( test_dia_image ): +# band = test_dia_image[ 'band' ] +# pointing = test_dia_image[ 'pointing' ] +# sca = test_dia_image[ 'sca' ] +# truth = phrosty.utils.read_truth_txt( None, band, pointing, sca ) +# assert len(truth) == 25622 +# assert set(truth.columns) == {'dec', 'ra', 'obj_type', 'flux', 'x', 'realized_flux', 'object_id', 'mag', 'y'} -def test_radec_isin( test_sn, test_dia_image ): - band = test_dia_image[ 'band' ] - pointing = test_dia_image[ 'pointing' ] - sca = test_dia_image[ 'sca' ] - rain = test_sn[ 'ra' ] - decin = test_sn[ 'dec' ] - # These next few are based on knowledge of what test_dia_image is - raout = 7.4106 - decout = -44.7131 - rawayout = 8 - decwayout = 32 +# def test_radec_isin( test_sn, test_dia_image ): +# band = test_dia_image[ 'band' ] +# pointing = test_dia_image[ 'pointing' ] +# sca = test_dia_image[ 'sca' ] +# rain = test_sn[ 'ra' ] +# decin = test_sn[ 'dec' ] +# # These next few are based on knowledge of what test_dia_image is +# raout = 7.4106 +# decout = -44.7131 +# rawayout = 8 +# decwayout = 32 - assert phrosty.utils.radec_isin( rain, decin, path=None, band=band, pointing=pointing, sca=sca ) - assert not phrosty.utils.radec_isin( raout, decout, path=None, band=band, pointing=pointing, sca=sca ) - assert not phrosty.utils.radec_isin( rawayout, decwayout, path=None, band=band, pointing=pointing, sca=sca ) +# assert phrosty.utils.radec_isin( rain, decin, path=None, band=band, pointing=pointing, sca=sca ) +# assert not phrosty.utils.radec_isin( raout, decout, path=None, band=band, pointing=pointing, sca=sca ) +# assert not phrosty.utils.radec_isin( rawayout, decwayout, path=None, band=band, pointing=pointing, sca=sca ) -def test_get_corners( test_dia_image ): - band = test_dia_image[ 'band' ] - pointing = test_dia_image[ 'pointing' ] - sca = test_dia_image[ 'sca' ] +# def test_get_corners( test_dia_image ): +# band = test_dia_image[ 'band' ] +# pointing = test_dia_image[ 'pointing' ] +# sca = test_dia_image[ 'sca' ] - corners = phrosty.utils.get_corners( path=None, band=band, pointing=pointing, sca=sca ) - expected = ( np.array([ 7.58004938, -44.71290687]), - np.array([ 7.5896352 , -44.83369681]), - np.array([ 7.4075831 , -44.71566849]), - np.array([ 7.4168088 , -44.83647236]) ) - assert np.all( np.abs(c-e) < 0.00028 for c, e in zip( corners, expected ) ) +# corners = phrosty.utils.get_corners( path=None, band=band, pointing=pointing, sca=sca ) +# expected = ( np.array([ 7.58004938, -44.71290687]), +# np.array([ 7.5896352 , -44.83369681]), +# np.array([ 7.4075831 , -44.71566849]), +# np.array([ 7.4168088 , -44.83647236]) ) +# assert np.all( np.abs(c-e) < 0.00028 for c, e in zip( corners, expected ) ) -# TEST _read_parquet? +# # TEST _read_parquet? -# This one also implicitly tests make_object_table -def test_get_transient_radec( test_sn ): - trns = test_sn[ 'oid' ] - coords = ( test_sn['ra'], test_sn['dec'] ) - assert phrosty.utils.get_transient_radec(trns) == pytest.approx( coords, abs=0.000028 ) +# # This one also implicitly tests make_object_table +# def test_get_transient_radec( test_sn ): +# trns = test_sn[ 'oid' ] +# coords = ( test_sn['ra'], test_sn['dec'] ) +# assert phrosty.utils.get_transient_radec(trns) == pytest.approx( coords, abs=0.000028 ) -def test_get_transient_mjd( test_sn ): - trns = test_sn[ 'oid' ] - endpoints = (test_sn['mjd0'], test_sn['mjd1']) - assert phrosty.utils.get_transient_mjd(trns) == pytest.approx( endpoints, abs=0.1 ) +# def test_get_transient_mjd( test_sn ): +# trns = test_sn[ 'oid' ] +# endpoints = (test_sn['mjd0'], test_sn['mjd1']) +# assert phrosty.utils.get_transient_mjd(trns) == pytest.approx( endpoints, abs=0.1 ) -def test_get_transient_zcmb( test_sn ): - oid = test_sn[ 'oid' ] - zcmb = np.float32( test_sn['zcmb'] ) +# def test_get_transient_zcmb( test_sn ): +# oid = test_sn[ 'oid' ] +# zcmb = np.float32( test_sn['zcmb'] ) - tzcmb = phrosty.utils.get_transient_zcmb(oid) +# tzcmb = phrosty.utils.get_transient_zcmb(oid) - assert zcmb == pytest.approx( tzcmb, rel=1e-5 ) +# assert zcmb == pytest.approx( tzcmb, rel=1e-5 ) -def test_get_transient_peakmjd( test_sn ): - oid = test_sn[ 'oid' ] - mjd = np.float32( test_sn[ 'peakmjd' ] ) +# def test_get_transient_peakmjd( test_sn ): +# oid = test_sn[ 'oid' ] +# mjd = np.float32( test_sn[ 'peakmjd' ] ) - tmjd = phrosty.utils.get_transient_peakmjd(oid) +# tmjd = phrosty.utils.get_transient_peakmjd(oid) - assert mjd == pytest.approx( tmjd, abs=0.001 ) +# assert mjd == pytest.approx( tmjd, abs=0.001 ) -def test_get_transient_info( test_sn ): - oid = test_sn[ 'oid' ] - coords = (test_sn['ra'], test_sn['dec'], test_sn['mjd0'], test_sn['mjd1']) - assert phrosty.utils.get_transient_info( oid ) == pytest.approx( coords, rel=1e-6 ) +# def test_get_transient_info( test_sn ): +# oid = test_sn[ 'oid' ] +# coords = (test_sn['ra'], test_sn['dec'], test_sn['mjd0'], test_sn['mjd1']) +# assert phrosty.utils.get_transient_info( oid ) == pytest.approx( coords, rel=1e-6 ) -def test_transient_in_or_out( test_sn, test_dia_image ): - oid = test_sn[ 'oid' ] - mjd0 = test_sn[ 'mjd0' ] - mjd1 = test_sn[ 'mjd1' ] - band = test_dia_image[ 'band' ] +# def test_transient_in_or_out( test_sn, test_dia_image ): +# oid = test_sn[ 'oid' ] +# mjd0 = test_sn[ 'mjd0' ] +# mjd1 = test_sn[ 'mjd1' ] +# band = test_dia_image[ 'band' ] - in_tab, out_tab = phrosty.utils.transient_in_or_out( oid, mjd0, mjd1, band ) - # These numbers are based on the choice of object in test_sn - assert len(in_tab) == 53 - assert len(out_tab) == 82 - assert set(in_tab.columns) == { 'filter', 'pointing', 'sca' } - assert set(out_tab.columns) == { 'filter', 'pointing', 'sca' } +# in_tab, out_tab = phrosty.utils.transient_in_or_out( oid, mjd0, mjd1, band ) +# # These numbers are based on the choice of object in test_sn +# assert len(in_tab) == 53 +# assert len(out_tab) == 82 +# assert set(in_tab.columns) == { 'filter', 'pointing', 'sca' } +# assert set(out_tab.columns) == { 'filter', 'pointing', 'sca' } -@pytest.mark.skip( reason="Not used in pipeline.py, test is a TODO" ) -def test_get_templates(): - assert False +# @pytest.mark.skip( reason="Not used in pipeline.py, test is a TODO" ) +# def test_get_templates(): +# assert False -@pytest.mark.skip( reason="Not used in pipeline.py, test is a TODO" ) -def test_get_science(): - assert False +# @pytest.mark.skip( reason="Not used in pipeline.py, test is a TODO" ) +# def test_get_science(): +# assert False -def test_get_mjd_limits(): - start, end = phrosty.utils.get_mjd_limits() - assert ( start, end ) == pytest.approx( (62000.02139, 63563.0579), abs=0.1 ) +# def test_get_mjd_limits(): +# start, end = phrosty.utils.get_mjd_limits() +# assert ( start, end ) == pytest.approx( (62000.02139, 63563.0579), abs=0.1 ) -def test_get_radec_limits(): - val = phrosty.utils.get_radec_limits() - assert val['ra'] == pytest.approx( [6.97879, 12.0204], abs=0.001 ) - assert val['dec'] == pytest.approx( [-46.5199, -41.4786], abs=0.001 ) +# def test_get_radec_limits(): +# val = phrosty.utils.get_radec_limits() +# assert val['ra'] == pytest.approx( [6.97879, 12.0204], abs=0.001 ) +# assert val['dec'] == pytest.approx( [-46.5199, -41.4786], abs=0.001 ) -def test_get_mjd(): - pointing = 0 - test_path = os.path.join(os.path.dirname(__file__), 'testdata', 'Roman_TDS_obseq_11_6_23.fits') - with fits.open(test_path) as tp: - tobseq = Table(tp[1].data) - tmjd = float(tobseq['date'][int(pointing)]) - mjd = phrosty.utils.get_mjd(pointing) +# def test_get_mjd(): +# pointing = 0 +# test_path = os.path.join(os.path.dirname(__file__), 'testdata', 'Roman_TDS_obseq_11_6_23.fits') +# with fits.open(test_path) as tp: +# tobseq = Table(tp[1].data) +# tmjd = float(tobseq['date'][int(pointing)]) +# mjd = phrosty.utils.get_mjd(pointing) - assert tmjd == mjd +# assert tmjd == mjd -def test_pointings_near_mjd(): - obseq = Table.read( phrosty.utils.ou2024_obseq_path() ) +# def test_pointings_near_mjd(): +# obseq = Table.read( phrosty.utils.ou2024_obseq_path() ) - ptgs = phrosty.utils.pointings_near_mjd( 62000.04011 ) - assert len(ptgs) == 383 - assert np.all( obseq[ptgs]['date'] > 62000.04011 - 3 ) - assert np.all( obseq[ptgs]['date'] < 62000.04011 + 3 ) - ptgs = phrosty.utils.pointings_near_mjd( 62000.04011, window=1 ) - assert len(ptgs) == 220 - assert np.all( obseq[ptgs]['date'] > 62000.04011 - 1 ) - assert np.all( obseq[ptgs]['date'] < 62000.04011 + 1 ) +# ptgs = phrosty.utils.pointings_near_mjd( 62000.04011 ) +# assert len(ptgs) == 383 +# assert np.all( obseq[ptgs]['date'] > 62000.04011 - 3 ) +# assert np.all( obseq[ptgs]['date'] < 62000.04011 + 3 ) +# ptgs = phrosty.utils.pointings_near_mjd( 62000.04011, window=1 ) +# assert len(ptgs) == 220 +# assert np.all( obseq[ptgs]['date'] > 62000.04011 - 1 ) +# assert np.all( obseq[ptgs]['date'] < 62000.04011 + 1 ) - ptgs = phrosty.utils.pointings_near_mjd( 10000 ) - assert len(ptgs) == 0 +# ptgs = phrosty.utils.pointings_near_mjd( 10000 ) +# assert len(ptgs) == 0 -def test_get_mjd_info(): - mjd0 = 62000.04011 - 3 - mjd1 = 62000.04011 + 3 +# def test_get_mjd_info(): +# mjd0 = 62000.04011 - 3 +# mjd1 = 62000.04011 + 3 - info = phrosty.utils.get_mjd_info( mjd0, mjd1 ) - assert len(info) == 383 - assert set( info['filter'] ) == { 'H158', 'F184', 'R062', 'K213', 'Y106', 'Z087', 'J129' } +# info = phrosty.utils.get_mjd_info( mjd0, mjd1 ) +# assert len(info) == 383 +# assert set( info['filter'] ) == { 'H158', 'F184', 'R062', 'K213', 'Y106', 'Z087', 'J129' } -def test_get_exptime(): - # These are the exposure times that were used in the OpenUniverse sims - # (See Troxel et al. 2025 ) - expected = {'F184': 901.175, - 'J129': 302.275, - 'H158': 302.275, - 'K213': 901.175, - 'R062': 161.025, - 'Y106': 302.275, - 'Z087': 101.7} +# def test_get_exptime(): +# # These are the exposure times that were used in the OpenUniverse sims +# # (See Troxel et al. 2025 ) +# expected = {'F184': 901.175, +# 'J129': 302.275, +# 'H158': 302.275, +# 'K213': 901.175, +# 'R062': 161.025, +# 'Y106': 302.275, +# 'Z087': 101.7} - assert phrosty.utils.get_exptime() == expected +# assert phrosty.utils.get_exptime() == expected - for band, t in expected.items(): - assert phrosty.utils.get_exptime( band ) == pytest.approx( t, abs=0.001 ) +# for band, t in expected.items(): +# assert phrosty.utils.get_exptime( band ) == pytest.approx( t, abs=0.001 ) diff --git a/phrosty/utils.py b/phrosty/utils.py deleted file mode 100644 index 9dda844..0000000 --- a/phrosty/utils.py +++ /dev/null @@ -1,624 +0,0 @@ -__all__ = [ 'get_roman_bands', 'read_truth_txt', 'radec_isin', 'get_corners', - 'get_transient_radec', 'get_transient_mjd', 'get_transient_zcmb', - 'get_transient_peakmjd', 'get_transient_info', 'transient_in_or_out', - 'get_mjd_limits', 'get_radec_limits', 'get_mjd', 'pointings_near_mjd', - 'get_mjd_info', 'get_exptime', 'make_object_table' ] - -# IMPORTS Standard: -import os -import os.path as pa -import numpy as np -import pandas as pd -import requests -import warnings -from glob import glob - -# IMPORTS Astro: -from astropy.coordinates import SkyCoord -from astropy.io import fits -from astropy.wcs import WCS, FITSFixedWarning -import astropy.wcs.wcs -from astropy.wcs.utils import skycoord_to_pixel -from astropy.table import Table -from astropy import units as u - -# IMPORTS snpit -from snpit_utils.config import Config -from snpit_utils.logger import SNLogger - -# The FITSFixedWarning is consequenceless and it will get thrown every single time you deal with a WCS. -warnings.simplefilter('ignore', category=FITSFixedWarning) - - -def _build_filepath(path, band, pointing, sca, filetype, rootdir=None): - """Builds the filepath to an OpenUniverse simulation file. - - Parameters - ---------- - path: Path - If the path to the file is already known, this overrides the rest of the function and returns - the input to kwarg 'path'. - band: str - Filter associated with target file. - pointing: str - Pointing number associated with target file. - sca: str - SCA number associated with target file. - filetype: str - The type of target file within the OpenUniverse simulations that you are looking for. Valid - values are 'image' (*.fits.gz), 'truth' (*.fits.gz), and 'truthtxt' (*.txt). - rootdir: Path, default None - Root directory where OpenUniverse files are stored. - - Returns - ------- - path: Path - Path to target file. - - Raises - ------ - ValueError - if filetype is not 'image', 'truth', or 'truthtxt', a ValueError is raised. - ValueError - if (band is None) or (pointing is None) or (sca is None), - you need to specify band, pointing, and sca if you do not provide a full filepath. - ValueError - if (path is None) and (band is None) and (pointing is None) and (sca is None), - you need to provide either the full image path, or the band, pointing, and SCA. - - """ - - rootdir = Config.get().value( 'ou24.tds_base' ) if rootdir is None else rootdir - - # First, what kind of file are we looking for? - neededtypes = [ 'image', 'truth', 'truthtxt' ] - if filetype not in neededtypes: - raise ValueError(f"filetype must be in {neededtypes}.") - elif filetype == 'image': - subdir = 'images/simple_model' - prefix = 'simple_model' - extension = 'fits.gz' - elif filetype == 'truth': - subdir = 'images/truth' - prefix = 'truth' - extension = 'fits.gz' - elif filetype == 'truthtxt': - subdir = 'truth' - prefix = 'index' - extension = 'txt' - - # Did you already provide a path? - if path is not None: - return path - elif (band is None) or (pointing is None) or (sca is None): - raise ValueError('You need to specify band, pointing, and sca if you do not provide a full filepath.') - elif (band is not None) and (pointing is not None) and (sca is not None): - path = pa.join(rootdir, subdir, band, str(pointing), - f'Roman_TDS_{prefix}_{band}_{str(pointing)}_{str(sca)}.{extension}') - return path - - elif (path is None) and (band is None) and (pointing is None) and (sca is None): - raise ValueError('You need to provide either the full image path, or the band, pointing, and SCA.') - - -def ou2024_obseq_path( path=None ): - """Retrieve the path to the OpenUniverse obseq file. - - Parameters - ---------- - path: Path, default None - Path to file. If not provided, use config file. - - Returns - ------- - Path to OpenUniverse obseq file. - - """ - return ( os.path.join( Config.get().value('ou24.tds_base'), 'Roman_TDS_obseq_11_6_23.fits' ) - if path is None else path ) - - -def get_roman_bands(): - """Get roman passbands. - - Returns - ------- - List of bands included in the Roman-DESC TDS simulations. - - """ - return ['R062', 'Z087', 'Y106', 'J129', 'H158', 'F184', 'K213'] - - -def read_truth_txt(path=None, band=None, pointing=None, sca=None): - """Reads in the txt versions of the OpenUniverse truth files as convenient astropy tables. - - Parameters - ---------- - truthpath: str, default None - Path to txt catalog version of truth file. If you do not - provide this, you need to specify the arguments - band, pointing, and sca. - band: str, default None - Roman filter. If you do not provide this, you need to provide truthpath. - pointing: str, default None - Pointing ID. If you do not provide this, you need to provide truthpath. - sca: str, default None - SCA ID. If you do not provide this, you need to provide truthpath. - - Returns - ------- - truth: astropy.table.Table - Astropy table with contents of the specified catalog txt file. - - """ - - _truthpath = _build_filepath(path=path, band=band, pointing=pointing, sca=sca, filetype='truthtxt') - truth_colnames = ['object_id', 'ra', 'dec', 'x', 'y', 'realized_flux', 'flux', 'mag', 'obj_type'] - truth_pd = pd.read_csv(_truthpath, comment='#', skipinitialspace=True, sep=' ', names=truth_colnames) - truth = Table.from_pandas(truth_pd) - - return truth - - -def radec_isin(ra, dec, path=None, band=None, pointing=None, sca=None): - """Check if a given RA, dec coordinate is in a given target image. - - Parameters - ---------- - ra : float - RA in degrees. - dec : float - Dec in degrees. - path : Path, default None - Path to image to check. - band : str, default None - Filter assocated with target image. - pointing : str, default None - Pointing associated with target image. - sca : str, default None - SCA associated with target image. - - Returns - ------- - res: boolean - True if provided RA, Dec is in the image. False if not. - - """ - - _imgpath = _build_filepath(path, band, pointing, sca, 'image') - with fits.open(_imgpath) as hdu: - wcs = WCS(hdu[1].header) - worldcoords = SkyCoord(ra=ra*u.deg, dec=dec*u.deg) - try: - x, y = skycoord_to_pixel(worldcoords, wcs) - except astropy.wcs.wcs.NoConvergence: - return False - pxradec = np.array([x, y]) - if np.logical_or(any(pxradec < 0), any(pxradec > 4088)): - res = False - else: - res = True - - return res - - -def get_corners(path=None, band=None, pointing=None, sca=None): - """Retrieves the RA, dec of the corners of the specified SCA in degrees. - - :param band: Roman filter. - :type band: str - :param pointing: Pointing ID. - :type pointing: str - :param sca: SCA ID. - :type sca: str - :return: Tuple containing four numpy arrays, each with the RA and dec of the corner - of the specified image in degrees. - :rtype: tuple - """ - _imgpath = _build_filepath(path, band, pointing, sca, 'image') - with fits.open(_imgpath) as hdu: - wcs = WCS(hdu[1].header) - corners = [[0, 0], [0, 4088], [4088, 0], [4088, 4088]] - wcs_corners = wcs.pixel_to_world_values(corners) - - return wcs_corners - - -# TODO clean this up for style -# TODO write something to clear this out -_parquet_cache = {} - - -def _read_parquet( file ): - global _parquet_cache - - if file not in _parquet_cache: - SNLogger.info( f"**** Reading parquet file {file}" ) - _parquet_cache[file] = pd.read_parquet( file ) - - totm = 0 - for f, df in _parquet_cache.items(): - totm += df.memory_usage(index=True).sum() - - SNLogger.info( f"**** Done reading parquet file {file}; cache using {totm/1024/1024} MiB" ) - return _parquet_cache[ file ] - - -def get_transient_radec(oid): - """Retrieve RA, dec of a transient based on its object ID in the OpenUniverse sims. - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - ra : float - RA in degrees of target transient. - dec : float - Dec in degrees of target transient. - - """ - - oid = int(oid) - snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) - file_list = glob(snana_pq_path) - for file in file_list: - # Read the Parquet file - df = _read_parquet(file) - if len(df[df['id'] == oid]) != 0: - ra = df[df['id'] == oid]['ra'].values[0] - dec = df[df['id'] == oid]['dec'].values[0] - return ra, dec - - -def get_transient_mjd(oid): - """Retrieve start and end dates of a transient in the OpenUniverse sims based on its object ID. - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - start : float - Simulated start MJD of target transient. - - end : float - Simulated end MJD of target transient. - - """ - oid = int(oid) - snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) - file_list = glob(snana_pq_path) - for file in file_list: - # Read the Parquet file - df = _read_parquet(file) - if len(df[df['id'] == oid]) != 0: - start = df[df['id'] == oid]['start_mjd'].values[0] - end = df[df['id'] == oid]['end_mjd'].values[0] - - return start, end - - -def get_transient_zcmb(oid): - - """Retrieve z_CMB of a transient in the OpenUniverse sims based on its object ID. - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - z : float - z_CMB of target transient. - - """ - - oid = int(oid) - snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) - file_list = glob(snana_pq_path) - for file in file_list: - # Read the Parquet file - df = _read_parquet(file) - if len(df[df['id'] == oid]) != 0: - z = float(df[df['id'] == oid]['z_CMB'].values[0]) - - return z - - -def get_transient_peakmjd(oid): - - """Retrieve MJD of peak brightness for a transient object in the OpenUniverse sims. - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - mjd : float - Peak MJD of target transient. - - """ - - oid = int(oid) - snana_pq_path = os.path.join( Config.get().value('ou24.sn_truth_dir'), 'snana_*.parquet' ) - file_list = glob(snana_pq_path) - for file in file_list: - # Read the Parquet file - df = _read_parquet(file) - if len(df[df['id'] == oid]) != 0: - mjd = df[df['id'] == oid]['peak_mjd'].values[0] - - return mjd - - -def get_transient_info(oid): - """Retrieve RA, Dec, start MJD, and end MJD for a specified object in the OpenUnvierse sims. - - This function calls get_transient_radec() and get_transient_mjd(). - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - ra : float - RA in degrees of target transient. - - dec : float - Dec in degrees of target transient. - - start : float - Simulated start MJD of target transient. - - end : float - Simulated end MJD of target transient. - - """ - - SNLogger.info( "*** calling get_transient_radec" ) - ra, dec = get_transient_radec(oid) - SNLogger.info( "*** calling get_transient_mjd" ) - start, end = get_transient_mjd(oid) - SNLogger.info( "*** Done with get_transient_info" ) - - return ra, dec, start, end - - -def transient_in_or_out(oid, start, end, band): - """Retrieve pointings that contain and do not contain the specified SN, per the truth files by MJD. - - transient_info_filepath is the output of make_object_table. - - Returns a tuple of astropy tables (images with the SN, images without the SN). - """ - tab = Table.from_pandas( make_object_table( oid ) ) - - tab.sort('pointing') - tab = tab[tab['filter'] == band] - - in_all = get_mjd_info(start, end) - in_rows = np.where(np.isin(tab['pointing'], in_all['pointing']))[0] - in_tab = tab[in_rows] - - out_all = get_mjd_info(start, end, return_inverse=True) - out_rows = np.where(np.isin(tab['pointing'], out_all['pointing']))[0] - out_tab = tab[out_rows] - - return in_tab, out_tab - - -def get_mjd_limits(obseq_path=None): - """Retrive the earliest and latest MJD in the OpenUniverse TDS simulations. - - Parameters - ---------- - obseq_path : Path, default None - Path to obseq file Roman_TDS_obseq_11_6_23.fits. - - Returns - ------- - start : float - Simulated start MJD of OpenUniverse TDS survey. - - end : float - Simulated end MJD OpenUniverse TDS survey. - - """ - - with fits.open(ou2024_obseq_path(obseq_path)) as obs: - obseq = Table(obs[1].data) - - start = min(obseq['date']) - end = max(obseq['date']) - - return start, end - - -def get_radec_limits(obseq_path=None): - """Retrieve the RA, dec limits of the boresight coordinates of the simulated TDS OpenUniverse survey in degrees. - - Parameters - ---------- - obseq_path : Path, default None - Path to obseq file Roman_TDS_obseq_11_6_23.fits. - - Returns - ------- - Dictionary with keys 'ra' and 'dec'. Each key identifies a list - with [minimum RA, maximum RA] and [minimum Dec, maximum Dec], - respectively. - - """ - with fits.open(ou2024_obseq_path(obseq_path)) as obs: - obseq = Table(obs[1].data) - - ra_min = min(obseq['ra']) - ra_max = max(obseq['ra']) - - dec_min = min(obseq['dec']) - dec_max = max(obseq['dec']) - - return {'ra': [ra_min, ra_max], 'dec': [dec_min, dec_max]} - - -def get_mjd(pointing, obseq_path=None): - """Retrieve the MJD of a given pointing in the OpenUniverse TDS simulation. - - Parameters - ---------- - pointing : int - Pointing number. - obseq_path: Path, default None - Path to obseq file Roman_TDS_obseq_11_6_23.fits. - - Returns - ------- - mjd : float - MJD of the specified input pointing. - - """ - - with fits.open(ou2024_obseq_path(obseq_path)) as obs: - obseq = Table(obs[1].data) - mjd = float(obseq['date'][int(pointing)]) - - return mjd - - -def pointings_near_mjd(mjd, window=3, obseq_path=None): - """Retrieve pointings near given MJD. - - Parameters - ---------- - mjd : float - Central MJD to search around. - window : float, default 3 - Number of days around central MJD to include in search. - obseq_path : Path, default None - Path to obseq file Roman_TDS_obseq_11_6_23.fits. - - Returns - ------- - pointings : list - List of pointings within specified MJD range. - - """ - - with fits.open(ou2024_obseq_path(obseq_path)) as obs: - obseq = Table(obs[1].data) - - pointings = np.where(np.logical_and(obseq['date'] < mjd + window, obseq['date'] > mjd - window))[0] - return pointings - - -def get_mjd_info(mjd_start=0, mjd_end=np.inf, return_inverse=False, obseq_path=None): - """Get all pointings and corresponding filters between two MJDs. - - Returns an astropy table with columns 'filter' and 'pointing'. - Does not return an 'sca' column because every sca belonging to a - given pointing satisfies an MJD requirement. - - Parameters - ---------- - mjd_start : float, default 0 - Start MJD - mjd_end : float, default np.inf - End MJD - return_inverse: boolean, default False - If true, returns all pointings outside the MJD range specified instead of inside. - obseq_path: Path, default None - Path to obseq file Roman_TDS_obseq_11_6_23.fits. - - Returns - ------- - mjd_tab : astropy.table.Table - Astropy table with pointing numbers and corresponding filters that satisfy the - MJD requirements. - - """ - with fits.open(ou2024_obseq_path(obseq_path)) as obs: - obseq = Table(obs[1].data) - - if not return_inverse: - mjd_idx = np.where((obseq['date'] > float(mjd_start)) & (obseq['date'] < float(mjd_end)))[0] - elif return_inverse: - mjd_idx = np.where(~((obseq['date'] > float(mjd_start)) & (obseq['date'] < float(mjd_end))))[0] - - mjd_tab = Table([obseq['filter'][mjd_idx], mjd_idx], names=('filter', 'pointing')) - - return mjd_tab - - -def get_exptime(band=None): - - """Retrieves exposure times from the TDS OpenUniverse sims. - - https://arxiv.org/abs/2501.05632 - - Parameters - ---------- - band : str, default None - Band for which to retrieve exposure time. - - Returns - ------- - exptime : dict or float - If a band is specified, a float with the exposure time in seconds is - returned. If no band is specified, a dictionary with the exposure time - for all bands is returned. - """ - - exptime = {'F184': 901.175, - 'J129': 302.275, - 'H158': 302.275, - 'K213': 901.175, - 'R062': 161.025, - 'Y106': 302.275, - 'Z087': 101.7} - - if band in exptime.keys(): - return exptime[band] - else: - return exptime - - -def make_object_table(oid): - """Retrieves a table with all images that contain the RA, Dec coordinates of a specified transient. - - Parameters - ---------- - oid : int - Object ID of target transient. - - Returns - ------- - objs : pd.DataFrame - Table with columns ['filter', 'pointing', 'sca']. - - Raises - ------ - RuntimeError - Error is raised if indexing fails. It will probably work if you - run it again. - - """ - ra, dec = get_transient_radec(oid) - - server_url = 'https://roman-desc-simdex.lbl.gov' - req = requests.Session() - result = req.post(f'{server_url}/findromanimages/containing=({ra}, {dec})') - if result.status_code != 200: - raise RuntimeError(f"Got status code {result.status_code}\n{result.text}") - - objs = pd.DataFrame(result.json())[['filter', 'pointing', 'sca']] - return objs