Luca Caucci
 Assistant Professor, Research Scholar Track
 Assistant Research Professor, Optical Sciences
Contact
 (520) 6264162
 AHSC, Rm. 1343
 TUCSON, AZ 857245067
 caucci@arizona.edu
Degrees
 Ph.D. Optical Scieces
 University of Arizona, Tucson, Arizona, United States
 Task Performance with ListMode Data
 M.S. Optical Sciences
 University of Arizona, Tucson, Arizona, United States
 M.S. Electrical and Computer Engineering
 University of Arizona, Tucson, Arizona, United States
 Point Detection and Hotelling Discriminant: An Applicationin Adaptive Optics
Work Experience
 University of Arizona, Tucson, Arizona (2015  Ongoing)
Awards
 Senior Member
 National Academy of Inventors, Summer 2020
 Outstanding Graduate Student Award
 College of Optical Sciences, University of Arizona, Spring 2011
Interests
Research
Detection, estimation, listmode data, emission tomography, parallel computing
Courses
202122 Courses

Intro to Image Science
OPTI 536 (Spring 2022)
202021 Courses

Advanced Medical Imaging
OPTI 638 (Spring 2021)
201920 Courses

Intro to Image Science
OPTI 536 (Spring 2020)
201819 Courses

Advanced Medical Imaging
OPTI 638 (Spring 2019)
201718 Courses

Advanced Medical Imaging
BME 638 (Spring 2018) 
Advanced Medical Imaging
OPTI 638 (Spring 2018)
201516 Courses

Advanced Medical Imaging
BME 638 (Spring 2016) 
Advanced Medical Imaging
OPTI 638 (Spring 2016)
Scholarly Contributions
Books
 Caucci, L. (2012). Task Performance with ListMode Data. The University of Arizona..More infoPhD Dissertation
Chapters
 Caucci, L., Ding, Y., Ding, Y., Barrett, H. H., & Barrett, H. H. (2018). Computational Methods for PhotonCounting and Photon Processing Detectors. In Photon Counting  Fundamentals and Applications. IntechOpen. doi:10.5772/INTECHOPEN.72151
Journals/Publications
 Barrett, H. H., & Caucci, L. (2021). Stochastic models for objects and images in oncology and virology: application to PI3KAktmTOR signaling and COVID19 disease. Journal of medical imaging (Bellingham, Wash.), 8(Suppl 1), S16001.More infoThe goal of this research is to develop innovative methods of acquiring simultaneous multidimensional molecular images of several different physiological random processes (PRPs) that might all be active in a particular disease such as COVID19. Our study is part of an ongoing effort at the University of Arizona to derive biologically accurate yet mathematically tractable models of the objects of interest in molecular imaging and of the images they produce. In both cases, the models are fully stochastic, in the sense that they provide ways to estimate any estimable property of the object or image. The mathematical tool we use for images is the characteristic function, which can be calculated if the multivariate probability density function for the image data is known. For objects, which are functions of continuous variables rather than discrete pixels or voxels, the characteristic function becomes infinite dimensional, and we refer to it as the characteristic functional. Several innovative mathematical results are derived, in particular for simultaneous imaging of multiple PRPs. Then the application of these methods to cancers that disrupt the mammalian target of rapamycin signaling pathway and to COVID19 are discussed qualitatively. One reason for choosing these two problems is that they both involve lipid rafts. We found that it was necessary to employ a new algorithm for energy estimation to do simultaneous singlephoton emission computerized tomography imaging of a large number of different tracers. With this caveat, however, we expect to be able to acquire and analyze an unprecedented amount of molecular imaging data for an individual COVID patient.
 Barrett, H. H., Caucci, L., & Barrett, H. H. (2021). Erratum: Publisher's Note: Stochastic models for objects and images in oncology and virology: application to PI3KAktmTOR signaling and COVID19 disease.. Journal of medical imaging (Bellingham, Wash.), 8(Suppl 1), 019801. doi:10.1117/1.jmi.8.s1.019801More info[This corrects the article DOI: 10.1117/1.JMI.8.S1.S16001.].
 Barrett, H. H., Caucci, L., & Barrett, H. H. (2021). Stochastic models for objects and images in oncology and virology: application to PI3KAktmTOR signaling and COVID19 disease.. Journal of medical imaging (Bellingham, Wash.), 8(Suppl 1), S16001. doi:10.1117/1.jmi.8.s1.s16001More infoPurpose: The goal of this research is to develop innovative methods of acquiring simultaneous multidimensional molecular images of several different physiological random processes (PRPs) that might all be active in a particular disease such as COVID19. Approach: Our study is part of an ongoing effort at the University of Arizona to derive biologically accurate yet mathematically tractable models of the objects of interest in molecular imaging and of the images they produce. In both cases, the models are fully stochastic, in the sense that they provide ways to estimate any estimable property of the object or image. The mathematical tool we use for images is the characteristic function, which can be calculated if the multivariate probability density function for the image data is known. For objects, which are functions of continuous variables rather than discrete pixels or voxels, the characteristic function becomes infinite dimensional, and we refer to it as the characteristic functional. Results: Several innovative mathematical results are derived, in particular for simultaneous imaging of multiple PRPs. Then the application of these methods to cancers that disrupt the mammalian target of rapamycin signaling pathway and to COVID19 are discussed qualitatively. One reason for choosing these two problems is that they both involve lipid rafts. Conclusions: We found that it was necessary to employ a new algorithm for energy estimation to do simultaneous singlephoton emission computerized tomography imaging of a large number of different tracers. With this caveat, however, we expect to be able to acquire and analyze an unprecedented amount of molecular imaging data for an individual COVID patient.
 Barrett, H. H., Henscheid, N., Caucci, L., & Barrett, H. H. (2020). Abstract 1656: Quantifying task performance with photonprocessing detectors. Cancer Research, 80, 16561656. doi:10.1158/15387445.am20201656More infoPurpose. To estimate task performance (e.g., the ability to correctly detect a small lesion in a normal tissue) when the imaging system uses a new class of detectors, called photonprocessing detectors. Experimental Procedure. We simulated a singlephoton emission computed tomography (SPECT) system in which multiple cameras image a 3D object from different angles. Each camera uses multiple measurements (e.g., photomultiplier tube outputs) to perform estimation of event parameters (or “attributes”) of each detected gammaray photon. Photon attributes, such as position, energy and direction of propagation, are stored at full precision. The likelihood ratio applied to photonprocessing data is used to assess task performance for the detection of a signal buried in a random background. Statistical methods (including Markov chain Monte Carlo estimation) are used to calculate the performance of the likelihood ratio on this task. Results. Our results show an improvement with respect to conventional (i.e., pixelated) detectors when photonprocessing detectors are used. The area under the receiver operating characteristic (ROC) curve is used to assess task performance, and we use this figureofmerit to compare photonprocessing detectors with conventional detectors. Conclusions. Increased performance is observed in the detection of a small lesion when photonprocessing detectors are used instead of pixelated detectors. Applications that will benefit from photonprocessing detectors include tumor diagnosis, drug development, therapy assessment and the study of tumor metabolism. Citation Format: Luca Caucci, Nicholas P. Henscheid, Harrison H. Barrett. Quantifying task performance with photonprocessing detectors [abstract]. In: Proceedings of the Annual Meeting of the American Association for Cancer Research 2020; 2020 Apr 2728 and Jun 2224. Philadelphia (PA): AACR; Cancer Res 2020;80(16 Suppl):Abstract nr 1656.
 Caucci, L., & Barrett, H. H. (2020). Stochastic models for objects and images in oncology and virology: Application to PI3KAktmTOR signaling and COVID19 disease. Journal of medical imaging (Bellingham, Wash.).
 Kupinski, M., Kupinski, M., Omer, K., Omer, K., Kupinski, M., Kupinski, M., Caucci, L., & Caucci, L. (2020). CNN performance dependence on linear image processing. electronic imaging, 2020(10), 31013107. doi:10.2352/issn.24701173.2020.10.ipas310More infoFast track article for IS&T International Symposium on Electronic Imaging 2020: Image Processing: Algorithms and Systems proceedings.
 Omer, K., Caucci, L., & Kupinski, M. (2020). Limitations of CNNs for Approximating the Ideal Observer Despite Quantity of Training Data or Depth of Network. The Journal of imaging science and technology, 64(6), 6040816040811.More infoThe performance of a convolutional neural network (CNN) on an image texture detection task as a function of linear image processing and the number of training images is investigated. Performance is quantified by the area under (AUC) the receiver operating characteristic (ROC) curve. The Ideal Observer (IO) maximizes AUC but depends on highdimensional image likelihoods. In many cases, the CNN performance can approximate the IO performance. This work demonstrates counterexamples where a fullrank linear transform degrades the CNN performance below the IO in the limit of large quantities of training data and network layers. A subsequent linear transform changes the images' correlation structure, improves the AUC, and again demonstrates the CNN dependence on linear processing. Compression strictly decreases or maintains the IO detection performance while compression can increase the CNN performance especially for small quantities of training data. Results indicate an optimal compression ratio for the CNN based on task difficulty, compression method, and number of training images.
 Omer, K., Kupinski, M., & Caucci, L. (2020). Limitations of CNNs for Approximating the Ideal Observer Despite Quantity of Training Data or Depth of Network. Journal of Imaging Science and Technology, 64(6), 6040816040811. doi:10.2352/j.imagingsci.technol.2020.64.6.060408
 Omer, K., Kupinski, M., & Caucci, L. (2020). Limitations of CNNs for Approximating the Ideal Observer Despite Quantity of Training Data or Depth of Network.. The Journal of imaging science and technology, 64(6), 6040816040811. doi:10.2352/j.imagingsci.technol.2020.64.6.060408More infoThe performance of a convolutional neural network (CNN) on an image texture detection task as a function of linear image processing and the number of training images is investigated. Performance is quantified by the area under (AUC) the receiver operating characteristic (ROC) curve. The Ideal Observer (IO) maximizes AUC but depends on highdimensional image likelihoods. In many cases, the CNN performance can approximate the IO performance. This work demonstrates counterexamples where a fullrank linear transform degrades the CNN performance below the IO in the limit of large quantities of training data and network layers. A subsequent linear transform changes the images' correlation structure, improves the AUC, and again demonstrates the CNN dependence on linear processing. Compression strictly decreases or maintains the IO detection performance while compression can increase the CNN performance especially for small quantities of training data. Results indicate an optimal compression ratio for the CNN based on task difficulty, compression method, and number of training images.
 Barrett, H. H., Furenlid, L. R., Han, H., Jha, A. K., Liu, Z., & Caucci, L. (2019). Towards ContinuoustoContinuous 3D Imaging in the Real World. Physics in Medicine and Biology.More infoImaging systems are often modeled as continuoustodiscrete mappings that map the object (i.e., a function of continuous variables such as space, time, energy, wavelength, etc.) to a finite set of measurements. When it comes to reconstruction, some discretized version of the object is almost always assumed, leading to a discretetodiscrete representation of the imaging system. In this paper, we discuss a method for single photon emission computed tomography~(SPECT) imaging that avoids discrete representations of the object or the imaging system, thus allowing reconstruction on arbitrarily fine set of points.
 Caucci, L., Liu, Z., Jha, A. K., Han, H., Furenlid, L. R., & Barrett, H. H. (2019). Towards continuoustocontinuous 3D imaging in the real world. Physics in medicine and biology, 64(18), 185007.More infoImaging systems are often modeled as continuoustodiscrete mappings that map the object (i.e. a function of continuous variables such as space, time, energy, wavelength, etc) to a finite set of measurements. When it comes to reconstruction, some discretized version of the object is almost always assumed, leading to a discretetodiscrete representation of the imaging system. In this paper, we discuss a method for singlephoton emission computed tomography (SPECT) imaging that avoids discrete representations of the object or the imaging system, thus allowing reconstruction on an arbitrarily fine set of points.
 Caucci, L., Ding, Y., & Barrett, H. H. (2017). Null functions in threedimensional imaging of alpha and beta particles. Scientific Reports.
 Ding, Y., Caucci, L., & Barrett, H. H. (2017). Chargedparticle emission tomography. Medical physics, 44(6), 24782489.More infoConventional chargedparticle imaging techniques  such as autoradiography  provide only twodimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct threedimensional (3D) autoradiography technique, which we call chargedparticle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo chargedparticle imaging with a window chamber or an endoscope.
 Ding, Y., Caucci, L., & Barrett, H. H. (2017). Null functions in threedimensional imaging of alpha and beta particles. Scientific reports, 7(1), 15807.More infoNull functions of an imaging system are functions in the object space that give exactly zero data. Hence, they represent the intrinsic limitations of the imaging system. Null functions exist in all digital imaging systems, because these systems map continuous objects to discrete data. However, the emergence of detectors that measure continuous data, e.g. particleprocessing (PP) detectors, has the potential to eliminate null functions. PP detectors process signals produced by each particle and estimate particle attributes, which include two position coordinates and three components of momentum, as continuous variables. We consider ChargedParticle Emission Tomography (CPET), which relies on data collected by a PP detector to reconstruct the 3D distribution of a radioisotope that emits alpha or beta particles, and show empirically that the null functions are significantly reduced for alpha particles if ≥3 attributes are measured or for beta particles with five attributes measured.
 Barrett, H. H., Woolfenden, J. M., Hoppin, J. W., Caucci, L., Barrett, H. H., & Alberts, D. S. (2016). Therapy operating characteristic curves: tools for precision chemotherapy.. Journal of medical imaging (Bellingham, Wash.), 3(2), 023502. doi:10.1117/1.jmi.3.2.023502More infoThe therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control versus the probability of normaltissue complications as the overall radiation dose level is varied, e.g., by varying the beam current in externalbeam radiotherapy or the total injected activity in radionuclide therapy. This paper shows how TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy, AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. The mathematical analogy between response of observers to images and the response of tumors to distributions of a chemotherapy drug is exploited to obtain linear discriminant functions from which AUTOC can be calculated. Methods for using mathematical models of drug delivery and tumor response with imaging data to estimate patientspecific parameters that are needed for calculation of AUTOC are outlined. The implications of this viewpoint for clinical trials are discussed.
 Caucci, L., Myers, K. J., & Barrett, H. H. (2016). Radiance and photon noise: imaging in geometrical optics, physical optics, quantum optics and radiology. Optical Engineering, 55(1), 013102. doi:10.1117/1.OE.55.1.013102.
 Barrett, H. H., Alberts, D. S., Woolfenden, J. M., Liu, Z., Caucci, L., & Hoppin, J. W. (2015). Quantifying and Reducing Uncertainties in Cancer Therapy. Proceedings of SPIEthe International Society for Optical Engineering, 9412.More infoThere are two basic sources of uncertainty in cancer chemotherapy: how much of the therapeutic agent reaches the cancer cells, and how effective it is in reducing or controlling the tumor when it gets there. There is also a concern about adverse effects of the therapy drug. Similarly in externalbeam radiation therapy or radionuclide therapy, there are two sources of uncertainty: delivery and efficacy of the radiation absorbed dose, and again there is a concern about radiation damage to normal tissues. The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control vs. the probability of normaltissue complications as the overall radiation dose level is varied, e.g. by varying the beam current in externalbeam radiotherapy or the total injected activity in radionuclide therapy. The TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. In this paper we discuss the potential of using mathematical models of drug delivery and tumor response with imaging data to estimate AUTOC for chemotherapy, again for a single patient. This approach provides a basis for truly personalized therapy and for rigorously assessing and optimizing the therapy regimen for the particular patient. A key role is played by Emission Computed Tomography (PET or SPECT) of radiolabeled chemotherapy drugs.
 Barrett, H. H., Myers, K. J., Caucci, L., & Barrett, H. H. (2015). Phasespace optics, one photon at a time. Frontiers in Optics. doi:10.1364/fio.2015.fw5g.5More infoThe relationship between the Wigner Distribution Function and the spectral photon radiance is discussed for both physical optics and quantum optics. Computational methods and applications are surveyed, and photonbyphoton phasespace imagers are introduced.
 Furenlid, L. R., & Caucci, L. (2015). GPU programming for biomedical imaging. Proceedings of SPIE, 9594. doi:10.1117/12.2195217More infoScientific computing is rapidly advancing due to the introduction of powerful new computing hardware, such as graphics processing units (GPUs). Affordable thanks to mass production, GPU processors enable the transition to efficient parallel computing by bringing the performance of a supercomputer to a workstation. We elaborate on some of the capabilities and benefits that GPU technology offers to the field of biomedical imaging. As practical examples, we consider a GPU algorithm for the estimation of position of interaction from photomultiplier (PMT) tube data, as well as a GPU implementation of the MLEM algorithm for iterative image reconstruction.
 Jha, A. K., Barrett, H. H., Frey, E. C., Clarkson, E., Caucci, L., & Kupinski, M. A. (2015). Singular value decomposition for photonprocessing nuclear imaging systems and applications for reconstruction and computing null functions. Physics in medicine and biology, 60(18), 735985.More infoRecent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use realtime maximumlikelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photoninteraction event and store these attributes in a list format. This class of systems, which we refer to as photonprocessing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuousvalued photon attributes on a perphoton basis. Unlike conventional photoncounting (PC) systems that bin the data into images, PP systems do not have any binningrelated information loss. Mathematically, while PC systems have an infinitedimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singularvalue decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general twodimensional (2D) planar linear shiftinvariant (LSIV) PP system and a hypothetical continuously rotating 2D singlephoton emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photonbyphoton realtime reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.
 Jha, A., Barrett, H. H., Frey, E. C., Clarkson, E. W., Caucci, L., & Kupinski, M. A. (2015). Singular value decomposition for photonprocessing nuclear imaging systems and applications for reconstruction and computing null functions. Physics in Medicine and Biology, 6(18), 73597385.
 Barrett, H. H., Myers, K. J., Caucci, L., & Barrett, H. H. (2014). RADIANCE AND PHOTON NOISE: Imaging in geometrical optics, physical optics, quantum optics and radiology.. Proceedings of SPIEthe International Society for Optical Engineering, 9193. doi:10.1117/12.2066715More infoA fundamental way of describing a photonlimited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photonprocessing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to taskbased measures of image quality and the information content of a single detected photon.
 Caucci, L., Myers, K. J., & Barrett, H. H. (2014). Radiance and photon noise: imaging in geometrical optics, physical optics, quantum optics and radiology. OPTICAL ENGINEERING, 55(1).
 Barrett, H. H., Caucci, L., & Barrett, H. H. (2013). Information content of a photon and how to extract it. Spie Newsroom. doi:10.1117/2.1201305.004887
 Caucci, L., & Barrett, H. H. (2012). Objective assessment of image quality. V. Photoncounting detectors and listmode data. Journal of the Optical Society of America. A, Optics, image science, and vision, 29(6), 100316.More infoA theoretical framework for detection or discrimination tasks with listmode data is developed. The object and imaging system are rigorously modeled via three random mechanisms: randomness of the object being imaged, randomness in the attribute vectors, and, finally, randomness in the attribute vector estimates due to noise in the detector outputs. By considering the listmode data themselves, the theory developed in this paper yields a manageable expression for the likelihood of the listmode data given the object being imaged. This, in turn, leads to an expression for the optimal Bayesian discriminant. Figures of merit for detection tasks via the ideal and optimal linear observers are derived. A concrete example discusses detection performance of the optimal linear observer for the case of a known signal buried in a random lumpy background.
 Maire, J., Barrett, H. H., Savransky, D., Pueyo, L., Poyneer, L., Perrin, M. D., Pearson, I., Mugnier, L. M., Mouillet, D., Mawet, D., Marois, C., Maire, J., Lawson, P. R., Krist, J., Guyon, O., Gladysz, S., Furenlid, L. R., Frazin, R. A., Devaney, N., , Caucci, L., et al. (2012). On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Groundbased Coronagraphs.. Proceedings of SPIEthe International Society for Optical Engineering, 8447. doi:10.1117/12.925099More infoThe direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planetfinding coronagraphs, including VLTSPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earthlike planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in groundbased AOcorrected coronagraphs. The work reported in this paper is the result of interactions between the coauthors during a weeklong workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.
 Spaletta, G., & Caucci, L. (2012). Constrained iterations for blind deconvolution and convexity issues. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 197(1), 2943.
 Hesterman, J. Y., Caucci, L., Kupinski, M. A., Barrett, H. H., & Furenlid, L. R. (2010). MaximumLikelihood Estimation With a ContractingGrid Search Algorithm. IEEE transactions on nuclear science, 57(3), 10771084.More infoA fast search algorithm capable of operating in multidimensional spaces is introduced. As a sample application, we demonstrate its utility in the 2D and 3D maximumlikelihood positionestimation problem that arises in the processing of PMT signals to derive interaction locations in compact gamma cameras. We demonstrate that the algorithm can be parallelized in pipelines, and thereby efficiently implemented in specialized hardware, such as fieldprogrammable gate arrays (FPGAs). A 2D implementation of the algorithm is achieved in Cell/BE processors, resulting in processing speeds above one million events per second, which is a 20× increase in speed over a conventional desktop machine. Graphics processing units (GPUs) are used for a 3D application of the algorithm, resulting in processing speeds of nearly 250,000 events per second which is a 250× increase in speed over a conventional desktop machine. These implementations indicate the viability of the algorithm for use in realtime imaging applications.
 Caucci, L., Barrett, H. H., & Rodriguez, J. J. (2009). Spatiotemporal Hotelling observer for signal detection from image sequences. Optics express, 17(13), 1094658.More infoDetection of signals in noisy images is necessary in many applications, including astronomy and medical imaging. The optimal linear observer for performing a detection task, called the Hotelling observer in the medical literature, can be regarded as a generalization of the familiar prewhitening matched filter. Performance on the detection task is limited by randomness in the image data, which stems from randomness in the object, randomness in the imaging system, and randomness in the detector outputs due to photon and readout noise, and the Hotelling observer accounts for all of these effects in an optimal way. If multiple temporal frames of images are acquired, the resulting data set is a spatiotemporal random process, and the Hotelling observer becomes a spatiotemporal linear operator. This paper discusses the theory of the spatiotemporal Hotelling observer and estimation of the required spatiotemporal covariance matrices. It also presents a parallel implementation of the observer on a cluster of Sony PLAYSTATION 3 gaming consoles. As an example, we consider the use of the spatiotemporal Hotelling observer for exoplanet detection.
 Caucci, L., Furenlid, L. R., & Barrett, H. H. (2009). Maximum Likelihood Event Estimation and Listmode Image Reconstruction on GPU Hardware. IEEE Nuclear Science Symposium conference record. Nuclear Science Symposium, 2009, 4072.More infoThe scintillation detectors commonly used in SPECT and PET imaging and in Compton cameras require estimation of the position and energy of each gamma ray interaction. Ideally, this process would yield images with no spatial distortion and the best possible spatial resolution. In addition, especially for Compton cameras, the computation must yield the best possible estimate of the energy of each interacting gamma ray. These goals can be achieved by use of maximumlikelihood (ML) estimation of the event parameters, but in the past the search for an ML estimate has not been computationally feasible. Now, however, graphics processing units (GPUs) make it possible to produce optimal, realtime estimates of position and energy, even from scintillation cameras with a large number of photodetectors. In addition, the mathematical properties of ML estimates make them very attractive for use as list entries in listmode ML image reconstruction. This twostep ML processusing ML estimation once to get the list data and again to reconstruct the objectallows accurate modeling of the detector blur and, potentially, considerable improvement in reconstructed spatial resolution.
 Caucci, L., Kupinski, M. A., Freed, M., Furenlid, L. R., Wilson, D. W., & Barrett, H. H. (2008). Adaptive SPECT for Tumor Necrosis Detection. IEEE Nuclear Science Symposium conference record. Nuclear Science Symposium, 2008, 55485551.More infoIn this paper, we consider a prototype of an adaptive SPECT system, and we use simulation to objectively assess the system's performance with respect to a conventional, nonadaptive SPECT system. Objective performance assessment is investigated for a clinically relevant task: the detection of tumor necrosis at a known location and in a random lumpy background. The iterative maximumlikelihood expectationmaximization (MLEM) algorithm is used to perform image reconstruction. We carried out human observer studies on the reconstructed images and compared the probability of correct detection when the data are generated with the adaptive system as opposed to the nonadaptive system. Task performance is also assessed by using a channelized Hotelling observer, and the area under the receiver operating characteristic curve is the figure of merit for the detection task. Our results show a large performance improvement of adaptive systems versus nonadaptive systems and motivate further research in adaptive medical imaging.
 Whitaker, M. K., Gladysz, S., Devaney, N., Caucci, L., Burke, D., & Barrett, H. H. (2008). Optimal linear estimation of binary star parameters. Proceedings of SPIE, 7015. doi:10.1117/12.788973More infoWe propose a new postprocessing technique for the detection of faint companions and the estimation of their parameters from adaptive optics (AO) observations. We apply the optimal linear detector, which is the Hotelling observer, to perform detection, astrometry and photometry on real and simulated data. The real data was obtained from the AO system on the 3m Lick telescope 1 . The Hotelling detector, which is a prewhitening matched filter, calculates the Hotelling test statistic which is then compared to a threshold. If the test statistic is greater than the threshold the algorithm decides that a companion is present. This decision is the main task performed by the Hotelling observer. After a detection is made the location and intensity of the companion which maximise this test statistic are taken as the estimated values. We compare the Hotelling approach with current detection algorithms widely used in astronomy. We discuss the use of the estimation receiver operating characteristic (EROC) curve in quantifying the performance of the algorithm with no prior estimate of the companion's location or intensity. The robustness of this technique to errors in point spread function (PSF) estimation is also investigated.
 Caucci, L., Barrett, H. H., Devaney, N., & Rodríguez, J. J. (2007). Application of the Hotelling and ideal observers to detection and localization of exoplanets. Journal of the Optical Society of America. A, Optics, image science, and vision, 24(12), B1324.More infoThe ideal linear discriminant or Hotelling observer is widely used for detection tasks and imagequality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with longexposure images obtained from groundbased or spacebased telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the pointspread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves.
 Barrett, H. H., Myers, K. J., Devaney, N., Dainty, J. C., & Caucci, L. (2006). Task Performance in Astronomical Adaptive Optics. Proceedings of SPIEthe International Society for Optical Engineering, 6272, 62721W.More infoIn objective or taskbased assessment of image quality, figures of merit are defined by the performance of some specific observer on some task of scientific interest. This methodology is well established in medical imaging but is just beginning to be applied in astronomy. In this paper we survey the theory needed to understand the performance of ideal or ideallinear (Hotelling) observers on detection tasks with adaptiveoptical data. The theory is illustrated by discussing its application to detection of exoplanets from a sequence of shortexposure images.
 Spaletta, G., & Caucci, L. (2006). Constrained iterations for blind deconvolution and convexity issues. Journal of Computational and Applied Mathematics, 197(1), 2943. doi:10.1016/j.cam.2005.10.020More infoThe need for image restoration arises in many applications of various scientific disciplines, such as medicine and astronomy and, in general, whenever an unknown image must be recovered from blurred and noisy data [M. Bertero, P. Boccacci, Introduction to Inverse Problems in Imaging, Institute of Physics Publishing, Philadelphia, PA, USA, 1998]. The algorithm studied in this work restores the image without the knowledge of the blur, using a little priori information and a blind inverse filter iteration. It represents a variation of the methods proposed in Kundur and Hatzinakos [A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375390] and Ng et al. [Regularization of RIF blind image deconvolution, IEEE Trans. Image Process. 9(6) (2000) 11301134]. The problem of interest here is an inverse one, that cannot be solved by simple filtering since it is illposed. The imaging system is assumed to be linear and spaceinvariant: this allows a simplified relationship between unknown and observed images, described by a point spread function modeling the distortion. The blurring, though, makes the restoration illconditioned: regularization is therefore also needed, obtained by adding constraints to the formulation. The restoration is modeled as a constrained minimization: particular attention is given here to the analysis of the objective function and on establishing whether or not it is a convex function, whose minima can be located by classic optimization techniques and descent methods. Numerical examples are applied to simulated data and to real data derived from various applications. Comparison with the behavior of methods [D. Kundur, D. Hatzinakos, A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375390] and [M. Ng, R.J. Plemmons, S. Qiao, Regularization of RIF Blind Image Deconvolution, IEEE Trans. Image Process. 9(6) (2000) 11301134] show the effectiveness of our variant.
Proceedings Publications
 Caucci, L. (2020, Jan). Comparing training variability of CNN and optimal linear data reduction on image textures. In IS&T International Symposium on Electronic Imaging.
 Caucci, L. (2019, Oct). A Real Time Adaptive Strategy for Gamma Ray Imaging Systems. In IEEE NSSMIC Conference.
 Barrett, H. H., Woolfenden, J. M., Woolfenden, J. M., Miller, B. W., Han, L., Furenlid, L. R., Caucci, L., & Barrett, H. H. (2017). System Calibration for FastSPECT III: An UltraHigh Resolution CCDBased Pinhole SPECT System. In 2017 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).More infoFastSPECT III is a recently developed ultrahighresolution smallanimal SPECT imaging system. With 20 CCDbased intensified quantumimaging cameras (iQID) and 250micron diameter platinum pinhole apertures, this stationary SPECT system offers $\sim 350$ microns isotropic linear resolution. This paper presents a novel system calibration method for FastSPECT III and other highresolution stationary pinholeSPECT systems. The performance of the new system calibration method was evaluated using multibedposition MLEM reconstruction and helicallyscanned objects. Originally designed for highresolution rodent brain imaging to study neurological pathologies, FastSPECT III now offers wholebody mouse imaging capabilities with ultrahigh spatial resolution.
 Caucci, L., Barrett, H. H., Ding, J., & Henscheid, N. (2017, Oct 2017). Particleprocessing detectors for chargedparticle emission tomography. In IEEE Nuclear Science Symposium and Medical Imaging Conference Record (NSS/MIC).
 Zubal, G. I., Mukherjee, J. M., Konik, A., King, M. A., Kalluri, K. S., Goding, J. C., Furenlid, L. R., Caucci, L., & Banerjee, S. (2017). Preliminary Investigation to Improve Point Spread Function Modeling for a MultiPinhole SPECT Camera. In 2017 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).More infoHerein we report on the mathematical modeling of the simulated point spread functions (PSFs) of pinhole apertures for clinical I123 DaTscan imaging on a dualhead SPECT system consisting of fan and multipinhole (MPH) collimators on separate heads. The PSFs can be measured sparsely by translating a point source within the volume of interest (VOI). These PSFs were generated using GATE Monte Carlo simulation software and were then modeled using standard 2D Gaussian having 6 parameters, and three other models using higher order polynomial terms as well as cross terms in the exponential. The goal is to efficiently store the parameters of the modeled PSF, measured across the VOI and then interpolate them on the fly during reconstruction. It has been shown that MPH reconstruction can be improved with accurate modeling of the PSF. However, for our application it has been determined that improved accuracy in PSF modeling (reduced NRMSE) can be obtained by incorporating more polynomial terms in the exponential than employed by the standard 2D Gaussian, especially with increased pinhole angulations. In this paper we introduce higher order polynomial terms (degree 3 and 4) as an extension to the Gaussian model and observe that these added terms could significantly reduce the NRMSE.
 Ruizgonzalez, M., Furenlid, L. R., & Caucci, L. (2016). Joint amplitude and timing estimation for scintillation pulses in GPU. In 2016 IEEE Nuclear Science Symposium, Medical Imaging Conference and RoomTemperature Semiconductor Detector Workshop (NSS/MIC/RTSD).More infoWe present an implementation on a graphics processing unit (GPUs) of a maximumlikelihood (ML) contractinggrid algorithm that performs joint estimation of amplitude and time of scintillation pulses. ML estimation consists in finding the parameters that maximizes the likelihood. We proposed a multivariate normal model for the likelihood, which implies that we need mean pulses and covariance matrices for every parameter variation. In a previous work, we performed a Fisher information analysis to determine the minimum number of samples that contains the most timing information in a digital pulse, for a given acquisition rate. Here, we use just the right amount of samples to reduce the size of mean pulses and covariance matrices. In order to reduce the number of precomputed mean pulses and covariance matrices, we take advantage of the linearity of scintillation pulses, which allows us to scale a normalized pulse to obtain mean pulses of any amplitude. We make use of this information to limit the amount of computation and shared memory used in our GPU code, while preserving full timing performance. We developed our code on an Nvidia Tesla C2075 GPU and we were able to process 1.5 million events per second. The estimation algorithm and GPU implementation can be used in realtime for systems that require high temporal resolution such as timeofflight positron emission tomography or in processes that have high gammaray count rate.
 Caucci, L., & Furenlid, L. R. (2015, August). Graphics Processing units for biomedical imaging. In SPIE, 9594, 95940G.
 Caucci, L., Barrett, H. H., Liu, Z., Han, H., & Furenlid, L. R. (2015, September). Towards ContinuousToContinuous 3D Data Reconstruction. In 13th International Meeting on Fully ThreeDimensional Image Reconstruction in Radiology and Nuclear Medicine.
 Barrett, H. H., Ding, Y., Caucci, L., & Barrett, H. H. (2014). Directional chargedparticle detector with a twolayer ultrathin phosphor foil. In 2014 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC), 14.More infoCurrent chargedparticle detectors are able to estimate the position and energy of a particle, but not its direction. This study is aimed at developing a detector capable of measuring the direction of a charged particle as well as its position. The detector uses an image intensifier and a lenscoupled CMOS (complementary metaloxidesemiconductor) camera to capture the scintillation light excited by a charged particle traversing a phosphor assembly. The phosphor assembly is made of two layers of ultrathin phosphor foils separated by an air gap. The performance of the detector is illustrated by simulation, theory and experiment.
 Barrett, H. H., Ding, Y., Caucci, L., & Barrett, H. H. (2014). αET: Alpha Emission Tomography. In 2014 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).More infoCurrent techniques available for imaging of alpha particles, such as alpha autoradiography, are only able to produce planar (2D) images of 3D tissue samples containing radioactive substances. We introduce a new imaging technique, which we call Alpha Emission Tomography (αET), for the reconstruction of the volumetric (3D) distribution of radiotracers that emits alpha particles. Our imaging system is able to measure position and energy of each detected alpha particle. The list of measured positions and energies is fed to an iterative algorithm for 3D reconstruction. We predict that αET will become a valuable imaging technique for molecular biology.
 Barrett, H. H., Kupinski, M. A., Jha, A. K., Furenlid, L. R., Clarkson, E. W., Caucci, L., & Barrett, H. H. (2013). Image Science with PhotonProcessing Detectors.. In IEEE Nuclear Science Symposium conference record. Nuclear Science Symposium, 2013.More infoWe introduce and discuss photonprocessing detectors and we compare them with photoncounting detectors. By estimating a relatively small number of attributes for each collected photon, photonprocessing detectors may help understand and solve a fundamental theoretical problem of any imaging system based on photoncounting detectors, namely null functions. We argue that photonprocessing detectors can improve task performance by estimating position, energy, and time of arrival for each collected photon. We consider a continuoustocontinuous linear operator to relate the object being imaged to the collected data, and discuss how this operator can be analyzed to derive properties of the imaging system. Finally, we derive an expression for the characteristic functional of an imaging system that produces listmode data.
 Barrett, H. H., Myers, K. J., Caucci, L., Gregory, G., & Davis, A. (2013, 2014). RADIANCE AND PHOTON NOISE: Imaging in geometrical optics, physical optics, quantum optics and radiology. In NOVEL OPTICAL SYSTEMS DESIGN AND OPTIMIZATION XVII, 9193.
 Caucci, L., Hunter, W. C., Furenlid, L. R., Barrett, H. H., & , . (2010, 2010). Listmode MLEM Image Reconstruction from 3D ML Position Estimates. In 2010 IEEE NUCLEAR SCIENCE SYMPOSIUM CONFERENCE RECORD (NSS/MIC), 26432647.
 Burke, D., Devaney, N., Gladysz, S., Barrett, H. H., Whitaker, M. K., Caucci, L., Hubin, N., Max, C., & Wizinowich, P. (2009, 2008). Optimal Linear Estimation of Binary Star Parameters. In ADAPTIVE OPTICS SYSTEMS, PTS 13, 7015.
 Caucci, L., Furenlid, L. R., Barber, H., Furenlid, L., & Roehrig, H. (2008, 2015). GPU programming for biomedical imaging. In MEDICAL APPLICATIONS OF RADIATION DETECTORS V, 9594.
 Barrett, H. H., Rodriguez, J. J., Rodriguez, J. J., Devaney, N., Caucci, L., & Barrett, H. H. (2007). Statistical Decision Theory and Adaptive Optics: A Rigorous Approach to Exoplanet Detection. In Adaptive Optics: Analysis and Methods/Computational Optical Sensing and Imaging/Information Photonics/Signal Recovery and Synthesis Topical Meetings on CDROM.More infoStatistical decision theory is applied to the problem of exoplanet detection with AO. We derive optimal observers for the detection of exoplanets in AO images. Theoretical results are verified by running simulation tests.
 Caucci, L., Jha, A. K., Furenlid, L. R., Clarkson, E. W., Kupinski, M. A., Barrett, H. H., & , . (2007, 2013). Image Science with PhotonProcessing Detectors. In 2013 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (NSS/MIC).
 Lawson, P. R., Poyneer, L., Barrett, H., Frazin, R., Caucci, L., Devaney, N., Furenlid, L., Gladysz, S., Guyon, O., Krist, J., Maire, J., Marois, C., Mawet, D., Mouillet, D., Mugnier, L., Pearson, I., Perrin, M., Pueyo, L., Savransky, D., , Ellerbroek, B., et al. (2006, 2012). On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Groundbased Coronagraphs. In ADAPTIVE OPTICS SYSTEMS III, 8447.
Presentations
 Caucci, L., Henscheid, N., & Barrett, H. H. (2020, Apr 2020). Quantifying task performance with photonprocessing detectors. Annual Meeting of the American Association for Cancer Research. Virtual due to COVID: American Association for Cancer Research.
 Caucci, L. (2019, Fall 2019). Synthetic Imaging Systems. College of Medicine Data Blitz Seminar. Tucson, AZ: College of Medicine.
Other Teaching Materials
 Caucci, L. (2020. HighPerformance Computing for Medical Imaging on Graphics Processing Units (GPU) with CUDA. SPIE.More infoShort course given at the SPIE Medical Imaging Conference, 1520 February 2020, Houston, TX.
Others
 Caucci, L. (2020, Nov). Contractinggrid Search Algorithm on CPU, GPU and FPGA. github.com. https://github.com/caucci/get_data_proj/blob/master/summary.pdf