Charles M Higgins
- Associate Professor, Neuroscience
- Associate Professor, BIO5 Institute
- Associate Professor, Electrical and Computer Engineering
- Associate Professor, Applied Mathematics - GIDP
- Associate Professor, Entomology / Insect Science - GIDP
- Associate Professor, Neuroscience - GIDP
- Member of the Graduate Faculty
- (520) 621-6604
- Gould-Simpson, Rm. 430
- Tucson, AZ 85721
- higgins@neurobio.arizona.edu
Biography
Charles Higgins is an Associate Professor in the Department of Neuroscience with a joint appointment in Electrical Engineering at the University of Arizona, which he joined in 1999. Though he started his career as an electrical engineer (receiving his PhD from Caltech in 1993), his fascination with the natural world has led him to study insect vision and visual processing, and to try to meld together the worlds of robotics and biology.
His research encompasses areas including software simulations of brain circuits, human sleep, models of cognition, and interfacing live insect brains with robots, but his driving interest continues to be building truly intelligent machines.
Degrees
- Postdoctoral Fellow, Neuroscience
- California Institute of Technology, Pasadena, California, United States
- Ph.D. Electrical / Computer Engineering
- California Institute of Technology, Pasadena, Arizona, United States
- Classification and Approximation with Rule-Based Networks
- M.S. Electrical / Computer Engineering
- Georgia Institute of Technology, Atlanta, Georgia, United States
- B.S. Electrical / Computer Engineering
- Louisiana State University, Baton Rouge, Louisiana, United States
Work Experience
- Arete Associates (2009 - 2011)
- University of Arizona, Tucson, Arizona (2005 - Ongoing)
- Physical Sciences, Inc (2002 - 2003)
- Computational Sensors Corporation (2000 - 2002)
- University of Arizona, Tucson, Arizona (1999 - 2005)
- MIT Lincoln Laboratory (1993 - 1996)
- B. V. Bhivani, Inc (1992 - 1993)
- IBM Cambridge Scientific Center (1987 - 1990)
Awards
- Innovation in Teaching Award
- University of Arizona College of Science, Fall 2014
Interests
Teaching
Electronics, VLSI design, computational modeling, visual neuroscience, neuroscience, electrophysiology, robotics, brain-machine interfacing
Research
Computational modeling, visual neuroscience, neuroscience, electrophysiology, robotics, brain-machine interfacing
Courses
2024-25 Courses
-
Electrophysiology Laboratory
NROS 415 (Spring 2025) -
Scientific Programming Matlab
NROS 311 (Spring 2025) -
Electrophysiology Laboratory
NROS 415 (Fall 2024) -
Preceptorship
NROS 491 (Fall 2024) -
Scientific Programming Matlab
NROS 311 (Fall 2024)
2023-24 Courses
-
Scientific Programming Matlab
NROS 311 (Summer I 2024) -
Electrophysiology Laboratory
NROS 415 (Spring 2024) -
Preceptorship
NROS 491 (Spring 2024) -
Scientific Programming Matlab
NROS 311 (Spring 2024) -
Special Topics in Science
HNRS 195I (Spring 2024) -
Electrophysiology Laboratory
NROS 415 (Fall 2023) -
Honors Independent Study
NROS 199H (Fall 2023) -
Preceptorship
NROS 491 (Fall 2023) -
Scientific Programming Matlab
NROS 311 (Fall 2023)
2022-23 Courses
-
Scientific Programming Matlab
NROS 311 (Summer I 2023) -
Electrophysiology Laboratory
NROS 415 (Spring 2023) -
Independent Study
NROS 499 (Spring 2023) -
Preceptorship
NSCS 491 (Spring 2023) -
Scientific Programming Matlab
NROS 311 (Spring 2023) -
Special Topics in Science
HNRS 195I (Spring 2023) -
Directed Research
NROS 492 (Fall 2022) -
Preceptorship
NROS 491 (Fall 2022) -
Scientific Programming Matlab
NSCS 311 (Fall 2022)
2021-22 Courses
-
Scientific Programming Matlab
NSCS 311 (Summer I 2022) -
Electrophysiology Laboratory
NROS 415 (Spring 2022) -
Preceptorship
NSCS 491 (Spring 2022) -
Scientific Programming Matlab
NSCS 311 (Spring 2022) -
Special Topics in Science
HNRS 195I (Spring 2022) -
Preceptorship
NSCS 491 (Fall 2021) -
Scientific Programming Matlab
NSCS 311 (Fall 2021)
2020-21 Courses
-
Scientific Programming Matlab
NSCS 311 (Summer I 2021) -
Electrophysiology Laboratory
NROS 415 (Spring 2021) -
Preceptorship
NSCS 491 (Spring 2021) -
Scientific Programming Matlab
NSCS 311 (Spring 2021) -
Special Topics in Science
HNRS 195I (Spring 2021) -
Engaging Topics in NSCS
NSCS 195B (Fall 2020) -
Preceptorship
NSCS 491 (Fall 2020) -
Scientific Programming Matlab
NSCS 311 (Fall 2020)
2019-20 Courses
-
Electrophysiology Laboratory
NROS 415 (Spring 2020) -
Special Topics in Science
HNRS 195I (Spring 2020) -
Directed Research
NSCS 492 (Fall 2019) -
Engaging Topics in NSCS
NSCS 195B (Fall 2019) -
Honors Preceptorship
NSCS 491H (Fall 2019) -
Preceptorship
NSCS 491 (Fall 2019) -
Scientific Programming Matlab
NSCS 311 (Fall 2019)
2018-19 Courses
-
Electrophysiology Laboratory
NROS 415 (Spring 2019) -
Honors Thesis
NSCS 498H (Spring 2019) -
Special Topics in Science
HNRS 195I (Spring 2019) -
Engaging Topics in NSCS
NSCS 195B (Fall 2018) -
Honors Preceptorship
NSCS 491H (Fall 2018) -
Honors Thesis
NSCS 498H (Fall 2018) -
Independent Study
NSCS 399 (Fall 2018) -
Preceptorship
NSCS 491 (Fall 2018) -
Rsrch Meth Biomed Engr
BME 597G (Fall 2018) -
Scientific Programming Matlab
NROS 311 (Fall 2018)
2017-18 Courses
-
Directed Research
NSCS 392 (Spring 2018) -
Electrophysiology Laboratory
NROS 415 (Spring 2018) -
Honors Independent Study
NSCS 499H (Spring 2018) -
Special Topics in Science
HNRS 195I (Spring 2018) -
Directed Research
NSCS 392 (Fall 2017) -
Preceptorship
NSCS 491 (Fall 2017) -
Scientific Programming Matlab
NROS 311 (Fall 2017)
2016-17 Courses
-
Directed Research
PSIO 492 (Spring 2017) -
Electrophysiology Laboratory
NROS 415 (Spring 2017) -
Honors Independent Study
NSCS 399H (Spring 2017) -
Independent Study
NSCS 299 (Spring 2017) -
Independent Study
NSCS 499 (Spring 2017) -
Intro to Electrophysiology
NROS 215 (Spring 2017) -
Honors Independent Study
NSCS 399H (Fall 2016) -
Independent Study
NSCS 399 (Fall 2016) -
Independent Study
NSCS 499 (Fall 2016) -
Intriguing Topics in NSCS
NSCS 495 (Fall 2016) -
Methods in Neuroscience
NSCS 315B (Fall 2016) -
Preceptorship
NSCS 491 (Fall 2016) -
Rsrch Meth Biomed Engr
BME 597G (Fall 2016) -
Special Topics in Science
HNRS 195I (Fall 2016)
2015-16 Courses
-
Dissertation
ECE 920 (Spring 2016) -
Electrophysiology Laboratory
NROS 415 (Spring 2016) -
Honors Independent Study
NSCS 299H (Spring 2016) -
Intro to Electrophysiology
NROS 215 (Spring 2016)
Scholarly Contributions
Journals/Publications
- Northcutt, B. D., & Higgins, C. M. (2017). An Insect-Inspired Model for Visual Binding II: Functional Analysis and Visual Attention. Biological Cybernetics, 111(2), 207-227.
- Northcutt, B. D., Dyhr, J. P., & Higgins, C. M. (2017). An Insect-Inspired Model for Visual Binding I: Learning Objects and Their Characteristics. Biological Cybernetics, 111(2), 185-206.
- Pant, V., & Higgins, C. M. (2012). Tracking improves performance of biological collision avoidance models. Biological Cybernetics, 106(4-5), 307-322.More infoPMID: 22744199;Abstract: Abstract Collision avoidance models derived from the study of insect brains do not perform universally well in practical collision scenarios, although the insects themselves may perform well in similar situations. In this article, we present a detailed simulation analysis of two well-known collision avoidance models and illustrate their limitations. In doing so, we present a novel continuous-time implementation of a neuronally based collision avoidance model. We then show that visual tracking can improve performance of thesemodels by allowing an relative computation of the distance between the obstacle and the observer.We compare the results of simulations of the two models with and without tracking to show how tracking improves the ability of the model to detect an imminent collision.We present an implementation of one of thesemodels processing imagery from a camera to showhow it performs in real-world scenarios. These results suggest that insects may track looming objects with their gaze. © The Author(s) 2012.
- Higgins, C., Rivera-Alvidrez, Z., Lin, I., & Higgins, C. M. (2011). A neuronally based model of contrast gain adaptation in fly motion vision. Visual neuroscience, 28(5).More infoMotion-sensitive neurons in the visual systems of many species, including humans, exhibit a depression of motion responses immediately after being exposed to rapidly moving images. This motion adaptation has been extensively studied in flies, but a neuronal mechanism that explains the most prominent component of adaptation, which occurs regardless of the direction of motion of the visual stimulus, has yet to be proposed. We identify a neuronal mechanism, namely frequency-dependent synaptic depression, which explains a number of the features of adaptation in mammalian motion-sensitive neurons and use it to model fly motion adaptation. While synaptic depression has been studied mainly in spiking cells, we use the same principles to develop a simple model for depression in a graded synapse. By incorporating this synaptic model into a neuronally based model for elementary motion detection, along with the implementation of a center-surround spatial band-pass filtering stage that mimics the interactions among a subset of visual neurons, we show that we can predict with remarkable success most of the qualitative features of adaptation observed in electrophysiological experiments. Our results support the idea that diverse species share common computational principles for processing visual motion and suggest that such principles could be neuronally implemented in very similar ways.
- Dyhr, J. P., & Higgins, C. M. (2010). Non-directional motion detectors can be used to mimic optic flow dependent behaviors. Biological Cybernetics, 103(6), 433-446.More infoPMID: 21161268;Abstract: Insect navigational behaviors including obstacle avoidance, grazing landings, and visual odometry are dependent on the ability to estimate flight speed based only on visual cues. In honeybees, this visual estimate of speed is largely independent of both the direction of motion and the spatial frequency content of the image. Electrophysiological recordings from the motion-sensitive cells believed to underlie these behaviors have long supported spatio-temporally tuned correlation-type models of visual motion detection whose speed tuning changes as the spatial frequency of a stimulus is varied. The result is an apparent conflict between behavioral experiments and the electrophysiological and modeling data. In this article, we demonstrate that conventional correlation-type models are sufficient to reproduce some of the speed-dependent behaviors observed in honeybees when square wave gratings are used, contrary to the theoretical predictions. However, these models fail to match the behavioral observations for sinusoidal stimuli. Instead, we show that non-directional motion detectors, which underlie the correlation-based computation of directional motion, can be used to mimic these same behaviors even when narrowband gratings are used. The existence of such non-directional motion detectors is supported both anatomically and electrophysiologically, and they have been hypothesized to be critical in the Dipteran elementary motion detector (EMD) circuit. © 2010 Springer-Verlag.
- Higgins, C., Dyhr, J. P., & Higgins, C. M. (2010). The spatial frequency tuning of optic-flow-dependent behaviors in the bumblebee Bombus impatiens. The Journal of experimental biology, 213(Pt 10).More infoInsects use visual estimates of flight speed for a variety of behaviors, including visual navigation, odometry, grazing landings and flight speed control, but the neuronal mechanisms underlying speed detection remain unknown. Although many models and theories have been proposed for how the brain extracts the angular speed of the retinal image, termed optic flow, we lack the detailed electrophysiological and behavioral data necessary to conclusively support any one model. One key property by which different models of motion detection can be differentiated is their spatiotemporal frequency tuning. Numerous studies have suggested that optic-flow-dependent behaviors are largely insensitive to the spatial frequency of a visual stimulus, but they have sampled only a narrow range of spatial frequencies, have not always used narrowband stimuli, and have yielded slightly different results between studies based on the behaviors being investigated. In this study, we present a detailed analysis of the spatial frequency dependence of the centering response in the bumblebee Bombus impatiens using sinusoidal and square wave patterns.
- Johnson, L. A., & Higgins, C. M. (2006). A navigation aid for the blind using tactile-visual sensory substitution. Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, 6289-6292.More infoAbstract: The objective of this study is to improve the quality of life for the visually impaired by restoring their ability to self-navigate. In this paper we describe a compact, wearable device that converts visual information into a tactile signal. This device, constructed entirely from commercially available parts, enables the user to perceive distant objects via a different sensory modality. Preliminary data suggest that this device is useful for object avoidance in simple environments. © 2006 IEEE.
- Özalevli, E., Hasler, P., & Higgins, C. M. (2006). Winner-take-all-based visual motion sensors. IEEE Transactions on Circuits and Systems II: Express Briefs, 53(8), 717-721.More infoAbstract: We present a novel analog VLSI implementation of visual motion computation based on the lateral inhibition and positive feedback mechanisms that are inherent in the hysteretic winner-take-all circuit. By use of an input-dependent bias current and threshold mechanism, the circuit resets itself to prepare for another motion computation. This implementation was inspired by the Barlow-Levick model of direction selectivity in the rabbit retina. Each pixel uses 33 transistors and two small capacitors to detect the direction of motion and can be altered with the addition of six more transistors to measure the interpixel transit time. Simulation results and measurements from fabricated VLSI designs are presented to show the operation of the circuits. © 2006 IEEE.
- Higgins, C. M., Pant, V., & Deutschmann, R. (2005). Analog VLSI implementation of spatio-temporal frequency tuned visual motion algorithms. IEEE Transactions on Circuits and Systems I: Regular Papers, 52(3), 489-502.More infoAbstract: The computation of local visual motion can be accomplished very efficiently in the focal plane with custom very large-scale integration (VLSI) hardware. Algorithms based on measurement of the spatial and temporal frequency content of the visual motion signal, since they incorporate no thresholding operation, allow highly sensitive responses to low contrast and low-speed visual motion stimuli. We describe analog VLSI implementations of the three most prominent spatio-temporal frequency-based visual motion algorithms, present characterizations of their performance, and compare the advantages of each on an equal basis. This comparison highlights important issues in the design of analog VLSI sensors, including the effects of circuit design on power consumption, the tradeoffs of subthreshold versus above-threshold MOSFET biasing, and methods of layout for focal plane vision processing arrays. The presented sensors are capable of distinguishing the direction of motion of visual stimuli to less than 5% contrast, while consuming as little as 1 μW of electrical power. These visual motion sensors are useful in embedded applications where minimum power consumption, size, and weight are crucial. © 2005 IEEE.
- Melano, T., & Higgins, C. M. (2005). The neuronal basis of direction selectivity in lobula plate tangential cells. Neurocomputing, 65-66(SPEC. ISS.), 153-159.More infoAbstract: Using a neuronally based computational model of the fly's visual elementary motion detection (EMD) system, the effects of picrotoxin, a GABA receptor antagonist, were modeled to investigate the role of various GABAergic cells in direction selectivity. By comparing the results of our simulation of an anatomically correct model to previously published electrophysiological results, this study supports the hypothesis that EMD outputs integrated into tangential cells are weakly directional, although the tangential cells themselves respond to moving stimuli in a strongly directional manner. © 2004 Published by Elsevier B.V.
- Rivera-Alvidrez, Z., & Higgins, C. M. (2005). Contrast saturation in a neuronally-based model of elementary motion detection. Neurocomputing, 65-66(SPEC. ISS.), 173-179.More infoAbstract: The Hassenstein-Reichardt (HR) correlation model is commonly used to model elementary motion detection in the fly. Recently, a neuronally-based computational model was proposed which, unlike the HR model, is based on identified neurons. The response of both models increases as the square of contrast, although the response of insect neurons saturates at high contrasts. We introduce a saturating nonlinearity into the neuronally-based model in order to produce contrast saturation and discuss the neuronal implications of these elements. Furthermore, we show that features of the contrast sensitivity of movement-detecting neurons are predicted by the modified model. © 2004 Elsevier B.V. All rights reserved.
- Özalevli, E., & Higgins, C. M. (2005). Reconfigurable biologically inspired visual motion systems using modular neuromorphic VLSI Chips. IEEE Transactions on Circuits and Systems I: Regular Papers, 52(1), 79-92.More infoAbstract: Visual motion information provides a variety of clues that enable biological organisms from insects to primates to efficiently navigate in unstructured environments. We present modular mixed-signal very large-scale integration (VLSI) implementations of the three most prominent biological models of visual motion detection. A novel feature of these designs is the use of spike integration circuitry to implement the necessary temporal filtering. We show how such modular VLSI building blocks make it possible to build highly powerful and flexible vision systems. These three biomimetic motion algorithms are fully characterized and compared in performance. The visual motion detection models are each implemented on separate VLSI chips, but utilize a common silicon retina chip to transmit changes in contrast, and thus four separate mixed-signal VLSI designs are described. Characterization results of these sensors show that each has a saturating response to contrast to moving stimuli, and that the direction of motion of a sinusoidal grating can be detected down to less than 5% contrast, and over more than an order of magnitude in velocity, while retaining modest power consumption. © 2005 IEEE.
- Higgins, C. M. (2004). Nondirectional motion may underlie insect behavioral dependence on image speed. Biological Cybernetics, 91(5), 326-332.More infoPMID: 15490223;Abstract: Behavioral experiments suggest that insects make use of the apparent image speed on their compound eyes to navigate through obstacles, control flight speed, land smoothly, and measure the distance they have flown. However, the vast majority of electrophysiological recordings from motion-sensitive insect neurons show responses which are tuned in spatial and temporal frequency and are thus unable to unambiguously represent image speed. We suggest that this contradiction may be resolved at an early stage of visual motion processing using nondirectional motion sensors that respond proportionally to image speed until their peak response. We describe and characterize a computational model of these sensors and propose a model by which a spatial collation of such sensors could be used to generate speed-dependent behavior.
- Higgins, C. M., & Pant, V. (2004). A biomimetic VLSI sensor for visual tracking of small moving targets. IEEE Transactions on Circuits and Systems I: Regular Papers, 51(12), 2384-2394.More infoAbstract: Taking inspiration from the visual system of the fly, we describe and characterize a monolithic analog very large-scale integration sensor, which produces control signals appropriate for the guidance of an autonomous robot to visually track a small moving target. This sensor is specifically designed to allow such tracking even from a moving imaging platform which experiences complex background optical flow patterns. Based on relative visual motion of the target and background, the computational model implemented by this sensor emphasizes any small-field motion which is inconsistent with the wide-field background motion. © 2004 IEEE.
- Higgins, C. M., & Pant, V. (2004). An elaborated model of fly small-target tracking. Biological Cybernetics, 91(6), 417-428.More infoPMID: 15597180;Abstract: Flies have the capability to visually track small moving targets, even across cluttered backgrounds. Previous computational models, based on figure detection (FD) cells identified in the fly, have suggested how this may be accomplished at a neuronal level based on information about relative motion between the target and the background. We experimented with the use of this "small-field system model" for the tracking of small moving targets by a simulated fly in a cluttered environment and discovered some functional limitations. As a result of these experiments, we propose elaborations of the original small-field system model to support stronger effects of background motion on small-field responses, proper accounting for more complex optical flow fields, and more direct guidance toward the target. We show that the elaborated model achieves much better tracking performance than the original model in complex visual environments and discuss the biological implications of our elaborations. The elaborated model may help to explain recent electrophysiological data on FD cells that seem to contradict the original model.
- Pant, V., & Higgins, C. M. (2004). A biomimetic VLSI architecture for small target tracking. Proceedings - IEEE International Symposium on Circuits and Systems, 3, III5-III8.More infoAbstract: Tracking of a target in a cluttered environment requires extensive computational architecture. However, even a small housefly is adept at pursuing its prey. Biomimetic algorithms suggest a novel way of looking at this problem. In the lobula plate of a fly's brain, a neural circuit is hypothesized based on a tangential cell called the figure detection (FD) cell. The proposed small target fixation algorithm based on electrophysiological recordings does not take into account the translation of the pursuer during pursuit. We have modified the biological algorithm to include this aspect of tracking. In this paper, we present the elaborated biological algorithm for small target tracking, and an analog VLSI implementation of this algorithm.
- Ozalevli, E., & Higgins, C. M. (2003). Multi-chip implementation of a biomimetic VLSI vision sensor based on the Adelson-Bergen algorithm. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2714, 433-440.More infoAbstract: Biological motion sensors found in the retinas of species ranging from flies to primates are tuned to specific spatio-temporal frequencies to determine the local motion vectors in their visual field and perform complex motion computations. In this study, we present a novel implementation of a silicon retina based on the Adelson-Bergen spatio-temporal energy model of primate cortical cells. By employing a multi-chip strategy, we successfully implemented the model without much sacrifice of the fill factor of the photoreceptors in the front-end chip. In addition, the characterization results proved that this spatio-temporal frequency tuned silicon retina can detect the direction of motion of a sinusoidal input grating down to 10 percent contrast, and over more than a magnitude in velocity. This multi-chip biomimetic vision sensor will allow complex visual motion computations to be performed in real-time. © Springer-Verlag Berlin Heidelberg 2003.
- Higgins, C. M., & Shams, S. (2002). A biologically inspired modular VLSI system for visual measurement of self-motion. IEEE Sensors Journal. doi:10.1109/jsen.2002.807304
- Higgins, C. M., & Shams, S. A. (2002). A biologically inspired modular VLSI system for visual measurement of self-motion. IEEE Sensors Journal, 2(6), 508-528.More infoAbstract: We introduce a biologically inspired computational architecture for small-field detection and wide-field spatial integration of visual motion based on the general organizing principles of visual motion processing common to organisms from insects to primates. This highly parallel architecture begins with two-dimensional (2-D) image transduction and signal conditioning, performs small-field motion detection with a number of parallel motion arrays, and then spatially integrates the small-field motion units to synthesize units sensitive to complex wide-field patterns of visual motion. We present a theoretical analysis demonstrating the architecture's potential in discrimination of wide-field motion patterns such as those which might be generated by self-motion. A custom VLSI hardware implementation of this architecture is also described, incorporating both analog and digital circuitry. The individual custom VLSI elements are analyzed and characterized, and system-level test results demonstrate the ability of the system to selectively respond to certain motion patterns, such as those that might be encountered in self-motion, at the exclusion of others. © 2002 IEEE.
- Higgins, C. M. (2001). Sensory architectures for biologically inspired autonomous robotics. Biological Bulletin, 200(2), 235-242.More infoPMID: 11341590;Abstract: Engineers have a lot to gain from studying biology. The study of biological neural systems alone provides numerous examples of computational systems that are far more complex than any man-made system and perform real-time sensory and motor tasks in a manner that humbles the most advanced artificial systems. Despite the evolutionary genesis of these systems and the vast apparent differences between species, there are common design strategies employed by biological systems that span taxa, and engineers would do well to emulate these strategies. However, biologically-inspired computational architectures, which are continuous-time and parallel in nature, do not map well onto conventional processors, which are discrete-time and serial in operation. Rather, an implementation technology that is capable of directly realizing the layered parallel structure and nonlinear elements employed by neurobiology is required for power- and space-efficient implementation. Custom neuromorphic hardware meets these criteria and yields low-power dedicated sensory systems that are small, light, and ideal for autonomous robot applications. As examples of how this technology is applied, this article describes both a low-level neuromorphic hardware emulation of an elementary visual motion detector, and a large-scale, system-level spatial motion integration system.
- Higgins, C. M., & Koch, C. (2000). Modular multi-chip neuromorphic architecture for real-time visual motion processing. Analog Integrated Circuits and Signal Processing, 24(3), 195-211.More infoAbstract: The extent of pixel-parallel focal plane image processing is limited by pixel area and imager fill factor. In this paper, we describe a novel multi-chip neuromorphic VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex pixel-parallel motion processing than is possible in the focal plane. This multi-chip system retains the primary advantages of focal plane neuromorphic image processors: low-power consumption, continuous-time operation, and small size. The two basic VLSI building blocks are a photosensitive sender chip which incorporates a 2D imager array and transmits the position of moving spatial edges, and a receiver chip which computes a 2D optical flow vector field from the edge information. The elementary two-chip motion processing system consisting of a single sender and receiver is first characterized. Subsequently, two three-chip motion processing systems are described. The first three-chip system uses two sender chips to compute the presence of motion only at a particular stereoscopic depth from the imagers. The second three-chip system uses two receivers to simultaneously compute a linear and polar topographic mapping of the image plane, resulting in information about image translation, rotation, and expansion. These three-chip systems demonstrate the modularity and flexibility of the multi-chip neuromorphic approach.
- Higgins, C. M., Deutschmann, R. A., & Koch, C. (1999). Pulse-based 2-D motion sensors. IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 46(6), 677-687.More infoAbstract: We present two compact CMOS integrated circuits for computing the two-dimensional (2-D) local direction of motion of an image focused directly onto the chip. These circuits incorporate onboard photoreceptors and focal plane motion processing. With fully functional 14 x 13 and 12 x 13 implementations consuming less than 50 -iW per pixel, we conclude that practical pixel resolutions of at least 64 x 64 are easily achievable. Measurements characterizing the elementary one-dimensional motion detectors are presented along with a discussion of 2-D performance and example 2-D motion vector fields. As an example application of the sensor, it is shown that the array as fabricated can directly compute the focus of expansion of a 2-D motion vector field. © 1999 IEEE.
- Higgins, C. M., & Goodman, R. M. (1994). Fuzzy rule-based networks for control. IEEE Transactions on Fuzzy Systems, 2(1), 82-88.More infoAbstract: We present a method for learning fuzzy logic membership functions and rule to approximate a numerical function from a set of examples of the functions independent variables and the resulting function value. This method uses a three-step approach to building a complete function approximation system: first, learning the membership functions and creating a cell-based rule representation; second, simplifying the cell-based rules using an information-theoretic approach for induction of rules from discrete-valued data; and, finally, constructing a computational (neural) network to compute the function value given its independent variables. This function approximation system is demonstrated with a simple control example: learning the truck and the trailer backer-upper control system.
- Smyth, P., Miller, J. W., Higgins, C. M., & Goodman, R. M. (1992). Rule-based neural networks for classification and probability estimation. Neural Computation, 4(6), 781-804. doi:10.1162/neco.1992.4.6.781More infoIn this paper we propose a network architecture that combines a rule-based approach with that of the neural network paradigm. Our primary motivation for this is to ensure that the knowledge embodied in the network is explicitly encoded in the form of understandable rules. This enables the network's decision to be understood, and provides an audit trail of how that decision was arrived at. We utilize an information theoretic approach to learning a model of the domain knowledge from examples. This model takes the form of a set of probabilistic conjunctive rules between discrete input evidence variables and output class variables. These rules are then mapped onto the weights and nodes of a feedforward neural network resulting in a directly specified architecture. The network acts as parallel Bayesian classifier, but more importantly, can also output posterior probability estimates of the class variables. Empirical tests on a number of data sets show that the rule-based classifier performs comparably with standard neural network classifiers, while possessing unique advantages in terms of knowledge representation and probability estimation.
- Higgins, C. M., & Goodman, R. M. (1991). Incremental learning with rule-based neural networks. Proceedings. IJCNN-91-Seattle: International Joint Conference on Neural Networks, 875-880.More infoAbstract: A classifier for discrete-valued variable classification problems is presented. The system utilizes an information-theoretic algorithm for constructing informative rules from example data. These rules are then used to construct a neural network to perform parallel inference and posterior probability estimation. The network can be grown incrementally, so that new data can be incorporated without repeating the training on previous data. It is shown that this technique performs as well as other techniques such as backpropagation while having unique advantages in incremental learning capability, training efficiency, knowledge representation, and hardware implementation suitability.
Proceedings Publications
- Pham, T. T., & Higgins, C. M. (2014, August). A visual motion detecting module for dragonfly-controlled robots. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2014, 1666-1669.More infoWhen imitating biological sensors, we have not completely understood the early processing of the input to reproduce artificially. Building hybrid systems with both artificial and real biological components is a promising solution. For example, when a dragonfly is used as a living sensor, the early processing of visual information is performed fully in the brain of the dragonfly. The only significant remaining tasks are recording and processing neural signals in software and/or hardware. Based on existing works which focused on recording neural signals, this paper proposes a software application of neural information processing to design a visual processing module for dragonfly hybrid bio-robots. After a neural signal is recorded in real-time, the action potentials can be detected and matched with predefined templates to detect when and which descending neurons fire. The output of the proposed system will be used to control other parts of the robot platform.
- Pant, V., & Higgins, C. M. (2007). A Biomimetic Focal Plane Speed Computation Architecture. In Adaptive Optics: Analysis and Methods/Computational Optical Sensing and Imaging/Information Photonics/Signal Recovery and Synthesis Topical Meetings on CD-ROM.More infoA sensor was designed to compute speed at the image focal plane for robotic navigation. It employs an array of parallel sensing and computing blocks, and outputs a signal that varies linearly with image speed.
- Pant, V., & Higgins, C. M. (2004). A biomimetic VLSI architecture for small target tracking. In 2004 IEEE International Symposium on Circuits and Systems (IEEE Cat. No.04CH37512), 3, 5-8.More infoTracking of a target in a cluttered environment requires extensive computational architecture. However, even a small housefly is adept at pursuing its prey. Biomimetic algorithms suggest a way of looking at this problem. In the lobula plate of a fly's brain, a neural circuit is hypothesized based on a tangential cell called the figure detection (FD) cell. The proposed small target fixation algorithm based on electrophysiological recordings does not take into account the translation of the pursuer during pursuit. We have modified the biological algorithm to include this aspect of tracking. In this paper, we present the elaborated biological algorithm for small target tracking, and an analog VLSI implementation of this algorithm.
- Koch, C., & Higgins, C. M. (1999). Multi-chip neuromorphic motion processing. In Proceedings 20th Anniversary Conference on Advanced Research in VLSI, 309-323.More infoWe describe a multi-chip CMOS VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex motion processing than is possible with all the circuitry in the focal plane. The two basic VLSI building blocks are a sender chip which incorporates a 2D imager array and transmits the position of moving spatial edges, and a receiver chip which computes a 2D optical flow vector field from the edge information. The elementary two-chip motion processing system consisting of a single sender and receiver is first characterized. Subsequently, two three-chip motion processing systems are described. The first such system uses two sender chips to compute the presence of motion only at a particular stereoscopic disparity. The second such system uses two receivers to simultaneously compute a linear and polar topographic mapping of the image plane, resulting in information about image translation, rotation, and expansion. These three-chip systems demonstrate the modularity and flexibility of the multi-chip neuromorphic approach.
- Higgins, C. M., & Koch, C. (1998).
An Integrated Vision Sensor for the Computation of Optical Flow Singular Points
. In Advances in Neural Information Processing Systems 11, 11, 699-705.More infoA robust, integrative algorithm is presented for computing the position of the focus of expansion or axis of rotation (the singular point) in optical flow fields such as those generated by self-motion. Measurements are shown of a fully parallel CMOS analog VLSI motion sensor array which computes the direction of local motion (sign of optical flow) at each pixel and can directly implement this algorithm. The flow field singular point is computed in real time with a power consumption of less than 2 mW. Computation of the singular point for more general flow fields requires measures of field expansion and rotation, which it is shown can also be computed in real-time hardware, again using only the sign of the optical flow field. These measures, along with the location of the singular point, provide robust real-time self-motion information for the visual guidance of a moving platform such as a robot.
Presentations
- Higgins, C. M. (2021, October).
Emergent Intelligence from a Neuronal Model of Insect Vision
. MARC (Maximizing Access to Research Careers) invited seminar. - Higgins, C. M. (2020, April).
A matched oscillator theory for auditory recognition
. Department of Neuroscience faculty meeting talk.
Poster Presentations
- Miller, J. E., & Higgins, C. M. (2023, September). Altered Basal Ganglia Gene Networks for Vocal Function in Normative Aging and Parkinson’s Disease.. Arizona Alzheimer’s Consortium Conference.