Thomas Powers Research Assistant & Research Assistant tcpowers@apl.washington.edu |

Publications |
2000-present and while at APL-UW |

Building recurrent networks by unfolding iterative thresholding for equential sparse recovery Wisdom, S., T. Powers, J. Pitton, and L. Atlas, "Building recurrent networks by unfolding iterative thresholding for equential sparse recovery," Proc., IEEE International Conference on Acoustics, Speech and Signal Processing, 5-9 March, New Orleans, LA, 4346-4350, doi:10.1109/ICASSP.2017.7952977 (IEEE, 2017). |
More Info |
19 Jun 2017 |
|||||||

Historically, sparse methods and neural networks, particularly modern deep learning methods, have been relatively disparate areas. Sparse methods are typically used for signal enhancement, compression, and recovery, usually in an unsupervised framework, while neural networks commonly rely on a supervised training set. In this paper, we use the specific problem of sequential sparse recovery, which models a sequence of observations over time using a sequence of sparse coefficients, to show how algorithms for sparse modeling can be combined with supervised deep learning to improve sparse recovery. Specifically, we show that the iterative soft-thresholding algorithm (ISTA) for sequential sparse recovery corresponds to a stacked recurrent neural network (RNN) under specific architecture and parameter constraints. Then we demonstrate the benefit of training this RNN with backpropagation using supervised data for the task of column-wise compressive sensing of images. This training corresponds to adaptation of the original iterative thresholding algorithm and its parameters. Thus, we show by example that sparse modeling can provide a rich source of principled and structured deep network architectures that can be trained to improve performance on specific tasks. |

Constrained robust submodular sensor selection with applications to multistatic sonar arrays Powers, T., J. Bilmes, D.W. Krout, and L. Atlas, "Constrained robust submodular sensor selection with applications to multistatic sonar arrays," Proc., FUSION — 19th International Conference on Information Fusion, 5-8 July, 2179-2185 (IEEE, 2016). |
More Info |
4 Aug 2016 |
|||||||

We develop a framework to select a subset of sensors from a field in which the sensors have an ingrained independence structure. Given an arbitrary independence pattern, we construct a graph that denotes pairwise independence between sensors, which means those sensors may operate simultaneously. The set of all fully-connected subgraphs (cliques) of this independence graph forms the independent sets of a matroid over which we maximize the minimum of a set of submodular objective functions. We propose a novel algorithm called MatSat that exploits submodularity and, as a result, returns a near-optimal solution with approximation guarantees that are within a small factor of the average-case scenario. We apply this framework to ping sequence optimization for active multistatic sonar arrays by maximizing sensor coverage and derive lower bounds for minimum probability of detection for a fractional number of targets. In these ping sequence optimization simulations, MatSat exceeds the fractional lower bounds and reaches near-optimal performance. |