Partners on Campus
| PROJECT / COLLABORATORS | DESCRIPTION |
Tomographic Network Measurement Funding: NSF
| This project is developing tomographic network measurement techniques that enable packet delay and loss characteristics to be isolated to specific links in the wide area. Tomography is the use of information from simultaneous probes to estimate properties of shared resources. Analytical models for these techniques, as well as tools that can be used in the Internet by both researchers and network operators, are also being developed. Testing and evaluation of the tools is being conducted in WAIL. |
Next Generation Network UW Collaborators: DoIT
| We are consulting with DoIT on a variety of issues that concern management and operation of the UW campus network including different aspects of storage, monitoring, security and external connectivity. I also advise DoIT on the activities of the National Lambdarail effort for which I am currently serving as a board member. |
Automated Analysis of Chest CT Images for Detecting Cystic Fibrosis in Children UW Collaborator: Philip Farrell (Medical School) Funding: NIH and UW
| This project is developing algorithms and systems that take a high-resolution computerized tomography (HRCT) image of a child's chest, and perform automated analysis to evaluate the degree of cystic fibrosis (CF) in the patient. This is an important problem for improving the early diagnosis of CF through neonatal screening. |
Image Analysis of Diffusion MRI of the Human Brain UW Collaborator: Andrew Alexander (Medical Physics) Funding: NIH | Diffusion-Tensor MRI is a relatively new imaging method that has many clinical applications such as analyzing the human brain for detecting tumors, multiple sclerosis, autism, injury and depression. In this project we are developing innovative DT-MRI image analysis approaches for the quantitative evaluation of diffusion measurements in the human brain including (1) probabilistic algorithms for DT-MRI brain segmentation, and (2) statistical parametric mapping. |
Face Recognition using Integral Invariants and Cryptology UW Collaborators: Nigel Boston (Mathematics) and Yu-Hen Hu (ECE) Funding: NSF | While current human face recognition methods are limited in the conditions where they work, recent advances in the mathematical theory of invariants and recent connections with cryptology provide new insights for the face recognition problem that may help broaden the generality and robustness of face recognition. The goals if this project include: (1) establishing a common mathematical foundation between cryptology and image analysis for object recognition, (2) investigating the mathematical theory of integral invariants for 3D objects, and (3) developing algorithms and prototype systems for recognizing moving faces using multiple cameras. |
Cancer Radiotherapy UW Collaborators: R. Mackie and R. Jeraj (Medical Physics) Funding: NSF | The application of optimization techniques to radiation treatment planning has become an extremely active area of research. The latest generation of radiation devices allows the delivery of highly optimized and accurate treatment plans. However, the new capabilities of these devices, particularly their ability to obtain data at each treatment session for the delivered dose and the movement of organs in the patient's body are not exploited by current planning methodologies. Neither do these methodologies take account of the many errors and uncertainties that arise during the planning process in, for example, patient positioning, organ movement, and tumor location and extent. Even a highly optimized plan can turn out to be ineffective in practice, if the tumor moves from its presumed location during the treatment period. |
Model Building with Likelihood Basis Pursuit UW Collaborators: Meta M. Voelker (formerly MACE program, now AlphaTech), Hao Helen Zhang (formerly Statistics, now NC State) and Grace Wahba (Statistics) | We consider a nonparametric penalized likelihood approach for model building called likelihood basis pursuit (LBP) that determines the probabilities of binary outcomes given explanatory vectors, while automatically selecting important features. The LBP model involves parameters that balance the competing goals of maximizing the log likelihood and minimizing the penalized basis pursuit terms. These parameters are selected to minimize a proxy of misclassification error, namely the randomized generalized approximate cross validation (ranGACV) function. The ranGACV function is not easily represented in compact form; its functional values can only be obtained by solving two instances of the LBP model, which may be computationally expensive. Application of this method, computational efficiencies and implementations are considered. |
Imaging and Analysis of the Microvascular Architecture of the Mouse Brain UW Collaborators: Garet Lahvis (Surgery) Funding: NIH (to be submitted) | Our goal is to understand the micro-vascular architecture (the structure of the network of blood vessels) of the mouse brain and its role in development and disease. The size and complexity of brain vasculature demands a new methodolgy for study. We have successfully developed techniques that allow us to image and examine micro-vascular architecture at an unprecedented scale. We have demonstrated a histological methodology that combines novel preparation methods, serial sectioning, and bright-light flourescent montage microscopy that allows us to image every capillary in the entire mouse brain. We have developed a custom-designed suite of visualization and analysis software that permit biologists to examine these large datasets. We have used these tools to both confirm hypotheses about brain development, as well as to discover anomolies that deserve further study. |
GLOW Distributed Computing System at the University of Wisconsin UW Collaborators: Juan Depablo (Chemical Engineering), David Schwartz (Chemistry), Paul Barford and Remzi Arpaci-Dusseau (CS), Paul Deluca and Robert Jera (Medical Physics), Sridhara Dasu, Francis Halzen, Albrecht Karle, Don Reeder and Wesley Smith (Physics) Funding: NSF | This project is constructing GLOW, the Grid Laboratory Of Wisconsin, to serve as a crucial scientific computing resource at the UW in the years to come. This multidisciplinary effort spans the fields of - Computer Science, Genomics, High Energy Physics, Material Science, Neutrino Astrophysics and Medical Physics. GLOW is a campus-wide grid laboratory that combines and enhances autonomous sites of computing resources. Each site will be managed locally and will be configured to meet the specific scientific needs of the site. GLOW will refine and scale-up the methods of implementing large-scale collaborative computing environments that CHTC of UW Department of Computer Sciences has pioneered. The distributed nature of GLOW and the local autonomy of its sites will enable us to address the sociological and technological challenges of grid computing in a real-life setting. While each of the sites will focus on addressing the computing needs of the local community of users and will maintain full control over the local resources, computing power and storage space will be shared across site boundaries according to a mutually defined policy. These shared resources will serve the ever-growing computing needs on the UW campus. |
| HTCondor Miron Livny | In addition to GLOW, CHTC supports HTCondor pools in the following UW departments:
There are also ongoing collaborations between HTCondor and
|
A Cyberinfrastructure for Automated Radiation Treatment Planning UW Collaborator: Leyuan Shi (Industrial Engineering) Funding: NSF | The number of cancer patients in the US treated with radiation is currently more than one-half million annually, and worldwide this number is greater than one million. Given the aging population and the development of increasingly effective radiation therapies, these numbers are expected to increase rapidly in coming years.The research focuses on the development of a cyberinfrastructure to support the development, integration, and dissemination of optimization methods for radiation treatment planning (RTP). Roughly speaking, the goal of RTP software is to construct treatment plans that approximate as closely as possible ideal treatments in which a prescribed radiation dose is delivered to the tumor while keeping the dose to non-cancerous tissues below specified thresholds. The internet provides an ideal medium for an information technology infrastructure that will support the coordination of the disparate worldwide research efforts to develop effective approaches for the solution of the extremely difficult computational problems that arise from the need to plan the safe and effective delivery of radiation therapies of ever-increasing complexity. |
DRACO Parallelization and Tuning UW Collaborators: Paul Wilson and Greg Moses (Engineering Physics) Funding: DOE | The goal of this project to parallelize a critical physics code named DRACO, a hydrodynamics simulation code developed in the Engineering Physics Department. It uses the arbitrary Lagrangian-Eulerian (ALE) technique to perform one, two, or three dimensional simulations. Several physics modules such as laser-energy deposition and radiation transport have been incorporated into DRACO. The Paradyn group has actively worked with them to help parallelize and tune this c ode to run on Linux clusters. |
Modeling and Simulation for Critical Infrastructure Protection UW Collaborators: Vicki Bier (Industrial Engineering & Engineering Physics), Pascale Carayon (Industrial Engineering), Thomas Kurtz (Math and Statistics), and Mary Vernon (CS) Funding: DOD | This project examines vulnerabilities in networked, interacting systems, particularly systems employing hybrid human, physical and informational architectures. |
Planning Under Uncertainty: Methods and Applications UW Collaborator: Andrew Miller (Industrial Engineering). Funding: DOD | This project is developing and applying powerful optimization technology to deal with problems of resource allocation in applied settings with particular emphasis on problems with uncertain data. Areas of focus include: (1) applying optimization techniques to problems in radiation therapy for the treatment of cancer, with special attention paid to issues related to fractionation of delivery; (2) using optimization methods efficiently to solve various kinds of robust network design problems; and (3) developing and applying stochastic optimization methods for various facets of the air mobility optimization problem. We are also developing theoretical foundations and computational tools to solve these and related problems. The theoretical and computational aspects of this program relate to stochastic programming, in which decisions must be made under uncertainty, and integer programming, in which some decisions must be represented by integer-valued variables, and to the intersection of these two fields. While researchers have made considerable progress in recent years in applying optimization methods to a variety of problems in these two areas, there are many applications in both in which the current state of the art is insufficient to address decision support needs. |
Advanced Analysis of NMR Spectroscopy UW Collaborators: John Markley (Biochemistry) Funding: NIH | NMR is a technology that can be used to determine the structure of proteins in solutions (the only other technique is crystallography). The spectroscopy is a step in the structure identification, where one studies signals that are acquired during the experiments and tries to find a set of "active frequencies". The goal of this project is to analyze and compare various existing algorithms including recent nonlinear methods such as filter diagonalization and maximum entropy. Such representations can also substantially improve existing NMR spectroscopy techniques, either with regards to computational speed or signal/noise separation. More ambitiously such representations may lead to innovative spectroscopic algorithms. Past experience suggests the effort will have feedback effects, with wide-spread ramifications within mathematics, and will stimulate new, intrinsic studies in this field. |
Image Compression for 4-D Biology Imagery UW Collaborators: John White (Molecular Biology) Funding: NSF | The focus of this project is the development of real-time, wavelet-based compression techniques that allow distant users to view 4D imagery of an embryos' evolution (via the internet) using photon microscopes. |
MRI Reconstruction from Severely Undersampled Data UW Collaborators: Charles Mistretta (Medical School) Funding (to be submitted): NIH | The focus of this effort is to obtain MRI imagery of blood vessels. Due to technical limitations it is only possible to obtain imagery from a small number of angles from which the final image must be reconstructed. We are working on new reconstruction algorithms that will produce high-quality images even with substantial undersampling. |
EDAM UW Collaborator: James Schauer (Civil/Environmental Engineering and Atmospheric and Oceanic Sciences) Funding: NSF | The EDAM project is a collaborative effort between computer scientists and environmental chemists at Carleton College and UW-Madison. The goal is to develop data mining techniques for advancing the state of the art in analyzing atmospheric aerosol datasets. There is a great need to better understand the sources, dynamics, and compositions of atmospheric aerosols. The traditional approach for particle measurement, which is the collection of bulk samples of particulates on filters, is not adequate for studying particle dynamics and real-time correlations. This has led to the development of a new generation of real-time instruments that provide continuous or semi-continuous streams of data about certain aerosol properties. However, these instruments have added a significant level of complexity to atmospheric aerosol data, and dramatically increased the amounts of data to be collected, managed, and analyzed. In the EDAM project, we are investigating techniques for automatically labeling mass spectra from different kinds of aerosol mass spectrometers, and then analyzing and exploring the rich spatiotemporal information collected from multiple geographically distributed instruments. |
Structural Biology via Machine Learning and Visualization UW Collaborator: George Phillips (Biochemistry) Funding: NIH | One of the most time-consuming steps in determining a protein's structure via x-ray crystallography is interpretation of the electron density map. This can be viewed as a computer-vision problem, since a density map is simply a three-dimensional image of a protein. However, due to the intractably large space of conformations the protein can adopt, building a protein model to match in the density map is extremely difficult. This project addresses the use of pictorial structures to build a flexible protein model from the protein's amino acid sequence. A pictorial structure is a way of representing an object as a collection of parts connected, pairwise, by deformable springs. Model parameters are learned from training data. Using an efficient algorithm to match the model to the density map, the most probable arrangement of the protein's atoms can be found in a reasonable running time. |
Knowledge-intensive, Interactive and Efficient Relational Pattern Learning UW Collaborator: David Page (Biostatistics and Medical Informatics) Funding: DARPA | Link discovery is an important task in data mining for counter-terrorism and is the focus of DARPA's Evidence Extraction and Link Discovery (EELD) research program. Link discovery concerns the identification of complex relational patterns that indicate potentially threatening activities in large amounts of relational data. Most data-mining methods assume data is in the form of a feature-vector (a single relational table) and cannot handle multi-relational data. Inductive logic programming is a form of relational data mining that discovers rules in first-order logic from multi-relational data. This project applies ILP to learning patterns for link discovery. |
Training Bayesian Networks to Diagnois Breast Cancer UW Collaborator: Beth Burnside (Radiology) and David Page (Biostatistics and Medical Informatics) Funding: UW | We are developing a system based on machine learning and Bayesian statistics that can provide accurate automated imaging–histologic correlations to aid radiologists in assessing the concordance of mammographic findings with the results of imaging-guided breast biopsies. We are using Breast Imaging Reporting and Data System (BI-RADS) descriptorsto convey the level of suspicion of mammographic abnormalities. Our system links BI-RADS descriptors with diseases of the breast using probabilities derived both from the literature and from actual cases. |
Training in Computation and Informatics in Biology and Medicine UW Collaborators: George Phillips, Director, (Biochemistry) and Fred Blattner (Genetics) Funding: NIH | The University of Wisconsin - Madison runs an interdisciplinary predoctoral and postdoctoral bioinformatics training program that supports 16 pre-doctoral and 4 post-doctoral students each year (plus 6 summer internships). The program's mission is to provide modern training for a new generation of researchers wishing to solve biomedical problems equiring strengths in both computational and biological science. Ph. D. students who are eligible for this interdisciplinary training include those majoring in computer sciences, statistics, genetics, biochemistry, engineering, and other computational areas and biological science disciplines from five colleges across campus. By participating in the Computation and Informatics Training Program, computer scientists, statisticians, and engineers receive cross-disciplinary training in biological sciences. Similarly, biologists receive cross-disciplinary training in statistics, engineering and computational areas related to biomedical research problems. |
Connecting the Quantum Dots: Theory of Quantum Computing in a Solid-state Implementation UW Collaborators: E. Bach (CS), S. Coppersmith (Physics), M. Friesen (Material Sciences), R. Joynt (Physics). Funding: NSF | This project studies the theoretical aspects of a proposal for a physical implementation of a quantum computer that uses the spin of individual electrons trapped in potential wells within semiconductor material to represent a qubit. The promise of this approach is that, in contrast to currently existing implementations, scalability will not be an issue: Once we can get a single nontrivial gate to work, classical semiconductor technology can be used to obtain much larger circuits. The challenge is to get two qubits to interact in a controlled way. We are also investigating other topics in quantum computation, such as quantum walks and algorithms for graph isomorphisim. |
Quantum Information Processing with two-dimensional atomic arrays UW Collaborators: M. Saffman (Physics), T. Walker (Physics). Funding: NSF | The goal of this project is to build a quantum computer with 32 qubits represented by the state of neutral atoms that are trapped in a two-dimensional optical grid. The physicists in the group hope single gate operations to be feasible relatively soon, and expect the approach to be fairly scalable. The anticipated bottleneck is the laser power needed to create the optical lattice. |
Optimization in Model Predictive Control UW Collaborator: Jim Rawlings (Chemical Engineering) Funding: NSF | Control of the operation of large, complex industrial plants can no longer be performed efficiently by means of simple control devices and methodologies. The "control loop" must include computers and sophisticated algorithms. Optimization algorithms are particularly important in model predictive control, a modern methodology that requires the repeated solution of optimization problems in real time. |
Optimization Algorithms in Statistics UW Collaborator: Grace Wahba (Statistics) Funding: NSF | Various problems arising in statistics, require the formulation and solution of large optimization problems. Recent work includes kernel estimation from dissimilarity data, using semidefinite programming and second-order cone programming formulations, and identification of sparse bases from extremely large dictionaries, via minimization of log-likelihood functionals with nonsmooth regularization terms. In each case, modern optimization algorithms have led to highly efficicnt software for solving these problems. |
Extracting Background Knowledge from the Scientific Literature to Improve Xiaojin Zhu Collaborator: Mark Craven, Dept. of Biostatistics and Medical Informatics Funding: UW | Traditionally the structure of gene regulatory networks was inferred from
|
