Research Highlights

Here is a list of items that are currently being featured on the home page, or that have been in the past.

Air Quality Balloon

Air Quality Modeling at CHPC

Collaboration between the Utah Division of Air Quality (UDAQ) and the University of Utah’s Center for High Performance Computing (CHPC) now gives the air quality modeling group at UDAQ access to advanced computing resources.  This cooperative agreement began with a request from the Utah office of the Bureau of Land Management (BLM) for consultation on air quality modeling to support environmental impact analysis needed to process natural gas drilling permits in the Uintah Basin.  This collaboration between UDAQ and CHPC is now a critical element in UDAQ’s ability to conduct air quality modeling for a wide variety of applications throughout the state, from the urbanized Wasatch Front to energy production areas of the Uintah Basin. 

HIPAA cluster

HIPAA-compliant Servers at CHPC

CHPC’s security expert Wayne Bradford, along with John Hurdle, Bernie LaSalle and Julio Facelli at BMI, has published a case study of the HIPAA-compliant environment at CHPC.  The study shows “how an HPC can be re-engineered to accommodate clinical data while retaining its utility in computationally intensive tasks such as data mining, machine learning, and statistics.”   Access to CHPC’s secured servers requires double authentication:  first, users must have access to the CHPC virtual private network; and, second, users must be listed in the HPC network information service directory.  Additional security is provided by housing the physical hardware in our data center that has controlled room access. All CHPC employees and the users of the secured servers are required to take the U’s Health Insurance Portability and Accountability Act training. The HIPAA-compliant environment is put to good use.  Wayne reports that “in the first 3 years, researcher count has increased from 6 to 58.”

You can read his full case study here:

Mapping the Universe with CHPC Resources

The Sloan Digital Sky Survey makes use of the University of Utah's Center for High Performance Computing (CHPC) parallel computing resources to help with its mission to map the Universe, from our Galaxy and beyond. Building on fourteen years of discovery, the fourth phase of SDSS (SDSS-IV) recently had its first public data release, DR13. In SDSS-IV the survey expands its reach in three different ways:

  1. We observe millions of stars in both the Northern and Southern skies by including a second telescope in Chile. Now SDSS uses both the 2.5m SDSS telescope in New Mexico, and the 2.5m du Pont Telescope in Las Campanas, Chile.
  2. We observe millions of galaxies and quasars at previously unexplored distances to map the large scale structure in the Universe 5 billion years ago, and to understand the nature of Dark Energy.
  3. We use new instrumentation to collect high resolution spectra within 10,000 nearby galaxies, to discover how galaxies grow and evolve over billions of years of cosmic history.

University of Utah astronomers are a core part of this international collaboration. Joel Brownstein, Professor of Physics and Astronomy, is the Science Archive Scientist, making sure that the SDSS data reduction pipelines run smoothly, and that the data products are easily accessible both within the team and publicly. Prof. Kyle Dawson and postdoctoral fellows are also involved, working on instrumentation to map the distant Universe.

Autism children

Autism Research within CHPC’s Protected Environment

The Utah Autism and Developmental Disabilities Monitoring Project (UT-ADDM) headed by Deborah Bilder, M.D. and William McMahon, M.D. in the Department of Psychiatry at the University of Utah’s School of Medicine, uses CHPC’s  robust protected environment that allows researchers using protected health information (PHI) to gather, process and store data, increasing user productivity and compliance.  In addition to access to high performance computing power, other tangible benefits for researchers using PHI is that the CHPC handles systems management issues, such as rapid response to electrical power issues, provision of reliable cooling and heating, VPN support for a work-anywhere computing experience, and ensuring a hardened, secure environment compared to office computers or departmental servers. For the institution this resource allows much better compliance and reduces the vulnerabilities of exposure of PHI data. 


A New Role for Proteins

DNA encodes RNAs and RNAs encode proteins. This flow of cellular information is commonly referred to as the Central Dogma of Molecular Biology. However, a team of researchers discovered a notable exception to this rule where a protein can direct the synthesis of another protein, without an RNA template. This unusual mode of protein synthesis only occurs after normal protein synthesis has failed and appears to send a distress signal to the cell that something has gone awry.

The researchers first detected template-free protein synthesis by visualizing it directly by using a technique known as electron cryo-microscopy (cryo-EM). The image analysis, performed on the University of Utah Center for High Performance Computing cluster, required processing hundreds of thousands of 2D images to compute a 3D reconstruction of the cellular assembly. Once the researchers analyzed the structure and performed follow-up biochemical experiments, they knew they had stumbled upon an unexpected discovery. "In this case, we have a protein playing a role similar to that filled by messenger RNA," says Adam Frost, M.D., Ph.D., assistant professor at University of California, San Francisco (UCSF) and adjunct professor of biochemistry at the University of Utah, who led the research team. "I love this story because it blurs the lines of what we thought proteins could do."  This work was featured in the January 2, 2015 issue of Science.


Uncovering the secrets of Darwin’s pigeons

Centuries of selective breeding have generated a tremendous diversity among the 300+ breeds of domestic rock pigeon (Columba livia), an organism that Charles Darwin proposed as an exemplary model for evolution under selection. This variation gives us the opportunity to better understand the process of vertebrate development, and to investigate how those processes evolve over time. Until recently, however, the specific molecular mechanisms responsible for phenotypic differences among pigeon breeds were a complete mystery.

To identify genetic changes responsible for novel traits among pigeon breeds, researchers from the Shapiro lab at the University of Utah Department of Biology utilized resources at CHPC to identify genetic differences in the genomes of over 100 diverse pigeons, which were then compared to pinpoint which genetic differences are most-closely associated with specific traits.  These analyses require high performance computing to processes terabytes of genomic data.  Without the large scale computing resources available at CHPC, identifying genetic differences among pigeons would take years, rather than days. Through these studies, surprising links between genes responsible for trait evolution in pigeons and genes responsible for genetic disorders in humans have been found. These results help researchers better understand mechanisms of evolution on a level that Charles Darwin could only have imagined.



Yellowstone Supervolcano

Imaging Magma Reservoir beneath Yellowstone Park

The supervolcano that lies beneath Yellowstone National Park is one of the world’s largest active volcanoes. University of Utah seismologists Fan-Chi Lin, Hsin-Hua Huang, Robert B. Smith and Jamie Farrell have used advanced seismic imaging techniques to develop a more complete view of the magma chamber beneath this supervolcano, extending the known range from 12 miles underground to 28 miles. For the study the researchers used new methods to combine the seismic information from two sources. Data from local quakes and shallower crust were provided by University of Utah Seismographic Stations surrounding Yellowstone. Information on the deeper structures was provided by the NSF-funded EarthScope array of seismometers across the US.

Their recent study, as reported in the May 15, 2015 issue of Science, reveals that along with the previously known upper magma chamber there is also a second previously unknown second reservoir that is deeper and nearly 5 times larger than the upper chamber, as depicted in the cross-section illustration which cuts from the southwest to the northeast under Yellowstone.  This study provides the first complete view of the plumbing system that supplies hot and partly molten rock from the Yellowstone hotspot to the Yellowstone supervolcano. Together these chambers have enough magma to fill the Grand Canyon nearly 14 times. Using resources at the Center for High Performance Computing, new 3D models are being developed to provide greater insight into the potential seismic and volcanic hazards presented by this supervolcano.


Computational Fluid Dynamic Simulation of a Novel Flash Ironmaking Technology

The U.S. steel industry needs a new technology that produces steel from iron ore with lower greenhouse gas emission and energy consumption. At the University of Utah, Prof. Hong Yong Sohn and his team have conceived of a drastically novel idea for an alternative method called the Flash Ironmaking Technology to replace the century-old blast furnace process.  This new technology eliminates the highly problematic cokemaking and pelletization/sintering steps from the current ironmaking processes by directly utilizing iron ore concentrates, which are in abundance in the United States. 

Using CHPC resources, the Sohn group is developing high-resolution computational fluid dynamics (CFD) simulations to select the optimal operating conditions for testing and subsequently reduce the testing time and effort. Simulation results will be used to analyze the results from the flash reactors. Also of high importance, the results of the simulations will assist in the subsequent design of an industrial-scale pilot facility and eventual full-scale commercial plant.

food mapping

An Analysis of Tobacco and Food Purchases

Professor John Hurdle, Biomedical Informatics,  has developed QualMART, a tool for helping grocery stores promote healthy eating habits for their customers.  To validate the effectiveness of this tool, the group conducted a study that compared tobacco purchases and the quality of food purchases.  They classified household grocery transactions in the Atlanta region, based on whether shoppers had ever purchased tobacco, and then applied their novel food purchase quality scoring measure to evaluate the household food environment. The master database with 15 months’ shopping activity from over 100,000 households nationally is housed on a HIPAA-compliant cluster at CHPC (accessed via Swasey).

The graphic shows the difference between ‘ever’ and ‘never’ tobacco purchasing groups is significant, with green areas indicating higher food quality scores and grey and red showing lower quality scores, aggregated by the zip code of the grocery shopping location.

This study validated the group's data-driven grocery food quality scoring design as the findings reproduce results from other studies in the scientific literature showing that tobacco users have lower overall diet quality compared to people who do not use tobacco.    


Prediction of Crystal Structures from First Principle Calculations

Using CHPC resources a team of researchers from the University of Utah and the University of Buenos Aires has demonstrated that it is possible to predict the crystal structures of a biomedical molecule using solely first principles calculations.  The results on glycine polymorphs shown in the figure were obtained using the Genetic Algorithms search implemented in Modified Genetic Algorithm for Crystals coupled with the local optimization and energy evaluation provided by Quantum Espresso. All three of the ambient pressure stable glycine polymorphs were found in the same energetic ordering as observed experimentally.  The agreement between the experimental and predicted structures is of such accuracy that they are visually almost indistinguishable.

The ability to accomplish this goal has far reaching implications well beyond just intellectual curiosity.  Crystal structure prediction can be used to obtain an understanding of the principles that control crystal growth.  More practically, the ability to successfully predict crystal structures and energetics based on computation alone will have a significant impact in many industries for which crystal structure and stability plays a critical role in product formulation and manufacturing, including pharmaceuticals, agrochemicals, pigments, dyes and explosives.

Lund AM, Pagola GI, Orendt AM, Ferraro, MB, Facelli, JC (2015). Crystal structure prediction from first principles: The crystal structure of glycine. Chemical Physics Letters, 626, 20-24. 

Figure 1: Snapshots from simulations of two types of nanomaterials. (a) A highly porous metal-organic framework (ZIF-8), consisting of Zn ions (yellow spheres) and methylimidazolate linkers (nitrogen atoms are colored blue, carbon atoms are colored gray, hydrogen atoms are not shown). (b) A superstructure formed from octahedral silver nanocrys- tals. The pink frame indicates the boundaries of the simulated region. A few nanocrystals are colored yellow and blue to highlight features of the complex structure they form.

Watching Nanomaterials Assemble at CHPC

By Prof. Michael Grünwald, Grünwald Research Group, Department of Chemistry

My son and I like to build remote control cars. The path that leads from a disordered pile of plastic parts and metal screws to a new race car is straightforward and fun: step after step, we collect the pieces that need to be assembled and put them together according to the instructions. In fact, this assembly strategy is the blueprint for much human building activity and applies almost generally to the construction of houses, machines, furniture (in particular the Swedish kind), and many other objects of our daily lives.

Large objects, that is. Building small things, as it turns out, requires a strikingly different approach. Consider, for instance, the "objects" illustrated in Figure 1: A porous crystal structure made from intricately arranged metal ions and organic molecules (a "metal-organic framework"), and an ordered arrangement of nanoparticles (a "superstructure"), which themselves consist of many thousands of atoms. These structures are examples of "nanomaterials", objects that derive their unusual properties from their fascinating microscopic structure. Because of their large pores, metal-organic frameworks like the one in Figure 1a can be used to store hydrogen gas, filter CO2, or separate molecules by shape. Depending on the kinds of nanoparticles used, superstructures such as the one in Figure 1b can be used to alter the direction of light, or act as new kinds of solar cells.

Read the full article in the newsletter

Interactions of Insects and Plants

Cataloging the Interactions of Insects and Plants

By Thomas Kursar, Department of Biology

For our NSF-funded project, “Dimensions: Coexistence, Herbivore Host Choice, and Plant-Herbivore Evolution in the Recently Radiated and Speciose Neotropical Tree Genus, Inga," we developed a data repository that is based at CHPC. Our NSF project addresses critical, long-standing questions in tropical ecology, the origin and maintenance of diversity. We focus on the herbivorous insects that consume leaves and the defenses of plants against these invertebrates. We hypothesize that the diversity among plant species of their anti-herbivore defenses is exceedingly high. If it is the case that each species has a distinct defensive niche, this could explain how tropical forests maintain an exceedingly high number of coexisting species. Additionally, we hypothesize that the arms race between plants and herbivores may drive divergence and speciation, thus explaining the origin of diversity in tropical rainforests. Our repository supports this project by storing data and images on plants, herbivores, and the toxic metabolites that plants make in order to defend themselves.

Multiscale Modeling of Anion-exchange Membrane for Fuel Cells

By Jibao Lu, Liam Jacobson, Justin Hooper, Hongchao Pan, Dmitry Bedrov, and Valeria Molinero, University of Utah; Kyle Grew and Joshua McClure, US Army Research Laboratory; and Wei Zhang and Adri Duin, Pennsylvania State University

To our knowledge, this is the first coarse grain (CG) model that includes explicitly each water and ion, and accounts for hydrophobic, ionic, and intramolecular interactions explicitly paramterized to reproduce multiple properties of interest for hydrated polyelectrolyte membranes. The CG model of polyphenylene oxide/trimethylamine is about 100 times faster than the reference atomistic GAFF model. The strategy implemented here can also be used in parameterization of CG models for other substances, such as biomolecular systems and membranes for desalination, water purification and redox flow batteries. We anticipate that the large spatial and temporal simulations made possible by the CG model will advance the quest for anion-exchange membranes with improved transport and mechanical properties.

Analyzing and Predicting Stream Properties

By Milada Majerova and Bethany Neilson, Utah Water Research Laboratory, Utah State University

The stream temperature regime is an important and very complex component of habitat quality. With introducing beaver dams in to the system and thus changing stream hydraulic properties, the processes become even more complicated and difficult to predict. Beaver dams increase spatial and temporal variability in temperature and flow, as well as increase baseflow and groundwater gains during summer months. This variability could play an important role for fish and other aquatic organisms under changing conditions when summers are predicted to be hotter and longer with less precipitation throughout the year. Stream temperature quantification and modeling then becomes an essential tool in order to better understand, predict and manage our stream systems. CHPC resources play an indispensable role in the modeling effort of capturing and predicting the stream hydraulic properties and temperature variability.

Clean Coal: Powered by Exascale

By Philip J. Smith and Michal Hradisky, CCMSC

The mission of the Carbon-Capture Multidisciplinary Simulation Center (CCMSC) at the University of Utah is to demonstrate the use of exascale uncertainty quantification (UQ) predictive simulation science to accelerate deployment of low-cost, low-emission electric power generation to meet the growing energy needs in the United States and throughout the world. The two main objectives, advancing simulation science to exascale with UQ-predictivity in real engineering systems and use of high-performance computing (HPC) and predictive science to achieve a societal impact, are linked together through an overarching problem: simulation of an existing 1,000 MW coal-fired ultra-supercritical (USC) boiler and simulation of a design 500 MW oxy-coal advanced ultra-supercritical (AUSC) boiler.

Read the full article in the newsletter

Changes in Neuronal Membrane Properties Lead to Suppression of Hippocampal Ripples

By Eric D. Melonakos, John A. White, and Fernando R. Fernandez, Department of Bioengineering

Center for High Performance Computing resources were used to study the effects of cholinergic inputs to the hippocampus on patterns of brain activity.

Ripples (140–220 Hz) are patterns of brain activity, seen in the local field potential of the hippocampus, that are important for memory consolidation. Cholinergic inputs to the hippocampus from neurons in the medial septum-diagonal band of Broca cause a marked reduction in ripple incidence as rodents switch from memory consolidation to memory encoding behaviors. The mechanism for this disruption in ripple power is not fully understood. Among the major effects of acetylcholine (or carbachol, a cholinomimetic) on hippocampal neurons are 1) an increase in membrane potential, 2) a decrease in the size of spike after hyperpolarization (AHP), and 3) an increase in membrane resistance. Using an existing model of hippocampal ripples that includes 5000 interconnected neurons (Brunel and Wang, 2003), we manipulated these parameters and observed their effects on ripple power. Shown here, the network firing rate and ripple power of the original model (top row; pyramidal neuron data is shown in red, interneuron data is shown in black) undergo marked changes following a decrease in pyramidal neuron AHP size, as well as an increase in the membrane voltage of both types of neurons. These changes could be the means whereby cholinergic input suppresses hippocampal ripples.

Tackling Large Medical Genomics Datasets

By Barry Moore, USTAR Center for Genetic Discovery

The University of Utah has a long and rich history of genetic research that spans decades and has led to the discovery of over 30 genes linked to genetic disease. These Utah discoveries range from relatively common and well-known heritable disease, such as breast cancer linked to BRCA1/BRCA2 genes, to the truly obscure Ogden syndrome, which in 2010 became the first new genetic disease to be described based on genome sequencing. The Utah Genome Project (UGP), together with clinical investigators across the University of Utah, is continuing this tradition of cutting edge genetic research in Utah by launching several large medical genomics projects over the last year. The USTAR Center for Genetic Discovery (UCGD)—the computational engine for the UGP—has partnered with the University’s Center for High Performance Computing (CHPC) to tackle the massive datasets and the large scale computing requirements associated with these projects.

Read the full article in the newsletter

Role of Stacking Disorder in Nucleation, Growth and Stability of Ice

By Laura Lupi, Arpa Hudait, Baron Peters, and Valeria Molinero, Department of Chemistry

Accurate forecasts of changes in weather and climate rely on the possibility of accurately predictions of the properties of clouds. Rates of ice nucleation, in the temperature ranges relevant for the atmosphere, are usually based on extrapolations using classical nucleation theory (CNT), which assumes that the structure of nanometer-sized ice nuclei correspond to that of bulk hexagonal ice. Here we use molecular dynamics simulations and free energy calculations to show that stacking disordered ice is the stable phase for critical-sized ice nuclei. The finding results in over three orders of magnitude higher nucleation rates with respect to CNT predictions and should have a strong impact on climate models.
Niwot Ridge, Colorado

Understanding the Carbon Cycle Through Climate Models

By Brett Raczka, Department of Biology

Land surface models are useful tools to quantify contemporary and future climate impact on terrestrial carbon cycle processes, provided they can be appropriately constrained and tested with observations. Stable carbon isotopes of CO2 offer the potential to improve model representation of the coupled carbon and water cycles because they are strongly influenced by stomatal function. Recently, a representation of stable carbon isotope discrimination was incorporated into the Community Land Model component of the Community Earth System Model. Here, we tested the model's capability to simulate whole-forest isotope discrimination in a subalpine conifer forest at Niwot Ridge, Colorado, USA.

Read the paper in Biogeosciences

Data Assimilation for Improving WRF Performance in Simulating Wintertime Thermal Inversions in the Uintah Basin

By Trang Tran, Huy Tran, and Erik Crosman, Utah State University and University of Utah

Meteorological models for simulating atmospheric properties (e.g., temperature and wind) during thermal inversions are important for simulating winter ozone pollution in the Uintah Basin. The Weather Research and Forecasting (WRF) meteorological model supports "observational nudging," i.e., a technique in which the model is biased to conform to available observational data. We recently performed two WRF simulations, one nudged with temperature, wind field, and humidity data, and one without nudging, for the period of Jan 16 to Feb 9, 2013. Contrary to expectations, the nudged model produced an unrealistic inversion structure that was too intense and shallow. It confined most pollutants to only a shallow area at the bottom of the Basin. On the other hand, the non-nudged WRF model tends to produce a weaker, deeper inversion layer and produced too much vertical mixing.

Tracking Pressure Features

By Alexander Jacques, MesoWest/SynopticLabs and Atmospheric Sciences

Center for High Performance Computing resources were used to model the progression of a mesoscale gravity wave generated by a large storm system on April 26–27, 2011.

A mesoscale gravity wave, generated by a large storm system in the southern United States, moved northward through the central United States causing short-term changes in surface wind speed and direction. This animation shows efforts to detect and evaluate the negative mesoscale surface pressure perturbation generated by this wave. Detected positive (red contours) and negative (blue contours) perturbations are determined from perturbation analysis grids, generated every 5 minutes, using USArray Transportable Array surface pressure observations (circle markers). Best-track paths for the perturbations are shown via the dotted trajectories. To identify physical phenomena associated with the perturbations, conventional radar imagery was also leveraged. It can be seen here that the detected feature migrates north away from the majority of the precipitation, which is often seen with mesoscale gravity wave features.

Understanding Wind Energy

By Gerard Cortina and Marc Calaf, Wind Energy & Turbulence, Department of Mechanical Engineering

The Wind Energy and Turbulence laboratory was designed to improve the current understanding of wind energy harvesting. To achieve this goal we dedicate much of our efforts to develop new knowledge on the turbulent atmospheric boundary layer. Our focus resides on solving high resolution numerical simulations with the help of the Center for High Performance Computing at the university of Utah, which we ultimately complement with the analysis of experimental data.

Currently we mainly use Large Eddy Simulations, which are capable of resolving most of the atmospheric turbulent scales as well as the wind turbines, providing very good results when compared to the experimental data. We are highly interested in improving the current conception of the land-atmosphere energy exchanges, and our work strives to fill the gaps of our current understanding. It is only by properly capturing the land-atmosphere connection that forces the atmospheric flow aloft that we will be able to reproduce with high accuracy the atmospheric flow.

Tracking Pressure Perturbations Resulting From Thunderstorm Complexes

By Alexander Jacques, MesoWest/SynopticLabs and Atmospheric Sciences

Two strong thunderstorm complexes moved across the north-central plains of the United States late on August 11 into August 12, 2011. This animation shows research efforts to detect and evaluate large mesoscale surface pressure perturbation features generated by these complexes. The detected positive (red contours) and negative (blue contours) perturbations are determined from perturbation analysis grids, generated every 5 minutes, using USArray Transportable Array surface pressure observations (circle markers). Best-track paths for perturbations are shown via the dotted trajectories. To identify physical phenomena associated with the perturbations, conventional radar imagery was also leveraged to identify regions of thunderstorm and precipitation activity. It can be seen here that two distinct thunderstorm complexes are co-located with several of the detected pressure perturbation feature.

Modeling Ozone Concentration

By Brian Blaylock, Department of Atmospheric Sciences

A strong lake breeze with impact on air quality was observed on 18 June 2015 in the Salt Lake Valley. The progression of this lake breeze was simulated using the Weather Research and Forecast Model. The model was initialized using hourly analyses of the High Resolution Rapid Refresh model. Shown in the [above] videos are the concentrations of atmospheric tracers released near the surface at the north (red) and south (blue) end of the Salt Lake Valley. Tracers are released every time step from the source regions and then transported by the wind field. The development and passage of the simulated lake breeze is recognizable in the simulation on 18 June 2015 at 1830 UTC.
Comparative genomics and signatures of social behavior in bees

Genomic Insights Through Computation

By Karen Kapheim, Kapheim Lab, Utah State University

The primary focus of research in the Kapheim Lab is understanding how social behavior evolves in bees. We take an integrative approach to finding answers to this question, and in doing so merge ecology, behavior, neuroscience, comparative genomics, and molecular biology. We conduct experiments in the field with live bees, process these in our molecular biology lab, and then analyze the sequence data using the CHPC. Examples of on-going projects include using metabarcoding to characterize the role of the microbiome in social behavior and health of bees. We have sequenced a portion of the bacterial 16s rRNA gene in DNA extracted from the guts of bees during various life stages. We are processing these sequences on the CHPC. As a side project, we are also using similar computational methods to characterize the metabarcodes sequenced from the guts of carrion flies to characterize the mammal community on a tropical island where we work. Other projects involve comparative genomics of bee genomes to look for signatures of evolutionary transitions between solitary and social lifestyles. We are also using the CHPC to analyze microRNA expression differences among bees that vary in social behavior, and in response to hormone treatments. In each of these projects, the CHPC staff and resources have been extremely valuable, as genomic data is particularly large and analyses would not be possible on desktop computers.

Modeling the Unexpected Formation of a Gyroid

By Carlos Chu-Jon, Grünwald Research Group, Department of Chemistry

You mix lemon, water, and sugar; and you expect lemonade, and not cider. Here we show the unexpected formation of a gyroid from the components that make up the porous metal organic framework ZIF-8. Although, the formation of this structure was not our original intent, its geometric intricacy, and simple beauty, makes it a worthwhile specimen.

Quantifying Contributions from Natural and Non-local Sources to Uintah Basin Ozone

By Huy Tran, Seth Lyman, Trang Tran, and Marc Mansfield, Bingham Entrepreneurship & Energy Research Center

Ozone in the lowest layer of the atmosphere (the troposphere) results in large part from human activity: Pollutants already present in the atmosphere are converted to ozone by the action of sunlight. However, there are also natural sources of ozone, such as wildfires and a phenomenon known as a "stratospheric intrusion," when strong vertical mixing pulls ozone from the stratospheric ozone layer down to the surface. Using the GEOS-Chem global chemical model, we have successfully demonstrated that a stratospheric ozone intrusion event occurred on June 8–9, 2015, which caused surface ozone in the Uintah Basin to exceed the 70-ppb national standard. We have also identified many other cases in which natural or non-local sources contributed a large portion of the surface ozone in the Basin, especially during spring and summer, although at levels not exceeding the national standard. The ability to distinguish human-caused local, human-caused non-local, and natural ozone events is important for planning and evaluating ozone mitigation strategies.

Linking Frost Timing to Circulation Patterns

Atmospheric sciences professor Courtenay Strong and Gregory McCabe of the United States Geological Survey studied how frost timing (specifically, the lengthening of the frost-free season) is influenced by global warming and local atmospheric circulation by utilizing objective-clustering algorithms and optimization techniques. By discovering the circulations responsible for frost timing in different climatic regions of the conterminous United States, they found that atmospheric circulation patterns account for between 25 and 48 percent of variation in frost timing.

Read the paper in Nature Communications or read the article in UNews

Sea level pressure analysis from the operational High Resolution Rapid Refresh at 1 PM March 14, 2017 with unusually low pressure associated with a major New England snowstorm

Efficient Storage and Data Mining of Atmospheric Model Output

By Brian Blaylock and John Horel, Department of Atmospheric Sciences

Our group … purchased 30TB in CHPC’s pando [archive storage] system to test its suitability for several research projects. We have relied extensively over the years on other CHPC storage media such as the tape archive system and currently have over 100TB of network file system disk storage. However, the pando system is beginning to meet several of our interwoven needs that are less practical using other data archival approaches: (1) efficient expandable storage for thousands of large data files; (2) data analysis using fast retrieval of user selectable byte-ranges within those data files; and (3) the ability to have the data accessible to the atmospheric science research community.

The CHPC pando storage archive has made it possible for us to efficiently archive, access, and analyze a large volume of atmospheric model output. Several researchers outside the University of Utah have already discovered its utility in the short time that the archive has been available.

Read the full article in the newsletter

Modeling Pollution in Utah's Valleys

By Christopher Pennell, Utah Division of Air Quality

The Utah Division of Air Quality simulated a high pollution episode that occurred during the first eleven days of January, 2011. Using CHPC resources, we produced a high resolution, hourly animation showing when levels of fine particulate matter (PM2.5) far exceeded federal standards in Northern Utah.

Air pollution builds up during the day with the onset of sunlight and human activity. Pollution levels greatly decrease in the late evening except when a persistent temperature inversion gets established in Utah’s valleys. As inversion conditions persist, air pollution steadily accumulates across several days triggering public health concerns. We are left waiting for a strong winter storm that can destroy surface air stability and bring in fresh clean air.

Our pollution modeling not only accounts for human activity, but also for the mechanisms that make particulate pollution from emitted gases. The computational power provided by CHPC allows the State of Utah to model the complex relationship between meteorology, human activity, and air chemistry with impressive precision.