Skip to content Skip to main navigation Report an accessibility issue

2018 Spotlights


Demand for nuclear power has increased steadily since the first commercial nuclear power station came online in the 1950s. Nuclear energy is now responsible for approximately 11 percent of all power generated in the world. Due to this growing popularity, greater emphasis has been put on areas of research relating to the generation of nuclear power, specifically the materials used in and around reactors, and the waste those materials generate.

Safe disposal of nuclear waste is of primary concern, given its potential effects on the environment and surrounding communities should a leak occur. One of the methods used to combat this possibility is vitrification—the conversion of waste into a stable glass. The glass serves as a safe containment method for the spent fuel over hundreds of thousands of years. However, the effects of radiation, one of the key drivers of microstructural evolution of these types of glass, are not fully understood. This is where Maik Lang, associate professor of nuclear engineering and Pietro F. Pasqua Fellow, and his JDRD team come in.

The goal of Lang’s JDRD project is to develop an understanding of the structure of these glasses using ORNL’s Spallation Neutron Source (SNS) and Integrated Computational Environment–Modeling and Analysis for Neutrons, or ICE-MAN. In its second year, Lang’s team has pivoted to focus their modeling efforts on the nuclear fuel itself. 

“In the renewal, we shifted away from glasses to nuclear fuel materials,” he said. “The overall goal is still to understand the glass structure, but we think we first have to understand defect formation in crystalline materials. Once we have a good handle on that, we can move on to the more complicated glass system, which is an aperiodic material.”

Lang’s team is working alongside ORNL’s Anibal Ramirez-Cuesta, Chemical Spectroscopy Group leader, and Matt Tucker, Diffraction Group leader, to further develop the ICE-MAN platform as well as making use of it to decode their data. In turn, Lang’s data is serving as a test bed for the platform itself, setting the stage for future collaborations. 


Machine learning algorithms allow a computer to learn and make predictions based on existing data. Machine learning is already in use in the everyday lives of many. Netflix, for example, uses machine learning algorithms to make recommendations based on the viewing habits of its approximately 125 million subscribers. Machine learning is becoming more widespread in a variety of arenas, including health care and finance. Steven Johnston, assistant professor of physics and astronomy, thinks it can also be applied to physics.

Johnston works with quantum Monte Carlo simulations. These simulations work by taking a configuration or set of parameters and proposing a random change over and over again in the search for the best arrangement.

“The difficulty is that deciding whether or not you accept that proposed change is incredibly expensive computationally,” he said. “We need to find a way to do this cheaper and faster.”

Johnston’s JDRD team is working with machine learning to train a computer using a neural network to guess whether a proposed change is going to be accepted. His team has run some benchmark tests that show the machine learning algorithms capable of completing calculations at a significantly faster rate than other methods.

“The neural network approach can do in about six hours what used to take five or six days. The idea is now that we can do it faster, we can make the problem bigger and use the same computing time as before,” said Johnston.

He hopes to have completely benchmarked an algorithm by the end of the funding year, determining how well it performs against a conventional algorithm. He also plans to carry out at least one comparative study to confirm that both conventional and machine learning algorithms produce the same results.

Johnston’s ORNL collaborator, Markus Eisenbach of the Center for Computational Sciences, is using the same machine learning techniques to tackle a different set of algorithms. The work of both teams could contribute meaningfully to the future use of machine learning in physics.


DNA testing and whole genome sequencing have become increasingly important tools in the arsenal of public health professionals. Specifically with regard to food- borne illness outbreaks, these techniques have been employed to successfully trace the sources of infection. Thus far, they have been used mostly in Salmonella outbreaks, but Jeremiah Johnson, assistant professor of microbiology, hopes to extend their capability to trace other bacteria. 

“Something that’s actually starting to happen nationally, in terms of public health, is that they’re moving away from more archaic genomic analyses and actually mandating that all state public health departments start using whole genome sequencing,” he said. 

This is where his team steps in. While many health departments are acquiring the equipment necessary to conduct this kind of sequencing, they are lacking the in-house expertise needed to conduct the work. Johnson’s team possesses the bioinformatics experience required to analyze the genomes of the subject bacteria—in this case, Campylobacter.

Campylobacter was recently identified by the CDC as a serious threat to public health, as it is becoming increasingly resistant to antibiotics. It also takes significantly less of the bacterium to make a person sick than other bacteria in the past. 

“With Campylobacter, it only takes a few hundred bacterial cells to make a person sick,” said Johnson. Another hurdle with Campylobacter is how rapidly the bacteria mutate and evolve, making some older genomic techniques ineffective.

In the first year of support, Johnson’s team focused on acquiring Campylobacter samples from a variety of sources. The cooperative effort saw contributions from UT’s College of Veterinary Medicine and the US Food and Drug Administration. Team members also ventured out into other areas of East Tennessee to collect additional samples for sequencing.

“We’ve collected samples from area farmer’s markets and grocery stores. We’ve done local waterways, including the Tennessee River and the Hiwassee River. We’ve also done animals around here, like sheep, pigs, cows, and some birds,” said Johnson.

In the second year of the project, the team is hoping to establish proof of concept by running a transmission study with Johnson’s ORNL partner, computational biologist Dan Jacobson. The goal, based on a genomic sequencing conducted on the front end, is to see if the ORNL collaborators can identify the original strain of Campylobacter once it’s passed through this transmission study.


The adoption of new technology into society does not happen all at once. Take cars, for instance. Initially only a few consumers owned the Model T. Now automobiles are ubiquitous, with 95 percent of American households owning at least one. There was, however, a time when both cars and horse-drawn wagons existed on the road simultaneously, creating a series of unique challenges for drivers and pedestrians alike. 

Similar challenges occur with more recent major technological changes. With his JDRD project, Professor of Electrical Engineering and Computer Science Seddik Djouadi hopes to address the challenges happening right now in renewable energy.  

“Renewable energy and the penetration of renewable energy sources into the modern grid require power converters that interface with the current devices in the power grid,” said Djouadi. “But these sources of renewable energy—like solar energy and wind turbines—are variable, meaning the power throughput is variable.”

According to Djouadi, this variability can cause instability and inconsistent performance in the power grid, which is designed to handle a steady flow of electricity. His solution is to design better controllers for the power converters that integrate renewable energy into the grid.

“It’s basically designing a switching control so if there’s a disturbance in the system it triggers a mechanism that can ensure the safe operation of the power grid,” said Djouadi. “These energy sources introduce variability in frequency and voltage and improved controllers will help maintain consistency in both these areas.”

Djouadi and his team hope the design will encourage further integration and use of renewable energy sources.

“From my perspective, this kind of control design will show people in industry that we have the capability to achieve this type of performance,” said doctoral student Yichen Zhang, who is working on Djouadi’s team. “From the industrial point of view, they may think this design will cost a lot for them—to change from their traditional controller—but the duty of our research is to let them know this can be done.”

At the conclusion of their first year, Djouadi and his team hope to have developed some successful algorithms and made advances in designing their proposed controller.


For generations, the human brain has served as a source of inspiration for artists and scientists alike. Composed of neurons, blood vessels, and glial cells, the brain governs all the functions of a human body. Millions of individual pieces come together to make a person who they are, all in a relatively small package using a minimal amount of energy. Unsurprisingly, the brain has become a model for a relatively new area of computational science.

Neuromorphic computing is an approach to computation based on the model of the human brain with widespread potential applications, from medicine to autonomous vehicle development. In order to have such far-reaching effects, neuromorphic computing has to be both efficient and scalable. Mark Dean, interim dean of the Tickle College of Engineering and Fisher Distinguished Professor of Electrical Engineering and Computer Science, hopes to address these requirements with his JDRD work.

From the outside, neuromorphic computing systems look like any other computer, according to Dean. They might even have chips like traditional computers. The content of those chips, however, makes new computational skills possible.

Mark Dean“On the chip you might see artificial neurons and synapses built from traditional digital logic, but you might also see new forms of devices,” he said. “This means of storing information allows them to be used like synaptic elements.”

Dean suggests this way of storing and transferring information could affect the very functionality of such computing systems, allowing them to learn and improve over time.

“Right now, computers are pretty static. You program them to do something and that’s all they do,” he said. “Neuromorphic computing would be more flexible than that. It would be able to deal with variations in information and come up with a set of insights that traditional computers just couldn’t do.”

His team is currently working to develop low-power interconnects for neuromorphic elements to support the work of his LDRD partner Raphael Pooser, Quantum Sensing Team lead at ORNL. Dean’s goal is to show that neuromorphic elements can be connected in a way that will maximize efficiency without losing functionality. 

“Our expectation is that we will demonstrate how neuromorphic components can be connected together in an efficient way that minimizes power consumption and optimizes scale,” he said. “We’re hoping to show that it can be done and done well.”