How Supercomputers are Becoming a Scientist’s Best Friend
Oak Ridge is a small city in the state of Tennessee with a big claim to fame. Home to about 30,000 residents, the area also has a national laboratory that hosts the world’s most powerful supercomputer.
The supercomputer in Oak Ridge is a technological feat, crunching data faster than any other supercomputer in the world in 2018. The sheer numbers are astounding. A capability of 200 petaflops (two thousand trillion floating point operations per second)1. Power efficiency of nearly 15 GFlops/watt (the 3rd most energy efficient supercomputer in the world in 2018)2. Perhaps the most impressive metric, though, is that it is the first system to reach exaflop speed, or a million trillion total operations per second.
Supercomputers Make Scientists Research Superheroes
When it comes to research, it is no surprise that scientists are using supercomputers to address the world’s biggest challenges. From modeling global climate to discovering alternative energy to simulating aircraft and more, the ability to process massive amounts of data in parallel has and continues to push the frontiers of science. In this blog post, we cover ten ways that scientific researchers are buddying up with supercomputers to take on problems that benefit humanity.
#1: Modeling Climate around the World
As the weather changes, computer simulations help climate scientists understand changing conditions in the atmosphere, land, and oceans. The simulations are fed with massive amounts of weather data from satellites orbiting the earth, as well as observational data from the ground. From there, supercomputers build a mathematical model based on satellite data using a two-step process known as data assimilation.
First, the system projects how atmospheric conditions will change along a pre-defined time span. While weather models usually deal with projections in the time span of days, climate models typically look at trends over several years. Then, the system is continuously fine-tuned with real-time observational data. These complex models are made possible by supercomputers being able to perform data analytics on trillions of calculations every second3.
#2: Simulating Events in Solid Earth Geophysics
Summit, the world’s most powerful supercomputer in 2018, is helping scientists who study the earth see beneath its thin outer crust. Led by a Professor of Geology from an Ivy League university, the Tennessee-based research team is on a mission to create a three-dimensional map of the earth’s mantle – the thick, middle layer between our planet’s core and crust4.
To do so, the group of scientists is using measurements from millions of data points, collected by seismic instruments around the world. These sensors measure the speed that seismic waves, generated by earthquakes, move through Earth’s mantle. Looking at these variations in speed helps the researchers image the different temperature and composition of materials in our planet’s interior. With supercomputers, geophysicists have both the numeral simulation and advanced visualization capabilities to explore new depths around the globe.
#3: Replicating Biomolecular Behavior
For medical researchers, supercomputers are a potential solution to deal with the rapid rise in healthcare data. By 2020, this data is estimated to reach the zettabyte (ZB) level of approximately 2.314 ZB5. For comparison, this number is roughly equivalent to the data storage capacity of nearly 9 million high-end smartphones (each with 256 GB).
When it comes to medical research, high-performance computing is helping make new discoveries in life sciences. In drug discovery, supercomputers are helping predict molecular interactions between potential drug therapies and the proteins that they are trying to bind to6. For companies involved in bioinformatics, supercomputers use parallel processing to crunch the personal data of human genomes – or an entire person’s DNA. Supercomputers are also well-suited to model the behavior of large molecular systems, from thousands to millions of individual particles.
#4: Crafting Better Aircraft and Spacecraft
When one of the top schools for aviation and aerospace in America wanted to advance their applied research, they turned to a supercomputer. The four-rack cluster was installed in the university’s engineering and technology center in 20177. Filled with over 3,000 processing cores, it models aircraft from their build to propulsion systems to aerodynamic efficiency. The high-performance computing system is also used to research the effects of atmospheric conditions on flight.
On the national level, supercomputers are helping researchers prepare for the launch of future space missions. These plans include expeditions to the Moon, Mars, and beyond. By using a powerful computer, researchers were able to accurately model the launch environment for space shuttles and other spacecraft. With this data, the root cause of an issue with the water system of space shuttles that caused high wall pressures on launch was found. The team of researchers used over 5,000 cores on a specialized supercomputer to simulate nearly 144 million cells, taking over 5 million processor hours to complete8.
#5: Researching Particle Physics
Supercomputers help particle physicists see what the universe is made of. Take the world’s most powerful particle accelerator, located near Geneva in Switzerland. Filled with superconducting magnets and accelerating structures, the 27-kilometer ring has helped scientific researchers experiment with antimatter – an antiparticle that exactly matches a given particle, but with opposite charge – and discover a well-known boson.
In these particle collisions, an enormous amount of data is generated. Each day, the European research organization processes an average of 1 petabyte of data9. Within a typical year, this amount translates to more than 50 petabytes, with an additional 25 petabytes generated by the group’s non-particle accelerator-related experiments9. To preserve the experimental results of high-energy physics activities, high-density magnetic tape is used to archive the data.
#6: Conducting 3D Tests of Nuclear Weapons
So long, underground nuclear testing. Farewell, above ground warhead detonation practice. Welcome, advanced computer simulations of nuclear weapons. In the United States, nuclear weapon scientists are using two supercomputers to support the country’s stockpile stewardship mission. Together, these data crunching behemoths have nearly 3 petabytes of total memory and over 1.5 million compute cores10,11. One use of the supercomputer pair is to estimate the performance of nuclear weapons using numerical simulations. Second, the researchers are fine-tuning the models for nuclear weapons, which becomes a part of their design codes.
With supercomputers, scientists have been able to create more advanced and detailed nuclear simulations. In one case, a research team with members from a national research lab and prestigious university in Pennsylvania was able to track the most energetic particles during a trillion-particle simulation, thanks to supercomputers12. Following the path of these particles helped the researchers simulate how plasma – a state of matter made of protons, neutrons, and free electrons – is produced during nuclear bomb explosions.
#7: Testing Renewable Energy Sources
Sustainability and supercomputers are also coming closer together. In early 2019, the world’s largest supercomputer dedicated to alternative energy research came online. The high-performance system, can perform up to 8 million-billion calculations per second13. With over 76,000 cores, 296 terabytes of random access memory and 14 petabytes of total storage, the supercomputer is making a new chapter in the nation’s energy transformation possible14.
Currently, scientists are using the system to make advances across the renewable energy industry15. In solar energy, researchers are exploring the different photovoltaic materials, devices and processes that make up solar panels and grids. For wind energy, it helps scientists look at different paths for wind technology development, from land-based to distributed to offshore wind. The supercomputer also aids research in alternative energy related to bioenergy, hydropower, transportation and much more.
#8: Mimicking the Human Brain
Researchers are years away from a computer simulation of a full human brain. But, that doesn’t mean progress hasn’t been made. For example, a Swiss brain research initiative has been using supercomputers to create biologically accurate, digital reconstructions of rodent brains – with the ultimate goal to simulate the brain of humans.
To this end, the organization used imaging data from thousands of whole brain tissue stains to map the 737 regions of the mouse brain16. The end result? A first-ever digital 3D cell atlas was released. It works like a search engine, enabling neuroscientists to visualize and study specific areas of a rodent’s brain, from wide regions down to individual cells. Whereas previous studies had only mapped about 4 percent of cells in the mouse brain, the Swiss brain research initiative was able to categorize the major types, numbers, and positions of the remaining 96 percent of cells16.
#9: Spotting Cybersecurity Threats
Keeping data safe from increasingly sophisticated cyber attacks takes a new approach. To this end, cybersecurity analysts are using artificial intelligence (AI) on supercomputers to proactively spot and analyze potential threats.
First, the AI uses machine learning and deep learning on billions of data points – from research papers to blog posts and more – to understand signatures for online threats17. With this knowledge, the AI can then reason between potential threats. These risks are captured and consolidated across an enterprise’s various applications, devices and endpoints. Insights from this risk analysis are sent to the cybersecurity team for automatically orchestrated responses.
#10: Exploring Oil & Gas Reservoirs
In the search for new oil and gas resources, faster computing is helping geologists work smarter. A French oil giant has claim to the most powerful computer in the oil and gas sector18. For context, the data crunching machine has the computing power of nearly 170,000 personal laptops – combined. Compared to the company’s previous supercomputer, the new computing system is estimated to process data at a rate nearly 10 times faster.
The computer is being used mainly to search for hydrocarbons, a key component of oil and natural gas. The supercomputer does this by using imaging algorithms to crunch complex data about the sub-surface of Earth18. With this extra processing power, the oil group from Paris hopes to find additional success in exploring, appraising, developing, and drilling – saving millions of dollars in the process.
Learn More about Big Data Analysis
- This government-funded medical lab translates big data into precision medicine
- Analytics is just one pillar to build a robust data strategy framework
- For enterprises, Big Data analytics uncovers trends, patterns and associations
FORWARD-LOOKING STATEMENTS: This article contains forward-looking statements, including statements relating to expectations for storage products, the market for storage products, product development efforts, and the capacities, capabilities and applications of Western Digital products. These forward-looking statements are subject to risks and uncertainties that could cause actual results to differ materially from those expressed in the forward-looking statements, including development challenges or delays, supply chain and logistics issues, changes in markets, demand, global economic conditions and other risks and uncertainties listed in Western Digital Corporation’s most recent quarterly and annual reports filed with the Securities and Exchange Commission, to which your attention is directed. Readers are cautioned not to place undue reliance on these forward-looking statements and we undertake no obligation to update these forward-looking statements to reflect subsequent events or circumstances.
- Move Over, China: U.S. Is Again Home to World’s Speediest Supercomputer. https://www.nytimes.com/2018/06/08/technology/supercomputer-china-us.html
- November 2018 | TOP500 Supercomputer Sites. https://www.top500.org/green500/lists/2018/11/
- Supercomputing the Climate. https://svs.gsfc.nasa.gov/vis/a010000/a010500/a010563/index.html
- Summit Supercomputer Clears Path To Seismic Discoveries. https://www.nextplatform.com/2019/03/04/summit-supercomputer-clears-path-to-seismic-discoveries/
- Stanford Medicine 2017 Health Trends Report – Harnessing the Power of Data in Health. https://med.stanford.edu/content/dam/sm/sm-news/documents/StanfordMedicineHealthTrendsWhitePaper2017.pdf
- Big Data Life Sciences: Healthcare Data Analytics & NCS – Applications – Cray. https://www.cray.com/solutions/life-sciences?tab=applications
- Embry-Riddle Acquires Cray Supercomputer to Advance Research. https://news.erau.edu/headlines/embry-riddle-acquires-cray-supercomputer-to-advance-research
- Simulations Give NASA Code Green Light for Space Launch System Testing. https://www.nas.nasa.gov/publications/articles/feature_LociCHEM_launch_sound_suppression.html
- Storage | CERN. https://home.cern/science/computing/storage
- Sequoia | Computation. https://computation.llnl.gov/computers/sequoia
- Sierra | Computation. https://computation.llnl.gov/computers/sierra
- This Bomb-Simulating Us Supercomputer Broke A World Record. https://www.wired.com/story/this-bomb-simulating-us-supercomputer-broke-a-world-record/
- NREL’s New Supercomputer, Eagle, Takes Flight. https://www.nrel.gov/news/program/2019/nrel-new-supercomputer-eagle-takes-flight.html
- Infographic: Eagle vs. Peregrine Supercomputer Stack-Up. https://www.nrel.gov/computational-science/eagle-peregrine-infographic.html
- Research Areas | NREL. https://www.nrel.gov/research/areas.html
- Blue Brain Project releases first-ever digital 3D brain cell atlas. https://www.epfl.ch/research/domains/bluebrain/blue-brain/news/blue-brain-project-releases-first-ever-digital-3d-brain-cell-atlas/
- Artificial intelligence for a smarter kind of cybersecurity. https://www.ibm.com/security/artificial-intelligence
- Oil group Total hopes new supercomputer will help it find oil faster and more cheaply. https://www.reuters.com/article/us-total-supercomputer/oil-group-total-hopes-new-supercomputer-will-help-it-find-oil-faster-and-more-cheaply-idUSKCN1TJ0FQ