Tag

Supercomputing

All articles tagged with #supercomputing

mathematics20 days ago

The 9th Dedekind Number: A 32-Year Quest and the Elusive 10th

Dedekind numbers are a sequence of mathematical values with complex growth, first studied by Richard Dedekind, and are extremely difficult to compute beyond the eighth term due to their exponential complexity. The ninth Dedekind number was only recently discovered through advanced computational methods, but finding the tenth may be impossible for the foreseeable future due to the astronomical computational power required.

technology2 months ago

AMD Gains as U.S. Department of Energy Announces $1B Supercomputing Partnership

AMD has partnered with the U.S. Department of Energy in a $1 billion project to develop two supercomputers, Lux and Discovery, aimed at advancing scientific research in areas like nuclear energy, cancer treatment, and national security. The project enhances AMD's role in high-performance computing and AI infrastructure, with the first system, Lux, launching within six months, and the second, Discovery, expected in 2029. AMD stock has gained over 1% following the announcement.

science-and-technology1 year ago

Frontier Supercomputer Sets New Record in Universe Simulation

Researchers at the Department of Energy's Argonne National Laboratory have used the Frontier supercomputer to conduct the largest astrophysical simulation of the universe to date. This groundbreaking simulation, which incorporates both atomic and dark matter, sets a new standard for cosmological hydrodynamics simulations. The project, part of the Exascale Computing Project, utilized the HACC code, significantly upgraded for exascale machines, achieving speeds nearly 300 times faster than previous benchmarks. This advancement allows for unprecedented simulations of the universe's evolution, comparable to large telescope surveys.

technology1 year ago

NVIDIA Unveils Powerhouse GB200 NVL4 with Quad GPUs and Dual CPUs

Nvidia has unveiled its latest high-performance computing (HPC) and AI chip, the GB200 NVL4, which integrates four Blackwell GPUs and two Grace CPUs on a single board, consuming 5.4 kilowatts of power. This configuration, showcased at the Supercomputing event in Atlanta, allows for significant compute power without relying on Nvidia's proprietary interconnects, making it compatible with existing HPC systems from companies like HPE and Eviden. The GB200 NVL4 can deliver up to 10 petaFLOPS of FP64 compute per cabinet, although AMD-based systems still offer higher floating-point performance. Nvidia also announced the H200 NVL, a PCIe-based configuration that supports up to 13.3 petaFLOPS of FP8 performance with sparsity, emphasizing flexibility and compatibility with standard server racks.

science-and-technology1 year ago

How Supercomputing Unlocks Cosmic Mysteries

NASA is leveraging supercomputing to advance understanding of the universe, from improving astronaut safety on Artemis II missions to refining aircraft designs for efficiency. At the International Conference for High Performance Computing, NASA will showcase projects like AI-based weather models, neutron star simulations, and solar activity studies. These efforts highlight the role of supercomputers in space exploration, climate science, and aerospace engineering.

science1 year ago

Physicists Collaborate to Decode Unstable Sigma Meson

Researchers at the U.S. Department of Energy's Thomas Jefferson National Accelerator Facility have used supercomputers to better understand the unstable sigma meson particle, which plays a crucial role in nuclear physics. By simulating pion-pion reactions based on quantum chromodynamics, they have made significant progress in describing the sigma meson, despite its brief existence and complex nature. This collaborative effort, involving advanced computational techniques, paves the way for further studies of similar particles and deeper insights into the strong interaction.

science-and-technology1 year ago

"Creating Super Diamonds: Squeezing Harder Crystals with Supercomputer Simulations"

Physicists have used supercomputing to simulate the behavior of diamond under high pressure and temperature, revealing new insights into the elusive BC8 phase of carbon, which is expected to be even harder than diamond. The simulations provide clues on the conditions needed to push carbon atoms into this unusual structure, potentially paving the way for its synthesis in a lab. The BC8 phase, thought to exist in high-pressure environments deep inside exoplanets, could open up new research and material application possibilities if stabilized closer to home. Despite previous difficulties in synthesizing BC8 carbon, the simulations have identified the specific high-pressure, high-temperature conditions required for its formation, offering hope for its eventual achievement.

science-and-technology1 year ago

"NASA's Mission Delays Linked to Outdated Supercomputers"

An audit of NASA's high-end computing capabilities reveals that the agency's supercomputers are oversubscribed, out-of-date, and lack strong security controls, leading to delays in missions and increased costs. The audit recommends senior leadership reform how supercomputing is administered and implement a "tiger team" to address technology gaps, improve asset allocations, evaluate cyber risks, and formalize procedures for hardware and software life-cycle management. NASA management has agreed to implement the tiger team and reform its supercomputing management apparatus to address these issues.

astronomytechnology1 year ago

"Filtering Satellite Interference for Square Kilometre Array Precursor"

The Australian Square Kilometre Array Pathfinder (ASKAP) is developing techniques to mitigate satellite interference as it copes with increased satellite traffic, particularly in the bands used for its observations. The project's head of data operations, Dr. Matthew Whiting, highlighted concerns about the impact of satellite signals on ASKAP's operations and discussed efforts to predict and mitigate interference. ASKAP's use of Pawsey's "Setonix" supercomputer has also presented challenges due to the project's high data rate, producing four terabytes an hour at its peak.