Editor's Note: In this weekly series, LiveScience explores how technology drives scientific exploration and discovery. - See more at: http://www.livescience.com/37671-how-to-do-brain-surgery.html#sthash.NIJ821It.dpuf
Editor's Note: In this weekly series, LiveScience explores how technology drives scientific exploration and discovery.
Today's supercomputers are marvels of computational power, and they are being used to tackle some of the world's biggest scientific problems.
Current models are tens of thousands of times faster than the average desktop computer. They achieve these lightning-fast speeds via parallel processing, in which many computer processors perform computations simultaneously. Supercomputers are used for everything from forecasting weather to modeling the human brain.
What sets supercomputers apart is the size and difficulty of the tasks they can tackle and solve, said Jack Wells, director of science at the Oak Ridge Leadership Computing Facility at Oak Ridge National Laboratory in Tennessee. [9 Super-Cool Uses for Supercomputers]
"Supercomputers can do supersize problems," Wells said.
Supercomputers are often built from the same components as regular computers, but they're integrated so they can work together, Wells told LiveScience.
The first supercomputers were developed in the 1960s, designed by electrical engineer Seymour Cray of Control Data Corporation (CDC). In 1964, the company released the CDC 6600, often considered to be the world's first supercomputer. Cray later formed his own company, which made the Cray-1 in 1976 and Cray-2 in 1985.
These early supercomputers had only a few processors, but by the 1990s, the United States and Japan were making ones with thousands of processors. Fujitsu's Numerical Wind Tunnel became the fastest supercomputer in 1994 with 166 processors, followed by the Hitachi SR2201, in 1996, with more than 2,000 processors. The Intel Paragon edged into the lead in 1993. As of June 2013, China's Tianhe-2 was the world's fastest supercomputer.
Supercomputer performance is measured in "flops," short for floating-point operations per second. Today's machines can achieve speeds in petaflops ? quadrillions of flops.
The TOP500 is a ranking of the world's 500 most powerful supercomputers. China?s Tianhe-2 achieves 33.86 petaflops, while the Cray Titan reaches 17.59 petaflops, and IBM's Sequoia ranks third at 17.17 petaflops.
Solving supersize problems
Researchers have harnessed the number-crunching power of supercomputers to work on complex problems in fields ranging from astrophysics to neuroscience.
These computational behemoths have been used to answer questions about the creation of the universe during the Big Bang. Researchers at the Texas Advanced Computing Center (TACC) simulated how the first galaxies formed, and scientists at NASA Ames Research Center in Mountain View, Calif., simulated the birth of stars. Using computers like IBM's Roadrunner at Los Alamos National Laboratory, physicists have probed the mysteries of dark matter, the mysterious substance that makes up roughly 25 percent of the mass of the universe. [101 Astronomy Images That Will Blow Your Mind]
Weather forecasting is another area that relies heavily on supercomputing. For example, forecasters used the TACC supercomputer Ranger to determine the path of Hurricane Ike in 2008, improving the five-day hurricane forecast by 15 percent. Climate scientists use supercomputers to model global climate change, a challenging task involving hundreds of variables.
Testing nuclear weapons has been banned in the United States since 1992, but supercomputer simulations ensure that the nation's nukes remain safe and functional. IBM's Sequoia supercomputer at Lawrence Livermore National Laboratory in California is designed to replace testing of nuclear explosions with improved simulations. ?
Increasingly, neuroscientists have turned their attention to the daunting task of modeling the human brain. The Blue Brain project at the ?cole Polytechnique F?d?rale de Lausanne in Switzerland, led by Henry Markram,?aims to create a complete, virtual human brain. The project scientists are using an IBM Blue Gene supercomputer to simulate the molecular structures of real mammalian brains. In 2006, Blue Brain successfully simulated a complete column of neurons in the rat brain.
Sharing the load
The quintessential supercomputer typically consists of large datacenters filled with many machines that are physically linked together. But distributed computing could also be considered a form of supercomputing; it consists of many individual computers connected by a network (such as the Internet) that devote some portion of their processing power to a large problem.
A well-known example is the SETI@home (Search for Extraterrestrial Intelligence at home) project, in which millions of people run a program on their computers that looks for signs of intelligent life in radio signals. Another is "Folding at home," a project to predict the 3D structure of proteins ? the biological workhorses that perform vital tasks in our bodies ? from the sequence of molecular chains from which they're made.
In the future, supercomputers will edge toward "exascale" capabilities ? about 50 times faster than current systems, Wells said. This will require greater energy, so energy efficiency will likely become an important goal of future systems. Another trend will be integrating large amounts of data for applications like discovering new materials and biotechnologies, Wells said.
Follow Tanya Lewis on Twitter and Google+.?Follow us @livescience, Facebook & Google+. Original article on?LiveScience.com.
Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Source: http://news.yahoo.com/incredible-technology-supercomputers-solve-giant-problems-153719212.html
ferris state hockey mary poppins john derbyshire kinkade thomas kinkade paintings navy jet crash virginia beach isiah thomas
কোন মন্তব্য নেই:
একটি মন্তব্য পোস্ট করুন