Being that I primarily blog for the glory of the Lord, If this is your first visit to my site, you might like to start here. I write on a few different topics: My Heart contains journal entries, confessions, thoughts, opinions whilst My Home is where I share about, My Online Life is where I talk about, social media, privacy and I also share links to freebies and great sites. either I have thoroughly enjoyed writing or have received a lot of comments from you
Image Source Location: http://commons.wikimedia.org/wiki/File:Roadrunner_supercomputer_HiRes.jpg
written by Supercomputer
"High-performance computing" redirects here. For narrower definitions of HPC, see high-throughput computing and many-task computing. For other uses, see Supercomputer (disambiguation). The Blue Gene/P supercomputer at Argonne National Lab runs over 250,000 processors using normal data center air conditioning, grouped in 72 racks/cabinets connected by a high-speed optical network. A supercomputer is a computer at the frontline of contemporary processing capacity – particularly speed of calculation which can happen at speeds of nanoseconds. Supercomputers were introduced in the 1960s, made initially and, for decades, primarily by Seymour Cray at Control Data Corporation (CDC), Cray Research and subsequent companies bearing his name or monogram. While the supercomputers of the 1970s used only a few processors, in the 1990s machines with thousands of processors began to appear and, by the end of the 20th century, massively parallel supercomputers with tens of thousands of "off-the-shelf" processors were the norm. As of November 2013, China's Tianhe-2 supercomputer is the fastest in the world at 33.86 petaFLOPS, or 33.86 quadrillion floating point operations per second. Systems with massive numbers of processors generally take one of two paths: In one approach (e.g., in distributed computing), a large number of discrete computers (e.g., laptops) distributed across a network (e.g., the Internet) devote some or all of their time to solving a common problem; each individual computer (client) receives and completes many small tasks, reporting the results to a central server which integrates the task results from all the clients into the overall solution. In another approach, a large number of dedicated processors are placed in close proximity to each other (e.g. in a computer cluster); this saves considerable time moving data around and makes it possible for the processors to work together (rather than on separate tasks), for example in mesh and hypercube architectures.
The use of multi-core processors combined with centralization is an emerging trend; one can think of this as a small cluster (the multicore processor in a smartphone, tablet, laptop, etc.) that both depends upon and contributes to the cloud. Supercomputers play an important role in the field of computational science, and are used for a wide range of computationally intensive tasks in various fields, including quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), and physical simulations (such as simulations of the early moments of the universe, airplane and spacecraft aerodynamics, the detonation of nuclear weapons, and nuclear fusion). Throughout their history, they have been essential in the field of cryptanalysis. We Are Being tracking
The Beast is tracking us - it started around the year 1974.
In 1974, a crisis meeting was called in Brussels, by head leaders in the Common Market in Belgium. Concern about economic corruption and chaos in the world was the topic. At this crisis meeting, scientists, advisors and economic analysts gathered, including one - Dr. Hanrick Eldeman. The doctor revealed at this meeting that "The Beast," a 3-story high supercomputer was up and running.
The Beast was programmed a certain way so that it became a self-programming (heuristic) supercomputer, and now it is tracking all of us, gathering data through our spending habits. In the near future, we will all be assigned a number to replace our need for credit cards.
Eventually, the digits assigned to us will be tattooed or otherwise embedded in our skin to effect an invisible yet permanent mark on either our foreheads or the backs of our hands. This will only be visible under special infrared scanners, and will eliminate many common credit card problems for both credit card holders and credit card companies.
In essence, once we are tattooed, we will become walking credit cards, each and every one of us!
Dr. Eldeman asserts that allowing the supercomputer to assign numbers in three entries of six digits each, everyone in the world will be marked with and assigned his or her own unique credit card identification number.
Think this is a joke? That no computer is capable of counting everyone in the world?
Well The Beast is three stories high and has over a hundred data enty sources plugging information into the supercomputer! Day and night. It is a machine, so it never sleeps...
History of supercomputing
A Cray-1 preserved at the Deutsches Museum
The history of supercomputing goes back to the 1960s, with the Atlas at the University of Manchester and a series of computers at Control Data Corporation (CDC), designed by Seymour Cray. These used innovative designs and parallelism to achieve superior computational peak performance.
The Atlas was a joint venture between Ferranti and the Manchester University and was designed to operate at processing speeds approaching one microsecond per instruction, about one million instructions per second. The first Atlas was officially commissioned on 7 December 1962 as one of the world's first supercomputers – considered to be the most powerful computer in the world at that time by a considerable margin, and equivalent to four IBM 7094s.
The CDC 6600, released in 1964, was designed by Cray to be the fastest in the world by a large margin. Cray switched from germanium to silicon transistors, which he ran very fast, solving the overheating problem by introducing refrigeration.Given that the 6600 outran all computers of the time by about 10 times, it was dubbed a supercomputer and defined the supercomputing market when one hundred computers were sold at $8 million each.
Cray left CDC in 1972 to form his own company. Four years after leaving CDC, Cray delivered the 80 MHz Cray 1 in 1976, and it became one of the most successful supercomputers in history. The Cray-2 released in 1985 was an 8 processor liquid cooled computer and Fluorinert was pumped through it as it operated. It performed at 1.9 gigaflops and was the world's fastest until 1990.
Although I've asked this question about Christian scriptures, I don't mean to discount that something similar to what this urban legend tells of is actually part of our "improving technology."
This legend surfaced long before computers were as high-tech and wide-spread as they are in the 2000's
In part, this legend may have been a mixture of natural societal fear about exploding development of technology, Christian viewpoints and interpretation on Scriptures, and other societal fears about mechanical and technological machines that have the possibility to harm us...and obvious and great fears that budding technologies and machines will get into the wrong hands of people who will harm the world in general and as a whole via these machines...
Main article: While the supercomputers of the 1980s used only a few processors, in the 1990s, machines with thousands of processors began to appear both in the United States and in Japan, setting new computational performance records. Fujitsu's Numerical Wind Tunnel supercomputer used 166 vector processors to gain the top spot in 1994 with a peak speed of 1.7 gigaflops per processor. The Hitachi SR2201 obtained a peak performance of 600 gigaflops in 1996 by using 2048 processors connected via a fast three-dimensional crossbar network. The Intel Paragon could have 1000 to 4000 Intel i860 processors in various configurations, and was ranked the fastest in the world in 1993. The Paragon was a MIMD machine which connected processors via a high speed two dimensional mesh, allowing processes to execute on separate nodes; communicating via the Message Passing Interface.
Hardware and architecture
Supercomputer architecture and Parallel computer hardware
A Blue Gene/L cabinet showing the stacked blades, each holding many processors
Approaches to supercomputer architecture have taken dramatic turns since the earliest systems were introduced in the 1960s. Early supercomputer architectures pioneered by Seymour Cray relied on compact innovative designs and local parallelism to achieve superior computational peak performance.However, in time the demand for increased computational power ushered in the age of massively parallel systems.
While the supercomputers of the 1970s used only a few processors, in the 1990s, machines with thousands of processors began to appear and by the end of the 20th century, massively parallel supercomputers with tens of thousands of "off-the-shelf" processors were the norm. Supercomputers of the 21st century can use over 100,000 processors (some being graphic units) connected by fast connections.
Throughout the decades, the management of heat density has remained a key issue for most centralized supercomputers.The large amount of heat generated by a system may also have other effects, e.g. reducing the lifetime of other system componentsThere have been diverse approaches to heat management, from pumping Fluorinert through the system, to a hybrid liquid-air cooling system or air cooling with normal air conditioning temperatures.
In my opinion, despite the heavily notable Christian influences within this legend, I think this is a legend that derives from WELL-PLACED FEARS...
By 2010, machines and technologies ARE very much harming people, are often in the wrong hands and into the hands of people whose motives in the use of machines are to oppress others and harm them...
Some of our technologies are harmful because their amazing functions are underestimated by large groups of people who do not know the capacity that these machines/technologies have that can harm people when not used for good purposes.
This story is one urban legend that is coming true - perhaps not in exactly the same way mentioned in the legend, perhaps not with the same motive as stated in the legend but with at least equal harm possible as narrated in the legend.