After having influenced the
consumer electronics field during the 1970s and the
automotive world during the 1980s, the Japanese had developed a strong reputation. The launch of the FGCS project spread the belief that parallel computing was the future of all performance gains, producing a wave of apprehension in the computer field. Soon parallel projects were set up in the US as the
Strategic Computing Initiative and the
Microelectronics and Computer Technology Corporation (MCC), in the UK as
Alvey, and in Europe as the
European Strategic Program on Research in Information Technology (ESPRIT), as well as the
European Computer‐Industry Research Centre (ECRC) in
Munich, a collaboration between
ICL in Britain,
Bull in France, and
Siemens in Germany. The project ran from 1982 to 1994, spending a little less than ¥57 billion (about US$320 million) total. Per-year spending was less than 1% of the entire R&D expenditure of the electronics and communications equipment industry. For example, the project's highest expenditure year was 7.2 million yen in 1991, but IBM alone spent 1.5 billion dollars (370 billion yen) in 1982, while the industry spent 2150 billion yen in 1990. which presented a Concurrent Prolog
interpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on
concurrent logic programming as the software foundation for the project. The project found that the benefits of
logic programming were largely negated using committed choice. Another problem was that existing CPU performance quickly overcame the barriers that experts anticipated in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations. Although a number of
workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially. The project also failed to incorporate outside innovations. During its lifespan,
GUIs became mainstream in computers; the
internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining. The FGCS workstations had no appeal in a market where general purpose systems could replace and outperform them. This is parallel to the Lisp machine market, where rule-based systems such as
CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.
Ahead of its time In summary, the Fifth-Generation project was revolutionary, and accomplished some basic research that anticipated future research directions. Many papers and patents were published. MITI established a committee which assessed the performance of the FGCS Project as having made major contributions in computing, in particular eliminating bottlenecks in parallel processing software and the realization of intelligent interactive processing based on large knowledge bases. However, the committee was strongly biased to justify the project, so this overstates the actual results. Many of the themes seen in the Fifth-Generation project are now being re-interpreted in current technologies, as the hardware limitations foreseen in the 1980s were finally reached in the 2000s. When
clock speeds of CPUs began to move into the 3–5 GHz range,
CPU power dissipation and other problems became more important. The ability of
industry to produce ever-faster single CPU systems (linked to
Moore's Law about the periodic doubling of transistor counts) began to be threatened. In the early 21st century, many flavors of
parallel computing began to proliferate, including
multi-core architectures at the low-end and
massively parallel processing at the high end. Ordinary consumer machines and
game consoles began to have parallel processors like the
Intel Core,
AMD K10, and
Cell.
Graphics card companies like Nvidia and AMD began introducing large parallel systems like
CUDA and
OpenCL. == See also ==