The use of the transistor in computers in the late 1950s marked the advent of smaller, faster, and more versatile logical elements than were possible with vacuum-tube machines. Because transistors use much less power and have a much longer life, this development alone was responsible for the improved machines called second-generation computers.Components became smaller, as did inter-component spacings, and the system became much less expensive to build.
D. Integrated Circuits
Late in the 1960s the integrated circuit, or IC, was introduced, making it possible for many transistors to be fabricated on one silicon substrate, with interconnecting wires plated in place. The IC resulted in a further reduction in price, size, and failure rate. The microprocessor became a reality in the mid-1970s with the introduction of the large-scale integrated (LSI) circuit and, later, the very large-scale integrated (VLSI) circuit (microchip), with many thousands of interconnected transistors etched into a single silicon substrate.
To return, then, to the switching capabilities of a modern computer:computers in the 1970s were generally able to handle eight switches at a time. That is, they could deal with eight binary digits, or bits, of data, at every cycle. A group of eight bits is called a byte, each byte containing 256 possible patterns of ONs and OFFs (or 1s and 0s). Each pattern is the equivalent of an instruction, a part of an instruction, or a particular type of datum, such as a number or a character or a graphics symbol. The pattern 11010010, for example, might be binary data—in this case, the decimal number 210—or it might be an instruction telling the computer to compare data stored in its switches to data stored in a certain memory-chip location.
The development of processors that can handle 16, 32, and 64 bits of data at a time has increased the speed of computers. The complete collection of recognizable patterns—the total list of operations—of which a computer is capable is called its instruction set. Both factors—the number of bits that can be handled at one time, and the size of instruction sets—continue to increase with the ongoing development of modern digital computers.
翻译:
历史
第一台加法机,数字计算机的先驱,是1642年由法国科学家、数学家兼哲学家布莱斯•帕斯卡设计的。这个装置使用了一系列有10个齿的轮子,每个齿代表从0到9的一个数字。轮子互相连接,从而通过按照正确的齿数向前移动轮子,就可以将数字彼此相加。在17世纪70年代,德国哲学家兼数学家戈特弗里德•威廉•莱布尼兹对这台机器进行了改良,设计了一台也能做乘法的机器。
法国发明家约瑟夫―玛丽•雅卡尔,在设计自动织机时,使用了穿孔的薄木板来控制复杂图案的编织。在19世纪80年代期间,美国统计学家赫尔曼•何勒里斯,想出了使用类似雅卡尔的木板那样的穿孔卡片来处理数据的主义。通过使用一种将穿孔卡片从电触点上移过的系统,他得以为1890年的美国人口普查汇编统计信息。
1、分析机
也是在19世纪,英国数学家兼发明家查尔斯•巴比奇,提出了现代数字计算机的原理。他构想出旨在处理复杂数学题的若干机器,如差分机。许多历史学家认为巴比奇及其合伙人,数学家奥古斯塔•埃达•拜伦,是现代数字计算机的真正先驱。巴比奇的设计之一,分析机,具有现代计算机的许多特征。它有一个以一叠穿孔卡片的形式存在的输入流、一个储存数据的“仓库”、一个进行算术运算的“工厂”和一个产生永久纪录的打印机。巴比奇未能将这个想法付诸实践,尽管在那个时代它在技术上很可能是可行的。
文章来源于领测软件测试网 https://www.ltesting.net/