iLiR Posted November 22, 2012 Share Posted November 22, 2012 (edited) Hello and welcome to the History of Computing thread. This guide will tour you through the history of the computing as well as the first computer hardwares, from the very first known working computer till today. As you can see from the Content below, this guide is separated in chapters to make it reading easier and more interactive. All I can say is that I hope you WILL enjoy reading it, and learn more about computers we're using today. As I'm a human, it's normal I make mistakes, so if someone notices something in my guide, and as well as giving your suggestions/ideas/praise/complaint about this guide, feel free to PM me. ENJOY. 1. The start 1.1 Content2. The word "Computer" 2.1 The meaning 2.2 First usage 2.3 Computing hardware3. First computers 3.1 Earliest true hardware (part 1) 3.1 Earliest true hardware (part 2) 3.2 Punched card technology 3.3 Desktop calculators 3.4 Advanced analog computers4. Modern computers 4.1 Limited-function early computers 4.2 First general-purpose computers5. Generations of computers 5.1 First generation computers 5.2 Second generation computers 5.3 Third generation computers 5.4 Fourth generation computers6. Classes of computers 6.1 Classes based on principles of operation 6.2 Classes by size and configuration 6.3 Classes by function7. Computer history Timeline Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/ Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) Here I will explain the meaning of the word "computer", as well as its first usage and some old computing hardware that led to creating more advanced computers, the analog computers. [top] - [content] • A computer is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem. • Conventionally, a computer consists of at least one processing element, typically a central processing unit (CPU) and some form of memory. The processing element carries out arithmetic and logic operations, and a sequencing and control unit that can change the order of operations based on stored information. Peripheral devices allow information to be retrieved from an external source, and the result of operations saved and retrieved. • The first electronic digital computers were developed between 1940 and 1945 in the United Kingdom and United States. Originally they were the size of a large room, consuming as much power as several hundred modern personal computers (PC’s). In this era mechanical analog computers were used for military applications. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. • Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code. Binary system • While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today. [top] - [content] • The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations. A women who does calculations • The earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. A very old abacus [top] - [content] • Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to stored-program computers. The history of stored-program computers relates first to computer architecture, that is, the organization of the units to perform input and output, to store data and to operate as an integrated mechanism. • Before the development of the general-purpose computer, most calculations were done by humans. Mechanical tools to help humans with digital calculations were then called "calculating machines" (see picture), by proprietary names, or even as they are now, calculators. It was those humans who used the machines who were then called computers. Aside from written numerals, the first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. • A sophisticated (and comparatively recent) example is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous "analog" form, for instance a voltage or some other physical property was set to be proportional to the number. Slide ruler • Analog computers, like those designed and built by Vannevar Bush before World War II were of this type. Numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results. Vannevar Bush (1890-1974) • The invention of electronic amplifiers made calculating machines much faster than their mechanical or electromechanical predecessors. Vacuum tube (thermionic valve) amplifiers gave way to solid state transistors, and then rapidly to integrated circuits which continue to improve, placing millions of electrical switches (typically transistors) on a single elaborately manufactured piece of semi-conductor the size of a fingernail. By defeating the tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity. There is an ongoing effort to make computer hardware faster, cheaper, and capable of storing more data. Old Transistor Old Vacuum Tubes • Computing hardware has become a platform for uses other than mere computation, such as process automation, electronic communications, equipment control, entertainment, education, etc. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements, such as the role of the touch screen to create a more intuitive and natural user interface. • As all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storages tied to the development of computers. [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979405 Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) This is the section were the computing history begins. Here you will find the oldest true computers which were used to do some kind of work. Read further and you will understand what I mean. [top] - [content] Part 1 • Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with our fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in containers. The use of counting rods is one example. Tally sticks Yang Hui (Pascal's triangle) using rod numerals • Several analog computers were constructed in ancient and medieval times to perform astronomical calculations. These include the Antikythera mechanism and The Astrolabe from ancient Greece (c. 150–100 BC), which are generally regarded as the earliest known mechanical analog computers. Antikythera mechanism The Astrolabe • Hero of Alexandria (c. 10–70 AD) made many complex mechanical devices including Automata and a programmable cart. "Automata" • Other early versions of mechanical devices used to perform one or another type of calculations include the Planisphere and other mechanical computing devices invented by Abū Rayhān al-Bīrūnī (c. AD 1000); the equatorium and universal latitude-independent astrolabe by Abū Ishāq Ibrāhīm al-Zarqālī (c. AD 1015); the astronomical analog computers of other medieval Mulsim astronomers and engineers; and the astronomical clock tower of Su Song (c. AD 1090) during the Song Dynasty. Abū Rayhān al-Bīrūnī(973-1048) Planisphere Equatorium of al-Zarqālī Su Song (1020-1101) Astronomical clock towerof Su Song • Scottish mathematician and physicist John Napier noted multiplication and division of numbers could be performed by addition and subtraction, respectively, of logarithms of those numbers. While producing the first logarithmic tables Napier needed to perform many multiplications, and it was at this point that he designed Napier's bones, an abacus-like device used for multiplication and division. Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620's to allow multiplication and division operations to be carried out significantly faster than was previously possible. Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of the pocket calculator. John Napier (1550–1617) Napier's bones • Wilhelm Schickard, a German polymath, designed a calculating clock in 1623. It made use of a single-tooth gear that was not an adequate solution for a general carry mechanism.[8] A fire destroyed the machine during its construction in 1624 and Schickard abandoned the project. Two sketches of it were discovered in 1957, too late to have any impact on the development of mechanical calculators. Wilhelm Schickard(1592-1635) [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979423 Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) Part 2 • In 1642, while still a teenager, Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes he invented the mechanical calculator. He built twenty of these machines (called Pascal's Calculator or Pascaline) in the following ten years. Nine Pascalines have survived, most of which are on display in European museums. Blaise Pascal(1623-1662) Pascaline • Gottfried Wilhelm von Leibniz invented the Stepped Reckoner and his famous cylinders around 1672 while adding direct multiplication and division to the Pascaline. Leibniz once said "It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used." Gottfried Wilhelm Leibniz(1646-1716) Stepped Reckoner • Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, that could add, subtract, multiply, and divide. It was mainly based on Leibniz' work. Mechanical calculators, like the base-ten addiator, the comptometer, the Monroe, the Curta and the Addo-X remained in use until the 1970's. Leibniz also described the binary numeral system, a central ingredient of all modern computers. However, up to the 1940's, many subsequent designs (including Charles Babbage's machines of the 1822 and even ENIAC of 1945) were based on the decimal system; ENIAC's ring counters emulated the operation of the digit wheels of a mechanical adding machine. Charles Xavier Thomas(1785-1870) The Arithmometer Addo-X • In Japan, Ryōichi Yazu patented a mechanical calculator called the Yazu Arithmometer in 1903. It consisted of a single cylinder and 22 gears, and employed the mixed base-2 and base-5 number system familiar to users to the soroban (Japanese abacus). Carry and end of calculation were determined automatically. More than 200 units were sold, mainly to government agencies such as the Ministry of War and agricultural experiment stations. Ryōichi Yazu and his PatentYazu Arithmometer. [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979435 Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) • In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. Punch cards were preceded by punch bands, as in the machine proposed by Basile Bouchon. These bands would inspire information recording for automatic pianos and more recently NC machine-tools. Joseph-Marie Jacquard(1752-1834) A punched card Jacquard's loom • In 1833, Charles Babbage moved on from developing his difference engine (for navigational calculations) to a general purpose design, the Analytical Engine, which drew directly on Jacquard's punched cards for its program storage. In 1837, Babbage described his analytical engine. It was a general-purpose programmable computer, employing punch cards for input and a steam engine for power, using the positions of gears and shafts to represent numbers. His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a special purpose machine). Babbage's idea soon developed into a general-purpose programmable computer. • While his design was sound and the plans were probably correct, or at least debuggable, the project was slowed by various problems including disputes with the chief machinist building parts for it. Babbage was a difficult man to work with and argued with everyone. All the parts for his machine had to be made by hand. Small errors in each item might sometimes sum to cause large discrepancies. In a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time, this was a major problem. The project dissolved in disputes with the artisan who built parts and ended with the decision of the British Government to cease funding. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by Federico Luigi, Conte Menabrea. This appears to be the first published description of programming. Charles Babbage(1791-1871) Analytical Engineof Babbage • A reconstruction of the Difference Engine II, an earlier, more limited design, has been operational since 1991 at the London Science Museum. With a few trivial changes, it works exactly as Babbage designed it and shows that Babbage's design ideas were correct, merely too far ahead of his time. The museum used computer-controlled machine tools to construct the necessary parts, using tolerances a good machinist of the period would have been able to achieve. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Difference Engine II • A machine based on Babbage's difference engine was built in 1843 by Per Georg Scheutz and his son Edward. An improved Scheutzian calculation engine was sold to the British government and a later model was sold to the American government and these were used successfully in the production of logarithmic tables. Per Georg Scheutz(1785-1873) Scheutzian mechanical calculator • Following Babbage, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909. Percy Ludgate(1883-1922) [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979441 Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) • By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. By the 1920's Lewis Fry Richardson's interest in weather prediction led him to propose human computers and numerical analysis to model the weather; to this day, the most powerful computers on Earth are needed to adequately model its weather using the Navier–Stokes equations. Lewis Fry Richardson(1881-1953) Navier-Stokes equations • Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930's that could add, subtract, multiply and divide. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of human computers who understood the use of differential equations which were being solved for the war effort. Differential Equations • In 1948, the Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950's and 1960's a variety of different brands of mechanical calculators appeared on the market. The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch (130 mm) CRT, and introduced Reverse Polish notation (RPN) to the calculator market at a price of $2200. The EC-132 model added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms. In the early days of binary vacuum-tube computers, their reliability was poor enough to justify marketing a mechanical octal version ("Binary Octal") of the Marchant desktop calculator. It was intended to check and verify calculation results of such computers. Curta ANITA Mk.VII EC-132 LOCI-2 [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979449 Share on other sites More sharing options...
iLiR Posted November 22, 2012 Author Share Posted November 22, 2012 (edited) • Before World War II, mechanical and electrical analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties—the position and motion of wheels or the voltage and current of electronic components—and the mathematics of other physical phenomena, for example, ballistic trajectories, inertia, resonance, energy transfer, momentum, and so forth. They model physical phenomena with electrical voltages and currents as the analog quantities. • Centrally, these analog systems work by creating electrical 'analogs' of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs. The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an electrical example is the Mallock machine built in 1941. A planimeter is a device which does integrals, using distance as the analog quantity. Unlike modern digital computers, analog computers are not very flexible, and need to be rewired manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited. Water integrator Mallock machine Planimeter • Some of the most widely deployed analog computers included devices for aiming weapons, such as the Norden bombsight, and fire-control systems, such as Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after World War II; the Mark I Fire Control Computer was deployed by the United States Navy on a variety of ships from destroyers to battleships. Other analog computers included the Heathkit EC-1, and the hydraulic MONIAC Computer which modeled econometric flows. Norden bombsight Mark I Fire Control Computer Heathkit EC-1 MONIAC Computer • The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT (Massachusetts Institute of Technology) starting in 1927, which in turn built on the mechanical integrators invented in 1876 by James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence was obvious; the most powerful was constructed at the University of Pennsylvania's Moore School of Electrical Engineering, where the ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950's and 1960's, and later in some specialized applications. James Thomson(1822–1892) [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979455 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 The evolution from old to modern computers was very difficult. The inventors were focused to create an automatic calculator that would make the sellers's job a lot easier. There are many other reasons they wanted to invent automatic machines, and the large amount of inventions caused computers to pass into new modern era computers. [top] - [content] • The history of the modern computer begins with two separate technologies, automated calculation and programmability, but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the electronic calculator, like the Sumerian abacus, designed around 2500 BC of which a descendant won a speed competition against a modern desk calculating machine in Japan in 1946, the slide rules, invented in the 1620's, which were carried on five Apollo space missions, including to the moon and arguably the astrolabe and the Antikythera mechanism, an ancient astronomical computer built by the Greeks around 80 BC. The Greek mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when. This is the essence of programmability. Apollo program • Around the end of the 10th century, the French monk Gerbert d'Aurillac brought back from Spain the drawings of a machine invented by the Moors that answered either Yes or No to the questions it was asked. Again in the 13th century, the monks Albertus Magnus and Roger Bacon built talking androids without any further development (Albertus Magnus complained that he had wasted forty years of his life when Thomas Aquinas, terrified by his machine, destroyed it). Pope Sylvester II(born Gerbert d'Aurillac) (946-1003) Albertus Magnus(1206-1280) Roger Bacon(1214–1294) • In 1642, the Renaissance saw the invention of Pascal's mechanical calculator, a device that could perform all four arithmetic operations without relying on human intelligence. The mechanical calculator was at the root of the development of computers in two separate ways. Initially, it was in trying to develop more powerful and more flexible calculators that the computer was first theorized by Charles Babbage and then developed. Secondly, development of a low-cost electronic calculator, successor to the mechanical calculator, resulted in the development by Intel of the first commercially available microprocessor integrated circuit. Old Intel logo [top] - [content] Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979469 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) [top] - [content] • In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability. • In the late 1880's, Herman Hollerith invented the recording of data on a machine-readable medium. Earlier uses of machine-readable media had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..." To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of ideas and technologies, that would later prove useful in the realization of practical computers, had begun to appear: Boolean algebra, the vacuum tube (thermionic valve), punched cards and tape, and the teleprinter. Herman Hollerith(1860-1929) Hollerith's Tabulator • During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. • Alan Turing is widely regarded as the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine, providing a blueprint for the electronic digital computer. Of his role in the creation of the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine". Alan Turing(1912-1954) Illustration of Turing machine • The Atanasoff–Berry Computer (ABC) was the world's first electronic digital computer, albeit not programmable. Atanasoff is considered to be one of the fathers of the computer. Conceived in 1937 by Iowa State College physics professor John Atanasoff, and built with the assistance of graduate student Clifford Berry, the machine was not programmable, being designed only to solve systems of linear equations. The computer did employ parallel computation. A 1973 court ruling in a patent dispute found that the patent for the 1946 ENIAC computer derived from the Atanasoff–Berry Computer. John Vincent Atanasoff(1903-1995) Clifford Berry(1918-1963) ABC (conceived in 1937, tested in 1942) • The first program-controlled computer was invented by Konrad Zuse, who built the Z3, an electromechanical computing machine, in 1941. Konrad Zuse(1910–1995) The Z3 • The first programmable electronic computer was the Colossus, built in 1943 by Tommy Flowers. Tommy Flowers(1905-1998) The Colossus • George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability. George Stibitz(1904-1995) "Model K" • A succession of steadily more powerful and flexible computing devices were constructed in the 1930's and 1940's, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult. Notable achievements include: Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer. The non-programmable Atanasoff–Berry Computer (commenced in 1937, completed in 1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements. The secret British Colossus computers (1943),[29] which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes. The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability. The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming. Claude Shannon(1916-2001) [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979472 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) The history of the computers is divided in four generations. First generation computers used old vacuum tubes. The invention of transistors marked the starting of the second generation computers. The third generation computers started after inventing integrated circuits, also known as microchips. Microprocessors marked the fourth and last generation computers, which based on them, today's computers are made. [top] - [content] VACUUM TUBES • First electronic computers used vacuum tubes, and they were huge and very complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer) invented by J. Presper Eckert. It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 160 kilowatts of power. It used 18.000 vacuum tubes, 7200 crystal diodes, 1500 relays, 70.000 resistors, 10.000 capacitors, and 5 million hand-soldered joints. John Adam Presper Eckert Jr.(1919-1995) ENIAC (announced in 1946) • The first non-general purpose computer was ABC (Atanasoff–Berry Computer) invented by John Vincent Atanasoff, and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, UNIVAC and others. John Vincent Atanasoff(1903-1995) ABC (conceived in 1937, tested in 1942) [top] - [content] TRANSISTORS • The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. Transistor • The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 305 RAMAC. IBM 1401 IBM 305 Ramac [top] - [content] INTEGRATED CIRCUITS • The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on. Micro chip • First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew. IBM System/360 [top] - [content] MICROPROCESSORS • First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004. Intel 4004 The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today. [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979489 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) Computers can be classified, or typed in many ways. The most common types of computers are classified into:• Classes based on principles of information • Classes by size and configuration • Classes by function Below here I explained all those three classes. [top] - [content] There are three different types of computers according to the principles of operation. Those three types of computers are: Analog Computers Digital Computers Hybrid Computers Analog Computers - Analog Computer is a computing device that works on continuous range of values. The results given by the analog computers will only be approximate since they deal with quantities that vary continuously. It generally deals with physical variables such as voltage, pressure, temperature, speed, etc. Mechanical analog computers were very important in gun fire control in World War II, The Korean War and well past the Vietnam War; they were made in significant numbers. The development of transistors made electronic analog computers practical, and until digital computers had developed sufficiently, they continued to be commonly used in science and industry. - Analog computers can have a very wide range of complexity. Slide rules and nomographs are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated. Analog computers Digital Computers - On the other hand a digital computer operates on digital data such as numbers. It uses binary number system in which there are only two digits 0 and 1. Each one is called a bit. - The digital computer is designed using digital circuits in which there are two levels for an input or output signal. These two levels are known as logic 0 and logic 1. Digital Computers can give more accurate and faster results. Digital computer is well suited for solving complex problems in engineering and technology. Hence digital computers have an increasing use in the field of design, research and data processing. - Based on the purpose, Digital computers can be further classified as: General Purpose Computers Special Purpose Computers- Special purpose computer is one that is built for a specific application. General purpose computers are used for any type of applications. They can store different programs and do the jobs as per the instructions specified on those programs. Most of the computers that we see today, are general purpose computers. - General purpose computer is a computer designed to perform, or that is capable of performing, in a reasonably efficient manner, the functions required by both scientific and business applications. A general purpose computer is often understood to be a large system, capable of supporting remote terminal operations, but it may also be a smaller computer, e.g., a desktop workstation. Digital computers Hybrid Computers - A hybrid computer combines the desirable features of analog and digital computers. It is mostly used for automatic operations of complicated physical processes and machines. Now-a-days analog-to-digital and digital-to-analog converters are used for transforming the data into suitable form for either type of computation. - For example, in hospital’s ICU, analog devices might measure the patients temperature, blood pressure and other vital signs. These measurements which are in analog might then be converted into numbers and supplied to digital components in the system. These components are used to monitor the patient’s vital sign and send signals if any abnormal readings are detected. Hybrid computers are mainly used for specialized tasks. Hybrid computers [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979499 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) There are four different classes of computers by size and configuration: Supercomputer Mainframe computers Minicomputers (Midrange computers) Microcomputers (Personal computers) Supercomputer - A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. The term supercomputer itself is rather fluid, and the speed of today's supercomputers tends to become typical of tomorrow's ordinary computer. - A supercomputer is focused on performing tasks involving intense numerical calculations such as weather forecasting, fluid dynamics, nuclear simulations, theoretical astrophysics, and complex scientific computations. - Supercomputer processing speeds are measured in floating point operations per second or FLOPS. An example of a floating point operation is the calculation of mathematical equations in real numbers. In terms of computational capability, memory size and speed, I/O technology, and topological issues such as bandwidth and latency, supercomputers are the most powerful, are very expensive, and not cost-effective just to perform batch or transaction processing. Transaction processing is handled by less powerful computers such as server computers or mainframes. Old supercomputer Modern supercomputer Mainframe computers - The term mainframe computer was created to distinguish the traditional, large, institutional computer intended to service multiple users from the smaller, single user machines. These computers are capable of handling and processing very large amounts of data quickly. - Mainframe computers are used in large institutions such as government, banks and large corporations. They are measured in MIPS (Million Instructions Per Second) and respond to up to 100's of millions of users at a time. Old mainframe Modern mainframe Minicomputers (Midrange computers) - A minicomputer (colloquially, mini) is a class of multi-user computers that lies in the middle range of the computing spectrum, in between the smallest multi-user systems (mainframe computers) and the largest single-user systems (microcomputers or personal computers). - The contemporary term for this class of system is midrange computer, such as the higher-end SPARC, POWER and Itanium -based systems from Oracle Corporation, IBM and Hewlett-Packard (E.g. Laboratory computers). Old minicomputer Modern minicomputer Microcomputers (Personal computers) - Microcomputers are the most common type of computers used by people today, whether in a workplace, at school or on the desk at home. The term “microcomputer” was introduced with the advent of single chip microprocessors. The term "microcomputer" itself is now practically an anachronism. Old microcomputer - These computers include: Desktop computers – A case and a display, put under and on a desk. In-car computers (“carputers”) – Built into a car, for entertainment, navigation, etc. Game consoles – Fixed computers specialized for entertainment purposes (video games). Desktop computer Carputer Video game console - A separate class is that of mobile devices: Laptops, notebook computers and Palmtop computers – Portable and all in one case. Varying sizes, but other than smartbooks expected to be “full” computers without limitations. Tablet computer – Like laptops, but with a touch-screen, sometimes entirely replacing the physical keyboard. Smartphones, smartbooks and PDA's (personal digital assistants) – Small handheld computers with limited hardware. Programmable calculator – Like small handhelds, but specialised on mathematical work. Handheld game consoles – The same as game consoles, but small and portable. Dell laptop iPad tablet computer Acer PDA PSP handheld game console [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979515 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) There are four different classes of computers by function: Servers Workstations Information appliances Embedded computers Servers - Server usually refers to a computer that is dedicated to provide a service. For example, a computer dedicated to a database may be called a "database server". "File servers" manage a large collection of computer files. "Web servers" process web pages and web applications. - Many smaller servers are actually personal computers that have been dedicated to provide services for other computers. Servers Workstations - A Workstations is a high-end microcomputer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. The term "workstation" has also been used to refer to a mainframe computer terminal or a PC connected to a network. - Workstations are intended to serve one user and may contain special hardware enhancements not found on a personal computer. Workstations Information appliances - Information appliances are computers that are usable for the purposes of computing, telecommunicating, reproducing, and presenting encoded information in myriad forms and applications. They're also specially designed to perform a specific user-friendly function such as playing music, photography, or editing text. - The term is most commonly applied to mobile devices, though there are also portable and desktop devices of this class. Information appliance computers Embedded computers - Embedded computers are computers that are a part of a machine or device. Embedded computers generally execute a program that is stored in non-volatile memory and is only intended to operate a specific machine or device. Embedded computers are very common. - Embedded computers are typically required to operate continuously without being reset or rebooted, and once employed in their task the software usually cannot be modified. An automobile may contain a number of embedded computers; however, a washing machine and a DVD player would contain only one. - The central processing units (CPU's) used in embedded computers are often sufficient only for the computational requirements of the specific application and may be slower and cheaper than CPU's found in a personal computer. ADSL modem and DVD Player as Embedded computers [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979526 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) I created this Computer History Timeline to cover almost all important events from the first known computer, the Abacus till today. Further more, this timeline will cover some events I didn't mentioned above. Please notify me if any mistakes are done. Hope you enjoy while reading it. Year Invention Inventor 2400 BCAbacus, the first known calculator.Invented in Babylonia87 BCAntikythera Mechanism: Built in Rhodes to track movementof the stars. Ancient Greeks60 AD Invented a machine which follow a series of instructions. Heron of Alexandria 724 The first fully mechanical clock was invented. Liang Lingzan 1492 Drawings by Leonardo da Vinci depict inventions such asflying machines, including a helicopter, the first mechanical calculator and one of the first programmable robots. Leonardo da Vinci 1614 Invented a system of moveable rods (Napier's Rods) based onlogarithms which was able to multiply, divide and calculate square and cube roots. John Napier 1622 William Oughtred develops slide rules. William Oughtred 1623 Calculating Clock was invented. Wilhelm Schickard 1642 The "Pascaline", a mechanical adding machine was invented. Blaise Pascal 1671 Gottfried Leibniz is known as one of the founding fathers of calculus. Gottfried Leibniz 1801 An automatic loom controlled by punched cards was invented. Joseph-Marie Jacquard 1820 The Arithmometer was the first mass-produced calculator. Charles Xavier Thomas 1822 Charles Babbage designs his first mechanical computer. Charles Babbage 1834 The Analytical Engine was invented. Charles Babbage 1835 The Morse code was invented. Samuel Morse 1848 Boolean algebra is invented. George Boole 1853 Per Georg Scheutz and his son Edvard invent the Tabulating Machine. Per Georg Scheutz 1869 A practical logic machine is designed. William Stanley Jevons 1878 Invented a fast calculator with an internal multiplication table. Ramon Verea 1880 Invented a telephone called the Photophone. Alexander Graham Bell 1884 The Comptometer is an invention of Dorr E. Felt which is operated bypressing keys. Dorr Eugene Felt 1890 A counting machine was invented which increment mechanical counters. Herman Hollerith 1895 Radio signals were invented. Guglielmo Marconi 1898 Remote control was invented. Nikola Tesla 1906 Lee De Forest invents the electronic tube. Lee De Forest 1911 International Business Machines Corporation (IBM) is formed onJune 15, 1911. IBM 1923 Television Electronic was invented. Philo Farnsworth 1924 The world's first fully electronic colour television was invented. John Logie Baird 1924 Walther Bothe develops the logic gate and invents coincidence circuit. Walther Bothe 1930 Vannevar Bush develops a partly electronic Difference Engine. Vannevar Bush 1932 Magnetic drum memory was invented. Gustav Tauschek 1937 Alan Turing develops the concept of a theoretical computing machine. Alan Turing 1937 Atanasoff-Berry Computer (ABC) was conceived. John Vincent AtanasoffClifford Berry 1938 Konrad Zuse creates the Z1 Computer, a binary digital computer usingpunch tape. Konrad Zuse 1939 The Complex Number Calculator was developed - a foundation fordigital computers. George Stibitz 1939 The Bombe was produced. UK Government Code andCypher School (GC&CS) 1939 Hewlett Packard was founded. William HewlettDavid Packard 1943 Tommy Flowers designed the code-breaking machine Colossus. Tommy Flowers 1944 A MARK series of computers were designed at Harvard University. Howard AikenGrace Hopper 1945 ENIAC (Electronic Numerical Integrator and Computer) was invented. John Presper EckertJohn W. Mauchly 1945 The term ‘computer bug’ as computer bug was first used byGrace Hopper. Grace Hopper 1946 F. C. Williams develops his cathode-ray tube (CRT) storing device theforerunner to Random-Access Memory (RAM). Frederic Calland Williams 1947 Donald Watts Davies joins Alan Turing to build the fastest digital computerin England at the time, the Pilot ACE. Donald Watts Davies 1947 William Shockley invents the transistor at Bell Labs. William Shockley 1947 Douglas Engelbart theorises on interactive computing with keyboardand screen display instead of on punchcards. Douglas Engelbart 1949 Claude Shannon builds the first machine that plays chess. Claude Shannon 1949 The Harvard-MARK III was developed. Howard Aiken 1950 The first electronic computer is created in Japan. Hideo Yamachito 1951 T. Raymond Thompson and John Simmons develop the first businesscomputer, the Lyons Electronic Office (LEO) at Lyons Co. T. Raymond ThompsonJohn Simmons 1951 UNIVAC I (UNIVersal Automatic Computer I) was introduced - the firstcommercial computer made in the United States. John Presper EckertJohn W. Mauchly 1951 The EDVAC (Electronic Discrete Variable Automatic Computer) beginsperforming basic tasks. Unlike the ENIAC, it was binary rather than decimal. John Presper EckertJohn W. Mauchly 1954 John Backus & IBM develop the FORTRAN Computer ProgrammingLanguage. John BackusIBM 1955 Bell Labs introduces its first transistor computer. Bell Labs 1956 Optical fiber was invented. Basil HirschowitzC. Wilbur Peters Lawrence E. Curtiss 1958 The first integrated circuit, or silicon chip, is produced. Jack KilbyRobert Noyce 1960 The Common Business-Oriented Language (COBOL) programminglanguage is invented. Private researchers 1961 General Motors puts the first industrial robot, Unimate, to work in aNew Jersey factory. General Motors 1962 The first computer game Spacewar was invented. Steve RussellMIT 1963 Douglas Engelbart invents and patents the first computer mouse(nicknamed the mouse because the tail came out the end). Douglas Engelbart 1963 The American Standard Code for Information Interchange (ASCII) isdeveloped to standardize data exchange among computers. America Standards Institute 1964 IBM introduces the first word processor. IBM 1964 Developed a Beginner’s All-purpose Symbolic Instruction Language(BASIC). John KemenyThomas Kurtz 1965 The Compact Disc (CD) is invented in the United States. James T. Russell 1967 IBM creates the first floppy disk. IBM 1969 Seymour Cray develops the CDC 7600, the first supercomputer. Seymour Cray 1969 Gary Starkweather invents the laser printer whilst working with Xerox. Gary Starkweather 1970 Intel introduces the world's first available dynamic RAM(random-access memory) chip and the first microprocessor, the Intel 4004. Intel 1971 E-mail was invented. Ray Tomlinson 1971 Liquid Crystal Display (LCD) was invented. James Fergason 1971 Pocket calculator was invented. Sharp Corporation 1971 Floppy Disk was invented by David Noble with IBM - Nicknamed the"Floppy" for its flexibility. David NobleIBM 1972 Atari releases Pong, the first commercial video game. Atari 1973 The Ethernet is created, a Local-Area Network (LAN) protocol. Robert MetcalfeDavid Boggs 1973 Vint Cerf and Bob Kahn develop gateway routing computers tonegotiate between the various national networks. Vint CerfBob Kahn 1974 IBM develops SEQUEL (Structured English Query Language), nowknown as SQL. IBM 1974 Charles Simonyi coins the term WYSIWYG (What You See Is WhatYou Get) to describe the ability of being able to display a file or document exactly how it is going to be printed or viewed. Charles Simonyi 1975 Altair produces the first portable computer. Altair 1975 The Microsoft Corporation was founded April 4, 1975 to develop andsell BASIC interpreters for the Altair 8800. Bill GatesPaul Allen 1976 Apple Computers was founded by Steve Wozniak and Steve Jobs. Steve WozniakSteve Jobs 1977 Apple Computer’s Apple II, the first personal computer with colorgraphics is demonstrated. Apple Computers 1977 Ward Christensen writes the programme "MODEM" allowing twomicrocomputers to exchange files with each other over a phone line. Ward Christensen 1980 IBM hires Paul Allen and Bill Gates to create an operating system for anew PC. They buy the rights to a simple operating system manufactured by Seattle Computer Products and use it as a template to develop DOS. IBM 1982 WordPerfect Corporation introduces WordPerfect 1.0, a wordprocessing program. WordPerfect Corporation 1982 The Commodore 64 becomes the best-selling computer of all time. Commodore International 1982 SMTP (Simple Mail Transfer Protocol) is introduced. No information 1982 Domain Name System (DNS) pioneered by Jon Postel, Paul Mockapetrisand Craig Partridge. Seven 'top-level' domain names are initially introduced: edu, com, gov, mil, net, org and int. Jon PostelPaul Mockapetris Craig Partridge 1982 Microsoft Windows introduced eliminating the need for a user to have totype each command, like MS-DOS, by using a mouse to navigate through drop-down menus, tabs and icons. Microsoft Windows 1984 Apple introduces the Macintosh with mouse and window interface. Apple 1985 Paul Brainard introduces Pagemaker for the Macintosh creating thedesktop publishing field. Paul Brainard 1985 The Nintendo Entertainment System makes its debut. Nintendo 1987 Microsoft introduces Microsoft Works. Microsoft 1987 Larry Wall introduces Perl 1.0, a high-level dynamic programminglanguage. Larry Wall 1990 Tim Berners-Lee and Robert Cailliau propose a 'hypertext' systemstarting the modern Internet. Tim Berners-LeeRobert Cailliau 1991 The World Wide Web (WWW) is launched to the public onAugust 6, 1991. No information 1993 At the beginning of the year only 50 World Wide Web servers are knownto exist. No information 1994 The World Wide Web Consortium is founded to help with thedevelopment of common protocols for the evolution of the World Wide Web. Tim Berners-Lee 1994 YAHOO is created in April, 1994. Jerry YangDavid Filo 1994 PlayStation (PS) video game console was invented on December 3, 1994in Japan. Ken Kutaragi 1995 Java is introduced. James Gosling 1995 Amazon.com is founded by Jeff Bezos. Jeff Bezos 1995 EBay is founded by Pierre Omidyar. Pierre Omidyar 1995 Hotmail is started by Jack Smith and Sabeer Bhatia. Jack SmithSabeer Bhatia 1995 WebTV is introduced. Steve PerlmanBruce Leak Phil Goldman 1997 Altavista introduces its free online translator Babel Fish. Altavista 1997 Microsoft acquires Hotmail. No information 1998 Google is founded by Sergey Brin and Larry Page on September 7, 1998. Sergey BrinLarry Page 1998 PayPal is founded. Peter ThielMax Levchin 2001 Bill Gates introduces the Xbox on January 7th, 2001. Bill Gates [top] - [content] Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979538 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 Reserved. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979549 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) Reserved and you're free to post. ENJOY Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979551 Share on other sites More sharing options...
TheGodfather. Posted November 23, 2012 Share Posted November 23, 2012 Too..Good...You must have too much patience for it..This thread is better than the books which we are taught... Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979757 Share on other sites More sharing options...
ccrogers15 Posted November 23, 2012 Share Posted November 23, 2012 You better be giving MASTER OF SANANDREAS some credit. You basically ripped his thread and changed it about computers! Its so obvious!: http://www.gtaforums.com/index.php?showtopic=529269 Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061979783 Share on other sites More sharing options...
sivispacem Posted November 23, 2012 Share Posted November 23, 2012 You better be giving MASTER OF SANANDREAS some credit. You basically ripped his thread and changed it about computers! Its so obvious!: http://www.gtaforums.com/index.php?showtopic=529269 That's just the page formatting, which looking at the quote codes isn't that difficult anyway. If I'm pedantic, Colossus was the first non-general-purpose, digital computer, and the Bombe, despite not being a true "computer", could have been worth a mention, but an interesting read. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061980394 Share on other sites More sharing options...
Master of San Andreas Posted November 23, 2012 Share Posted November 23, 2012 Great Guide Illir definetly worthy of a medal and commendation. @cc-He didn't 'rip off' he was just inspired by the guide,to be honest I myself was inspired by miromiro's newbie guide. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061980676 Share on other sites More sharing options...
Shoumaker Posted November 23, 2012 Share Posted November 23, 2012 I was waiting for this thread . Great job with this thread buddy, Ill be reading through it all night. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061980709 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 (edited) Thank you all. I really appreciate it. This was a real time-consuming, as it took me nearly 3 months to complete. Just one notice: I didn't finished my guide yet, as I'm going to add more sections (that's why I reserved two posts), gonna write the credits, write some last words etc My main point of this guide was the accuracy of the informations, rather than formatting. I don't say I didn't focused on formatting and making it more appealing and readable, although the accuracy of the guide was my big point. Had too many problems with tables tough @sivispacem: Actually the Atanasoff-Berry computer was the first non-general computer, as it was conceived in 1937 and successfully tested in 1942, albeit non-programmable. The Colossus, or to be more precise, The Colossus Mark I was invented in December 1943 and it was operational after one year (in February 1944) at Bletchley Park. It's main usage was breaking codes back in World War II and helping in the cryptanalysis of the Lorenz cipher. Unlike Atanasoff-Berry computer, the Colossus was programmable. I also read about the Bombe, it was an electromechanical device rather than a computer, but I'll mention it in Computer History Timeline. Thank you for your suggestion Edited November 23, 2012 by ilir Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061980813 Share on other sites More sharing options...
Celestail Posted November 23, 2012 Share Posted November 23, 2012 ilir you have been working really hard i see nice work mate, you guild looks amazing Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061981177 Share on other sites More sharing options...
sivispacem Posted November 23, 2012 Share Posted November 23, 2012 Your right to say Colossus was not the first digital computing device, but I would argue that what distinguishes a computer from other similar devices is that it can be programmed Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061981285 Share on other sites More sharing options...
iLiR Posted November 23, 2012 Author Share Posted November 23, 2012 @Celestail: Thank you very much @sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061981371 Share on other sites More sharing options...
ExtremoMania Posted November 24, 2012 Share Posted November 24, 2012 Well I'll be. You really do deserve a cookie. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061983344 Share on other sites More sharing options...
sivispacem Posted November 24, 2012 Share Posted November 24, 2012 @sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me My point was more about how you define a computer. I'd argue that if it lacks programmability it isn't a computer, but hey it's your topic, I was just commenting. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061983480 Share on other sites More sharing options...
Celestail Posted November 24, 2012 Share Posted November 24, 2012 @sivispacem I see your point but i was tought all Machines are computers whenever they can be programmed or not Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061983811 Share on other sites More sharing options...
sivispacem Posted November 24, 2012 Share Posted November 24, 2012 @sivispacem I see your point but i was tought all Machines are computers whenever they can be programmed or not It really is a question of semantics. I feel somewhat bad for partially derailing the topic, but the Oxford English Dictionary definition of a "Computer", in this context- an electronic device which is capable of receiving information (data) in a particular form and of performing a sequence of operations in accordance with a predetermined but variable set of procedural instructions (program) to produce a result in the form of information or signals. refers directly to the programmability of computers as a defining characteristic. I was originally merely pointing out that, depending on how you define a computer, the first (digital) one isn't necessarily the same. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061984208 Share on other sites More sharing options...
Space Cowboy Posted November 24, 2012 Share Posted November 24, 2012 I just came across this, really good work such a detailed guide, I'll be reading it later tonight, good job mate. Link to comment https://gtaforums.com/topic/535381-history-of-computing/?do=findComment&comment=1061984279 Share on other sites More sharing options...
Recommended Posts