Quantcast

Jump to content

» «
Photo

History of Computing

37 replies to this topic
iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#1

Posted 22 November 2012 - 11:15 PM Edited by ilir, 23 November 2012 - 01:17 AM.

user posted image

user posted image
Hello and welcome to the History of Computing thread. This guide will tour you through the history of the computing as well as the first computer hardwares, from the very first known working computer till today. As you can see from the Content below, this guide is separated in chapters to make it reading easier and more interactive. All I can say is that I hope you WILL enjoy reading it, and learn more about computers we're using today. As I'm a human, it's normal I make mistakes, so if someone notices something in my guide, and as well as giving your suggestions/ideas/praise/complaint about this guide, feel free to PM me. ENJOY.


user posted image

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#2

Posted 22 November 2012 - 11:42 PM Edited by ilir, 23 November 2012 - 01:23 AM.

user posted image
Here I will explain the meaning of the word "computer", as well as its first usage and some old computing hardware that led to creating more advanced computers, the analog computers.


user posted image

A computer is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem.

• Conventionally, a computer consists of at least one processing element, typically a central processing unit (CPU) and some form of memory. The processing element carries out arithmetic and logic operations, and a sequencing and control unit that can change the order of operations based on stored information. Peripheral devices allow information to be retrieved from an external source, and the result of operations saved and retrieved.

• The first electronic digital computers were developed between 1940 and 1945 in the United Kingdom and United States. Originally they were the size of a large room, consuming as much power as several hundred modern personal computers (PC’s). In this era mechanical analog computers were used for military applications. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time.

• Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code.

user posted image
Binary system

• While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.



user posted image

• The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.

user posted image
A women who does calculations

• The earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results.


user posted image
A very old abacus

user posted image

Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to stored-program computers. The history of stored-program computers relates first to computer architecture, that is, the organization of the units to perform input and output, to store data and to operate as an integrated mechanism.

• Before the development of the general-purpose computer, most calculations were done by humans. Mechanical tools to help humans with digital calculations were then called "calculating machines" ( see picture ), by proprietary names, or even as they are now, calculators. It was those humans who used the machines who were then called computers. Aside from written numerals, the first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result.

• A sophisticated (and comparatively recent) example is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous "analog" form, for instance a voltage or some other physical property was set to be proportional to the number.

user posted image
Slide ruler

Analog computers, like those designed and built by Vannevar Bush before World War II were of this type. Numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results.

user posted image
Vannevar Bush
(1890-1974)

• The invention of electronic amplifiers made calculating machines much faster than their mechanical or electromechanical predecessors. Vacuum tube (thermionic valve) amplifiers gave way to solid state transistors, and then rapidly to integrated circuits which continue to improve, placing millions of electrical switches (typically transistors) on a single elaborately manufactured piece of semi-conductor the size of a fingernail. By defeating the tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity. There is an ongoing effort to make computer hardware faster, cheaper, and capable of storing more data.

user posted image
Old Transistor
user posted image
Old Vacuum Tubes

• Computing hardware has become a platform for uses other than mere computation, such as process automation, electronic communications, equipment control, entertainment, education, etc. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements, such as the role of the touch screen to create a more intuitive and natural user interface.

• As all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storages tied to the development of computers.



iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#3

Posted 22 November 2012 - 11:48 PM Edited by ilir, 23 November 2012 - 01:19 AM.

user posted image
This is the section were the computing history begins. Here you will find the oldest true computers which were used to do some kind of work. Read further and you will understand what I mean.


user posted image
Part 1

• Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with our fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in containers. The use of counting rods is one example.

user posted image
Tally sticks
user posted image
Yang Hui (Pascal's triangle)
using rod numerals

• Several analog computers were constructed in ancient and medieval times to perform astronomical calculations. These include the Antikythera mechanism and The Astrolabe from ancient Greece (c. 150–100 BC), which are generally regarded as the earliest known mechanical analog computers.

user posted image
Antikythera mechanism
user posted image
The Astrolabe

Hero of Alexandria (c. 10–70 AD) made many complex mechanical devices including Automata and a programmable cart.

user posted image
"Automata"

• Other early versions of mechanical devices used to perform one or another type of calculations include the Planisphere and other mechanical computing devices invented by Abū Rayhān al-Bīrūnī (c. AD 1000); the equatorium and universal latitude-independent astrolabe by Abū Ishāq Ibrāhīm al-Zarqālī (c. AD 1015); the astronomical analog computers of other medieval Mulsim astronomers and engineers; and the astronomical clock tower of Su Song (c. AD 1090) during the Song Dynasty.

user posted image
Abū Rayhān al-Bīrūnī
(973-1048)
user posted image
Planisphere
user posted image
Equatorium of al-Zarqālī
user posted image
Su Song (1020-1101)
user posted image
Astronomical clock tower
of Su Song

• Scottish mathematician and physicist John Napier noted multiplication and division of numbers could be performed by addition and subtraction, respectively, of logarithms of those numbers. While producing the first logarithmic tables Napier needed to perform many multiplications, and it was at this point that he designed Napier's bones, an abacus-like device used for multiplication and division. Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620's to allow multiplication and division operations to be carried out significantly faster than was previously possible. Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of the pocket calculator.

user posted image
John Napier (1550–1617)
user posted image
Napier's bones

Wilhelm Schickard, a German polymath, designed a calculating clock in 1623. It made use of a single-tooth gear that was not an adequate solution for a general carry mechanism.[8] A fire destroyed the machine during its construction in 1624 and Schickard abandoned the project. Two sketches of it were discovered in 1957, too late to have any impact on the development of mechanical calculators.


user posted image
Wilhelm Schickard
(1592-1635)

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#4

Posted 22 November 2012 - 11:51 PM Edited by ilir, 23 November 2012 - 01:24 AM.

user posted image
Part 2


• In 1642, while still a teenager, Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes he invented the mechanical calculator. He built twenty of these machines (called Pascal's Calculator or Pascaline) in the following ten years. Nine Pascalines have survived, most of which are on display in European museums.

user posted image
Blaise Pascal
(1623-1662)
user posted image
Pascaline

Gottfried Wilhelm von Leibniz invented the Stepped Reckoner and his famous cylinders around 1672 while adding direct multiplication and division to the Pascaline. Leibniz once said "It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."

user posted image
Gottfried Wilhelm Leibniz
(1646-1716)
user posted image
Stepped Reckoner

• Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, that could add, subtract, multiply, and divide. It was mainly based on Leibniz' work. Mechanical calculators, like the base-ten addiator, the comptometer, the Monroe, the Curta and the Addo-X remained in use until the 1970's. Leibniz also described the binary numeral system, a central ingredient of all modern computers. However, up to the 1940's, many subsequent designs (including Charles Babbage's machines of the 1822 and even ENIAC of 1945) were based on the decimal system; ENIAC's ring counters emulated the operation of the digit wheels of a mechanical adding machine.

user posted image
Charles Xavier Thomas
(1785-1870)
user posted image
The Arithmometer
user posted image
Addo-X

• In Japan, Ryōichi Yazu patented a mechanical calculator called the Yazu Arithmometer in 1903. It consisted of a single cylinder and 22 gears, and employed the mixed base-2 and base-5 number system familiar to users to the soroban (Japanese abacus). Carry and end of calculation were determined automatically. More than 200 units were sold, mainly to government agencies such as the Ministry of War and agricultural experiment stations.


user posted image
Ryōichi Yazu and his Patent
Yazu Arithmometer.



iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#5

Posted 22 November 2012 - 11:53 PM Edited by ilir, 23 November 2012 - 01:26 AM.

user posted image

• In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. Punch cards were preceded by punch bands, as in the machine proposed by Basile Bouchon. These bands would inspire information recording for automatic pianos and more recently NC machine-tools.

user posted image
Joseph-Marie Jacquard
(1752-1834)
user posted image
A punched card
user posted image
Jacquard's loom

• In 1833, Charles Babbage moved on from developing his difference engine (for navigational calculations) to a general purpose design, the Analytical Engine, which drew directly on Jacquard's punched cards for its program storage. In 1837, Babbage described his analytical engine. It was a general-purpose programmable computer, employing punch cards for input and a steam engine for power, using the positions of gears and shafts to represent numbers. His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a special purpose machine). Babbage's idea soon developed into a general-purpose programmable computer.

• While his design was sound and the plans were probably correct, or at least debuggable, the project was slowed by various problems including disputes with the chief machinist building parts for it. Babbage was a difficult man to work with and argued with everyone. All the parts for his machine had to be made by hand. Small errors in each item might sometimes sum to cause large discrepancies. In a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time, this was a major problem. The project dissolved in disputes with the artisan who built parts and ended with the decision of the British Government to cease funding. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by Federico Luigi, Conte Menabrea. This appears to be the first published description of programming.

user posted image
Charles Babbage
(1791-1871)
user posted image
Analytical Engine
of Babbage

• A reconstruction of the Difference Engine II, an earlier, more limited design, has been operational since 1991 at the London Science Museum. With a few trivial changes, it works exactly as Babbage designed it and shows that Babbage's design ideas were correct, merely too far ahead of his time. The museum used computer-controlled machine tools to construct the necessary parts, using tolerances a good machinist of the period would have been able to achieve. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow.

user posted image
Difference Engine II

• A machine based on Babbage's difference engine was built in 1843 by Per Georg Scheutz and his son Edward. An improved Scheutzian calculation engine was sold to the British government and a later model was sold to the American government and these were used successfully in the production of logarithmic tables.

user posted image
Per Georg Scheutz
(1785-1873)
user posted image
Scheutzian mechanical calculator

• Following Babbage, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.


user posted image
Percy Ludgate
(1883-1922)

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#6

Posted 22 November 2012 - 11:55 PM Edited by ilir, 23 November 2012 - 01:27 AM.

user posted image

• By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. By the 1920's Lewis Fry Richardson's interest in weather prediction led him to propose human computers and numerical analysis to model the weather; to this day, the most powerful computers on Earth are needed to adequately model its weather using the Navier–Stokes equations.

user posted image
Lewis Fry Richardson
(1881-1953)





user posted image

Navier-Stokes equations

• Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930's that could add, subtract, multiply and divide. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of human computers who understood the use of differential equations which were being solved for the war effort.

user posted image
Differential Equations

• In 1948, the Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950's and 1960's a variety of different brands of mechanical calculators appeared on the market. The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch (130 mm) CRT, and introduced Reverse Polish notation (RPN) to the calculator market at a price of $2200. The EC-132 model added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms.
In the early days of binary vacuum-tube computers, their reliability was poor enough to justify marketing a mechanical octal version ("Binary Octal") of the Marchant desktop calculator. It was intended to check and verify calculation results of such computers.


user posted image
Curta
user posted image
ANITA Mk.VII
user posted image
EC-132
user posted image
LOCI-2

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#7

Posted 22 November 2012 - 11:57 PM Edited by ilir, 23 November 2012 - 01:28 AM.

user posted image

• Before World War II, mechanical and electrical analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties—the position and motion of wheels or the voltage and current of electronic components—and the mathematics of other physical phenomena, for example, ballistic trajectories, inertia, resonance, energy transfer, momentum, and so forth. They model physical phenomena with electrical voltages and currents as the analog quantities.

• Centrally, these analog systems work by creating electrical 'analogs' of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs. The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an electrical example is the Mallock machine built in 1941. A planimeter is a device which does integrals, using distance as the analog quantity. Unlike modern digital computers, analog computers are not very flexible, and need to be rewired manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.

user posted image
Water integrator
user posted image
Mallock machine
user posted image
Planimeter

• Some of the most widely deployed analog computers included devices for aiming weapons, such as the Norden bombsight, and fire-control systems, such as Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after World War II; the Mark I Fire Control Computer was deployed by the United States Navy on a variety of ships from destroyers to battleships. Other analog computers included the Heathkit EC-1, and the hydraulic MONIAC Computer which modeled econometric flows.

user posted image
Norden bombsight
user posted image
Mark I Fire Control Computer
user posted image
Heathkit EC-1
user posted image
MONIAC Computer

• The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT (Massachusetts Institute of Technology) starting in 1927, which in turn built on the mechanical integrators invented in 1876 by James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence was obvious; the most powerful was constructed at the University of Pennsylvania's Moore School of Electrical Engineering, where the ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950's and 1960's, and later in some specialized applications.


user posted image
James Thomson
(1822–1892)

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#8

Posted 23 November 2012 - 12:04 AM

user posted image
The evolution from old to modern computers was very difficult. The inventors were focused to create an automatic calculator that would make the sellers's job a lot easier. There are many other reasons they wanted to invent automatic machines, and the large amount of inventions caused computers to pass into new modern era computers.


user posted image

• The history of the modern computer begins with two separate technologies, automated calculation and programmability, but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the electronic calculator, like the Sumerian abacus, designed around 2500 BC of which a descendant won a speed competition against a modern desk calculating machine in Japan in 1946, the slide rules, invented in the 1620's, which were carried on five Apollo space missions, including to the moon and arguably the astrolabe and the Antikythera mechanism , an ancient astronomical computer built by the Greeks around 80 BC. The Greek mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when. This is the essence of programmability.

user posted image
Apollo program

• Around the end of the 10th century, the French monk Gerbert d'Aurillac brought back from Spain the drawings of a machine invented by the Moors that answered either Yes or No to the questions it was asked. Again in the 13th century, the monks Albertus Magnus and Roger Bacon built talking androids without any further development (Albertus Magnus complained that he had wasted forty years of his life when Thomas Aquinas, terrified by his machine, destroyed it).

user posted image
Pope Sylvester II
(born Gerbert d'Aurillac)
(946-1003)
user posted image
Albertus Magnus
(1206-1280)
user posted image
Roger Bacon
(1214–1294)

• In 1642, the Renaissance saw the invention of Pascal's mechanical calculator , a device that could perform all four arithmetic operations without relying on human intelligence. The mechanical calculator was at the root of the development of computers in two separate ways. Initially, it was in trying to develop more powerful and more flexible calculators that the computer was first theorized by Charles Babbage and then developed. Secondly, development of a low-cost electronic calculator, successor to the mechanical calculator, resulted in the development by Intel of the first commercially available microprocessor integrated circuit.


user posted image
Old Intel logo

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#9

Posted 23 November 2012 - 12:06 AM Edited by ilir, 23 November 2012 - 01:10 AM.

user posted image

• In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.

• In the late 1880's, Herman Hollerith invented the recording of data on a machine-readable medium. Earlier uses of machine-readable media had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..." To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of ideas and technologies, that would later prove useful in the realization of practical computers, had begun to appear: Boolean algebra, the vacuum tube (thermionic valve), punched cards and tape , and the teleprinter.

user posted image
Herman Hollerith
(1860-1929)
user posted image
Hollerith's Tabulator

• During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

Alan Turing is widely regarded as the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine, providing a blueprint for the electronic digital computer. Of his role in the creation of the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine".

user posted image
Alan Turing
(1912-1954)
user posted image
Illustration of Turing machine

• The Atanasoff–Berry Computer (ABC) was the world's first electronic digital computer, albeit not programmable. Atanasoff is considered to be one of the fathers of the computer. Conceived in 1937 by Iowa State College physics professor John Atanasoff, and built with the assistance of graduate student Clifford Berry, the machine was not programmable, being designed only to solve systems of linear equations. The computer did employ parallel computation. A 1973 court ruling in a patent dispute found that the patent for the 1946 ENIAC computer derived from the Atanasoff–Berry Computer.

user posted image
John Vincent Atanasoff
(1903-1995)
user posted image
Clifford Berry
(1918-1963)
user posted image
ABC (conceived in 1937, tested in 1942)

• The first program-controlled computer was invented by Konrad Zuse, who built the Z3, an electromechanical computing machine, in 1941.

user posted image
Konrad Zuse
(1910–1995)
user posted image
The Z3

• The first programmable electronic computer was the Colossus, built in 1943 by Tommy Flowers.

user posted image
Tommy Flowers
(1905-1998)
user posted image
The Colossus

George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability.

user posted image
George Stibitz
(1904-1995)
user posted image
"Model K"

• A succession of steadily more powerful and flexible computing devices were constructed in the 1930's and 1940's, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult. Notable achievements include:
  • Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer.

  • The non-programmable Atanasoff–Berry Computer (commenced in 1937, completed in 1941) which used vacuum tube based computation, binary numbers , and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements.

  • The secret British Colossus computers (1943),[29] which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.

  • The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.

  • The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.

user posted image
Claude Shannon
(1916-2001)

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#10

Posted 23 November 2012 - 12:14 AM Edited by ilir, 23 November 2012 - 01:08 AM.

user posted image
The history of the computers is divided in four generations. First generation computers used old vacuum tubes. The invention of transistors marked the starting of the second generation computers. The third generation computers started after inventing integrated circuits, also known as microchips. Microprocessors marked the fourth and last generation computers, which based on them, today's computers are made.


user posted image
VACUUM TUBES

• First electronic computers used vacuum tubes, and they were huge and very complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer) invented by J. Presper Eckert. It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 160 kilowatts of power. It used 18.000 vacuum tubes, 7200 crystal diodes, 1500 relays, 70.000 resistors, 10.000 capacitors, and 5 million hand-soldered joints.

user posted image
John Adam Presper Eckert Jr.
(1919-1995)
user posted image
ENIAC (announced in 1946)
• The first non-general purpose computer was ABC (Atanasoff–Berry Computer) invented by John Vincent Atanasoff, and other similar computers of this era included german Z3 , ten British Colossus computers, LEO, Harvard Mark I, UNIVAC and others.


user posted image
John Vincent Atanasoff
(1903-1995)
user posted image
ABC (conceived in 1937, tested in 1942)

user posted image
TRANSISTORS

• The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.

user posted image
Transistor
• The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 305 RAMAC.


user posted image
IBM 1401
user posted image
IBM 305 Ramac

user posted image
INTEGRATED CIRCUITS

• The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.

user posted image
Micro chip
• First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.


user posted image
IBM System/360

user posted image
MICROPROCESSORS

• First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.

user posted image
Intel 4004
The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.



iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#11

Posted 23 November 2012 - 12:21 AM Edited by ilir, 23 November 2012 - 01:11 AM.

user posted image
Computers can be classified, or typed in many ways. The most common types of computers are classified into:
• Classes based on principles of information
• Classes by size and configuration
• Classes by function
Below here I explained all those three classes.


user posted image

There are three different types of computers according to the principles of operation. Those three types of computers are:
  • Analog Computers
  • Digital Computers
  • Hybrid Computers

Analog Computers

- Analog Computer is a computing device that works on continuous range of values. The results given by the analog computers will only be approximate since they deal with quantities that vary continuously. It generally deals with physical variables such as voltage, pressure, temperature, speed, etc. Mechanical analog computers were very important in gun fire control in World War II, The Korean War and well past the Vietnam War; they were made in significant numbers. The development of transistors made electronic analog computers practical, and until digital computers had developed sufficiently, they continued to be commonly used in science and industry.

- Analog computers can have a very wide range of complexity. Slide rules and nomographs are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated.


user posted image
user posted image
Analog computers



Digital Computers

- On the other hand a digital computer operates on digital data such as numbers. It uses binary number system in which there are only two digits 0 and 1. Each one is called a bit.

- The digital computer is designed using digital circuits in which there are two levels for an input or output signal. These two levels are known as logic 0 and logic 1. Digital Computers can give more accurate and faster results.
Digital computer is well suited for solving complex problems in engineering and technology. Hence digital computers have an increasing use in the field of design, research and data processing.

- Based on the purpose, Digital computers can be further classified as:
  • General Purpose Computers
  • Special Purpose Computers
- Special purpose computer is one that is built for a specific application. General purpose computers are used for any type of applications. They can store different programs and do the jobs as per the instructions specified on those programs. Most of the computers that we see today, are general purpose computers.

- General purpose computer is a computer designed to perform, or that is capable of performing, in a reasonably efficient manner, the functions required by both scientific and business applications. A general purpose computer is often understood to be a large system, capable of supporting remote terminal operations, but it may also be a smaller computer, e.g., a desktop workstation.


user posted image
user posted image
Digital computers



Hybrid Computers

- A hybrid computer combines the desirable features of analog and digital computers. It is mostly used for automatic operations of complicated physical processes and machines. Now-a-days analog-to-digital and digital-to-analog converters are used for transforming the data into suitable form for either type of computation.

- For example, in hospital’s ICU, analog devices might measure the patients temperature, blood pressure and other vital signs. These measurements which are in analog might then be converted into numbers and supplied to digital components in the system. These components are used to monitor the patient’s vital sign and send signals if any abnormal readings are detected. Hybrid computers are mainly used for specialized tasks.

user posted image
user posted image
Hybrid computers



iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#12

Posted 23 November 2012 - 12:31 AM Edited by ilir, 23 November 2012 - 01:14 AM.

user posted image

There are four different classes of computers by size and configuration:
  • Supercomputer
  • Mainframe computers
  • Minicomputers (Midrange computers)
  • Microcomputers (Personal computers)

Supercomputer

- A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. The term supercomputer itself is rather fluid, and the speed of today's supercomputers tends to become typical of tomorrow's ordinary computer.

- A supercomputer is focused on performing tasks involving intense numerical calculations such as weather forecasting, fluid dynamics, nuclear simulations, theoretical astrophysics, and complex scientific computations.

- Supercomputer processing speeds are measured in floating point operations per second or FLOPS. An example of a floating point operation is the calculation of mathematical equations in real numbers. In terms of computational capability, memory size and speed, I/O technology, and topological issues such as bandwidth and latency, supercomputers are the most powerful, are very expensive, and not cost-effective just to perform batch or transaction processing. Transaction processing is handled by less powerful computers such as server computers or mainframes.


user posted image
Old supercomputer
user posted image
Modern supercomputer


Mainframe computers

- The term mainframe computer was created to distinguish the traditional, large, institutional computer intended to service multiple users from the smaller, single user machines. These computers are capable of handling and processing very large amounts of data quickly.

- Mainframe computers are used in large institutions such as government, banks and large corporations. They are measured in MIPS (Million Instructions Per Second) and respond to up to 100's of millions of users at a time.


user posted image
Old mainframe
user posted image
Modern mainframe


Minicomputers (Midrange computers)

- A minicomputer (colloquially, mini) is a class of multi-user computers that lies in the middle range of the computing spectrum, in between the smallest multi-user systems (mainframe computers) and the largest single-user systems (microcomputers or personal computers).

- The contemporary term for this class of system is midrange computer, such as the higher-end SPARC, POWER and Itanium -based systems from Oracle Corporation, IBM and Hewlett-Packard (E.g. Laboratory computers).


user posted image
Old minicomputer
user posted image
Modern minicomputer


Microcomputers (Personal computers)

- Microcomputers are the most common type of computers used by people today, whether in a workplace, at school or on the desk at home. The term “microcomputer” was introduced with the advent of single chip microprocessors. The term "microcomputer" itself is now practically an anachronism.

user posted image
Old microcomputer

- These computers include:
  • Desktop computers – A case and a display, put under and on a desk.
  • In-car computers (“carputers”) – Built into a car, for entertainment, navigation, etc.
  • Game consoles – Fixed computers specialized for entertainment purposes (video games).
user posted image
Desktop computer
user posted image
Carputer
user posted image
Video game console

- A separate class is that of mobile devices:

user posted image
Dell laptop
user posted image
iPad tablet computer
user posted image
Acer PDA
user posted image
PSP handheld game console

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#13

Posted 23 November 2012 - 12:39 AM Edited by ilir, 23 November 2012 - 01:12 AM.

user posted image

There are four different classes of computers by function:
  • Servers
  • Workstations
  • Information appliances
  • Embedded computers

Servers

- Server usually refers to a computer that is dedicated to provide a service. For example, a computer dedicated to a database may be called a "database server". "File servers" manage a large collection of computer files. "Web servers" process web pages and web applications.

- Many smaller servers are actually personal computers that have been dedicated to provide services for other computers.


user posted image
user posted image
Servers



Workstations

- A Workstations is a high-end microcomputer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. The term "workstation" has also been used to refer to a mainframe computer terminal or a PC connected to a network.

- Workstations are intended to serve one user and may contain special hardware enhancements not found on a personal computer.


user posted image
user posted image
Workstations



Information appliances

- Information appliances are computers that are usable for the purposes of computing, telecommunicating, reproducing, and presenting encoded information in myriad forms and applications. They're also specially designed to perform a specific user-friendly function such as playing music, photography, or editing text.

- The term is most commonly applied to mobile devices, though there are also portable and desktop devices of this class.


user posted image
user posted image
Information appliance computers



Embedded computers

- Embedded computers are computers that are a part of a machine or device. Embedded computers generally execute a program that is stored in non-volatile memory and is only intended to operate a specific machine or device. Embedded computers are very common.

- Embedded computers are typically required to operate continuously without being reset or rebooted, and once employed in their task the software usually cannot be modified. An automobile may contain a number of embedded computers; however, a washing machine and a DVD player would contain only one.

- The central processing units (CPU's) used in embedded computers are often sufficient only for the computational requirements of the specific application and may be slower and cheaper than CPU's found in a personal computer.


user posted image
user posted image
ADSL modem and DVD Player as Embedded computers


iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#14

Posted 23 November 2012 - 12:47 AM Edited by ilir, 23 November 2012 - 01:15 PM.

user posted image

I created this Computer History Timeline to cover almost all important events from the first known computer, the Abacus till today. Further more, this timeline will cover some events I didn't mentioned above. Please notify me if any mistakes are done. Hope you enjoy while reading it.


Year
Invention
Inventor
2400 BC Abacus , the first known calculator.Invented in Babylonia
87 BC Antikythera Mechanism : Built in Rhodes to track movement
of the stars.
Ancient Greeks
60 AD
Invented a machine which follow a series of instructions.
Heron of Alexandria
724
The first fully mechanical clock was invented.
Liang Lingzan

1492
Drawings by Leonardo da Vinci depict inventions such as
flying machines, including a helicopter, the first mechanical
calculator and one of the first programmable robots.

Leonardo da Vinci

1614
Invented a system of moveable rods ( Napier's Rods ) based on
logarithms which was able to multiply, divide and calculate
square and cube roots.

John Napier
1622
William Oughtred develops slide rules .
William Oughtred
1623
Calculating Clock was invented.
Wilhelm Schickard
1642
The " Pascaline ", a mechanical adding machine was invented.
Blaise Pascal
1671
Gottfried Leibniz is known as one of the founding fathers of calculus.
Gottfried Leibniz
1801
An automatic loom controlled by punched cards was invented.
Joseph-Marie Jacquard
1820
The Arithmometer was the first mass-produced calculator.
Charles Xavier Thomas
1822
Charles Babbage designs his first mechanical computer.
Charles Babbage
1834
The Analytical Engine was invented.
Charles Babbage
1835
The Morse code was invented.
Samuel Morse
1848
Boolean algebra is invented.
George Boole
1853
Per Georg Scheutz and his son Edvard invent the Tabulating Machine .
Per Georg Scheutz
1869
A practical logic machine is designed.
William Stanley Jevons
1878
Invented a fast calculator with an internal multiplication table.
Ramon Verea
1880
Invented a telephone called the Photophone.
Alexander Graham Bell
1884
The Comptometer is an invention of Dorr E. Felt which is operated by
pressing keys.
Dorr Eugene Felt
1890
A counting machine was invented which increment mechanical counters.
Herman Hollerith
1895
Radio signals were invented.
Guglielmo Marconi
1898
Remote control was invented.
Nikola Tesla
1906
Lee De Forest invents the electronic tube.
Lee De Forest
1911
International Business Machines Corporation (IBM) is formed on
June 15, 1911.
IBM
1923
Television Electronic was invented.
Philo Farnsworth
1924
The world's first fully electronic colour television was invented.
John Logie Baird
1924
Walther Bothe develops the logic gate and invents coincidence circuit.
Walther Bothe
1930
Vannevar Bush develops a partly electronic Difference Engine.
Vannevar Bush
1932
Magnetic drum memory was invented.
Gustav Tauschek
1937
Alan Turing develops the concept of a theoretical computing machine .
Alan Turing
1937
Atanasoff-Berry Computer (ABC) was conceived.
John Vincent Atanasoff
Clifford Berry
1938
Konrad Zuse creates the Z1 Computer, a binary digital computer using
punch tape.
Konrad Zuse
1939
The Complex Number Calculator was developed - a foundation for
digital computers.
George Stibitz
1939
The Bombe was produced.
UK Government Code and
Cypher School (GC&CS)
1939
Hewlett Packard was founded.
William Hewlett
David Packard
1943
Tommy Flowers designed the code-breaking machine Colossus .
Tommy Flowers
1944
A MARK series of computers were designed at Harvard University.
Howard Aiken
Grace Hopper
1945
ENIAC (Electronic Numerical Integrator and Computer) was invented.
John Presper Eckert
John W. Mauchly
1945
The term ‘computer bug’ as computer bug was first used by
Grace Hopper.
Grace Hopper
1946
F. C. Williams develops his cathode-ray tube (CRT) storing device the
forerunner to Random-Access Memory (RAM).
Frederic Calland Williams
1947
Donald Watts Davies joins Alan Turing to build the fastest digital computer
in England at the time, the Pilot ACE.
Donald Watts Davies
1947
William Shockley invents the transistor at Bell Labs.
William Shockley
1947
Douglas Engelbart theorises on interactive computing with keyboard
and screen display instead of on punchcards.
Douglas Engelbart
1949
Claude Shannon builds the first machine that plays chess.
Claude Shannon
1949
The Harvard-MARK III was developed.
Howard Aiken
1950
The first electronic computer is created in Japan.
Hideo Yamachito
1951
T. Raymond Thompson and John Simmons develop the first business
computer, the Lyons Electronic Office (LEO) at Lyons Co.
T. Raymond Thompson
John Simmons
1951
UNIVAC I (UNIVersal Automatic Computer I) was introduced - the first
commercial computer made in the United States.
John Presper Eckert
John W. Mauchly

1951
The EDVAC (Electronic Discrete Variable Automatic Computer) begins
performing basic tasks. Unlike the ENIAC, it was binary rather than
decimal.
John Presper Eckert
John W. Mauchly
1954
John Backus & IBM develop the FORTRAN Computer Programming
Language.
John Backus
IBM
1955
Bell Labs introduces its first transistor computer.
Bell Labs

1956

Optical fiber was invented.
Basil Hirschowitz
C. Wilbur Peters
Lawrence E. Curtiss
1958
The first integrated circuit, or silicon chip, is produced.
Jack Kilby
Robert Noyce
1960
The Common Business-Oriented Language (COBOL) programming
language is invented.
Private researchers
1961
General Motors puts the first industrial robot, Unimate, to work in a
New Jersey factory.
General Motors
1962
The first computer game Spacewar was invented.
Steve Russell
MIT
1963
Douglas Engelbart invents and patents the first computer mouse
(nicknamed the mouse because the tail came out the end).
Douglas Engelbart
1963
The American Standard Code for Information Interchange (ASCII) is
developed to standardize data exchange among computers.
America Standards Institute
1964
IBM introduces the first word processor.
IBM
1964
Developed a Beginner’s All-purpose Symbolic Instruction Language
(BASIC).
John Kemeny
Thomas Kurtz
1965
The Compact Disc (CD) is invented in the United States.
James T. Russell
1967
IBM creates the first floppy disk.
IBM
1969
Seymour Cray develops the CDC 7600, the first supercomputer.
Seymour Cray
1969
Gary Starkweather invents the laser printer whilst working with Xerox.
Gary Starkweather

1970
Intel introduces the world's first available dynamic RAM
(random-access memory)
chip and the first microprocessor,
the Intel 4004 .

Intel
1971
E-mail was invented.
Ray Tomlinson
1971
Liquid Crystal Display (LCD) was invented.
James Fergason
1971
Pocket calculator was invented.
Sharp Corporation
1971
Floppy Disk was invented by David Noble with IBM - Nicknamed the
"Floppy" for its flexibility.
David Noble
IBM
1972
Atari releases Pong, the first commercial video game.
Atari
1973
The Ethernet is created, a Local-Area Network (LAN) protocol.
Robert Metcalfe
David Boggs
1973
Vint Cerf and Bob Kahn develop gateway routing computers to
negotiate between the various national networks.
Vint Cerf
Bob Kahn
1974
IBM develops SEQUEL (Structured English Query Language), now
known as SQL.
IBM

1974
Charles Simonyi coins the term WYSIWYG (What You See Is What
You Get) to describe the ability of being able to display a file or
document exactly how it is going to be printed or viewed.

Charles Simonyi
1975
Altair produces the first portable computer.
Altair
1975
The Microsoft Corporation was founded April 4, 1975 to develop and
sell BASIC interpreters for the Altair 8800.
Bill Gates
Paul Allen
1976
Apple Computers was founded by Steve Wozniak and Steve Jobs.
Steve Wozniak
Steve Jobs
1977
Apple Computer’s Apple II, the first personal computer with color
graphics is demonstrated.
Apple Computers
1977
Ward Christensen writes the programme "MODEM" allowing two
microcomputers to exchange files with each other over a phone line.
Ward Christensen

1980
IBM hires Paul Allen and Bill Gates to create an operating system for a
new PC. They buy the rights to a simple operating system manufactured
by Seattle Computer Products and use it as a template to develop DOS.

IBM
1982
WordPerfect Corporation introduces WordPerfect 1.0, a word
processing program.
WordPerfect Corporation
1982
The Commodore 64 becomes the best-selling computer of all time.
Commodore International
1982
SMTP (Simple Mail Transfer Protocol) is introduced.
No information

1982
Domain Name System (DNS) pioneered by Jon Postel, Paul Mockapetris
and Craig Partridge. Seven 'top-level' domain names are initially
introduced: edu, com, gov, mil, net, org and int.
Jon Postel
Paul Mockapetris
Craig Partridge

1982
Microsoft Windows introduced eliminating the need for a user to have to
type each command, like MS-DOS, by using a mouse to navigate through
drop-down menus, tabs and icons.

Microsoft Windows
1984
Apple introduces the Macintosh with mouse and window interface.
Apple
1985
Paul Brainard introduces Pagemaker for the Macintosh creating the
desktop publishing field.
Paul Brainard
1985
The Nintendo Entertainment System makes its debut.
Nintendo
1987
Microsoft introduces Microsoft Works.
Microsoft
1987
Larry Wall introduces Perl 1.0, a high-level dynamic programming
language.
Larry Wall
1990
Tim Berners-Lee and Robert Cailliau propose a 'hypertext' system
starting the modern Internet.
Tim Berners-Lee
Robert Cailliau
1991
The World Wide Web (WWW) is launched to the public on
August 6, 1991.
No information
1993
At the beginning of the year only 50 World Wide Web servers are known
to exist.
No information
1994
The World Wide Web Consortium is founded to help with the
development of common protocols for the evolution of the World Wide Web.
Tim Berners-Lee
1994
YAHOO is created in April, 1994.
Jerry Yang
David Filo
1994
PlayStation (PS) video game console was invented on December 3, 1994
in Japan.
Ken Kutaragi
1995
Java is introduced.
James Gosling
1995
Amazon.com is founded by Jeff Bezos.
Jeff Bezos
1995
EBay is founded by Pierre Omidyar.
Pierre Omidyar
1995
Hotmail is started by Jack Smith and Sabeer Bhatia.
Jack Smith
Sabeer Bhatia

1995

WebTV is introduced.
Steve Perlman
Bruce Leak
Phil Goldman
1997
Altavista introduces its free online translator Babel Fish.
Altavista
1997
Microsoft acquires Hotmail.
No information
1998
Google is founded by Sergey Brin and Larry Page on September 7, 1998.
Sergey Brin
Larry Page
1998
PayPal is founded.
Peter Thiel
Max Levchin
2001
Bill Gates introduces the Xbox on January 7th, 2001.
Bill Gates


iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#15

Posted 23 November 2012 - 12:52 AM

Reserved.

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#16

Posted 23 November 2012 - 12:53 AM Edited by ilir, 23 November 2012 - 01:01 AM.

Reserved and you're free to post. ENJOY smile.gif

TheGodfather.
  • TheGodfather.

    Nobody likes me here...I think...

  • Members
  • Joined: 19 Jun 2012

#17

Posted 23 November 2012 - 02:53 AM

Too..Good...You must have too much patience for it..This thread is better than the books which we are taught...

ccrogers15
  • ccrogers15

    REQUESTED BAN

  • BUSTED!
  • Joined: 26 Jul 2010

#18

Posted 23 November 2012 - 03:08 AM

You better be giving MASTER OF SANANDREAS some credit. You basically ripped his thread and changed it about computers! Its so obvious!: http://www.gtaforums...howtopic=529269

sivispacem
  • sivispacem

    I shall revoke the throne, atop the stellar tree

  • Moderator
  • Joined: 14 Feb 2011
  • European-Union
  • Contribution Award [D&D, General Chat]
    Most Knowledgeable [Vehicles] 2013
    Best Debater 2015, 2014, 2013, 2012, 2011

#19

Posted 23 November 2012 - 08:27 AM

QUOTE (ccrogers15 @ Friday, Nov 23 2012, 04:08)
You better be giving MASTER OF SANANDREAS some credit. You basically ripped his thread and changed it about computers! Its so obvious!: http://www.gtaforums...howtopic=529269

That's just the page formatting, which looking at the quote codes isn't that difficult anyway.

If I'm pedantic, Colossus was the first non-general-purpose, digital computer, and the Bombe, despite not being a true "computer", could have been worth a mention, but an interesting read.

Master of San Andreas
  • Master of San Andreas

    Leaving with a big bang, you guys rock.

  • BUSTED!
  • Joined: 07 Jul 2012

#20

Posted 23 November 2012 - 11:44 AM

Great Guide Illir definetly worthy of a medal and commendation. smile.gif

@cc-He didn't 'rip off' he was just inspired by the guide,to be honest I myself was inspired by miromiro's newbie guide. icon14.gif

Shou
  • Shou

    y

  • The Yardies
  • Joined: 09 Jul 2012
  • Jamaica
  • Best Signature 2013
    Best Signature 2012

#21

Posted 23 November 2012 - 12:03 PM

I was waiting for this thread biggrin.gif . Great job with this thread buddy, Ill be reading through it all night.

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#22

Posted 23 November 2012 - 01:09 PM Edited by ilir, 23 November 2012 - 05:02 PM.

Thank you all. I really appreciate it. smile.gif This was a real time-consuming, as it took me nearly 3 months to complete.
Just one notice: I didn't finished my guide yet, as I'm going to add more sections (that's why I reserved two posts), gonna write the credits, write some last words etc smile.gif
My main point of this guide was the accuracy of the informations, rather than formatting. I don't say I didn't focused on formatting and making it more appealing and readable, although the accuracy of the guide was my big point.
Had too many problems with tables tough tounge.gif

@sivispacem: Actually the Atanasoff-Berry computer was the first non-general computer, as it was conceived in 1937 and successfully tested in 1942, albeit non-programmable.
The Colossus, or to be more precise, The Colossus Mark I was invented in December 1943 and it was operational after one year (in February 1944) at Bletchley Park. It's main usage was breaking codes back in World War II and helping in the cryptanalysis of the Lorenz cipher. Unlike Atanasoff-Berry computer, the Colossus was programmable. smile.gif
I also read about the Bombe, it was an electromechanical device rather than a computer, but I'll mention it in Computer History Timeline. Thank you for your suggestion smile.gif

Celestail
  • Celestail

    Foot Soldier

  • BUSTED!
  • Joined: 27 Sep 2012

#23

Posted 23 November 2012 - 04:15 PM

ilir you have been working really hard i see smile.gif nice work mate, you guild looks amazing icon14.gif

sivispacem
  • sivispacem

    I shall revoke the throne, atop the stellar tree

  • Moderator
  • Joined: 14 Feb 2011
  • European-Union
  • Contribution Award [D&D, General Chat]
    Most Knowledgeable [Vehicles] 2013
    Best Debater 2015, 2014, 2013, 2012, 2011

#24

Posted 23 November 2012 - 05:02 PM

Your right to say Colossus was not the first digital computing device, but I would argue that what distinguishes a computer from other similar devices is that it can be programmed wink.gif

iLiR
  • iLiR

    Spinnin'

  • Members
  • Joined: 13 Jan 2009
  • Macedonia

#25

Posted 23 November 2012 - 05:30 PM

@Celestail: Thank you very much smile.gif

@sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me smile.gif

ExtremoMania
  • ExtremoMania

    Been saving it for a rainy day...

  • Members
  • Joined: 04 Apr 2012
  • Philippines

#26

Posted 24 November 2012 - 08:10 AM

Well I'll be. icon14.gif You really do deserve a cookie. cookie.gif cookie.gif

sivispacem
  • sivispacem

    I shall revoke the throne, atop the stellar tree

  • Moderator
  • Joined: 14 Feb 2011
  • European-Union
  • Contribution Award [D&D, General Chat]
    Most Knowledgeable [Vehicles] 2013
    Best Debater 2015, 2014, 2013, 2012, 2011

#27

Posted 24 November 2012 - 09:44 AM

QUOTE (ilir @ Friday, Nov 23 2012, 18:30)
@sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me smile.gif

My point was more about how you define a computer. I'd argue that if it lacks programmability it isn't a computer, but hey it's your topic, I was just commenting.

Celestail
  • Celestail

    Foot Soldier

  • BUSTED!
  • Joined: 27 Sep 2012

#28

Posted 24 November 2012 - 12:48 PM

@sivispacem I see your point but i was tought all Machines are computers whenever they can be programmed or not wink.gif

sivispacem
  • sivispacem

    I shall revoke the throne, atop the stellar tree

  • Moderator
  • Joined: 14 Feb 2011
  • European-Union
  • Contribution Award [D&D, General Chat]
    Most Knowledgeable [Vehicles] 2013
    Best Debater 2015, 2014, 2013, 2012, 2011

#29

Posted 24 November 2012 - 03:31 PM

QUOTE (Celestail @ Saturday, Nov 24 2012, 13:48)
@sivispacem I see your point but i was tought all Machines are computers whenever they can be programmed or not wink.gif

It really is a question of semantics. I feel somewhat bad for partially derailing the topic, but the Oxford English Dictionary definition of a "Computer", in this context-

QUOTE (The Oxford English Dictionary)
an electronic device which is capable of receiving information (data) in a particular form and of performing a sequence of operations in accordance with a predetermined but variable set of procedural instructions (program) to produce a result in the form of information or signals.

refers directly to the programmability of computers as a defining characteristic. I was originally merely pointing out that, depending on how you define a computer, the first (digital) one isn't necessarily the same. wink.gif

0909090
  • 0909090

  • Members
  • Joined: 14 Mar 2012
  • Slovakia

#30

Posted 24 November 2012 - 04:08 PM

I just came across this, really good work such a detailed guide, I'll be reading it later tonight, good job mate. wink.gif smile.gif




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users