| Hello and welcome to the History of Computing thread. This guide will tour you through the history of the computing as well as the first computer hardwares, from the very first known working computer till today. As you can see from the Content below, this guide is separated in chapters to make it reading easier and more interactive. All I can say is that I hope you WILL enjoy reading it, and learn more about computers we're using today. As I'm a human, it's normal I make mistakes, so if someone notices something in my guide, and as well as giving your suggestions/ideas/praise/complaint about this guide, feel free to PM me. ENJOY. |
History of Computing
Posted 22 November 2012 - 11:15 PM Edited by ilir, 23 November 2012 - 01:17 AM.
Posted 22 November 2012 - 11:42 PM Edited by ilir, 23 November 2012 - 01:23 AM.
|Here I will explain the meaning of the word "computer", as well as its first usage and some old computing hardware that led to creating more advanced computers, the analog computers.|
• The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.
• The earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results.
• Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to stored-program computers. The history of stored-program computers relates first to computer architecture, that is, the organization of the units to perform input and output, to store data and to operate as an integrated mechanism.
• Before the development of the general-purpose computer, most calculations were done by humans. Mechanical tools to help humans with digital calculations were then called "calculating machines" ( see picture ), by proprietary names, or even as they are now, calculators. It was those humans who used the machines who were then called computers. Aside from written numerals, the first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result.
• A sophisticated (and comparatively recent) example is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous "analog" form, for instance a voltage or some other physical property was set to be proportional to the number.
• Analog computers, like those designed and built by Vannevar Bush before World War II were of this type. Numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results.
• The invention of electronic amplifiers made calculating machines much faster than their mechanical or electromechanical predecessors. Vacuum tube (thermionic valve) amplifiers gave way to solid state transistors, and then rapidly to integrated circuits which continue to improve, placing millions of electrical switches (typically transistors) on a single elaborately manufactured piece of semi-conductor the size of a fingernail. By defeating the tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity. There is an ongoing effort to make computer hardware faster, cheaper, and capable of storing more data.
• Computing hardware has become a platform for uses other than mere computation, such as process automation, electronic communications, equipment control, entertainment, education, etc. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements, such as the role of the touch screen to create a more intuitive and natural user interface.
• As all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storages tied to the development of computers.
Posted 22 November 2012 - 11:48 PM Edited by ilir, 23 November 2012 - 01:19 AM.
|This is the section were the computing history begins. Here you will find the oldest true computers which were used to do some kind of work. Read further and you will understand what I mean.|
• Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with our fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in containers. The use of counting rods is one example.
• Several analog computers were constructed in ancient and medieval times to perform astronomical calculations. These include the Antikythera mechanism and The Astrolabe from ancient Greece (c. 150–100 BC), which are generally regarded as the earliest known mechanical analog computers.
• Hero of Alexandria (c. 10–70 AD) made many complex mechanical devices including Automata and a programmable cart.
• Other early versions of mechanical devices used to perform one or another type of calculations include the Planisphere and other mechanical computing devices invented by Abū Rayhān al-Bīrūnī (c. AD 1000); the equatorium and universal latitude-independent astrolabe by Abū Ishāq Ibrāhīm al-Zarqālī (c. AD 1015); the astronomical analog computers of other medieval Mulsim astronomers and engineers; and the astronomical clock tower of Su Song (c. AD 1090) during the Song Dynasty.
• Scottish mathematician and physicist John Napier noted multiplication and division of numbers could be performed by addition and subtraction, respectively, of logarithms of those numbers. While producing the first logarithmic tables Napier needed to perform many multiplications, and it was at this point that he designed Napier's bones, an abacus-like device used for multiplication and division. Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620's to allow multiplication and division operations to be carried out significantly faster than was previously possible. Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of the pocket calculator.
• Wilhelm Schickard, a German polymath, designed a calculating clock in 1623. It made use of a single-tooth gear that was not an adequate solution for a general carry mechanism. A fire destroyed the machine during its construction in 1624 and Schickard abandoned the project. Two sketches of it were discovered in 1957, too late to have any impact on the development of mechanical calculators.
Posted 22 November 2012 - 11:51 PM Edited by ilir, 23 November 2012 - 01:24 AM.
• In 1642, while still a teenager, Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes he invented the mechanical calculator. He built twenty of these machines (called Pascal's Calculator or Pascaline) in the following ten years. Nine Pascalines have survived, most of which are on display in European museums.
• Gottfried Wilhelm von Leibniz invented the Stepped Reckoner and his famous cylinders around 1672 while adding direct multiplication and division to the Pascaline. Leibniz once said "It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."
• Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, that could add, subtract, multiply, and divide. It was mainly based on Leibniz' work. Mechanical calculators, like the base-ten addiator, the comptometer, the Monroe, the Curta and the Addo-X remained in use until the 1970's. Leibniz also described the binary numeral system, a central ingredient of all modern computers. However, up to the 1940's, many subsequent designs (including Charles Babbage's machines of the 1822 and even ENIAC of 1945) were based on the decimal system; ENIAC's ring counters emulated the operation of the digit wheels of a mechanical adding machine.
• In Japan, Ryōichi Yazu patented a mechanical calculator called the Yazu Arithmometer in 1903. It consisted of a single cylinder and 22 gears, and employed the mixed base-2 and base-5 number system familiar to users to the soroban (Japanese abacus). Carry and end of calculation were determined automatically. More than 200 units were sold, mainly to government agencies such as the Ministry of War and agricultural experiment stations.
Posted 22 November 2012 - 11:53 PM Edited by ilir, 23 November 2012 - 01:26 AM.
• In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. Punch cards were preceded by punch bands, as in the machine proposed by Basile Bouchon. These bands would inspire information recording for automatic pianos and more recently NC machine-tools.
• In 1833, Charles Babbage moved on from developing his difference engine (for navigational calculations) to a general purpose design, the Analytical Engine, which drew directly on Jacquard's punched cards for its program storage. In 1837, Babbage described his analytical engine. It was a general-purpose programmable computer, employing punch cards for input and a steam engine for power, using the positions of gears and shafts to represent numbers. His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a special purpose machine). Babbage's idea soon developed into a general-purpose programmable computer.
• While his design was sound and the plans were probably correct, or at least debuggable, the project was slowed by various problems including disputes with the chief machinist building parts for it. Babbage was a difficult man to work with and argued with everyone. All the parts for his machine had to be made by hand. Small errors in each item might sometimes sum to cause large discrepancies. In a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time, this was a major problem. The project dissolved in disputes with the artisan who built parts and ended with the decision of the British Government to cease funding. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by Federico Luigi, Conte Menabrea. This appears to be the first published description of programming.
• A reconstruction of the Difference Engine II, an earlier, more limited design, has been operational since 1991 at the London Science Museum. With a few trivial changes, it works exactly as Babbage designed it and shows that Babbage's design ideas were correct, merely too far ahead of his time. The museum used computer-controlled machine tools to construct the necessary parts, using tolerances a good machinist of the period would have been able to achieve. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow.
• A machine based on Babbage's difference engine was built in 1843 by Per Georg Scheutz and his son Edward. An improved Scheutzian calculation engine was sold to the British government and a later model was sold to the American government and these were used successfully in the production of logarithmic tables.
• Following Babbage, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.
Posted 22 November 2012 - 11:55 PM Edited by ilir, 23 November 2012 - 01:27 AM.
• By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. By the 1920's Lewis Fry Richardson's interest in weather prediction led him to propose human computers and numerical analysis to model the weather; to this day, the most powerful computers on Earth are needed to adequately model its weather using the Navier–Stokes equations.
• Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930's that could add, subtract, multiply and divide. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of human computers who understood the use of differential equations which were being solved for the war effort.
• In 1948, the Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950's and 1960's a variety of different brands of mechanical calculators appeared on the market. The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch (130 mm) CRT, and introduced Reverse Polish notation (RPN) to the calculator market at a price of $2200. The EC-132 model added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms.
In the early days of binary vacuum-tube computers, their reliability was poor enough to justify marketing a mechanical octal version ("Binary Octal") of the Marchant desktop calculator. It was intended to check and verify calculation results of such computers.
Posted 22 November 2012 - 11:57 PM Edited by ilir, 23 November 2012 - 01:28 AM.
• Before World War II, mechanical and electrical analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties—the position and motion of wheels or the voltage and current of electronic components—and the mathematics of other physical phenomena, for example, ballistic trajectories, inertia, resonance, energy transfer, momentum, and so forth. They model physical phenomena with electrical voltages and currents as the analog quantities.
• Centrally, these analog systems work by creating electrical 'analogs' of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs. The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an electrical example is the Mallock machine built in 1941. A planimeter is a device which does integrals, using distance as the analog quantity. Unlike modern digital computers, analog computers are not very flexible, and need to be rewired manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.
• Some of the most widely deployed analog computers included devices for aiming weapons, such as the Norden bombsight, and fire-control systems, such as Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after World War II; the Mark I Fire Control Computer was deployed by the United States Navy on a variety of ships from destroyers to battleships. Other analog computers included the Heathkit EC-1, and the hydraulic MONIAC Computer which modeled econometric flows.
• The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT (Massachusetts Institute of Technology) starting in 1927, which in turn built on the mechanical integrators invented in 1876 by James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence was obvious; the most powerful was constructed at the University of Pennsylvania's Moore School of Electrical Engineering, where the ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950's and 1960's, and later in some specialized applications.
Posted 23 November 2012 - 12:04 AM
|The evolution from old to modern computers was very difficult. The inventors were focused to create an automatic calculator that would make the sellers's job a lot easier. There are many other reasons they wanted to invent automatic machines, and the large amount of inventions caused computers to pass into new modern era computers.|
• The history of the modern computer begins with two separate technologies, automated calculation and programmability, but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the electronic calculator, like the Sumerian abacus, designed around 2500 BC of which a descendant won a speed competition against a modern desk calculating machine in Japan in 1946, the slide rules, invented in the 1620's, which were carried on five Apollo space missions, including to the moon and arguably the astrolabe and the Antikythera mechanism , an ancient astronomical computer built by the Greeks around 80 BC. The Greek mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when. This is the essence of programmability.
• Around the end of the 10th century, the French monk Gerbert d'Aurillac brought back from Spain the drawings of a machine invented by the Moors that answered either Yes or No to the questions it was asked. Again in the 13th century, the monks Albertus Magnus and Roger Bacon built talking androids without any further development (Albertus Magnus complained that he had wasted forty years of his life when Thomas Aquinas, terrified by his machine, destroyed it).
• In 1642, the Renaissance saw the invention of Pascal's mechanical calculator , a device that could perform all four arithmetic operations without relying on human intelligence. The mechanical calculator was at the root of the development of computers in two separate ways. Initially, it was in trying to develop more powerful and more flexible calculators that the computer was first theorized by Charles Babbage and then developed. Secondly, development of a low-cost electronic calculator, successor to the mechanical calculator, resulted in the development by Intel of the first commercially available microprocessor integrated circuit.
Posted 23 November 2012 - 12:06 AM Edited by ilir, 23 November 2012 - 01:10 AM.
• In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.
• In the late 1880's, Herman Hollerith invented the recording of data on a machine-readable medium. Earlier uses of machine-readable media had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..." To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of ideas and technologies, that would later prove useful in the realization of practical computers, had begun to appear: Boolean algebra, the vacuum tube (thermionic valve), punched cards and tape , and the teleprinter.
• During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.
• Alan Turing is widely regarded as the father of modern computer science. In 1936 Turing provided an influential formalisation of the concept of the algorithm and computation with the Turing machine, providing a blueprint for the electronic digital computer. Of his role in the creation of the modern computer, Time magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine".
• The Atanasoff–Berry Computer (ABC) was the world's first electronic digital computer, albeit not programmable. Atanasoff is considered to be one of the fathers of the computer. Conceived in 1937 by Iowa State College physics professor John Atanasoff, and built with the assistance of graduate student Clifford Berry, the machine was not programmable, being designed only to solve systems of linear equations. The computer did employ parallel computation. A 1973 court ruling in a patent dispute found that the patent for the 1946 ENIAC computer derived from the Atanasoff–Berry Computer.
• The first program-controlled computer was invented by Konrad Zuse, who built the Z3, an electromechanical computing machine, in 1941.
• The first programmable electronic computer was the Colossus, built in 1943 by Tommy Flowers.
• George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability.
• A succession of steadily more powerful and flexible computing devices were constructed in the 1930's and 1940's, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult. Notable achievements include:
Posted 23 November 2012 - 12:14 AM Edited by ilir, 23 November 2012 - 01:08 AM.
|The history of the computers is divided in four generations. First generation computers used old vacuum tubes. The invention of transistors marked the starting of the second generation computers. The third generation computers started after inventing integrated circuits, also known as microchips. Microprocessors marked the fourth and last generation computers, which based on them, today's computers are made.|
| VACUUM TUBES |
• First electronic computers used vacuum tubes, and they were huge and very complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer) invented by J. Presper Eckert. It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 160 kilowatts of power. It used 18.000 vacuum tubes, 7200 crystal diodes, 1500 relays, 70.000 resistors, 10.000 capacitors, and 5 million hand-soldered joints.
| TRANSISTORS |
• The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.
| INTEGRATED CIRCUITS |
• The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.
| MICROPROCESSORS |
• First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.
Posted 23 November 2012 - 12:21 AM Edited by ilir, 23 November 2012 - 01:11 AM.
| Computers can be classified, or typed in many ways. The most common types of computers are classified into:|
• Classes based on principles of information
• Classes by size and configuration
• Classes by function
Below here I explained all those three classes.
There are three different types of computers according to the principles of operation. Those three types of computers are:
- Analog Computer is a computing device that works on continuous range of values. The results given by the analog computers will only be approximate since they deal with quantities that vary continuously. It generally deals with physical variables such as voltage, pressure, temperature, speed, etc. Mechanical analog computers were very important in gun fire control in World War II, The Korean War and well past the Vietnam War; they were made in significant numbers. The development of transistors made electronic analog computers practical, and until digital computers had developed sufficiently, they continued to be commonly used in science and industry.
- Analog computers can have a very wide range of complexity. Slide rules and nomographs are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated.
- On the other hand a digital computer operates on digital data such as numbers. It uses binary number system in which there are only two digits 0 and 1. Each one is called a bit.
- The digital computer is designed using digital circuits in which there are two levels for an input or output signal. These two levels are known as logic 0 and logic 1. Digital Computers can give more accurate and faster results.
Digital computer is well suited for solving complex problems in engineering and technology. Hence digital computers have an increasing use in the field of design, research and data processing.
- Based on the purpose, Digital computers can be further classified as:
- General purpose computer is a computer designed to perform, or that is capable of performing, in a reasonably efficient manner, the functions required by both scientific and business applications. A general purpose computer is often understood to be a large system, capable of supporting remote terminal operations, but it may also be a smaller computer, e.g., a desktop workstation.
- A hybrid computer combines the desirable features of analog and digital computers. It is mostly used for automatic operations of complicated physical processes and machines. Now-a-days analog-to-digital and digital-to-analog converters are used for transforming the data into suitable form for either type of computation.
- For example, in hospital’s ICU, analog devices might measure the patients temperature, blood pressure and other vital signs. These measurements which are in analog might then be converted into numbers and supplied to digital components in the system. These components are used to monitor the patient’s vital sign and send signals if any abnormal readings are detected. Hybrid computers are mainly used for specialized tasks.
Posted 23 November 2012 - 12:31 AM Edited by ilir, 23 November 2012 - 01:14 AM.
There are four different classes of computers by size and configuration:
- A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. The term supercomputer itself is rather fluid, and the speed of today's supercomputers tends to become typical of tomorrow's ordinary computer.
- A supercomputer is focused on performing tasks involving intense numerical calculations such as weather forecasting, fluid dynamics, nuclear simulations, theoretical astrophysics, and complex scientific computations.
- Supercomputer processing speeds are measured in floating point operations per second or FLOPS. An example of a floating point operation is the calculation of mathematical equations in real numbers. In terms of computational capability, memory size and speed, I/O technology, and topological issues such as bandwidth and latency, supercomputers are the most powerful, are very expensive, and not cost-effective just to perform batch or transaction processing. Transaction processing is handled by less powerful computers such as server computers or mainframes.
- The term mainframe computer was created to distinguish the traditional, large, institutional computer intended to service multiple users from the smaller, single user machines. These computers are capable of handling and processing very large amounts of data quickly.
- Mainframe computers are used in large institutions such as government, banks and large corporations. They are measured in MIPS (Million Instructions Per Second) and respond to up to 100's of millions of users at a time.
- A minicomputer (colloquially, mini) is a class of multi-user computers that lies in the middle range of the computing spectrum, in between the smallest multi-user systems (mainframe computers) and the largest single-user systems (microcomputers or personal computers).
- The contemporary term for this class of system is midrange computer, such as the higher-end SPARC, POWER and Itanium -based systems from Oracle Corporation, IBM and Hewlett-Packard (E.g. Laboratory computers).
- Microcomputers are the most common type of computers used by people today, whether in a workplace, at school or on the desk at home. The term “microcomputer” was introduced with the advent of single chip microprocessors. The term "microcomputer" itself is now practically an anachronism.
- These computers include:
- A separate class is that of mobile devices:
Posted 23 November 2012 - 12:39 AM Edited by ilir, 23 November 2012 - 01:12 AM.
There are four different classes of computers by function:
- Server usually refers to a computer that is dedicated to provide a service. For example, a computer dedicated to a database may be called a "database server". "File servers" manage a large collection of computer files. "Web servers" process web pages and web applications.
- Many smaller servers are actually personal computers that have been dedicated to provide services for other computers.
- A Workstations is a high-end microcomputer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. The term "workstation" has also been used to refer to a mainframe computer terminal or a PC connected to a network.
- Workstations are intended to serve one user and may contain special hardware enhancements not found on a personal computer.
- Information appliances are computers that are usable for the purposes of computing, telecommunicating, reproducing, and presenting encoded information in myriad forms and applications. They're also specially designed to perform a specific user-friendly function such as playing music, photography, or editing text.
- The term is most commonly applied to mobile devices, though there are also portable and desktop devices of this class.
- Embedded computers are computers that are a part of a machine or device. Embedded computers generally execute a program that is stored in non-volatile memory and is only intended to operate a specific machine or device. Embedded computers are very common.
- Embedded computers are typically required to operate continuously without being reset or rebooted, and once employed in their task the software usually cannot be modified. An automobile may contain a number of embedded computers; however, a washing machine and a DVD player would contain only one.
- The central processing units (CPU's) used in embedded computers are often sufficient only for the computational requirements of the specific application and may be slower and cheaper than CPU's found in a personal computer.
Posted 23 November 2012 - 12:47 AM Edited by ilir, 23 November 2012 - 01:15 PM.
Posted 23 November 2012 - 12:53 AM Edited by ilir, 23 November 2012 - 01:01 AM.
Posted 23 November 2012 - 02:53 AM
Posted 23 November 2012 - 08:27 AM
|QUOTE (ccrogers15 @ Friday, Nov 23 2012, 04:08)|
|You better be giving MASTER OF SANANDREAS some credit. You basically ripped his thread and changed it about computers! Its so obvious!: http://www.gtaforums...howtopic=529269|
That's just the page formatting, which looking at the quote codes isn't that difficult anyway.
If I'm pedantic, Colossus was the first non-general-purpose, digital computer, and the Bombe, despite not being a true "computer", could have been worth a mention, but an interesting read.
Posted 23 November 2012 - 11:44 AM
@cc-He didn't 'rip off' he was just inspired by the guide,to be honest I myself was inspired by miromiro's newbie guide.
Posted 23 November 2012 - 12:03 PM
Posted 23 November 2012 - 01:09 PM Edited by ilir, 23 November 2012 - 05:02 PM.
Just one notice: I didn't finished my guide yet, as I'm going to add more sections (that's why I reserved two posts), gonna write the credits, write some last words etc
My main point of this guide was the accuracy of the informations, rather than formatting. I don't say I didn't focused on formatting and making it more appealing and readable, although the accuracy of the guide was my big point.
Had too many problems with tables tough
@sivispacem: Actually the Atanasoff-Berry computer was the first non-general computer, as it was conceived in 1937 and successfully tested in 1942, albeit non-programmable.
The Colossus, or to be more precise, The Colossus Mark I was invented in December 1943 and it was operational after one year (in February 1944) at Bletchley Park. It's main usage was breaking codes back in World War II and helping in the cryptanalysis of the Lorenz cipher. Unlike Atanasoff-Berry computer, the Colossus was programmable.
I also read about the Bombe, it was an electromechanical device rather than a computer, but I'll mention it in Computer History Timeline. Thank you for your suggestion
Posted 23 November 2012 - 04:15 PM
Posted 23 November 2012 - 05:02 PM
Posted 23 November 2012 - 05:30 PM
@sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me
Posted 24 November 2012 - 09:44 AM
|QUOTE (ilir @ Friday, Nov 23 2012, 18:30)|
|@sivispacem: It's not my right of saying sivispacem. That's what history says. Don't mess up the first non-general computer with the first digital computer. ABC and Colossus were both first digital computers, however Colossus was programmable and ABC wasn't. That's it. If you wanna discuss this further, feel free to PM me|
My point was more about how you define a computer. I'd argue that if it lacks programmability it isn't a computer, but hey it's your topic, I was just commenting.
Posted 24 November 2012 - 12:48 PM
Posted 24 November 2012 - 03:31 PM
|QUOTE (Celestail @ Saturday, Nov 24 2012, 13:48)|
|@sivispacem I see your point but i was tought all Machines are computers whenever they can be programmed or not|
It really is a question of semantics. I feel somewhat bad for partially derailing the topic, but the Oxford English Dictionary definition of a "Computer", in this context-
|QUOTE (The Oxford English Dictionary)|
|an electronic device which is capable of receiving information (data) in a particular form and of performing a sequence of operations in accordance with a predetermined but variable set of procedural instructions (program) to produce a result in the form of information or signals.|
refers directly to the programmability of computers as a defining characteristic. I was originally merely pointing out that, depending on how you define a computer, the first (digital) one isn't necessarily the same.
Posted 24 November 2012 - 04:08 PM
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users