ABA


"אפשר בבקשה חומר באנגלית על ההתפתחות של המחשב."
גירסת הדפסה        
קבוצות דיון לימודים, מדע ותרבות נושא #8075 מנהל    סגן המנהל    מפקח   Winner    צל"ש   מומחה  
אשכול מספר 8075
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   21:59   25.11.03   
אל הפורום  
  אפשר בבקשה חומר באנגלית על ההתפתחות של המחשב.  
 
ערכתי לאחרונה בתאריך 25.11.03 בשעה 22:01 בברכה, SwitcH
 
ממש מההתחלה ועד היום..
בבקשה יש לי עבודה לבגרות להגיש לעוד שבועיים..

פשוט אין לי מושג איפה לחפש..


                                שתף        
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד

  האשכול     מחבר     תאריך כתיבה     מספר  
  בבקשה!!..זה חשוב!! SwitcH 26.11.03 12:30 1
  Down SwitcH 26.11.03 15:52 2
  Left SwitcH 28.11.03 13:04 3
  Right SwitcH 28.11.03 18:36 4
  זה חלק קטן מהחומר שיש לי... the_jackass 28.11.03 18:54 5
     תביא אחי.. SwitcH 28.11.03 21:23 6
         אמ... the_jackass 29.11.03 03:48 7
             הנה: the_jackass 29.11.03 14:45 8
                 תודה רבה!! SwitcH 29.11.03 18:50 9

       
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   12:30   26.11.03   
אל הפורום  
  1. בבקשה!!..זה חשוב!!  
בתגובה להודעה מספר 0
 


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   15:52   26.11.03   
אל הפורום  
  2. Down  
בתגובה להודעה מספר 0
 


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   13:04   28.11.03   
אל הפורום  
  3. Left  
בתגובה להודעה מספר 0
 


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   18:36   28.11.03   
אל הפורום  
  4. Right  
בתגובה להודעה מספר 0
 
ערכתי לאחרונה בתאריך 28.11.03 בשעה 18:43 בברכה, SwitcH
 


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
the_jackass

   18:54   28.11.03   
אל הפורום  
  5. זה חלק קטן מהחומר שיש לי...  
בתגובה להודעה מספר 0
 
   תגיד לי אם זה טוב ואני יביא את הכל...


the contribution of major individuals, machines, and ideas to the development of computing.

A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room.

Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who laboured to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming.

They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this article, and links are provided to entries on many of the individuals and companies mentioned. In addition, separate articles exist on computer science and supercomputer.

Early history

Computer precursors

The abacus

The earliest known calculating device is probably the abacus. It dates back at least to 1100 BC and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabicnumber system. In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping.

The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero.


Analog calculators: From Napier's logarithms to the slide rule

Calculating devices took a different turn when John Napier, a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest, adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624 tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labour-saving tool for tedious astronomical calculations.

Most significant for the development of computing, the transformation of multiplication into addition greatly simplified the possibility of mechanization. Analog calculating devices based on Napier's logarithms—representing digital values with analogous physical lengths—soon appeared. In 1620 Edmund Gunter, an English mathematician who coined the terms cosine and cotangent, built a device for performing navigational calculations: the Gunter Scale or, as navigators simply called it, the gunter. Around 1632 an English clergyman and mathematician named William Oughtred built the first slide rule, drawing on Napier's ideas. That first slide rule was circular , but Oughtred also built the first rectangular one in 1633.

The analog devices of Gunter and Oughtred had various advantages and disadvantages compared with digital devices such as the abacus. What is important is that the consequences of these design decisions were being tested in the real world.


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   21:23   28.11.03   
אל הפורום  
  6. תביא אחי..  
בתגובה להודעה מספר 5
 
תודה רבה..
אם יש לך יותר ממוקד בחומר..
יהיה טוב..
תודה!


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
the_jackass

   03:48   29.11.03   
אל הפורום  
  7. אמ...  
בתגובה להודעה מספר 6
 
   יש לי בערך פי 20 ממה שכתבתי בתגובה הקודמת...
אני ינסה להוציא את החלקים החשובים מחר בבוקר...


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
the_jackass

   14:45   29.11.03   
אל הפורום  
  8. הנה:  
בתגובה להודעה מספר 7
 
   The earliest known calculating device is probably the abacus. It dates back at least to 1100 BCand is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system. In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping.

The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero.


Analog calculators: From Napier's logarithms to the slide rule

Calculating devices took a different turn when John Napier, a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest, adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624 tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labour-saving tool for tedious astronomical calculations.

Most significant for the development of computing, the transformation of multiplication into addition greatly simplified the possibility of mechanization. Analog calculating devices based on Napier's logarithms—representing digital values with analogous physical lengths—soon appeared. In 1620 Edmund Gunter, an English mathematician who coined the terms cosine and cotangent, built a device for performing navigational calculations: the Gunter Scale or, as navigators simply called it, the gunter. Around 1632 an English clergyman and mathematician named William Oughtred built the first slide rule, drawing on Napier's ideas. That first slide rule was circular , but Oughtred also built the first rectangular one in 1633.

The analog devices of Gunter and Oughtred had various advantages and disadvantages compared with digital devices such as the abacus. What is important is that the consequences of these design decisions were being tested in the real world.

The first computer

By the second decade of the 19th century, a number of ideas necessary for the invention of the computer were in the air. First, the potential benefits to science and industry of being ableto automate routine calculations was appreciated, as it had not been a century earlier. Specific methods to make automated calculation more practical, such as doing multiplicationby adding logarithms or by repeating addition, had been invented, and experience with both analog and digital devices had shown some of the benefits of each approach. Finally (as described in the previous section, Computer precursors), the Jacquard loom had shown the benefits of directing a multipurpose device through coded instructions, and it had demonstrated how punched cards could be used to modify those instructions quickly and flexibly. It was a mathematical genius in England who began to put all of these pieces together.


The Difference Engine

Charles Babbage was an English mathematician and inventor: he invented the cowcatcher, reformed the British postal system, and was a pioneer in the fields of operations research andactuarial science. It was Babbage who first suggested that the weather of years past could be read from tree rings. He also had a lifelong fascination with keys, ciphers, and mechanical dolls.

As a founding member of the Royal Astronomical Society, Babbage had seen a clear need to design and build a mechanical device that could automate long, tedious astronomical calculations. He began by writing a letter in 1822 to Sir Humphry Davy, president of the Royal Society, about the possibility of automating the construction of mathematical tables—specifically, logarithm tables for use in navigation. He then wrote a paper, “On the Theoretical Principles of the Machinery for Calculating Tables,” which he read to the society later that year. (It won the Royal Society's first Gold Medal in 1823.) Tables then in use often contained errors, which could be a life-and-death matter for sailors at sea, and Babbage argued that by automating the production of the tables he could assure their accuracy. Having gained support in the society for his Difference Engine, as he called it, Babbage next turned to the British government to fund development, obtaining one of the first government grants for research and technological development anywhere in the world.

Babbage approached the project very seriously: he hired a master machinist, set up a fireproof workshop, and built a dust-proof environment for testing the device. Up until then calculations were rarely carried out to more than 6 digits; Babbage planned routinely to produce 20- or 30-digit results.

The Difference Engine was a digital device: it operated on discrete digits rather than smooth quantities, and the digits were decimal (0–9), represented by positions on toothed wheels, rather than the binary digits that Leibniz favoured (but did not use). When one of the toothed wheels turned from 9 to 0, it caused the next wheel to advance one position, carrying the digitjust as Leibniz's Step Reckoner calculator had operated.

The Difference Engine was more than a simple calculator, however; it mechanized not just a single calculation but a whole series of calculations on a number of variables to solve a complex problem. It went far beyond calculators in other ways, as well. Like modern computers, the Difference Engine had storage—that is, a place where data could be held temporarily for later processing—and it was designed to stamp its output into soft metal, which could later be used to produce a printing plate.

Nevertheless, the Difference Engine performed only one operation. The operator would set upall of its data registers with the original data, and then the single operation would be repeatedly applied to all of the registers, ultimately producing a solution. Still, in complexity and audacity of design it dwarfed any calculating device then in existence.

The full engine, designed to be room-sized, was never built, at least not by Babbage. Althoughhe sporadically received several government grants—governments changed, funding often ran out, and he had to personally bear some of the financial costs—he was working at or near the tolerances of the construction methods of the day, and he ran into numerous constructiondifficulties. All design and construction ceased in 1833, when Joseph Clement, the machinist responsible for actually building the machine, refused to continue unless he was prepaid. (The completed portion of the Difference Engine is on permanent exhibition at the Science Museum in London.)

Early business machines

Throughout the 19th century, business machines were coming into common use. Calculators became available as a tool of commerce in 1820 (see the earlier section Digital calculators), and in 1874 the Remington Arms Company, Inc., sold the first commercially viable typewriter. Other machines were invented for other specific business tasks. None of these machines was a computer, but they did advance the state of practical mechanical knowledge—knowledge that would be used in computers later.

One of these machines was invented in response to a sort of constitutional crisis in the UnitedStates: the census tabulator.


Herman Hollerith's census tabulator

The U.S. Constitution mandates that a census of the population be performed every 10 years. The first attempt at any mechanization of the census was in 1870, when statistical data were transcribed onto a rolling paper tape displayed through a small slotted window. As the size of America's population exploded in the 19th century, and the number of census questions expanded, the urgency of further mechanization became increasingly clear.

After graduating from the Columbia University School of Mines, New York City, in 1879, Herman Hollerith obtained his first job with one of his former professors, William P. Trowbridge, who had received a commission as a special agent for the 1880 census. It was while employed at the Census Office that Hollerith first saw the pressing need for automatingthe tabulation of statistical data.

Over the next 10 years Hollerith refined his ideas, obtaining his first patent in 1884 for a machine to punch and count cards. He then organized the health records for Baltimore, Maryland, for New York City, and for the state of New Jersey—all in preparation for winning thecontract to tabulate the 1890 U.S. Census. The success of the U.S. census opened European governments to Hollerith's machines. Most notably, a contract with the Russian government, signed on December 15, 1896, may have induced him to incorporate as the Tabulating Machine Company on December 5, 1896.

As the technology for realizing a computer was being honed by the business machine companies in the early 20th century, the theoretical foundations were being laid in academia.During the 1930s, two important strains of computer-related research were being pursued at two universities in Cambridge, Massachusetts. One strain produced the Differential Analyzer,the other a series of devices ending with the Harvard Mark IV.


Vannevar Bush's Differential Analyzer

In 1930 an engineer named Vannevar Bush at the Massachusetts Institute of Technology (MIT) developed the first modern analog computer. The Differential Analyzer, as he called it, was an analog calculator that could be used to solve certain classes of differential equations,a type of problem common in physics and engineering applications that are often very tedious to solve. Variables were represented by shaft motion, and addition and multiplication were accomplished by feeding the values into a set of gears. Integration was carried out by means of a knife-edged wheel rotating at a variable radius on a circular table. The individual mechanical integrators were then interconnected to solve a set of differential equations.

The Differential Analyzer proved highly useful, and a number of them were built and used at various universities. Still the device was limited to solving this one class of problem, and, as is the case for all analog devices, it produced approximate, albeit practical, solutions. Nevertheless, important applications for analog computers, and analog-digital hybrid computers, still exist, particularly for simulating complicated dynamical systems such as aircraft flight, nuclear power plant operations, and chemical reactions.

The age of Big Iron

A snapshot of computer development in the early 1950s would have to show a number of companies and laboratories in competition—technological competition, and increasingly earnest business competition—to produce the few computers then demanded for scientific research. Several computer-building projects had been launched immediately after the end ofWorld War II in 1945, primarily in the United States and Britain. These projects were inspired chiefly by a 1946 document, “Preliminary Discussion of the Logical Design of an Electronic Digital Computing Instrument,” produced by a group working under the direction of mathematician John von Neumann of the Institute for Advanced Study at Princeton University.The IAS paper, as von Neumann's document became known, articulated the concept of the stored program—a concept that has been called the single largest innovation in the history ofthe computer. (Von Neumann's principles are described earlier, in the section Toward the classical computer.) Most computers built in the years following the paper's distribution were designed according to its plan, yet by 1950 there were still only a handful of working stored-program computers.

Business use at this time was marginal because the machines were so hard to use. Although computer makers such as Remington Rand, the Burroughs Adding Machine Company, and the International Business Machines Corporation (IBM) had begun building machines to the IAS specifications, it was not until 1954 that a real market for business computers began to emerge. The IBM 650, delivered at the end of 1954 for colleges and businesses, was a decimalimplementation of the IAS design. With this low-cost magnetic drum computer, which sold for about $200,000 apiece (compared with about $1,000,000 for the scientific model, the IBM 701), IBM had a hit, eventually selling about 1,800 of them. In addition, by offering universities that taught computer science courses around the IBM 650 an academic discount program (with price reductions of up to 60 percent), IBM established a cadre of engineers and programmers for their machines. (Apple Computer later used a similar discount strategy in American grade schools to capture a large proportion of the early microcomputer market.)

A snapshot of the era would also have to show what could be called the sociology of computing. The actual use of computers was restricted to a small group of trained experts, and there was resistance to the idea that this group should be expanded by making the machines easier to use. Machine time was expensive, more expensive than the time of the mathematicians and scientists who needed to use the machines, and computers could only process one problem at a time. As a result, the machines were in a sense held in higher regardthan the scientists. If a task could be done by a person, it was thought that the machine's timeshould not be wasted with it. The public's perception of computers was not positive either. If motion pictures of the time can be used as a guide, the popular image was of a room-filling brain attended by white-coated technicians, mysterious and somewhat frightening—about to eliminate jobs through automation.

Yet the machines of the early 1950s were not much more capable than Charles Babbage's Analytical Engine of the 1830s (although they were much faster). Although in principle these were general-purpose computers, they were still largely restricted to doing tough math problems. They often lacked the means to perform logical operations, and they had little text-handling capability—for example, lower-case letters were not even representable in the machines, even if there were devices capable of printing them.

These machines could only be operated by experts, and preparing a problem for computation (what would be called programming today) took a long time. With only one person at a time able to use a machine, major bottlenecks were created. Problems lined up like experiments waiting for a cyclotron or the Space Shuttle. Much of the machine's precious time was wasted because of this one-at-a-time protocol.

In sum, the machines were expensive and the market was still small. To be useful in a broaderbusiness market, or even in a broader scientific market, computers would need application programs: word processors, database programs, and so on. These applications in turn would require programming languages in which to write them and operating systems to manage them.


Before 1970, computers were big machines requiring thousands of separate transistors. They were operated by specialized technicians, who were often dressed in white lab coats and commonly referred to as a computer priesthood. The machines were expensive and difficult to use. Few people came in direct contact with them, not even their programmers. Thetypical interaction was as follows: a programmer coded instructions and data on preformatted paper, a keypunch operator transferred the data onto punch cards, a computer operator fed the cards into a card reader, and, finally, the computer executed the instructionsor stored the cards' information for later processing. Advanced installations might allow users limited interaction with the computer more directly, but still remotely, via time-sharing through the use of cathode-ray tube terminals or teletype machines.

At the beginning of the 1970s there were essentially two types of computers. There were room-sized mainframes, costing hundreds of thousands of dollars, that were built one at a time by companies such as International Business Machines Corporation (IBM) and Control Data Corporation. There also were smaller, cheaper, mass-produced minicomputers, costing tens of thousands of dollars, that were built by a handful of companies, such as Digital Equipment Corp. and Hewlett-Packard Company, for scientific laboratories and businesses.

Still, most people had no direct contact with either type of computer, and the machines were popularly viewed as impersonal giant brains that threatened to eliminate jobs through automation. The idea that anyone would have his or her own desktop computer was generally regarded as far-fetched. Nevertheless, with advances in integrated circuit technology, the necessary building blocks for desktop computing began to emerge in the early 1970s.

זה סיכומים של ראשי פרקים
אם אתה צריך פרוט על משהו אז תבקש (עדיף בפרטי)


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד
SwitcH
חבר מתאריך 30.10.03
18527 הודעות
   18:50   29.11.03   
אל הפורום  
  9. תודה רבה!!  
בתגובה להודעה מספר 8
 


                                                         (ניהול: מחק תגובה)
מכתב זה והנלווה אליו, על אחריות ועל דעת הכותב בלבד

תגובה מהירה  למכתב מספר: 
 
___________________________________________________________________

___________________________________________________________________
למנהלים:  נעל | תייק בארכיון | מחק | העבר לפורום אחר | מחק תגובות | עגן אשכול
       



© כל הזכויות שמורות ל-רוטר.נט בע"מ rotter.net