Pages

Friday, December 13, 2013

1962 First computer game & word processor

1962 First computer game & word processor
Steve Russell at MIT invents Space war, the first computer game running on a DEC PDP-1.
Because the PDP-1 had a typewriter interface,  editors like TECO (Text Editor and Corrector) were written for it.
Steve Piner and L. Peter Deutsch produced the first word processor called Expensive
Typewriter (MIT's PDP-1 cost $100,000).



Sunday, October 13, 2013

1960 – First commercial transistorized computers

1960 First commercial transistorized computers
DEC introduced the PDP-1and IBM released the 7090 which was the fastest in the world.



1954 – FORTRAN and 1958 – Integrated Circuit computer History.

1954 FORTRAN
John Backus & IBM invent the first successful high level programming language, and compiler, that ran on IBM 701 computers.
FORmula TRANslation was designed to make calculating the answers to  scientific and    math problems easier.













1958  Integrated Circuit
Jack Kilby at Texas Instruments & Robert Noyce at Fairchild semiconductor independently invent the first integrated circuits or the chip.
Jack Kilby was awarded the National Medal of Science and was inducted into the National Inventors Hall of Fame, and received the 2000 Nobel Prize in Physics for his work on the integrated circuit.

Grace Hopper 1906-1992 computer History

Grace Hopper 1906-1992
Developed the first compiler (A-0, later ARITH-MATIC, MATH-MATIC and FLOW-MATIC) while working at the Remington Rand corporation on the UNIVAC I.
Later returned to the NAVY where she worked on COBOL and was eventually promoted to Rear Admiral.



Grace Hopper 1906-1992
Rear Admiral Grace Hopper, US Navy, and other programmers at a UNIVAC console - 1957
Some of Grace Hopper's Awards
She won the first "man of the year" award from the Data Processing Management Association in 1969.
She became the first person from the United States and the first woman of any nationality to be made a Distinguished Fellow of the British Computer Society in 1973.
Upon her retirement she received the Defense Distinguished Service Medal in 1986
She received the National Medal of Technology in 1991.



Thursday, October 10, 2013

1953 – IBM 701 EDPM Computer

IBM enters the market with its first large scale electronic computer.
It was designed to be incomparable with IBM's existing punch card processing system, so that it would not cut into IBM's existing profit sources.


Computing History 1947-1951

1947 The transistor
Invented by William Shockley (seated) John Bardeen & Walter Brattain at Bell Labs.
The transistor replaces bulky vacuum tubes with a smaller, more reliable, and power saving solid sate circuit.
1951  UNIVAC
First commercial computer -  Between 1951 and 1958, 47 UNIVAC I computers were delivered.


1951  UNIVAC Mercury delay unit (1 of 7)Ăľ
UNIVAC mercury delay units containing 18 delay lines, each of which stored 120 bits. Total of 2,160 bits, or 144 fifteen bit words per memory unit.



1951  UNIVAC
UNIVAC tape units.


1951  UNIVAC

UNIVAC tube board and individual vacuum tube.



Computing History 1943/1944/1946

1943/1944 Colossus Mark I & II.
The Colossus Mark I & II are widely acknowledged as the first programmable electric computers, and were used at Bletchley Park to decode German codes encrypted by the Lorenz SZ40/42.
























1946John Eckert & John W. Mauchly ENIAC 1 Computer.
ENIAC was short for Electronic Numerical Integrator And Computer. It was the first general
purpose (programmable to solve any problem) electric computer. It contained over 17,000
vacuum tubes, weighed 27 tones and drew 150 kW of power to operate.


Computing History 1936, 1944.

1936 Konrad Zuse Z1 Computer
First freely programmable computer, electro-mechanical punch tape control.



























1944 Howard Aiken & Grace Hopper
Harvard Mark I Computer
The IBM Automatic Sequence Controlled Calculator (ASCC) Computer was created by IBM for Harvard University, which called it the Mark I. First universal calculator.


Alan Turing 1912-1954 Computing History.

1.    British mathematician and cryptographer.
2.    Father of theoretical computer science.
3.    Contributions include:
     a.    Turing Machine
     b.    Turing Test (for AI)
     c.    First detailed design of a stored program computer (never built)
4.    The Turing Machine is a simpler version of Kurt Gödel's formal languages.
5.    Halting problem is undecidable.

Wednesday, October 9, 2013

Alonzo Church 1903-1995 computing history

American mathematician and logician. Developed lambda calculus, directly implemented by LISP and other functional programming languages. Showed the existence of an undecidable problem. Lambda calculus was proven to be equivalent to a Turning Machine by Church and Turing working together.





Kurt Gödel 1906-1978 computing History.


Kurt Gödel 1906-1978

Famous for his incompleteness theorem

This theorem implies that not all mathematical questions are computable (can be solved).







The Right Honourable Augusta Ada, Countess of Lovelace computing history





The Right Honourable Augusta Ada,

Created a program for the (theoretical) Babbage analytical engine which would have calculated Bernoulli numbers.
Widely recognized as the first programmer.




Computing History 1837 – Analytical Engine.

Charles Babbage first described a general purpose analytical engine in 1837, but worked on the design until his death in 1871. It was never built. As designed, it would have been programmed using punch-cards and would have included features such as sequential control, loops, conditionals and branching. If constructed, it would have been the first “computer” as we think of them today. Augusta Ada Byron King, Countess of Lovelace 1815-1852.


History of Computing in 1822 – Difference Engine

Numerical tables were constructed by hand using large numbers of human computers (one who computes).

Annoyed by the many human errors this produced, Charles Babbage designed a difference engine that could calculate values of polynomial functions. 



History of Computing Charles Babbage 1791-1871.

1.  English mathematician, engineer, philosopher and inventor.
2.  Originated the concept of the programmable computer, and designed one.
3.  Could also be a Jerk.



Computer History 1805 - Jacquard Loom.

1.    First fully automated and programmable Loom

2.    Used punch cards to “program” the pattern to be woven into cloth


Computer History Jacques de Vaucanson 1709-1782

1.    Gifted French artist and inventor
2.    Son of a glove-maker, aspired to be a clock-maker
3.    1727-1743 Created a series of mechanical automations that simulated life.
4.    Best remembered is the Digesting Duck, which had over 400 parts.

5.    Also worked to automate looms, creating the first automated loom in 1745.                  

History of Computer Network Era (Late 50s to present)

1.    Timesharing, the concept of linking a large numbers of users to a single computer via remote terminals, is developed at MIT in the late 50s and early 60s.
2.    Paul Baran of RAND develops the idea of distributed, packet-switching networks.
3.    ARPANET goes online in 1969.
4.    Bob Kahn and Vint Cerf develop the basic ideas of the Internet in 1973.
5.    In 1974 BBN opens the first public packet-switched network –Telenet.
6.    A UUCP link between the University of North Carolina at Chapel Hill and Duke University establishes USENET in 1979.
7.    TCP/IP (Transmission Control Protocol and Internet Protocol) is established as the standard for ARPANET in 1982.
8.    The number of network hosts breaks 10,000 in 1987; two years later, the number of hosts breaks 100,000.
9.    Tem Berners-Lee develops the World Wide Web. CERN releases the first Web server in 1991.
10. By 1992, the number of network hosts breaks 1,000,000.

11. The World Wide Web sports a growth rate of 341,634% in service traffic in its third year--1993.

History of Computer Micro Era 1971-1989.

1.    Bill Gates and Paul Allen form Traf-O-Data in 1971 to sell their computer traffic-analysis sytems.
2.    Gary Kildall writes PL/M, the first high-level programming language for the Intel Microprocessor.
3.    Steve Jobs and Steve Wozniak are building and selling “blue boxes” in Southern California in 1971.
4.    Intel introduces the 8008, the first 8-bit microprocessor in April of 1972.
5.    Jonathan A. Titus designs the Mark-8 and is featured in the July 1974 Radio Electronics.
6.    In January 1975 Popular Electronics features the MITS Altair 8800; it is hailed as the first “personal” computer.
7.    Paul Allen and Bill Gates develop BASIC for the Altair 8800. Microsoft is born!!!
8.    Apple is selling its Apple II for $1,195, including 16K of RAM but no monitor by 1977.
9.    Software Arts develops the first spreadsheet program, Visicalc by the spring of 1979. 500 copies per month are shipped in 1979 and sales increase to 12,000 per month by 1981.
10. By 1980 Apple has captured 50% of the personal computer market.
11. In 1980 Microsoft is approached by IBM to develop BASIC for its personal computer project. The IBM PC is released in August, 1981.
12. The Apple Macintosh, featuring a simple graphical interface using the 8-MHz, 32-bit Motorola 68000 CPU and a built-in 9-inch B/W screen, debuts in 1984.
13. Microsoft Windows 1.0 ships in November, 1985.
14. Microsoft’s sales for 1989 reach $1 billion.

History of Computer Mini Era (1959-1970)

1.    The Mini Era began with the development of the integrated circuit in 1959 by Texas Instruments and Fairchild Semiconductor.
2.    Ivan Sutherland demonstrates a program called Sketchpad (makes engineering drawings with a light pen) on a TX-2 mainframe at MIT’s Lincoln Labs in 1962.
3.    By 1965, an integrated circuit that cost $1,000 in 1959 now costs less than $10.
4.    Doug Engelbart demonstrates a word processor in 1968.
5.    Also in 1968, Gordon Moore and Robert Noyce founded a company called Intel.
6.    Xerox creates its Palo Alto Research Center (Xerox PARC) in 1969.
7.    Fairchild Semiconductor introduces a 256-bit RAM chip in 1970.
8.    In late 1970 Intel introduces a 1K RAM chip and the 4004, a 4-bit microprocessor. Two years later comes the 8008, an 8-bit processor.

Electronics Era 1900-1964

1.    In 1926, Dr. Julius Edgar Lilienfield from New York filed for a patent on a transistor.
2.    Konrad Zuse, a German engineer, completes the 1st general purpose programmable calculator in 1941.
3.    Colossus, a British computer used for code-breaking, is operational by the end of 1943.
4.    ENIAC (Electronic Numerical Integrator Analyzor and Computer) is developed by Ballistics Research Lab in Maryland and built by the University of Pennsylvania and completed in 1945.
5.    The transistor is developed by Bell Telephone Laboratories in 1947.
6.    UNIVAC (Universal Automatic Computer) is developed in 1951 and can store 12,000 digits in random access mercury-delay lines.
7.    EDVAC (Electronic Discrete Variable Computer) is completed for the Ordinance Department in 1952.
8.    Texas Instruments and Fairchild Semiconductor both announce the integrated circuit in 1959.
9.    The IBM 360 is introduced in April of 1964 and quickly becomes the standard institutional mainframe computer. By the mid-80s the 360 and its descendents have generated more than $100 billion in revenue for IBM.

Pre-History Era 4th century B.C. to 1930s.

  1. The abacus is believed to have been invented in 4th century B.C.
  2. The Antikythera mechanism, a device used for registering and predicting the motion of the stars and planets, is dated to 1st century B.C.
  3. Arabic numerals were introduced in Europe in the 8th and 9th century A.D. and was used until the 17th century.
  4. John Napier of Scotland invents logs in 1614 to allow multiplication and division to be converted to addition and subtraction.
  5. Wilhelm Schickard, a professor at the University of Tubingen, Germany builds a mechanical calculator in 1623 with a 6-digit capacity. The machine worked, but it never makes it beyond the prototype stage.
  6. Leonardo Da Vinci is now given credit for building the first mechanical calculator around 1500. Evidence of Da Vinci’s machine was not found until papers were discovered in 1967.
  7. Blaise Pascal builds a mechanical calculator in 1642 with an 8-digit capacity.
  8. Joseph-Marie Jacquard invents an automatic loom controlled by punch-cards in the early 1800s.
  9. Charles Babbage designs a “Difference Engine” in 1820 or 1821 with a massive calculator designed to print astronomical tables. The British government cancelled the project in 1842; Babbage then conceives the “Analytical Engine”, a mechanical computer that can solve any mathematical problem and uses punch-cards.
  10. Augusta Ada Byron, Countess of Lovelace and daughter of English poet Lord Byron, worked with Babbage and created a program for the Analytical Engine. Ada is now credited as being the 1st computer programmer.
  11. Samuel Morse invents the Electric Telegraph in 1837.
  12. George Boole invents Boolean Algebra in the late 1840s. Boolean Algebra was destined to remain largely unknown and unused for the better part of a century, until a young student called Claude E. Shannon recognized its relevance to electronics design.
  13. In 1857, only twenty years after the invention of the telegraph, Sir Charles Wheatstone (the inventor of the accordian) introduced the first application of paper tapes as a medium for the preparation, storage, and transmission of data.
  14. The first practical typewriting machine was conceived by three American inventors and friends, Christopher Latham Sholes, Carlos Glidden, and Samual W. Soule who spent their evenings tinkering together.
  15. The friends sold their design to Remington and Sons, who hired William K. Jenne to perfect the prototype, resulting in the release of the first commercial typewriter in 1874.
  16. Herman Hollerith’s Tabulating Machines were used for the 1890 census; the machines used Jacquard’s punched cards.

What skills are needed

What skills are needed 
Programming is an important activity as people life and living depends on the programs 
one make. Hence while programming one should 
o  Paying attention to detail 
o  Think about the reusability. 
o  Think about user interface 
o  Understand the fact the computers are stupid 
o  Comment the code liberally 

Why Programming is important

Why Programming is important 
The question most of the people ask is why should we learn to program when there are so 
many application software and code generators available to do the task for us. Well the 
answer is as give by the Matthias Felleisen in the book ‘How to design programs’ 
“The answer consists of two parts. First, it is indeed true that traditional forms of 
programmingare useful for just a few people. But, programming as we the authors 
understand itis useful for everyone: the administrative secretary who uses spreadsheets 
as well as the high-tech programmer. In other words, we have a broader notion of 
programming in mind than the traditional one. We explain our notion in a moment. 
Second, we teach our idea of programming with a technology that is based on the 
principle of minimal intrusion. Hence, our notion of programming teaches problemanalysis and problem-solving skills withoutimposing the overhead of traditional 
programming notations and tools.” 
Hence learning to program is important because it develops analyticaland problem
solvingabilities. It is a creative activity and provides us a mean to express abstract ideas. 
Thus programming is fun and is much more than a vocational skill. By designing 
programs, we learn many skills that are important for all professions. These skills can be 
summarized as: 
o  Critical reading 
o  Analytical thinking 
o  Creative synthesis 

Introduction to programming in computer

Summary 
o  What is programming 
o  Why programming is important 
o  What skills are needed 
o  Develop a basic recipe for writing programs 
o  Points to remember 
What is programming 
As this course is titled “Introduction to programming”, therefore it is most essential and appropriate to understand what programming really means. Let us first see a widely known definition of programming. 
Definition: "A program is a precise sequence of steps to solve a particular problem.” It means that when we say that we have a program, it actually mean that we know about a complete set activities to be performed in a particular order. The purpose of these 
activities is to solve a given problem. 
Alan Perlis, a professor at Yale University, says: 
"It goes against the grain of modern education to teach children to program. What fun is 
there in making plans, acquiring discipline in organizing thoughts, devoting attention to 
detail and learning to be self-critical? " 
It is a sarcastic statement about modern education, and it means that the modern 
education is not developing critical skills like planning, organizing and paying attention 
to detail. Practically, in our day to day lives we are constantly planning, organizing and 
paying attention to fine details (if we want our plans to succeed). And it is also fun to do 
these activities. For example, for a picnic trip we plan where to go, what to wear, what to 
take for lunch, organize travel details and have a good time while doing so. 
When we talk about computer programming then as Mr. Steve Summit puts it 
“At its most basic level, programming a computer simply means telling it what to do, and 
this vapid-sounding definition is not even a joke. There are no other truly fundamental 
aspects of computer programming; everything else we talk about will simply be the 
details of a particular, usually artificial, mechanism for telling a computer what to do. 
Sometimes these mechanisms are chosen because they have been found to be convenient 
for programmers (people) to use; other times they have been chosen because they're easy 
for the computer to understand. The first hard thing about programming is to learn, 
become comfortable with, and accept these artificial mechanisms, whether they make 
``sense'' to you or not. “