Skip to main content

History of computer from background to date

The history of computer dated back to the period of scientific revolution (i.e.1543 – 1678). The calculating machine invented by Blaise Pascal in 1642 and that of Goffried Liebnits marked the genesis of the application of machine in industry.
This progressed up to the period 1760 – 1830 which was the period of the industrial revolution in Great Britain where the use of machine for production altered the British society and the Western world. During this period Joseph The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the United State (U.S) Population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done,giving rise to punch-card based computers that took up entire rooms. Today, we carry more computing power on our smart phones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers. The followings are historical events of computer.
1623: Wilhelm Schickard designed and constructed the first working mechanical Calculator.
1673: Gottfried Leibniz demonstrated a digital mechanical calculator, called the  Stepped Reckoner. He may be considered the first computer scientist and information theorist, for, among other reasons, documenting the binary number system.
1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards. Home,News , Tech  Health ,Planet Earth,Strange 
News,Animals, History,Culture , Space & Physics.
1820: Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, which was the first calculating machine strong enough and reliable enough to be used daily in an office environment.
1822: English mathematician Charles Babbage (Father of Computer)conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world's first computer was actually built.
Jacquard invented the weaving loom (a machine used in textile industry).
1843: During the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be the first published algorithm ever specifically tailored for implementation on a computer.
1885: Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM.
1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM. 
1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central 
concept of the modern computer was based on his ideas. 
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.
1937: One hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, 
which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage, according to the Computer History Museum.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is 
able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes. 
1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications. 
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.
1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO 
Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war. 
1954: The FORTRAN programming language, an acronym for FORmula 
TRANslation, is developed by a team of programmers at IBM led by John 
Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work. 
1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public. 
1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language,UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users. 1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip. 
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers. 
1973: Robert Metcalfe, a member of the research staff for Xerox,develops Ethernet for connecting multiple computers and other hardware. 
1974 -1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80-affectionately known as the "Trash 80" — and the Commodore PET. 
1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new Beginners All Purpose Symbolic Instruction Code (BASIC) language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, 
Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single-circuit board, 
according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage. 

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program. 
1979: Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional changes 
included getting rid of command mode and adding a print function. I was the technical brains — I figured out how to do it, and did it, and documented it. 
"The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system. (Image: © IBM).
1981: The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MSDOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computer land sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC. 
1983: Apple's Lisa is the first personal computer with a graphical user interface (GUI). It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with 
the familiar flip form factor and the first to be marketed as a "laptop." The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks (Image: © Radioshack) 
1985: Microsoft announces Windows, according to Encyclopedia Britannica. 
This was the company's response to Apple's graphical user interface (GUI). 
Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years before 
the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer,registers Symbolics.com. More than two years later, only 100 dot-coms had been registered. 
1986: Compaq brings the “Deskpro 386” to market. Its 32-bit architecture provides as speed comparable to mainframes. 
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics 
laboratory in Geneva, develops Hyper Text Markup Language (HTML), giving rise to the World Wide Web. 
1993: The Pentium microprocessor advances the use of graphics and music on 
PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the 
Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" 
are among the games to hit the market. 
1996: Sergey Brin and Larry Page develop the Google search engine at 
Stanford University. 
1997: Microsoft invests $150 million in Apple, which was struggling at the 
time, ending Apple's court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its operating system. 
1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires. 
2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned graphical user interface GUI. 
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market. 
2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches. 
2005: YouTube, a video sharing service, is founded. Google acquires Android,a Linux-based mobile phone operating system. 
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market. 
2007: The iPhone brings many computer functions to the smart phone. 
2009: Microsoft launches Windows 7, which offers the ability to pin 
applications to the taskbar and advances in touch and handwriting recognition,among other features. 
2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment. 
2011: Google releases the Chromebook, a laptop that runs the Google Chrome 
OS.
2012: Facebook gains 1 billion users on October 4. 
2015: Apple releases the Apple Watch. Microsoft releases Windows 10. 
2016: The first reprogrammable quantum computer was created. "Until now,there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a 
quantum physicist and optical engineer at the University of Maryland, College 
Park. 
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, 
program manager in DARPA's Defense Sciences Office, said in a statement. 
"Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital 
architectures." [Computers of the Future May Be Minuscule Molecular 
Machines].
The history of computer is considered with the generations of a computer from first generation to fifth generation.
In 19th century English mathematics professor name Charles Babbage
referred as a “Father of Computer”. He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on. Generally speaking, computers can be classified into five generations. Each generation lasted for a certain period of time and each gave us either a new and improved computer or an improvement to the existing computer.
Thanks for reading

Comments

Popular posts from this blog

What is Computer?

 The word computer originates from the word compute which means to calculate. It was initially used to refer to human beings that perform calculations. A computer has been defined so many forms by different authors. Some of the definitions are as follows: - Computer :-  is an electronic device that accepts data as input Process the data and gives out information as output.  - Computer :- It can be defined as an electronic or electromechanical device that is capable of accepting data, holds a means of instruction in its memory, process the information given by following sets of instructions to carry out a task without human intervention and at the end provide significant result. - Computer :- is any machine which accepts data and information presented to it in a prescribed form,carry out some operations on the input and supply the required result in a specified format as information or as signals to control some other machines or process. - Computer :- is an ele...

System Analysis and Design: A Comprehensive Overview

System analysis and design is a critical phase in the development of software systems. It involves a structured approach to understanding, defining, and designing solutions to meet business needs or address problems. This process ensures that the resulting system is efficient, effective, and aligned with user requirements. Let's delve into the key components and stages of system analysis and design:  1. System Analysis: Understanding Requirements and Problems In this stage, system analysts gather and analyze information to understand the current system or business processes, identify problems, and determine user needs. The goal is to define the scope and objectives of the project.  Requirements Gathering:  Analysts interact with stakeholders to gather requirements, including functional, non-functional, and user-specific needs. Interviews, surveys, observations, and workshops are used to collect detailed information. Problem Identification:  Existing problems, ineffic...

Algorithm Analysis ,Time and Space Complexities

An algorithm is a step-by-step procedure or set of rules for solving a problem or performing a specific task. Algorithm analysis involves evaluating the efficiency and performance of algorithms, particularly in terms of their time and space complexities.  These complexities provide insights into how an algorithm's runtime and memory requirements grow as the input size increases.  Time Complexity: Time complexity measures the amount of time an algorithm takes to run as a function of the input size. It helps us understand how the algorithm's performance scales with larger inputs. Common notations used to express time complexity include Big O, Big Theta, and Big Omega. - Big O Notation (O()): It represents the upper bound on an algorithm's runtime.  For an algorithm with time complexity O(f(n)), the runtime won't exceed a constant multiple of f(n) for large inputs. -Big Omega Notation (Ω()): It represents the lower bound on an algorithm's runtime.  For an algorithm w...