Introduction
The current generation finds computer relevant and applicable nearly everywhere, but primarily useful in the fields of information storage and dissemination (Rao, 2017). Among the renowned reasons as to why computers are helpful in every area are their accuracy, readiness, and speed. The usefulness of the equipment has made it fashionable for every organization to adopt it. The currently designed organizations find it necessary to incorporate in their computer departments to serve the entire organization. Thus, every individual requires computer literacy to survive in any kind of employment. The age of computer evolution is characterized by computer generations, which implies that the entire innovation and overtime development of computers were achievable through stages. Therefore, the kinds of machines that are available in the societies of today have been developed over time by different scientists since their very first invention many years ago. The gradual invention and the overtime development of computers into the current ones is the main focus of the paper.
The history of their invention can be traced back to more than 2500 years until the abacus (Zhezhnych et al., 2017). Computer professionals find tasks for managing the department. The current global standards of employment in every sector recruit computer-literate employees to work for them because computer literacy is regarded as a pre-requisite for most forms of jobs (Copeland & Long, 2017). The computer has seen the world become a global village, especially in the 21st century. The abacus involved a simple calculator which was made from wires and beads. Technology has facilitated the development and spread of globalization in various aspects, which have, in turn, led to the introduction of more modern techniques.
The abacus is still found useful in some other parts of the world to date (Zhezhnych et al., 2017). The distinctions that exist between modern computers and ancient abacus are broad. The former machines lacked most features that made them more complex to operate in some disciplines. For instance, the logarithms in the computers were sophisticated while the need for devices with such features was becoming eminent. Despite the difference, the principle that aids the achievement of many repeated calculations faster than the brain of a human being are precisely similar.
The illustrious history of computer dates back to the era of the scientific revolution, from 1543 to 1678 (Way & Robin, 2016). The calculation machine that was invented in 1642 by Blaise Pascal, in combination with the one developed by Goffried Leibnits, initiated the beginning of industrial machine application (Way & Robin, 2016). Its progression continued up to the era of Great Britain's industrial revolution, which lasted between 1760 and1830, when the adoption of machine production changed the society of Great Britain and that of the Western world (Rao, 2017). The introduction of computers in the region is one of the major reasons behind the rapid industrialization that took place in Britain. The machine production introduced during that time drastically changed the rate of development due to the increased efficiency of task performance that resulted from the technology.
The first computer designs were invented to solve the crisis regarding the number-crunching problem. It was not for sending written messages via email or for entertainment. The population of the United States of America had grown so large that by 1880, that it took a period exceeding seven years to execute the tabulation of the American Census results (Copeland & Long, 2017). The government managed to establish a quicker way of doing the job, a fact that gave rise to computers that were based on punch cards and took up the space of whole rooms in terms of their size. Nowadays, individuals carry much of computing power in their smartphones, a privilege that was never there before.
The development of computers since invention developed progressively from simple beginnings to many sophisticated machines that we have today. It took a lot of complicated stages by so many scientists to make computers appear like what they are now. Thus, the current computers can browse the internet, perform even complicated mathematical calculations, play games, entertain through the use of videos and audio media, and act as library among other advantages (Taoka & Suzuki, 2019).
In 1623, the first working mechanical calculator was designed and constructed by Wilhelm Schickard (Rao, 2017). Gottfried Leibniz demonstrated Stepped Reckoner, a mechanical computer that was digital in 1673 (Zhezhnych et al., 2017). He is regarded as the first information theorist and computer scientist for his other achievements like the documentation of the system of the binary numbers. In the process of developing early computers, a French scientist, Joseph Marie Jacquard, invented a loom that used punched wooden cards to weave the fabric designs automatically. The then machines used the punched cards for their efficiency.
Thomas de Colmar launched an industry for a mechanical calculator upon the release of his arithmometer that was simplified in 1820 (Copeland & Long, 2017). It was the first calculating machine that many offices across the world adopted by then because it was reliable and strong enough. Charles Babbage, who was regarded as the father of the computer and an English mathematician, conceived a calculating machine that was driven by steam in 1822 (Copeland & Long, 2017). The intelligent machine design intended to have the ability to computerize tables of numbers. However, it was a failed project despite being funded by the English government. A century later, the first computer in the world was built, though.
During the translation of an article about the analytical engine from France in 1843 (Zhehzhnych et al., 271), Ada Lovelace drafted an algorithm of computing the Bernoulli numbers. It was considered the first-ever published algorithm that aimed at being implemented on a computer. In 1885, a tabulator that used punched cards in the processing of statistical information was invented by Herman Hollerith (Taoka & Suzuki, 2019). His company became part of IBM at last. Later in 1990, he designed a system of the punch card to aid in the calculation of the census that was conducted in 1880 (Rao, 2017). He accomplished the task in just three years, and thus saved the government a total amount of USD 5 million (Way & Robin, 2016). He established a company that is currently the IBM that same year.
The development of a computer into becoming a more powerful, effective, and sophisticated machine with additional features and functions continued successively from the 19th century up to the 20th century. Here, a notion of a device that was universal was created and later referred to as the Turning machine. It had the capability of computing anything computable, concerning every aspect of life, field, or discipline.
In 1937, J.V Atanasoff attempted to develop the first computer that lacked gears, belts, and cams (Way & Robin, 2016). He was a professor at the State University of Iowa. A century later, after the failure of Charles Babbage's invention, IBM was convinced by Howard Aiken into developing his enormous calculator that was programmable. The Babbage's Analytical Engine, which also used punched cards, and a central programming unit, was incorporated into the development of the giant programmable computer by the ASCC/Harvard Mark I (Taoka & Suzuki, 2019). The machine was completed, and others referred to it as the eventual success of the failed Babbage dream.
In 1939, David Packard and Bill Hewlett established the Hewlett-Packard (Copeland & Long, 2017). According to the Museum of Computer History, they founded it in a garage in Palo Alto, California. Atanasoff and his graduate student, Clifford Berry, designed a computer that had the ability of simultaneously solving up to 29 equations. Their achievement marked the first time a computer assumed the ability to have its main memory as a storage of information. John Mauchly and J. Presper Eckert, who were both professors of the same university, built the Electronic Numerical Integrator and Calculator (ENIAC) from 1943 to 1944 (Taoka & Suzuki, 2016). They were both professors at the University of Pennsylvania.
The ENIAC was and is still considered the grandfather of digital computers, as it occupied a 20-foot by 40-foot space in terms of a room, and had 18,000 vacuum tubes and later left the University of Pennsylvania in 1946 (Taoka & Suzuki, 2016). They received funding from the Bureau of Census to aid the building of the UNIVAC, which was the first commercial computer for government use and business (Rao, 2017).
In 1947, Walter Brattain, William Shockley, and John Bardeen, all who worked in Bell Laboratories, invented the transistor (Zhezhnych et al., 2017). They also designed the making of the electric switch without the use of a vacuum, but with solid materials. Grace Hopper developed the first computer language in 1953, which later became the COBOL (Zhezhnych et al., 2017). In the same year, Thomas Johnson Watson Jr., son of the then CEO of IBM, Thomas Johnson Watson Sr., conceived the IBM 701 EDPM to aid the United Nations in keeping tabs in Korea during the war. A team of programmers at the IBM led by John Backus developed an acronym for FORmula TRANslation, and the FORTRAN programming language in 1954, according to the University of Michigan (Rao, 2017).
In 1958, the Integrated Circuit (IC) called the computer chip was unveiled by Robert Noyce and Jack Kilby (Copeland & Long, 2017). Jack was later awarded the Nobel Prize in Physics for his work in 2000 (Rao, 2017). Douglas Engelbart demonstrated a prototype of the modern computer in 1964 (Rao, 2017). The prototype had a mouse and a Graphical User Interface (GUI). The prototype marked the evolution of a computer into a machine or technology that was more useful and accessible by the entire public at large, from just a specialized machine meant for mathematicians and scientists (Way & Robin, 2016).
In 1969, UNIX was made by a group of developers who were employees at the Bell Laboratories (Way & Robin, 2016). UNIX was developed as a computer operating system that helped in addressing the issues of compatibility in the entire computer. UNIX was an operating system that was configured and created in the C programming language, and it thus was portable across many platforms. It, therefore, changed to be the most prioritized system among the mainframe computers in large companies and even entities of the government.
However, it did not gain much attention from PC users at home because it was a slow operating system. In 1970, Intel 1103 was unveiled as the first Dynamic Access Memory (DRAM) chip, by the then newly founded Intel Corporations (Rao, 2017). Alan Shugart led a group of IBM engineers in inventing the Floppy Disk, a device that permitted the sharing of data between computers in 1971 (Rao, 2017). The Ethernet that aids the connection of many machines and other hardware was developed in 1973 by Robert Metcalfe, who was then a member of Xerox research staff (Way & Robin, 2016). From 1974 to 1977, many personal computers like IBM 5100, Scelbi & Mark-8 Altair, and Radio Shack's TRS-80, which were affectionately referred to as the Trash 80, and the Commodore PET, were developed and released into the market (Zhezhnych et al., 2017).
In 1975, the Altair 8080 was developed. It became popular via the January magazine issue of Popular Electronics, as it was featured and was regarded as the first minicomputer kit in the world to shake the market fo...
Cite this page
Research Paper on Computer's Relevance: Accuracy, Readiness and Speed. (2023, May 22). Retrieved from https://proessays.net/essays/research-paper-on-computers-relevance-accuracy-readiness-and-speed
If you are the original author of this essay and no longer wish to have it published on the ProEssays website, please click below to request its removal:
- Comparison Essay Example: Winn and Gibson
- Data Management in Equifax Paper Example
- The Development of Software - Research Paper Example
- Essay Example on CRM Systems: Retaining Clients Through Solutions, Not Apologies
- Successful Implementation of Health Information Technology: Reports, Methodologies & Lessons Learned
- Data 101: A Tutorial on Organizing and Retrieving Data - Research Paper
- Secure Linux With AppArmor: Name-Oriented Access Control - Essay Sample