Definition

computer

Contributor(s): Vicki-lynn Brunskill

A computer is a device that accepts information (in the form of digitalized data) and manipulates it for some result based on a program, software, or sequence of instructions on how the data is to be processed.

Complex computers include the means for storing data (including the program, which is also a form of data) for some necessary duration. A program may be invariable and built into the computer hardware (and called logic circuitry as it is on microprocessors) or different programs may be provided to the computer (loaded into its storage and then started by an administrator or user). Today's computers have both kinds of programming.

Major types of computers

Analog computer - represents data by measurable quantities
Desktop computer - a personal computer that fits on a desk and is often used for business or gaming
Digital computer - operates with numbers expressed as digits
Hybrid computer - combines features of both analog and digital computers
Laptop (notebook) - an easily transported computer that is smaller than a briefcase
Mainframe (big iron) computer - a centralized computer used for large scale computing
Microcomputer - generally referred to as a PC (personal computer). Uses a single integrated semiconductor chip microprocessor.
Minicomputer - an antiquated term for a computer that is smaller than a mainframe and larger than a microcomputer
Netbook - a smaller and less powerful version of a laptop
Personal computer (PC) - a digital computer designed to be used by one person at a time
Smartphone - a cellular telephone designed with an integrated computer
Supercomputer - a high performing computer that operates at extremely high speeds
Tablet computer (tablet PC) - a wireless personal computer with a touch screen
Workstation - equipment designed for a single user to complete a specialized technical/scientific task

History of the modern computer

Most histories of the modern computer begin with the Analytical Engine envisioned by Charles Babbage following the mathematical ideas of George Boole, the mathematician who first stated the principles of logic inherent in today's digital computer. Babbage's assistant and collaborator, Ada Lovelace, is said to have introduced the ideas of program loops and subroutines and is sometimes considered the first programmer. Apart from mechanical calculators, the first really useable computers began with the vacuum tube, accelerated with the invention of the transistor, which then became embedded in large numbers in integrated circuits, ultimately making possible the relatively low-cost personal computer.

Modern computers inherently follow the ideas of the stored program laid out by John von Neumann in 1945. Essentially, the program is read by the computer one instruction at a time, an operation is performed, and the computer then reads the next instruction.

From the mid-1900s to the present, the advancement of computers is divided into five generations. While the year span for each generation varies depending on the reference source, the most recognized generational timeline is below.

1940 to 1956

First generation computers were room-sized machines that used vacuum tubes for circuitry and magnetic drums for limited internal storage. These machines used punched cards for data input and a binary machine code (language). Examples of first generation computers include the ABC (Atanasoff Berry Computer), Colossus, IBM 650 and the EDVAC (Electronic Discrete Variable Computer).

1956 to 1963

Second generation computers replaced vacuum tubes with transistors, used magnetic tape storage for increased storage capacity, used BAL (basic assembler language) and continued to use punched cards for input. Transistors drew less power and generated less heat than vacuum tubes. Examples of second-generation computers include the IBM 7090, IBM 7094, IBM 1400, and the UNIVAC (Universal Automatic Computer).

1964 to 1971

Third generation computers used ICs (integrated circuits) with several transistors and MOS (metal oxide semiconductor) memory. Smaller, cheaper and faster than their predecessors, these computers used keyboards for input, monitors for output, and employed programming languages such as FORTRAN (Formula Translation), COBOL (Common Business Oriented Language) and C-Language. Examples of third generation computers include the IBM 360 and IBM 370 series.

1972 to 2010

Fourth generation computers used integrated circuits and microprocessors with VLSI (very large scale integration), RAM (random access memory), ROM (read-only memory), and high-level programming languages including C and C++. The creation and expansion of the World Wide Web and cloud computing (the ability to deliver hosted services using the Internet) significantly enhanced computing capabilities during this period. Examples of fourth generation computers include Apple's Macintosh and IBM's PC.

2010 and beyond

Fifth generation computers are based on AI (artificial intelligence), use large scale integrated chips and more than one CPU (processor). Fifth generation computers respond to natural language input, solve highly complex problems, make decisions through logical (human-like) reasoning and use quantum computing and nanotechnology (molecular manufacturing). Fifth generation computers and programs allow multiple programs (and computers) to work on the same problem at the same time in parallel.

The advent of the Internet, cloud computing, and high bandwidth data transmission enables programs and data to be distributed over a network quickly and efficiently, while application programs and software make computers the tools of choice for such things as word processing, databases, spreadsheets, presentations, ERP (enterprise resource planning), simulations, education, CMS (content management systems), gaming and engineering. 

This was last updated in April 2019

Continue Reading About computer

Dig Deeper on Enterprise infrastructure management

Join the conversation

11 comments

Send me notifications when other members comment.

Please create a username to comment.

A computer, at least MY computer, has long since stopped being a "device" and developed into an extension of my brain. It's how I organize my thoughts, gather information, process it and ultimately communicate with the world.

Computers are already faster than I, they have better storage and they're re right on the verge of becoming lots smarter, too. Much, much smarter. And now that we're about to unleash the IoT (Internet of Things) we can expect an explosion of content and resources at our fingertips. At our computer's chip-tips.

I hate to anthropomorphize, but, hell, I do anyway. This tin box of circuits and chips has developed over the years into something beyond "device". Let's face it, my assistant "accepts information (in any form I provide) and manipulates it for some result based on a sequence of instructions." But I'd be loath to call him a device.

He has a name, he's an entity. I trust and respect him. I value his knowhow and input. I believe in his work and rely on it everyday. Just like my computer.

Despite all that, neither one will make a decent cup of coffee for me....
Cancel
Haha i loved the bit about a cup of coffee :) Yeah i totally agree with your sentiment and feel the same about all my computers and phones and tablets: respect, gratitude and appreciation for everything they do for me. I'm attached to them in a way. They are cool. Thumbs up!
Cancel
Computer Is an electronic device that accept data and instructions as input, process the data according to the given instruction and produce information as output.
Cancel
OK. But shouldn't we update the information it receives to be even analog data now that we're using devices that can interact with the world around them either through visuals or audio?
Cancel
No we cant. This is because even this analog data has first to be converted into a series of bits and bytes in order to be processed
Cancel
People can define things based on their composition, function, application, analogy..
We can, for example, say that a house is a structure built according to design patterns and consisting of building materials such as bricks or timber. Or we can say that house is a structure where people live.

Being a software tester, I think of computer systems as a medium allowing people to communicate and collaborate with each other through distance and time.
Cancel
A computer is an electronic device used to process data, in small to extremely large amounts, in a structured way via a set of command in a program to produce a desired results. This can be another set of data or a report.  It can be done much faster that trying to accomplish the same task manually. Computers have changed vastly over the last 50 years. What used to fill a room you can now wear on your wrist.

Cancel
Yes, computers are great multitasking tools. I have played music files while writing code and developing 3D models for game use at the same time. I have also done the same while video chatting. Throw in a download or two and that just is further multitasking.
Cancel
the full form of computer is:-
c:commonly
o:operating
m:machine
p:practically
u:used in
t:technology
e:education
r:research
Cancel
Computer is a normal system. It is used all over the world. But it is not carry in another place. Today COMPUTER is a very useful device for work.

Cancel
Thank You For Sharing this Article
Really this is awesome tips sir thanks for guiding me
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

SearchServerVirtualization

SearchCloudComputing

SearchSQLServer

SearchEnterpriseDesktop

SearchVirtualDesktop

Close