Techno
Page
The five phases
The history of computer development can be divided into five generations.
Each generation is characterised by a major technological development
that fundamentally changed the way computers operate, resulting
in increasingly smaller, cheaper, extra powerful more efficient
and reliable devices.
First generation
- 1940-1956
Breakthrough: Vacuum tubes.
The first computers used vacuum tubes for circuitry and magnetic
drums for memory. They were enormous, taking up entire rooms and
sometimes even buildings. They were very expensive to operate and
as a result of using a vast amount of electricity, generated a lot
of heat, which often led to malfunctions. First generation computers
relied on machine language to perform their tasks, and they could
only solve one problem at a time. Input was based on punched cards
and paper tape, and output was displayed on printouts. The UNIVAC
and ENIAC computers are examples of first-generation computing devices.
The UNIVAC was the first commercial computer delivered to a business
client, the U.S. Census Bureau in 1951.
Second generation
- 1956-1963
Breakthrough: Transistors replaced vacuum tubes and ushered
in the second generation of computers. The transistor was invented
in 1947 but weren't used in computers until the late '50s. The transistor
was far superior to the vacuum tube, allowing computers to become
smaller, faster, cheaper, more energy-efficient and more reliable
than their predecessors. Though the transistor still generated a
great deal of heat that exposed the computer to damage, it was a
vast improvement over the vacuum tube. Second-generation computers
still relied on punched cards for input and printouts for output.
They also moved from machine language to assembly languages, which
allowed programmers to specify instructions in words. High-level
programming languages were also being developed such as early versions
of COBOL and FORTRAN. These were also the first computers that stored
their instructions in their memory, which moved from a magnetic
drum to magnetic core technology.
Third generation
- 1964-1971
Breakthrough: Integrated circuits.
The development of the integrated circuit was the hallmark of the
third generation of computers. Transistors were miniaturised and
placed on silicon chips, which drastically increased the speed and
efficiency of computers. Instead of punched cards and printouts,
users interacted with third generation computers through keyboards
and monitors with the luxury of an operating system. This allowed
the device to run many different applications at one time with a
central programme that monitored the memory and other resources.
Computers for the first time became accessible to a mass audience
because they were smaller and cheaper than their predecessors.
Fourth generation
- 1971-present
Breakthrough: The microprocessor brought the fourth generation
of computers, as thousands of integrated circuits were built onto
a single silicon chip. What in the first generation filled an entire
room could now fit in the palm of the hand. The Intel 4004 chip,
developed in 1971, located all the components of the computer -
from the central processing unit and memory to input/output controls
- on a single chip. In 1981, IBM introduced its first computer for
the home user, and in 1984, Apple introduced the Macintosh. Microprocessors
also moved out of the realm of desktop computers and into many areas
of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked
together to form networks, which eventually led to the development
of the Internet. Fourth generation computers also saw the development
of GUIs, the mouse and handheld devices.
Fifth generation
- present and beyond
Breakthrough: Fifth generation computing devices, based on
artificial intelligence, are still in development, though there
are some applications, such as voice recognition, that are being
used today. The use of parallel processing and superconductors is
helping to make artificial intelligence a reality. Quantum computation
and molecular and nanotechnology will radically change the face
of computers in the years to come. The goal of fifth-generation
computing is to develop devices that respond to natural language
commands and are capable of self-learning.
Understanding
CD burner speeds
Have you ever wondered what all those numbers mean when you
hear people talk about CD burners? When you see a configuration
that looks like 2x12x24, these numbers indicate the drive speeds
of the CD drive. What is important is that the "x" stands
for the transfer of 150 KB of data per second, and each number represents
a different action that the CD drive can take. A CD-R drive has
two actions; recording onto and reading from compact discs. A CD-RW
drive has three actions; recording, rewriting (erasing and recording
over) and reading. When talking about drive speeds, the first number
("2" in the example above) indicates the speed at which
the CD drive will record data onto a CD-R compact disc. So, the
CD drive will record data at 2 times 150 KB/second. The second number
("12" in the above example) indicates the speed at which
the CD drive will rewrite data onto a compact disc. So in the above
example, the CD drive will rewrite data onto the compact disc at
12 times 150 KB/second. The last number ("24" in the above
example) indicates the speed at which the drive will read data from
a compact disc. So the CD drive will read data from a compact disc
at 24 times 150 KB/second.
The difference
between the Internet and the World Wide Web
Many people use the terms Internet and World Wide Web (also
known as The Web) without exactly knowing what they mean, but in
fact, the two terms are not synonymous. The Internet and the Web
are two separate but related things.
The Internet
is a massive network of networks; a combination of various networking
infrastructure. It connects millions of computers together globally,
forming a network in which any computer can communicate with any
other computer as long as they are both connected to the Internet.
Information that travels over the Internet does so through many
languages known as protocols.
The World Wide
Web is a way of accessing information over the Internet. It is an
information-sharing model that is built on top of the Internet.
The Web uses the HTTP protocol, only one of the languages spoken
over the Internet, to transmit data. The Web also utilises browsers,
such as Internet Explorer or Netscape, to access Web documents called
Web pages that are linked to each other via hyperlinks. Web documents
also contain graphics, sounds, text and video.
The Web is
just one of the ways that information can be distributed over the
Internet. The Internet (not the Web), is also used for e-mail, which
relies on SMTP, news groups, instant messaging and FTP. So the Web
is just a portion of the Internet, although a large portion, but
the two terms are not synonymous and should not be confused.
What is Ethernet?
Ethernet is a local-area network (LAN) architecture developed by
Xerox Corporation together with DEC and Intel in 1976. Ethernet
uses a bus or star topology and supports data transfer rates of
10 Mbps. It is one of the most widely implemented LAN standards.
A newer version
of Ethernet, called 100Base-T (or Fast Ethernet), supports data
transfer rates of 100 Mbps. And the newest version, Gigabit Ethernet
supports data rates of 1 gigabit (1,000 megabits) per second.
|