Modern computing
We, as
human beings, are born with a sense of curiosity that makes us want
to explore the world around us and seek to understand 'how things
work'. These explorations are carried out based on the perceptions
of our with five senses. All five senses namely; sight, sound, taste,
smell and touch feed our curiosity as well as helping us understand
the world, its objects and its ideas.
Computers,
during the early years were complex, very difficult to use and limited
in their capabilities. One of the factors that contributed to their
complexity was the fact that they employed a limited number of media
such as text alone to communicate with their environment which in
turn was their human users. Computers over the years have evolved
to be more user-friendly mainly because of an increased capacity
to use multiple forms of media such as imagery and sounds to communicate
and give feedback to their users.
This realisation
led to the dawn of the multimedia age with the advent of microprocessors
that were capable of executing special multimedia related instructions
during the mid 1990s and the computer's ability to process audio
and video information has grown exponentially over the years.
As the computers
grow in user-friendliness and accessibility, so has the number of
users and their range. Most of the users of modern computers are
not necessarily programmers or researchers, but ordinary people
such as executives, secretaries and students who have a minimal
knowledge if any about programming or the inner workings of the
machines.
Thus, it has
been vital that the interface between the computer and the user
be capable of fostering efficient communication and provides the
user with a comfortable environment in which he/she could accomplish
a given task. This has been possible only with the advent and advancement
of multimedia technology which has grown into one of four main fields
in computer science and one that has not been affected by the economic
downturns of the recent past unlike fields such as hardware and
software that felt the adverse effects of the slowing economy. As
a result, jobs in the multimedia sector showed a growth while the
majority of sectors had to cut down on jobs to stay afloat.
Multimedia
applications have been utilised to provide interactive interfaces
to computer software that are used in various aspects such as Computer
Based Training (CBT), Interactive Websites on the World Wide Web,
advertising, movies, cartoons and entertainment services, computer
games and many more. But the development of these applications requires
the most rare and highly priced component - creativity. So if 'creative'
is your middle name and you want to get into IT... you might just
want to consider being up-to-date in the field of multimedia.
We invite those
engaged in the field of multimedia and those aspiring to do so,
to write in and share your views with us about any of the aspects
discussed above. Write into Technopage and share your knowledge
and experience with everyone.
What is overclocking? Overclocking involves
running the hardware, such as processors and RAM, at higher
speeds than what the manufacturer has rated (and guaranteed) that
device to run at.
This can potentially
result in increased performance of the hardware and the entire system
at no additional cost.
Generally,
overclocking is achieved by increasing the clock speed of the device:
Overclocking rely heavily on a number of variables, such as architecture,
yields, temperatures, tolerances and cooling that allow an increase
in the speed of the device, ideally without spending additional
money.
As with running
any piece of hardware outside of manufacturer's guaranteed specifications
and parameters, there is always the possibility of damage to hardware.
You risk shortening the lifespan of a system as well as compromising
system stability.
As long as your
hardware is cooled properly, and you don't go overboard trying to
feed it too much voltage/power, permanent damage is minimised. Additional
heat and voltage does increase the wear and tear on the processor,
shortening its life. Fortunately, most people will have long replaced
that component before its operational life comes to an end.
Remember that
your mileage may vary; just because someone else's hardware has
reached a certain speed in no way guarantees that yours will also
reach the same speed. You should always thoroughly research the
hardware that you are going to overclock before cranking up the
speed.
Sent in by
Vishwa Jayasinghe
Snippets
Sun Microsystems to build their own distribution
of Linux?
Sun Microsystems has decided to drop its own brand of Linux
in favour of existing distributions. Sun said the reason for the
change came from customers who were unhappy about having to contend
with another version of Linux. Sun will instead partner up with
an existing Linux vendor such as Red Hat or SuSE. Sun has not announced
which vendor it plans to pursue talks with, and no timeline has
been set by the company.
Sun still plans
to offer a Linux desktop by June. The desktops were to have been
loaded with the Sun version of Linux but it is not clear whether
the desktops will still offer the Sun version of Linux or another
distribution.
Broadband
handshake...
Scientists
in Japan are currently researching the electric signals produced
by a human being. The technology they are investigating is wireless
communication devices powered by these signals. The results could
herald a new wave of communication where information could be passed
from one device to another by way of a handshake.
The company
behind this research is Nippon Telegraph and Telephone (NTT), which
has confirmed that such a technology would allow data transfer at
broadband speed, but relying solely on the electrical signals produced
by the person wearing the device.
As well as human-to-human
data transfer, NTT also sees human-to-machine interaction as feasible,
with identification being given just by touching a surface. The
technology has yet to be perfected, and therefore costs and specific
details have yet to be disclosed.
Improve your
computer literacy
Vampire tap: A cable connection used to connect transceivers
to a Thicknet coaxial cable in an Ethernet network using a bus topology.
Instead of cutting the cable and attaching connectors to both ends
of the severed coaxial cable, a vampire tap pierces through (hence
the name vampire) the insulating layer of the cable and makes direct
contact with the cable's conducting core.
Companding:
Formed from the words compressing and expanding. A PCM compression
technique where analog signal values are rounded on a non-linear
scale. The data is compressed before being sent and then expanded
at the receiving end using the same non-linear scale. Companding
reduces the noise and crosstalk levels at the receiver.
COCOMO: Short
for Constructive Cost Model, a method for evaluating and/or estimating
the cost of software development. There are three levels in the
COCOMO hierarchy:
* Basic COCOMO:
computes software development effort and cost as a function of programme
size expressed in estimated DSIs. There are three modes within Basic
COCOMO:
Organic Mode:
Development projects typically are uncomplicated and involve small
experienced teams. The planned software is not considered innovative
and requires a relatively small amount of DSIs (typically under
50,000).
Semidetached
Mode: Development projects typically are more complicated than in
Organic Mode and involve teams of people with mixed levels of experience.
The software requires no more than 300,000 DSIs. The project has
characteristics of both projects for Organic Mode and projects for
Embedded Mode.
Embedded Mode:
Development projects must fit into a rigid set of requirements because
the software is to be embedded in a strongly joined complex of hardware,
software, regulations and operating procedures.
* Intermediate
COCOMO: an extension of the Basic model that computes software development
effort by adding a set of "cost drivers," that will determine
the effort and duration of the project, such as assessments of personnel
and hardware.
* Detailed
COCOMO: an extension of the Intermediate model that adds effort
multipliers for each phase of the project to determine the cost
driver's impact on each step.
COCOMO was developed
by Barry Boehm in his 1981 book, Software Engineering Economics.
|