Graphic controller
The Graphics controller is the remarkable device that drives your screen.
Your choice of controller will have an inpact on the overall performance of your system, especially when it comes to playing
your favourite games.
The difficulty in choosing the right video accelerator card comes
from the different needs we have for this piece of hardware. As usual we'd prefer getting a card that can do everything at
an excellent level and this if somehow possible for a low price as well. However the miracolous cheap all-round card isn't
out yet and I guess that it will possibly take forever until all our needs will be pleased. Hence we have to make our mind
up what is most important to us and also how much money we are willing to spend.
The first question we have to ask ourselves is if we will use
our system mainly for professional work or mainly for games. Most professional cards are not great at games and vice versa.
If you've already got a video card in your system, ask yourself if you're pleased with its 2D performance at professional
work and if you just want to purchase an add on card for games. In this case you still have the professional performance of
your current video card and add some real good gaming performance with the add on 3D card. You will need an additional PCI
slot though.
Complicating the market further is the growing popularity of PCI Express as
the graphics interface format for cards. PCI Express offers a much higher bandwidth than the latest AGP format.
It is not easy to measure pure 3D performance, because there are so many different
ways a 3D engine can be used. Most official benchmarks are using the Direct3D engine of DirectX, like e.g. ZD's 3D Winbench
or VNU's Final Reality. These benchmarks can only show you the card's Direct3D performance, hence how well the driver translates
Direct3D into the chip's own 3D engine. NVidia's RIVA 128 doesn't need this 'translator', it uses Direct3D as its
own API. This is only one reason why the RIVA scores by far best in Direct3D benchmarks. However some games written for that
specific 3D engine of a chip can run much faster than the 3D Winbench score would let you expect them to. VQuake for Rendition's
Verite 1000 is one good old example. The Verite 1000 was never scoring well in 3D Winbench, but VQuake looked good and ran
fast.
3D Quality !
Now 3D performance is only one thing, 3D quality is another. There
are a lot of 3D features used nowadays, most of them supported and used by DirectX 5, but there will be even more 3D features
implemented in DirectX 6. A 3D chip can only support a special amount of 3D features, others are either not supported at all,
or special drivers are used that emulate these features. In my latest test I came across only one chip that supports virtually
every current 3D feature properly and this is 3Dfx's Voodoo chip. The big let down of the Voodoo chip leads to the other aspect
for quality, the 3D screen resolution. The Voodoo chip can only do 640x480 in case of 2 MB frame buffer memory (4 MB cards),
as in most of the Voodoo cards, or maximal 800x600 in case the card comes with 6 MB RAM (e.g. Quantum 3D Obsidian 100SB) , 4MB hereof as frame buffer. NVidia's RIVA 128 chip has got a simular problem,
it can't support more than 4 MB onboard memory, only good for a 3D resolution of maximal 800x600. Now it doesn't have to be
that bad, since we are quite pleased with our good old television as well, which has a lower resolution than 800x600. The
3D chip and the system CPU have to be powerful enough for running smoothly at this resolution as well. However, I've seen
'Forsaken' at 1024x768 on a PII 300 with an ATI XPERT card and it looks pretty awesome.
Which Cards?
NVIDIA provides top-to-bottom
solutions for every type of desktop PC user. For work or play, gaming or Web surfing, NVIDIA GPUs (graphics processing units),
NVIDIA MCPs (media and communications processors), platform processors, and multimedia solutions deliver the power and
performance PC users require. NVIDIA desktop solutions are ideal for gamers, video enthusiasts, families, professionals, or
home office users
http://www.nvidia.com/page/geforce6.html
The RADEON® X850 series is the most extreme gaming graphics card
technology ever created by ATI, with up to 16 pipelines, the fastest frame rates and ATI's industry-leading 3D image enhancement
technology. The RADEON® X850 series delivers further on the promise of High-Definition Gaming.
http://www.ati.com/products/radeonx850/index.html
Towers, Mini-Towers, Desktops
Although it is relatively unimportant, there lie a few distinctions in classifications
of PC cases. The case of the PC is the exterior body, or the enclosure, of the motherboard and all connected components. This
includes the power supply, cooling equipment, hard drives, optical drives, and any expansion cards. The vast majority of cases
today are beige, mid-tower ATX enclosures known as a “beige box” to describe the generic qualities of this case
(this is not to be confused with the phone-hacking term “beige box” as I am sure there is an enlightened person
reading this with some other connotations in mind concerning the term). There are some general and some more specific classifications
regarding all cases. The first classification is by one of three basic types: towers, mini (or mid) towers, and desktops.
Most standalone servers come in large towers, which sit upright and are generally very large to accommodate a large number
of hard drives, cooling equipment, and dual processors. Some high-end workstations and gaming rigs these days are enclosed
by full towers. Mid-Towers are like towers in that they sit upright, but are generally smaller in depth, length, and height.
The vast majority of computers these days are mid-towers. The third class is the desktop. This is a computer that lies flat
on the desk, usually with the monitor on top of it. Although most consumer PC’s no longer utilize this enclosure, many
corporate computers use this space-saving design.
There is also another widely-known means of classification – by motherboard form factor. However, motherboard form factor has much more relevance than just in the
physical sense.
Motherboard Form Factors
There are a multitude of motherboard “form factors.” Form factors
of a motherboard describe the physical shape and appearance, as well as features and circuitry, of a motherboard. There are
newer and older form factors, as there are newer and older computer components
XT
The XT, or extended technology, form factor was developed for the Intel 8088
CPU. All you need to know about this is that it is very, extremely old. It utilizes a 16-bit internal and 8-bit external bus.
The XT form factor also could not be configured electronically – rather, to configure BIOS settings, you would have
to manually set jumpers.
AT
Advanced Technology, or AT boards were the next step up from XT motherboards.
Though physically (dimensionally) the same as its predecessor, the AT form factor motherboard is distinguished by a number
of features. The first of these is a 16-bit (or in some cases, greater) external bus. It is important to understand what is
being referred to when a bus is called “16-bit” or “32-bit.” The width of the bus, known as the datapath,
is measured in bits. One bit represents a single digit, one or zero, in the binary system. With 8 bits, one can form 8 1’s
or 0’s. Correspondingly, those 8 digits can form a number between 0 and 255 in the decimal system. The greater the size
of the datapath, the more information can be pushed from one system component to another, including such things as hard drives
and expansion cards. A serial bus, for example, transfers information serially, that is, one bit at a time, whereas the parallel
bus transfers information at eight bits at a time. Make careful mention that although the datapath of one bus may be larger
than another, this does not make the former faster than the other. For example, USB is a serial bus, yet transfers information
hundreds of times faster than an eight-bit parallel bus. In addition to the 16-bit external bus, AT utilizes a keyboard connector
that has five pins and is larger than the PS/2 keyboard connector that will arise in the future. This five-pin DIN connector
would come to be known as the AT connector, synonymous with “old keyboard connector.” This is because the AT form
factor is a number of years older than the more modern form factors. The last major advantage of AT over XT was its use of
CMOS – Complimentary Metal Oxide Semiconductor – in configuring BIOS – Basic Input Output Settings –
in the place of jumpers and switches.
Baby AT
The Baby AT board was a response to those manufacturers and consumers who longed
for a smaller yet similarly capable form factor. Thus the Baby AT factor was born.
LPX/Mini-LPX
Although the Baby AT motherboard form factor was very slim, it still was not
quite small enough to fit into a slimline desktop unit. To alleviate this, the LPX and Mini-LPX form factors were utilized
and although they share many features with the AT design, they also have a few advantages over it. One, of course, is obviously
physical shape and size. The AT board was more than twice as big in area! Less obvious is the ability for the LPX and Mini-LPX
boards to use a riser card, which allows all of the expansion cards to lie flat, rather than tall. This reduces the height
of the desktop by the width of the riser card. In addition, the connectors on these motherboards were in well-fitting, standard
placements. Because the AT five-pin DIN connector was too large, a smaller and more versatile six-pin Mini-DIN connector was
created. This was to be deemed the PS/2 connector.
ATX/Mini-ATX
The majority of today’s PC’s utilize the ATX form factor. Smaller
dimensionally than the AT form factor, and with many more features, the ATX form factor is practical in today’s PC applications.
First of all, it is physically smaller and eliminates drive overlap in most cases. Second, all of the I/O connectors are built
right into the board. Third, the ATX utilizes the PS/2 keyboard connector. Fourth, the CPU placement and cooling options are
designated for a cooler computing experience. Fifth, there is a single 20-pin power connector. Finally, the ATX form factor
runs at a lower voltage than before, 3.3V, which is now an accepted computer component standard.
NLX
NLX is basically the update to the LPX and Mini-LPX corresponding to ATX. It
is found in newer retail slimline and desktop pc's.
Processors
Processors, or CPU’s, are the central processing units of the computer.
They perform all of the calculations necessary for normal operation. They also control a large number of I/O functions. Locating
the CPU is rather easy – look for a fan on the motherboard rising high above the motherboard and underneath that fan
and heatsink will lay a large chip. That chip is the CPU of the computer. Most PC’s utilize a single CPU. However, a
PC may utilize more than one CPU, usually meaning it is a duel CPU computer. Even still, there are some machines that can
manage four processors. For the purposes of this guide, we will focus on individual processors, as server-oriented dual processors
such as Intel’s Xeon and Itanium and AMD’s Athlon MP are not focused upon in this test.
There are several ways to classify processors. The first, major way is by the
family of processor. There exist several major families of processors, including the 80X86, Pentiums I, II, III, and IV, and
various lines of AMD processor. You will find that the A+ test focuses very much on typical IBM architecture, which up until
recently was built entirely around the Intel CPUs. Therefore, most questions dealing with CPU’s and such on this test
are focused upon the Intel architecture. However, even within the family of chips, there are a number of discerning characteristics
between processors. Some processors are derivatives of other processors, and add on or subtract features from the original
in some kind of marketing scheme. In addition, these derivative chips differ in clock speed with one another. A major testing
point seems to also be the presence or lack thereof of a floating point unit, or FPU. Another major testing point concerns
the presence and amount of internal cache. Finally, it is important to determine the width of the data paths in the CPU, both
internal and external, as described previously.
80X86 Processors
The 80X86 processors are considered the first modern CPU’s for IBM computers.
Designed and manufactured by Intel, the 80X86 line began with the 8088 and ended with the 80486 derivative processors. Each
of these added on features, quality, and clock speed.
The 8088 Processor
The Intel 8088 processor chip was technically not the first IBM compatible CPU
in existence. Rather, the 8086 was the first developed IBM CPU for the mass market. However, for some marketing reason, the
8088 was the first to hit the market. It is thus the traditional starting point in the history of IBM CPU’s. It was
developed and designed for the XT architecture, and clocked at about 4.77 MHz. For those of you who are not familiar with
the term hertz, one hertz is a tick, or cycle, in a second. As the number of hertz increases, a proportionally larger associated
number of ticks per second results, thus associating hertz with speed. One Megahertz, or MHz, is 1024 squared number of ticks
per second. This is roughly equal to a million ticks per second (the actual number is a bit greater – 1,048,576). Later,
it was clocked at 8 MHz, roughly double of the original speed. It featured a 16-bit internal and 8-bit external bus, which
fit nicely with XT model architecture. It also had a 20-bit address bus to access 1 MB of RAM. To finish execute an instruction,
the 8088 took about 12 cycles, on average.
The Failed 8086
Recall that the first processor was the 8086. However, due to an issue in pricing,
Intel and IBM felt that the majority of consumers would not pay the extra money to achieve the features associated with the
8086, including a true 16-bit internal bus and a true 16-bit external bus. As it was able to communicate at 16 bits with other
components of the computer, it was able to achieve a 20% performance gain over the 8088.
Partial Success: The 80186 and the 80188
Both the 80186 and the 80188 suffered similar problems. Both still addressed
memory at 20-bits, limiting the potential RAM to be used to only a single megabyte. Additionally, although they combined much
system I/O components, most consumers and PC manufacturers did not believe these gains constituted a purchase. However, some
consumers appreciated the integration and purchased these processors.
The 80286 Processor
The 80286 processor represents a turning point in computing. First, the release
of the AT motherboard coincided with the release of this processor (IBM and Intel cooperation) and thus this combination became
to be referred to as the IBM clone. This association lead to the standardization of the AT motherboard. Moreover, the 80286
was able to address 16 entire megabytes of memory – nothing by today’s standards, but a revolutionary achievement
at the time. However, the way programs are written are highly specific in the means they access memory, and thus the processor
instruction codes for these programs differ when there is a different mechanism for addressing memory. Intel wisely made the
80286 capable of running older 8088/8086 compatible programs which were designed to only access a single megabyte of memory.
In addition, the 80286 required on average 4.5 cycles per instruction, rather than the 12 required by the 8088/8086 processors.
The 80286 was also available in many clock speeds, including 8, 10, 12, and 20 MHz. Even so, the 80286 would install on motherboards
limited at 16 MHz, even when clocked at a higher speed.
The most remarkable achievement of the 80286, however, was the creation of real
mode. The 80286, which is capable of addressing 16 megabytes of memory, was the first chip to introduce both “Real”
and “Protected” mode. Real Mode is the mode in which the processor addresses the first 1,024 bytes of memory as
conventional and thus assigns them real locations, allowing backwards compatibility. The 286 was capable of addressing 16
megabytes of memory, as previously mentioned, though it could extend that number through a feature known as swapping. This
feature would prove to be indeed important in the future of computing. Swapping memory is the process of using the hard
drive as a holding space for RAM memory. In other words, if the system requires more RAM, the system can utilize the
hard drive to copy blocks of memory to the hard drive, use those blocks, and then eventually, when those blocks are needed,
return them to the physical memory addresses. Through swapping, software could access up to one gigabyte, or 1024 cubed, bytes
of memory. However, swapping was only available in “Protected Mode,” which was not fully utilized until later.
Protected Mode, theoretically, allows a program to crash, or stop responding tot the CPU, without causing the entire system
to fail. This is because in Protected Mode, each of the memory addresses was fully independent of one another. This also allowed
the CPU to swap memory to the hard drive. However, in practice, software developers would not make use of swapping till a
bit later, and even today, under a Pentium IV and Windows XP, one program can still cause the entire system to crash. Despite
the realities, Protected Mode was a very important development in the CPU.
Some sites on Motherboards:-
www.asrock.com.tw
www.asus.com
www.soltek.com.tw
www.gigabyte.com.tw
www.intel.com.au
Power Supply In
a personal computer (PC), the power supply is the metal box usually found in a corner of the case. The power supply is visible
from the back of many systems because it contains the power-cord recept
Power supplies, often referred to as "switching power supplies", use switcher
technology to convert the AC input to lower DC voltages. The typical voltages supplied are:
- 3.3 volts
- 5 volts
- 12 volts
The 3.3- and 5-volts are typically used by digital circuits, while the 12-volt
is used to run motors in disks drives and fans. The main specification of a power supply is in watts. A watt is the
product of the voltage in volts and the current in amperes or amps.
If you have been around PCs for many years, you probably remember that the original
PCs had large red toggle switches that had a good bit of heft to them. When you turned the PC on or off, you knew you were
doing it. These switches actually controlled the flow of 120 volt power to the power supply.
Today you turn on the power with a little push button, and you turn off the
machine with a menu option. These capabilities were added to standard power supplies several years ago. The operating system
can send a signal to the power supply to tell it to turn off. The push button sends a 5-volt signal to the power supply to
tell it when to turn on. The power supply also has a circuit that supplies 5 volts, called VSB for "standby voltage" even
when it is officially "off", so that the button will work.
Power Supply Problems The PC power supply is probably the most failure-prone item in a personal computer. It heats and cools each time it
is used and receives the first in-rush of AC current when the PC is switched on. Typically, a stalled cooling fan is a predictor
of a power supply failure due to subsequent overheated components. All devices in a PC receive their DC power via the power
supply.
A typical failure of a PC power supply is often noticed as a burning smell just
before the computer shuts down. Another problem could be the failure of the vital cooling fan, which allows components in
the power supply to overheat. Failure symptoms include random rebooting or failure in Windows for no apparent reason.
For any problems you suspect to be the fault of the power supply, use the documentation
that came with your computer. If you have ever removed the case from your personal computer to add an adapter card or memory,
you can change a power supply. Make sure you remove the power cord first, since voltages are present even though
your computer is off.
Some sites on power supplies;-
www.fortunetec.com.au
www.anyware.com.au
www.thermaltake.com
|