NVidia Developer Relations
It's all fine to have an absolutely amazing
piece of graphics hardware, but without some
background information and developer resources
it's never going to be used to it's full
potential. When using API's such as OpenGL and
Direct3D this isn't going to come from one source,
rather a collection of sources - Microsoft (for
the D3D API), game development websites,
newsgroups, web forums etc...
However, NVidia do provide a very thorough site
packed with white papers and downloadable demo's
showing how to use the more advanced features
available (bare in mind it is usually only the
advanced parts they show off). You do have to be
at the more advanced/experienced end of the
spectrum to make full use of all the white papers
and demo's available, but there are still more
than enough to get beginner/intermediate
programmers up and running with some quality
graphics.
NVidia's developer site, and developer
relations do stand out in one particular way that
no other 3D card manufacturer seems to have
managed: It's a big player. This is a bit of a
broad term, it encompasses the fact that NVidia's
engineers and developers are present for many of
the big decision-making processes (they worked
closely with Microsoft on Direct3D8, and probably
are for Direct3D9), and their developers are often
responsible for pushing forward the graphics
programming industry.
The latest thing to rock the graphics world (at
time of writing) is NVidia's 'Cg' compiler - a
truly remarkable piece of software/thinking. 'Cg'
is basically "C for graphics" - a high
level shader language/compiler to simplify the
dark art that is shader programming. Anyone who's
looked into pixel/vertex shaders properly will
have realized that it is far from a trivial task -
even the most simple programs can prove to be
rather difficult (unless you're at home with
assembly-level programming). When you use the Cg
language you can use (as the name states) a subset
of the C programming language to write your
scripts, and then let the compiler reduce it down
to the complicated assembly level instructions.
High Level Shader Languages (HLSL) as an idea
aren't anything new really - various parties have
been working on them for a while now, but NVidia's
Cg libraries are the first significant launch.
Programming with Cg is still a tricky business
at times, but the alternatives are far worse!
NVidia have an 80mb downloadable archive for those
interested, and assuming you compile to DirectX8
standards you can still use the compiler for non-NVidia
cards (although it won't take advantage of the
Radeon's ps1.4 capability). The downloadable SDK
comes with a good selection of examples (some of
which are very impressive) - ideally you should
see them working in real-time to appreciate them,
but you can look at the following screenshots
(Click to enlarge):
Useful Links:
NVidia's
main developer website.
NVidia's
Cg toolkit page.
Conclusion
From the benchmarks earlier in this review, it
would appear that the GeForce 4 Ti4200 is the
faster card when compared to the Radeon 8500 -
given that the Ti4200 is the slowest of the 3
variations it would be a logical assumption that
the Ti4600 would be a phenomenally powerful piece
of hardware (although having not tested it myself,
I can't guarantee this!). However, there are two
things to bare in mind with this comparison;
firstly the Radeon 8500 reviewed before was not
the fastest Radeon available (read the review for
more details), and secondly the Ti4200 on review
here is an engineering release from NVidia that
you can't buy in the shop - so a Ti4200 you
purchase in the shops could well have marginally
different performance.
For the Ti4200
the speed increase over the Radeon 8500 is not
that significant, but the Radeon 8500 does have a
slightly more advanced Direct3D feature set -
this, in my opinion, tips the balance slightly.
When deciding between a Ti4200 and a Radeon 8500
you should really only be considering features and
price, when you start looking at the Ti4400 and
Ti4600 will you really start to appreciate the
GeForce 4's speed advantage.
Developer support from NVidia is very very
good, and given that they are constantly pushing
the limits of real-time 3D graphics you can be
assured that if you have one of their latest
graphics cards you'll be in good hands, and have
access to most of the latest and greatest
graphical technology.
The Ti4600 (at time of writing) is an expensive
piece of equipment, so the Ti4200 may well provide
the better price:features ratio for most, but
whichever one of the GeForce 4's you get (if you
get any!) will not disappoint you - all of them
are powerful, and short of the cutting-edge
version 1.4 pixel shaders they are a fully
featured series of graphics cards.
Good Points |
Bad Points |
• the Ti4200 is a fast
card, provisionally that would make the
Ti4600 an exceptionally powerful card. |
• Possibly some small
driver issues lurking around (Although
they will be the exception). |
• Absolutely excellent
developer relations department |
• You'll need a fairly
recent computer to make full use of the
card (bandwidth/processor speed) |
• A very capable graphics
card, given it's features. |
• Doesn't have every
single latest feature for cutting edge
design/programming |
• Comes from a very strong
company dedicated to pushing the limits of
3D graphics |
• Higher specification
GeForce 4's are quite expensive. |
• Good price:feature ratio
for the Ti4200. |
|
•
nView implementation is an amazingly
useful tool if you have two monitors
available. |
|
select a page from the list:
• Introduction
• Installation, Benchmarks and Programming
• NVidia's Developer Relations, Conclusion
|