ATI has announced an incorporated Silicon Image. The SiI 1390 transmitters
are claimed to be solutions designed
to interface directly to the video and audio interfaces of PC platforms to
access to high definition content. What is the main reason that has driven
ATI and NVIDIA to equip
graphics cards with an HDMI interface?
There are currently no graphics adapters I know of that would have a HDMI
interface. Although some reference designs have appeared, so far, we have not
released a board that incorporates a HDMI interface (and I am also not aware
that any of our partners would have one).
The HDMI interface supports both digital audio and video. Will a
future scenario include graphics cards which would also support audio?
Please see my answer in the previous question. In addition to that, I
could not talk about any future plans, I can really only talk about products
that are available now or hint at upcoming products.
Would the use of a dual-core GPU concept, like the one used for PC
CPUs be a reasonable course of action for future graphics card?
The concept behind dual-core CPUs is to parallelize the workload and
split it into two or more threads. In graphics processing, we have been using this
kind of concept for years, i.e. we are running several pipelines in parallel.
What about a dual GPU on a single PCB? Should we expect performance
comparable to a CrossFire configuration?
Yes, two GPUs on one PCB would essentially offer the same performance
as a CrossFire configuration. In fact, there are some advantages if both GPUs
could be integrated on one PCB, but there are also a lot of drawbacks such
as powerdraw, space, etc.
Are we going to see four or even more display card solutions on a
singe board, after CrossFire?
Our FireMV line of products already support up to 4 Monitors on a single
graphics adapter. However, most users with the desire to drive two, three or
four monitors usually don’t use these systems for gaming but for applications
for example in the stock market, banking, etc.