Monday, September 22, 2014
Search
  
Most Popular
Interviews
Optical Storage
WEB Reviews
Seagate Enterprise Capacity 3.5 V.3 4TB SAS 6Gb/s HDD Review
OCZ Vector 256GB SSD Review @ Custom PC Review
Gigabyte F2A85XM-D3H
NZXT Phantom 630
Auvio Bluetooth Portable Speaker Review
Corsair H90 CPU Cooler Review
BIOSTAR Hi-Fi Z77X (Intel Z77) Motherboard Review
Noctua NH-L9i Cooler Review on Technic3D
Breaking News
Samsung Galaxy Alpha And LG G3 Vigor Coming from AT&T
EMC In Merger Talks With Other Companies: reports
HTC To Make Next Google Nexus tablet: report
MediaTek Unveils LinkIt Platform to Support Wearable and IoT Device Creation
Apple iPhone Sales Top 10 Million
VESA Puts DisplayPort Into New USB Type-C Connector
Corsair Unleashes Gaming RGB Keyboards, RGB Mice, and Headsets
Microsoft Pushes Back Xbox One Debut In China
Home > Interviews > Optical Storage

Wednesday, September 21, 2005
Interview With Mr Rene Froeleke of ATI

5. Page 5

ATI has announced an incorporated Silicon Image. The SiI 1390 transmitters are claimed to be solutions designed to interface directly to the video and audio interfaces of PC platforms to enable secure access to high definition content. What is the main reason that has driven ATI and NVIDIA to equip graphics cards with an HDMI interface?

There are currently no graphics adapters I know of that would have a HDMI interface. Although some reference designs have appeared, so far, we have not released a board that incorporates a HDMI interface (and I am also not aware that any of our partners would have one).

The HDMI interface supports both digital audio and video. Will a future scenario include graphics cards which would also support audio?

Please see my answer in the previous question. In addition to that, I could not talk about any future plans, I can really only talk about products that are available now or hint at upcoming products.

Would the use of a dual-core GPU concept, like the one used for PC CPUs be a reasonable course of action for future graphics card?

The concept behind dual-core CPUs is to parallelize the workload and split it into two or more threads. In graphics processing, we have been using this kind of concept for years, i.e. we are running several pipelines in parallel.

What about a dual GPU on a single PCB? Should we expect performance comparable to a CrossFire configuration?

Yes, two GPUs on one PCB would essentially offer the same performance as a CrossFire configuration. In fact, there are some advantages if both GPUs could be integrated on one PCB, but there are also a lot of drawbacks such as powerdraw, space, etc.

Are we going to see four or even more display card solutions on a singe board, after CrossFire?

Our FireMV line of products already support up to 4 Monitors on a single graphics adapter. However, most users with the desire to drive two, three or four monitors usually don’t use these systems for gaming but for applications for example in the stock market, banking, etc.




Get RSS feed Easy Print E-Mail this Message


 
Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2014 - All rights reserved -
Privacy policy - Contact Us .