Originally posted by I-like-flings(m):ok new question.. hehe.... so what is the diff between dual DVI and HDMI... always read it for GPU and Monitor.... if it dun come with HDMI... only with DVI.. then got anyway to convert it to HDMI too...and lost in quality?
basically, HDMI was developed based on DVI... the video signal that HDMI carries is exactly the same as DVI... but the advantage of HDMI is that it can also carry audio signals at the same time, thus eliminating the ugly sight of chunks of wires...
so since the video signal of DVI and HDMI are the same, there are DVI-HDMI converters in the market that allows u to switch from one format to the other... that's how i connect my Radeon X1950GT, which has only DVI out and HDTV(Component) out, to the LCD TV in my living room... however, if u connect this way, then u'll have to connect the audio with another cable...
Originally posted by MyPillowTalks:If u are talking about graphic card, i tink when they put dual DVI, HDMI, it mean 2 dvi outputs, 1 HDMI output. in the box, there would be converters from dvi to vga, or dvi to hdmi.
They are called dongles
my ATI radeon HD3870x2 comes with 3 dvi, 1 HDMI, 1 S-video output on board, and on the box they say:"triple DVI, HDMI"
And HDMI comes with cable, if u ar talking about the nvidia graphics card, i heard that the HDMI dongle does not support audio, and the ati HD series ones support audio out of the HDMI, 1080p
most ATi Radeon HD cards come with HDMI as the GPU has got the codec of HD audio integrated, thus audio from the sound card can output to the GPU through PCI-E and to the HDMI port...
for Nvidia cards, u'll need to connect the graphic card and sound card by S/PDIF cable in order to output as HDMI... anyway, all Nvidia cards are designed to output sound through DVI as well, but this doesn't make sense as all DVI cables do not carry sound, thus they require a special dongle/converter in order to output both sound and video thru HDMI...
however, it seems that ATi is also adopting this approach (less the S/PDIF part) lately with the launch of Radeon HD 4850 (ie needs a dongle to connect thru HDMI), which is not surprising to me since for every HDMI port created, they'll ve to pay royalty fee...
Originally posted by IT-Newbie-Logy:TS might wanna look at the E7200 or the E8400 Processor, coupled with a ATi4850
which cost so freaking cheap like 279$ same price as a new 9600gt. Cheap than a
8800Gt.
Board,i no idea. depends on your needs,Rams i prefer Corsair XM2. Crucial Tracers.
Some add ons/info on 4850,some brands come packed with DVI to HMDI adapter.
4850 DDR3 Core is lower than 4870 DDR3. The rest like transistors and memory is the same.
So,what the hell is TS waiting for? =D
the reference design of HD4870 uses GDDR5... that's wat u re paying the extra money for compared to HD4850...
theoratically, GDDR5 has at least twice the bandwidth of GDDR3... the fastest GDDR5 that i know of can run at 6000MHz (1500x4) stable... fastest GDDR3? not even 2000MHz...
in fact HD4870's bandwidth (GDDR5) is on par with, if not more than, that of GeForce GTX 280 (GDDR3)... do take note that the former still uses a 256-bits memory interface while the latter is using a 512-bits memory interface...