1080p?
airbornflght
Houston, TX Icrontian
So my parents got a new lcd tv (pics coming later) that I got installed today with the new home theater system. My mom wants to have the computer hooked up to the tv (me too) but the current vid card on has one output and its vga.
Right now their computer is a P4 3.2 HT, 512 DDR400, and a radeon 9200se. The radeon doesn't have dvi output so I'm looking for a new vid card.
I was looking on newegg for a video card that is either agp or pci that had at least one dvi socket. But then I got scared reading some of the reviews about 1080p output.
The will more than likely never play a real 1080p file as they generally stream low bitrate movies over the internet (you know the sites). My question are if they were to play a 720p or 1080p file would the hardware handle it and what video card would you recommend?
Keep in mind I'm on a budget here. I'm trying to stay well under but not exceeding $50. I don't need anything fancy as there will be no gaming done on this computer. Just casual use situations including internet/email, office, etc.
I don't want to pop this card in, have the video stutter, and look like a fool. Because some peoples' posts I read acted like you need a heavy duty computer to play 1080p and my computer doesn't even flinch so I figured a P4 might be at 30-40% usage.
Right now their computer is a P4 3.2 HT, 512 DDR400, and a radeon 9200se. The radeon doesn't have dvi output so I'm looking for a new vid card.
I was looking on newegg for a video card that is either agp or pci that had at least one dvi socket. But then I got scared reading some of the reviews about 1080p output.
The will more than likely never play a real 1080p file as they generally stream low bitrate movies over the internet (you know the sites). My question are if they were to play a 720p or 1080p file would the hardware handle it and what video card would you recommend?
Keep in mind I'm on a budget here. I'm trying to stay well under but not exceeding $50. I don't need anything fancy as there will be no gaming done on this computer. Just casual use situations including internet/email, office, etc.
I don't want to pop this card in, have the video stutter, and look like a fool. Because some peoples' posts I read acted like you need a heavy duty computer to play 1080p and my computer doesn't even flinch so I figured a P4 might be at 30-40% usage.
0
Comments
Most mid range GPUs will eat up most games at pretty high reolution my radion x1950 will still burn up TF2 call of duty 4 etc without blinking. Now surely, gaming due to its dynamic nature, and 3d rendering, requires much higher processing power compared to streaming recorded video. So why cant mid range PCs handle HD video when it knows what content is coming and can build in a substantial buffer?
The HD4350 accelerates H.264 for $40-$50, and provides audio over HDMI which makes it ideal for HTPC applications if there is a PCI-E slot on the motherboard.
In other words, it should play 1080P BluRay (And of course normal XviD/DivX movies) with no problems on the existing P4 3.2HT system unless Picture-in-Picture is a factor.
In my experience with my 3850, video cards still require a good bit of work from the CPU. I think it depends on the setup and software used?
I know 720p xvid plays fine on a 2.4C northwood overclocked to 3.4ghz. (Had a X800 XTPE in it at the time this was on my old toshiba rear projection 720P tv).
Your best bet is really just to build a new HTPC, I have a samsung 550 series lcd (1080P) 52"
Anyways my media center is the following Asus M3N78-VM, Amd Phenom 9550 4GB of ocz gold, Ati HD 3870 (HDMI for video) Razer barracuda for sound.
Plays 1080P video's no problem, I can game on the tv as well.
We need to draw a distinction between H.264 and 1080p. The former, H.264, is a specific encoding format which is hardware accelerated by newer GPU's. The latter, 1080p, is only a resolution which does not refer to a specific encoding format. Some video files can be 1080p but encoded to some MPEG-2 derivative, like DivX or XviD. These encoding formats are not hardware accelerated by any consumer-type GPU, and thus playback is wholly CPU dependent.
In other words, it's not accurate to say that "1080p uses CPU" or other such statements. It could, sure, but if it is encoded to the H.264 format (Which is MPEG-4 and not MPEG-2) then it could be GPU accelerated, in which case the tests show that CPU usage for H.264 playback is usually like 2-12%. H.264 playback is not CPU dependent with the right GPU, such as any Radeon HD4000-series GPU.
On another note, there is a work around for this issue: All video files can be converted to H.264, and then they will be completely hardware accelerated during playback using a video card like the HD4350.
BluRay movies can also be downsampled to lower bitrate H.264 MPEG-4 video files and thus stored on a local computer without using up 35-50GB of hard drive space.
I hope this clears things up.
You can get a Radeon 3650 for AGP. They are about $70. It has HDMI output (via adapter)
The problem is, if you were to build a new machine with a pci-e slot you could get a better video card for $45. Everything in your machine is end of life.
Although, those usually have to do with games not working correctly with the ancient drivers shipped on the CD or on the manufacturer's website. So long as the drivers you have access to enable whatever hardware decoding is available, I suppose it'd be worth a shot to buy a $50 card and see if it works.
Better to save nickels for a month and upgrade the core system. Besides, that would ensure smooth playback of non-H.264 files.