1080p?

airbornflghtairbornflght Houston, TX Icrontian
edited December 2008 in Science & Tech
So my parents got a new lcd tv (pics coming later) that I got installed today with the new home theater system. My mom wants to have the computer hooked up to the tv (me too) but the current vid card on has one output and its vga.

Right now their computer is a P4 3.2 HT, 512 DDR400, and a radeon 9200se. The radeon doesn't have dvi output so I'm looking for a new vid card.

I was looking on newegg for a video card that is either agp or pci that had at least one dvi socket. But then I got scared reading some of the reviews about 1080p output.

The will more than likely never play a real 1080p file as they generally stream low bitrate movies over the internet (you know the sites). My question are if they were to play a 720p or 1080p file would the hardware handle it and what video card would you recommend?

Keep in mind I'm on a budget here. I'm trying to stay well under but not exceeding $50. I don't need anything fancy as there will be no gaming done on this computer. Just casual use situations including internet/email, office, etc.

I don't want to pop this card in, have the video stutter, and look like a fool.:vimp: Because some peoples' posts I read acted like you need a heavy duty computer to play 1080p and my computer doesn't even flinch so I figured a P4 might be at 30-40% usage.

Comments

  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2008
    You're not going to find a card in that price range that will capably handle 1080p decoding. 720p maybe. I'll do some research.
  • airbornflghtairbornflght Houston, TX Icrontian
    edited December 2008
    Could the processor not lend hand to the heavy lifting?
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited December 2008
    No.
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited December 2008
    To do 1080p you either need a video card with 1080p decoding or a fast dual core processor - you can't really go halfway with both.
  • RichDRichD Essex, UK
    edited December 2008
    I must admit this whole subject confuses me a little.

    Most mid range GPUs will eat up most games at pretty high reolution my radion x1950 will still burn up TF2 call of duty 4 etc without blinking. Now surely, gaming due to its dynamic nature, and 3d rendering, requires much higher processing power compared to streaming recorded video. So why cant mid range PCs handle HD video when it knows what content is coming and can build in a substantial buffer?
  • KhaosKhaos New Hampshire
    edited December 2008
    I was looking on newegg for a video card that is either agp or pci that had at least one dvi socket.
    When you say PCI, do you mean legacy PCI or PCI Express?
    Thrax wrote:
    You're not going to find a card in that price range that will capably handle 1080p decoding. 720p maybe. I'll do some research.
    The HD4350 accelerates H.264 for $40-$50, and provides audio over HDMI which makes it ideal for HTPC applications if there is a PCI-E slot on the motherboard.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2008
    Yes, but the 4350 still requires a significant CPU. It's not a dedicated H.264 decoder card.
  • KhaosKhaos New Hampshire
    edited December 2008
    Thrax wrote:
    Yes, but the 4350 still requires a significant CPU. It's not a dedicated H.264 decoder card.
    Not speaking from experience here, but supposedly it is just that: a dedicated H.264 decoder card.
    Anandtech wrote:
    AMD states that while both the 4550 and 4350 support full hardware Blu-ray decode acceleration, the 4350 may not be able to fully accelerate a high bitrate 2nd stream for picture in picture scenarios. We didn't have the opportunity to test this on our 4350 but we'll be following up with more HD decode tests in a future HTPC article. Both GPUs should decode a single stream 1080p Blu-ray movie without any issues, offloading 100% of the decode pipeline to the GPU; we confirmed that the 4550 works as expected and we're assuming the 4350 is the same given that the hardware is identical.

    In other words, it should play 1080P BluRay (And of course normal XviD/DivX movies) with no problems on the existing P4 3.2HT system unless Picture-in-Picture is a factor.
  • edited December 2008
    Khaos wrote:
    Not speaking from experience here, but supposedly it is just that: a dedicated H.264 decoder card.



    In other words, it should play 1080P BluRay (And of course normal XviD/DivX movies) with no problems on the existing P4 3.2HT system unless Picture-in-Picture is a factor.

    In my experience with my 3850, video cards still require a good bit of work from the CPU. I think it depends on the setup and software used?
  • GrayFoxGrayFox /dev/urandom Member
    edited December 2008
    That pc won't handle 1080p ever.

    I know 720p xvid plays fine on a 2.4C northwood overclocked to 3.4ghz. (Had a X800 XTPE in it at the time this was on my old toshiba rear projection 720P tv).

    Your best bet is really just to build a new HTPC, I have a samsung 550 series lcd (1080P) 52"

    Anyways my media center is the following Asus M3N78-VM, Amd Phenom 9550 4GB of ocz gold, Ati HD 3870 (HDMI for video) Razer barracuda for sound.

    Plays 1080P video's no problem, I can game on the tv as well.
  • KhaosKhaos New Hampshire
    edited December 2008
    Editing this down to try to condense it...

    We need to draw a distinction between H.264 and 1080p. The former, H.264, is a specific encoding format which is hardware accelerated by newer GPU's. The latter, 1080p, is only a resolution which does not refer to a specific encoding format. Some video files can be 1080p but encoded to some MPEG-2 derivative, like DivX or XviD. These encoding formats are not hardware accelerated by any consumer-type GPU, and thus playback is wholly CPU dependent.

    In other words, it's not accurate to say that "1080p uses CPU" or other such statements. It could, sure, but if it is encoded to the H.264 format (Which is MPEG-4 and not MPEG-2) then it could be GPU accelerated, in which case the tests show that CPU usage for H.264 playback is usually like 2-12%. H.264 playback is not CPU dependent with the right GPU, such as any Radeon HD4000-series GPU.


    On another note, there is a work around for this issue: All video files can be converted to H.264, and then they will be completely hardware accelerated during playback using a video card like the HD4350.

    BluRay movies can also be downsampled to lower bitrate H.264 MPEG-4 video files and thus stored on a local computer without using up 35-50GB of hard drive space.

    I hope this clears things up.
  • edcentricedcentric near Milwaukee, Wisconsin Icrontian
    edited December 2008
    Since you will probly never be playing true 1080p you don't need to be too worried.
    You can get a Radeon 3650 for AGP. They are about $70. It has HDMI output (via adapter)

    The problem is, if you were to build a new machine with a pci-e slot you could get a better video card for $45. Everything in your machine is end of life.
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited December 2008
    Normally I support finding a use for legacy hardware, but there are lots of threads out there on the interwebs relating stories of driver gloom with AGP-bridged ATI cards.

    Although, those usually have to do with games not working correctly with the ancient drivers shipped on the CD or on the manufacturer's website. So long as the drivers you have access to enable whatever hardware decoding is available, I suppose it'd be worth a shot to buy a $50 card and see if it works.
  • KhaosKhaos New Hampshire
    edited December 2008
    Considering that you can get a dual core, low power AMD CPU and a motherboard supporting PCI-E for about $110 total cost, I see no reason to take chances with legacy. Especially when a PCI-E HD4350 is $40.

    Better to save nickels for a month and upgrade the core system. Besides, that would ensure smooth playback of non-H.264 files.
Sign In or Register to comment.