AMD Ryzen

2»

Comments

  • AlexDeGruvenAlexDeGruven Wut? Meechigan Icrontian

    Hmmm, I use HDMI audio and I don't do OC'ing, so it looks like that does everything I need for the most part.

  • ThraxThrax 🐌 Austin, TX Icrontian

    That HDMI port wouldn't connect to anything in Ryzen. Ryzen does not have a GPU, and therefore no audio controller. The HDMI port is placed there because socket AM4 also supports 7th (today) and 8th (2h17) generation APUs as well.

  • AlexDeGruvenAlexDeGruven Wut? Meechigan Icrontian

    HDMI from my video card (R9 380). IIRC that doesn't use the onboard chipset.

  • ThraxThrax 🐌 Austin, TX Icrontian

    That is correct.

  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian

  • MAGICMAGIC Doot Doot Furniture City, Michigan Icrontian

    But they's sold out everywhere :-/

  • ObsidianObsidian Michigan Icrontian

    Can someone please explain to me why Ryzen is such a big deal at the moment? It's obviously good news for AMD but from what I can tell the biggest selling point seems to be that the 1800x can just about match a 6900K in most applications for half the price. From what I can tell though, Intel only charges $1000+ for their best processors because they know nothing can match them. Is the hype then stemming from the fact that AMD is finally competitive again for the moment and will force Intel to drop their prices? Do people expect AMD to sell a ton of them in workstations and make market share back there?

    I'm failing to see why people that play games in particular, where Ryzen seems to be struggling, would be so excited right now. From what I can tell AMD's answer for gamers is that at 1440p and 4k performance is close enough that it doesn't really matter because their CPU's are close enough to Intel's that performance is almost completely dependent on your GPU. That's still kind of an iffy answer though, especially because the majority of people are still playing at 1080p where you do see a difference in most games. Is there any reason why Ryzen would actually be a superior product for a gamer, especially if they typically upgrade their GPU more often than their CPU? Wouldn't you then potentially see a real performance difference two years down the line when you get a graphics card that's twice as powerful and your CPU actually does become a potential limiting factor?

    _k
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian

    It's half the price

  • Efficiency of the platform. They can offer a sixteen thread chip inside a 65 watt envelope. Intel is nowhere near that right now.

    I think if you are just straight gaming, the future parts are going to be a better deal. I think the Ryzen 5 six core twelve thread chips are going to be in a stellar space price to performance wise for gamers, even gamers that want to live stream. The current sixteen thread chips are for very specialized workloads that most of us will not do, but for someone that needs it, to get all that kick at under 100 watts TDP is a dream. Cooler, less fan noise, more efficient, will run longer without any noticeable drop in performance. Ryzen is a massive leap in efficiency.

    ObsidianGarg
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian

    There are three non-gaming tasks that are very important to me where Ryzen will be a huge benefit:

    1) Photo processing and editing. I routinely edit and batch-process big groups of photos that are anywhere from 8-25mb each. The faster this goes, the better

    2) Folding

    3) BIG data set number-crunching for gear sims for WoW. Seriously, there's a program called SimulationCraft that my current computer literally cannot run if I put a big data set into it. These data sets can be 2+ gb of raw math that my FX-8350 crashes out on.

    Cliff_ForsterBuddyJ
  • GHoosdumGHoosdum Icrontian

    @primesuspect said:
    3) BIG data set number-crunching for gear sims for WoW. Seriously, there's a program called SimulationCraft that my current computer literally cannot run if I put a big data set into it. These data sets can be 2+ gb of raw math that my FX-8350 crashes out on.

    Is this like theorycrafting, but with big data?

  • ObsidianObsidian Michigan Icrontian

    @Cliff_Forster
    Thanks for the explanation. Nothing I've seen so far has really screamed "Buy me!" yet but it'll be interesting to see what AMD has left to unveil and how Intel responds. It's hard not to be a little unimpressed so far if you're a typical gamer though.

    Cliff_Forster
  • TushonTushon I'm scared, Coach Alexandria, VA Icrontian
    edited March 2017

    @Obsidian said:
    @Cliff_Forster
    Thanks for the explanation. Nothing I've seen so far has really screamed "Buy me!" yet but it'll be interesting to see what AMD has left to unveil and how Intel responds. It's hard not to be a little unimpressed so far if you're a typical gamer though.

    I'm a little confused on "typical gamer". I don't think a typical gamer is an enthusiast that wants the extra percentile improvement for the multi-hundred dollar extra in cost. I totally agree with @Obsidian's point that Intel charged what they wanted because they were wanting in competition, and now they've got it AFAICT.

  • I had the same reaction as some when I looked at the benchmarks. I wanted them to punch Intel in the face in every single aspect, especially on a $500 chip which to me is beyond what I'm wiling to pay for any single PC component. All that said, look at the incredible amount of multitasking you can do inside a 95 watt TDP, and that is only when the X chips dynamically overclock themselves. The base model, sixteen threads, 65 watt TDP. Now I don't know where the 5 series chips will land in efficiency but if it dips below that it will be really exciting for a gaming rig shrink. Now if @Thrax could just call the motherboard partners and convince them to make me an Micro ATX board that does not suck by that time, I'll be living it up!

  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian

    @GHoosdum said:

    Is this like theorycrafting, but with big data?

    Yeah it's an absurd number of calculations. The sims take each of your possible combinations of gear and run 20,000 simulated raid fights with the data; they use tens of thousands of logs from actual WoW battles to get a baseline for what a 'typical' raid fight looks like from a numbers perspective (This character cast this spell, then waited one second and cast this one, then hit this damage buff, then got hit by this boss' ability and then did this and then that... etc). Each piece of gear has a bunch of stats on it and you're wearing 15 pieces of gear. Your weapon has abilties that empower your spells and other abilities. It gets tremendously complicated very fast. If you want to sim 40 different pieces of gear combinations, it's like 40x39x38x37 etc.... You end up with an 8gb data dump of every possible gear combination with what you have in your inventory. Then you run a full sim with EACH of those combinations and what you end up with is a job that shuts your computer down, no joke.

    GHoosdum
Sign In or Register to comment.