Programs that determine ideal cluster + stripe sizes?

ginipigginipig OH, NOES
edited December 2003 in Hardware
I'll probably get flamed quite a bit considering the number of short-medians that have had enough real-world experiences to warrant the co-creation of a bestseller 'Guide to Raid.' Nevertheless:
I'll most likely have to go through multiple reinstallations of W2k + atto benchmarks in order to attempt any raid-tweaking. I was just wondering if there was a program made available (either commercially, or preferrably opensource) to users looking to monitor their hard-drive usage. In essence, a program that could determine the optimum stripe+cluster to use when building an array - based on algorithms that reflected my computer's data-read/writes patterns.

Patterns:
  • I would listen to monkey.audio files, surf the web, and encode music files all at the same time.
  • I would play Counter-strike and have a cd-burner running in the background.
  • I would edit large amounts of footage for 300zx-tuning guides whilst listening to music and chatting with overseas correspondences.

Basically, I multitask every chance I get, and I was hoping that this program could document every step I take with regards to file access.

If any of you know whether or not this is possible/completely absurd, let me know.

Don't be too harsh though.

Comments

  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited December 2003
    I think it borders on three steps below completely absurd, more in line with "perhaps remotely possible" :D

    Honestly, I really don't think there's a piece of software made that can do what you want. I'm pretty sure all of us here have done the old "create/benchmark/reformat/repeat" method to find the optimal settings.

    Remember one thing though - benchmarks can be somewhat deceptive. For example, if your 64/64 array pulls a 48000 atto, and then you try 16/64 and it pulls 49000, there is no way in hell you could EVER tell the difference in real world performance. It won't make any difference to you, as a user. That's why they call these benchmarks 'synthetic' - they don't actually mean much in the real world. They're just numbers. So,in order to save you what will probably be days worth of testing, just go with 64K stripe and 16K clusters and you'll be happy with the results.

    * primesuspect dons flame suit and prepares for a barrage from the texperts. ;D
  • ginipigginipig OH, NOES
    edited December 2003
    64K stripe, and 16K. OhhTays :wave:

    I'll try to do a little more research on the topic (i.e. pm Tex :respect: )
  • RWBRWB Icrontian
    edited December 2003
    On my system, I found 4K/16K the best and fastest overall. However, my understanding is that you overwork the HDD with the smaller sizes, so all in all 16K/64K is nearly as good, but without the overworked HDD.
  • ginipigginipig OH, NOES
    edited December 2003
    RWB, what kind of files do you usually deal with?
  • EnverexEnverex Worcester, UK Icrontian
    edited December 2003
    I think your bottleneck is more going to be the processor and the entire FSB trying to do all that at the same time, stripe size will be the least of your problems...
  • MediaManMediaMan Powered by loose parts.
    edited December 2003
    ginipig,

    What you can best hope for is an average result. There are too many variables to come to a defacto conclusion. Not only do you have to contend with the variables of what you plan to do...as each time you do it...every test will be uniquely different. For example, playing counterstrike will change each time you play it. That will impact the video card, the processor, etc. differently each time.

    There is no specific stripe and cluster size that is absolutely best for all systems. The more popular settings are 64k/16k and 16k/16k. Tex, for the longest time, waved the 16k/16k flag of hard drive gospel.

    Standing back from the issue a bit it can be safely said that 99% of the planet deal with small file access. This is in comparison to the other 1% that deal with larger files for professional graphics and video/multimedia editing. By larger I mean 300 MB and above single files.

    Even games don't really deal with HUGE files but, more accurately, the amount of information compiled by a many smaller calculations "simultaneously".

    If you were to spend the time to map out, as best you could, exactly repeatable tests with the same activity happening at the same times, a constant control could be done. This test(s) then would be repeated at least 5 times each where an averaged result would begin to show. You'd have to defrag the drives between each test and so forth.

    You could then come to a more accurate conclusion that "given these circumstances...this setting proved best". But not all of us multitask like that all the time. You could then come to the conclusion that under heavy processor, gpu and HDD activity a particluar setting works best.

    But that's just a lot of work that would take days. :)
  • ginipigginipig OH, NOES
    edited December 2003
    But that's just a lot of work that would take days.

    I broke a sweat from just reading your post!

    Well, the thing is, I multitask constantly. There is a lot of repetition with the kind of work I do (documenting engine swaps, howto guides for ignition mastering); if it were not for the strict regimen that I developed, I would have amassed nowhere near as much data as I have now. I've been doing this for the past 6 years, so it's really hard for me to adjust.

    I'll most likely have to set aside those few days to tweak my system, but not just yet. I figure it'll take about a month for NyE's effects to be thoroughly cleansed from my body. College not withstanding, I'll do it simply because I know that it's a neccessity.

    Look out in early '04 for my results.

    Thanks for the motivation, though. Previously, I had been searching for an easy way out. You got me on the right path :thumbsup:
Sign In or Register to comment.