Testing Methodology, Revised

For those of you who aren't familiar or don't remember, here's a brief primer on how we used to test cases to get you up to speed.

Acoustic testing is standardized at one foot from the front of the case, using the Extech SL10 with an ambient noise floor of ~32dB. For reference, that's what my silent apartment measures with nothing running, testing acoustics in the dead of night (usually between 1am and 3am). A lot of us sit about a foot away from our computers, so this should be a fairly accurate representation of the kind of noise the case generates, and it's close enough to get noise levels that should register above ambient.

Thermal testing is run with the computer having idled at the desktop for fifteen minutes, and again with the computer running both Furmark (where applicable) and Prime95 (less one thread when a GPU is being used) for fifteen minutes. I've found that leaving one thread open in Prime95 allows the processor to heat up enough while making sure Furmark isn't CPU-limited. We're using the thermal diodes included with the hardware to keep everything standardized, and ambient testing temperature is always between 71F and 74F. Processor temperatures reported are the average of the CPU cores.

That all seems fairly reasonable, but over time subtle issues have crept in that we're taking the opportunity to correct.

For starters, we've found that while the Extech SL10 is perfectly fine for testing sound levels at about 37dBA and up, it's downright lousy for handling anything designed for silent running. The meter only has an official noise floor of 40dB, which is frankly a bit loud. To produce more accurate results, we've switched over to a beefier Extech SL130. The SL130 is rated to go as low as 30dB (basically the lowest any reasonably priced sound meter will go). In addition, I've actually moved since I started doing case reviews, and my new apartment is much quieter than the old one, resulting in an ambient noise floor well below 30dB. Unlike the SL10, the SL130 won't make "an educated guess" about sound levels below its rated floor, either. I'm continuing to test acoustics with the microphone a foot directly in front of the top of the enclosure to ensure consistent readings on that front. Anything below 30dB still rates as "near silent", but this is a big step away from 40dB.

Meanwhile, thermal testing has proven to be a bit trickier than initially anticipated. Maintaining a consistent interior temperature of an apartment (or even just one room) is easier said than done. Even a variation in ambient temperatures that I mentioned before can color results. As a result, instead of using the absolute temperatures reported from the hardware's thermal diodes during testing, I'm reporting the delta over ambient temperature. Ambient temperature is also measured at the beginning of each test cycle (after fifteen minutes of idle, and before fifteen minutes of burn-in.)

Test cycles are also being ever so slightly modified. I'm continuing to use seven threads of Prime95 to stress the processor, but GPU stress is now being handled by eVGA's OC Scanner instead of Furmark. Furmark is an odd duck that I've found to be unreliable as a GPU stress testing tool; Furmark just consumes power, but doesn't actually simulate proper GPU stress the way something like OC Scanner will. We've also seen some driver tweaks by both AMD and NVIDIA over the years designed to prevent Furmark, so it's best to use something else.

As before, I'm continuing to use the thermal diodes of the internal components rather than separate thermal sensors, and CPU temperature is reported as an average of the four cores. SSD temperature will continue to be included as a representative of how well the enclosure cools installed drives, but chipset and RAM temperatures are no longer going to be included. RAM thermals are really only relevant in extreme cases, and modern chipsets just aren't the heat generators that old dogs like the X58 were.

One new wrinkle I'm including is fan speed, though. Since the CPU and GPU fans are both thermally controlled, it may be useful to see just how hard these fans have to work in any given enclosure. These results aren't going to be strictly comparable between enclosures due to variations in ambient temperatures, but should be a reasonable starting point.

Testing Hardware (Mini-ITX), Revised Conclusion: More Reliable Comparisons
Comments Locked


View All Comments

  • Coup27 - Thursday, March 29, 2012 - link

    clicking "testing hardware (mini-itx) revised" takes you to the home page.
  • JarredWalton - Thursday, March 29, 2012 - link

    It works for me. Can you refresh, clear cache, and/or try a different browser? If you're still having problems, please post details.
  • Coup27 - Thursday, March 29, 2012 - link

    It happened earlier and seconds after the article disappeared and was scrubbed from the home page as well. It's back now so not really sure what was going on. Links are working now tho.
  • holotech - Thursday, March 29, 2012 - link

    does the same for me.
    firefox, updated fully.
  • holotech - Thursday, March 29, 2012 - link

    i lied, sorry, its working fine now.
  • Coup27 - Thursday, March 29, 2012 - link

    I have a feeling the article was pulled and then reposted?
  • Impulses - Thursday, March 29, 2012 - link

    Some cases really merit an extra testing round with an alternate fan configuration from what they ship with... Most enthusiasts often end up adding a fan or repositioning some, it'd add a lot of value to the reviews to allow for that.

    I know a lot of people are still gonna complain you didn't test the exact configuration they'd use or whatever, and you can't realistically test more than one alternative... But I still think it's worth thinking about, doesn't even have to happen for every review, just the ones that really merit it (particularly value cases that skimp on fans or highly customizable cases).

    I understand why doing this is a logistics nightmare but as long as everyone understand any extra testing is at the reviewer's discretion, I think it'd be a great addition. I know testing them only as they ship is fairer and more representative of the case's value, BUT I think a lot of enthusiasts view cases as an investment that lasts thru multiple builds, so value is really skewed by what you can ultimately do with it after $20-40 and several years later.
  • casteve - Thursday, March 29, 2012 - link

    Congrats on the new test setup and the quieter living arrangements. If this trend continues, I expect you be living in an underground climate controlled man cave in a few years. :)
  • VampyrByte - Thursday, March 29, 2012 - link

    I totally agree with using temperature deltas rather than absolute values. However, atleast in the 550D article posted alongside this one, I found it hard at first to source the value for the ambient temperature. This information should really be in the title of the graph somewhere.
  • JarredWalton - Thursday, March 29, 2012 - link

    The ambient temperature will likely be slightly different for each case tested, so as the graphs get filled in with more data points we can't simply list one ambient.

Log in

Don't have an account? Sign up now