System Performance

One thing I'd like the readership to keep in mind when looking at these charts is that the Alienware X51 review unit we have on hand is just $999, and the $949 version with only 6GB of RAM will probably perform comparably. That means the desktops it's competing with here are almost all between two to three times more expensive; the only one in the same price bracket is the WarFactory Sentinel, but that system was reviewed nearly a year ago.

Futuremark PCMark 7

Futuremark PCMark Vantage

PCMark certainly takes the X51 to task, but keep in mind that every other system tested here is sporting an SSD while the X51 is making do with a mechanical hard disk. PCMark skews very heavily towards SSD-enabled systems; that's why the AMD Phenom II X4 955 in WarFactory's tower is able to post a lead on the Alienware's substantially faster i5-2320. Nnot that we're disputing how big of an impact that can have in the real world--in many use cases, a moderate Phenom II system with an SSD will feel snappier than a faster Core i5/i7 with a hard drive.

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R11.5

Video Encoding - x264

Video Encoding - x264

Once we get to the more CPU-centric application tests the X51 fares a bit better, and again here it's competing with desktops that are almost all running 4GHz or better overclocked processors. CPU performance is still actually quite good, and the i5 definitely offers a better value now than the Phenom II in WarFactory's machine did.

Futuremark 3DMark 11

Futuremark 3DMark Vantage

Futuremark 3DMark06

On the other hand, the GeForce GTX 460 in the WarFactory machine is definitely a faster video card than the X51's GTX 555, but it's not the most massive lead in the world, and in 3DMark06 the Sentinel winds up being CPU limited. The GTX 555 may very well have its work cut out for it, though; Alienware promises the X51 is able to deliver a compelling gaming experience, but is it going to be able to hack 1080p gaming?

Introducing the Alienware X51 Gaming Performance
POST A COMMENT

59 Comments

View All Comments

  • Anonymous Blowhard - Friday, February 17, 2012 - link

    "With such a compact design one would expect the X51 to be both loud and hot, but surprisingly this isn't the case. Quite the opposite actually; the X51 is cooler and quieter at both idle and load than the first-generation Xbox 360 was."

    I'm pretty sure I've heard quieter power tools than a first-gen 360. That's not exactly shooting for the moon there.

    How far away is that 40dB measurement being taken from? This makes the difference between "gaming capable HTPC" and "banned from the living room."
    Reply
  • haukionkannel - Friday, February 17, 2012 - link

    This is something like a paragon of "the best you can get" when thinking next generation consoles.
    The consoles are most propably even more cripled by power consumption and this would be too expensive, so they would reguire allso cheaper parts...
    Nice to see when xbox 720 comes out how it would compare to this...
    Reply
  • A5 - Saturday, February 18, 2012 - link

    Take this and replace the GPU with something with DX11.1 support and similar thermals (a 6850 with DX11.1 features added seems reasonable instead of a 7770), and you're probably in the ballpark.

    Good-looking console games come from the incredible amount of optimization possible due to a single hardware configuration, not from the power of the hardware.
    Reply
  • A5 - Saturday, February 18, 2012 - link

    You'd also replace the CPU with some kind of PPC variant if the rumors are to be believed. Reply
  • tipoo - Saturday, February 18, 2012 - link

    The first revision 360 had a 200W maximum power draw, this has a 172W draw. I think they could do it, but I think Microsoft at least, and probably Sony too, will re-think the selling for a loss strategy this round as it took them a looong time to recoup losses. There's a rumor the Nextbox will use a 6670-like card, but I think (and hope) that is false, as the original 360 dev kits used an old x800 graphics card before they finally came with the x1900-like chip in the 360. Reply
  • Traciatim - Friday, February 17, 2012 - link

    It's really unfortunate that you couldn't have done the gaming benchmarks with the I3, i5, and i7 models to see how much of a difference each step makes in a variety of games. Reply
  • Wolfpup - Friday, February 17, 2012 - link

    The answer is power gating, not switchable graphics. Until we have that better, we need the GPU acting as a GPU.

    These articles keep acting like it's fine, and in practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics.

    Articles like this that keep promoting it have casual users trying to buy stuff confused, when you've got 10 people on a forum trying to talk them out of it.

    I'm used to Anandtech being dead on with everything, so this Optimus push of the last few years is BIZARRE.
    Reply
  • TrackSmart - Friday, February 17, 2012 - link

    Switchable graphics makes a lot of sense for a mobile system, where an extra couple of watts of power draw can mean an extra hour or two of battery life. I'm already amazed at how little energy *very powerful* modern graphics cards use when idling. How much lower do you think they can realistically go? Until they can get within range of their mobile parts at idle, switchable graphics will continue to be a compelling feature for keeping laptops running longer.

    If you are talking specifically about desktop computers, then I agree that the benefits are minimal. Aside for access to Quick Sync for those few people who would use it.
    Reply
  • JarredWalton - Friday, February 17, 2012 - link

    "...in practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics..."

    I disagree. I've had very few BSODs, taking all of the Optimus laptops I've tested/used together over the past few years. I'm sure there are probably exceptions, but certainly within the last 18 months I've had no complaints that I can think of with Optimus on my personal laptops.

    I don't think Optimus fills a major need for a desktop, but posts like yours claiming that Optimus is essentially driver hell and problems are, in my experience, the rantings of someone who either had one bad experience or simply hasn't used it.

    But let's put it another way: what specific laptops have you used/tested with Opitmus where there were clear problems with Optimus working properly, where drivers couldn't be updated, etc.?
    Reply
  • TrackSmart - Friday, February 17, 2012 - link

    Gamers are the target audience, yet a marginally bigger case would have allowed for a more powerful GPU. Or a similarly powerful GPU for a lot less money. This is not a mobile system where every square cm of space counts, so why force the consumer to make such large compromises in price:performance?

    Obviously I'm not the target audience. Just like I will never own an "all in one" desktop computer that has the performance of a laptop. It just doesn't make sense unless you have absurd space limitations.
    Reply

Log in

Don't have an account? Sign up now