Comments Locked

146 Comments

Back to Article

  • Wreckage - Monday, October 28, 2013 - link

    So AMDs big Hawaii show was just wiped out by a simple price cut.
  • techxx - Monday, October 28, 2013 - link

    Sure sounds like it. Their R9 280X parade was sweet, but short. I suppose there will always be those who will want the 10% performance advantage though at the cost of build quality, noise, and heat levels.
  • Gigaplex - Monday, October 28, 2013 - link

    Noise and heat levels, sure, but build quality? I've seen far more build quality issues with NVIDIA.
  • just4U - Tuesday, October 29, 2013 - link

    Build quality across both companies appears to be about equal.. Nvidia's high end reference cooler takes the cake though if it's partners would use it more. Hopefully AMD takes note of that. One of the sticking points for many who might be early adopters (of either company) is always the cooler. Having something really special that will still be enticing after custom designs come out was a great move by Nvidia that I hope AMD also adopts.
  • meacupla - Monday, October 28, 2013 - link

    not quite yet?
    We already know 290X trades blows with Titan and actually beats it in ultra high resolutions, while 780Ti performance is still just speculation at this point.

    And we are still waiting on word for 290 pricing and performance.

    Either way, this competition is good for the customer.
  • Sancus - Monday, October 28, 2013 - link

    There's a big problem with using the 290X for ultra-high resolutions, though. If you were to do that, you'd want at least 2 or 3 of them, because one of them isn't good enough even if they are a few percent faster than an overclocked 780: the noise. One 290X is noisy but not a big deal, 2 or 3 of them goes from "noisy" to "completely intolerable". Plus Crossfire is still a mess, especially with Eyefinity.
  • Spunjji - Monday, October 28, 2013 - link

    You clearly haven't read the 290X crossfire articles, then, because it works fine for the 290X. Also consider that anyone spending the money on those cards and the accompanying setup would have to be startlingly dim to use the stock coolers on any of these devices when superior options are available.
  • The Von Matrices - Monday, October 28, 2013 - link

    This is a common misconception. Eyefinity does not have to be expensive. You can get three 1080P monitors for the same price as the R9 290X. For those people (I am one of them) the price of swapping the stock heatsink is actually a significant portion of the purchase cost.
  • Sancus - Monday, October 28, 2013 - link

    Next you'll tell me that SLI works just as well as a single GPU.
  • TheJian - Tuesday, October 29, 2013 - link

    So you're saying the price of an AMD card is card+fan to avoid the noise/heat...So not $550 then huh? Last I checked it still doesn't come with 3 AAA games either and pulls ~50-60 more watts.
  • jnad32 - Monday, October 28, 2013 - link

    Correct me if I'm wrong, but I thought every review I read said that quiet mode was basically inaudible. I also haven't really read anything about bad crossfire performance from the 290X's in any reviews either.
  • MrSpadge - Monday, October 28, 2013 - link

    It's "not noisy" only while being idle. Under load in quiet mode it's "okay, better than HD7970GE", but very clearly audible. Unless you've got some other massive noise source nearby.. which would make talking about noise almost pointless.
  • Torashin - Monday, October 28, 2013 - link

    What, like the game audio? *facepalm*
  • Gigaplex - Monday, October 28, 2013 - link

    Not all games have loud sounds at all times. And some of us don't turn the volume up all that loud.
  • Klimax - Wednesday, October 30, 2013 - link

    Not even game with massive explosions will overlay such noise and that level of loudness is already quite problematic...
  • Gadgety - Monday, October 28, 2013 - link

    "One 290X is noisy but not a big deal, 2 or 3 of them goes from "noisy" to "completely intolerable"." Well, going multi GPU I would definitely put them under water. In my opinion, multi GPU are all completely intolreable regardless of brand.
  • eanazag - Monday, October 28, 2013 - link

    We'll know how the Ti fares after it is released. It is interesting knowing that AMD has laid its cards out and they still have it priced so high. The Titan didn't get a price cut yet and likely should. I am having a hard time believing that the Ti will out perform the Titan. Yet, what we do know is that there is plenty of thermal room for nvidia to ratchet up their existing lineup to parity with AMD and edge them on build quality with a performance advantage.

    I guess what we need to see are performance, heat, and noise numbers for a GTX 780 with a TDP of 300W's - before the 7th.

    Back on Titan - it is too easy to get two 290x's for the price of a Titan. And the numbers are very much in favor of AMD.

    All-in-all this is great for customers, especially compared to the drought we are getting on the CPU side.
  • Gadgety - Monday, October 28, 2013 - link

    "The Titan didn't get a price cut yet and likely should. I am having a hard time believing that the Ti will out perform the Titan." Well, not necessarily. The Titan offers high capacity double precision, which I assume the 780ti won't. My impression is in the compute/workstation segment the Titan already is a high value option. No price cut necessary.
  • b3nzint - Monday, October 28, 2013 - link

    I think you are wrong, take a look at this. http://www.sisoftware.co.uk/?d=qa&f=gpu_financ...
    R9 290x also shines in compute area!
  • TheJian - Tuesday, October 29, 2013 - link

    Can you make money with Sisoftware, folding@home, bitmining, etc? NOPE. CUDA is where the money is in pro apps. No AMD card can do it and Titan excels at it for $1500 off Tesla pricing ($2500).

    From your link...LOL
    "Note: OpenCL was used as it is supported by all GPUs/APUs. The tests are also available through CUDA which provides better optimisations for nV hardware."

    Gee lets run NV cards in their worst case scenario ignoring CUDA so AMD looks reasonable...Lets run them in that crap that's not funded by anybody called OPENCL. Why not test CUDA vs. OpenCL here? They are already telling you they can do it, but then NV would blow away AMD...Toms/anandtech both refuse to pit them against each other while it is EASY to do with any pro app (adobe, cinema4d, 3dsmax, blender etc etc). Just swap plugins and bench. Luxrender for AMD and Furryball/Octane etc for NV. What excuse is there for never pitting them against each other? OpenCL for AMD (or OpenGL whichever is needed) vs. Cuda in the same app for NV. They all hate cuda and love OPEN crap which is why they avoid this scenario that can easily be shown. Cuda is taught in 600+ universities for a reason...LOL.

    Your link is running NV in CRAP MODE. Nice try. What moron purposely turns off 7yrs of CUDA funding and runs Titan (or any NV card) without Cuda when as he notes it is already in there. TURN IT ON. Useless results.
  • dukejukem - Tuesday, October 29, 2013 - link

    I've read your sentence twice now, I'm still not sure what point you're trying to make here.

    The only attempt at a fact that you've put in this mess of a paragraph is that 600 universities teach CUDA, yet have failed to mention what their specific application of CUDA was, and why the teach it in the first place.

    I think you may have to realise that people buy video cards to suit their needs, some people buy them because they like having the most expensive card on the market to brag to their friends about, some people just want to play games within a reasonable budget, other people like myself buy certain cards based on functionality requirements that each manufacturer produces with the cards.
  • Da W - Tuesday, October 29, 2013 - link

    CUDA is like AMD's Mantle, for compute. If Mantle is bad, so is CUDA. If CUDA is good, so is Mantle.
  • Klimax - Wednesday, October 30, 2013 - link

    Cuda is not like Mantle. Cuda manages quite few thing like core allocation and such.
  • Bubafet - Wednesday, October 30, 2013 - link

    Do you know how many times you said cuda, all that effort and sass over what, how much money do you make with it, hope it justifies your. If you were listened to elsewhere you wouldn't be a random tool on the internet, and curiously do you think the odd wannabe creator, that boasts long before any display of real talent, not childish obsession, outways the masses that just want to game. You act like youir profession, or attempt since you haven't mentioned any credible work or employment, is anywhere near the main purpose or use for the product. Always people that aim at knowing more like yourself are the first to pat themselves on the back for knowing the first tier and claiming their legend status to people that don't care. and comparing yourself to two reputable tech groups you've mentioned, sad, sad. Please make a game that makes your rant less awkward. If your a fanboy, try sticking up for NV's past six months of driver shame. Even they are back-peddling and hiding at the moment. I'm an Nv fan, always have been, but no more
  • blitzninja - Saturday, November 2, 2013 - link

    I know he sounds like a tool but he does make one good point. OpenCL (and OpenGL by extension) don't run as efficiently on nV hardware as CUDA does since nV Compute hardware is built with this in mind.

    The crap he says about OpenCL being shit is obviously false and it's clear he has no clue whatsoever about OpenCLs coding, OpenCL is actually really efficient and OpenGL--since he mentioned it--is actually faster than Direct X 11, the only reason people choose Direct X is because it has more support/advertising (Microsoft) and has more entry user friendly API documentation, otherwise, OpenGL is faster and runs on every platform.

    Honestly, run the CUDA nV vs OpenCL AMD because if someone buys an nV card, they're not going to be running OpenCL if CUDA support is available, so it's a more realistic comparison.
  • TheJian - Tuesday, October 29, 2013 - link

    As gadgety said, Titan is a Tesla already with a $1500 price cut down from $2500. Pro's still laugh about the value of a Titan and that is why it sells. If you bought them just for gaming, you're either rich, or just don't get the point of the product. Great gamer yes, but also as a PRO CARD replacement on a budget. Think CUDA, and you'll get the point. They don't need a price drop but I'd suggest it anyway to say $800 and make a faster ultra model with another SMX enabled and 100mhz. It shouldn't hurt tesla much as if you're really making serious cash with your card you'll want driver support for the other $1500. Over years of use making money it means nothing.
  • just4U - Tuesday, October 29, 2013 - link

    From what I know about compute stuff.. doesn't AMD sort of lead in this area across most of their cards? In theory wouldn't they be the budget pro card?
  • abhaxus - Wednesday, October 30, 2013 - link

    I believe AMD does not do double precision with the same speed as the Titan. For basic compute (consumer level) like bitcoin mining, folding, etc, the AMD cards are faster. But for prosumer and professional tasks, the Titan is the king. Not an expert, but I had been confused by that as well.
  • just4U - Tuesday, October 29, 2013 - link

    There seems to be a hole at 399.. I'd lay money that is where AMD plans to position the 290... unless they want it to directly compete with the 770.. then they'd have to nock it down a bit more but there would still be that hole in the $400 segment.
  • Spunjji - Monday, October 28, 2013 - link

    Or not, if you live in the real world! ;)
  • Da W - Monday, October 28, 2013 - link

    I'd say AMD hit a hard one and Nvidia was left with no choice but to cut prices.
  • tuklap - Tuesday, October 29, 2013 - link

    Indeed they are. ^_^

    I guess its a sweet battle for us.
  • krutou - Thursday, October 31, 2013 - link

    Same thing that happened to AMD with the 7000 series around this time last year.

    I don't see the problem.
  • danjw - Monday, October 28, 2013 - link

    Well that and the introduction of a new card. As someone who won't buy AMD Graphics cards, this makes me laugh. While I would like more competition in the market place, AMD's graphics division really annoyed me when I was working at a small gaming company and we were trying to resolve an issue related to their drivers. Nvidia was always there for us, AMD wasn't even showing up to the game.
  • DMCalloway - Monday, October 28, 2013 - link

    Not to worry, They'll be there for the next gen. of consoles.
  • axien86 - Monday, October 28, 2013 - link

    Since Nvidia's GPU chips are around 25% bigger than AMD Hawaii, the price competition is good for consumers and extremely bad for Nvidia's bottom line. That's even before the launch of AMD R9-290 in two days along with one of the biggest worldwide gaming promotion for Battlefield 4.

    Given that Mantle tech will be shown at the Nov 11 AMD APU event along with the December Battlefield 4 update, Nvidia looks to be fighting against a better equipped competitor.

    http://www.dsogaming.com/news/pcars-developer-mant...

    Fudzilla just reported that more Nvidia top executives are leaving the company and they have inside information that will only become evident to the public in the future.
  • EJS1980 - Monday, October 28, 2013 - link

    WOW.....more Nvidia doom n' gloom forecasting, huh? How original, man.
  • MySchizoBuddy - Monday, October 28, 2013 - link

    A $150 price cut will definitely hurt Nvidia's profits. But i don't see any doom and gloom in it's future.
  • krutou - Thursday, October 31, 2013 - link

    "Since Nvidia's GPU chips are around 25% bigger than AMD Hawaii, the price competition is good for consumers and extremely bad for Nvidia's bottom line."

    The 7.1B transistor count is for GK110 represents a chip with no SMXes disabled. The 780 is a GK110 with 12 of 15 total SMXes enabled and represents the lowest bin for GK110. Higher binned chips are sold for $1000 as the Titan or over $2000 as the Tesla.

    Nvidia isn't losing money compared to AMD with their "bigger chips" because they are not "bigger" in terms of usable transistors.
  • The Von Matrices - Monday, October 28, 2013 - link

    The price cut is great, but alone it won't sell NVidia cards. What will sell NVidia cards is the shortage of R9 290X supply combined with the lack of custom R9 290X's compared to custom GTX 780's.
  • just4U - Tuesday, October 29, 2013 - link

    TBH.. if Nvidia partners put out more reference models they'd likely sell more.. out of everything that come out for the Titan/780 and even the 770 I think that's it's biggest selling point. But likely due to it's expense partners have opted out for their own custom designs.
  • abhaxus - Wednesday, October 30, 2013 - link

    The writing on the wall that I'm reading is that AMD is not allowing custom R9 290X cards because they purposely thermally limited the stock cooling solution. When the 780 Ti comes out, all of a sudden 290Xs will appear with custom cooling that allows them to run at their actual maximum speeds. Think 780 HOF from Galaxy.
  • insurgent - Tuesday, October 29, 2013 - link

    Or Nvidia's profits were wiped out by competition, saving some fantards a couple of hundred bucks.
  • Jumangi - Thursday, October 31, 2013 - link

    They forced Nvidia to drop the 780 by $150, I don't see how that's a bad thing for consumers.
  • piroroadkill - Tuesday, November 5, 2013 - link

    Not even slightly. They have created competition. It's good for everyone.
  • kmob - Monday, October 28, 2013 - link

    Pretty impressive response from nVidia - Definitely puts the 780 back in the "value for $ game" around the $500 price point. Especially with G-Sync on the horizon, this takes away a lot of the clear advantage that the R290X had at launch (last week...)

    A big question for me... where the heck does the Titan fit here? 780 -> Titan performance gap wasn't that significant (ignore the Double Precision FP capability). How can they insert the 780Ti in between there at $700 and still have ANY value proposition for the Titan?
  • sherlockwing - Monday, October 28, 2013 - link

    My guess is Titan just goes EOL, 780Ti will probably be a 3GB Titan or better(2880 core maybe)
  • neoraiden - Monday, October 28, 2013 - link

    Titan is for price is no object 3 way SLI....
  • MySchizoBuddy - Monday, October 28, 2013 - link

    Titan is for compute professions using CUDA with it's really high Double Precision flops.
  • JarredWalton - Monday, October 28, 2013 - link

    No, I think Titan will stay more or less where it is (possibly a $100 price cut), as it basically has professional level features (full performance 64-bit FP). GTX 780 Ti could then be Titan but without the full performance 64-bit functionality, and maybe higher clocks than Titan. If Titan also had the full OpenGL performance of the Quadro cards, it would be a no-brainer, but that doesn't appear to be the case.
  • MySchizoBuddy - Monday, October 28, 2013 - link

    it's 64bit FP is 1/3 of it's 32bit FP. it isn't full performance but 1/3 of full performance.
  • Friendly0Fire - Monday, October 28, 2013 - link

    You're assuming double-precision can be as fast as single-precision. Hint: it can't. You're gonna take a hit regardless, and it usually hovers between 50 and 75% of single-precision performance.

    As far as I'm aware there is no throttling whatsoever with Titan, it's running DP at its maximum possible speed.
  • chizow - Monday, October 28, 2013 - link

    That's correct, Titan runs double-precision at full-speed, same as GK110-based Tesla, which is only 1/3rd of single-precision speed. GK104 chips run DP at only 1/24th, more of a hardware limitation than willful castration.
  • 1Angelreloaded - Tuesday, October 29, 2013 - link

    I don't know if you guys actually remember but during the Quadro years people were actually using server boards that could support 4 double wide cards, the reasoning was for 4x 500$ EVGA 9800 GX2 you actually got a really, really, good workstation speed for a large discount. I believe a firm in Europe was actually selling systems based around the setup, The Titan is used as a medium of HYBRID card something a workstation is bad at, Titan is for the professional that wants a workstation that also can be used for normal to extreme consumer levels.

    I repeat to all the retards.....GTX Titan is a hybrid workstation/Consumer level GPU meaning you can do both on 1 system, not that you couldn't game on tesla cards however, the diminishing returns on the FPS is ridiculous and it isn't made to accommodate gaming. Titan changes that, and that is its market segment.
  • chizow - Monday, October 28, 2013 - link

    Yes there are plenty of other options available now that don't force people to pay the ridiculous compute gimmick tax. Those who didn't feel they needed compute unloaded their Titans long ago, the poor few that didn't sell when the 780 launched are now trying to unload them for $700. Once the 780Ti launches for $700 on November 7th, they will be lucky to sell them for $600. Great for the few hundreds who actually want a Titan for compute, I guess.
  • blitzninja - Saturday, November 2, 2013 - link

    You really have no idea how many people want Hybrid Compute-Gaming Stations do you?
  • ZeDestructor - Monday, October 28, 2013 - link

    Titaan has fully enabled FP64. 780Ti will likely not. As such the titan will retain its $1000 price point as being the poor man's Quadro K6000.
  • NextGen_Gamer - Monday, October 28, 2013 - link

    I can see the 780 Ti being completely identical to the Titan in every way except two: FP64 and memory size. This would make sense to me. Keep the clockspeeds & functional units the same, but just cut-out the full FP64 capability & bring the memory down to 3GB. So you can get what is essentially Titan performance (from a gamer's perspective) at $700, or pay $300 more to get the FP64 and 6GB/future-proof memory size.
  • zeock9 - Monday, October 28, 2013 - link

    Simply keeping it at the Titan-level performance will not be enough to justify its $150 premium over their 290x competition since nVidia, this time around, seems to be willing to play a very competitive p/$ ratio game with AMD, evidenced by their recent aggressive price cuts.

    A hefty premium over 290x, as the article points out, should be a hint that the new 780ti could be much faster than Titan, not just its identical parts with computing units disabled.
  • Minion4Hire - Monday, October 28, 2013 - link

    And what about clock speed? Titan has some headroom in it yet, especially since it is a more mature part now. A fully enabled higher clocked GTX 780 might just push past the 290X given how close the two already are.
  • Da W - Monday, October 28, 2013 - link

    290X also has headroom to spare. If you read correctly the problem is the cooler, the chip basicly never runs at full speed to keep the fan noise down.
    Get a 290X with a custom liquid cooler and its gonna rock!
  • mfergus - Monday, October 28, 2013 - link

    How much headroom can the 290X with how much power it is already using?
  • mfergus - Monday, October 28, 2013 - link

    ^have
  • EJS1980 - Monday, October 28, 2013 - link

    I agree, as an OC'd 780 destroys a Titan (and the 290x, for that matter). So in order for the 780ti to even make sense, it would have to readily beat the Titan out of the box. Factor in that Jen Hsun Huang stated that the 780ti will be the fastest GPU they've ever released, and I think all signs point to the 780ti being the new UNDISPUTED king of the castle.
  • Flunk - Monday, October 28, 2013 - link

    That's not quite true, it could easily be just an overclocked GTX 780, they don't take user overclocking into account. Sometimes they ship identical cards with different clocks and claim one is better.
  • YazX_ - Monday, October 28, 2013 - link

    Titan is a one year old card and manufactured in limited quantities, so Titan is history now. now 780 Ti will be placed above 780 (most probably will wipe R9 290x) and then Titan Ultra will take the crown and placed above 780 Ti and will be priced at 1k, same story as last year.
  • Spunjji - Monday, October 28, 2013 - link

    You haven't quite explained how they make the 780 Ti perform better than the Titan, though, which would be required for it to "wipe" the 290X.
  • Flunk - Monday, October 28, 2013 - link

    Full compute cores, more RAM and a higher clock. There you go, that's just too easy.
  • Byte - Wednesday, October 30, 2013 - link

    An overclocked 780 can easily match titan in many games. Just look at the review of preO/C cards you can see this. 780Ti will def beat Titan in gaming but not compute.
  • FITCamaro - Monday, October 28, 2013 - link

    Titan was never meant to be just for games. It was built for doing GPU computing.
  • Da W - Tuesday, October 29, 2013 - link

    Most people don't get it, the difference between AMD and Nvidia is precisely that Nvidia segregate between compute and gaming cards, and AMD doesn't. In doing so Nvidia is able to remove some compute functionality from its gaming cards and keep power usage lower than AMD for same performance. AMD is aiming at compute for mainstream market with its APUs. At its core no architecture is fundamentaly better than the other.
  • Da W - Tuesday, October 29, 2013 - link

    This penis contest is pointless.
  • Margalus - Wednesday, October 30, 2013 - link

    the Titan doesn't matter, you can't ignore what it was really made for. It doesn't fit anywhere at all with gaming cards like the 290x or 780ti. Titan is not a gaming card, it is a professional compute card that just happens to play games very well. It is not intended for gaming consumers like the 780ti. The 780ti may just come along and blow titan out of the water in gaming performance. But that will not affect the Titans price since it is NOT intended for gamers.
  • blitzninja - Saturday, November 2, 2013 - link

    It is intended for gamers who also do compute, try playing a game on a Tesla card, your FPS will be shit (especially for the price), the diminishing returns aren't worth it (you can get good FPS if the game support's your 2/3-way SLI Tesla Cards but even then it's just good, nothing better).

    The titan fills these combined niche categories in one go:
    - People who can't afford Tesla.
    - People who wants to compute and game on the same desktop.
    - People who have Tesla workstations at work but want a lighter Tesla card at home for that little extra work.

    This is where Titan shines and as far as nVidia is concerned, the Titan is produce with the wafer dies that have a failed SMX unit, people don't seem to consider this with all their "enable the extra SMX unit and 100MHz comments" (this is why they're produced in limited quantities btw, fully enabled dies become Tesla cards most of the time) and the market is there since the card is usually sold out.
  • mattyc - Monday, October 28, 2013 - link

    So I buy a GTX 680.. 780 is released a month later but I was lucky enough to get in the EVGA step up program... Month after I get my 780, the Ti comes out and the 780 is reduced to what I had to pay for the step up. I need a friend who's in the industry so I don't have to deal with this ****.
  • iniudan - Monday, October 28, 2013 - link

    Your the one who first bought a card, that was on the market for over a year, in a market where you get annual release of a series of devices. Not like a quick search cannot give you the release date of what currently available.
  • The Von Matrices - Monday, October 28, 2013 - link

    Your timeline makes no sense. The GTX 780 was released on May 23, 2013, which according to you is when you traded up. The GTX 780Ti will be released on November 7. I don't see how that's a month after you get your GTX 780. That's a 6 month difference, which is a perfectly reasonable lifespan as the top card.
  • mattyc - Monday, October 28, 2013 - link

    Did you not read what I said? "Step up program"
  • kedesh83 - Monday, October 28, 2013 - link

    Since I pruchased my 780 SC about 20 days ago, I wonder if I can use the step up program to get the ti. If that doesn't work looks like I'll be selling it on craigslisy for 400 dollars hopefully.
  • rakunSA - Monday, October 28, 2013 - link

    I just stepped up my 770 to the 780. EVGA updated it already. You should be able to as soon as Nov 7 rolls around
  • Da W - Monday, October 28, 2013 - link

    My only requirement is to run my 3 1920X1200 screen in portrait with no issue. Is the GTX 780 surround comparable to eyefinity, INLUDING in desktop mode to work (i'm a stock trader).

    Cause i sure would prefer Nvidia game bundle.
  • The Von Matrices - Monday, October 28, 2013 - link

    NVidia does Surround better than AMD does Eyefinity if you ask me (I've had both). AMD believes in switching profiles between eyefinity and separate desktops. NVidia has a hybrid mode so you don't need to switch; you still play games on all three monitors but in desktop mode windows maximize to single monitors. The only disadvantage to the NVidia implementation is that full screen video maximizes across all three monitors. This is OK if you have 16:9 monitors, but if you're like me with 16:10 monitors that means a 16:9 video scales up and displays some content on the side monitors.
  • The Von Matrices - Monday, October 28, 2013 - link

    I'm not sure about NVidia's support for rotated displays; you'll have to look that up.
  • ZeDestructor - Monday, October 28, 2013 - link

    Rotated displays work fine on my GTX670. For non-Surround setups at least.
  • colonelclaw - Monday, October 28, 2013 - link

    Does anyone know how the logistics with regards to retailers works? I just checked on my favourite web site and sure enough all the 680 prices have dropped dramatically, and nearly all the cards are in stock. Does this mean Nvidia credits the retailer with the difference? I would assume all the cards on sale were originally purchased by the retailer for more than they are now selling for. Or is there some other transaction?
  • The Von Matrices - Monday, October 28, 2013 - link

    Typically this works with NVidia (or whatever manufacturer) providing discounts on future purchases.
  • extide - Monday, October 28, 2013 - link

    It could be either way, I mean it is not un-heard of for a manufacturer to cut a check to a retailer because of a price cut.
  • firewall597 - Monday, October 28, 2013 - link

    What upsets me the most is that Nvidia has simply been charging what they want until now due to lack of competition. It takes a competitive release from AMD to bring Nvidia prices down to normalcy, while they've been raking in stupid profit margins up til now and calling it "value".
  • EJS1980 - Monday, October 28, 2013 - link

    Welcome to the wonderful world of BUSINESS!
    Every company ever, has done this, and AMD is NO DIFFERENT. Unfortunately, when a company has a great product, and no competition, they can charge whatever they feel their customers will tolerate, and Nvidia made boatloads off Titan. This is why I'll never understand the idiotic fanboys who want only one company to win. Apparently, these fools don't realize what would happen if only Nvidia/AMD were making chips (hint: abysmal innovation and extortionate pricing)... ;(
  • TheJian - Tuesday, October 29, 2013 - link

    NV hasn't made as much as they did in 2007 in 6yrs. AMD meanwhile has lost 6Bill in 10yrs. Neither is charging enough or they would be making at least as much as they did in 2007 (well, maybe AMD would be making a yearly profit at least) which for NV was ~850mil if memory serves. Let me know when they make more than 2007 again, then you can say gouging is going on. Take a look at 10yr summary for Intel, Apple, MSFT which are all up 50% profits since 2007 and have double or more of their assets value. You could argue they are all gouging you, but not NV or AMD.
  • mfergus - Monday, October 28, 2013 - link

    AMD did the same thing when the 7970 was released. It's just business as usual.
  • Hrel - Monday, October 28, 2013 - link

    GTX760 needs a $50 price cut.
  • aakash_sin - Tuesday, October 29, 2013 - link

    +1
  • Conduit - Monday, October 28, 2013 - link

    The R9-290X custom cooled editions will be the ones to go for, they will bring the temps back down to normal levels while significantly outperforming the GTX 780.

    I find it funny that Nvidia has bundled a bunch of poorly received games (reviews were generally not favorable for these games) and a Shield coupon, let's face it, who the hell wants to stream PC games to a tiny little screen with poor controls when you can just connect a HDMI cable to a TV?
  • Sandcat - Monday, October 28, 2013 - link

    Exactly. As it is, the 290x can't overclock worth a crap, so a 780 with a nice OC negates any performance difference. Non-ref cooler should help ameliorate that.
  • The Von Matrices - Monday, October 28, 2013 - link

    So you're going to argue that AMD charging $30 for a BF4 bundle is better than NVidia giving away three recent AAA games for free.

    Also, how can you even say that the bundled games' reviews are generally not favorable? Splinter Cell Blacklist and Batman: Arkham Origins got "generally favorable" reviews on Metacritic and Assassins' Creed IV isn't even released yet. I don't get your argument.

    I also don't necessarily agree with the purpose of Shield, but you don't have to buy it so why complain about a discount?

    Custom cooled R9 290X's won't be here for a month or two. There's no point in waiting for one when you can get a $499 GTX 780 right now.
  • Conduit - Monday, October 28, 2013 - link

    I like the GTX 780 at $499, but personally if I was to buy a enthusiast class GPU I would just get the 290X with the BF4 bundle for Mantle (which will allow it to embarrass the GTX 780 in Mantle supported games) as well as guaranteed future proofing due to next-gen consoles having GCN gpu's. When all the next gen games are released AMD card's will have better performance.

    Also I wasn't talking about Metacritic's "generally favourable" definition. I was talking about my definition, 79 is not a good score for me, I know it sounds very critical but I probably wouldn't buy a game less than 85, which is pretty much barely cutting it to me.

    I'm not saying people shouldn't buy 780's now, but I really dislike the fact that Nvidia was ripping consumer's off and hope they lose profit this quarter as a lesson to them.
  • The Von Matrices - Monday, October 28, 2013 - link

    I appreciate the reasoned arguments, but I fail to see how free bundled games diminish a card's value, especially when the alternative is paying $30 extra for a bundle. I also think you are putting too much faith in Mantle given that there is only one guaranteed game and just a promise from AMD that there will be more. If you are the type of person who replaces cards every 1 or 2 years then assuming best case scenario Mantle will only be hitting its prime once the card is obsolete.

    As far as ripping off consumers, I like lower prices too, but I dislike how the blame for the high prices is placed solely on NVidia. Sure, NVidia set the prices, but AMD kept the 7970 as their high end product for a ridiculous 22 months. At least NVidia refreshed their lineup with GK110. If there's anyone to blame for NVidia's prices it's AMD for waiting two years to replace the 7970.
  • EJS1980 - Monday, October 28, 2013 - link

    Splinter Cell:Blacklist has a Metacritic score of 85, Arkhman Origins is around 80, and AC4 hasn't even been released/reviewed yet, even though anyone with a pulse could tell you that it will score pretty well (like EVERY AC to date!).
    So, your ridiculous attempt to disparage the Nvidia games bundle is as petty as it is incorrect. :(
  • Footman36 - Monday, October 28, 2013 - link

    Just bought an EVGA GTX 780 Classified from Newegg for $579 down from $699 and it comes with the Holiday season game bundle. That's what I call a good deal. The Classified is as fast as the stock R9 290X and if I need more speed I guess I can step up next month to the GTX 780 TI. Price drops available now on Newegg....
  • RoninX - Monday, October 28, 2013 - link

    And this is why, despite being an Nvidia fan, I'm happy that AMD remains competitive...
  • dwade123 - Monday, October 28, 2013 - link

    Anyone with a brain would just grab the more efficient GTX 780 over the power guzzler aka room heater that is the 290x.
  • Footman36 - Monday, October 28, 2013 - link

    Isn't that what they called the GTX 480 and 470 when they were released :-)
  • Friendly0Fire - Monday, October 28, 2013 - link

    As with a lot of things, it's cyclical. New architecture means inefficiencies to iron out and doesn't bring out its maximal performance until a few generations later.

    It's no coincidence that the 400 series (the first series using Fermi, which was a brand new architecture for NVIDIA) was loud and consumed a lot of power, just as it is now for AMD with GCN. I expect AMD's power consumption to stabilize and drop in the next generation or two (compare with the 5000 series, which was in many ways the peak of AMD's old VLIW architecture and consumed very little power), just like I expect NVIDIA might fluctuate with their inevitable new architecture.

    We should be glad that this cyclical nature exists here as it's what's keeping either company in check in terms of pricing and innovation. The CPU market doesn't have that cycle and Intel's able to go crazy on prices as a result.
  • The Von Matrices - Monday, October 28, 2013 - link

    AMD had two years to perfect GCN between the initial 7970 release and the R9 290X release; GCN is far from new at this point. That's actually the same time frame NVidia had between Fermi and Kepler, and look what that brought.
  • Da W - Tuesday, October 29, 2013 - link

    If you live in Canada a room heater like the 290X is no a bad choice.
  • krutou - Thursday, October 31, 2013 - link

    Sorry to burst your bubble, but gas heaters are more cost effective than electric heaters.
  • blitzninja - Saturday, November 2, 2013 - link

    Best post of the day. Laughed so hard at these two.
  • MrWizzy2002 - Monday, October 28, 2013 - link

    Does anyone know what kind of DRM the games bundle is? I didn't see it on the bundle article. Origin, Steam, DRM Free?
  • zeock9 - Monday, October 28, 2013 - link

    Steam.
  • conorvansmack - Monday, October 28, 2013 - link

    Impatient GTX 770 owner says, "Dammit."
  • aakash_sin - Tuesday, October 29, 2013 - link

    :D
  • zeock9 - Monday, October 28, 2013 - link

    I think most of us are misinterpreting this price cut on 780, assuming that nVidia is back on p/$ game against AMD - they are still not.

    780 is going to be a direct competitor against the upcoming 290, not the 290x, and as such should still be somewhat more expensive of an option than AMD's solution which is expected to cost around $450.

    780Ti vs 290x = $150 premium over AMD
    780 vs 290 = ~$50 premium over AMD
  • Friendly0Fire - Monday, October 28, 2013 - link

    We don't know the performance of the Ti or the 290, so I'd suggest waiting before declaring anyone a winner...
  • zeock9 - Monday, October 28, 2013 - link

    Nothing I said implied anything about performance figures, however what we do know is the price of each solutions.

    I simply stated the fact that, from their price point perspectives, nVidia still commands hefty premiums over their direct competitions unlike most of us who immediately jumped to a conclusion that nVidia's solutions are now cheaper - they are not.
  • The Von Matrices - Monday, October 28, 2013 - link

    We'll see about the performance of the unreleased cards, but from what I am reading in this thread there are a lot of people who don't want to wait for announced products to release before they buy (as evidenced by the numerous people complaining in this forum that they bought their 780's within the past month). To them, what matters is the price and performance now, and at the moment the GTX 780 is the high end card to get.
  • mfergus - Monday, October 28, 2013 - link

    If the 780 is faster than the 290, it deserves to cost more. It can't be overpriced if it's only $50 more but closer to the 290x in performance.
  • madwolfa - Monday, October 28, 2013 - link

    Thinking what to do with my GTX 670 now (gaming on single 1440p screen).
  • R-Type - Monday, October 28, 2013 - link

    The GTX 670 is great performer for the power it consumes. Get another, you'll love gaming at 1440p in SLI!

    I feel bad for anyone who's bought a GTX 780 in the past month (HA HA!) but this is a necessary adjustment. The previous high price was an "early adopter" tax for the best equipment. NOW is the time to buy a GTX 780 if you're so inclined. The game bundle makes it an incredible value.
  • legalaliens5 - Monday, October 28, 2013 - link

    So we know the 2GB version of the GTX 770 will be $330, how much will the 4GB version be?
  • zeock9 - Monday, October 28, 2013 - link

    Just subtracting $70 from what they cost now would be your best estimate.
  • chizow - Monday, October 28, 2013 - link

    Well that was certainly a swift and decisive reaction from Nvidia. Still, it pains me to see Nvidia is basically charging $500 for what amounts to a GTX 465/GTX 560 Ti448.
  • The Von Matrices - Monday, October 28, 2013 - link

    I'm assuming that you are arguing based on the idea that these are third-tier chips. But if you ask me all that matters is price/performance, and the GTX 780 excels at that with no downsides (unlike the outrageous power consumption/performance of the GTX 465 and 560Ti 448 to a lesser extent).
  • Vinny DePaul - Monday, October 28, 2013 - link

    I knew it. I have been waiting for the price drop. I was hoping for AMD but nVidia has better drivers.
  • Milan232 - Monday, October 28, 2013 - link

    The 290x has more memory and memory bits than the 780. Cooling wise you can already purchase a waterblock for the 290x. With a WB will bring good temps and overclocks than 780. it's a great buy for $50 more tomorrow.
  • The Von Matrices - Monday, October 28, 2013 - link

    ...but a waterblock is another $150, so the price difference is really $200 in that case.
  • Milan232 - Monday, October 28, 2013 - link

    r9 290x water block http://www.performance-pcs.com/catalog/index.php?m...

    gtx 780 http://www.performance-pcs.com/catalog/index.php?m...

    r9 290x card $550+waterblock $118= $668
    gtx 780 card=$500+ waterblock $115= $616

    $50 difference which would you buy 780 or 290x
  • Milan232 - Monday, October 28, 2013 - link

    $53 to be right .
  • Milan232 - Monday, October 28, 2013 - link

    with shipping on some websites. probably like a 65$ difference
  • mfergus - Monday, October 28, 2013 - link

    His point was you don't need a water block for a 780. With that money you could just get a 780 Ti.
  • krutou - Thursday, October 31, 2013 - link

    Problem is that the 780 doesn't need a waterblock to be a solid performer.
  • tonyn84 - Monday, October 28, 2013 - link

    Alright AMD, there's a nice sweet spot at $399 for the R9 290, you know you want to.
  • Laststop311 - Monday, October 28, 2013 - link

    That GTX 780 for 499 is an incredibly great value. It depresses me on the 1,050 I spent on the Titan the first week it came out. But that's the price you pay being an early adopter. I avoid SLI/crossfire setups at all costs. Single GPU's perform more stable. But I have to give nvidia credit they did promise the titan would be their highest performing card for at least a full year and I am glad they are holding to that. I spent a lot but it gives me their best card for a good long while.
  • Laststop311 - Monday, October 28, 2013 - link

    And 1 more thing. The AMD cards mat have the performance advantage but they run hot as all hell and noisy as hell. It's not worth the slight performance advantage to have a hurricane blowing inside your pc.
  • aakash_sin - Tuesday, October 29, 2013 - link

    $50 Price Cut on nVidia GTX 760 Please :|
    It's price is like an island in the sea..
  • tuklap - Tuesday, October 29, 2013 - link

    Nice Price "Upper Cuts" right there. I can see christmas to be very good for all of us. ^_^

    I just hope they'll make sheild discounts into other game bundles. who's buying nvidia shield when you have a pc? nvidia ceo is not that wise i guess.

    I wonder when will this price cut take effect? november?? just in time for christmas???
  • tuklap - Tuesday, October 29, 2013 - link

    It's funny how people set their standard of temperature when a designer said it runs safe at this temperature they will still say it is hot and is not that good.

    I just hope non reference design come out like, they can maintain temperatures at 70 degrees which will make r9 290x a monster OCer.

    And yes, I guess AMD will have their price cuts soon on their 290x and 290 later this year or early next year with the benefit of games using mantle, its a clear winner for AMD
  • joewaldo - Tuesday, October 29, 2013 - link

    "price drop" yup thats why its the same price it was before all over
  • Futureman666 - Tuesday, October 29, 2013 - link

    Finally I was expecting an NVidia price drop way sooner than this but they kept milking everyone
    I know I will not buy an ATI card that's for sure for the moment the R290x is nice but runs way to hot and in the end some of the results are not impressive maybe good manufacturer will be able to harness better cooling for this BBQ card .. the thing that does not correlate well for the price drop is the price of the GTX 770 4GB cards they don't seem to have drop and way to close to the price of the new GTX 780 which is ridicule for now .. I'll wait a little more so the price can adjust accordingly . i'm glad I waited and thanks to ATI for making Nvidia Sweat
  • C'DaleRider - Tuesday, October 29, 2013 - link

    "...NVIDIA’s kicker – their superior build quality..."

    Thanks, Ryan, now we all know who's pocket you are in. Superior build quality....such a laugh. This comment made despite AMD using more layers in their pcb's, caps of equal quality, historically faster memory from excellent vendors, and on and on.

    I don't think you can quantify that superior build quality except to bring up the cooler. And if that's all you have for superior build quality, then you are completely bought and paid for by Nvidia. (This being written by someone who bought a 780 and Titan as his last two gpu purchases.)
  • krutou - Thursday, October 31, 2013 - link

    And then look back to rampant coil whine with reference 7950s and 7970s.
  • krutou - Thursday, October 31, 2013 - link

    Its hard to make unbiased judgments when you've never actually owned an AMD card.
  • krutou - Thursday, October 31, 2013 - link

    I'm pretty sure Ryan is talking specifically about the reference 780 build quality vs the reference R9-290X build quality. Nvidia clearly wins in this comparison.
  • rogerthat1945 - Thursday, October 31, 2013 - link

    Last week (and all last month at least, the ASUS GTX 780 price was around $745 USD (in Yen) on Amazon Jp.

    I put one in my shoping basket, and browsed some more for extra items (Zx Evo Headset considerations), and then heard about the NVidia cards price to be dropped for the GTX 780 range; so I held off going to checkout; however, this morning when I went to look at paying via the advertised price drop, BUT I found that Amazon have JACKED-UP the price to $867 US. :no:
    http://www.amazon.co.jp/ASUSTeK-GTX780%E3%83%81%E3...
    Question is;-

    "Where can I buy this card for a `proper` price (which popular site) where they will POST it via Air Mail to Japan (not a US military address)? :ange:

    Every site I tried from California to China do not post to Japan.

    Amazon Japan, you are Kraaayyy-Zee crayon users. :pt1cable:
  • Milan232 - Thursday, October 31, 2013 - link

    will a 290x with a water block give better performance than a 780 with a water block? im also confused and the whole 95 degrees set. if i have a waterblock on it. with the temp still give 95 but so much better frames?
  • Milan232 - Thursday, October 31, 2013 - link

    if i have a 780 and a 290x both on waterblocks. at the end which will be better?

Log in

Don't have an account? Sign up now