The NVIDIA GeForce GTX 1650 Super Review, Feat. Zotac Gaming: Bringing Balance To 1080pby Ryan Smith on December 20, 2019 9:00 AM EST
Meet The ZOTAC Gaming GeForce GTX 1650 Super
Since this latest GTX 1650 series card launch is a virtual launch like the others, the board partners are once again stepping up to the plate to provide samples. For the GTX 1650 Super launch, we received Zotac’s Gaming GeForce GTX 1650 Super card, which is a fairly straightforward entry-level card for the series.
|GeForce GTX 1650 Super Card Comparison|
|GeForce GTX 1650 Super
|Zotac Gaming GeForce GTX 1650 Super|
|Memory Clock||12Gbps GDDR6||12Gbps GDDR6|
|GPU Power Limit||100W||100W|
|Cooler Type||N/A||Open Air, Dual Fan|
For their sole GTX 1650 Super card, Zotac has opted to keep things simple, not unlike their regular GTX 1650 cards. In particular, Zotac has opted to design their card to maximize compatibility, even going as far as advertising the card as being compatible with 99% of systems. The end result of this being that rather than doing a large card that may not fit everywhere, Zotac has gone with a relatively small 6.2-inch long card that would be easily at home in a Mini-ITX system build.
Fittingly, there is no factory overclock to speak of here. With GPU and memory speeds identical to NVIDIA’s reference specifications, Zotac’s card is as close as you can get to an actual reference card. With is very fitting for our generalized look at the GeForce GTX 1650 Super as a whole.
Digging down, we start with Zotac’s cooler. The company often shifts between single fan and dual fan designs in this segment of the market, and for the GTX 1650 Super they’ve settled on a dual fan design. Given the overall small size of the card, the fans are equally small, with a diameter of just 65mm each. This is something to keep in mind for our look at noise testing, as small fans are often a liability there. Meanwhile the fans are fed by a single 2-pin power connector, so there isn’t any advanced PWM fan control or even RPM monitoring available for the fan. In this respect it’s quite basic, but typical for an NVIDIA xx50 series card.
Underneath the fan is an aluminum heatsink that runs most the length of the card. With a TDP of just 100 Watts – and no option to further increase the power limit – there’s no need for heatpipes or the like here. Though the heatsink’s base is big enough that Zotac has been able to cover both the GPU and the GDDR6 memory, bridging the latter via thermal pads. The fins are arranged vertically, so the card tends to push air out of the top and bottom.
The small PCB housing the GPU and related components is otherwise unremarkable. Zotac has done a good job here seating such a large GPU without requiring a larger PCB. As we usually see for such short cards, the VRM components have been moved up to the front of the board. The MOSFETs themselves are covered with a small aluminum heatsink, though with most of the airflow from the fans blocked by the primary heatsink, I don’t expect the VRMs are getting much in the way of airflow.
For power, the card relies on an 6-pin external PCIe power cable, as well as PCIe slot power. The power connector is inverted – that is, the tab is on the inside of the card – which helps to keep it clear of the shroud, but may catch system builders (or video card editors) off-guard the first time they install the card.
Finally for hardware features, for display I/O we’re looking at the same configuration we’ve seen in most GTX 1650 cards: a DisplayPort, an HDMI port, and a DL-DVI-D port. While DVI ports have long been banished from new products, there are still a lot of DVI monitors out there, particularly in China where NVIDIA’s xx50 cards tend to dominate. The tradeoff, as always, is that the DVI port is taking up space that could otherwise be filed by more DisplayPorts, so you’re only going to be able to drive up to two modern monitors with Zotac’s GTX 1650 Super. Of course, one could argue that a DL-DVI port shouldn’t even be necessary – this lower-end card isn’t likely to be driving a 1440p DL-DVI display – but I suspect this is a case where simplicity wins the day.
Post Your CommentPlease log in or sign up to comment.
View All Comments
StevoLincolnite - Sunday, December 22, 2019 - linkBut if you only paid $100 and it only lasts for a couple years, it's still worth it for that tier of performance, no?
Yojimbo - Monday, December 23, 2019 - linkWe don't have any data on this, which is why I would avoid a used mining card. We'll never get any data on this, either. The thing is, although one can't tell a well-cared-for gaming card from one not well-cared for, over the years a general knowledge of the expectation of a used part has been built up. In the case of mining it is a big unknown in my view. You don't really know which cards are mining cards and which are gaming, so any card that has been popular with miners is suspect, in my view, unless you know who you are buying from.
eastcoast_pete - Sunday, December 22, 2019 - linkWhich poses this question: Is there a program ("app") that can run a health check on a card? In addition to any "custom BIOS" , I would also be concerned about simple aging with intense, ongoing use. When manufacturers bin chips and assign them to target speeds, they supposedly do so also based on life expectancy, at least for CPUs. So, is there a way to test how much life the GPU and RAM of a card have left in them?
flyingpants265 - Sunday, December 22, 2019 - linkThe elephant in the room for the RX580 8GB, and AMD videocards in general, is the almost 200W power draw on a "1080p card", whereas the 1650 uses 75W. It may really suck for the price, but it uses less than half the power. Obviously the RX570 is a great choice as well.
Then there's reliability. I've seen statistics from Puget systems and some big online retailer and AMD had some obscenely high failure rates. AMD is a much smaller company, they might have less oversight, and heat causes a lot of damage to complex electronics. Not exactly reliable info but I wouldn't really be surprised if it were somewhat accurate. I believe all consumer products are cheaply made, so I'd rather go with the lower-power, lower-heat, larger company. Too bad I don't have any hard data to back that up.
Not really interested in anecdotal evidence either.
Spunjji - Monday, December 23, 2019 - link"Too bad I don't have any hard data to back that up.
Not really interested in anecdotal evidence either."
Next time start with that pitch, so folks can ignore the self-confessedly uninformed speculative rambling that follows.
If the power draw is a bother on the RX580, a little undervolting will go a very long way without noticeably affecting frame rates at 1080p. It'll also help with longevity. Regardless, none of this is particularly crucial when you're saving ~$50 and getting a faster card with more VRAM.
Spunjji - Monday, December 23, 2019 - linkRX 580 is still good for 1440p too, if you're not obsessed with hitting "Max" on every setting just because it's there.
khanikun - Monday, December 30, 2019 - linkReminds me of back in the day, when I moved to ATI for a very short while. 9700 Pro. Started overheating after a year, then broke. 9600 XT as temporary card. Started overheating in less than a year. 9800 XT. Started overheating in a couple months. Went back to Nvidia and haven't had a reason to look at ATI/AMD cards since.
Qasar - Wednesday, January 1, 2020 - linki have some of those cards from then, 9800pro, 9600pro, used them not to long ago to see if they still work.. and they still do.. hehehehe
WetKneeHouston - Monday, January 20, 2020 - linkI think it makes sense to think that heat and power would lead to reliability issues. That's why I went with the 1650 Super. It's still too powerful of a card for the low tier (I can't notice the difference with higher settings, I suspect it's a scam lol) 1080p gaming I do, so I probably should have gotten the regular 1650, but they're basically the same price,
Yojimbo - Friday, December 20, 2019 - linkYeah maybe if it cost $50 it would be worth it for running dosbox.