The NVIDIA GeForce GTX 1650 Super Review, Feat. Zotac Gaming: Bringing Balance To 1080pby Ryan Smith on December 20, 2019 9:00 AM EST
Meet The ZOTAC Gaming GeForce GTX 1650 Super
Since this latest GTX 1650 series card launch is a virtual launch like the others, the board partners are once again stepping up to the plate to provide samples. For the GTX 1650 Super launch, we received Zotac’s Gaming GeForce GTX 1650 Super card, which is a fairly straightforward entry-level card for the series.
|GeForce GTX 1650 Super Card Comparison|
|GeForce GTX 1650 Super
|Zotac Gaming GeForce GTX 1650 Super|
|Memory Clock||12Gbps GDDR6||12Gbps GDDR6|
|GPU Power Limit||100W||100W|
|Cooler Type||N/A||Open Air, Dual Fan|
For their sole GTX 1650 Super card, Zotac has opted to keep things simple, not unlike their regular GTX 1650 cards. In particular, Zotac has opted to design their card to maximize compatibility, even going as far as advertising the card as being compatible with 99% of systems. The end result of this being that rather than doing a large card that may not fit everywhere, Zotac has gone with a relatively small 6.2-inch long card that would be easily at home in a Mini-ITX system build.
Fittingly, there is no factory overclock to speak of here. With GPU and memory speeds identical to NVIDIA’s reference specifications, Zotac’s card is as close as you can get to an actual reference card. With is very fitting for our generalized look at the GeForce GTX 1650 Super as a whole.
Digging down, we start with Zotac’s cooler. The company often shifts between single fan and dual fan designs in this segment of the market, and for the GTX 1650 Super they’ve settled on a dual fan design. Given the overall small size of the card, the fans are equally small, with a diameter of just 65mm each. This is something to keep in mind for our look at noise testing, as small fans are often a liability there. Meanwhile the fans are fed by a single 2-pin power connector, so there isn’t any advanced PWM fan control or even RPM monitoring available for the fan. In this respect it’s quite basic, but typical for an NVIDIA xx50 series card.
Underneath the fan is an aluminum heatsink that runs most the length of the card. With a TDP of just 100 Watts – and no option to further increase the power limit – there’s no need for heatpipes or the like here. Though the heatsink’s base is big enough that Zotac has been able to cover both the GPU and the GDDR6 memory, bridging the latter via thermal pads. The fins are arranged vertically, so the card tends to push air out of the top and bottom.
The small PCB housing the GPU and related components is otherwise unremarkable. Zotac has done a good job here seating such a large GPU without requiring a larger PCB. As we usually see for such short cards, the VRM components have been moved up to the front of the board. The MOSFETs themselves are covered with a small aluminum heatsink, though with most of the airflow from the fans blocked by the primary heatsink, I don’t expect the VRMs are getting much in the way of airflow.
For power, the card relies on an 6-pin external PCIe power cable, as well as PCIe slot power. The power connector is inverted – that is, the tab is on the inside of the card – which helps to keep it clear of the shroud, but may catch system builders (or video card editors) off-guard the first time they install the card.
Finally for hardware features, for display I/O we’re looking at the same configuration we’ve seen in most GTX 1650 cards: a DisplayPort, an HDMI port, and a DL-DVI-D port. While DVI ports have long been banished from new products, there are still a lot of DVI monitors out there, particularly in China where NVIDIA’s xx50 cards tend to dominate. The tradeoff, as always, is that the DVI port is taking up space that could otherwise be filed by more DisplayPorts, so you’re only going to be able to drive up to two modern monitors with Zotac’s GTX 1650 Super. Of course, one could argue that a DL-DVI port shouldn’t even be necessary – this lower-end card isn’t likely to be driving a 1440p DL-DVI display – but I suspect this is a case where simplicity wins the day.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Korguz - Sunday, December 22, 2019 - linkwhy do you think the games will target ps4 ?? is this just your own opinion??
Kangal - Sunday, December 22, 2019 - linkBecause there's a lot of PS4 units hooked up to TVs right now, there will still be hooked up until 2022. When the PS4 launched, the PS3 was slightly ahead of the Xbox 360, yet sales were nothing like the PS4's. And the PS3 was very outdated back in 2014, whereas in 2020, the PS4 is not nearly as outdated... so there's more longevity in there.
So with all those factors and history, there's a high probability (certainty?) that Game Publishers will still target the PS4 as their baseline. This is good news for Gaming PC's with only 8GB RAM and 4GB VRAM, and performance below that of a RX 5700. Regardless, it's always easier to upgrade a PC's GPU than it is to upgrade the entire console.
...that's why Ryan is not quite right
Korguz - Sunday, December 22, 2019 - linkum yea ok sure... and you have numbers to confirm this ?? seems plausible, but also, just personal opinion
Kangal - Monday, December 23, 2019 - linkDuring the launch of the PS4 back in 2014, the older PS3 was 8 YEARS OLD at the time, and hadn't aged well, but it did a commendable sales of 85 Million consoles.
I was surprised by the Xbox 360 which was 9.5 YEARS OLD, which understandably was more outdated, and it did a surprising sales of 75 Million consoles.
Because both consoles weren't very modern/quite outdated, and marketing was strong, the initial sales of the PS4 and Xbox One were very strong in 2014. Despite this there was about another, 5 Million PS3 and Xbox 360, budget sales made in this period. And it took until Early-2016 for Game Publishers to ditch the PS3 and Xbox 360. So about 1.5 Years, and about 40 Million sales (PS4) or 25 Million sales (Xbox 360) later. During this period people using 2GB VRAM Graphic Cards (GTX 960, AMD R9 370X) were in the clear. Only after 2016 were they really outdated, but it was a simple GPU Swap for most people.
So that's what happened, that's our history.
Now let's examine the current/upcoming events!
The PS4 has sold a whopping 105 Million consoles, and the Xbox One has a commendable 50 Million units sold. These consoles should probably reach 110 Million and 55 Million respectively when the PS5 and Xbox X release. And within 2 years they will probably settle on a total of 120 Million and 60 Million sales total. That's a huge player base for companies to ignore, and is actually better than the previous generation. However, this current gen will have both consoles much less outdated than the previous gen, and it's understandable since both consoles will only be 6 YEARS OLD. So by the end of 2022, it should (will !!) be viable to use a lower-end card, something that "only" has 4GB VRAM such as the RX 5500XT or the GTX 1650-Super. And after that, it's a simple GPU Swap to fix that problem anyway so it's no big deal.
Ryan thinks these 4GB VRAM cards will be obsolete within 6 Months. He's wrong about the timing. It should take 2 Years, or about x4 as much time. If he or you disagree, that's fine, but I'm going off past behavior and other factors. I will see Ryan in 6 Months and see if he was right or wrong.... if I remember to revisit this article/comment that is : )
Korguz - Monday, December 23, 2019 - linkand yet... i know some friends that sold their playstations.. and got xboxes... go figure....
for game makers to make a game for a console to port it to a comp = a crappy game for the most part.. supreme commander 2, is a prime example of this....
flyingpants265 - Sunday, December 22, 2019 - linkMost benchmarks on this site are pretty bad and missing a lot of cards.
Bench is OK but the recent charts are missing a lot of cards and a lot of tests.
Pcpartpicker is working on a better version of bench, they've got dozens of PCs running benchmarks, 24/7 year-round, to test every possible combination of hardware and create a comprehensive benchmark list. Kind of an obvious solution, and I'm surprised nobody has bothered to do this for... 20-30 years or longer..
Korguz - Sunday, December 22, 2019 - linkhmmmmmm could it be because of, oh, let me guess... cost ?????????????????
sheh - Saturday, December 21, 2019 - linkIn the buffer compression tests the 1650S fares worse than both the non-S cards and the 1050 Ti.
Curiously, the 1660S is even worse than the 1650S.
catavalon21 - Saturday, December 21, 2019 - linkGuessing it's ratio differences not rated to absolute performance. A more comprehensive chart in BENCH of the INT8 Buffer Compression test shows the 2080Ti with a far lower score than any of the recent mid-range offerings.
catavalon21 - Sunday, December 22, 2019 - link* not related to