Now that Nvidia have finally pulled their heads out of RTX, they’re working on something for the masses. Not too long ago we found out that the GTX 1660ti was in the pipelines from NVIDIA. Soon after, it was confirmed by the tech giant that they are indeed working on a GTX 16XX series. The series aims to be the replacement for the GTX 10 series, however, we have yet to see any GTX 1670 or GTX 1680 cards. Perhaps they decided that whoever could opt for such cards would naturally go for RTX rather than GTX. As for what we do know, the GTX 1660 ti will be launching tomorrow at $279. The month after that will see the launch of the GTX 1660 for $50 less at $250. What’s interesting is there are whispers in the tech world of yet another GTX 16 card coming in the lineup.
The GTX 1650
Twitter user and hardware informant, APISAK, has tweeted about a GTX 1650 card with a base clock of 1,485Mhz. He also responded to a comment asking about the VRAM of the card with 4GBs.
base Clock 1,485 MHz
— APISAK (@TUM_APISAK) February 21, 2019
If all this is true then it looks like Nvidia are ready to make a significant impact in the budget market. If this card gets priced at anything below $200, it will make for very interesting market readings. Furthermore, the performance is expected to be much better than it’s predecessor because the GTX 1650 will be sporting the new Turing architecture. like it’s bigger brothers however, there will be no ray tracing or tensor cores of any sort in these cards, that’s a feature reserved for the RTX club.
There are no leaks nor confirmations on core count and power draw at this time. However, from speculations, it seems that the GTX 1650 will have a core count somewhere between 890-1050. Furthermore, as this is a budget card, it is expected to go in budget builds with budget power supplies. That being said, it should not be more than the standard 75W that was found on its predecessors.
The card is expected to be unveiled next month with the GTX 1660 card unveiling so you can see it then for yourselves. Until then, let’s see if AMD can come up with an interesting rebuttal.