Tuesday, 20 June 2017 15:08

Is HBM a viable technology for GPUs? Yes, Yes it is… just not right now

Written by

Reading time is around minutes.

Over the last couple of days, we have received information that would indicate nVidia is not moving to HBM 2 for their consumer GPUs (outside of some extremely high-end models). Instead, they appear to be focusing on improvements found in GDDR5X and GDDR6. Conversely, AMD appears to be focusing on HBM for many of their high-end and even some mid-range cards. The two very different paths has sparked something of a debate amongst fans of both products (as you can imagine). The questions are, why chose one over the other at this point and is HBM a truly viable option for AMD?

When AMD made the decision to move to HBM there was a lot of chatter about potential improved performance and how this move would position AMD above nVidia. The reality of this was that while there was a significant performance gain to be had by moving to HBM, AMD was plagued with supply issues which made the card very hard to get. This issue gave nVidia time to respond and to do so without the need to fight through the supply challenges that HBM brings to the table. We saw high-end cards leveraging GDDR5X to pull off some amazing performance although the prices were higher in the end.

AMD for their part fell back to their old stand-by; offer the same or similar performance for a lower price. The logic here seemed to be that it would keep their cards as a lower cost alternative for gaming and for VR/AR vs nVidia while the VR headset makers worked on reducing price. This makes sense on the surface as an AMD based system will cost you much less than an Intel/nVidia one if you are looking for AR. The problem is that AR is just not there yet for most consumers. The prices of the headsets as well as the changing landscape means it is not the driving force in building most PCs. Pure gaming performance is what the majority is looking for at this point.

Now this does not mean there is no market for AMD, quite the opposite in fact. AMD based GPUs when combined with HBM are well positioned for crypto currency mining. They do a great job compared to nVidia and are currently in high demand for people looking to convert PCs into mining machines. Some say that buying these cards is too much of an investment compared to custom ASICs, but AMD is not going to warn potential buyers away from this endeavor.

Now that I have digressed far off of the point, I will try to get back on topic.

HBM as a technology is fantastic. It provides a significant increase in bandwidth over GDDR of any flavor and can increase the overall throughput of data for the right kind of processor. It marries up very well to the highly parallel GPUs from AMD and allows their full potential to be realized. Moving to HBM was and still is a no brainer for AMD…, which is where the challenge is. AMD’s high-end GPUs would probably not do well with GDDR5(X) so they almost have no choice when it comes to memory. nVidia, on the other hand, has options due to their current GPU designs. They can use GDDR5, GRRD5X and GDDR6 to keep high-performing products in consumers’ hands and take advantage of HBM for segments that they want. AMD, currently, does not have that option if they want to remain performance competitive.

As we said last year, HBM supply issues are going to impact adoption of this technology for GPUs and eventually CPUs. Until these are addressed, we would not expect to see wide spread adoption. AMD, on their own, do not generate enough sales pressure to make manufacturers want to fix things just yet. With improvements in Ryzen and the potential gaming impact of Threadripper (with its 64 PCIe 3.0 lanes) things could change as consumers could see the advantage of an all AMD platform. Sadly, those changes would not really have an impact on HBM supply until 2H 2018 at the earliest. Still if AMD can grab some market share from both Intel and nVidia over the next few months, we could see a significant shift in the ecosystem for future AMD GPUs. This in turn will force nVidia to respond and we all win.

Read 7337 times

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.