Over the last couple of days, we have received information that would indicate nVidia is not moving to HBM 2 for their consumer GPUs (outside of some extremely high-end models). Instead, they appear to be focusing on improvements found in GDDR5X and GDDR6. Conversely, AMD appears to be focusing on HBM for many of their high-end and even some mid-range cards. The two very different paths has sparked something of a debate amongst fans of both products (as you can imagine). The questions are, why chose one over the other at this point and is HBM a truly viable option for AMD?
The experts have all weighed in. 2016 will be the year of Virtual Reality. The problem is that the experts are very often wrong. Still that has not stopped multiple companies from pushing out new VR headsets, APIs, development kits and more. The craze has gone so far as to start effecting the way that companies are making core hardware. We already know that AMD is pushing for VR mastery with new products and by showing which existing products also have a level of VR support.
It is said that nature abhors a vacuum and that is certainly true. Something will come along to fill the void if we let nature take its course. Unfortunately this law is a little mutated in the consumer electronics market and especially in the PC component world. Here is reads; the market cannot stand not having an “It” technology, so we much create one. It seems that the last few years we have been watching this happen.
On the 19th of January Samsung announced that they had begun mass production of their 4GB HBM 2.0 3D memory. This announcement was the starting gun for the next big GPU race. As we know both AMD and NVIDIA are racing to get viable products to the market in time for Oculus and HTC to launch their consumer version VR headsets. Up until now we have really only seen the developers’ kits and while these have been impressive they are not what most are hoping for in the final product.
Yesterday we talked about the possibility that AMD will launch a Dual-GPU R9 Fury X card geared for 4k and VR. This is certainly welcome news for most AMD fans and for fans of virtual reality. It was no coincidence that the first time we are seeing this in operation was at a big VR event in LA or that the launch is rumored to coincide with the launch of Oculus and HTC’s Vive headsets. This move would be a very high-end AMD card on the market around April/May of this year.
Remember that little patent squabble that NVIDIA and Samsung got into last year? Well some things have happened and they are not all that good for NVIDIA. If you have already forgotten about this incident (we do not blame you) we will fill you in. NVIDIA decided to file a complaint with the ITC against Samsung and Qualcomm. The claim was that Samsung was using technology that violated patents that they owned (programmable shaders, parallel processing etc.). NVIDIA also filed a patent law suit at the same time.
The average GPU is a pretty powerful computational device. The highly parallel design and efficient memory structure means that you can execute operations at a rate that puts most CPUs to shame. With the advent of Cuda and OpenCL the door was opened for developers to push workloads to the GPU and get back some pretty nice returns. Microsoft and many others joined in and began making access to the GPU simpler starting with DirectX 10.
As most people are aware, AMD dropped the first GPUs to utilize HBM (High Bandwidth Memory). These GPUs use a form of HBM called 2.5D which requires the use of an interproser layer than both the memory and the GPU sit on. This is opposed to the 3D stack in which the memory sits on top of the processor that owns it. The traditional stacking of 3D Memory provides significant performance benefits, but would require a different chip for every memory density you plan on releasing. In the GPU world this can be a big problem and is why both AMD and NVIDIA have opted for the 2.5D method.
It seems that at least one person is rather annoyed at AMD for making claims about certain FX series CPUs running Bulldozer cores. On November 4th the news went out that Tony Dickey had filed a class-action lawsuit on behalf of himself and others. The suit was actually filed back on October 26th and alleges violations of the Consumer Legal Remedies Act. This act covers misrepresentation and false advertising. Dicky alleges that AMD knowing mislead consumers about the number of functional cores Bulldozer CPUs have. AMD claims that Bulldozer has eight independent cores, Dickey says that there are only four that are functional.
The idea of the “cloud” is nothing new and has, in fact, been around for a number of years in one form or another. The concept goes back to the use of small “dumb” terminals that were nothing more than display devices for com putting done in a central location. After it became possible to put more power into the systems we used the cloud faded into the back ground. With the production of mobile devices that did not typically have the same power and capacity as a desktop the cloud returned. It had a major resurgence when the smartphone and tablet leaped onto the scene and now it seems that everything is trying to become cloud based; including gaming.