This ability to shift manufacturing processes for R&D while maintaining production levels has always been one of the strengths that Intel has used maintain market dominance. AMD, on the other hand, gave up that ability when they sold off all of their FABs to recover from the ATi purchase. However,… Intel does have some pretty big obstacles to overcome.
As anyone that has read the news lately knows Intel’s Ivy Bridge is getting some bad press for heat issues when overclocking. There are some that are still claiming it is the thermal interface material Intel has used, while others are looking at process issues. Of the two we side with the latter. You see as you shrink the process you get what is called current leakage (also called semiconductor leakage).
In current leakage small amounts of current (measured in microamperes) that leak out from the transistors, interconnects and even at the source ad drain terminals in a semiconductor. To combat this in their 32nm process Intel used High-K Metal gates in their transistors. At 22nm Intel felt that 3D Transistors would be the key to reduction in leakage (in combination with Hi-K). However, no matter how you cut it the reduction in process size will bring leakage this increased exponentially as you force more current into the system (overclocking) and heat in the silicon increases. Now the two are so interconnected that when the more you push the more you leak.
For many this is the obvious issue with Ivy Bridge and its overclocking/heat issues. If you consider that you have to dump in 1.5V+ to get to 4.9-5GHz you are going in increase both heat and leakage. As you cool the CPU more the leakage decreases which gives you more stability. With the mature 32nm process that we saw in Sandy Bridge getting to 5GHz+ with 1.45-1.475 Volts was not uncommon.
So this is what Intel is facing as they move down to 14nm and below. Of course they do have some new materials to play with like carbon nanotubes and graphene. Both of these should be ready to play with soon. The last time we asked Intel about this (in 2007 or so) they stated 5-8 years before these would be viable for use in CPU manufacturing. This puts them in the 2013-2015 timeframe which is right around when Intel expects to release 14nm CPUs and below. We are guessing that new CPUs will feature nanotubes for interconnects and graphene as the CPU material. We are just interested in seeing how they implement this moving forward.
Discuss this in our Forum