AMD to de-couple Mooreâs Law for efficient energy in future information technologyAdvanced Micro Devices (AMD) has announced an ambitious goal to increase the âtypical useâ of energy efficiency of its entire line of mobile processors by 25 times from 2014 to 2020
Advanced Micro Devices (AMD) has announced an ambitious goal to increase the 'typical use' of energy efficiency of its entire line of mobile processors by 25 times from 2014 to 2020.
As we obsessively use our handheld devices, few of us think of the amazing IT infrastructure that it takes to put all of that information into the palm of our hands. And even fewer people think of all the power it takes to run the infrastructure.
The closest many of us come to considering IT energy efficiency is hoping that the battery in our devices will make it to the end of the day. From streaming videos or music, to sharing photos, social media, tracking our workouts and reviewing restaurants, we are more connected than ever before. And there is no end in sight.
Smartphones took the world by storm starting with the introduction of the iPhone in 2007. As developers made applications for anything we could think of, our smartphones became an indispensable part of our lives.
For example, as reported in 2012, an astonishing 90 percent of 18 to 29-year-olds sleep with their smartphones. Very quickly, computing devices are becoming part of the fabric of everyday life.
Next on the horizon are wearable technologies such as Google Glass, smartwatches and an assortment of fitness and health-monitoring devices ' all vying for a foothold in the market.
And all of this is happening at the dawn of the super-connected 'Internet of Things', with a multitude of devices or appliances being Internet-connected, and 'Surround Computing' where we will be immersed in computational power that will anticipate our needs and seamlessly deliver the information relevant to our environment.
Surround Computing is really a superset of the Internet of Things because it also describes how we will interact naturally with the technology and how the technology will enable us in ways that are still unfolding.
But, as technology enables us in these new and exciting ways, important questions are being raised about the energy needed to power this growing infrastructure.
The energy used to power IT is large and growing
According to the MIT Energy Initiative, 'worldwide, 3 billion personal computers use more than 1 percent of all energy consumed, and 30 million computer servers use an added 1.5 percent of all electricity at an annual cost of US$14-18 billion. Expanded use of the Internet, smartphones and the network to connect everything is causing all of those numbers to escalate.'
Similarly, the United States Department of Energy estimates that for 'the US, IT and telecommunications facilities annually consume roughly 120 billion kilowatt hours of electricity ' or 3 percent of all US electricity use.'
Energy efficiency and IT
The good news is that, as discussed by Dr. Jonathan Koomey in MIT Technology Review, 'as the performance of computers has shown remarkable and steady growth, doubling every year-and-a-half since the 1970s, the electrical efficiency of computing [the number of computations that can be completed per kilowatt-hour of electricity used] has also doubled every year-and-a-half since the dawn of the computer age.'
Passively cooled laptops, mobile phones and tablets are part of this trend, which has led to rapid reductions in the power consumed by battery-powered computing devices.
Dr. Koomey also noted that 'the power needed to perform a task requiring a fixed number of computations has been observed to fall by half every one-and-a-half years [or a factor of 100 every decade].'
If this sounds familiar, it should. This is the same exponential improvement trend observed in 1965 by Gordon Moore ' widely known as Moore's Law.
Moore's Law accurately predicted that the number of transistors on a central processing unit (CPU) would double every two years. The trend of peak power efficiency follows the same pattern because, as we pack more transistors into a processor, the distance that electricity has to travel through the device gets smaller, the transmission speed increases and this lowers the amount of power needed to perform a given unit of computing.
But, this near-steady pace of improvement in energy efficiency has actually been slowing over the last decade and now significantly trails the Moore's Law prediction. The question now is: how best to get back on track?
De-coupling from Moore's Law: The future of energy-efficient IT
Going forward, the power efficiency of IT is expected to continue to improve but in very different ways.
For example, AMD recently announced an ambitious goal to increase the 'typical use' energy efficiency of its entire line of mobile processors by 25 times from 2014 to 2020. It plans to achieve this through a combination of accelerated performance and reductions in the energy consumption.
If this goal is reached, it means that, in 2020, computers running on AMD technology could accomplish a computing task in one fifth of the time of today's PCs while consuming less than one fifth of the power on average.
To put this in perspective, try to envision getting the same performance improvement in the car you drive.
Using a similar performance-to-energy use ratio: If you drive a 100 horsepower car that achieves 30 miles per gallon today and were able to increase its performance by 25 times in six years, you would be driving a 500 horsepower car that gets 150 miles per gallon in 2020.
This ambitious goal comes on the heels of a tenfold energy efficiency improvement in the product line since 2008. The difference going forward is that many of the gains will come from outside the traditional silicon 'shrink' cycle, or what industry insiders call the 'race to the next process node'.
Rather than waiting for the next generation of silicon technology to come online, AMD's approach is to aggressively design energy efficiency through processor architecture and intelligent power management. And, the energy efficiency gains made by achieving this goal would outpace the Moore's Law efficiency trend by at least 70 percent between 2014 and 2020.
Here are a few of the key design innovations that will help propel the future of AMD energy-efficient IT:
AMD's accelerated processing units (APUs) include both CPUs and graphic processing units (GPUs) on the same piece of silicon. Combining CPUs and GPUs on the same chip saves energy by eliminating connections between discrete chips. AMD extracts even more energy savings by enabling APUs to seamlessly shift the computing workloads between the CPU and GPU to optimize efficiency, part of the Heterogeneous Systems Architecture that is now being widely adopted in the industry.
This could be re-titled as a 'race to idle' because the energy advantages are primarily derived from finishing a job quickly and efficiently to enable a faster return to the ultra-low power idle state.
Inter-frame power gating, per-part adaptive voltage, voltage islands, further integration of system components, and other techniques still in the development stage will continue to yield accelerated energy-efficiency gains going forward. AMD has implemented an 'ambidextrous' product offering to cover both ARM and x86 instruction sets ' so the same power management approach can be applied to the vast majority of IT use cases (the market for ARM- and x86-based processors is expected to grow to more than $85 billion by 2017).
Why energy efficiency matters
While AMD's quest for energy-efficient IT is not as acute, the stakes are high. By 2020, the number of connected devices is estimated to be nearly five times higher than the earth's population, resulting in increased energy demand. It follows that energy-efficient technology is essential to achieve the promise of an IT-enabled society.
And, with the massive projected increases in connected devices, there is a strong environmental motive to pursue energy-efficient IT.
The International Energy Agency (IEA) referred to energy efficiency as 'the world's first fuel'.
Similarly, the Alliance to Save Energy stated 'energy efficiency is one of the most important tools for avoiding climate change by reducing use of fossil fuels.'
While power-efficient IT alone cannot fully address climate change, it is an important part of the solution.
Improvement in the power efficiency of IT devices is just part of the story. IT-enabled devices can also help make other systems more energy efficient.
A recent study projected that IT-enabled devices could cut global greenhouse gas (GHG) emissions by 16.5 percent in 2020. These gains would be achieved through many different applications, ranging from smart power grids, to sophisticated heating, ventilation, and air conditioning (HVAC) systems, to sensor-driven intelligent traffic management and more.
The study predicted that the saving from IT-enabled devices would amount to $1.9 trillion in gross energy and fuel savings and a reduction of 9.1 gigatonnes of carbon dioxide-equivalent of greenhouse gases by 2020.
Mark Papermaster is senior vice president and chief technology officer of AMD, the world's only company designing x86/ARM microprocessors and graphics, which power millions of intelligent devices such as personal computers, tablets, game consoles and cloud servers that define the new era of Surround Computing.
Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.
Thank you for sharing your thoughts. We appreciate your feedback.