Computer

New Energy-Efficient Computer Chips May Reduce Data Center Electricity Use

New Energy-Efficient Computer Chips May Reduce Data Center Electricity Use

The energy consumption of photonic chips used in data centers and supercomputers has been reduced thanks to research from Baylor University and Oregon State University.

The results are significant because, according to the U.S. Department of Energy, a data center can use up to 50 times more energy per square foot of floor space than a conventional office building.

An organization’s information technology operations and infrastructure are housed in a data center, which also processes, distributes, and stores data and applications. Data centers account for roughly 2% of all electricity use in the United States, the DOE says.

The U.S. International Trade Commission reports that as data demand has increased, the number of data centers has increased quickly. In the United States, home to many firms that produce and consume vast amounts of data including Facebook, Amazon, Microsoft and Google, there are more than 2,600 data centers.

The advance by John Conley of the OSU College of Engineering, former Oregon State colleague Alan Wang, now of Baylor, and OSU graduate students Wei-Che Hsu, Ben Kupp and Nabila Nujhat involves a new, ultra-energy-efficient method to compensate for temperature variations that degrade photonic chips.

Our method is much more acceptable for the planet. It will one day allow data centers to keep getting faster and more powerful while using less energy so that we can access ever more powerful applications driven by machine learning, such as ChatGPT, without feeling guilty.

John Conley

Such chips “will form the high-speed communication backbone of future data centers and supercomputers,” Conley said.

In contrast to traditional computer chips, which employ electrons for their circuitry, photonic chips utilise light-related photons. Photons, which travel at the speed of light, allow for highly quick and energy-efficient data transmission.

The problem with photonic chips is that they have up to now required a lot of energy to maintain a constant temperature and great performance. But the Wang-led team has demonstrated that it’s possible to cut the energy required for temperature regulation by more than a million times.

“Alan is an expert in photonic materials and devices and my area of expertise is atomic layer deposition and electronic devices,” Conley said. “We were able to make working prototypes that show temperature can be controlled via gate voltage, which means using virtually no electric current.”

Presently, Wang said, “the photonics industry exclusively relies on components known as ‘thermal heaters’ to fine tune the working wavelengths of high-speed, electro-optic devices and optimize their performance. These thermal heaters consume several milliwatts of electricity per device.”

“That might not sound like much considering that a typical LED lightbulb uses 6 to 10 watts,” Wang said. “However, multiply those several milliwatts by millions of devices and they add up quickly, so that approach faces challenges as systems scale up and become bigger and more powerful.”

“Our method is much more acceptable for the planet,” Conley added. “It will one day allow data centers to keep getting faster and more powerful while using less energy so that we can access ever more powerful applications driven by machine learning, such as ChatGPT, without feeling guilty.”