100 years: Deluge and Drought

Lessons in how to wedge data into smaller spaces and build a smarter energy grid

100 years: Deluge and Drought
Lessons in how to wedge data into smaller spaces. And build a smarter energy grid.

In 2010 alone, humans generated enough new data on our digital devices—from photos on Facebook and Flickr to movies streaming on Netflix—to fill more than 75 billion iPads. As more and more people join the global conversation, this flood of data will only grow. Which is why, for years, the work of Professor of Computer Engineering Ahmed Amer has focused on developing techniques to manage the mind-boggling (and growing) amount of data being stored on computers, whether locally or across the cloud.

That might be enough to keep him busy. But since arriving at Santa Clara University in 2009, Amer has also sought to bring the same techniques he has been developing to handle this data deluge to bear on a different kind of scarcity: the energy in the physical world that powers not just server farms but everything on the planet that’s part of a power grid.

“Operating systems are wonderful and useful liars.”

In both these settings, Amer focuses on the level of the operating system—the interface between a programmer’s instructions and the computer hardware (or grid of energy resources) that is to carry them out. Operating systems handle the nitty-gritty of the underlying devices, allowing programmers to think on a higher, more intuitive level. When he gets wound up talking about his research, Amer’s courtly, formal tone takes on a breathy quality—as if we’re embarking on a big climb together. He’s set his sights on a bigger goal than just making programmers’ lives easier, however. He
aims to design operating system software that overcomes the underlying system’s physical limitations, to make it more efficient, robust, and easily scalable.

A WONDERFUL LIAR

Amer’s recent work includes developing software techniques for a next-generation data storage technology known as “shingled magnetic recording.” Ordinary magnetic memory devices—of the kind whirring away inside most computers or stacked up in big data storage centers—have managed to fit more data into less space with every passing year. Now, though, the technology is approaching its fundamental physical limit—if the tiny magnetic bits (1s and 0s) that encode the data were to get much smaller, the natural motion of their individual molecules would be enough to destabilize and erase the bits.

Shingled magnetic memory squeezes extra bits of data into a given amount of memory real estate by employing a clever trick. It structures the bits like a shingled roof: Only a tiny sliver of a given sequence of bits peeks out from underneath the subsequent bits “shingled” over it. Current shingled magnetic recording prototypes are expected to fit in twice as much data as ordinary magnetic memory, and that number may ultimately grow considerably, Amer says. But this data density comes at a price: There is often no way to change a bit’s value without also erasing the values of the bits shingled over it.

To tackle this problem, Amer and collaborators are designing a novel storage system that essentially lies to the programmer—for his or her own good, of course—about how much memory it possesses and where it is writing the programmer’s bits.

Their operating system software essentially sets aside a portion of the memory that it keeps secret from the programmer. Then, if the programmer asks the operating system to change the value of a bit that has other bits shingled over it, the system instead stores the new value, using part of the hidden memory. If the programmer asks for the value of this bit later on, the operating system retrieves it from the new location, leaving the programmer none the wiser that it was not in his or her specified location.

“I like to tell my students that operating systems are wonderful and useful liars,” Amer says.

BETTER. STRONGER. FASTER.

This approach to data storage technology is emblematic of Amer’s overall strategy: Create operating system software that works beneath the programmer’s level of awareness, acting as a benevolent dictator in order to make the best use of the underlying physical resources. He is quick to point out that this is the basic line of thought behind much, if not all, systems software “written by much brighter people than myself.” But Amer has turned this mindset on a wide variety of computational challenges. For example, he is currently studying how operating system software can work behind the scenes to manage the layout of data in large data centers like Amazon’s corner of the cloud. His goal is to reduce the risk of massive outages, such as the storm-related breakdown that took out Netflix, Instagram, and Pinterest (all of which store their data on Amazon’s cloud) for several hours on June 29, 2012.

“A lot of the techniques we currently have for making systems more robust don’t scale up well to big data storage systems,” Amer says.

In the past few years, Amer has also started turning the power of his approach on new territory: the problem of improving the robustness and efficiency of the energy grid.

The energy grid has become considerably more complex in recent years. In the past, with power supplied by large, centralized sources, managing the energy grid was a comparatively straightforward task of sending energy downstream to a collection of end users. Today, by contrast, the process is evolving into an intricate dance of different energy resources, some renewable and some not, including home energy devices such as solar panels and electric car batteries that could allow homeowners to store energy for later use or even send it to the grid. So far, however, the choreography of this dance is mired in the past.

“Right now, the energy grid control centers tend to be centralized and hierarchical—there’s some guy in a control room looking at the world and making decisions,” Amer says.

Currently Amer is working on designing an energy grid operating system that could allocate energy resources in a more distributed manner, allowing give-and-take between all the players in the grid and “making smarter decisions than any human could do in real time,” he says. Such a system could improve the grid’s robustness and efficiency, he argues. What’s more, it could benefit the environment, since its reliability could reduce the need to use quick-start sources for power generators, which typically are less clean and efficient.

“This is a project that I’m particularly passionate about, because it has a huge potential impact for good,” Amer says.

He adds that he has been inspired by Santa Clara’s commitment to research on sustainability. “In the School of Engineering, we like to say that we are ‘Engineering with a Mission,’” he says, in punning reference to the Mission Church at the heart of the campus. “But it’s really not a joke—it’s something genuine that you feel in all the faculty, staff, and students.”

Make AI the Best of Us

What we get out of artificial intelligence depends on the humanity we put into it.

The Co-Op

Santa Clara University has long been a bastion of interdisciplinary learning. A new fund is taking cross-collaboration to new heights.

Human at Heart

How Santa Clara University is distinguishing itself as a leader in one of the fastest-growing industries in the nation.

A Campus on the Rise

New buildings on campus—count ’em, six in total—aren’t the only changes brought by a successful $1 billion fundraising campaign. Come explore what’s new.