05/01/12 San Francisco Chronicle — UC Berkeley has won a $60 million grant to establish a worldwide center for theoretical computer science to explore high-level mathematical algorithms that could help a variety of scientific fields, including health care, climate modeling and economics. The award from the Simons Foundation of New York will be used to start the Simons Institute for the Theory of Computing, which is expected start operations in July.
05/01/12 — The Simons Institute for the Theory of Computing will be coming to Calvin Hall; funded by a $60 million Simons Foundation grant, the institute will create a hub for theoretical computer science.
04/10/12 Stanford University — Berkeley Engineering professor Scott Shenker is co-director of the new Open Networking Research Center, which is exploring software-defined networking (SDN) as a paradigm for making networks simpler and less expensive while expanding their capacities. Industry sponsors include Cisco, Google, Hewlett-Packard and Intel.
04/09/12 The New York Times — Michael Franklin, a professor of computer science and director of the AMP Lab, talks about the challenges of working with Big Data in the New York Times. Last month, the National Science Foundation awarded $10 million to Berkeley's AMP Expedition.
04/05/12 Penn News — UC Berkeley engineers, led by computer scientist Ras Bodik, will join the University of Pennsylvania and seven other research institutions in a project to make computer programming faster, easier and more intuitive. Dubbed ExCAPE, the project is led by Penn and funded by a five-year, $10 million grant from the National Science Foundation's Expeditions in Computing program.
12/05/11 The New York Times — Berkeley Engineering professor David Patterson discusses how computer scientists will fight the war on cancer by taking on the Big Data challenges of information processing, genome sequencing, cloud computing, crowd-sourcing and other complex tasks. Patterson argues, "Given that millions of people do have and will get cancer, if there is a chance that computer scientists may have the best skill set to fight cancer today, as moral people aren't we obligated to try?"
08/03/11 Intel — Aimed at shaping the future of cloud computing and how increasing numbers of everyday devices will add computing capabilities, Intel Labs announced the latest Intel Science and Technology Centers (ISTC) both headquartered at Carnegie Mellon University. The center combines top researchers from Carnegie Mellon University, Georgia Institute of Technology, University of California Berkeley, Princeton University, and Intel. The researchers will explore technology that will have has important future implications for the cloud.
XSEDE project brings advanced cyberinfrastructure, digital services and expertise to scientists and engineers
07/25/11 National Science Foundation — The NSF has launched a massive five-year, $121 million project involving 17 institutions, including UC Berkeley, to bring advanced digital services to the nation's scientists and engineers. Collectively known as the Extreme Science and Engineering Discovery Environment (XSEDE), the new project replaces the TeraGrid, which for 10 years provided researchers with computational and data resources in an open infrastructure to support scientific discovery.
05/04/11 — Modern computing has a looming data traffic problem. Sometime in the next decade, experts say, processors will not be able to deliver better performance, because integrated circuits will have reached their capacity. Commonly described as interconnect bottleneck, this phenomenon means that computers, regardless of their processing speed, will be incapable of moving data any faster. But Berkeley engineers, led by Connie Chang-Hasnain, have recently developed a groundbreaking process that could solve the vexing problem of interconnect bottleneck and lead to a new class of faster, more efficient microprocessors.
04/08/11 — Two new research ventures at Berkeley Engineering have boundary-shattering visions for the future of computing. Jointly unveiled at the recent Berkeley EECS Annual Research Symposium (BEARS), these labs have distinct missions. The Swarm Lab will advance work in tiny wireless sensors capable of linking our homes, cities and bodies to the cyber world. The AMPLab will focus on solutions to the growing challenge of storing, accessing and analyzing a deluge of data that has begun overwhelming today's technology.
01/08/11 The New York Times — AT&T's dial tone was engineered so that 99.999 percent of the time, you could successfully make a phone call. Can we realistically expect that such availability will ever come to Internet services? "Google doesn't have the luxury of scheduled downtime for maintenance," says Armando Fox, an adjunct associate professor in the College of Engineering at UC Berkeley. Nor can it take down the service, he says, to install upgrades. "It is not uncommon for a place like Google to push out a major release every week," he said, adding that such frequency is "unprecedented" for the software industry.
03/15/10 Wall Street Journal — A connection to the University of California at Berkeley - and a lengthy record for innovations - seem to be winning attributes in this year's big computing prizes. Eric Brewer and Charles Thacker have both.
12/15/09 — "Fun. Easy to learn. Can relate to it." That's what students were saying about a new introductory computing course at Berkeley, established by Dan Garcia, Brian Harvey, Colleen Lewis (B.S.'05 EECS) and George Wang, that will alter the way young people perceive the field. Called "The Beauty and Joy of Computing," the two-unit freshman/sophomore seminar teaches non-majors basic programming skills while exploring big picture topics such as abstraction, world-changing applications and the social implications of computing.
05/02/08 — There are about six billion base pairs in the human genome, and our family tree includes about six billion living humans. So, although DNA sequencing begins in a laboratory, it requires research-level computer science and statistics to crunch the resulting mass of data and make sense of the results. As EECS and statistics professor Yun Song remarks, “Just 15 years ago, it was very difficult for population genetics researchers to run their computationally intensive analyses on desktop computers. It's thanks to relatively recent improvements in computers and algorithms that these problems have become tractable.”