David Patterson Says It’s Time for New Computer Architectures and Software Languages

But last year, he said, “single program performance only grew 3 percent—so it’s doubling every 20 years. If you are just sitting there waiting for chips to get faster, you are going to have to wait a long time.”

For a computer architect like Patterson, this is actually good news. It’s also good news for innovative software engineers, he pointed out. “Revolutionary new hardware architectures and new software languages, tailored to dealing with specific kinds of computing problems, are just waiting to be developed,” he said. “There are Turing Awards waiting to be picked up if people would just work on these things.”

As an example on the software side, Patterson indicated that rewriting Python into C gets you a 50 times speedup in performance. Add in various optimization techniques, and the speedup increases dramatically. It wouldn’t be too much of a stretch, he indicated, “to make an improvement of a factor of 1,000 in Python.”

On the hardware front, Patterson thinks domain-specific architectures just run better—saying, “it’s not magic, there are just things we can do.” For example, applications don’t all require that computing be done to the same level of accuracy. For some, he said, you could use lower-precision floating point arithmetic than the commonly used IEEE 754 standard.

The biggest area of opportunity right now for applying such new architectures and languages is machine learning, Patterson said. “If you are a hardware person,” he said, “you want friends who desperately need more computers.” And machine learning is “ravenous for computing, which we just love.”

Today, he said, there’s a vigorous debate surrounding which type of computer architecture is best for machine learning, with many companies placing their bets. Google has its Tensor Processing Unit (TPU), with one core per chip and software-controlled memory instead of caches; Nvidia’s GPU has 80-plus cores; and Microsoft is taking an FPGA approach.

And Intel, he said “is trying to make all the bets,” marketing traditional CPUs for machine learning, purchasing Altera (the company that provides FPGAs to Microsoft), and buying Nervana with it specialized neural network processor (similar in approach to Google’s TPU).

Along with these major companies offering different architectures for machine learning, Patterson says there are at least 45 hardware startups tackling the problem. Ultimately, he said, the market will decide.

“This,” he says, “is a golden age for computer architecture.”

Source: IEEE Semiconductors