We don’t know why deep learning forms of neural networks achieve great success on many tasks; the discipline has a paucity of theory to explain its empirical successes. As Facebook’s Yann LeCun has said, deep learning is like the steam engine, which preceded the underlying theory of thermodynamics by many years. But some deep thinkers have been plugging away at the matter of theory for several years now. On Wednesday, the group presented a proof of deep learning’s superior ability to simulate the computations involved in quantum computing. According to these thinkers, the redundancy of information that happens in two of the most successful neural network types, convolutional neural nets, or CNNs, and recurrent neural networks, or RNNs, makes all the difference. Amnon Shashua, who is the president and chief executive of Mobileye, the autonomous driving technology company bought by chip giant Intel last year for $14.1 billion, presented the findings on Wednesday at a conference in Washington, D.C. hosted by The National Academy of Science called the Science of Deep Learning. In addition to being a senior vice president at Intel, Shashua is a professor of computer science at the Hebrew University in Jerusalem, and the paper is co-authored with colleagues from there, Yoav Levine, the lead author, Or Sharir, and with Nadav Cohen of Princeton University’s Institute for Advanced Study. Also: Facebook’s Yann LeCun reflects on the enduring appeal of convolutions The report, “Quantum Entanglement in Deep Learning Architectures,” was published this week in the prestigious journal Physical… [Read full story]
ZDNet is a business technology news website published by CBS Interactive, along with TechRepublic. The brand was founded on April 1, 1991, as a general interest technology portal from Ziff Davis and evolved into an enterprise IT-focused online publication owned by CNET Networks.