Cover of The Deep Learning Revolution

The Deep Learning Revolution

by Terrence J. Sejnowski

"The Deep Learning Revolution" explores the transformative impact of deep learning, arguing that its recent success stems primarily from its remarkable scalability. Author Terrence J. Sejnowski highlights that foundational algorithms, like backpropagation, existed for decades but only achieved real-world impact with the advent of powerful computing and vast datasets. Unlike many AI algorithms that falter beyond "toy problems," neural networks demonstrate exceptional scaling, with performance continuously improving as network size and layers increase. This phenomenon is likened to the evolutionary expansion of the cerebral cortex and the growth of the internet, both systems that scaled dramatically once fundamental protocols or mechanisms were established. A key insight is that training multiple deep learning networks on identical data yields diverse yet similarly performing models, underscoring the robustness and adaptability of this paradigm.

1 popular highlights from this book

Buy on Amazon

Key Insights & Memorable Quotes

Below are the most popular and impactful highlights and quotes from The Deep Learning Revolution:

“It’s All about Scaling Most of the current learning algorithms were discovered more than twenty-five years ago, so why did it take so long for them to have an impact on the real world? With the computers and labeled data that were available to researchers in the 1980s, it was only possible to demonstrate proof of principle on toy problems. Despite some promising results, we did not know how well network learning and performance would scale as the number of units and connections increased to match the complexity of real-world problems. Most algorithms in AI scale badly and never went beyond solving toy problems. We now know that neural network learning scales well and that performance continues to increase with the size of the network and the number of layers. Backprop, in particular, scales extremely well. Should we be surprised? The cerebral cortex is a mammalian invention that mushroomed in primates and especially in humans. And as it expanded, more capacity became available and more layers were added in association areas for higher-order representations. There are few complex systems that scale this well. The Internet is one of the few engineered systems whose size has also been scaled up by a million times. The Internet evolved once the protocols were established for communicating packets, much like the genetic code for DNA made it possible for cells to evolve. Training many deep learning networks with the same set of data results in a large number of different networks that have roughly the same average level of performance.”

Find Another Book

More Books You Might Like

Note: As an Amazon Associate, we earn from qualifying purchases