Tag Archives: Rectified Linear Units

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging) Episode Summary: Deep learning machine technology has rapidly developed over the past five years due in part to a variety of factors such as: better technology, convolutional net algorithms, rectified linear units, and a relatively new learning strategy called “dropout” in which hidden… Read More »

LM101-029: How to Modernize Deep Learning with Rectilinear units, Convolutional Nets, and Max-Pooling

LM101-029: How to Modernize Deep Learning  with Rectilinear units,  Convolutional Nets, and Max-Pooling Episode Summary This podcast discusses the topics of rectilinear units, convolutional nets, and max-pooling relevant to deep learning which were inspired by my recent visit to the 3rd International Conference on Learning Representations (May 7-9, 2015) in San Diego. Specifically, commonly used techniques shared by… Read More »