Category Archives: Features

LM101-080: Ch2: How to Represent Knowledge using Set Theory

Episode Summary: This particular podcast covers the material in Chapter 2 of my new book “Statistical Machine Learning: A unified framework” with expected publication date May 2020. In this episode we discuss Chapter 2 of my new book, which discusses how to represent knowledge using set theory notation. Chapter 2 is titled “Set Theory for Concept Modeling”. Show… Read More »

LM101-074: How to Represent Knowledge using Logical Rules (remix)

LM101-074: How to Represent Knowledge using Logical Rules (remix) Episode Summary: In this episode we will learn how to use “rules” to represent knowledge. We discuss how this works in practice and we explain how these ideas are implemented in a special architecture called the production system. The challenges of representing knowledge using rules are also discussed. Specifically,… Read More »

LM101-052: How to Use the Kernel Trick to Make Hidden Units Disappear

LM101-052: How to Use the Kernel Trick to Make Hidden Units Disappear Episode Summary: Today, we discuss a simple yet powerful idea which began popular in the machine learning literature in the 1990s which is called “The Kernel Trick”. The basic idea behind “The Kernel Trick” is that an impossible machine learning problem can be transformed into an… Read More »

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging) Episode Summary: Deep learning machine technology has rapidly developed over the past five years due in part to a variety of factors such as: better technology, convolutional net algorithms, rectified linear units, and a relatively new learning strategy called “dropout” in which hidden… Read More »

LM101-029: How to Modernize Deep Learning with Rectilinear units, Convolutional Nets, and Max-Pooling

LM101-029: How to Modernize Deep Learning  with Rectilinear units,  Convolutional Nets, and Max-Pooling Episode Summary This podcast discusses the topics of rectilinear units, convolutional nets, and max-pooling relevant to deep learning which were inspired by my recent visit to the 3rd International Conference on Learning Representations (May 7-9, 2015) in San Diego. Specifically, commonly used techniques shared by… Read More »

LM101-024: How to Use Genetic Algorithms to Breed Learning Machines (Stochastic Model Search and Selection)

Episode Summary: In this episode we explore the concept of evolutionary learning machines. That is, learning machines that reproduce themselves in the hopes of evolving into more intelligent and smarter learning machines. Show Notes: Hello everyone! Welcome to the twenty-fourth podcast in the podcast series Learning Machines 101. In this series of podcasts my goal is to discuss… Read More »

LM101-015: How to Build a Machine that Can Learn Anything (The Perceptron)

Episode Summary: In this episode we describe how to build a machine that can learn any given pattern of inputs and generate any desired pattern of outputs when it is possible to do so! Show Notes: Hello everyone! Welcome to the fifteenth podcast in the podcast series Learning Machines 101. In this series of podcasts my goal is… Read More »

LM101-014: How to Build a Machine that Can Do Anything (Function Approximation)

Episode Summary: In this episode we describe how to build a machine that can take any given pattern of inputs and generate any desired pattern of outputs! Show Notes: Hello everyone! Welcome to the fourteenth podcast in the podcast series Learning Machines 101. In this series of podcasts my goal is to discuss important concepts of artificial intelligence… Read More »

LM101-009: How to Enhance Intelligence with a Robotic Body (Embodied Cognition)

Episode Summary: Embodied cognition emphasizes the design of complex artificially intelligent systems may be both vastly simplified and vastly enhanced if we view the robotic bodies of artificially intelligent systems as important contributors to intelligent behavior. Show Notes: Hello everyone! Welcome to the ninth podcast in the podcast series Learning Machines 101. In this series of podcasts my… Read More »

LM101-007: How to Reason About Uncertain Events using Fuzzy Set Theory and Fuzzy Measure Theory

Episode Summary: In real life, there is no certainty. There are always exceptions. In this episode, two methods are discussed for making inferences in uncertain environments. In fuzzy set theory, a smart machine has certain beliefs about imprecisely defined concepts. In fuzzy measure theory, a smart machine has beliefs about precisely defined concepts but some beliefs are stronger… Read More »

LM101-003: How to Represent Knowledge using Logical Rules

Episode Summary: In this episode we will learn how to use “rules” to represent knowledge. We discuss how this works in practice and we explain how these ideas are implemented in a special architecture called the production system. The challenges of representing knowledge using rules are also discussed. Specifically, these challenges include: issues of feature representation, having an… Read More »

LM101-002: How to Build a Machine that Learns to Play Checkers

Episode Summary: In this episode, we explain how to build a machine that learns to play checkers.  The solution to this problem involves several key ideas which are fundamental to building systems which are artificially intelligent. Show Notes: Hello everyone! Welcome to the second podcast in the podcast series Learning Machines 101. In this series of podcasts my… Read More »