Ending In:

Certification Included

access

lifetime

content

5.5 Hours

**Description**

- Access 41 lectures & 5.5 hours of content 24/7
- Incorporate ideas from Bayesian Machine Learning, Reinforcement Learning, & Game Theory
- Discuss variational autoencoder architecture
- Discover GAN basics

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but knowledge of calculus, probability, object-oriented programming, Python, Numpy, linear regression, gradient descent, and how to build a feedforward and convolutional neural network in Theano and TensorFlow is expected
- All code for this course is available for download
*here*, in the directory unsupervised_class3

Compatibility

- Internet required

**Terms**

- Instant digital redemption

- Introduction and Outline
- Welcome (4:33)
- Where does this course fit into your deep learning studies? (5:00)
- Where to get the code and data (3:51)
- How to succeed in this course (5:19)

- Generative Modeling Review
- What does it mean to Sample? (4:57)
- Sampling Demo: Bayes Classifier (3:57)
- Gaussian Mixture Model Review (10:31)
- Sampling Demo: Bayes Classifier with GMM (3:54)
- Why do we care about generating samples?
- Neural Network and Autoencoder Review (7:26)
- Tensorflow Warmup (4:07)
- Theano Warmup (4:54)

- Variational Autoencoders
- Variational Autoencoders Section Introduction (5:39)
- Variational Autoencoder Architecture (5:57)
- Parameterizing a Gaussian with a Neural Network (8:00)
- The Latent Space, Predictive Distributions and Samples (5:13)
- Cost Function (7:28)
- Tensorflow Implementation (pt 1) (7:18)
- Tensorflow Implementation (pt 2) (2:29)
- Tensorflow Implementation (pt 3) (9:55)
- The Reparameterization Trick (5:05)
- Theano Implementation (10:52)
- Visualizing the Latent Space (3:09)
- Bayesian Perspective (3:09)
- Variational Autoencoder Section Summary (4:02)

- Generative Adversarial Networks (GANs)
- GAN - Basic Principles (5:13)
- GAN Cost Function (pt 1) (7:23)
- GAN Cost Function (pt 2) (4:56)
- DCGAN (7:38)
- Batch Normalization Review (8:01)
- Fractionally-Strided Convolution (8:35)
- Tensorflow Implementation Notes (13:23)
- Tensorflow Implementation (18:13)
- Theano Implementation Notes (7:26)
- Theano Implementation (19:47)
- GAN Summary (9:43)

- Appendix
- How to How to install Numpy, Theano, Tensorflow, etc... (17:32)
- How to Succeed in this Course (Long Version) (5:55)
- How to Code by Yourself (part 1) (15:54)
- How to Code by Yourself (part 2) (9:23)
- Where to get discount coupons and FREE deep learning material (2:20)