File:Infinitely wide neural network.webm
Summary
Description |
English: Left: a Bayesian neural network with two hidden layers, transforming a 3-dimensional input (bottom) into a two-dimensional output (top). Right: output probability density function induced by the random weights of the network. Video: as the width of the network increases, output distribution simplifies, ultimately converging to a multivariate normal in the infinite width limit. Note: the video is an artistic depiction of the progression, and not an actual simulation. |
Date | |
Source | https://ai.googleblog.com/2020/03/fast-and-easy-infinitely-wide-networks.html |
Author | Tom Small |
Presented in https://iclr.cc/virtual_2020/poster_SklD9yrFPS.html
Licensing
This file is licensed under the Creative Commons Attribution-Share Alike 4.0 International license.
- You are free:
- to share – to copy, distribute and transmit the work
- to remix – to adapt the work
- Under the following conditions:
- attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.
Category:CC-BY-SA-4.0
Category:Central limit theorem
Category:Deep learning
Category:Gaussian processes
Category:Kernel methods for machine learning
Category:Machine learning research
Category:Neural networks (computer)
Category:Videos of computer science
Category:Videos of machine learning
Category:Videos of machine learning algorithms