In physics, Feynman diagrams are used to compute correlation functions. Correlation functions, also known as kernels, also play an important role in machine learning, and computing them is of great theoretical and practical interest. In the first half of the talk, I will overview the physics of deep neural networks and their phase transitions from the perspective of correlation functions, as well as their impact on the practice of deep learning. In the second half of the talk, I will introduce a new tool, Tensor Programs, that enables the computation of correlation functions in deep learning, just like Feynman diagrams in particle physics. I will highlight a few of the many consequences of this tool, among which is the universality of the Gaussian process behavior in wide neural networks of any architecture.