@SimonsInstituteTOC
  @SimonsInstituteTOC
Simons Institute | Data is as Data Does: The Influence of Computation on Inference @SimonsInstituteTOC | Uploaded 3 months ago | Updated 9 hours ago
John Patrick Cunningham (Columbia University)
https://simons.berkeley.edu/talks/john-patrick-cunningham-columbia-university-2024-06-14
AI≡Science: Strengthening the Bond Between the Sciences and Artificial Intelligence

Probabilistic models remain a hugely popular class of techniques in modern machine learning, and their expressiveness has been extended by modern large-scale compute. While exciting, these generalizations almost always come with approximations, and researchers typically ignore the fundamental influence of computational approximations. Thus, results from modern probabilistic methods become as much about the approximation method as they are about the data and the model, undermining both the Bayesian principle and the practical utility of inference in probabilistic models for real applications in science and industry.

To expose this issue and to demonstrate how to do approximate inference correctly in at least one model class, in this talk I will derive a new type of Gaussian Process approximation that provides consistent estimation of the combined posterior arising from both the finite number of data observed *and* the finite amount of computation expended. The most common GP approximations map to an instance in this class, such as methods based on the Cholesky factorization, conjugate gradients, and inducing points. I will show the consequences of ignoring computational uncertainty, and prove that implicitly modeling it improves generalization performance. I will show how to do model selection while considering computation, and I will describe an application to neurobiological data.
Data is as Data Does: The Influence of Computation on InferenceThe approximate structure of triangle-free graphsWhat neural machinery is needed for language acquisition (Virtual Talk)Faces, Flukes, Fins, and Flanks: How Multispecies Re-ID Models are Transforming Our Approach to AI..What Does Machine Learning Have to Offer Mathematics? | Theoretically SpeakingLearning From Contrastive ExamplesSublinear Insights: A Faster (Classical) Algorithm for Edge ColoringPlenary Talk: Privately Evaluating Untrusted Black-Box FunctionsML Efficiency for Large Models: From Data Efficiency to Faster TransformersEvidence of social learning across symbolic cultural barriers in sperm whalesUnderstanding (a bit about) hallucinations in Generative AIHarnessing the properties of equivariant neural networks to understand and design materials

Data is as Data Does: The Influence of Computation on Inference @SimonsInstituteTOC

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER