# Introduction to Scientific Machine Learning

This course provides an introduction to data analytics for individuals with no prior knowledge of data science or machine learning. The course starts with an extensive review of probability theory as the language of uncertainty, discusses Monte Carlo sampling for uncertainty propagation, covers the basics of supervised (Bayesian generalized linear regression, logistic regression, Gaussian processes, deep neural networks, convolutional neural networks), unsupervised learning (k-means clustering, principal component analysis, Gaussian mixtures) and state space models (Kalman filters). The course also reviews the state-of-the-art in physics-informed deep learning and ends with a discussion of automated Bayesian inference using probabilistic programming (Markov chain Monte Carlo, sequential Monte Carlo, and variational inference). Throughout the course, the instructor follows a probabilistic perspective that highlights the first principles behind the presented methods with the ultimate goal of teaching the student how to create and fit their own models.

What you'll learn

• Represent uncertainty in parameters in engineering or scientific models using probability theory
• Propagate uncertainty through physical models to quantify the induced uncertainty in quantities of interest
• Solve basic supervised learning tasks, such as: regression, classification, and filtering
• Solve basic unsupervised learning tasks, such as: clustering, dimensionality reduction, and density estimation
• Create new models that encode physical information and other causal assumptions
• Calibrate arbitrary models using data
• Apply various Python coding skills
• Load and visualize data sets in Jupyter notebooks
• Visualize uncertainty in Jupyter notebooks
• Recognize basic Python software (e.g., Pandas, numpy, scipy, scikit-learn) and advanced Python software (e.g., pymc3, pytorch, pyrho, Tensorflow) commonly used in data analytics
• Introduction to Predictive Modeling
• Basics of Probability Theory
• Discrete Random Variables
• Continuous Random Variables
• Collections of Random Variables
• Random Vectors
• Basic Sampling
• The Monte Carlo Method for Estimating Expectations
• Monte Carlo Estimates of Various Statistics
• Quantify Uncertainty in Monte Carlo Estimates
• Linear Regression Via Least Squares
• Bayesian Linear Regression
• Advanced Topics in Bayesian Linear Regression
• Classification
• Clustering and Density Estimation
• Dimensionality Reduction
• State-Space Models – Filtering Basics
• State-Space Models – Kalman Filters
• Gaussian Process Regression – Priors on Function Spaces
• Gaussian Process Regression – Conditioning on Data
• Bayesian Global Optimization
• Deep Neural Networks
• Deep Neural Networks Continued
• Physics-Informed Deep Neural Networks
• Sampling Methods
• Variational Inference

OpenCourser is an affiliate partner of edX and may earn a commission when you buy through our links.

Rating Not enough ratings 8 weeks 6 - 7 hours per week On Demand (Start anytime) \$2250 Purdue University via edX Ilias Bilionis On all desktop and mobile devices English Engineering

## Careers

An overview of related careers and their average salaries in the US. Bars indicate income percentile.

Counseling Theories & Models Part-Time Faculty \$17k

Trainer of Evidence Based Models \$54k

Assistant Adjunct Professor Statistical Models \$122k

Risk Analytics Tools and Models Program Manager \$136k

## Write a review

Your opinion matters. Tell us what you think.