In this course you learn to build, refine, extrapolate, and, in some cases, interpret models designed for a single, sequential series. There are three modeling approaches presented. The traditional, Box-Jenkins approach for modeling time series is covered in the first part of the course. This presentation moves students from models for stationary data, or ARMA, to models for trend and seasonality, ARIMA, and concludes with information about specifying transfer function components in an ARIMAX, or time series regression, model. A Bayesian approach to modeling time series is considered next. The basic Bayesian framework is extended to accommodate autoregressive variation in the data as well as dynamic input variable effects. Machine learning algorithms for time series is the third approach. Gradient boosting and recurrent neural network algorithms are particularly well suited for accommodating nonlinear relationships in the data. Examples are provided to build intuition on the effective use of these algorithms.
In this course you learn to build, refine, extrapolate, and, in some cases, interpret models designed for a single, sequential series. There are three modeling approaches presented. The traditional, Box-Jenkins approach for modeling time series is covered in the first part of the course. This presentation moves students from models for stationary data, or ARMA, to models for trend and seasonality, ARIMA, and concludes with information about specifying transfer function components in an ARIMAX, or time series regression, model. A Bayesian approach to modeling time series is considered next. The basic Bayesian framework is extended to accommodate autoregressive variation in the data as well as dynamic input variable effects. Machine learning algorithms for time series is the third approach. Gradient boosting and recurrent neural network algorithms are particularly well suited for accommodating nonlinear relationships in the data. Examples are provided to build intuition on the effective use of these algorithms.
The course concludes by considering how forecasting precision can be improved by combining the strengths of the different approaches. The final lesson includes demonstrations on creating combined (or ensemble) and hybrid model forecasts.
This course is appropriate for analysts interested in augmenting their machine learning skills with analysis tools that are appropriate for assaying, modifying, modeling, forecasting, and managing data that consist of variables that are collected over time.
This course uses a variety of different software tools. Familiarity with Base SAS, SAS/ETS, SAS/STAT, and SAS Visual Forecasting, as well as open-source tools for sequential data handling and modeling, is helpful but not required. The lessons on Bayesian analysis and machine learning models assume some prior knowledge of these topics. One way that students can acquire this background is by completing these SAS Education courses: Bayesian Analyses Using SAS and Machine Learning Using SAS Viya.
OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.
Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.
Find this site helpful? Tell a friend about us.
We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.
Your purchases help us maintain our catalog and keep our servers humming without ads.
Thank you for supporting OpenCourser.