We may earn an affiliate commission when you visit our partners.
Course image
Joseph W. Cutrone, PhD

This specialization is a three course sequence that will cover the main topics of undergraduate linear algebra. Defined simply, linear algebra is a branch of mathematics that studies vectors, matrices, lines and the areas and spaces they create. These concepts are foundational to almost every industry and discipline, giving linear algebra the informal name "The Theory of Everything". This specialization assumes no prior knowledge of linear algebra and requires no calculus or similar courses as a prerequisite. The first course starts with the study of linear equations and matrices. Matrices and their properties, such as the determinant and eigenvalues are covered. The specialization ends with the theory of symmetric matrices and quadratic forms. Theory, applications, and examples are presented throughout the course. Examples and pictures are provided in low dimensions before abstracting to higher dimensions. An equal emphasis is placed on both algebraic manipulation as well as geometric understanding of the concepts of linear algebra. Upon completion of this specialization , students will be prepared for advanced topics in data science, AI, machine learning, finance, mathematics, computer science, or economics.

Enroll now

Share

Help others find Specialization from Coursera by sharing it with your friends and followers:

What's inside

Three courses

Linear Algebra: Linear Systems and Matrix Equations

This first course of a three-course specialization introduces linear algebra concepts, one of the most important and basic areas of mathematics, with many real-life applications. This foundational material provides both theory and applications for topics in mathematics, engineering, and the sciences.

Linear Algebra: Matrix Algebra, Determinants, & Eigenvectors

This course continues the Linear Algebra Specialization by developing techniques to study matrices as linear transformations on vectors. We'll focus on manipulating matrices algebraically to analyze and solve systems of linear equations. We'll also study eigenvalues and eigenvectors to understand the geometry of matrix transformations. Applications include Markov Chains and the Google PageRank Algorithm.

Linear Algebra: Orthogonality and Diagonalization

This course focuses on orthogonal vectors, transformations, and bases. It culminates in symmetric matrices, linking algebraic properties with geometric equivalences. These matrices are common in applications, including AI and machine learning.

Save this collection

Save Linear Algebra from Elementary to Advanced to your list so you can find it easily later:
Save
Our mission

OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.

Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.

Find this site helpful? Tell a friend about us.

Affiliate disclosure

We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.

Your purchases help us maintain our catalog and keep our servers humming without ads.

Thank you for supporting OpenCourser.

© 2016 - 2024 OpenCourser