Understanding Artificial Intelligence through Algorithmic Information Theory

Artificial Intelligence is more than just a collection of brilliant, innovative methods to solve problems.

If you are interested in machine learning or are planning to explore it, the course will make you see artificial learning in an entirely new way. You will know how to formulate optimal hypotheses for a learning task. And you will be able to analyze learning techniques such as clustering or neural networks as just ways of compressing information.

If you are interested in reasoning , you will understand that reasoning by analogy, reasoning by induction, explaining, proving, etc. are all alike; they all amount to providing more compact descriptions of situations.

If you are interested in mathematics , you will be amazed at the fact that crucial notions such as probability and randomness can be redefined in terms of algorithmic information. You will also understand that there are theoretical limits to what artificial intelligence can do.

If you are interested in human intelligence , you will find some intriguing results in this course. Thanks to algorithmic information, notions such as unexpectedness, interest and, to a certain extent, aesthetics, can be formally defined and computed, and this may change your views on what artificial intelligence can achieve in the future.

Half a century ago, three mathematicians made the same discovery independently. They understood that the concept of information belonged to computer science; that computer science could say what information means. Algorithmic Information Theory was born.

Algorithmic Information is what is left when all redundancy has been removed. This makes sense, as redundant content cannot add any useful information. Removing redundancy to extract meaningful information is something computer scientists are good at doing.

Algorithmic information is a great conceptual tool. It describes what artificial intelligence actually does , and what it should do to make optimal choices. It also says what artificial intelligence can’t do. Algorithmic information is an essential component in the theoretical foundations of AI.

Keywords:

Algorithmic information, Kolmogorov complexity, theoretical computer science, universal Turing machine, coding, compression, semantic distance, Zipf’s law, probability theory, algorithmic probability, computability, incomputability, random sequences, incompleteness theorem, machine learning, Occam's razor, minimum description length, induction, cognitive science, relevance.

What you'll learn

• Complexity as code length
• Conditional Complexity
• Complexity and frequency
• Meaning distance
• Algorithmic probability, Randomness
• Gödel’s theorem
• Universal induction - MDL
• Analogy & Machine Learning as complexity minimization
• Simplicity & coincidences
• Subjective probability
• Relevance

OpenCourser is an affiliate partner of edX and may earn a commission when you buy through our links.

Rating Not enough ratings 5 weeks 4 - 8 hours per week On Demand (Start anytime) \$49 IMT via edX Jean-Louis Dessalles On all desktop and mobile devices English Programming Computer Science

Careers

An overview of related careers and their average salaries in the US. Bars indicate income percentile.

Algorithmic Execution Desk Support \$67k

Information Resources \$69k

Information Security 1 \$72k

Information Service \$73k

Information Artist \$77k

Equity Algorithmic Quant Analyst \$83k

Information Engineer 3 \$92k

Algorithmic Trading Developer - C++ \$104k

Information Coordinator 3 \$105k

Algorithmic Software Engineer \$111k

Information Architect 2 \$118k

Write a review

Your opinion matters. Tell us what you think.