We may earn an affiliate commission when you visit our partners.
Start-Tech Academy

If you are a developer, data scientist, or AI enthusiast who wants to build and run large language models (LLMs) locally on your system, this course is for you. Do you want to harness the power of LLMs without sending your data to the cloud? Are you looking for secure, private solutions that leverage powerful tools like Python, Ollama, and LangChain? This course will show you how to build secure and fully functional LLM applications right on your own machine.

In this course, you will:

Read more

If you are a developer, data scientist, or AI enthusiast who wants to build and run large language models (LLMs) locally on your system, this course is for you. Do you want to harness the power of LLMs without sending your data to the cloud? Are you looking for secure, private solutions that leverage powerful tools like Python, Ollama, and LangChain? This course will show you how to build secure and fully functional LLM applications right on your own machine.

In this course, you will:

  • Set up Ollama and download the Llama LLM model for local use.

  • Customize models and save modified versions using command-line tools.

  • Develop Python-based LLM applications with Ollama for total control over your models.

  • Use Ollama's Rest API to integrate models into your applications.

  • Leverage LangChain to build Retrieval-Augmented Generation (RAG) systems for efficient document processing.

  • Create end-to-end LLM applications that answer user questions with precision using the power of LangChain and Ollama.

Why build local LLM applications? For one, local applications ensure complete data privacy—your data never leaves your system. Additionally, the flexibility and customization of running models locally means you are in total control, without the need for cloud dependencies.

Throughout the course, you’ll build, customize, and deploy models using Python, and implement key features like prompt engineering, retrieval techniques, and model integration—all within the comfort of your local setup.

What sets this course apart is its focus on privacy, control, and hands-on experience using cutting-edge tools like Ollama and LangChain. By the end, you’ll have a fully functioning LLM application and the skills to build secure AI systems on your own.

Ready to build your own private LLM applications? Enroll now and get started.

Enroll now

What's inside

Learning objectives

  • Download and install ollama for running llm models on your local machine
  • Set up and configure the llama llm model for local use
  • Customize llm models using command-line options to meet specific application needs
  • Save and deploy modified versions of llm models in your local environment
  • Develop python-based applications that interact with ollama models securely
  • Call and integrate models via ollama’s rest api for seamless interaction with external systems
  • Explore openai compatibility within ollama to extend the functionality of your models
  • Build a retrieval-augmented generation (rag) system to process and query large documents efficiently
  • Create fully functional llm applications using langchain, ollama, and tools like agents and retrieval systems to answer user queries

Syllabus

Getting started with local models
Introduction
Downloading and Installing Ollama
Setting up Ollama and downloading Llama LLM model
Read more
Model Customization options in CMD or terminal
Creating, saving and using a modified Ollama model
Using Ollama with Python
Installing and Setting up Python
Using Ollama library in Python
Calling the Model using Ollama Rest API
Ollama OpenAI Compatibility
Using LangChain in Python for LLM applications
What is LangChain and why are we using it
Basics of Langchain - Prompt Templates and LLM Models
Basics of Langchain - Formatting the output
Quiz
Building Retrieval Augmented Generation - RAG applications
Concept of Retrieval Augmented Generation RAG System
What is the RAG Process
Loading and Chunking the document using LangChain and Ollama
Embedding chunks using LangChain and Ollama
Building complete RAG application for answering user questions - Part 1
Building complete RAG application for answering user questions - Part 2
Building Tools and Agents based applications
Understanding Tools and Agents
Tools calling with LangChain and Llama3.1
Agents using LangChain and Llama3.1
Conclusion
About your certificate
Bonus lecture

Good to know

Know what's good
, what to watch for
, and possible dealbreakers
Focuses on building secure and private LLM solutions, which is crucial for professionals handling sensitive data and those prioritizing data governance
Emphasizes hands-on experience with cutting-edge tools like Ollama and LangChain, which are essential for staying current in the rapidly evolving field of AI
Covers Retrieval-Augmented Generation (RAG) systems, which are increasingly important for efficient document processing and knowledge retrieval in various industries
Requires installing and setting up Python, which may pose a challenge for learners without prior programming experience, but is a foundational skill for AI development
Uses Llama3.1, which may be updated in the near future, requiring learners to adapt their knowledge to newer versions of the model as they are released
Teaches Ollama, which is a relatively new tool, so learners should be prepared for potential updates, changes, and a smaller community support base compared to more established tools

Save this course

Save Build local LLM applications using Python and Ollama to your list so you can find it easily later:
Save

Activities

Be better prepared before your course. Deepen your understanding during and after it. Supplement your coursework and achieve mastery of the topics covered in Build local LLM applications using Python and Ollama with these activities:
Review Python Fundamentals
Reinforce your understanding of Python syntax, data structures, and control flow before diving into LLM application development.
Browse courses on Python
Show steps
  • Review Python syntax and data types.
  • Practice writing basic Python functions.
  • Work through introductory Python tutorials.
Review REST API Concepts
Familiarize yourself with REST API concepts to effectively interact with Ollama's API endpoints.
Browse courses on Web APIs
Show steps
  • Study the basics of RESTful architecture.
  • Learn about HTTP methods (GET, POST, PUT, DELETE).
  • Explore API documentation and examples.
Read 'Natural Language Processing with Python'
Gain a deeper understanding of NLP concepts to enhance your LLM application development skills.
Show steps
  • Read the introductory chapters on NLP fundamentals.
  • Explore the sections on text processing and analysis.
  • Experiment with the NLTK library for NLP tasks.
Four other activities
Expand to see all activities and additional details
Show all seven activities
Follow LangChain Tutorials
Enhance your LangChain skills by working through official tutorials and example projects.
Show steps
  • Explore the LangChain documentation website.
  • Work through tutorials on prompt engineering.
  • Implement a basic RAG application using LangChain.
Build a Simple Question Answering App
Solidify your understanding by building a question answering application using Ollama, LangChain, and a local LLM.
Show steps
  • Set up Ollama and download a suitable LLM.
  • Implement a RAG pipeline using LangChain.
  • Create a user interface for querying the application.
Contribute to Ollama Documentation
Deepen your understanding of Ollama by contributing to its open-source documentation.
Show steps
  • Explore the Ollama GitHub repository.
  • Identify areas in the documentation that need improvement.
  • Submit a pull request with your documentation changes.
Write a Blog Post on Local LLMs
Share your knowledge by writing a blog post about building local LLM applications with Python and Ollama.
Show steps
  • Choose a specific topic related to local LLMs.
  • Research and gather information for your blog post.
  • Write, edit, and publish your blog post.

Career center

Learners who complete Build local LLM applications using Python and Ollama will develop knowledge and skills that may be useful to these careers:
Artificial Intelligence Developer
An Artificial Intelligence Developer designs and implements AI solutions, and this course provides strong support for that ambition. You will learn to build and deploy large language models locally, which is frequently required for AI development. By focusing on tools like Ollama, Python, and LangChain, this course prepares you to handle various AI development tasks, including creating retrieval augmented generation systems. Developing Python based LLM applications and using Ollama’s Rest API are skills directly transferable to the role of Artificial Intelligence Developer.
AI Research Engineer
An AI Research Engineer designs and conducts experiments on AI models and algorithms. This course is a natural fit for any AI Research Engineer interested in developing and experimenting with large language models locally. By learning how to build, customize, and deploy models using Python, Ollama, and LangChain, this course provides tools for applied researchers in a private environment that they control completely. Creating retrieval-augmented generation systems specifically enhances capabilities in utilizing large amounts of data.
Machine Learning Engineer
A Machine Learning Engineer builds, tests, and deploys machine learning models, and this course is highly relevant to this endeavor. This course focuses on running large language models locally, teaching skills in model customization and deployment, mirroring the work of a Machine Learning Engineer who is responsible for model development and management. The course focuses on tools like Python, Ollama, and LangChain, technologies often used to build and deploy local applications. The hands-on experience in building Retrieval-Augmented Generation systems and creating complete LLM applications directly contributes to skills needed in this engineering role.
AI Product Manager
An AI Product Manager is responsible for guiding the development and strategy of AI products, and this course provides a strong foundation for this role. The course covers the practical aspects of building local LLM applications, which helps the product manager appreciate the possibilities and limitations of this technology. The ability to understand the capabilities of tools like Ollama, Python, and LangChain lets the product manager make informed decisions about product development. The focus on privacy and local deployment are also especially important for the AI product manager.
Machine Learning Operations Engineer
A Machine Learning Operations Engineer focuses on deployment, maintenance and scaling of machine learning models, and this course directly addresses key aspects of that role. Focusing on the use of Python, Ollama, and Langchain, the course develops the ability to both deploy and customize language models in a controlled environment. The course's emphasis on building tools and integrating them through the Rest API is also central to the work of a Machine Learning Operations Engineer, making this course a great fit.
Natural Language Processing Engineer
A Natural Language Processing Engineer focuses on building systems that enable computers to understand and process human language. This course may be useful for a Natural Language Processing Engineer because it covers essential aspects of local LLM development. You will learn core skills in building LLM applications using Python and LangChain, and will gain experience in customizing models. The core focus on local deployment makes this course a potential resource for any engineer focused on applications that must remain private or air-gapped.
Prompt Engineer
A Prompt Engineer specializes in crafting effective prompts for large language models, and this course may be useful for that purpose. The course provides a solid grounding in LLM interaction, which is a core skill for any prompt engineer. By learning how to build and run language models locally through Ollama, the learner gains hands-on experience in how LLMs respond to different inputs. This hands-on experience, including the development of retrieval augmented generation systems, develops a deeper understanding of how LLMs function.
Data Scientist
A Data Scientist uses data analysis and machine learning to gain insights and build predictive models. This course may be useful for a Data Scientist interested in exploring or implementing local large language models into their analytics toolkit. By learning how to run models locally using tools such as Ollama and Python, a data scientist can gain a deeper understanding of LLM behavior. The course also provides skills for building retrieval augmented generation systems, expanding a data scientist's ability to process unstructured text data.
Software Engineer
A Software Engineer designs, develops, and maintains software systems. This course may be useful for a Software Engineer who seeks to incorporate local large language models into their applications. The focus on using Python, Ollama and Langchain to create LLM functionality is directly applicable to software development projects. Creating end-to-end LLM applications and using the Ollama Rest API are core functions of the software engineer who desires to incorporate LLM based tools into their products.
Computational Linguist
A Computational Linguist uses computers to analyze and process human language, and this course may be useful for those doing projects that must be kept local. This course teaches tools like Python, LangChain and Ollama that are used to build language processing applications. The course provides valuable hands-on experience that is applicable to computational linguistics, while the focus on local deployment gives the learner capabilities that are especially valuable for projects where data must remain private or in an air-gapped setting.
Data Engineer
A Data Engineer builds and maintains the infrastructure for data storage and processing, and this course may be useful for data engineers who are working with AI tools. The course covers the practical application of large language models and their local deployment through tools like Python and Ollama. The ability to build retrieval augmented systems may provide a data engineer with unique and applicable opportunities in their work. Likewise, the ability to integrate local models into other tools via a Rest API also aligns well with the work of a Data Engineer.
Research Scientist
A Research Scientist conducts research to discover new knowledge or improve existing products and processes. This course may be useful for a Research Scientist interested in large language models, especially those who need to run experiments locally for privacy or customization reasons. The course specifically focuses on the practical application of language models in a local environment, allowing a researcher to test hypotheses and build sophisticated applications without cloud dependencies. Skills gained building retrieval augmented generation systems are useful for enhancing research capabilities by improving processing of unstructured data.
Solutions Architect
A Solutions Architect designs and oversees the implementation of technology solutions for an organization, and this course may be useful for an architect interested in incorporating local LLM technology. The course will develop skills necessary to evaluate and integrate LLMs in a local environment, as well as understand how they can be incorporated into a wider system. The ability to create end-to-end LLM applications and integrate them through the Rest API are core skills that the solutions architect should possess.
Robotics Engineer
A Robotics Engineer designs, develops, and tests robots and robotic systems. This course may be useful for a Robotics Engineer seeking to incorporate local large language models into their projects for enhanced interaction or data processing. The ability to build local LLM applications with Python and Ollama may enable the robotics engineer to create sophisticated functionalities in a self contained environment. This could include enhanced interfaces between human operators, or local processing of environmental data for more intelligent behavior.
Data Analyst
A Data Analyst interprets data and provides insights to drive decision making. This course may be useful for a Data Analyst who wishes to explore the application of large language models in data analysis workflows. Skills developed in this course, such as the ability to build retrieval augmented generation systems which can process unstructured data, may prove useful. Gaining a hands-on understanding of how local LLMs function will give a data analyst a deeper appreciation of the possibilities.

Reading list

We've selected one books that we think will supplement your learning. Use these to develop background knowledge, enrich your coursework, and gain a deeper understanding of the topics covered in Build local LLM applications using Python and Ollama.
Provides a comprehensive introduction to NLP using Python and the NLTK library. While the course focuses on Ollama and LangChain, understanding the fundamentals of NLP will greatly enhance your ability to build effective LLM applications. This book is particularly useful for understanding text processing techniques and basic NLP concepts. It serves as a valuable reference for those new to the field.

Share

Help others find this course page by sharing it with your friends and followers:

Similar courses

Similar courses are unavailable at this time. Please try again later.
Our mission

OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.

Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.

Find this site helpful? Tell a friend about us.

Affiliate disclosure

We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.

Your purchases help us maintain our catalog and keep our servers humming without ads.

Thank you for supporting OpenCourser.

© 2016 - 2025 OpenCourser