We may earn an affiliate commission when you visit our partners.
Rajeev Sakhuja

Are you interested in learning generative AI, but feel intimidated by the complexities of AI  and ML?

If your answer is YES, then this course is for you.

I structured this course based on my own journey learning generative AI technology. Having faced the challenges firsthand, I've designed it to make the learning process easier and more accessible. This course is tailored specifically for those without an AI or ML background, helping you quickly get up to speed with generative AI.

Read more

Are you interested in learning generative AI, but feel intimidated by the complexities of AI  and ML?

If your answer is YES, then this course is for you.

I structured this course based on my own journey learning generative AI technology. Having faced the challenges firsthand, I've designed it to make the learning process easier and more accessible. This course is tailored specifically for those without an AI or ML background, helping you quickly get up to speed with generative AI.

Designed specifically for IT professionals, developers, and architects with no prior AI/ML background, this course will empower you to build intelligent, innovative applications using Large Language Models (LLM). You’ll gain practical, hands-on experience in applying cutting-edge generative AI technologies without the steep learning curve of mastering complex algorithms or mathematical theories.

Here is an overview of course structure & coverage:

Generative AI Foundations: Dive into the core concepts of Large Language Models (LLM), and learn how to work with powerful models like Google Gemini, Anthropic Claude, OpenAI GPT, and multiple open-source/Hugging Face LLMs.

Building Generative AI Applications: Discover practical techniques for creating generative AI applications, including prompting techniques, inference control, in-context learning, RAG patterns (naive and advanced), agentic RAG, vector databases & much more.

Latest Tools and Frameworks: Gain practical experience with cutting-edge tools like LangChain, Streamlit, Hugging Face, and popular vector databases like Pinecone and ChromaDB.

Try out multiple LLM: Course doesn't depend on a single LLM for hands-on exercises, rather learners are encouraged to use multiple models for exercises so that they learn the nuances of their behavior.

Learning Reinforcement: After each set of conceptual lessons, students are given exercises, projects, and quizzes to solidify their understanding and reinforce the material covered in previous lessons.

Harnessing the Power of Hugging Face: Master the Hugging Face platform, including its tools, libraries, and community resources, to effectively utilize pre-trained models and build custom applications.

Advanced Techniques: Delve into advanced topics like embeddings, search algorithms, model architecture, and fine-tunings to enhance your AI capabilities.

Real-World Projects: Apply your knowledge through hands-on projects, such as building a movie recommendation engine and a creative writing workbench.

Course Features

  • 18+ Hours of Video Content

  • Hands-On Projects and Coding Exercises

  • Real-World Examples

  • Quizzes for Learning Reinforcement

  • GitHub Repository with Solutions

  • Web-Based Course Guide

By the end of this course, you'll be well-equipped to leverage Generative AI for a wide range of applications, from natural language processing to content generation and beyond.

Who Is This Course For?

This course is perfect for:

  • IT professionals, application developers, and architects looking to integrate generative AI into their applications.

  • Students or professional preparing for interviews for the roles related to generative AI

  • Those with no prior experience in AI/ML who want to stay competitive in today’s rapidly evolving tech landscape.

  • Anyone interested in learning how to build intelligent systems that solve real-world business problems using AI.

Why Choose This Course?

Raj structured this course based on his own experience in learning Generative AI technology. He applied his first hand knowledge of challenges faced in learning generative to create a structured course aimed at making it simple for anyone without AI/ML background to be able to get up to speed with generative ai fast.

  • No AI/ML Background Needed: This course is designed for non-experts and beginners in AI/ML.

  • Hands-On Learning: Engage in practical, real-world projects and coding exercises that bring AI concepts to life.

  • Expert Guidance: Learn from Rajeev Sakhuja, a seasoned IT consultant with over 20 years of industry experience.

  • Comprehensive Curriculum: Over 18 hours of video lessons, quizzes, and exercises, plus a web-based course guide to support you throughout your learning journey.

  • Latest Tools and Frameworks: Gain practical experience with cutting-edge tools like LangChain, Streamlit, Hugging Face, and popular vector databases like Pinecone

    • Folks looking for deep dive into the internals of generative AI models

    • Looking to gain understanding of mathematics behind the models

    • IT professionals interested in DataSciences role

Enroll now

What's inside

Learning objectives

  • Learners will master the core concepts and principles behind generative ai, how llms work, and how they can be applied in real-world use cases.
  • Develop practical skills to design, architect, and implement smart applications leveraging a wide range of llms from both open and closed sources.
  • Learn how to effectively use popular tools including langchain, huggingface, streamlit, ollama, and more
  • Understand and implement best practices when it comes to efficiency, scalability, and responsible use of generative ai in production environments.
  • Identify and navigate challenges specific to generative ai applications.
  • Prepare for intermediate-level technical interviews for roles in generative ai.
  • Learn how in-context learning, rag and fine-tuning works under the covers

Syllabus

Introduction

Meet your instructor!!!

Course outline, tips etc.

Learners will install the tools that they will use in the course
Read more

Provides an overview of what's covered in this section.

In this video you will follow the instructions to setup tools and course repository on your machine.

Hands on experience is a key part of this course. In this lesson you will learn about the various ways in which the course will enhance your learning experience.

In this lesson I will go over the options to access the models.

Explain the concepts of Generative AI. They will be able to discuss the working of LLMs at a high level. They will also be able describe the relationship between classical ML and Gen AI.

Discusses the objective and lessons covered in this section

Explore the evolution of Artificial Intelligence over the past two decades. This lesson provides an overview of AI technologies like Machine Learning (ML), Neural Networks, and Generative AI, laying the foundation for deeper understanding.

Delve into the basic building blocks of Generative AI—neurons and neural networks. Understand how deep learning networks work and why they are pivotal to AI models.

Interact with a neural network to solve mathematical problems, demystifying the underlying mechanisms. This hands-on exercise helps reinforce your understanding of how these networks operate.

Gain insight into how a Generative AI model functions from an external perspective. This lesson simplifies complex AI models by exploring their behavior without diving into technical intricacies.

Test your knowledge of Generative AI and its core concepts through this quiz, reinforcing your understanding of the material covered so far.

Learn how to build Generative AI applications. Understand the process of accessing models, and explore the differences between open-source and closed-source models.

Experience setting up access to a Google Gemini hosted model. This hands-on exercise teaches you how to integrate these models into your code for real-world applications.

Discover the capabilities of Hugging Face, a leading platform for AI models. Learn about its inference endpoints, gated models, and libraries essential for building AI applications.

Walk through the Hugging Face portal to familiarize yourself with its features. This exercise will help you navigate its tools and understand how to leverage its resources effectively.

Create an account on Hugging Face, request access to gated models, and generate access tokens. This exercise will enable you to interact with models using your tokens.

Check your understanding of Generative AI and Hugging Face with this quiz, designed to review key concepts and practical skills you’ve acquired.

Learn the fundamentals of Natural Language Processing (NLP) and its subsets, Natural Language Understanding (NLU) and Natural Language Generation (NLG). This lesson introduces the key concepts that power AI language models.


Explore how Large Language Models (LLMs) handle NLP tasks. Understand the basics of transformer architecture and the differences between encoder-only and decoder-only models.

Use the Hugging Face portal to find and apply models for specific NLP tasks. This exercise helps solidify your understanding of LLMs in practical applications.

Test your grasp of NLP concepts, including NLP, NLU, NLG, and how LLMs execute these tasks, with this knowledge-check quiz.

Section begins by explaining how models are named, providing insight into the structure and capabilities of different models. It then delves into various model types, including instruct, embedding, an

Provides an overview of the topics covered in this section.

In this lesson I will introduce you to OLlama, a platform for hosting models locally.

In this lesson you will learn how to hos models with HTTP endpoints using OLlama. In addition I will demonstrate pre-built chat apps for OLlama.

Learn how creators or providers assign names to AI models, and what these names reveal about their architecture, capabilities, and intended use cases.

Explore the key differences between instruct models, embedding models, and chat models, and see how platforms like Hugging Face use them to build AI applications.

Test your understanding of base, instruct, embedding, and chat models by completing this quiz and reinforcing key concepts from the lessons.

Discover how language models predict the next word in a sequence and tackle the fill-mask task, a common NLP challenge that evaluates a model’s vocabulary knowledge.

Dive into decoding parameters and understand how they shape a model’s output, with a walkthrough of commonly used controls in transformer-based models.

Understand how randomness is controlled in model outputs using hyperparameters like temperature, top-p, and top-k, to fine-tune creative or deterministic outputs.

Get hands-on with the Cohere API, register for a key, and explore randomness control by adjusting key parameters to impact model output.


Learn how to use frequency penalty and decoding penalty to manage the diversity of responses generated by a model.

Explore how max output tokens and stop sequences help control the length of the model’s generated content for more focused results.

Apply what you’ve learned by tuning decoding parameters in real-world tasks to see how they affect the model’s behavior and outputs.


Check your understanding of decoding hyperparameters like temperature, max tokens, and others by taking this quiz.


Learn how In-Context Learning allows models to mimic human learning by using examples, including techniques like zero-shot and few-shot learning.


Assess your knowledge of In-Context Learning, and concepts like zero-shot, few-shot, and fine-tuning through this comprehensive quiz.

Section begins with a hands-on exercise on installing and using the Hugging Face Transformers library in Python, followed by an exploration of task pipelines and pipeline classes. The section also cov

Provides an overview of topics covered in the section.

Get an overview of the Hugging Face Transformers library, followed by a step-by-step guide on how to install it and use it in Python for building AI applications.

Understand how task pipelines work in Hugging Face, explore key pipeline classes, and see practical demonstrations of their use for tasks like text classification and translation.

Test your knowledge of the Hugging Face Transformers library, including how to use task pipelines effectively in various applications.

Learn how to interact with the Hugging Face Hub to access model endpoints, manage model repositories, and integrate them into your projects for inference tasks.

Check your understanding of the Hugging Face Hub, its endpoints, and the inference classes used to streamline model interaction.

Explore both abstractive and extractive summarization methods, then apply Hugging Face models to implement a summarization task and experiment with real data.

Learn how to use the Hugging Face CLI to manage tasks, including model caching and cache cleanup, while streamlining workflows with locally stored models.

This content in this section are OPTIONAL. You may come back to this section later. Section starts with an exploration of tensors, the multi-dimensional arrays fundamental to neural networks, and how

Lesson provides an overview of lessons covered in this section.

Learn the foundational concept of tensors, which represent the multi-dimensional arrays produced by neural networks. Understand how pipeline classes transform tensors into meaningful task outputs.

Explore model configuration classes to compare and understand the underlying architecture of Hugging Face models, including parameters like hidden layers and vector dimensions.

Dive into the critical role of tokenizers in converting text into input for models. This lesson explains what tokenizers are and demonstrates how to use Hugging Face tokenizer classes effectively.

Learn what logits represent in machine learning, and explore their use in Hugging Face task-specific classes. This lesson includes a code walkthrough showing logits in action.

Discover the flexibility of auto model classes, which automatically load appropriate models for various tasks. See how they simplify working with different Hugging Face models in practice.

Test your knowledge of Hugging Face tokenizers, model configurations, and auto model classes with this quiz, designed to reinforce key concepts covered in the lessons.

Learn about different types of Question/Answering tasks, then design and implement your own question-answering system using Hugging Face models, combining theory with hands-on practice.

This section starts with a discussion of common challenges with LLM. A major part of the section is dedicated to the prompting techniques.

Provides overview of topics covered in this section.

Learn about LangChain template classes for creating complex and reusable templates.

Explore ICL from the LLM challenges perspective. Understand prompt engineering practices, transfer learning, and fine-tuning.

Find domain-adapted models on Hugging Face for specific industries or tasks.

Learn about prompt structure and general best practices.

Continue discussing prompt engineering best practices.

Test your understanding of prompt engineering and practice fixing prompts.

Understand how LLMs learn from few-shot prompts and the data requirements for ICL, fine-tuning, and pre-training. Learn best practices for few-shot and zero-shot prompts.

Test your knowledge of few-shot and zero-shot prompting and practice fixing prompts for Named Entity Recognition (NER).

Learn about the Chain of Thought (CoT) technique and how it enhances LLM responses.

Test your understanding of the CoT technique.

Learn about the self-consistency technique and how it enhances LLM responses.

Learn how the tree of thoughts technique can be used for solving reasoning and logical problems. Compare it to other techniques.

Test your knowledge of various prompting techniques and apply them to solve a task.

Use your knowledge of prompting techniques to build a creative workbench for a marketing team.

This section provides a comprehensive overview of LangChain, a powerful framework for building and managing LLM applications. Section covers key concepts such as prompt templates, few-shot prompt temp

Lesson provides an overview of the topics covered in this section.

Explore LangChain FewShotPromptTemplate and example selector classes.


Understand that there’s no universal prompt for all LLMs and learn how to address this challenge.

Learn how to invoke LLMs, stream responses, implement batch jobs, and use Fake LLMs for development.

Practice invoking, streaming, and batching with LLMs, and experiment with Fake LLMs.


Understand how the LLM client utility is implemented.

Test your knowledge of prompt templates, LLMs, and Fake LLMs.

Learn about LangChain chains and components, LangChain Execution Language (LCEL), and a demo of LCEL usage.


Build a compound sequential chain using LCEL and the pipe operator.

Learn about essential Runnable classes for building gen AI task chains.


Continue learning about essential Runnable classes for building gen AI task chains.

Familiarize yourself with common LCEL patterns using the LCEL cheatsheet and how-tos documentation.

Re-write the creative writing workbench project using LCEL and Runnable classes.

Test your knowledge of LCEL, Runnables, and chains.

This section focuses on the challenges and techniques associated with obtaining structured responses from large language models (LLMs). Section starts by comparing different data formats and highlight

Compare structured, unstructured, and semi-structured data. Understand the need for structured LLM responses and best practices for achieving them.


Learn about LangChain output parsers and how to use different types.

Write code to use the LangChain EnumOutputParser.

Write code to use the LangChain PydanticOutputParser.

Understand the application requirements and your tasks for the creative writing workbench project.

Step-by-step solution for the creative writing workbench project (part 1).

Good to know

Know what's good
, what to watch for
, and possible dealbreakers
Provides hands-on experience with tools like LangChain, Streamlit, and Hugging Face, which are widely used in the generative AI application development space
Covers RAG patterns and vector databases like Pinecone and ChromaDB, which are essential for building advanced generative AI applications
Includes hands-on projects like building a movie recommendation engine and a creative writing workbench, which allows learners to apply their knowledge in practical scenarios
Explores the Hugging Face platform, which is a central hub for pre-trained models and community resources, enabling learners to leverage existing AI capabilities
Requires learners to install tools and set up a course repository, which may pose a challenge for those unfamiliar with software development environments
Teaches LangChain, which is a rapidly evolving framework, so learners should be prepared to adapt to potential changes and updates in the tool

Save this course

Save Generative AI application design and development to your list so you can find it easily later:
Save

Activities

Be better prepared before your course. Deepen your understanding during and after it. Supplement your coursework and achieve mastery of the topics covered in Generative AI application design and development with these activities:
Review Core Python Concepts
Strengthen your Python foundation to better understand the code examples and exercises used throughout the course.
Browse courses on Python Basics
Show steps
  • Review data types, control flow, and functions.
  • Practice writing simple Python scripts.
  • Familiarize yourself with common Python libraries.
Review 'Building Applications with LLMs through LangChain'
Deepen your understanding of LangChain, a crucial tool for building generative AI applications, by studying a dedicated resource.
Show steps
  • Read the chapters related to prompt engineering and chains.
  • Experiment with the code examples provided in the book.
  • Compare the book's examples with the course's examples.
Experiment with Decoding Parameters
Master the art of controlling LLM outputs by systematically experimenting with decoding parameters like temperature, top-p, and frequency penalty.
Show steps
  • Choose a specific LLM and a simple prompting task.
  • Vary the decoding parameters one at a time.
  • Observe and document the changes in the model's output.
  • Analyze the impact of each parameter on the output's quality and diversity.
Four other activities
Expand to see all activities and additional details
Show all seven activities
Create a Cheat Sheet for LangChain LCEL
Improve your fluency with LangChain Execution Language (LCEL) by creating a concise cheat sheet that summarizes key syntax and patterns.
Show steps
  • Review the course materials and LangChain documentation on LCEL.
  • Identify the most frequently used LCEL syntax and patterns.
  • Organize the information into a clear and concise cheat sheet format.
  • Include examples of how to use each syntax and pattern.
Create a Blog Post on Prompt Engineering Techniques
Solidify your understanding of prompt engineering by explaining different techniques and their impact on LLM outputs in a blog post.
Show steps
  • Research different prompt engineering techniques.
  • Write a clear and concise explanation of each technique.
  • Provide examples of how each technique affects LLM outputs.
  • Publish the blog post on a platform like Medium or your personal website.
Build a Simple Chatbot with Hugging Face
Apply your knowledge of Hugging Face Transformers to create a functional chatbot, reinforcing your understanding of model integration and NLP tasks.
Show steps
  • Choose a pre-trained conversational model from Hugging Face Hub.
  • Implement a basic chatbot interface using Streamlit.
  • Integrate the model with the interface to generate responses.
  • Test and refine the chatbot's performance.
Review 'Generative AI with LangChain'
Gain a broader perspective on generative AI and LangChain by exploring a comprehensive guide that covers both fundamental and advanced concepts.
Show steps
  • Read the chapters related to RAG and agentic workflows.
  • Compare the book's approach to RAG with the course's approach.
  • Identify new techniques and ideas presented in the book.

Career center

Learners who complete Generative AI application design and development will develop knowledge and skills that may be useful to these careers:
Generative AI Application Developer
A Generative AI Application Developer designs, builds, and implements applications that leverage generative artificial intelligence models. This role involves practical application of tools like LangChain, Streamlit, and Hugging Face, which this course covers, to create innovative solutions. The course's focus on building applications using large language models, coupled with its hands-on projects like the movie recommendation engine and creative writing workbench, makes it a great fit for someone pursuing this career. The course's coverage on prompt engineering, inference control, and RAG patterns are particularly useful skills for a generative AI application developer.
Prompt Engineer
A prompt engineer specializes in crafting precise and effective prompts to elicit desired outputs from large language models. This course provides an extensive look at prompt engineering techniques, including in context learning, chain of thought prompting, and self consistency, all of which are essential for succeeding as a prompt engineer. The practical exercises in fixing and improving prompts throughout the course reinforce these skills. The course's focus on using multiple LLMs for exercises, rather than just one, would be quite beneficial in this role.
AI Software Engineer
An AI Software Engineer develops software that integrates artificial intelligence, often involving the use of generative AI. The course emphasizes hands-on experience with tools and frameworks relevant to this role, such as LangChain, Streamlit, and Hugging Face, and it covers a diverse set of topics, from model architecture to building real-world applications, which will help build a solid foundation. This course is specifically designed, and therefore beneficial to, those from non-AI backgrounds, which allows them to get up to speed quickly.
AI Solutions Architect
An AI Solutions Architect designs and oversees the implementation of AI systems and solutions, often integrating various AI technologies. This course helps build a foundation in generative AI, equipping a future architect with knowledge of LLMs, various model types, and techniques to apply them with tools and frameworks. The practical experience gained through projects in model implementation and use prepares one to architect AI solutions. This course is uniquely tailored for those without previous AI/ML experience, helping them quickly grasp the essentials needed for this role.
Natural Language Processing Engineer
A Natural Language Processing Engineer works on systems that process and understand human language. This course provides a great introduction to the field, covering key concepts like NLP, NLU, and NLG, along with how large language models perform these tasks. The course utilizes the Hugging Face library extensively, giving practical experience with model endpoints and inference classes. The course's focus on practical application rather than theory sets aspiring engineers up for success.
AI Consultant
An AI Consultant advises organizations on adopting and implementing AI technologies by providing valuable guidance and practical solutions. The hands-on nature of this course, with its many exercises and real-world projects, prepares a consultant with the practical knowledge needed for this role. The course's focus on multiple LLMs and frameworks also broadens their ability to advise on various AI solutions. The course is great for someone looking to transition into this role, who might not have a background in AI.
Machine Learning Engineer
A Machine Learning Engineer builds and maintains machine learning systems and infrastructure, which may include generative AI. Although this course does not go into the mathematical underpinnings of these systems, it helps build a foundation in the practical side of generative AI by covering topics such as how to use various model types, decoding parameters, in context learning, and the Hugging Face library. The hands-on experience with applications such as movie recommendation engines will be invaluable to a Machine Learning Engineer.
Software Architect
A Software Architect designs and oversees the technical aspects of software projects. This course may be useful for understanding the landscape of generative AI and how to integrate it into applications. The course focuses on building applications with tools such as LangChain which would be useful in designing software systems. This course provides a base understanding for software architects interested in integrating AI.
Solutions Engineer
A Solutions Engineer combines technical knowledge with client relations to create custom solutions. This course may be useful for a solutions engineer looking to understand the practical applications of generative AI. The course focuses on how to implement AI using various tools and frameworks, which is important for delivering solutions. The course also exposes students to several different LLMs and technologies, which is useful for creating custom solutions tailored to particular clients and applications.
Technical Project Manager
A Technical Project Manager oversees technology projects, often requiring an understanding of the underlying technology. This course may be useful for understanding the processes and tools used in generative AI project development. The course's overview of many different frameworks may be helpful in understanding the scope and effort involved in these types of projects. The course's focus on real world projects provides a good overview of the challenges and processes involved.
Data Scientist
A Data Scientist uses data analysis and machine learning to draw insights and solve problems. This course may be useful for acquiring skills in generative AI, a subfield of machine learning, and it demonstrates how to build a variety of applications using LLMs and tools like LangChain, which is helpful in modeling. The course will provide a glimpse into the practical use of these frameworks in building applications. This is a solid initial learning experience for a data scientist.
Research Scientist
A Research Scientist in the field of artificial intelligence conducts research to advance the field. This may include generative AI. The course may be useful in providing a hands-on introduction to frameworks used in generative AI such as LangChain and Hugging Face. The course focuses on the practical side of generative AI which is a useful perspective for a research scientist who is more focused on the mathematics and science of it. Those with an interest in research may find that this course fills a gap in their knowledge. A research scientist typically holds an advanced degree.
Computational Linguist
A computational linguist develops computational models of human language, often using machine learning. This course may be useful in providing an introduction to large language models and various prompting techniques. This course also shows how to use Hugging Face model endpoints, which are used for various NLP tasks. This course also introduces concepts such as Natural Language Processing, Natural Language Understanding, and Natural Language Generation.
Technology Analyst
A Technology Analyst evaluates and advises on technology trends and their potential impacts. This course may be useful for gaining practical knowledge of generative AI, which is useful for a technology professional to understand. The course highlights the use of various tools and frameworks, such as Hugging Face, which would help someone understand current technologies. The hands-on approach, with exercises and real world applications, will help an analyst gain practical understanding of the field.
Data Analyst
A Data Analyst interprets data to identify trends and insights, which can inform business decisions. This course may be useful for analysts looking to gain exposure to AI, a field which is becoming increasingly relevant to the field of data analysis. This course may be useful for analysts who want to use LLMs to automate some of the data analysis process. The course's coverage of language models and tools such as Hugging Face might help analysts understand how to use AI in their work.

Reading list

We've selected one books that we think will supplement your learning. Use these to develop background knowledge, enrich your coursework, and gain a deeper understanding of the topics covered in Generative AI application design and development.
Provides a practical guide to building applications using LangChain, a key framework covered in the course. It offers hands-on examples and detailed explanations of LangChain's features and capabilities. Reading this book will significantly enhance your ability to implement generative AI solutions. It valuable resource for understanding the practical aspects of LangChain.

Share

Help others find this course page by sharing it with your friends and followers:

Similar courses

Similar courses are unavailable at this time. Please try again later.
Our mission

OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.

Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.

Find this site helpful? Tell a friend about us.

Affiliate disclosure

We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.

Your purchases help us maintain our catalog and keep our servers humming without ads.

Thank you for supporting OpenCourser.

© 2016 - 2025 OpenCourser