Good job on landing on this no-code automation course. You can check out the full curriculum below :)I think you're going to like this course; because it breaks down the steep learning curve involved in creating automations.I'm not one to make you sit through theoretical lectures.I'm a no-code builder who's good at helping people to learn by doing.So much so, I created this no-code automation course.As the creator of this course (and Command Codeless), I've built numerous automations far more cost-effectively and quicker than it would take someone to 'code' them.Now the chances are, you've tried creating automations with code, but you're not happy with your progress.To put it simply, you're not following the path of least resistance. You don't need to join this course, but if you're like many others, there's a strong change you want to build automations to save you time - without a technical partner.I promise that if you take the time, you'll walk away from this course, building automations quicker than it would take you to learn how to code. You will grasp many of the programming concepts faster via this course too.Instead of theory, you'll learn practical skills as we work through projects, such as:- Creating Browser Automations- Automatically Populating Databases With Data- Creating Data-Driven Content Idea GeneratorsAnd, more.This course would have saved me a lot of; confusion, money (I spent $5,000+ on a coding bootcamp - don't do the same), and wasted hours on my learning programming journey.Go from zero to building up a portfolio of automations, and gain lifetime access to the lectures and worksheets this course offers you.For a small investment, you can re-ignite your passion for building and be sure to create automations that save you time. And if you don't love the course, take advantage of Udemy's refund policy.It's time to decide; will you continue struggling for time, or are you going to dive in with zero risks, with all to gain by learning how to automate?
Welcome. I can't wait to help you build automations without code.
I have either created or tested these resources. They come with my seal of approval, and I hope they serve as a valuable resource for you.
These mini no-code projects are great complementary material to this course.
To make sure you have the best learning experience on this course, please review the statements in this lecture to tailor the course to your specific learning needs.
Here, you'll be learning about the fundamentals of the no-code (automation) ecosystem. Later in this course, as you begin building your projects, an understanding of these fundamentals will be beneficial.
In this lecture, I want to clarify no-code automation. What is no-code automation? How does no-code automation work? Etc.
In this lecture, let's talk about no-code stacks, i.e. no-code tools.
In this lecture, I want to discuss adopting a systems mindset with you.
For this project, we’ll build a web scraper for competitor price monitoring. Specifically, our scraper will be extracting the name and price of each book from a fictional online book store.
You'll need these resources to complete this project.
In this lecture, we’ll set up our web scraper.
In this lecture, we'll save our web scraper.
In this lecture, we’ll set up a database in Airtable.
In this lecture, we'll be scraping multiple pages for our data.
Congratulations on completing this project. Now, why not challenge your capabilities? Try implementing one of these suggestions. Or try your own.
In this lecture, we’ll configure Twitter in Zapier.
For this project, we’ll be scraping Quotes To Scrape. Specifically, the quote and author of each quote. We’ll be turning our web scraper and scraped data into an API. That will create a level of automation as we could query the API.
In this lecture, we'll set up our API in Parabola.
In this lecture, we'll be filtering unnecessary data.
In this lecture, we'll export the data to Google Sheets.
In this lecture, we'll publish the flow.
For this project, we'll be creating an automated data enrichment workflow. The automated workflow will enrich data with more valuable data. Specifically, we'll be feeding our process with URLs. And, the workflow will return the logos for each of those companies without any manual intervention.
In this lecture, we’ll set up the Google Sheet that will house the prospects data.
For this project, we’ll be automatically populating an Airtable database with Elon Musk’s tweets. This project will teach you how to use Airtable as a database. And how to automate manual tasks in Airtable.
In this lecture, we’ll configure Airtable in Zapier.
For this project, we’ll be creating an automated sales pipeline database in Airtable. It will only be a two stages sales pipeline. A contact will get moved, automatically, to the next stage in the pipeline when they’ve gotten selected as having been contacted. The fictional sales team will get an email too.
In this lecture, we’ll source the links for data on prospects.
In this lecture, we’ll set up Airtable as a sales database.
In this lecture, we'll set up the email automation in Airtable.
In this lecture, we'll set up an automation to move leads.
For this project, we'll be building an automated Reddit content curator in Zapier. Our automation will extract the hottest posts from a subreddit and add them to a Google Sheet: creating our automated content curation tool.
In this lecture, we'll set up a Phantom in Phantombuster.
In this lecture, we’ll set up the Google Sheet that will house our outputs.
In this lecture, we’ll configure Reddit in Zapier.
In this lecture, we'll configure Google Sheets in Zapier.
For this project, we'll create a Twitter post scheduler. Our automation will keep track of any updates to a Google Sheet. If it finds an update, it will post it as a tweet to Twitter automatically. This project will teach you how to use Zapier to automate your social media operations.
In this lecture, we’ll set up the Google Sheet that will house our posts.
In this lecture, we’ll configure Google Sheets in Zapier.
In this lecture, we'll configure Twitter in Zapier.
For this project, we’ll be automating a marketing (prospect generation) task in Phantombuster. Our automation will gather numerous prospects. It’ll return crucial data: phone, email, address, etc. We’ll be collecting the data from Google Maps. If this were to get done manually, it would take tens of hours.
In this lecture, we'll clean the CSV data in Parabola.
In this lecture, we'll publish the automation.
In this lecture, we'll insert an IF/ELSE column to convert the sentiment score.
For this project, we’ll be creating a data-driven content idea generator. The automation will gather popular topics from various sources and see which content type has the best chance for success. Specifically, we’ll be picking popular topics from Medium and Reddit.
In this lecture, we'll edit the Reviews Sentiment Score column.
In this lecture, we’ll set up the Google Sheet that will house the content insights.
In this lecture, we’ll source the links we wish to analyse for content insights.
In this lecture, we'll set up Medium in Phantombuster.
In this lecture, we'll set up Reddit in Phantombuster.
In this lecture, we'll clean the data for insights in Parabola.
For this project, we'll be creating a sentiment analyzer. The sentiment analyzer will score the sentiment of a string of text automatically. It will tell us whether a string of text is positive, negative, or neutral. Specifically, we'll be running a bunch of online reviews through our sentiment analysis tool.
In this lecture, we'll be setting up Parabola and importing our data.
In this lecture, we'll insert a column for enriched data.
In this lecture, we'll insert the sentiment analysis function.
In this lecture, we'll format the numbers in the sentiment analysis columns.
In this lecture, we'll insert the API enrichment function.
In this lecture, we'll remove unnecessary columns.
For this project, we'll be creating a browser automation to auto-follow Twitter users. Functionality wise, our browser automation will go-to Twitter. Then, log-in. It will loop through each user that we want to follow and follow them.
OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.
Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.
Find this site helpful? Tell a friend about us.
We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.
Your purchases help us maintain our catalog and keep our servers humming without ads.
Thank you for supporting OpenCourser.