We may earn an affiliate commission when you visit our partners.
Course image
Google Cloud Training
Datenpipelines folgen in der Regel dem Muster Extrahieren und Laden (EL), Extrahieren, Laden und Transformieren (ELT) oder Extrahieren, Transformieren und Laden (ETL). Im Kurs wird beschrieben, welcher Ansatz in welcher Situation für Batchdaten geeignet ist. Außerdem werden verschiedene Technologien der Google Cloud Platform zur Datentransformation behandelt, wie BigQuery, das Ausführen von Spark in Cloud Dataproc, Pipelinediagramme in Cloud Data Fusion und die serverlose Datenverarbeitung mit Cloud Dataflow. In Qwiklabs erstellen die Kursteilnehmer dann selbst Komponenten einer Datenpipeline in der Google Cloud Platform.
Enroll now

Good to know

Know what's good
, what to watch for
, and possible dealbreakers
Examines multiple data processing and transformation techniques suited for batch data
Delves into different data transformation technologies available on Google Cloud Platform
Provides hands-on experience in setting up a data pipeline on Google Cloud Platform using Qwiklabs
Suitable for learners with a basic understanding of data pipelines and cloud computing concepts
Assumes learners have prior knowledge of cloud platforms such as Google Cloud Platform

Save this course

Save Building Batch Data Pipelines on GCP auf Deutsch to your list so you can find it easily later:
Save

Activities

Be better prepared before your course. Deepen your understanding during and after it. Supplement your coursework and achieve mastery of the topics covered in Building Batch Data Pipelines on GCP auf Deutsch with these activities:
Review the course syllabus and readings
Get a head start on the course by reviewing the syllabus and readings.
Show steps
  • Read the course syllabus.
  • Skim the required readings for the first few weeks of the course.
Review basic data modeling concepts
Refresh your understanding of data modeling to ensure you have a strong foundation for the course.
Browse courses on Data Modeling
Show steps
  • Review your notes from a previous course on data modeling.
  • Work through a few practice problems on data modeling.
Complete the Google Cloud Platform Data Engineering Fundamentals quest
Deepen your understanding of data engineering fundamentals on Google Cloud Platform.
Browse courses on Google Cloud Platform
Show steps
  • Follow the step-by-step instructions in the quest.
  • Complete the hands-on exercises in the quest.
Four other activities
Expand to see all activities and additional details
Show all seven activities
Solve data transformation problems using BigQuery
Practice writing SQL queries to transform data in BigQuery.
Browse courses on BigQuery
Show steps
  • Find a dataset on BigQuery Public Data.
  • Write a SQL query to transform the data.
  • Test your query and make sure it returns the expected results.
Attend a workshop on data engineering
Expand your knowledge of data engineering by attending a workshop.
Browse courses on Data Engineering
Show steps
  • Find a workshop that covers topics relevant to the course.
  • Register for the workshop.
  • Attend the workshop and participate actively.
Build a data pipeline using Cloud Dataflow
Gain hands-on experience building a data pipeline using Cloud Dataflow.
Browse courses on Cloud Dataflow
Show steps
  • Design your data pipeline.
  • Write the code for your data pipeline.
  • Deploy your data pipeline.
  • Monitor your data pipeline.
Help other students in the course
Reinforce your understanding of the course material by helping other students.
Show steps
  • Answer questions in the course discussion forum.
  • Participate in study groups.

Career center

Learners who complete Building Batch Data Pipelines on GCP auf Deutsch will develop knowledge and skills that may be useful to these careers:
Big Data Engineer
Big data engineers design and implement systems to manage and process large volumes of data. They work with businesses to understand their needs, and with data engineers to develop systems to meet those needs. Big data engineers need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Engineer
Data engineers design, build, and maintain solutions to manage and process large volumes of data. This often involves designing the architecture of a data management system, as well as writing code to extract, transform, and load data into a data warehouse, data lake, or other data storage system. This course can help data engineers understand data pipelines.
Data Scientist
Data scientists are responsible for developing and applying statistical and machine learning models to large datasets. They work with businesses to help them make decisions based on data. Data scientists need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Architect
Data architects design and implement the systems that manage and process data. They work with businesses to understand their needs, and with data engineers to develop systems to meet those needs. Data architects need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Integration Engineer
Data integration engineers design and implement systems to integrate data from different sources. They work with businesses to understand their needs, and with data engineers to develop systems to meet those needs. Data integration engineers need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Machine Learning Engineer
Machine learning engineers are responsible for building, deploying, and maintaining machine learning models. They work with data scientists to develop models, and with software engineers to deploy models into production. Machine learning engineers need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Analyst
In today's data-driven world, Data Analysts help businesses make sense of the vast amount of information they gather from a wide variety of sources. They are responsible for collecting, cleaning, and analyzing data. The course is particularly useful if one wishes to enter in this field as it covers the 3 data pipeline patterns: EL, ELT, and ETL.
Cloud Architect
Cloud architects design and implement cloud computing solutions. They work with businesses to understand their needs, and with cloud engineers to develop solutions to meet those needs. Cloud architects need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Governance Analyst
Data governance analysts work with businesses to develop and implement data governance policies and procedures. They work with businesses to understand their needs, and with data engineers to develop systems to meet those needs. Data governance analysts need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Data Privacy Analyst
Data privacy analysts work with businesses to develop and implement data privacy policies and procedures. They work with businesses to understand their needs, and with data engineers to develop systems to meet those needs. Data privacy analysts need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Business Analyst
Business analysts help businesses understand their data and make better decisions. They work with businesses to identify their needs, and with data scientists to develop models to meet those needs. Business analysts need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
DevOps Engineer
DevOps engineers work with developers and operations teams to ensure that software is developed and deployed quickly and efficiently. They work with businesses to understand their needs, and with developers and operations teams to develop and deploy software to meet those needs. DevOps engineers need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.
Database Administrator (DBA)
Database administrators (DBAs) are responsible for ensuring the performance, availability, and security of an organization IT environment. They work with database software and hardware, as well as with other IT professionals, to ensure that data is always accessible and secure. This course will also be helpful for an aspiring DBA as they need to know about data pipelines.
Software Engineer
Software engineers design, develop, and maintain software applications. They work with businesses to help them achieve their goals through the use of technology. Software engineers need to have a strong understanding of data pipelines, as they are often responsible for designing and implementing the systems that move data between different systems.

Reading list

We've selected seven books that we think will supplement your learning. Use these to develop background knowledge, enrich your coursework, and gain a deeper understanding of the topics covered in Building Batch Data Pipelines on GCP auf Deutsch.
Covers the basics of machine learning and provides some insights into how machine learning can be used for data engineering tasks, which are not covered by the course.
Provides a high-level overview of data-intensive applications and their design patterns. It could provide useful background knowledge for the course.
Provides a practical guide to data engineering on Google Cloud Platform. It covers some of the same topics as the course, but it has a different focus and is more hands-on.
Provides a practical guide to machine learning engineering with Python. It covers some of the same topics as the course, but it has a different focus and is more hands-on.
Provides a practical guide to building RESTful APIs with ASP.NET Core 3. It is not directly related to data pipelines, but it could be helpful for learners who want to develop web applications that interact with data pipelines.

Share

Help others find this course page by sharing it with your friends and followers:

Similar courses

Here are nine courses similar to Building Batch Data Pipelines on GCP auf Deutsch.
Daten für die Erkundung Vorbereiten
Most relevant
Google Cloud Platform Big Data and Machine Learning...
Most relevant
Essential Cloud Infrastructure: Foundation auf Deutsch
Most relevant
Google Cloud Platform Fundamentals: Core Infrastructure...
Most relevant
Modernizing Data Lakes and Data Warehouses with GCP auf...
Most relevant
Elastic Cloud Infrastructure: Scaling and Automation auf...
Most relevant
Architecting with Google Kubernetes Engine: Foundations...
Most relevant
Serverless Machine Learning with Tensorflow on Google...
Most relevant
Building Resilient Streaming Analytics Systems on GCP auf...
Most relevant
Our mission

OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.

Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.

Find this site helpful? Tell a friend about us.

Affiliate disclosure

We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.

Your purchases help us maintain our catalog and keep our servers humming without ads.

Thank you for supporting OpenCourser.

© 2016 - 2024 OpenCourser