Databricks can be made much easier to adopt if it can be seamlessly integrated into a development environment. This course looks into how this can be accomplished for the SQL Workbench/J client and the Prophecy service.
Databricks can be made much easier to adopt if it can be seamlessly integrated into a development environment. This course looks into how this can be accomplished for the SQL Workbench/J client and the Prophecy service.
For any organization which uses Databricks, integrating this big data platform into their own tool environments can prove a complex task. In this course, Integrating SQL and ETL Tools with Databricks, you'll learn how Databricks looks into two specific tools - SQL Workbench/J and Prophecy - and links them within the Databricks workspace. First, you'll discover the need or tool integrations, how these can help engineers be more productive, and how these can avoid adding to the complexity of a tooling environment. Then, you'll explore linking an Azure Databricks workspace with a popular SQL client - namely SQL Workbench/J. Finally, you'll learn the steps involved in integrating a Prophecy workflow with Databricks. Once you complete this course, you will be well-versed with the types of integrations which are possible with Databricks, and how to link up two popular tools with this big data service.
OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.
Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.
Find this site helpful? Tell a friend about us.
We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.
Your purchases help us maintain our catalog and keep our servers humming without ads.
Thank you for supporting OpenCourser.