Integrating Azure Databricks with Local Development Environments
This course delves into the use of the Databricks connect utility to link up local development environments such as Jupyter Notebooks or Pycharm and Azure Databricks. This allows you to develop Spark applications locally and run them on the cloud.
For any developer working with Databricks, it is important to make working with this technology as seamless as possible. In this course, Integrating Azure Databricks with Local Development Environments, you’ll delve into different integrations which can be applied to enable development of Spark applications in your own local environments, but run the jobs we define on the Azure Databricks service in the cloud. First, you’ll learn how to set up the Databricks connect utility, which creates a local environment that is linked to a cloud-hosted Databricks workspace. Next, you’ll discover how to use the environment set up with Databricks connect to link a local Jupyter notebook with a workspace on the Azure cloud. Finally, you’ll explore the Databricks connect environment, and enable the spawning of jobs from an application developed in the PyCharm IDE, then execute them on the Azure cloud. Once you complete this course, you will have the skills required to combine the convenience of building apps in your own development environment and the compute power of Azure Databricks to build robust and highly performant applications.
Author Name: Kishan Iyer
Author Description:
I have a Masters in Computer Science from Columbia University and have worked previously as a developer and DevOps engineer. I now work at Loonycorn which is a studio for high-quality video content. My interests lie in the broad categories of Big Data, ML and Cloud.
There are no reviews yet.