1.
Install Python in Your Local System
If you plan to use the Databricks CLI from your local machine, ensure you have Python
installed.
Install Anaconda for Python package management (optional but recommended for ease
of managing environments).
Steps to Install Python
1. Download and install Python from the official website.
2. Verify Python installation by running:
python --version
3. Install Anaconda (Optional): Download and install Anaconda from here.
2. Install Azure Databricks CLI
After setting up Python and pip, install the Azure Databricks CLI with the following command:
pip install databricks-cli
3. Configure the Databricks CLI
Step 1: Generate a Databricks Access Token
1. Navigate to the Databricks workspace in the Azure Portal.
2. Click on your profile icon in the top-right corner and select User Settings.
3. Under Access Tokens, generate a new token and copy it.
Step 2: Configure the CLI
Open the Azure Cloud Shell or your local terminal and run the following command:
databricks configure --token
You’ll be prompted to provide:
o Databricks Host URL: Enter your Databricks workspace URL (e.g.,
https://<databricks-instance>.azuredatabricks.net).
o Access Token: Paste the access token you generated.
4. Export Notebook from Source Databricks Account
Step 1: Set Up the Azure Databricks Account
1. Ensure that all necessary resources are deployed in the source account, including:
o Subscription
o Resource group
o Compute cluster
Step 2: Export the Notebook
Use the Databricks CLI to export the notebook from the source workspace to your local
machine:
databricks workspace export --format SOURCE /Workspace/Users/<user-
email>/SamplePOC ./SamplePOC.py
This command exports the notebook from the source workspace to your local directory as
a .py file.
NOTE:
If you do not wish to export the notebook to your local machine, configure a storage account
with the CLI.
5. Import the Notebook to the Target Account
Step 1: Set Up Target Databricks Account
Provision all required resources in the target Azure Databricks account:
o Subscription
o Resource group
o Compute cluster
Step 2: Import the Notebook
Once the resources are set up, import the notebook into the target Databricks account
from your local machine (or storage):
databricks workspace import --language PYTHON --format SOURCE
./SamplePOC.py /Workspace/Users/<user-email>/SamplePOC
6. Run the Notebook
After importing, open the Databricks workspace in the target account and run the
notebook to test the migration.
Summary of Key Commands
Install Databricks CLI:
pip install databricks-cli
Configure Databricks CLI:
databricks configure --token
Export Notebook:
databricks workspace export --format SOURCE /Workspace/Users/<user-
email>/SamplePOC ./SamplePOC.py
Import Notebook:
databricks workspace import --language PYTHON --format SOURCE
./SamplePOC.py /Workspace/Users/<user-email>/SamplePOC
This workflow guides you through migrating Databricks notebooks between Azure Databricks
workspaces using the Databricks CLI.