About 10,500,000 results
Open links in new tab
  1. Databricks: How do I get path of current notebook?

    Nov 29, 2018 · The issue is that Databricks does not have integration with VSTS. A workaround is to download the notebook locally using the CLI and then use git locally. I would, however, …

  2. Connecting C# Application to Azure Databricks - Stack Overflow

    The Datalake is hooked to Azure Databricks. The requirement asks that the Azure Databricks is to be connected to a C# application to be able to run queries and get the result all from the C# …

  3. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage …

  4. Databricks: Download a dbfs:/FileStore File to my Local Machine?

    Feb 28, 2018 · I am using Databricks Community Edition to teach an undergraduate module in Big Data Analytics in college. I have Windows 7 installed in my local machine. I have checked that …

  5. List databricks secret scope and find referred keyvault in azure ...

    Jun 23, 2022 · I found the fastest way to identify the key vault that a scope points to is using Secret API. First, in the Databricks workspace, go to Settings → Developer → Manage Access …

  6. python - How to pass the script path to %run magic command as …

    Aug 22, 2021 · I want to run a notebook in databricks from another notebook using %run. Also I want to be able to send the path of the notebook that I'm running to the main notebook as a …

  7. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Update: April 2023rd. There is a new SQL Execution API for querying Databricks SQL tables via REST API. It's possible to use Databricks for that, although it heavily …

  8. How to use python variable in SQL Query in Databricks?

    Jun 4, 2022 · Also like 2 other ways to access variable will be 1. the spark.sql way as you mentioned like spark.sql(f"select * from tdf where var={max_date2}") 2. will be to create a …

  9. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Easiest is to use databricks cli's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST …

  10. databricks: writing spark dataframe directly to excel

    Nov 29, 2019 · In order to be able to run the above code, you need to install the com.crealytics:spark-excel_2.12:0.13.5 (or a more recent version of course) library though, for …

Refresh