About 15,700,000 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with …

    Sep 29, 2024 · There is a lot of confusion wrt the use of parameters in SQL, but I see Databricks has started harmonizing heavily (for example, 3 months back, IDENTIFIER () didn't work with catalog, now it does). Check my answer for a working solution.

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret values. Because the code doesn't run in Databricks, the secret values aren't redacted. For my particular use case, I wanted to print values for all secrets in a given scope.

  3. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage lifecycle. This setup allows users to leverage existing data storage infrastructure while utilizing Databricks' processing capabilities.

  4. Databricks: How do I get path of current notebook?

    Nov 29, 2018 · Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath res1: ...

  5. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and Azure instances of Databricks. You will need to create a bearer token in the web interface in order to connect.

  6. databricks: writing spark dataframe directly to excel

    Nov 29, 2019 · Are there any method to write spark dataframe directly to xls/xlsx format ???? Most of the example in the web showing there is example for panda dataframes. but I would like to use spark datafr...

  7. How to use python variable in SQL Query in Databricks?

    Jun 4, 2022 · I am trying to convert a SQL stored procedure to databricks notebook. In the stored procedure below 2 statements are to be implemented. Here the tables 1 and 2 are delta lake tables in databricks c...

  8. azure devops - How can I pass parameters to databricks.yml in ...

    Nov 24, 2023 · 6 Background: I have a separate Databricks Workspace for each environment, and I am buidling an Azure DevOps pipeline to deploy a Databricks Asset Bundles to these environments. Question The asset bundle is configured in a databricks.yml file. How do I pass parameters to this file so I can change variables depending on the environment?

  9. How to read xlsx or xls files as spark dataframe - Stack Overflow

    Jun 3, 2019 · Can anyone let me know without converting xlsx or xls files how can we read them as a spark dataframe I have already tried to read with pandas and then tried to convert to spark dataframe but got...

  10. Saving a file locally in Databricks PySpark - Stack Overflow

    Sep 3, 2017 · Saving a file locally in Databricks PySpark Asked 7 years, 11 months ago Modified 2 years ago Viewed 24k times

Refresh