13. Use Python to Manage OneLake in Fabric |

Поделиться
HTML-код
  • Опубликовано: 19 окт 2024

Комментарии • 10

  • @syednayyar
    @syednayyar 2 месяца назад

    Great vidoe , can you also please make a video to send the Json file from android app ( fetching the file from local download folder of android ) to the One lake root folder or any other folder ? i have tried but its not working

  • @NarayanJangid-q6i
    @NarayanJangid-q6i 25 дней назад

    Hi @WafaStudies, I am getting sampleLakehouse.Lakehouse is not found in the Workspace . Why? I have rechecked workspace and lakehouse name, all of them are correct

  • @deepjyotimitra1340
    @deepjyotimitra1340 7 месяцев назад

    what an explanation 👏 👌 Loved it.
    For live project, can we use SPN for auth?

  • @cheeliAmarnath
    @cheeliAmarnath 5 месяцев назад +1

    Am not able to install packages

  • @zubair489
    @zubair489 8 месяцев назад +1

    How to use a service principal instead of logging interactively?

    • @BharathKumar-ch4tp
      @BharathKumar-ch4tp 7 месяцев назад

      import os
      from azure.identity import ClientSecretCredential
      from azure.storage.filedatalake import (
      DataLakeServiceClient,
      DataLakeDirectoryClient,
      )
      # Set your account, workspace, and item path here
      ACCOUNT_NAME = "onelake"
      WORKSPACE_NAME = ""
      DATA_PATH = ""
      LOCAL_FILE_PATH = r""
      def upload_file_to_directory(directory_client: DataLakeDirectoryClient, local_file_path: str):
      file_name = os.path.basename(local_file_path)
      file_client = directory_client.get_file_client(file_name)
      with open(local_file_path, mode="rb") as data:
      file_client.upload_data(data, overwrite=True)
      def main():
      # Create a service client using service principal credentials
      client_id = ""
      tenant_id = ""
      client_secret = ""
      account_url = f"{ACCOUNT_NAME}.dfs.fabric.microsoft.com"
      credential = ClientSecretCredential(tenant_id, client_id, client_secret)
      service_client = DataLakeServiceClient(account_url, credential=credential)
      # Create a file system client for the workspace
      file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)

      # Get the directory client for the specified data path
      directory_client = file_system_client.get_directory_client(DATA_PATH)

      # Upload the local file to the specified directory in the Data Lake storage
      upload_file_to_directory(directory_client, LOCAL_FILE_PATH)
      if __name__ == "__main__":
      main()
      If you already know how to create service principal , Use this code

    • @BharathKumar-ch4tp
      @BharathKumar-ch4tp 7 месяцев назад

      This is for loading a data from local computer to Lakehouse using Python

  • @rogerbheeshmaa3606
    @rogerbheeshmaa3606 8 месяцев назад

    how can we fetch the data from onelake, pls post a video on that

  • @kartiktak6270
    @kartiktak6270 7 месяцев назад

    when will you add more videos ??