Thanks a lot for this very clear video. I spent hours trying to do this until I luckily stumble across your video. I agree that this video should definitely have more views!!
Hi @investing3370, try changing line 16 to: sas_token = 'your sas token' connect_str = 'DefaultEndpointsProtocol=https;SharedAccessSignature=' + sas_token + ';EndpointSuffix=core.windows.net' you wont line lines 11 to 13 let me know if that works
I have a python script that reads an EDI file and, from there, creates unique data tags and elements (basically a CSV file with one tag and data field, per line). I need to process to load this into Azure and, for the outbound, to extract into the same tags+data. This looks close. Anyone interested in giving me a quote for this (can you show it working?). Thanks.
ooh. I support a SaaS app and _hate_ Azure Storage Explorer with a burning passion. if I can access logs etc from Python instead of ASE that would be a very happy rabbithole to go down. I suspect I don't have access to those keys though
What if you did not want to bring the files down to the local machine? How would you process the files up on Azure? And run the Python code on Azure. For instance, the files were placed in blob storage and now you wanted process them, clean them up and then save out the results out in blob storage. The Python code is not complicated , just what are the pices/configuration up on Azure.
I'm trying to do this in an azure datafactory and a custom activity in the datafactory can execute python files that are saved on a blob. You will need an azure batch account with a pool that has python installed on it (pools are based on vm's and some vm's have python pre-installed). Another way could be azure function apps but I have not tried that enough
@dotpi5907 can we load the hugging face save model like this? if so can you guide how? or there is any alternate solution? many thanks in advance. Cheers
Hi Nik K, thanks for the comment! That sounds like great idea for a video. Im away for a few days, but when i get back ill look into it and (all things going well) make a video on it
Hi @satyakipradhan2359 , thanks for the comment. 403 means that your connection is working but you don't have permission. So, your Azure account might have extra security on it. Try changing a few of the other options in the 'Generate SAS' tab. Some of the options like adding your IP address to the 'Allowed IP addresses' and checking that you have read permissions in the 'permissions' dropdown. Hope that helps!
Can we do similar thing to load video file from azure blob storage using libraries like OpenCV . I want to load and analyze videos from blob storage inside azure machine learning studio
Hi sumit Sp, thanks for the comment, that sounds like an interesting project and it sounds like it can be done. I'll have a play around and let you know if I figure it out, maybe this weekend.
@@dotpi5907 Thank you for the reply. I tried above thing and I am able to do it now. We can read videos from azure blob storage by providing correct URI path and then we can convert it into frames and store in another location to use in our ml models. Now I’m looking for a way to read live videos coming directly from camera 😄
Is there an way to select files/blob from an Azure container using a flask application for further usage in the application just like request.files.getlist('files') function help in selecting files from the local directory. Can someone help me with this?
this worked for me: for blob in container_client.list_blobs(): if blob.name.startswith('resumes/') and blob.name.endswith('.pdf') : blob_list.append(blob.name)
Hi Niraj, thank you for the comment and sorry for the late reply. I don't think you can set the expiry date to be infinite unfortunately. The main reason for this is if someone outside of your organization were to get a hold of your sas key, they would be able to access the file for as long as the sas key is valid or the file is deleted. I don't know too much more on this subject, but there is some more info here learn.microsoft.com/en-us/azure/storage/common/sas-expiration-policy?tabs=azure-portal. Hope that helps!
Thanks a lot for this very clear video. I spent hours trying to do this until I luckily stumble across your video. I agree that this video should definitely have more views!!
I'm glad it helped. Thanks for the support!
Videos like yours should have way more views. Thank you for what you do.
Thanks so much! I really appreciate the support
Fantastic video. Very clear explanation and clean code for us to follow. Thank you!
What with happen when you have SAS token on hand, can it be replaced with account key?
Hi @investing3370, try changing line 16 to:
sas_token = 'your sas token'
connect_str = 'DefaultEndpointsProtocol=https;SharedAccessSignature=' + sas_token + ';EndpointSuffix=core.windows.net'
you wont line lines 11 to 13
let me know if that works
what if i have multiple directories inside the container and blobfiles are present inside those directories?
I have a python script that reads an EDI file and, from there, creates unique data tags and elements (basically a CSV file with one tag and data field, per line). I need to process to load this into Azure and, for the outbound, to extract into the same tags+data. This looks close. Anyone interested in giving me a quote for this (can you show it working?). Thanks.
I have image dataset stored in the azure datastores filestorage. I have a model in azure ml studio. So how do i access the dataset.
ooh. I support a SaaS app and _hate_ Azure Storage Explorer with a burning passion. if I can access logs etc from Python instead of ASE that would be a very happy rabbithole to go down. I suspect I don't have access to those keys though
Mega like, thank you so much!
Glad it helped!
Do you have any suggestions for how to then write a file in a similar fashion to the storage blob?
I have the same question.
Thank you fo this video! It saved my time
What if you did not want to bring the files down to the local machine? How would you process the files up on Azure? And run the Python code on Azure. For instance, the files were placed in blob storage and now you wanted process them, clean them up and then save out the results out in blob storage. The Python code is not complicated , just what are the pices/configuration up on Azure.
You'd probably need a VM
I'm trying to do this in an azure datafactory and a custom activity in the datafactory can execute python files that are saved on a blob. You will need an azure batch account with a pool that has python installed on it (pools are based on vm's and some vm's have python pre-installed).
Another way could be azure function apps but I have not tried that enough
@dotpi5907 can we load the hugging face save model like this? if so can you guide how? or there is any alternate solution? many thanks in advance. Cheers
Hi Nik K, thanks for the comment! That sounds like great idea for a video. Im away for a few days, but when i get back ill look into it and (all things going well) make a video on it
getting HTTP Error 403: This request is not authorized to perform this operation using this resource type
Hi @satyakipradhan2359 , thanks for the comment. 403 means that your connection is working but you don't have permission. So, your Azure account might have extra security on it.
Try changing a few of the other options in the 'Generate SAS' tab. Some of the options like adding your IP address to the 'Allowed IP addresses' and checking that you have read permissions in the 'permissions' dropdown. Hope that helps!
thank you for this video
Thanks for watching!
thanks! super helpful 👍
That's great to hear @AndresPapaquiNotario
Can we do similar thing to load video file from azure blob storage using libraries like OpenCV .
I want to load and analyze videos from blob storage inside azure machine learning studio
Hi sumit Sp, thanks for the comment, that sounds like an interesting project and it sounds like it can be done. I'll have a play around and let you know if I figure it out, maybe this weekend.
@@dotpi5907 Thank you for the reply. I tried above thing and I am able to do it now. We can read videos from azure blob storage by providing correct URI path and then we can convert it into frames and store in another location to use in our ml models.
Now I’m looking for a way to read live videos coming directly from camera 😄
Can we do the same for json files stored in blob storage?
I have the same question. Have you found a solution to your question?
Is there an way to select files/blob from an Azure container using a flask application for further usage in the application just like request.files.getlist('files') function help in selecting files from the local directory. Can someone help me with this?
this worked for me:
for blob in container_client.list_blobs():
if blob.name.startswith('resumes/') and blob.name.endswith('.pdf') :
blob_list.append(blob.name)
Hi Sir
Which version of pandas you have used, can we load into pyspark dataframe instead of pandas data frame if Yes, pls share me the syantax ASAP
Hi @learner-df2ns, Try replacing these lines:
df = pd.read_csv(sas_url)
df_list.append(df)
df_combined = pd.concat(df_list, ignore_index=True)
with these lines:
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("CSVtoDataFrame").getOrCreate()
df = spark.read.csv(csv_file_path, header=True, inferSchema=True)
spark.stop()
df_list.append(df)
df_combined = reduce(lambda df1, df2: df1.union(df2), df_list)
You da real MVP
Thanks @charlieevert7666!
Well Explained ! Actually I want to read ".docx" file from blob. How can I do that?
You will probably have to read it in bytes and store locally or create a BytesIO object first and then pass to python-docx
can we set the expiry time to be infinite?
Hi Niraj, thank you for the comment and sorry for the late reply. I don't think you can set the expiry date to be infinite unfortunately. The main reason for this is if someone outside of your organization were to get a hold of your sas key, they would be able to access the file for as long as the sas key is valid or the file is deleted.
I don't know too much more on this subject, but there is some more info here learn.microsoft.com/en-us/azure/storage/common/sas-expiration-policy?tabs=azure-portal.
Hope that helps!
Thank you so much!!!!
Glad it helped!