In my practice project, I use a "Cloudinary" ( I think equivalent of Blob) and I save files in this storage and in database I save only paths to this files. What do you mean about this aproach?
@@MilanJovanovicTech Could you please explain more. I believe we need to chunk the file into smaller size and either upload in parallel or something, but something looks not simple here. Thoughts?
@@MilanJovanovicTech I just used the classic azure function pattern and did it that way. Kinda gave up on the minimalist approach. Good video tho it showed me how to do what I was trying to do.
This is bad practice for big files. You simply proxy the file through the SERVER/API instead of directly going to blobstorage. If you need to upload GB++ files, you bottleneck bandwith and CPU time for no reason.
Want to master Clean Architecture? Go here: bit.ly/3PupkOJ
Want to unlock Modular Monoliths? Go here: bit.ly/3SXlzSt
Great video Milan!
Thanks a lot!
Great content as always. Thanks for sharing!
Much appreciated!
Thanks for your great contents, I think we need a content to "How do I write a test for Azure blob service"
Great suggestion!
Does Azure Blob Storage can handle rights access for files or is it something we have to implement on web application ?
It can
In my practice project, I use a "Cloudinary" ( I think equivalent of Blob) and I save files in this storage and in database I save only paths to this files. What do you mean about this aproach?
Works like a charm
Thanks for your share @Milan Jovanović. What should I consider if I have to upload very large (up to 10GB) files?
Consider uploading directly to Blob Storage with pre-signed URLs
@@MilanJovanovicTech Could you please explain more. I believe we need to chunk the file into smaller size and either upload in parallel or something, but something looks not simple here. Thoughts?
Great content.
Can this be implemented in a dockerized application on a linux server?
Yes - but I don't think you want to run your own instance of Azurite.
Could you please make a video on sending file from angular and storing it locally in azure storage
Sure
Arek you going to create course about Azure ?
Unlikely
I did all this, but I do not see "Files" in my swagger document???
No idea, it should just "work".
@@MilanJovanovicTech I just used the classic azure function pattern and did it that way. Kinda gave up on the minimalist approach. Good video tho it showed me how to do what I was trying to do.
Please make video microservice 🙏
Will consider
Blog -> Blob
Rofl, that autocorrect 🥲😂 Thanks for saving the day!
No need of docker , now it runs within visual studio
How so?
This is bad practice for big files. You simply proxy the file through the SERVER/API instead of directly going to blobstorage.
If you need to upload GB++ files, you bottleneck bandwith and CPU time for no reason.
There's also pre-signed URL so you can hit Blob storage directly
Unhappy why not modular content
Have to keep it varied