- Видео 89
- Просмотров 202 587
Pytalista
Добавлен 11 окт 2022
Create new user account with Rest API Django rest-framework [Part4]
GitHub repository: github.com/pedrojunqueira/django_rest_tutorial
Просмотров: 66
Видео
Adding permissions to API end point Django rest-framework [Part3]
Просмотров 24Месяц назад
GitHub repository: github.com/pedrojunqueira/django_rest_tutorial
Add Swagger UI to a Django Rest Framework [Part2]
Просмотров 134Месяц назад
In this video I will teach how to add a Swagger UI to a Django Rest Framework Project Git Repository: github.com/pedrojunqueira/django_rest_tutorial Python Package : drf-yasg.readthedocs.io/en/stable/readme.html
Create a CRUD API with Django Rest API [PART 1]
Просмотров 103Месяц назад
In this video of a 4 part series I am going to teach step by step how to create a Django Rest API backend application Git Repository: github.com/pedrojunqueira/django_rest_tutorial Django Documentation: docs.djangoproject.com/en/5.1/ Django Rest Framework Documentation: www.django-rest-framework.org/#quickstart
I failed Microsoft Certification Exam AZ-400
Просмотров 1062 месяца назад
This is a reflection on how and why I failed the AZ-400 2x and what I am doing about it. I posted this on LinkedIn about it. other failed stories: ruclips.net/video/XN-ay0VW-ts/видео.html ruclips.net/video/vZdAY17tvHI/видео.html
Azure DevOps Tutorial for Beginners Python Example CICD
Просмотров 3522 месяца назад
In this quick tutorial I will teach you how to create your first CICD pipeline in Azure DevOps services. GitHub Code: github.com/microsoft/python-sample-vscode-flask-tutorial Tutorial Link: learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops Create Azure DevOps Organization: learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-sign-up?view=azur...
How to create a custom GitHub Action with Docker [Python]
Просмотров 1903 месяца назад
In this video I will teach you step by step how to create a custom GitHub Action with Docker to run a Python script that calls the Databricks API. Code: github.com/pedrojunqueira/github-custom-docker GitHub Actions Documentation: docs.github.com/en/actions/sharing-automations/creating-actions/creating-a-docker-container-action
How to use tags to create a release in GitHub with Actions
Просмотров 6303 месяца назад
In this video I am going to explain the difference between a tag and a commit and also a branch and also create a release using GitHub Actions. Code in GitHub: github.com/pedrojunqueira/git-tag-demo
Setting Permissions in Databricks Asset Bundle - Advanced Example
Просмотров 2794 месяца назад
In this example I will take you through an example of how to implement permissions in Databricks Asset Bundle Documentation: learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/permissions Code Example: github.com/pedrojunqueira/PytalistaYT/tree/master/Python/databricks-asset-bundle-permission/test_permissions
How to Enable and Use Databricks Serverless Compute
Просмотров 8104 месяца назад
I am going to teach you how to start with Databricks Serverless Compute by enabling and examples in notebooks and workflows. Getting Started with Databricks Serverless Compute #databricks #code #serverlesscomputing #spark #python Documentation: learn.microsoft.com/en-us/azure/databricks/compute/serverless/ Limitations: learn.microsoft.com/en-us/azure/databricks/compute/serverless/limitations?so...
Unable to access Account console under Azure Databricks
Просмотров 1 тыс.4 месяца назад
Unable to access Account console under Azure Databricks because Selected user account does not exist in tenant 'Microsoft Services' and cannot access the application '2ff814a6-3304-4ab8-85cb-cd0e6f879c1d' in that tenant. The account needs to be added as an external user in the tenant first. Please use a different account. Solution from this video: community.databricks.com/t5/administration-arch...
How to ssh from a linux client into a Azure Linux VM
Просмотров 2494 месяца назад
In this video I will do a quick demo on how to create a Linux vm in Azure and ssh into it using your local Linux client. Azure Docs: Connect using public key: learn.microsoft.com/en-us/azure/virtual-machines/linux/create-ssh-keys-detailed Connect using private key: learn.microsoft.com/en-us/azure/virtual-machines/linux-vm-connect?tabs=Linux
Using Variables in Databricks Asset Bundle - Advanced Example
Просмотров 7484 месяца назад
In this video I will show for to use variables in Databricks Asset Bundle effectively and having variables in different environments and also how to overwrite target resources and lookup resources in your Databricks workspaces. Code: github.com/pedrojunqueira/PytalistaYT/tree/master/Python/databricks-asset-bundle-variable/demo Documentation: learn.microsoft.com/en-us/azure/databricks/dev-tools/...
Finally Monetized after 1000 subscribers
Просмотров 1115 месяцев назад
Finally Monetized after 1000 subscribers
Deploy Terraform with GitHub Actions [Terraform Cloud API]
Просмотров 3795 месяцев назад
Deploy Terraform with GitHub Actions [Terraform Cloud API]
Deploy Databricks Workspace with Terraform
Просмотров 8596 месяцев назад
Deploy Databricks Workspace with Terraform
Deploy a Unity Catalog Cluster in Azure Databricks using Terraform
Просмотров 3546 месяцев назад
Deploy a Unity Catalog Cluster in Azure Databricks using Terraform
Deploy Resources in Azure with Terraform
Просмотров 5647 месяцев назад
Deploy Resources in Azure with Terraform
Deploy Databricks Asset Bundles using GitHub Actions [DevOps]
Просмотров 2,6 тыс.7 месяцев назад
Deploy Databricks Asset Bundles using GitHub Actions [DevOps]
Build Python Packages in a Databricks Asset Bundle
Просмотров 1,6 тыс.7 месяцев назад
Build Python Packages in a Databricks Asset Bundle
How to create and deploy Azure Function Using VS Code NO MUSIC
Просмотров 1,7 тыс.7 месяцев назад
How to create and deploy Azure Function Using VS Code NO MUSIC
How to deploy Databricks Asset Bundle Projects
Просмотров 3,3 тыс.7 месяцев назад
How to deploy Databricks Asset Bundle Projects
Deploy Storage Account with Bicep and GitHub Actions CI/CD [VS code]
Просмотров 2427 месяцев назад
Deploy Storage Account with Bicep and GitHub Actions CI/CD [VS code]
How to build your ChatGPT clone with Django and HTMX
Просмотров 4128 месяцев назад
How to build your ChatGPT clone with Django and HTMX
Demo how to CDC with Debezium into Kafka in the cloud
Просмотров 6628 месяцев назад
Demo how to CDC with Debezium into Kafka in the cloud
How to read Kafka Topic with Databricks [Confluent]
Просмотров 1,5 тыс.8 месяцев назад
How to read Kafka Topic with Databricks [Confluent]
How to run multiple notebooks in a thread in Databricks [Python]
Просмотров 7988 месяцев назад
How to run multiple notebooks in a thread in Databricks [Python]
How to send Python logs to Applications Insights (Azure Monitor)
Просмотров 3,5 тыс.8 месяцев назад
How to send Python logs to Applications Insights (Azure Monitor)
Using Change Data Feed and Structured Streaming in Fabric [PySpark]
Просмотров 4679 месяцев назад
Using Change Data Feed and Structured Streaming in Fabric [PySpark]
How to do Slow Changing Dimension in Delta Tables [Python]
Просмотров 23810 месяцев назад
How to do Slow Changing Dimension in Delta Tables [Python]
Perfect !!! I could not thank you anymore. You have solved a problem which could not be solved by Chat GPT even after breaking my head for hours. So helpful.
thank you!
Full of confusion, I think better to get advanced preparations so that concentrations should not be broken during focusing on the presentations.
Thank you so much
Thanks
I am still unable to get the account console. i dont have access to Entra ID, I am using free subscription, Is there any other way that i get into account console
Only via admin account in azure. If you are using free get the global admin account.
I dont see Feature Enablement tab in my azure databricks settings.
Are you admin for the tenant ?
make a new id on copying that principal id and forget password , youll gett otp on gmail only and move forward
Comment only ?
thanks for the help man
Welcome 🙏🏻
I am really glad to land up to your channel. You are the only one on youtube who has explained the practicals on asset bundles so nicely. Really thanks for that!!! I also found many other videos which you have on the channel, about topics that no one really talks about. really thankful for making such videos and helping us understand the in and outs of those topics. Just one request, the organization of the videos could really be improved. It took me more than 15 minutes to understand that there exists just one playlist which have videos on databricks, devops, pyspark and some python and docker. I believe there is a high chance of cleaning and organizing the videos under different tags and playlists. I barely comment somewhere on a channel/vdieo, but I am doing it on yours because I found your videos to be pure and fresh and made with love. I would like to help you if you feel I could, like I could arrange and organize your channel videos first of all, haha!!! If you feel I could help you in any manner, i could out my email here. (going to comment on multiple videos of yours to come under your view)
I am really glad to land up to your channel. You are the only one on youtube who has explained the practicals on asset bundles so nicely. Really thanks for that!!! I also found many other videos which you have on the channel, about topics that no one really talks about. really thankful for making such videos and helping us understand the in and outs of those topics. Just one request, the organization of the videos could really be improved. It took me more than 15 minutes to understand that there exists just one playlist which have videos on databricks, devops, pyspark and some python and docker. I believe there is a high chance of cleaning and organizing the videos under different tags and playlists. I barely comment somewhere on a channel/vdieo, but I am doing it on yours because I found your videos to be pure and fresh and made with love. I would like to help you if you feel I could, like I could arrange and organize your channel videos first of all, haha!!! If you feel I could help you in any manner, i could out my email here. (going to comment on multiple videos of yours to come under your view)
Hi thanks you very much for the feedback and really glad. You can email me on pedrocj@gmail.com. I want to work harder in 25 in the channel all suggestions are welcome. 🙏🏻
I am really glad to land up to your channel. You are the only one on youtube who has explained the practicals on asset bundles so nicely. Really thanks for that!!! I also found many other videos which you have on the channel, about topics that no one really talks about. really thankful for making such videos and helping us understand the in and outs of those topics. Just one request, the organization of the videos could really be improved. It took me more than 15 minutes to understand that there exists just one playlist which have videos on databricks, devops, pyspark and some python and docker. I believe there is a high chance of cleaning and organizing the videos under different tags and playlists. I barely comment somewhere on a channel/vdieo, but I am doing it on yours because I found your videos to be pure and fresh and made with love. I would like to help you if you feel I could, like I could arrange and organize your channel videos first of all, haha!!! If you feel I could help you in any manner, i could out my email here. (going to comment on multiple videos of yours to come under your view)
Thanks for this!
Welcome 🙏🏻
Hi Pedro, thank you for your videos. My question is how can I connect one of functions like this in Azure Data Factory as activity? Best
Hi I have a video for this. ruclips.net/video/6sXL8Ly2EeQ/видео.htmlsi=je49qSABSNWVGWUs
background music is loud
Thanks
Hi, I tried in windows , I am not able to install the cli using winget
Find instructions to install in windows.
Unfortunatelly when we arrive to execute: streamTestDf.writeStream \ .option("checkpointLocation", checkpointPath) \ .foreachBatch(parseAvroDataWithSchemaId) \ .queryName("clickStreamTestFromConfluent") \ .start() we get an error AttributeError: 'DataStreamReader' object has no attribute 'writeStream'
Make sure there is no typo. Also check the original code. From the video.
@@pytalista i did , the code is similar , same error
Same code should work.
How do I continue to sync the data?
This example just copy the whole table
Спасибо!!! Отличный ролик!
Thanks 🙏🏻
Hi Tutor What password we have to give there? while sign in
Use the password of your admin account as instructed on the video walk through
@pytalista Thanks mate 👍
perfect. I was able to fix mine using this video
Glad it helped
Very little about VS Code.
The basics usually is very little.
@@pytalista Then change the title (i.e., drop VS Code).
This is why it is “the basics” not “the complete guide”
Thanks for the great content.
Thanks 🙏🏻
Excellent video Pedro! Congrats man.
Thanks 🤩
Excellent video Pedro, congrats!
Thanks for watching 😊
Thank you for sharing
🙏🏻
Thank you very much. This was really helpful.
Thanks ☺️
Thank you for the video ❤
Welcome 🙏🏻
Summary: - He shows how logging works 'normally' in Python: - He creates a demo Azure Notebook - He goes to the Azure documentation. - Import the Python logging package: `import logging` - Initialize the logger: `logger = logging.getLogger(__name__) - Set the log level: `logger.setLevel(logging.INFO) - Create the stream handler: `stream_handler = logging.StreamHandler()` - Set up the log formatter - Add the stream handler to the logger: `logger.addHandler(stream_handler)` - He does an example log message: `logger.info('logging INFO')` and shows that it shows up in STDOUT in the Notebook. - He sets up logging to Azure: - He goes to Azure and creates an Applications Insights resource. - He creates a var in the Notebook named 'APPLICATIONINSIGHTS_CONNECTION_STRING' and gets the value from the 'Connection String' on the Application Insights resource 'Overview' page. - He adds a new Azure handler to his logger: - He does `pip install opencensus-ext-azure`. - He imports the Azure log handler library into his Notebook with `from opencensus.ext.azure.log_exporter import AzureLogHandler` - He adds the handler to the existing logger in his Notebook with `logger.addHandler(AzureLogHandler(connection_string=APPLICATION_INSIGHTS_CONNECTION_STRING)) - He then finds the log messages in Azure by going to the Azure Application Insights Resource --> Logs and then running a query for just the string 'traces'.
Thank you! At 16:48 you mentioned that you would make a video on creating a CI/CD pipeline to deploy to prod env. Is there an expected timeline for when it will be?
No. But basically to develop to prod just switch the mode to production
Make a video using azuirte to add watermark on download image
Interesting 🤔
This is the best Azure Function tutorial on RUclips. The music is amazing !!!!
Thank you! Glad you liked the music! Lots of people hate it 😀
thankyou so much
You are so welcome!
Thanks a lot for the video
Thanks mate !
hey there pytalista sir i am trying to run this code using vs code and python it is not similarly but it is not working for me i tried everything i am facing this issue py exited with code 1 (0x1). AttributeError: '_SixMetaPathImporter' object has no attribute '_path',AttributeError: '_SixMetaPathImporter' object has no attribute '_path',AttributeError: '_SixMetaPathImporter' object has no attribute '_path'. and my path is like this blobtrigger/test/{name} please help mee
Hi make sure you follow all pre requisites. Did you install everything you need on your environment?
@pytalista all pre requisites installed
thanks bro
Welcome 🙏🏻
👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍👍
What about the connection strings on datalake and semantic models? Its possible?
This is only for notebooks. I don't think it is possible on those scenarios
super
Super thanks
keep going
thanks !
You are the best teacher ever
thanks appreciated 👍👍👍👍👍👍👍👍
I can use this method trigger queue but inside an azure container apps?
This is a good question. Never done. I would guess not. I think the library wouldn’t work outside a serverless context. You can try.
Great video . Please make some real time scenarios on Databricks and ADF 😊.
I did a Databricks video. Look in the channel. Not ADF. You do not need ADF. Databricks can read the kafka topic directly.
Any ways to run only particular job failed notebook workflow?
I would think that in this caso no.
When people say simple many times during a demonstration, you know its exactly the opposite.
😀
let me know if you are stuck and where you did not understand.
Thank you! This helped me move on to the actual coding :)
Glad I could help!
Thanks for this videos. If we’re using storage accounts like ADLS how we can use this variables?
Thanks. In this case variables are to be applied across your bundle files in many cases.
Thank you very much. It was really helpful!
You're welcome!
Do you have an example using telemetry?
No sorry. I will think about it.
In local setup it is working but when i deploy its not working
Check if any environment variables in your code. Look at logs do debug