105. Databricks | Pyspark |Pyspark Development: Spark/Databricks Interview Question Series - V
HTML-код
- Опубликовано: 7 сен 2024
- Azure Databricks Learning: Pyspark Development: Spark/Databricks Interview Question Series - V
=================================================================================
Are you learning Spark/Databricks to become Bigdata Engineer? Are you preparing for an Interview for the role of Spark/Databricks data engineer?
Pyspark development is core area for any spark/databricks projects and we can expect more questions from this topic.
Follow this video to get list of questions on Pyspark Development concepts along with directions to give answer in an interview
#SparkInterviewQuestions, #DatabricksInterviewQuestion,#SparkInterviewSeries, #DatabricksInterviewSeries,#SparkDevelopment, #SparkLakeHouse,#DatabricksDevelopment, #DatabricksInternals #DatabricksPyspark,#PysparkTips, #DatabricksRealtime, #PysparkPerformanceOptimization, #DatabricksTutorial, #AzureDatabricks, #Databricks, #Databricksforbeginners
• 105. Databricks | Pysp...
Your content is really awesome the way how you explained really impressed thanks Raja for your efforts
Hi Ashok, thanks for your kind words. Glad to know it helps data engineers
Good Bro
Thanks!
CICD with git and azure devops also plz
Sure, will create a video on this requirement
Can you create CICD in databricks
Sure, will create a video on this requirement
Kindly make a video on push & pull notebook request from git account which refer to real project. 👍👍
Noted. Will create a video on this concept. Thanks for suggestion!
Ur content is very helpful for me, i gained more knowledge than from other institutes i took course.
Ur channel is more enough to get knowledge on pyspark and Databricks.
Keep up the good work.❤
Glad to hear that! Thanks for your comment
If we have more then 1 notebook for example take 3 notebook and i want to run them one by one how to do it and i want to add or connect a notebook to the on going data pipeline how to do it
We can use %run command multiple times. But for this requirement, it's better to go with workflow. Using workflow, we can create orchestration schedule - serial/parallel executions etc
Can u plz create aws videos also.based on aws data engineering
Sure, will start AWS also once completed most of the topics in azure data engineering
will you teach azure online?