Thank you so much for your lovely comment! ❤️ I hope my playlist made it easier for you to learn PySpark. To help me grow, please make sure to share with your network over LinkedIn 👍
Hi Shubham, Great content, I am following your series in data bricks environment. When we read a file it generates a job to get the metadata, when I to check the execution metrics in databricks ui, it does not show inputsize/record in databricks but in your docker container it show, where can we check that info in databricks?
Hello, You can lift and shift the same code in Databricks and it will work. Only difference, you dont need to generate Spark Session in Databricks notebook, it generates one for you. Hope this helps.
No words, just awesome. Please cover more such concepts .. we are with you!
If you like my content, Please make sure to share with your network over LinkedIn 👍
highly underrated series. Keep doing the good work
Thank you so much for your lovely comment! ❤️ I hope my playlist made it easier for you to learn PySpark.
To help me grow, please make sure to share with your network over LinkedIn 👍
crisp and clear👌
Thanks ❤️ Please make sure to share with your network over LinkedIn
really good sessions
Glad you like them! Please make sure to share with your network on LinkedIn ❤️
Lots of new things learn today 👍
AWESOME !
Hi Shubham,
Great content, I am following your series in data bricks environment. When we read a file it generates a job to get the metadata, when I to check the execution metrics in databricks ui, it does not show inputsize/record in databricks but in your docker container it show, where can we check that info in databricks?
3:00 what do you mean by identifying the metadata? what's the use of it in this context?
Metdata means the information about the column names and their datatypes
can you make some video about Pyspark interview questions
Sure, will definitely create some on it. Make sure to share this with your network.
PySpark Interview Series for the top companies
ruclips.net/p/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0&si=m82ejHBVkhSLWFET
@@yo_793 thank you
please make a video on how to write a production grade code , unit testing , these things are not available on yt .. can u plz make it ....
Will surely make video on that. Thanks for Following ❤️
PySpark Interview Series of Top Companies
ruclips.net/p/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0&si=m82ejHBVkhSLWFET
Can you please do it in databricks
Hello,
You can lift and shift the same code in Databricks and it will work. Only difference, you dont need to generate Spark Session in Databricks notebook, it generates one for you.
Hope this helps.
PySpark Interview Series for the Top Companies
ruclips.net/p/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0&si=m82ejHBVkhSLWFET
So much gap😅
The series is now Resumed. New videos are being published every 3 days. Thanks for Following ❤️