Thanks Brian for the video. I've been usin delta live tables and pipelines for a while and haven't find them as useful TBH. The main disadvantage is that they are proprietary, hence bring a lot of limitations, - you can not set up the environment in which you run you code is the biggest pain - you can't appropriately unit test you code that you send to databricks - you don't have a proper access and understanding of what is running behind the scenes when databricks sets up the pipeline and tables I know that you can come up with hacky solutions to deal with these downsides, but in the end I found it easier and more transparent to code up manually the things you mentioned as pros. (managing checkpoints, triggering, great expectations)
Ok. Bear in mind, Databricks is investing a lot in DLT and the current limitations are likely short term. Granular control means more work. In the long run, not moving to DTL may mean you lose out on future new features. But yes, there are some limitations. Thanks for your comments!
Hi Bryan, thanks for all the hard work you are putting in to make these videos...Can you pls upload a video about Unity Catalog which is a new evolution on Databricks. Thanks
Hi Bryan, I wanna do the DML operation the delta tables. Will there be any use if i go for delta live tables instead of just delta tables? Note that these DML happens in batch wise not streaming. I have a feeling not to use delta live because i am not doing any kinda inter-dependency taks between tables liek a pipeline instead i am running a set of DML. Also i am feeling a bit to use delta live because of it's auto maintanence. What are your views on this? Please help me out!!
Bryan - I'm here just for the way you pose the question with that face you make :-D Love it! Always makes me imagine a class of little dwarfs sitting looking up to you and throwing those questions at you :-D Keep going ma man! God bless you!
The only difference is success. SQL Server has been successful for 40+ years. Databricks created Apache Spark and are the largest open source contributor. I think they will be just fine. Of course, when to use any platform is dependent upon your requirements.
Год назад
@@BryanCafferky I do have a problem since with this, especially being so closed to a framework that can decide to raise up the cost of using it whenever they want.
Very intersting approach to introduce DLT by puting it in the context of Scaled Out architecture. Thanks !
Great explanation of DLT. I particularly liked the feature call out at the end.
Thanks Brian for the video.
I've been usin delta live tables and pipelines for a while and haven't find them as useful TBH. The main disadvantage is that they are proprietary, hence bring a lot of limitations,
- you can not set up the environment in which you run you code is the biggest pain
- you can't appropriately unit test you code that you send to databricks
- you don't have a proper access and understanding of what is running behind the scenes when databricks sets up the pipeline and tables
I know that you can come up with hacky solutions to deal with these downsides, but in the end I found it easier and more transparent to code up manually the things you mentioned as pros. (managing checkpoints, triggering, great expectations)
Ok. Bear in mind, Databricks is investing a lot in DLT and the current limitations are likely short term. Granular control means more work. In the long run, not moving to DTL may mean you lose out on future new features. But yes, there are some limitations. Thanks for your comments!
Completely agree , the development on Delta Live Tables is not as intuitive
I love how you ask questions to yourself with a funny voice and then answer them
Thanks Bryan, as always crystal-clear explanation.
Hi Bryan, thanks for all the hard work you are putting in to make these videos...Can you pls upload a video about Unity Catalog which is a new evolution on Databricks. Thanks
It's on my list but could get a bit costly since I have to pay for the Azure services out of pocket and Unity Catalog is a multi workspace solution.
@@BryanCafferky thanks for replying Bryan we are currently performing a POC at our client location on unity+ lakehouse.
Hopefully, one day, you'll get to talk about the move away from the jvm in your timeline.
Hi Bryan, I wanna do the DML operation the delta tables. Will there be any use if i go for delta live tables instead of just delta tables?
Note that these DML happens in batch wise not streaming.
I have a feeling not to use delta live because i am not doing any kinda inter-dependency taks between tables liek a pipeline instead i am running a set of DML. Also i am feeling a bit to use delta live because of it's auto maintanence.
What are your views on this? Please help me out!!
Superb Explanation .... became your fan 😀
superb explanation
I love your videos. Keep up the good work! 😊
Need tutorial on terraform for databricks+AWS multiple workspaces and unity catalog
Bryan - I'm here just for the way you pose the question with that face you make :-D Love it! Always makes me imagine a class of little dwarfs sitting looking up to you and throwing those questions at you :-D Keep going ma man! God bless you!
Thank you Bryan
Great content
So Live tables are smart autonomous tables ;)
Subject to the magic of DLT.
Oh so now everything is Databricks like the era of Cloudera for Hadoop. I'm sure it won't create any issue like it happend for Cloudera.
The only difference is success. SQL Server has been successful for 40+ years. Databricks created Apache Spark and are the largest open source contributor. I think they will be just fine. Of course, when to use any platform is dependent upon your requirements.
@@BryanCafferky I do have a problem since with this, especially being so closed to a framework that can decide to raise up the cost of using it whenever they want.
@ A good alterative is dbt. You can so many of the same things and there is an open source version.
Tell me nothing by saying a lot