If you need data infra or data strategy help, then feel free to set-up a free consultation! - calendly.com/seattledataguy/30-minute-meeting-requirements-yt
Snowflake is for BI and more traditional analytics. Excels in data warehousing, storage, analytics. Data Warehouse engineers and Data Analysts. Databricks is for big data processing (machine learning, AI workloads). Data Engineers & Data Scientists.
Thanks for the video. Also thanks for the relevant title, rather than clickbait or one tailored for youtube's algos. I found you because Codestrap mentioned you (youtube's algos are useless for finding people who know what they're talking about).
Micropartitions is just another name for... partitions aka shards, something that all databases use to scale out (because that's the only way to do it). There have been dozens of databases doing this for data warehouse/OLAP uses for decades. Snowflake (like BigQuery) was more powerful because of cloud-native scaling rather than provisioning real hardware, not just separating storage and compute. The other thing was usability from being in the cloud and running on object stores with features like zero-copy clones, sharing datasets across companies, etc.
Snowflake provides credit usage per second for each T-shirt size no VM costs to run the service. Databricks has its DBU cost per second for it’s service plus there is an underlying cloud VM cost to run databricks. This could be a plus or minus because databricks allows you to select the type of VM you want - Compute optimised vs Memory Optimised
Thanks for the overview. Our organization went through a very contentious evaluation internally between these two platforms. Ultimately ended up going with SnowFlake due to internal politics. Would also love to see a video comparing all four: bigquery redshift snowflake databricks
coming from an SSIS background I figured I'd end up landing in snowflake when I got to working with cloud tech, but I've really, really taken to spark and databricks. even if writing pipeline code in notebooks scares me sometimes
Nice video detailing two very popular data products, thanks ! Personally a big fan of notebook dev, but as you explain both products are solid and can serve customers well.
Awesome video and comparison! Was missing some details on their features in the sense of functionality e.g. a Delta lake might have time travel while snowflake has some pretty Advanced sql functionality that one might use for gdpr hashing. On these „implementation“ details i Would love to see some content. Also bq vs snowflake would be very interesting i guess
I primarily do ETL activities for a product based org and would like to know when to use Databricks and Snowflake or rather how DB/Snowflake can help ETL engineers who migrate data from one product to another. I couldn't understand this video properly because I was expecting a one line answer on when one should use Databricks/Snowflake
I feel Streamlit has helped increase the depth and specialty application possibilities for Snowflake while keeping the streamline usability, do you feel this is true?
@@SeattleDataGuy Yeah I love the new UDTF functions on snowfpark where you can like call python written udtf in SnowSQL with your py files or other packages in it. It's more like macros on dbt. Was able to join multiple tbales on different snowalke servers using snowpark and push it to a prod warehouse. I'm thinking to implement complex udfs like I did with pandas hopefully that works.
Databricks is clearly the winner here. The Snowflake offers simplicity but at the cost of customisation. If u have an app and want to optimise the cost of usage by selecting specific hardware for processing big data you can't do it in Snowflake as you can't touch the back end. I am certified in both and by now there isn't even a realistic comparison. Snowflake plays catch-up to spark. Not even mentioning the python native api that u can use with Databricks to do anything vs sql like syntax u need to use with snowflake.
You make however a great argument in favor of Snowflake in general. Simplicity and hassle-free data warehousing are hugely attractive to companies struggling to attract and retain data engineering talent. Just pull in tables with Stitch, Fivetran or Airbyte and let a group of SQL monkeys dbt the data into shape. Pay the higher cloud bill because these SQL monkeys hired obv. can’t optimize queries but it doesn’t matter for the most part. Cheaper than hiring experienced data engineers who can turn all the knobs of your dwh to optimize it.
@@alfredsfutterkiste7534 you can do exactly that in Databricks too though- it is actually more flexible to talent diversity as it supports all of the most popular languages used by analysts, including SQL. It has the simplicity that Snowflake offers but also far more depth and customisation
@@alfredsfutterkiste7534 well I disagree with the statement that hassle-free is better if you get limited fine tuning opportunity. Yes if you work with 2FB of data you don't care but as soon as you start working with TB of data which have to feed into an app that a user is frequently using then you feel the limitations. In my humble opinion you can have simple aka default settings but still leave the option for fine tuning.
need both databricks and snowflake for any enterprise application.. databricks ingestion the data from various sources and processing the data and stores in any database such as snowflake and postgres... snowflake is mainly used to OLAP application.. we cannot stores the data in delta table in databricka.. it's taking more time to return the data from delta table due to the more processing steps involved in spark.. job,stages,task,partition and cores but these kind of processing steps is not involved in snowflake and snowflake return data at earliest..
You missed core difference's. Databricks has a strong flexible offering that includes, ml, geospatial, ect. Serverless, ephemeral clusters ect.. snowflake does hosted data warehouse well... Thats what makes it good. You don't want to mess with managing complexity use snowflake ( with databricks) if not use databricks
I got into a LinkedIn thread dispute where a snowflake executive replied calling out snowflake is not proprietary and whoever calls it is a canard as per his viewpoint
In terms of data consulting, do you find more clients in need of Snowflake or Databricks expertise? Is there any correlation with the size / price point of the project?
Have experience will a 3. Palantir is databricks but with a ui and a lot more walled garden. Snowflake is a good warehouse but starts to fail at massive scales. You need iceberg or delta with trino spark to do many trillions of rows.
I am using a pivot on data in snowflake. It's causing the values in the rows to show as strings. How do I call these columns? How can I remove the quotes after the pivot?
One thing that I have not understood so far is whether Databricks can be perceived as a data virtualization tool (since they are not promoting it to be such). Yet to my understanding, Databricks is a perfect example for data virtualization. Am I wrong here?
Good analysis but I want to add as we have been heavy snowflake users. Features like replication, Time-Travel and Data Shares do add a lot of value to our ecosystem.
Hello SDG, Would you suggest IBM Data Engineering followed by IBM Data warehousing and GCP at the last on coursera or DataCamp Data Engineering track ? Time isn't a problem. Yes, I'm switching career. I'm already learning Python and Sql already.
Snowflake also has clustered warehouses... not more work... and also, the bigger the warehouse the faster performance therefore the same cost, just data retrieved faster... you need to catch up.
Hi there thank you so much for this great video, can you please tell me how databricks/snowflake could help me as a power bi/tableau developper ? and why we should consider them instead of a more simpler etl tool like power query for example ? Thanks again
Thank you for great video. So if I have Databricks with Deltalake, do I need a seperaste data warehouse ? I mean will I be missing anything, by not having a DW ?
I'm thinking about having my team get started with databricks, but concerned about the cost. I think we get 3000 a month to spend, but after we start getting billed. Does anyone know how quickly that bill starts to add up? We're not doing any kind of streaming of data. we'd be mostly using it to run jobs once a day. Some of the queries can take anywhere from 5 to 10 minutes to run and then they output to various tables. I just don't want my department getting hit with a 30,000 dollar bill or something at month end
Have you talked to a databricks account executive? When you talk to them, they should give you some perspective on controlling costs. Also there are things you can do such as autotermination(similar to snowflake) which will shutdown your clusters when its not running. There are also lots of other things you can do to avoid costs such as putting limits. Personally I know a lot more about Snowflake cost management(which based on what I have looked up is far easier). But overall they both want you to spend as much as possible but they don't want you to leave. So when you talk to an AE, make sure they give some documentation on that(otherwise you have a terrible AE). docs.databricks.com/clusters/clusters-manage.html
@@SeattleDataGuy hey thanks. yeah, I need to setup a meeting with one of the account manager. hopefully, we can implement it without incurring too many costs. it looks like a great product.
@@JoeG2324 Are you going the machine learning route or the data warehouse route? I am curious why the databricks choice vs other options? What was the tipping point or feature that really made it make sense?
@@SeattleDataGuy thanks,we're going the datawarehouse route, my team doesn't handle any machine learning. The reason we're going with databricks is because my company already has a relationship with them as many other groups in the company use it. They have much larger budgets than my team so cost is def a concern
@@JoeG2324 Hi Joe, once you schedule your jobs the Databricks clusters start and stop automatically and they also auto scale so if you’re data volume varies from day today you won’t have to oversize or risk under sizing your clusters. I’m sure you’re going to have fun setting up your first use case. If you have any trouble finding the right contacts at Databricks feel free to PM me and I’ll get you connected.
I must disagree with the diminishing returns in Snowflake. From XS - 4XL I have thus far experienced a consistent inverse linear relationship, i.e., increase the cluster by one t-shirt size -> compute time reduces by 50%. Occasionally performance of the bigger cluster was even better, because more RAM means less disk spill for certain jobs, but I never noticed the inverse. This worked for small jobs (couple thousand rows moved between tables) all the way to our currently largest runs (joining around 1B rows against around 200B+ rows).
@Audrey Delou until your stuck with Snowflakes proprietary data formats. Without being biased, ill stick with open source route. Snowflake cost is everyone's main complaint..
spark is so 2010... there has to be a better 2024 solution to big data ETL and exploration - lazy execution is not suited to analytical /exploratory work (databricks sits on spark, as does snowpark)
CDP is based on Spark and other open source solutions. We reviewed it but given up, as there is a lot of management work. It's better to head to Snowflake / Databricks based on your current situation of data, if it's in cloud / on-prem.
The issue with Snowflake in my opinion: is that it is too abstract and too over simplified, i mean honestly it's good to work with, and you can do great things really fast. But then, after the rush of migration, when you want to scale and optimize the cost and performance of your workloads because it starts to be used for real. Then you realize that the only way to have better performance it to change warehouse size. Of course you can try to tune your queries you'd say, but the query plan is not very chatty, it's very minimalistic to say the least. You cannot tune the engine to fit your need, as you can do with Spark with partition related tuning of playing with different joining strategies ... You have absolutely no control on things like codecs, or compression, or dataformats ... And when you ask Snowflake support to know more about what is going on underneath, then you understand that, it is not going anywhere. My point is, that you'll have a very limited number of cost-free options to optimise your jobs and queries. And it can be ok for some companies because at the end of the day, you pay what you get, but it can also be very frustrating for others.
Both inferior solutions, I dont see either of these companies offering natural language processing applications for no code operational data analysis as other more innovative ML and A.I. native companies have.
If you need data infra or data strategy help, then feel free to set-up a free consultation! - calendly.com/seattledataguy/30-minute-meeting-requirements-yt
Love this. I'm a Solutions Architect and the information you give out is priceless and accurate. Well done!
Spark (and therefore Databricks) is really a game changer
Snowflake is for BI and more traditional analytics. Excels in data warehousing, storage, analytics.
Data Warehouse engineers and Data Analysts.
Databricks is for big data processing (machine learning, AI workloads).
Data Engineers & Data Scientists.
Ty!
Thanks for the video. Also thanks for the relevant title, rather than clickbait or one tailored for youtube's algos. I found you because Codestrap mentioned you (youtube's algos are useless for finding people who know what they're talking about).
our boy has been hitting the gym.
loved the philosophical background on them at the beginning.
Just a little
I must be watching too many of your videos lately - searching for databricks landed me almost straight here
That's pretty impressive considering I just uploaded this 👀
@@SeattleDataGuy you were top on my recommendations too; the youtube algo knows. 🏃
Thanks for your support!
Micropartitions is just another name for... partitions aka shards, something that all databases use to scale out (because that's the only way to do it). There have been dozens of databases doing this for data warehouse/OLAP uses for decades. Snowflake (like BigQuery) was more powerful because of cloud-native scaling rather than provisioning real hardware, not just separating storage and compute. The other thing was usability from being in the cloud and running on object stores with features like zero-copy clones, sharing datasets across companies, etc.
Snowflake provides credit usage per second for each T-shirt size no VM costs to run the service. Databricks has its DBU cost per second for it’s service plus there is an underlying cloud VM cost to run databricks. This could be a plus or minus because databricks allows you to select the type of VM you want - Compute optimised vs Memory Optimised
Well done! You very succinctly summarized the essence of both platforms.
"Using tools for what they're for", - yes, 100% ❤
loved your explanation 🤟
This is great. Very helpful. Thanks.
thank you!
Thanks for the overview. Our organization went through a very contentious evaluation internally between these two platforms. Ultimately ended up going with SnowFlake due to internal politics. Would also love to see a video comparing all four: bigquery redshift snowflake databricks
Which would you have gone with?
What would have gone with without the politics
@@gj4king1 probably DataBricks.
@@jamaswin88 probably DataBricks.
That'd be an intense video!
Big like from india.. nice presentation
coming from an SSIS background I figured I'd end up landing in snowflake when I got to working with cloud tech, but I've really, really taken to spark and databricks. even if writing pipeline code in notebooks scares me sometimes
SSIS is more aligned with what Azure Data Factory does and ADF is tightly integrated with Azure Databricks.
Nice video detailing two very popular data products, thanks !
Personally a big fan of notebook dev, but as you explain both products are solid and can serve customers well.
Awesome video and comparison! Was missing some details on their features in the sense of functionality e.g. a Delta lake might have time travel while snowflake has some pretty Advanced sql functionality that one might use for gdpr hashing. On these „implementation“ details i Would love to see some content. Also bq vs snowflake would be very interesting i guess
I primarily do ETL activities for a product based org and would like to know when to use Databricks and Snowflake or rather how DB/Snowflake can help ETL engineers who migrate data from one product to another.
I couldn't understand this video properly because I was expecting a one line answer on when one should use Databricks/Snowflake
I feel Streamlit has helped increase the depth and specialty application possibilities for Snowflake while keeping the streamline usability, do you feel this is true?
Been loving snowpark sno far. It's like a wrapper of spark but by snowflake.
Any cool use cases that you have implemented? I haven't got to really play as deep in snowpark as I would like.
@@SeattleDataGuy Yeah I love the new UDTF functions on snowfpark where you can like call python written udtf in SnowSQL with your py files or other packages in it. It's more like macros on dbt. Was able to join multiple tbales on different snowalke servers using snowpark and push it to a prod warehouse. I'm thinking to implement complex udfs like I did with pandas hopefully that works.
its not spark underneath
Thank you for this video. Looking at options for my team and this is really useful!
Databricks is clearly the winner here. The Snowflake offers simplicity but at the cost of customisation. If u have an app and want to optimise the cost of usage by selecting specific hardware for processing big data you can't do it in Snowflake as you can't touch the back end. I am certified in both and by now there isn't even a realistic comparison. Snowflake plays catch-up to spark. Not even mentioning the python native api that u can use with Databricks to do anything vs sql like syntax u need to use with snowflake.
What do you think about Palantir?
Indeed databricks is giving run for money to the big cloud guys at this moment
You make however a great argument in favor of Snowflake in general. Simplicity and hassle-free data warehousing are hugely attractive to companies struggling to attract and retain data engineering talent. Just pull in tables with Stitch, Fivetran or Airbyte and let a group of SQL monkeys dbt the data into shape. Pay the higher cloud bill because these SQL monkeys hired obv. can’t optimize queries but it doesn’t matter for the most part. Cheaper than hiring experienced data engineers who can turn all the knobs of your dwh to optimize it.
@@alfredsfutterkiste7534 you can do exactly that in Databricks too though- it is actually more flexible to talent diversity as it supports all of the most popular languages used by analysts, including SQL. It has the simplicity that Snowflake offers but also far more depth and customisation
@@alfredsfutterkiste7534 well I disagree with the statement that hassle-free is better if you get limited fine tuning opportunity. Yes if you work with 2FB of data you don't care but as soon as you start working with TB of data which have to feed into an app that a user is frequently using then you feel the limitations. In my humble opinion you can have simple aka default settings but still leave the option for fine tuning.
need both databricks and snowflake for any enterprise application.. databricks ingestion the data from various sources and processing the data and stores in any database such as snowflake and postgres... snowflake is mainly used to OLAP application.. we cannot stores the data in delta table in databricka.. it's taking more time to return the data from delta table due to the more processing steps involved in spark.. job,stages,task,partition and cores but these kind of processing steps is not involved in snowflake and snowflake return data at earliest..
have you tried Databricks SQL warehouse?
You missed core difference's. Databricks has a strong flexible offering that includes, ml, geospatial, ect. Serverless, ephemeral clusters ect.. snowflake does hosted data warehouse well... Thats what makes it good. You don't want to mess with managing complexity use snowflake ( with databricks) if not use databricks
Very informative.
Curious, which Netflix show's clip you put in the scaling section?
I got into a LinkedIn thread dispute where a snowflake executive replied calling out snowflake is not proprietary and whoever calls it is a canard as per his viewpoint
Excellent video, thanks
thank you!
In terms of data consulting, do you find more clients in need of Snowflake or Databricks expertise? Is there any correlation with the size / price point of the project?
Hi SeattleDataGuy - Which product has the best support? From the vendor, partner & community perspective? Thanks!
Can you make any comparisons or parallels with Palantir?? Not sure how Palantir competes with these companies
Been working to get access to palantir
Have experience will a 3. Palantir is databricks but with a ui and a lot more walled garden. Snowflake is a good warehouse but starts to fail at massive scales. You need iceberg or delta with trino spark to do many trillions of rows.
@Thomas Adams I would agree with the palantir to databricks similarity. Its the fact that they treat everything like files 😆
@@thomasadams6860 what do you mean walled garden?
@@deemahdee Palantir wants you to stay strictly in their ecosystem for everything and make it very difficult to use different things easily.
I suggest that topics such as this that are heavy on content, it is better to go a little slower
Can you also make a video on the difference between DataBricks, Snowflake and Solix technologies
I am using a pivot on data in snowflake. It's causing the values in the rows to show as strings. How do I call these columns? How can I remove the quotes after the pivot?
One thing that I have not understood so far is whether Databricks can be perceived as a data virtualization tool (since they are not promoting it to be such). Yet to my understanding, Databricks is a perfect example for data virtualization. Am I wrong here?
What databrick do differently which gcp, azure or AWS can't do ?
Good analysis but I want to add as we have been heavy snowflake users. Features like replication, Time-Travel and Data Shares do add a lot of value to our ecosystem.
Data sharing is honestly great! I have seen several teams save so much development time because of that feature!
this is amazing thank you
Waking up to another nugget of gold here
Glad you enjoyed it!
For data engineer beginner, aws is better or azure?
They are the same, but there are more AWS jobs
AWS has some fantastic free learning resources on EdX - I'd start with the cloud practitioner course if you know nothing.
Microsoft runs the enterprise, aws runs the internet. Now it's your choice for your career vision.
Azure data factory is very easy to learn and use
Aws
Nicely explained.
Glad it was helpful!
What is the output its snowflakee or databricks
Hello SDG, Would you suggest IBM Data Engineering followed by IBM Data warehousing and GCP at the last on coursera or DataCamp Data Engineering track ? Time isn't a problem. Yes, I'm switching career. I'm already learning Python and Sql already.
Snowflake also has clustered warehouses... not more work... and also, the bigger the warehouse the faster performance therefore the same cost, just data retrieved faster... you need to catch up.
Hi there thank you so much for this great video, can you please tell me how databricks/snowflake could help me as a power bi/tableau developper ? and why we should consider them instead of a more simpler etl tool like power query for example ? Thanks again
Great video.
Thank you for great video.
So if I have Databricks with Deltalake, do I need a seperaste data warehouse ?
I mean will I be missing anything, by not having a DW ?
I'm thinking about having my team get started with databricks, but concerned about the cost. I think we get 3000 a month to spend, but after we start getting billed. Does anyone know how quickly that bill starts to add up? We're not doing any kind of streaming of data. we'd be mostly using it to run jobs once a day. Some of the queries can take anywhere from 5 to 10 minutes to run and then they output to various tables. I just don't want my department getting hit with a 30,000 dollar bill or something at month end
Have you talked to a databricks account executive? When you talk to them, they should give you some perspective on controlling costs.
Also there are things you can do such as autotermination(similar to snowflake) which will shutdown your clusters when its not running. There are also lots of other things you can do to avoid costs such as putting limits. Personally I know a lot more about Snowflake cost management(which based on what I have looked up is far easier). But overall they both want you to spend as much as possible but they don't want you to leave. So when you talk to an AE, make sure they give some documentation on that(otherwise you have a terrible AE).
docs.databricks.com/clusters/clusters-manage.html
@@SeattleDataGuy hey thanks. yeah, I need to setup a meeting with one of the account manager. hopefully, we can implement it without incurring too many costs. it looks like a great product.
@@JoeG2324 Are you going the machine learning route or the data warehouse route? I am curious why the databricks choice vs other options? What was the tipping point or feature that really made it make sense?
@@SeattleDataGuy thanks,we're going the datawarehouse route, my team doesn't handle any machine learning. The reason we're going with databricks is because my company already has a relationship with them as many other groups in the company use it. They have much larger budgets than my team so cost is def a concern
@@JoeG2324 Hi Joe, once you schedule your jobs the Databricks clusters start and stop automatically and they also auto scale so if you’re data volume varies from day today you won’t have to oversize or risk under sizing your clusters. I’m sure you’re going to have fun setting up your first use case. If you have any trouble finding the right contacts at Databricks feel free to PM me and I’ll get you connected.
I must disagree with the diminishing returns in Snowflake.
From XS - 4XL I have thus far experienced a consistent inverse linear relationship, i.e., increase the cluster by one t-shirt size -> compute time reduces by 50%. Occasionally performance of the bigger cluster was even better, because more RAM means less disk spill for certain jobs, but I never noticed the inverse. This worked for small jobs (couple thousand rows moved between tables) all the way to our currently largest runs (joining around 1B rows against around 200B+ rows).
Do much content around Data Security?
Not currently. But I am aiming to dig into a few options in this space.
could we get an updated video on this lol
Like what you do. :)
Thank you!
Both platforms are top notch. But at the end of the day Databricks takes the cake.. And If done correctly, MUCH more cost effective than Snowflake.
@Audrey Delou until your stuck with Snowflakes proprietary data formats. Without being biased, ill stick with open source route. Snowflake cost is everyone's main complaint..
Scrap all of this get oracle autonomous DB
spark is so 2010... there has to be a better 2024 solution to big data ETL and exploration - lazy execution is not suited to analytical /exploratory work (databricks sits on spark, as does snowpark)
Hey SDG, have you looked at Cloudera’s Cloudera Data Platform (CDP).
What are your thoughts on that ?
CDP is based on Spark and other open source solutions. We reviewed it but given up, as there is a lot of management work. It's better to head to Snowflake / Databricks based on your current situation of data, if it's in cloud / on-prem.
Snow flakes simply Structured data for analytics
Azure synapse is a great contender
But it's expensive as compared to snowflake
i felt this was a bit unstructured ....
Snowflake is on AWS, Azure, and GCP now.... just FYI
The issue with Snowflake in my opinion: is that it is too abstract and too over simplified, i mean honestly it's good to work with, and you can do great things really fast.
But then, after the rush of migration, when you want to scale and optimize the cost and performance of your workloads because it starts to be used for real. Then you realize that the only way to have better performance it to change warehouse size.
Of course you can try to tune your queries you'd say, but the query plan is not very chatty, it's very minimalistic to say the least.
You cannot tune the engine to fit your need, as you can do with Spark with partition related tuning of playing with different joining strategies ...
You have absolutely no control on things like codecs, or compression, or dataformats ...
And when you ask Snowflake support to know more about what is going on underneath, then you understand that, it is not going anywhere.
My point is, that you'll have a very limited number of cost-free options to optimise your jobs and queries.
And it can be ok for some companies because at the end of the day, you pay what you get, but it can also be very frustrating for others.
PALANTIR?
Just filmed a video, need to edit and get approved
Databricks is light years ahead of snowflake
Different people have different perspectives on this, and it all depends on use case anyway.
Light year is distance and not time sir
@@manichand1996 cope
Only thing that can challenge databricks now is fabric
where is palantir??
I finished my first video with them, waiting for some reviews from them
@@SeattleDataGuy okey thanks buddy
Unfortunate that snowflake does not support the R language.
Databricks completely defeated Snowflake.
✋ p̾r̾o̾m̾o̾s̾m̾
Not sure what this means
This video is waste of time
Both inferior solutions, I dont see either of these companies offering natural language processing applications for no code operational data analysis as other more innovative ML and A.I. native companies have.
Please do share some of the solutions you prefer!
@@SeattleDataGuy I am not as generous as you are 😉
@@kylelarson5074 Well at least you're honest 😆
I would like to connect & talk with you on LinkedIn.