Cloud Data Warehouse Benchmark Redshift vs Snowflake vs BigQuery | Fivetran

Поделиться
HTML-код
  • Опубликовано: 11 июл 2024
  • Get the slides: www.datacouncil.ai/talks/clou...
    ABOUT THE TALK:
    Benchmarks are all about making choices: what kind of data will I use? How much? What kind of queries will users run? How you make these choices matters a lot: change your assumptions and the fastest warehouse can become the slowest.
    As a data pipeline provider that supports all three warehouses as destinations, Fivetran conducted an independent benchmark that is representative of real-world users. In this talk, we'll dive into our methodology, the results and compare it to other similar benchmarks.
    ABOUT THE SPEAKER:
    George Fraser is Co-founder and CEO of Fivetran, a fully-managed data pipeline built for analysts. Fivetran is a Y Combinator-backed with over 300 customers relying on it to centralize their data. When Fivetran began in 2012, they realized that ETL tools were ill-equipped for modern companies who rely on cloud applications and databases.
    To meet the needs of analysts in this new era, Fivetran built the only zero-maintenance data pipeline on the market and is now part of a growing ecosystem of cloud infrastructure that gives organizations control of their data without heavy engineering.
    ABOUT DATA COUNCIL:
    Data Council (www.datacouncil.ai/) is a community and conference series that provides data professionals with the learning and networking opportunities they need to grow their careers. Make sure to subscribe to our channel for more videos, including DC_THURS, our series of live online interviews with leading data professionals from top open source projects and startups.
    FOLLOW DATA COUNCIL:
    Twitter: / datacouncilai
    LinkedIn: / datacouncil-ai
    Facebook: / datacouncilai
    Eventbrite: www.eventbrite.com/o/data-cou...
  • НаукаНаука

Комментарии • 49

  • @Jack-lg9mq
    @Jack-lg9mq 2 месяца назад +1

    Good presentation. Also nice to see that Jimmi Simpson is expanding his horizons.

  • @larpin1
    @larpin1 5 лет назад +8

    Scan speed is extremely important when the data set is huge and it cannot all fit in memory. On a large warehouse, the time spent scanning will usually dwarf the compute time on queries. So I agree that on a tiny 100GB benchmark, complex queries are more meaningful, but on a larger size warehouse scan speed and re-distribution speed are the differentiator.

  • @SanjayKattimani-tech
    @SanjayKattimani-tech 5 лет назад +19

    While most comparisons only focus on speed or cost, you covered a number of parameters in detail. Thanks for sharing.

  • @sau002
    @sau002 3 года назад +6

    Excellent video. I really like the detailed approach to pricing calculations. (20:00 onwards) . E.g. BigQuery being actually more expensive that what is appears to be.

  • @jmf10024
    @jmf10024 8 месяцев назад

    Thank you very much for this presentation. It was very well done and I appreciate the explanation of your choices.

  • @Berkov1
    @Berkov1 5 лет назад

    Great comoparsion, thank!

  • @ericlinden7214
    @ericlinden7214 5 лет назад +3

    I first used Sybase IQ in 1996. It was a hugely successful implementation. I would say this was the first columnar DB which stemmed from an MIT group if I recall.

    • @michaellazar6125
      @michaellazar6125 4 года назад

      I joined Sybase in 1992 having been a Sybase customer since 1988.

  • @seccat
    @seccat 6 лет назад +14

    OLTP vs OLAP @2:30 👍

  • @NicatBehbudov
    @NicatBehbudov 4 года назад

    Great, great video!

  • @northcraftanalytics
    @northcraftanalytics 4 года назад

    I think it's better to use Insert, Update and Delete architectural optimization vs. Query (Select) optimization for OLTP vs. OLAP. The select example you gave seems to be more of a difference between operational reports vs. analytical reports. BUT - Good stuff!

  • @andyandurkar7814
    @andyandurkar7814 3 года назад +2

    Great comparison & presentation!.

  • @StoneZhong
    @StoneZhong 3 года назад +1

    How about Databricks? Or using SparkSQL to query data stored in parquet file either stored in HDFS or in S3 via a connector?

  • @jpestana
    @jpestana 4 года назад +1

    Great presentation

  • @sau002
    @sau002 3 года назад

    Nicely presented.

  • @smallclips4164
    @smallclips4164 3 года назад

    Wonderful video. It should have azure as well.

  • @nosh3019
    @nosh3019 5 лет назад

    nice talk!

  • @gokukanishka
    @gokukanishka 3 года назад

    Next time you do a benchmark testing , please do include Teradata as well.

  • @deepallep
    @deepallep 4 года назад

    another big cloud data warehouse provider is Alibaba Cloud MaxCompute, are we going to involve this product

  • @georgefraser6310
    @georgefraser6310 5 лет назад +3

    The latest version of our warehouse benchmark is at fivetran.com/blog/warehouse-benchmark

  • @The_Pavanputra
    @The_Pavanputra 2 года назад

    legends are still waiting to get the presentation on their email id one day after registering on the link.

  • @joesoap8125
    @joesoap8125 4 года назад +4

    I don't want to throw a spanner in the works but ... why remove the best performing aspects of a data warehouse in order to perform a benchmark test? Removal of distribution, clustering, sort/partition keys doesn't in my opinion present a usable test because you removed the best and most important parts. Data can and should be be distributed and redistributed as copies in a warehouse. Re-sorting/restructuring has been used for variable data requirements for decades, and the most effective way is to create multiple copies (can also be a materialized view). Isn't disk space cheap relative to CPU+RAM? And will a complex data model (no indexing) cause problems, coupled with filtering and no partitioning or distribution? And why test with a small data set on a platform that is built for very large data sets?

    • @Kentama
      @Kentama 3 года назад +1

      I'm going to guess that these data warehouses are becoming so broadly available and cheap that they're edging out traditional data storage platforms, and are becoming more frequently used by smaller organizations. So a benchmark like this, while not necessarily helpful for large companies that would fully leverage the capabilities of a cloud storage architecture, is still extremely useful for a larger number of small companies looking to use agile storage services at a competitive price.

  • @sau002
    @sau002 3 года назад

    Excellent video. Please include SQL Warehouse (Azure Synapse Analytics)

  • @chrisellis3931
    @chrisellis3931 3 года назад

    Partitioning is a huge part of Snowflake's architectural magic... Isn't that a silly thing to restrict from the benchmark testing??

  • @dekim7979
    @dekim7979 Год назад

    Is he the same person with Fivetran etl company?

  • @richardmei2000
    @richardmei2000 5 лет назад

    Sybase IQ appeared as a column-store database in 90s and is still in use today. Yet sadly nobody knows about it. Both Sybase and SAP (who acquired Sybase) didn't bother to market it.

    • @OliverSteadman
      @OliverSteadman 5 лет назад

      That's really interesting - how did you first encounter it? I'd never heard of it but will be checking it out.

    • @karimdatoo
      @karimdatoo 5 лет назад

      Richard Mei cc

    • @michaellazar6125
      @michaellazar6125 4 года назад

      I loved Sybase as a customer and employee. But we could not market our way way out of a paper bag. In 1996 Oracle was 3 times our size in revenue but In new license sales we were nearly even. We were the 6th largest independent software company in the world. Oracle was 6th. Traveling on a plane I often had the following conversation (I like to talk to people). “....I work for Sybase a major database company.” Other passenger ‘I never heard of them, but I don’t know anything about technology”. Me “I bet you a dollar you heard of my competitor Oracle” them”oh, yes I have”

  • @themorbidhero2987
    @themorbidhero2987 5 лет назад

    I think bigquery is better for me

  • @softwaremom
    @softwaremom 5 лет назад +2

    Why no Azure SQL Date Warehouse?

    • @georgewfraser
      @georgewfraser 5 лет назад +5

      It's in the latest version: fivetran.com/blog/warehouse-benchmark

  • @Yooh_Music
    @Yooh_Music 4 года назад +1

    The really big problem of BigQuery is data governance. Permissions in BigQuery are horrible, BigQuery only have dataset permission granularity.

    • @Chekmate99
      @Chekmate99 3 года назад +1

      Is this still the case or has BigQuery security improved since last year? Thx

    • @Yooh_Music
      @Yooh_Music 3 года назад +1

      Yes a lot changed from last year, there is already ACL for BigQuery in beta

    • @Chekmate99
      @Chekmate99 3 года назад

      Yuri Soares thanks will research this as we are considering BigQuery

    • @Yooh_Music
      @Yooh_Music 3 года назад

      @@Chekmate99 Good! BigQuery integrates well with GCP products, but nowadays the best value for money Data Warehouse is Snowflake. If you want to do some crazy ML stuffs BigQuery could be the way to go, otherwise Snowflake is much better.

    • @Chekmate99
      @Chekmate99 3 года назад

      Yuri Soares thanks! We are also looking at Snowflake and Azure solutions. Our situation is similar to this video’s case study, we are consolidating data from several key OLTP systems into a warehouse. Years ago we used Cisco’s Data Virtualization tool to accomplish something similar but now we want to leverage the Cloud. Biggest challenge we’ve had in the past was getting the business user community onboard and use these solutions (get away from excel spreadsheets, etc.)

  • @kenjia.busybee8192
    @kenjia.busybee8192 4 года назад

    Good Comparison, very informative.
    However, I don't believe that this CEO have in-depth-knowledge of each technology to answer questions similar to that @34:27 min

  • @m.x.
    @m.x. 5 лет назад +3

    "My instinct is that in general ahhhh..." Really? If you don't know for a fact you should just admit that you don't know. Period.

  • @BlurSpeedyt
    @BlurSpeedyt 4 года назад +2

    1 minute 30 seconds to get in 20 "uh"s. You failed the "Um" game.

    • @MC-8
      @MC-8 Год назад

      Poor guy is nervous

  • @cjcastellini2903
    @cjcastellini2903 5 лет назад +2

    ahh ... like .. ahhh ... like .... way way way.. ahh ... like ... 28 min. Content 6 min.