Spark Session vs Spark Context | Spark Internals

Поделиться
HTML-код
  • Опубликовано: 6 фев 2019
  • This video is part of Spark learning Series. spark application, spark context and spark session are some of very less understood concept by beginners. So As part of this video we are covering following
    what is Spark session
    need of spark session
    hope spark session is different from south
    spark context
    what is Spark Context
    need on spark context
    #apachespark #sparktutorial #bigdata
    #spark #hadoop #hive

Комментарии • 95

  • @arundhingra4536
    @arundhingra4536 5 лет назад +11

    Very useful video. I have been working with spark for more than two years now but never really bothered about SparkSession vs SparkContext. For me its just the entry point and you go from there. But the idea of having multiple sparkSessions with a single underlying SparkContext makes great sense and was an eye opener. Thanks

  • @The1Rafvas
    @The1Rafvas 5 лет назад +7

    I would say that this is by far the best explanation I have found after hours of search on the topic. Congrats!!!

  • @randommoments1263
    @randommoments1263 5 лет назад +5

    Detailed, Clear and straightforward, all at the same time. Superb..!

  • @swatisneha9393
    @swatisneha9393 5 лет назад +11

    Today i understood exact meaning of sparkContext and sparkSession.
    Thanks a lot, your video helped!!!

    • @DataSavvy
      @DataSavvy  5 лет назад +1

      Thanks... I am happy that our is useful... Please provide your feedback on other videos of this channel

  • @anandhusk7794
    @anandhusk7794 4 года назад

    very clear and simple explanation. Thanks :)

  • @Smoked93POL
    @Smoked93POL 2 года назад

    Short and to the point. I like your explanation.

  • @rakeshchaudhary8255
    @rakeshchaudhary8255 4 месяца назад

    still relevent as of today and frequently asked. The practical on databricks made things crystal clear.

  • @SatishKumar-yz4tn
    @SatishKumar-yz4tn 3 года назад

    Nicely explained. Thank you!!

  • @saurav0777
    @saurav0777 5 лет назад +5

    In the older version of spark like spark 1.6 we had the entry point in the spark application created using spark context sc ,but the later version like spark 2.0 spark context has been deprecated and added all the context in one level above the abstraction and added them in spark session which contains all the spark context,spark-sql context,hive context etc..

    • @shayshaswishes373
      @shayshaswishes373 5 лет назад

      can you please post the vedio , how to add SparkSession.builder in existing code.

  • @saurabhgarud6690
    @saurabhgarud6690 3 года назад

    Very useful stuff thank you so much

  • @Pramodkumar-mn6jd
    @Pramodkumar-mn6jd 2 года назад

    Nice ans it was very clear explaination, thanku sir 🙏

  • @rahulberry4806
    @rahulberry4806 3 года назад

    thanks, clearly explained

  • @NiRmAlKuMaRindia
    @NiRmAlKuMaRindia 5 лет назад

    Great details

  • @nashaeshire6534
    @nashaeshire6534 2 года назад

    Thx a lot, really clear.

  • @kthiru5168
    @kthiru5168 3 года назад

    Nice Explanation.

  • @kneelakanta8137
    @kneelakanta8137 11 месяцев назад

    Very good information, can you please help in clarifying this doubts:
    1. What are included in configurations and properties of different spark sessions of the spark context, and it's effect on cluster
    2. What is purpose of spark context and for what spark context is responsible for
    can you make a video to understand the spark context in full fledge?

  • @unmeshasreeveni
    @unmeshasreeveni 4 года назад +1

    The best explanation. Congrats

  • @tirupatiraosambangij607
    @tirupatiraosambangij607 5 лет назад +1

    Nice explanation.. Thank you

    • @DataSavvy
      @DataSavvy  5 лет назад

      Thanks for appreciation :)

  • @meenakshisundaraar7267
    @meenakshisundaraar7267 6 месяцев назад +1

    Neat and clean presentation... 😊

    • @DataSavvy
      @DataSavvy  6 месяцев назад

      Thanks a lot 😊

  • @jayashankarnallam6945
    @jayashankarnallam6945 3 года назад +5

    What is the advantage of creating multiple spark sessions instead of having multiple spark contexts.

  • @truptiwaghmare3376
    @truptiwaghmare3376 2 года назад +1

    What if same table was being updated by two users at a time..which one would be updated,let's say if we change the datatype of column rename it to same as previous column and store it to table..back again..and by table I mean global table

  • @narendrak36
    @narendrak36 5 лет назад +2

    This is what exactly I am looking for. Niw I got to know exact difference between Context and session. Thank you dude.
    Do you know which is the best certification on spark as a Spark developer?

  • @alphacharith
    @alphacharith 3 года назад

    Thank You

  • @saurabh7337
    @saurabh7337 3 года назад +2

    Under which scenarios it will be meaningful to have separate sparkContext for each user?

  • @kushagra_nigam95
    @kushagra_nigam95 3 года назад +1

    Best explanation till date 👍

  • @ravulapallivenkatagurnadha9605

    Nice

  • @priteshpatel4316
    @priteshpatel4316 3 года назад

    Hi Harjeet thanks for the clear and simple explanations of all your videos. Can you upload videos serial wise pyspark tutorial if you have because in most of the tutorials around its starts with creation of spark dataframe using Sparksession and operations on dataframe. You can also suggest any tutorial/blog to read regarding pyspark. Thanks Man....your explanation are great

  • @santhoshsandySanthosh
    @santhoshsandySanthosh 5 лет назад

    What is the tool or software that is used in this demo of creating sessions. . is it python based or scala ?

  • @nandhannandhan8155
    @nandhannandhan8155 3 года назад

    Superb

  • @snehakavinkar2240
    @snehakavinkar2240 3 года назад +4

    Sorry, but I am little confused here. What do you mean when you say every spark context represents one application? When I submit a spark application aren't I am the only user who is attached to that application. How do multiple users make configuration changes to my spark application? Don't they have to submit their own copy of spark application again with config they wish to set? Thank you!

    • @DataSavvy
      @DataSavvy  3 года назад +1

      Imagine where u have already running app on cluster. Whatever code needs to be run you are getting at run time... That will be good use case for multiple spark sessions... Drop me an email at aforalgo@gmail.com. will share more content to read on this

    • @N12SR48SLC
      @N12SR48SLC 3 года назад

      stackoverflow.com/questions/52410267/how-many-spark-session-to-create#:~:text=4%20Answers&text=No%2C%20you%20don't%20create,in%20the%20same%20spark%20job.
      Why is it saying 1 SS per application then?

  • @karthikram1954
    @karthikram1954 2 года назад

    Great video sir. Just one question. In which node does the spark context and spark session run?

  • @sudheeryarramaneni2218
    @sudheeryarramaneni2218 5 лет назад +2

    I have a doubt,Can we apply actions directly on RDD with out transformations?

    • @DataSavvy
      @DataSavvy  5 лет назад +2

      Loading of a file and creating a rdd is also a transformation... So logically you cannot run action without transformation... If you you don't count creating an rdd as transformation, then you can say that you run action action without transformation

  • @dineshedala
    @dineshedala 5 лет назад +4

    In one of my interviews I faced this question. What happens if the executor got crashed unexpectedly which has already processed 50 records. Will it continues from 51 or from 0?
    Do we have any service that tracks the execution status of a executor?

    • @rahuldey1182
      @rahuldey1182 4 года назад

      yes by creating checkpoint and mentioning the checkpoint folder location in ur program

    • @indiannewyorkbabies6872
      @indiannewyorkbabies6872 2 года назад

      Doesn’t rdds store those lineage information and when does the executor fails, the rdds gives that info to another new executor and starts the execution…!! Thtsy rdds ade fault tolerant

  • @jasbirkumar7770
    @jasbirkumar7770 6 дней назад

    sir can you tell me some about housekeeping executive spark deta. i dont understand spark word. facility company JLL requird he have spark exprience

  • @krish808
    @krish808 3 года назад

    excellent content in a simple and easy format. Are you providing any trainings on databricks? if so, how do I contact you

  • @rajrajan51
    @rajrajan51 5 лет назад +1

    Thanks for the video bro .
    I have doubt suppose user 1 is sharing table 1 and user 2 is updating a value for the column in the table 1 will the change also got update user 1 shared table too.

    • @srikanthchillapalli1037
      @srikanthchillapalli1037 5 лет назад

      It wont happen as user1 and user2 will have isolated sessions from one another and so one user operation doesnt have any impact on other user table. Actually u can have different data for both these users though the table name is same.

  • @priyachauhan813
    @priyachauhan813 2 года назад +1

    Hi , Thanks for nice explanation,
    Scala works with datasets and python with dataframes and they both generate RDDs as end Results, is my uinderstanding correct

  • @deepakkini3835
    @deepakkini3835 5 лет назад +1

    I had an interview and he asked me on spark process. Could you please explain what happens when the spark job is stopped in midway of execution? Will it start from the beginning or from where it left off?

    • @DataSavvy
      @DataSavvy  5 лет назад +1

      It depends on how the job was stopped... Do you mean that you killed spark context and stopped or only the job running action had failed... Recovery will depend on this...

    • @DataSavvy
      @DataSavvy  5 лет назад +1

      It will also depend on if you have any checkpoints in your job

  • @gurumoorthysivakolunthu9878
    @gurumoorthysivakolunthu9878 Год назад

    Hi Sir... Very useful topic and very well explained... Thank you, Sir...
    1. "Each user can have different spark session... " -- Does this mean -- Different jobs submission...?
    That means only one Spark Context for the entire cluster which handles many jobs... Right...?
    2. Then what about Driver... Is it similar to Spark Context... Only one Driver for all jobs...?
    3. In the demo you showed creating many spark sessions in the same job... Each sessions are different within the same job itself... Am I right...? But why creating different sessions in the same code / job...?
    Thank you, Sir...

  • @ankan1627
    @ankan1627 2 года назад

    so what happens when different users create their own spark context. ( say before spark session was introduced) ? are multiple spark contexts created in such cases ? if yes, what are we gaining by moving the abstraction away from spark context to spark session ?

    • @ajithkannan522
      @ajithkannan522 Год назад

      only one spark context avaialable. You can create multiple sparkSessions under the spark context

  • @SagarSingh-ie8tx
    @SagarSingh-ie8tx Год назад

    Yes it’s nice

  • @phanikumar4915
    @phanikumar4915 4 года назад

    can we call stop on spark session what will happen if we call.

  • @sachinhugar
    @sachinhugar 5 лет назад +1

    Hi harjeet, when this type of use case come any example bcz in batch processing there will one spark session is enough.

    • @DataSavvy
      @DataSavvy  5 лет назад

      When u want your users to have live connection for data analysis etc

  • @projjalchakraborty1806
    @projjalchakraborty1806 5 лет назад +1

    Hi harjeet..why we are using multiple sparksession instead of multiple sparkcontext....any advantage is there???

    • @DataSavvy
      @DataSavvy  5 лет назад

      It makes it easier to share tables and share cluster resources among your users... As you well know starting different application for each user usually cause cluster contention

  • @vijaydas2962
    @vijaydas2962 5 лет назад +1

    Very informative content... I've a doubt.... I opened 2 separate spark2-shell using 2 different Ids... When I hit spark.sparkContext in two different terminals, the reference numbers were different. Shouldn't they be same as you explained at the beginning of this video where multiple users shared the same sparkContext object?

    • @atnafudargaso8374
      @atnafudargaso8374 4 года назад

      same here

    • @yashdeepkumar2495
      @yashdeepkumar2495 2 года назад

      He is talking when working in a clustered environment with more than one worker node I think...usually that will be the scenario. If you open 2 spark shells and check it will create two seperate contexts.I am new to this and pls let me know if you found the correct ans to your question after two years.

    • @sandeshhegde2847
      @sandeshhegde2847 Год назад

      If you open 2 shells, they're 2 different applications. this video talks about having multiple spark sessions within a single application

  • @vijaybigdata752
    @vijaybigdata752 4 года назад +1

    I have a doubt, in this scenario if we have 4 spark sessions for a single spark context, when spark context goes down will all 4 spark sessions killed? Please confirm.

    • @DataSavvy
      @DataSavvy  4 года назад

      Yes Vijay...all spark session will be killed

  • @rvalusa
    @rvalusa 3 года назад +1

    Thanks, Sir for a wonderful video explaining the differences.
    one qq, when we close/stop a sparkSession which is created from a sparkContext, then this makes other sparkSessions as well get stopped which are created from the same sparkContext?

    • @rvalusa
      @rvalusa 3 года назад

      found this, which is weird implementation and apparently a bug in spark - apache-spark-developers-list.1001551.n3.nabble.com/Closing-a-SparkSession-stops-the-SparkContext-td26932.html

  • @max6447
    @max6447 3 года назад

    By executor do u mean node manager?

  • @sreepaljsp
    @sreepaljsp 4 года назад

    you said some thing at 2:39.. i did not get that word.. the sentence is "I can ___ a spark context per user" what is that missing word?

    • @srikd9829
      @srikd9829 4 года назад

      I think the missing word is: Spun. its the past tense of the word: Spin. generally the word is used as 'spun a server' means different meanings like "introducing a new server or node, or starting or booting the server or node". This is because, starting or booting the server, spins the hard-disk to load the OS. This is how the word came into practise. hope this helps.

  • @srikanthchillapalli1037
    @srikanthchillapalli1037 5 лет назад

    I don't think we can create multiple spark context in spark 1.x as well. There is a parameter spark.driver.allowMultipleContexts=true, but this is only used on test scripts but cannot be used to create multiple contexts when coding in IDE. And in spark 2.x we will create multiple spark sessions. Please let me know if I'm wrong.

    • @murifedontrun3363
      @murifedontrun3363 5 лет назад +1

      There can be only one SparkContext per JVM process. If there would have been multiple SCs running in the same JVM then it would be very difficult to handle GC tuning up, communication overhead among the executors etc.

    • @srikanthchillapalli1037
      @srikanthchillapalli1037 5 лет назад

      @@murifedontrun3363 Yes, but here in video the tutor explained that in old versions multiple spark contexts were created. So, I got a doubt how it is possible.

  • @vijaypandey5371
    @vijaypandey5371 4 года назад

    What will happen if driver program fails in Spark. And how to recover it?

    • @DataSavvy
      @DataSavvy  4 года назад

      It depends on what settings you have for that job. If you have checkpoints and retry enabled. spark will start to recreate those objects... otherwise the job will fail..

  • @ravikrish006
    @ravikrish006 3 года назад

    Can we have multiple contexts. Could you show with some examples

  • @vamshi878
    @vamshi878 5 лет назад

    Hi harjeet, can you make a video for how to read hbase table data into spark dataframe, and how to insert spark dataframe into hbase table. is there any spark-hbase connector available for cloudera?

    • @DataSavvy
      @DataSavvy  5 лет назад +2

      Sure Vamsi... I will add this in my to-do list... Thanks for suggestion :)

  • @rohi1350
    @rohi1350 5 лет назад

    We can something SparkSession is similar to sqlContext in spark 1.6

  • @souravsardar
    @souravsardar 3 года назад +1

    @datasavvy thanks for the video. Could you please make a video on where we can practice production level scenarios in pyspark .

    • @DataSavvy
      @DataSavvy  3 года назад +1

      Sure Saurav... Let me know if you have list of scenario which u want me to cover. Drop me a email at aforalgo@gmail.com

  • @TheVijju89
    @TheVijju89 4 года назад

    Please post the python code sheet..

  • @mohdrayyankhan6623
    @mohdrayyankhan6623 2 года назад

    What are the users here ???

  • @mineb1842
    @mineb1842 Год назад

    Plz

  • @MyTravelingJourney
    @MyTravelingJourney 5 лет назад +2

    I think you missed many points

    • @DataSavvy
      @DataSavvy  5 лет назад

      Please suggest what are you pointing Towards... Will cover in another video

    • @dalwindersingh9282
      @dalwindersingh9282 5 лет назад

      please suggest, raise few of them.

  • @underlecht
    @underlecht 3 года назад

    3 ads in 5 minutes. did not finish.

    • @DataSavvy
      @DataSavvy  3 года назад +1

      RUclips has increased the ads based on user stats. I unfortunately don't have method to decrease that. Any suggestion is welcomed (except a complete switch off)