Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code. www.learningjournal.guru/courses/
This is the most simplistic and best video on Map-Reduce I have seen so far. Your way of explaining the complex concepts in simple way is very amazing. You follow the right approach to structure the breakdown of the complex concept and answering in a step by step manner.
By far the best video on this context that I have seen. Your effort in correlating every assertion directly with code is really commendable. Please keep up the good work.
Sir, i have seen all the videos . i got good knowledge. the content delivery is excellent. i never seen trainer like you on hadoop course. Thank you so much Sir.
Thank you very much, I saw lot of online Bigdata videos, but these are very nice , explained in the simple language with detail in technically , Thanks again!!!!!.
Sir, your tutorials are very helpful. But I have system with only 2 GB . And I am a student, so cannot afford the paid services. Is there any way I can practice the hadoop concepts without setup for free?
@@ScholarNest but its is stored on hdfs and we are running mapreduce task on blocks and blocks are getting replicated so how they know that we don't need to run on replicated block ?
Data is replicated, so we will have same blocks on some other system also, that will also be counted. I guess, this will not give the right answer as we have multiple count for same block of data.
As the data is replicated but the map reduce framework will ignore the multiple copies and run map function on a single input split. Although that is very obvious that only one input split is created for a perticular block
It should come to you. If not, please try reaching out to RUclips support. Payments are managed by RUclips and I don't have visibility or control of payment. RUclips support should help.
Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code.
www.learningjournal.guru/courses/
I cant justify how perfectly you explained and your slides are so apt
This is the most simplistic and best video on Map-Reduce I have seen so far. Your way of explaining the complex concepts in simple way is very amazing. You follow the right approach to structure the breakdown of the complex concept and answering in a step by step manner.
By far the best video on this context that I have seen. Your effort in correlating every assertion directly with code is really commendable. Please keep up the good work.
Literally clapped 👏 at the end of the video. Superb explanation with code !!
Superb fantastic fabulous. Amazing mindblowing video , explaining techniques is superb. Thank you for sharing
i am from indonesia and just learn about big data and i want to say thank you for your knowledge..
Thanks a lot for making these videos. The way you teach is awesome. You explain the complicated concepts also so simple. I am learning a lot.
Sir, i have seen all the videos . i got good knowledge. the content delivery is excellent. i never seen trainer like you on hadoop course. Thank you so much Sir.
it is really awesome. The way you explain word by word gives comprehensive knowledge.
really its very simple & way of explanation is Excellent
Thank you!
I can sense your simplicity and knowledge.... your teaching is simply awesome.
One of the best videos on Map Reduce I have ever seen so far....... keep up the good work!!
Excellent! Your understanding and explanation is really great. Look forward to learning more from you. Thank you.
Very easy to understand. Thanks a lot.
Thank you very much, I saw lot of online Bigdata videos, but these are very nice , explained in the simple language with detail in technically , Thanks again!!!!!.
Sir. Your teaching style is simply awesome.
Great Knowledge and Excellent Explanation Skills !
Explained very well! thank you very much sir
Hi Sir, can you also make a video on how TEZ framework works in the background for a hive job.
Great explanation. I understand the concepts and the code now. Thank you.
great video
Thanks you for all your videos, it is very helpful. please upload few videos for Apache Spark as well.
This was great, thanks for the tutorial, it was super simple!
Got a good knowledge from you sir
Learning made easy.. thank you
Excellent tutorial
Thank you!
Brilliant
thanks for amazing video :)
good stuff sir
Sir, your tutorials are very helpful.
But I have system with only 2 GB . And I am a student, so cannot afford the paid services.
Is there any way I can practice the hadoop concepts without setup for free?
Can you make a video tutorial course for NoSql database
Stay subscribed. NoSQL will come soon, and many more things will also follow.
Good Material
thank you so much sir
I am just confused what about replications will it count lines of replicatated blocks???????? Plz answer ???
No. Replication is kind of backup copy.
@@ScholarNest but its is stored on hdfs and we are running mapreduce task on blocks and blocks are getting replicated so how they know that we don't need to run on replicated block ?
thanks :)
could you personal train?
can you pls share these documents PPT's if possible will share email id?
Don't know who are those who gave thumbs down!! :O
Please insert subtitle nex time :)
:-) All newer videos have the subtitles.
Data is replicated, so we will have same blocks on some other system also, that will also be counted. I guess, this will not give the right answer as we have multiple count for same block of data.
As the data is replicated but the map reduce framework will ignore the multiple copies and run map function on a single input split.
Although that is very obvious that only one input split is created for a perticular block
This will be taken care by map-reduce framework with the help of metadata information in the namenode.
Sir Today I have paid 159 rupees, still it is ask for join the channel.
Kindly check and do the needful.
Thanks,
Nitin Kaushal
It should come to you. If not, please try reaching out to RUclips support. Payments are managed by RUclips and I don't have visibility or control of payment. RUclips support should help.
Thank you very much