So cool! I wished my tutors taught hadoop in such a straight and easy way by charging such a hefty sum. Out of 8 class lectures it took them 5 to Just setup the environment. Kudos guruji!
You are awesome. The way you are teaching is amazing and nobody can replace you. Thank you so much. but i have one request, please do continue with your videos
Sabse badhiya bhai.. Ellame Doubt clear aayich Anna 😀 College se to nahi, but your videos helped me lot to take seminar on the particular topic... Thank you bhai yaar..
+Aryan Bainsla Agar aap dual boot karva k chalaoge to learning ke liye sufficient hai , agar VMware use krke 2gb RAM allocate kiya to performance ka issue aaega .
i beg you i touch your feet please tell about the following in Big data :: Introduction to “R”, analyzing and exploring data with “R”, statistics for model building and evaluation.
Sure Achin Sir , Will try to do it after completing hadoop series , You can tell me frankly sir after all you are my first subscriber , I respect you , aap muze apna dost samze aur haq se batae , I wont mind at all ....
For Online classes Visit our website
technologicalgeeks.com/
Course Details : ruclips.net/video/KBK85ETH5nI/видео.html
So cool! I wished my tutors taught hadoop in such a straight and easy way by charging such a hefty sum. Out of 8 class lectures it took them 5 to Just setup the environment. Kudos guruji!
Perfect perfect perfect.... Kya khub ...kya energy... cha gaye guru :)
Brother u r too good..very nice method of teaching..appreciated
The way u r teaching is fabulous ... Thank you so much.
I have assessment tomorrow and your videos are saving me! 😍
Same 😅
You will have million subscribers soon. Your style of explaining is just what most of us need and it is unique.
you are so cool, making complex things easy and fun
I am from commerce background but understood 70% of it
Simply awesome...Best tutorials for Beginners....Thanks Bhai
I like your videos because of you include small kind of fun which active my brain and concentrate on learning # ये दुनिया लालची hai 😂😂😂😂
You are awesome. The way you are teaching is amazing and nobody can replace you. Thank you so much. but i have one request, please do continue with your videos
Chotu...you are very very good. Bhava keep it up
Sabse badhiya bhai..
Ellame Doubt clear aayich Anna 😀
College se to nahi, but your videos helped me lot to take seminar on the particular topic...
Thank you bhai yaar..
Bhai clone wala concept bahut hi interesting hai 😄
Ho gya exam...ye diagram 2 question me aya tha.. Thankyou
awesome explanation ! Thank you so much for creating such good content
God level explanation sir
You are amazingly different! Awesome video!
Next level explanation sir
Thanks Sir. It is very helpful and an amazing tutorial
sir, you are amazingg with ur work!! hats offf
Thanks for good knowledge
very well explained! thanks
super explained very well
Carry on sir ji..!! One day you will beat technical guruji.
is client node and name node will be on same system or different system
Thx for all videos
Very nice.👌👌
Isme coding bhi Hoti h Kya ya SQL ki tarah query type ki jati h
Small correction. Default block size is 64MB in HDFS
Yes you are right , however 64MB block size is for hadoop 1.x series which is outdated , for hadoop 2.x its 128MB .
Ohh! Thanks, Sandeep for correcting me!!
Sir could you upload videos on internet of Things ..
Agar bana diye ho to pls share link here
awesom please keep it up sir
wonderfull teaching
Can you make a video for HDFS and Linux commands...this will help a lot for everyone ....please
Sandip, where are you ? Create more content like this,it helps us
Simply..... Thank you
Very nice video. Can you please tell me which editing software do you use?
+Sanal Malhotra Thank you brother , I use Sony Vegas pro 13 for editing 😊
Amazing..and interesting..keep it up..👍
Wow.. Amazing
Great work keep it Up Big Like!!!!!!
+Siddharth Gosai Thank you very much 😊
Technological Geeks Hindi pl send me entire course pdf if possible sir, jayendra_901@yahoo.co.in
how data stored in distributed way ?pls explain
ek valle insan Hadoop Admin :-)
awsm video sir...really
Awesome bro 👌
Next video kb banaoge.
sir kya ham hdoop 4 gb wale laptop me seekh sakte h...
+Aryan Bainsla Agar aap dual boot karva k chalaoge to learning ke liye sufficient hai , agar VMware use krke 2gb RAM allocate kiya to performance ka issue aaega .
Wah Sandeep bhai Wah
Kya mai hadoop sek Shakta hu sir
Subscription is free? Wow
active aur standby namenode ka concept kaha lagega ?
mast hai bhai . maza aaya
Thank you bhai ..
bhai ek video ap haddop kese run kre window 7 ya 8 ya 10 ke upeer issse related ek vedio ap bana do plz
simple and clear
Mast bhau
+gujar nitesh Dhanyawad bhau 😊😊
Great Sir
i beg you i touch your feet please tell about the following in Big data :: Introduction to “R”, analyzing and exploring data with “R”, statistics for model building and evaluation.
Sure Achin Sir , Will try to do it after completing hadoop series , You can tell me frankly sir after all you are my first subscriber , I respect you , aap muze apna dost samze aur haq se batae , I wont mind at all ....
NICE
Brother Plz Share Slides
great video
+shakeel ahmed Thank you very much 😊
this one is great
plz sir ppt link sand
Amazing!!
bhau sqoop,zookeeper,mahout sang na
Ho nakki ... Lawkrch upload Karen ...
thoda jyada tezz ho gya bhai.
ese chizo ko sikhne mai thoda waqt lagta hai.
aap ne tho sara ka sara ek hi video mai daal diya
nice one
Thank you very much 😊
par sir maine padha h default block size 64 mb hota h.
+Aryan Bainsla it is for hadoop1 , for hadoop2 default block size is 128MB
thank you sir...
Sense of humor
Please explain in english
waaao man
*Practical bilkul smjh nhi aaya bhai*
try Bollywood
very nice...your videos help me a lot... can i have your email id ?and please upload complete hadoop course videos...
Thank you very much Arun Bhai , my email id is sdp117@gmail.com ...
bhau contact no. de na
itna badiya knowedge hai pr ab tareeka boring lagne lga hai hai....no offense !
muje literally kuch bhi nai samja
bhai bakchodi itta mat kr k padhaya kr irritation hoti h....pls its a request...