Hey Ridhi, I watched your Collection Framework and Multithreading videos before 1 week of my interview and I am very happy to tell you that I got selected out of 10 candidates to that company for Java profile. Your teaching style is just so awesome. Please keep on posting new videos brother. May God give you all the happiness 💝🙏🏻
❤❤ if you ask me i must day while watching the full video your explanation did not feel like I’m watching via RUclips. ❤❤ i feel like we are directly interacting. Super Riddhi da ✨. More this type of titbits we need . Kudos
A great explanation Riddhi. Thanks for explaining in such deep, as it is very important to know about load factor and initial capacity. We can also provide a constructor which takes initial capacity and load factor arguments also right? as it's done in the default HashMap class.Again, thanks for your effort
1hr back only done this problem using the hashmap. It takes lot of memory size than nested for loop to find number of occurrences of an element . Basically i from non it which one should I prefer space complexity or time complexity to design the alogorothm
I haven't seen any explanation on Hashmap that has this much depth. But It would have been even better if you discussed the worst case time complexity under high hash collisions would be O(n) in this linked list implementation and it can be reduced to O(log n) using balanced BST. Can you please discuss in a separate video about this Balanced BST implementation when the treeify_threshold has been crossed ??
Great explanation. I have got a different level of clarity in internal working of Hashmap. But I still have one doubt regarding the reduction in collisions. Suppose, the load factor crossed the threshold and we doubled the bucket array size and rehashing is also done but now also only the bucket index is going to change because of the change in compression function but for the objects which had same bucket index before rehashing, they will still be having the same bucket index just at a different index now. So, after rehashing , how does Hash fucntion will reduce the collisions??
Hi Rohit, let's suppose that initially the bucket size is 10 and we have two objects A and B with hashcode 21 and 31 respectively. when we try to calculate their bucket index, we will get 1 in both cases since 21%10 == 31%10 == 1 and hence both will be stored at same index. now, let's increase the bucket size to 20 and again calculate the bucket index. now for A, the index will be 21%20==1, but for B index will be 31%20==11. so now we have two different indexes for these objects and hence the collision is reduced.
Hi riddhi it was really great explanation but there is small change in code need to wrap Math.abs(key.hashCode()). there can be negative hashcode and negative numbers will give the error while getting the index of the table(bucket)
I need advice m fresher struggling to get a entry job using java stack What i want to know is shud i be doing advance topics of java the ones u teach seems little advance or i should stick to core and only practice that
Hi sir currently I am working as a flutter developer in Renault Nissan And I am preparing more DSA for top product based company can you please tell will my tech stack i.e flutter will be reliable for companies like faang please sir
Hey Ridhi, I watched your Collection Framework and Multithreading videos before 1 week of my interview and I am very happy to tell you that I got selected out of 10 candidates to that company for Java profile. Your teaching style is just so awesome. Please keep on posting new videos brother.
May God give you all the happiness 💝🙏🏻
So happy for you. Congratulations
@@rite2riddhi there is no such deep explanation videos on youtube which you teach in past as well as now ,with the super Teaching style, thank you sir
Excellent explanation without any kind of repetition. Great content on RUclips.
Brilliant explanation bhaiya! 💯💯
Really the best explanation ever on hashmap implementation ❤👍
Amazing explaining and kudos for your deep knowledge and expertise.
Good explanation with code much intuitive
❤❤ if you ask me i must day while watching the full video your explanation did not feel like I’m watching via RUclips. ❤❤ i feel like we are directly interacting.
Super Riddhi da ✨. More this type of titbits we need . Kudos
Thank you sourish.
It's a good video. Didn't find such content.
Worth watching your videos 💯..but shocked to see very less views . Do keep posting more videos on java
Finally , Master Java is back on youtube. Good See you
Haha.
Amazing!!
A great explanation Riddhi. Thanks for explaining in such deep, as it is very important to know about load factor and initial capacity. We can also provide a constructor which takes initial capacity and load factor arguments also right? as it's done in the default HashMap class.Again, thanks for your effort
Yes , correct. Thank you.
great video Riddhi, but just to point out the hashcode function does not give the memory address of the object.
1hr back only done this problem using the hashmap. It takes lot of memory size than nested for loop to find number of occurrences of an element . Basically i from non it which one should I prefer space complexity or time complexity to design the alogorothm
I haven't seen any explanation on Hashmap that has this much depth.
But It would have been even better if you discussed the worst case time complexity under high hash collisions would be O(n) in this linked list implementation and it can be reduced to O(log n) using balanced BST.
Can you please discuss in a separate video about this Balanced BST implementation when the treeify_threshold has been crossed ??
Great explanation. I have got a different level of clarity in internal working of Hashmap. But I still have one doubt regarding the reduction in collisions. Suppose, the load factor crossed the threshold and we doubled the bucket array size and rehashing is also done but now also only the bucket index is going to change because of the change in compression function but for the objects which had same bucket index before rehashing, they will still be having the same bucket index just at a different index now.
So, after rehashing , how does Hash fucntion will reduce the collisions??
Hi Rohit,
let's suppose that initially the bucket size is 10 and we have two objects A and B with hashcode 21 and 31 respectively. when we try to calculate their bucket index, we will get 1 in both cases since 21%10 == 31%10 == 1 and hence both will be stored at same index.
now, let's increase the bucket size to 20 and again calculate the bucket index. now for A, the index will be 21%20==1, but for B index will be 31%20==11. so now we have two different indexes for these objects and hence the collision is reduced.
Hi riddhi it was really great explanation but there is small change in code need to wrap Math.abs(key.hashCode()). there can be negative hashcode and negative numbers will give the error while getting the index of the table(bucket)
Truee..... in the case of strings it might overflow integer value then negative value will be returned.. good catch though
please continue java series
Thanks Riddhi
Hi Riddhi,
Can you also talk about why you were not using this keyword to set/update the values. Example this.size=0, this.capacity=INITIAL_CAPACITY
You can do that , but that is not required.
Sir make a full playlist of spring and spring boot
I need advice m fresher struggling to get a entry job using java stack
What i want to know is shud i be doing advance topics of java the ones u teach seems little advance or i should stick to core and only practice that
first strengthen your basics.
@riddhi can you share the code
Hi sir currently I am working as a flutter developer in Renault Nissan
And I am preparing more DSA for top product based company can you please tell will my tech stack i.e flutter will be reliable for companies like faang please sir
They don't care much about tech stack.
@@rite2riddhi then i can concentrate on DSA, system design and projects for my resume right sir?
Please make a tutorial on springboot
where do u work currently??
Nhi bataunga
Your voice is getting resound i thought that was due to your room....can you pls resolve it if possible....
Sure. Thanks for pointing out.
@@rite2riddhi My pleasure sir....very useful videos keep going we always support you....😇😇