I like the fact that you mention that it is indeed difficult and you also came across lot of bugs while writing this code. It makes us feel we are not the only ones struggling. Conducive to learning!
I ran the same code but instead of 't = max(t,w1)'; I just put 't = w1'. It worked with all test cases passing. That max() operation felt redundant, I broke my head trying to understand why we needed it in the first place. Then I tried omitting it myself and it worked. Also, it was faster by 200 ms this way. Anyways, I really appreciate your solution. Thanks Neetcode!
yeah same here,, i hit my head a lot to understand why that max operations is needed.. couldnt find any scenario where it will make any difference.. if anyone else can throw some light on this.
Hmm yes I think I agree, thanks for sharing. Since we are always popping the minimum value from the minHeap, and the values in minHeap only ever increase, I don't think t will ever be bigger than w1. The only reason I suspect this is because you said it worked without it lol. Neetcode said he tried many times with previous solutions having bugs, so i guess this was just left over from that.
Bellman ford solution: (n-1 iterations. Stop iterating, if none of the value changes from the prev iteration) class Solution: def networkDelayTime(self, a: List[List[int]], n: int, k: int) -> int: #Bellman Ford dp = [float("inf") for i in range(n+1)] dp[k]=0 dp[0]=0 stop = True j = 0
while j < n-1 or stop: stop = False for i in range(len(a)): src = a[i][0] dest = a[i][1] wt = a[i][2] if dp[dest] > min(dp[dest],dp[src]+wt) : dp[dest] = min(dp[dest],dp[src]+wt) stop = True j+=1 print(dp) if max(dp)==float("inf"): return -1 else: return max(dp)
How does this help with time complexity. I get that it works. You are basically iterating over all the edges and propagating time information, and stopping only when that time array "dp" is stable. But this feels less efficient than the solution proposed in the video. What do you think? After doing some research myself, it seems that the main benefit of doing Bellman-Ford is that it won't break in the case of negative edge weights.
Just wanted to say thanks, I don’t know how and why I understand all your solutions in the first go itself which i can never say about the actual Leetcode premium solutions or others in the discussion section
the course dijkstra's solution was more than sufficient. Thank you class Solution(object): def networkDelayTime(self, times, n, k): adj = {} for n in range(1,n+1): adj[n] = []
for s,d,w in times: adj[s].append((d,w))
shortest = {} min_heap = [(0,k)] while min_heap: w1,n1 = heapq.heappop(min_heap) if n1 in shortest: continue
shortest[n1] = w1 for ne,w in adj[n1]: if ne not in shortest: heapq.heappush(min_heap,(w+w1,ne))
return -1 if len(shortest)!=n else max(shortest.values())
This entire algorithm has been explained in detail. After watching, you can clearly understand the Dijkstra's algorithm and apply it to the problem. Thanks!!
Regarding complexity analysis- Since the number of edges `E` is given as an argmuent (`times`), what was the motivation for analyzing the algorithm in terms of `E` and `V` instead of just E (`E * log(E)`) or just V (`V ** 2 * log(V`)?
Hey, bro! I am just loving what you do, thanks a lot! It will be nice to mention why the order of values in minHeap must be in this exact order(weight, node). Due, to how minHeap in python works, it will order heap objects by the first value(weight). If you place values the other way (node, weight) --> It will work wrong.
Actually we can add few optimizations to this code. 1) We can stop the while loop when the visit set reaches n. Reason: since we are using a minHeap we are ensured to get the minimum path to all nodes and we can stop once we have reached all the nodes. 2) Also we can remove the continue block as we are checking if a node is in visit set before adding it to the minHeap. class Solution: def networkDelayTime(self, times: List[List[int]], n: int, k: int) -> int: adjList = {i:[] for i in range(1,n+1)} for u,v,w in times: adjList[u].append((v,w)) time = 0 minHeap = [(0,k)] visit = set() while minHeap and len(visit) < n: w1,n1 = heapq.heappop(minHeap) visit.add(n1) time = max(time, w1) for n2, w2 in adjList[n1]: if n2 not in visit: heapq.heappush(minHeap, (w1+w2,n2)) return time if len(visit) == n else -1 After adding these optimizations, the runtime decreased by 300ms for me.
@@janmejayadas6514 Yeah. optimization 2 (not totally sure if it's an optimization, frankly) works only when used with the optimization 1 (which is legit); the moment you visit all the nodes, you have the right `t` and you need to stop, otherwise, you're going to overwrite `t` in the very next iteration and then it's gone. I wouldn't use it either way, because it seems to me that there's simply no need to process the node we have already processed.
The 2nd one is actually the opposite of an optimization. We mark a node as visited only when we pop it from the heap. By that time we could've already added it basically v times (think it's the farthest node and connected to every other, closer node). So once we pop the shortest path to it, we mark it visited, so on every subsequent pop we don't consider it anymore.
Basically a node could be added to the heap twice with increasing weights before getting visited even once. So when we visit it the first time, we have the right t, and we shouldn't process it for the 2nd heap entry
As always, nice explanation :) I think you could omit the t variable and use w1 instead in the return statement, since the last w1 will be the solution. So instead of t = 0, we could do w1 = 0 before the loop.
The examples provided thus far are not addressing the case when there are simultaneous routes to shortest path. Although it works and passes, not sure how the code addresses following case. ex: [[1,2,1],[2,3,2],[1,3,2]] output: 3 ; you would think this is answer since resulted from 1->2(weight1) , then 1->3(weight2) expect: 2 ; since 1->2 and 1->3 is happening simultaneously, the weight 2 is picked.
For this problem, we need to notice that what we need finally is the parallel running time, which mean we need to know the running time to the farthest node. That has been mentioned at 4:02. The premise of the problem might be a little confused for me if we do not check the example. If the problem as us to return the total time from the start node to each node, then we just need to modify t = max(t, w1) to t += w1 gonna be ok. Then the 1st example gonna return 4.
t += w1 is not going to work, unless there's but a single path and when you push only w2 to the queue (L19). More importantly, every path should maintain its own time/weight separately to get a correct result for multi-path graphs, where there's multiple ways to get to a destination.
it is not. he deleting from the heap, not just using top. A delete is o(logn). BTW, the total algo could be implemented optimally with Fibonacci heap. Thus, making it O(E+vlogv) instead of O(ElogV+VlogV)
Why is max(t,w) required. Since we don't visit already visited node and no negative time value, won't the new value from minHeap be always greater than t?
I think you can optimize this by adding if len(visit) == n: return t under the while minheap: inspired by previous question: min cost to connect all points you can also omit the t variable and just return w1 at the end
oh I guess because we are using minheap, we are looking at the shorter path first and then if there is another route to the same node with greater weight we will just skip that loop. is this correct?
In the official leetcode solution, BFS was used with time complexity of O(V+E). How come BFS is more optimal than dijkstra's approach which has time complexity of O(V + ElogV)?
While the classic algorithm uses Min heap, there's no downside to using a Binary Tree instead (no upside too). Since there's no peek() or heapify() operations, there's no advantage.
Do we really need `t = max(t, w1)` ? Can't it just be `t = w1` ? This is because, we are already adding the previous times and w1 always reflects the new updated time. (popped from the min heap)
I like your content so much, can you please make separate playlist for backtracking problems I am really facing hard time trying to solve this type of questions
Was what you said at 7:19 a mistake? "Every time you want to get the minimum value from a min heap, it's log N". I think you meant to say it's just O(1) time to retrieve the minimum value, but it takes log N time for insertion (worst case scenario). Love your videos though!
Bit confused on the use of the "seen" set. If you are adding each node to seen as you visit it, how do you account for a situation where you encounter a node previously seen, but has a different path sum than when it was previously encountered?
Nope, removing the minimum value in a minheap and "fixing" the heap so that it still remains a minheap requires O(logn) time similar to heappush which also requires O(logn) time
Viewing the top value is O(1). Popping the top value is O(log n) because you have to rearrange "log n" number of values to reestablish the min heap property.
insertions and deletions both run in log N. You are not just finding min, you are popping it from heap. So, heap needs to be restructured to restore heap property. If you were just looking up the min without popping out, the complexity for that is O(1)
why "t = max(t, w1)"? We reach nodes in order of time of arrival. The last node we reach will have the largest time of arrival. So should it not be t = w1?
It can't. Line 18 will allow for pushing the not-yet-processed node into the heap with all the edges going into it, so that the heap can give you back the smallest edge. However, the moment you process that node, you will have the shortest path to that node, and from that moment onward, you do want to disregard all the incoming edges going toward that node, that ended up in the heap prior. This is what the line 12 condition is there for.
this is a greedy algorithm. If instead of using minheap (where we draw only 1 element), we used a queue, we would have to explore *EVERY* path. Here we only go to the one edge with lowest weight.
Sir! Love the videos, but please have some respect for Dijkstra, at least try to pronounce his name, put some correction about the DIJ and not DJI writing also :(
import java.util.*; class Solution { public int networkDelayTime(int[][] times, int n, int k) { // Initialize the graph List[] graph = new List[n + 1]; for (int node = 1; node a[0])); // Initialize visited array boolean[] visited = new boolean[n + 1]; Arrays.fill(visited, false); // Start with the source node pq.offer(new int[]{0, k}); // Initialize variables for result int minimumTime = 0; int nodesCovered = 0; while (!pq.isEmpty()) { int[] currNode = pq.poll(); int currTime = currNode[0]; int currentNode = currNode[1]; // Skip if the node is already visited if (visited[currentNode]) continue; // Mark the node as visited visited[currentNode] = true; // Update result variables nodesCovered++; minimumTime = currTime; // Explore neighbors for (int[] neighbor : graph[currentNode]) { int neighborNode = neighbor[0]; int travelTime = neighbor[1]; // Add unvisited neighbors to the priority queue if (!visited[neighborNode]) { pq.offer(new int[]{currTime + travelTime, neighborNode}); } } } // Check if all nodes are covered return nodesCovered == n ? minimumTime : -1; } } Note : max of currTime and minimumTime is not required
💡 GRAPH PLAYLIST: ruclips.net/video/EgI5nU9etnU/видео.html
This algorithm is not working anymore, any thoughts for latest test case?
leetcode 743
I like the fact that you mention that it is indeed difficult and you also came across lot of bugs while writing this code. It makes us feel we are not the only ones struggling. Conducive to learning!
I ran the same code but instead of 't = max(t,w1)'; I just put 't = w1'. It worked with all test cases passing. That max() operation felt redundant, I broke my head trying to understand why we needed it in the first place. Then I tried omitting it myself and it worked. Also, it was faster by 200 ms this way. Anyways, I really appreciate your solution. Thanks Neetcode!
yeah same here,, i hit my head a lot to understand why that max operations is needed.. couldnt find any scenario where it will make any difference.. if anyone else can throw some light on this.
Hmm yes I think I agree, thanks for sharing. Since we are always popping the minimum value from the minHeap, and the values in minHeap only ever increase, I don't think t will ever be bigger than w1. The only reason I suspect this is because you said it worked without it lol. Neetcode said he tried many times with previous solutions having bugs, so i guess this was just left over from that.
Yeah, the max(t,w1) is not needed.
I think they might have update the test case? This no longer work
The max is not needed because when we visit the last node in the loop, this last node will have the max time because it's an min heap
I am not graduated from CS, this is the first time I know Dijkstra's algorithm. Thank you so much for your explaination
Bellman ford solution:
(n-1 iterations. Stop iterating, if none of the value changes from the prev iteration)
class Solution:
def networkDelayTime(self, a: List[List[int]], n: int, k: int) -> int:
#Bellman Ford
dp = [float("inf") for i in range(n+1)]
dp[k]=0
dp[0]=0
stop = True
j = 0
while j < n-1 or stop:
stop = False
for i in range(len(a)):
src = a[i][0]
dest = a[i][1]
wt = a[i][2]
if dp[dest] > min(dp[dest],dp[src]+wt) :
dp[dest] = min(dp[dest],dp[src]+wt)
stop = True
j+=1
print(dp)
if max(dp)==float("inf"):
return -1
else:
return max(dp)
How does this help with time complexity. I get that it works. You are basically iterating over all the edges and propagating time information, and stopping only when that time array "dp" is stable. But this feels less efficient than the solution proposed in the video. What do you think?
After doing some research myself, it seems that the main benefit of doing Bellman-Ford is that it won't break in the case of negative edge weights.
Just wanted to say thanks, I don’t know how and why I understand all your solutions in the first go itself which i can never say about the actual Leetcode premium solutions or others in the discussion section
the course dijkstra's solution was more than sufficient. Thank you
class Solution(object):
def networkDelayTime(self, times, n, k):
adj = {}
for n in range(1,n+1):
adj[n] = []
for s,d,w in times:
adj[s].append((d,w))
shortest = {}
min_heap = [(0,k)]
while min_heap:
w1,n1 = heapq.heappop(min_heap)
if n1 in shortest:
continue
shortest[n1] = w1
for ne,w in adj[n1]:
if ne not in shortest:
heapq.heappush(min_heap,(w+w1,ne))
return -1 if len(shortest)!=n else max(shortest.values())
This entire algorithm has been explained in detail. After watching, you can clearly understand the Dijkstra's algorithm and apply it to the problem. Thanks!!
I've been calling it [D ai ks tra] throughout uni!!😆Love the algo thank you for the detailed explanation!
I think that is the correct pronunciation :)
Yes, that is definitely the correct pronunciation.
pls don't start calling it jikstra xD
He also spells it djikstra when its really dijkstra, thats where the issue lies underneath, I think 🧐
Regarding complexity analysis-
Since the number of edges `E` is given as an argmuent (`times`), what was the motivation for analyzing the algorithm in terms of `E` and `V` instead of just E (`E * log(E)`) or just V (`V ** 2 * log(V`)?
Hey, bro!
I am just loving what you do, thanks a lot!
It will be nice to mention why the order of values in minHeap must be in this exact order(weight, node). Due, to how minHeap in python works, it will order heap objects by the first value(weight). If you place values the other way (node, weight) --> It will work wrong.
Actually we can add few optimizations to this code.
1) We can stop the while loop when the visit set reaches n. Reason: since we are using a minHeap we are ensured to get the minimum path to all nodes and we can stop once we have reached all the nodes.
2) Also we can remove the continue block as we are checking if a node is in visit set before adding it to the minHeap.
class Solution:
def networkDelayTime(self, times: List[List[int]], n: int, k: int) -> int:
adjList = {i:[] for i in range(1,n+1)}
for u,v,w in times:
adjList[u].append((v,w))
time = 0
minHeap = [(0,k)]
visit = set()
while minHeap and len(visit) < n:
w1,n1 = heapq.heappop(minHeap)
visit.add(n1)
time = max(time, w1)
for n2, w2 in adjList[n1]:
if n2 not in visit:
heapq.heappush(minHeap, (w1+w2,n2))
return time if len(visit) == n else -1
After adding these optimizations, the runtime decreased by 300ms for me.
Optimization 2 may not work as during addition to the heap it may not be visited but later
@@janmejayadas6514 Yeah. optimization 2 (not totally sure if it's an optimization, frankly) works only when used with the optimization 1 (which is legit); the moment you visit all the nodes, you have the right `t` and you need to stop, otherwise, you're going to overwrite `t` in the very next iteration and then it's gone. I wouldn't use it either way, because it seems to me that there's simply no need to process the node we have already processed.
The 2nd one is actually the opposite of an optimization. We mark a node as visited only when we pop it from the heap. By that time we could've already added it basically v times (think it's the farthest node and connected to every other, closer node). So once we pop the shortest path to it, we mark it visited, so on every subsequent pop we don't consider it anymore.
Basically a node could be added to the heap twice with increasing weights before getting visited even once. So when we visit it the first time, we have the right t, and we shouldn't process it for the 2nd heap entry
You sir are a treasure
As always, nice explanation :) I think you could omit the t variable and use w1 instead in the return statement, since the last w1 will be the solution. So instead of t = 0, we could do w1 = 0 before the loop.
wow, hat's off to Neetcode. This is way to complicated problem, you solved it so perfectly. Thanks
Super helpful. Thanks for trudging through Dijkstra's to make it easier for us!
nav you were right, i got the intuition by myself but had a lot of confusion writing the algorithm.
U r a God bro. super videos and easy to understand . super bro . super.
The examples provided thus far are not addressing the case when there are simultaneous routes to shortest path. Although
it works and passes, not sure how the code addresses following case.
ex:
[[1,2,1],[2,3,2],[1,3,2]]
output: 3 ; you would think this is answer since resulted from 1->2(weight1) , then 1->3(weight2)
expect: 2 ; since 1->2 and 1->3 is happening simultaneously, the weight 2 is picked.
this is a pretty smart way to figure this out
For this problem, we need to notice that what we need finally is the parallel running time, which mean we need to know the running time to the farthest node. That has been mentioned at 4:02.
The premise of the problem might be a little confused for me if we do not check the example.
If the problem as us to return the total time from the start node to each node, then we just need to modify t = max(t, w1) to t += w1 gonna be ok. Then the 1st example gonna return 4.
t += w1 is not going to work, unless there's but a single path and when you push only w2 to the queue (L19).
More importantly, every path should maintain its own time/weight separately to get a correct result for multi-path graphs, where there's multiple ways to get to a destination.
Thank God NeetCode exists!!
7:16 Getting minimum from min-heap is not log(N) but O(1).
Good point!
it is not. he deleting from the heap, not just using top. A delete is o(logn). BTW, the total algo could be implemented optimally with Fibonacci heap. Thus, making it O(E+vlogv) instead of O(ElogV+VlogV)
@@orellavie6233 but it is always deleting the minimum, therefore the the delete operation will always be the best case which is O(1)
@@IAmCosmikGaming to delete a min require to change the heap accordingly.. There must be a change of heap elements in o(logn)
Really like hw u hv explained it so easily
Dijkstra's seems much less scary now, thanks
Why is max(t,w) required. Since we don't visit already visited node and no negative time value, won't the new value from minHeap be always greater than t?
someone else mentioned this too. I think it's unnecessary
might be the best way to highlight djikstas algorithm
I think you can optimize this by adding
if len(visit) == n:
return t
under the while minheap:
inspired by previous question: min cost to connect all points
you can also omit the t variable and just return w1 at the end
one of the best explanations so far
You could also exit the while loop earlier by checking for len(visit) == n
thanks for the great video, what is the reason for t = max(t, w1)? dont we want the shorter time? shouldn't it be min(t, w1)? I am confused lol
oh I guess because we are using minheap, we are looking at the shorter path first and then if there is another route to the same node with greater weight we will just skip that loop. is this correct?
we want the time at which all the nodes got the signal.
@@sunginjung3854 this is correct
i still did not get why t = max(t, w1) :(
@@veliea5160 t = w1 is enough, because w1 is always increasing
you can use array than heap to get complexity of V^2. This is good for dense graph
pronounces: dɛɪkstra
J is actually a vowel in Dutch and ij is like a long i or a sound no non-Dutch can pronounce
Thank you so much as always! Minor problem in line 5: should be edges[u].append((w, v)), w is first, before v.
Great explanation, thank you!
In the official leetcode solution, BFS was used with time complexity of O(V+E). How come BFS is more optimal than dijkstra's approach which has time complexity of O(V + ElogV)?
I think we don't need "if n2 not in visited" at 17:43 🤔
Nice explanation! One thing i'd like to point out is that it's pronounced "DIEK-struh"
While the classic algorithm uses Min heap, there's no downside to using a Binary Tree instead (no upside too). Since there's no peek() or heapify() operations, there's no advantage.
using max variable was quite impressive
So it’s like bfs but using priority queue instead of a regular queue?
great explanations sir!
Extremely nice introduction
great explanation
I love this channel a lot :)
Do we really need `t = max(t, w1)` ? Can't it just be `t = w1` ? This is because, we are already adding the previous times and w1 always reflects the new updated time. (popped from the min heap)
I like your content so much, can you please make separate playlist for backtracking problems I am really facing hard time trying to solve this type of questions
Thanks for the suggestion, just created it: 💡 BACKTRACKING PLAYLIST: ruclips.net/video/pfiQ_PS1g8E/видео.html
Was what you said at 7:19 a mistake?
"Every time you want to get the minimum value from a min heap, it's log N".
I think you meant to say it's just O(1) time to retrieve the minimum value, but it takes log N time for insertion (worst case scenario).
Love your videos though!
It's O(1) to look at the min, log(N) to pop the min, and log(N) to insert
Just curious, did you submitted this problem and it worked ? Can you please check.
clear explanation as always!
Dijkstra*
Thanks for the great explanation!
had this in a new grad amazon onsite last month. Why didn't I get to this problem sooner. FML
Bit confused on the use of the "seen" set. If you are adding each node to seen as you visit it, how do you account for a situation where you encounter a node previously seen, but has a different path sum than when it was previously encountered?
Nope, removing the minimum value in a minheap and "fixing" the heap so that it still remains a minheap requires O(logn) time
similar to heappush which also requires O(logn) time
I still have this doubt. Did you understand how seen set works in case of two routes
Why do you max( t,w1) ?
I have the same question, If you know pls answer
your videos are awesome
7:20 it should be O(1), right?
shouldn't the overall time complexity be O(V*logV + E*logV), why do we ignore looping through the vertices and heap pop operation part?
thanks for the big o explanation
Correct me if I am wrong but isn't the time complexity of getting the min element O(1) (referring to 7:18)
Viewing the top value is O(1). Popping the top value is O(log n) because you have to rearrange "log n" number of values to reestablish the min heap property.
@@misterimpulsism gotcha. Thanks.
Thank you
Javascript version + Generic Heap implementation:
/**
* @param {number[][]} times
* @param {number} n
* @param {number} k
* @return {number}
*/
var networkDelayTime = function (times, n, k) {
const adj = new Map();
for (let i = 1; i a[0] - b[0]);
minHeap.push([0, k]);
while (minHeap.size() > 0) {
const [w1, n1] = minHeap.pop();
if (shortest.has(n1)) continue;
shortest.set(n1, w1);
totalTime = w1;
for (const [w2, n2] of adj.get(n1)) {
if (shortest.has(n2)) continue;
minHeap.push([w1 + w2, n2]);
}
}
if (shortest.size < n) return -1;
return totalTime;
};
class Heap {
constructor(comparator = (a, b) => a - b) {
this.heap = [null];
this.comp = comparator;
}
push(value) {
this.heap.push(value);
this.#heapifyUp();
}
pop() {
if (this.heap.length === 1) return null;
if (this.heap.length === 2) return this.heap.pop();
const value = this.heap[1];
this.heap[1] = this.heap.pop();
this.#heapifyDown(1);
return value;
}
size() {
return this.heap.length - 1;
}
#higherPriority(a, b) {
return this.comp(this.heap[a], this.heap[b]) < 0;
}
#heapifyDown(parent) {
const size = this.heap.length;
while (true) {
const left = parent * 2;
const right = parent * 2 + 1;
let priority = parent;
if (left < size && this.#higherPriority(left, priority)) priority = left;
if (right < size && this.#higherPriority(right, priority)) priority = right;
if (parent === priority) break;
this.#swap(parent, priority);
parent = priority;
}
}
#heapifyUp() {
let child = this.heap.length - 1;
let parent = Math.floor(child / 2);
while (child > 1 && this.#higherPriority(child, parent)) {
this.#swap(parent, child);
child = parent;
parent = Math.floor(child / 2);
}
}
#swap(a, b) {
[this.heap[a], this.heap[b]] = [this.heap[b], this.heap[a]];
}
}
In a classic Dijkstra's there is a "relaxation" of the edges. where are we doing that here?
max heap is an optimizations, right? because you can use linear search as a simplest solution
New to dsa here. Is dijkstra's needed here? . Wouldn't a simple dfs suffice?
At line 19, is it w1+w2? or t+w2?
Neet algorithm as always!
Isn't finding the minimum in the min heap going to be O(1)? (instead of logN) I thought only insertion was logN
insertions and deletions both run in log N. You are not just finding min, you are popping it from heap. So, heap needs to be restructured to restore heap property. If you were just looking up the min without popping out, the complexity for that is O(1)
Yes, but popping it (removing) will be O(log N). I got confused as well.
@@deep.space.12 Makes sense!
in the question description, it does not mention that find the minimum time that it takes. So why are we trying to find the shortest path, then.
why is this problem not on your 150 list Nav?
Why do you calc the max of t,w1 . In my solution I set the t=w1 and it works?
why "t = max(t, w1)"?
We reach nodes in order of time of arrival. The last node we reach will have the largest time of arrival. So should it not be t = w1?
Pls let me know, If you know the anwer
legend
i copied your code line b line but it doesn't seem to work for the test case
[[1,2,1]]
2
1
bro just check the condition:
if len(vis) == n:
return t
else:
return -1
that's all...
how is this different from prims algorithm?
My man said jigstra's~
Are both conditions on line 12 and line 18 absolutely necessary? Can't the algorithm work with just one of those conditions?
It can't.
Line 18 will allow for pushing the not-yet-processed node into the heap with all the edges going into it, so that the heap can give you back the smallest edge.
However, the moment you process that node, you will have the shortest path to that node, and from that moment onward, you do want to disregard all the incoming edges going toward that node, that ended up in the heap prior. This is what the line 12 condition is there for.
For the love of CS, it's not Jikstra, it's Dijkstra pronounced (DYEKSTRA)
amazing
Time and Space Complexity explanation is not clear, otherwise nice explanation!
we don't need set
Why can't this problem be solved by BFS?
Sorry, if the question is too naive.
this is a greedy algorithm. If instead of using minheap (where we draw only 1 element), we used a queue, we would have to explore *EVERY* path. Here we only go to the one edge with lowest weight.
It's pronounced dye-kstra's :)
Sir! Love the videos, but please have some respect for Dijkstra, at least try to pronounce his name, put some correction about the DIJ and not DJI writing also :(
The pronunciation will be easier once you spelt it correctly (Dijk, not Djki) 😅
just wow
An interesting way to pronounce "Djikstra" lol
Do people still comment "first" XD
🍪
do people still use "XD" ?
Dike-struh.
What a poor explanation of E = V^2 loool
your teaching method is so gay
Question, any reason why we don't need to call heapq.heapify() ?
Dijkstra's algorithm is read as /ˈdaɪkstrəz/ DYKE-strəz, not Jikstra's
import java.util.*;
class Solution {
public int networkDelayTime(int[][] times, int n, int k) {
// Initialize the graph
List[] graph = new List[n + 1];
for (int node = 1; node a[0]));
// Initialize visited array
boolean[] visited = new boolean[n + 1];
Arrays.fill(visited, false);
// Start with the source node
pq.offer(new int[]{0, k});
// Initialize variables for result
int minimumTime = 0;
int nodesCovered = 0;
while (!pq.isEmpty()) {
int[] currNode = pq.poll();
int currTime = currNode[0];
int currentNode = currNode[1];
// Skip if the node is already visited
if (visited[currentNode]) continue;
// Mark the node as visited
visited[currentNode] = true;
// Update result variables
nodesCovered++;
minimumTime = currTime;
// Explore neighbors
for (int[] neighbor : graph[currentNode]) {
int neighborNode = neighbor[0];
int travelTime = neighbor[1];
// Add unvisited neighbors to the priority queue
if (!visited[neighborNode]) {
pq.offer(new int[]{currTime + travelTime, neighborNode});
}
}
}
// Check if all nodes are covered
return nodesCovered == n ? minimumTime : -1;
}
}
Note : max of currTime and minimumTime is not required
Thanks for great explanation!
No problem!