There is a small issue of the last application on the job-server distribution. The prof is doing it under the assumption that it the job length L1,…Ln are deterministic so the only randomness from the process is the random assignment of the server, which follows iid Bernoulli. But as he mentioned at the beginning, if we treat the arrival of the jobs as the product of two random variables, the length of each job and the assignment of server for each job, then the first random variable is not mutually independent from each other. I can see if we assume the lengths are deterministic, we can apply the theory, but in practice we can’t until we assume either 1) the length rv is also independent (which means that the rv for random assignment of server is redundant, meaning even if we do a rotation assignment will have the same effect), or 2) we can’t use the theory without some discounting of the result.
Great stuff, I wish there were exercises with answers available to help remember material better. Problem sets available are a bit hard to work on alone. Without guidance and people to consult with. But I mean it’s a great series. I just lookup whatever’s unclear. Thanks. I hope to see more math lecture captions in the future.
This will make you practice searching, a valuable skill in computer science, BUT, there are official free solutions to all problem sets of 6.042j (2010 version), not too difficult to find.
If Chernoff's is derived using Markov, how did we suddenly get so much tighter bound on the probability? Which step in the proof enabled this? EDIT: He answers this @1:11:15
This is another great lecture by professor Tom Leighton. This lecture was more theoretical in comparison to past lectures in discreet mathematics, however the MIT student in Computer Science should gain more knowledge and insights for this powerful profession. Probability and Statistics continue to grow in all parts of Computer Science. I really enjoyed this lecture.
I'm a business informatics student so I am taking introduction to computer science and computer programming, and so far I hate everything that has to do with this major aside from these 2 courses which I am really enjoying so far, so I am considering changing to a computer science and engineering major, but I want to make sure this is what I will enjoy before I make this change so I am researching some stuff relating to computer science online, one of which being discrete mathematics. Do you think discrete maths gives good insight on how CS is like? How about 'Theory of computation' and 'Intro to Artificial Intelligence'? Do you have any other recommendations for courses or subjects that DO give good insight on CS and do not have any course prerequisites so I can look into them and be able to follow? Thanks in advance to those who take the time to answer and help :) Apologies if this has spammed you on more than one video. I'm just trying to get as much information as possible.
This is a great lecture series, but the audio and video seem to be out of sync in lecture 24, which is distracting. Would it be possible to fix this problem and upload it again?
This was really cool - quick question about Markov's Thm though - it assumes/needs a non-negative variable, but if you have a negative variable, does it still work if you increase every possible amount by the most negative number's amount such that it becomes zero? Basically, if you move a distribution all the way into the positive side, can you then use Markov's Thm on that distribution?
This is not just a course on discrete mathematics, but also familiarization with mathematics. Congratulations, professor.
You're almost there, keep going..
This was probably the most challenging lecture in this series
There is a small issue of the last application on the job-server distribution. The prof is doing it under the assumption that it the job length L1,…Ln are deterministic so the only randomness from the process is the random assignment of the server, which follows iid Bernoulli. But as he mentioned at the beginning, if we treat the arrival of the jobs as the product of two random variables, the length of each job and the assignment of server for each job, then the first random variable is not mutually independent from each other. I can see if we assume the lengths are deterministic, we can apply the theory, but in practice we can’t until we assume either 1) the length rv is also independent (which means that the rv for random assignment of server is redundant, meaning even if we do a rotation assignment will have the same effect), or 2) we can’t use the theory without some discounting of the result.
Great stuff, I wish there were exercises with answers available to help remember material better. Problem sets available are a bit hard to work on alone. Without guidance and people to consult with. But I mean it’s a great series. I just lookup whatever’s unclear. Thanks. I hope to see more math lecture captions in the future.
This will make you practice searching, a valuable skill in computer science, BUT, there are official free solutions to all problem sets of 6.042j (2010 version), not too difficult to find.
Read the book, Mathematics for Computer Science by Tom Leighton, Albert R Meyar and one more guy.
41:30 - Chernoff Bound
Absolutely fantastic lecture.
Real world problems goes to "Recitation". Good
I was hoping to see the problem he mentioned at the lecture end.
If Chernoff's is derived using Markov, how did we suddenly get so much tighter bound on the probability? Which step in the proof enabled this? EDIT: He answers this @1:11:15
This is another great lecture by professor Tom Leighton. This lecture was more theoretical in comparison to past lectures in discreet mathematics, however the MIT student in Computer Science should gain more knowledge and insights for this powerful profession. Probability and Statistics continue to grow in all parts of Computer Science. I really enjoyed this lecture.
How did you get the number of standard deviations a 31:22
the sd is 15 mentioned before, 150=10*sd, 10 sd therefore
thanks, I missed the part where he mentioned it
I'm a business informatics student so I am taking introduction to computer science and computer programming,
and so far I hate everything that has to do with this major aside from these 2 courses which I am really enjoying so far,
so I am considering changing to a computer science and engineering major,
but I want to make sure this is what I will enjoy before I make this change so I am researching some stuff
relating to computer science online, one of which being discrete mathematics.
Do you think discrete maths gives good insight on how CS is like?
How about 'Theory of computation' and 'Intro to Artificial Intelligence'?
Do you have any other recommendations for courses or subjects that DO give good insight on CS and
do not have any course prerequisites so I can look into them and be able to follow? Thanks in advance to those
who take the time to answer and help :) Apologies if this has spammed you on more than one video. I'm just trying to get as much information as possible.
What did you end up doing?
bro what did you do?
When did we prove law of total expectations used @3:20
This is a great lecture series, but the audio and video seem to be out of sync in lecture 24, which is distracting. Would it be possible to fix this problem and upload it again?
I'd suggest downloading the video (from a third party website) and then adjusting the audio sync on VLC. Quick fix, just takes a couple of minutes.
This was really cool - quick question about Markov's Thm though - it assumes/needs a non-negative variable, but if you have a negative variable, does it still work if you increase every possible amount by the most negative number's amount such that it becomes zero? Basically, if you move a distribution all the way into the positive side, can you then use Markov's Thm on that distribution?
Nevermind, my questions was answered in the lecture, thanks!
excellent
@1:08:28 yea, MST is worse than this, this is just algebra🤣
kool