When the package is imported with the keyword "using", it is not necessary to use the Package.function syntax with the functions exported by the package. For example: Plosts.scatter(x, y) It can be written this way: scatter(x, y)
Hi Elias, I use the dot notation to help new Julia programmers know where the imported functions are coming from. But you are right! Maybe next episode I should mention that the dot notation is not necessary and that I use it for this specific reason.
@@DrRandyDavila Now I get it, it's just to simplify for beginners. I also think it's important to make that detail clear in the next episode. Beginners may be confused as Julia code is not normally written like this.
I think Stochastic Gradient Descent was implemented incorrectly. We had: data = [(x_train, y_train)] # a list containing 1 tuple Later we had: N = length(data) # so N = 1 And in the loop, for each epoch we have: i = rand(1:N) # so rand(1:1) which is always 1
Same feeling i am learning m.l theory and somehow flux/julia syntax mathches the intution . Am never going back to python unless ther's a package Julia doesnt have..!
OMG! is the pluto output sane now? does it show below the cells like 100% of the rest of the world’s agreed upon standard for command output? I’m gonna cry!
Great content, especially for beginners like me entering the Julia world. Very excited to work in Julia for Machine learning tasks. Thank you !
When the package is imported with the keyword "using", it is not necessary to use the Package.function syntax with the functions exported by the package.
For example:
Plosts.scatter(x, y)
It can be written this way:
scatter(x, y)
Hi Elias, I use the dot notation to help new Julia programmers know where the imported functions are coming from. But you are right! Maybe next episode I should mention that the dot notation is not necessary and that I use it for this specific reason.
@@DrRandyDavila Now I get it, it's just to simplify for beginners.
I also think it's important to make that detail clear in the next episode.
Beginners may be confused as Julia code is not normally written like this.
I think Stochastic Gradient Descent was implemented incorrectly.
We had:
data = [(x_train, y_train)] # a list containing 1 tuple
Later we had:
N = length(data) # so N = 1
And in the loop, for each epoch we have:
i = rand(1:N) # so rand(1:1) which is always 1
Hi L W, Yes, very good catch! This is something we will address in next weeks episode. Thank you again for pointing this out!
Same feeling i am learning m.l theory and somehow flux/julia syntax mathches the intution . Am never going back to python unless ther's a package Julia doesnt have..!
OMG! is the pluto output sane now? does it show below the cells like 100% of the rest of the world’s agreed upon standard for command output? I’m gonna cry!