@@aminemihoub5269 *Unfortunately no!* You can try but you'll fail like everybody else. Sorry for destroing your wealthy dream. I tried many ultra-fancy ideas and most advanced algos and I never was able to get *stable and significant* accuracy. Stocks are a bit easier to predict, but they are also unprofitable.
Very much impressed by the way you presented it. So simple and informative which no one shares as secret that is predicting nxt 30 days. Only few members share it in youtube. Hats off to you. Thanks and all the best for your future videos. May god bless you.
I hope you understand that the reason nobody shares the „secret“ on how to predict the next 30 days of a stock price is… that these predictions are really really bad, right? I mean you get that?
Let me congratulations first, I saw other videos that explain the same method using LSTM. However, you were the only one that made it predicting 30 days in the future, amazing! if you can make more videos like this maybe improving this method, explain it with all the details and logic behind it no matter if the video last 2 or 3 hours I'll see it. Finally an idea that you can make with this method is doing it with some other features using volume as an example and close price, instead of one feature, amazing.
SIr, you have done data leakage twice. 1) You should use scaling after splitting the data not before 2) You are fitting the model using test data as validation data which should not be done Correct me if I'm wrong
@@arshdeepsingh5950 He did not perform data processing correctly and data leakage is present which will cause the model performance metrics to be overly optimistic. Data was scaled prior to splitting, resulting in information from the test data present at the time of training.
This is good from an academic point of view and learning about LSTMs and application, but I think that a lot more goes into building a robust profitable automated trading system, specially given the current situation(volatile market), anyways an informative video, keep up the good work!
Hi, this is a good tutorial about using LSTM in Python. BUT: it is the wrong application. When you zoom into the curve, e.g. day 1150 to day 1258, you can see that the "prediction" works pretty much like a low pass filter. That means: there is a lot of delay. Also, when you compare the performance using RSME of your example to a simple 1-step delay ( predicted cost for the next day = actual cost from actual day), the 1-step delay gives better results.
When you compute the mean squared error you used "train_predict" thier values 200,201 (not scaling) with "y_train" thier values are scaling ( betwen -1 and 1) so you get a big value of mean_squared error
when you run the model on scaled data, you prediction should come between 0 to 1 (in case of min max scaler ) . so you can directly use both series to calculate MSE ,MAPE
If I may, add an edit in the code...the mean squared error you compared are of scaler transformed and non scalar transformed data..eg: y_train values are scalered transformed while train_predict data is already inversed scalar. Hence you got RMSE value of 100+. Compare root mean squared error before inverse transforming the predicted data. RMSE of 0 to 0.5 is quite acceptable.
Shouldnt we check rmse error after inverse transforming both y_train and y_predict? because it will give us the actual rmse values in a practical sense
@@doji-san It matters. You have to split before then apply MinMaxScaler() to the train set. After that , you transform the train set and the test set with your MinMaxScaler() instance that has been fitted to train. Fitting from the begining to all the data will lead to a look ahead bias. Thus , the results will be more good than they should be.
if you transform before you split, and if its a data wide transformation (not sample specific), you are setting global/data reference points from the test set, thus biasing your transformations, always ensure your test set mimics the real unknown future home slice
Hey Krish, The video showcases the implementation of the LSTM beautifully. It was easy, will definitely try to implement and check the result with tweaks. Keep it up. Thanks for the upload.
sir, you should have a created data-generator intend of creating the function creat_dataset it would save memory and time. Love your videos keep doing it.
After the prediction, please show how real market moved in next 30 days to compare how well the model predicted it!! Cause it looks like it is just flattening out at the end.
Dear Sir, If you are should not use for real world project then please mention the step also what kind of step should be consider in real world case study solution interms of stock prise prediction? Please provide your valuable feedback?
Thanks for the wonderful video. I tried in google stock today One clarification is on lines 311 and 312 we need to inverse transform (y_train & y_test) back to non scaled version before taking the rms for comparison with the predicted values. when i did that the RMSE came down drastically.
But on inverse transforming y_train and y_test it shows error 'Expected 2D array, got 1D array instead', but y_train, y_test is a 1D array only. How to solve this??
First comment Sir please make videos on gsoc and projects related to it to which we can contribute . I have mailed you regarding the same.. ty for such valuable contents sir...
one problem is you are taking the price for MinMax scaling between 0 and 1. That would create a problem as you can't enter a "maximum price". Maybe use % change instead.
At this step: model=Sequential() model.add(LSTM(50,return_sequences=True,input_shape=(100,1))) model.add(LSTM(50,return_sequences=True)) model.add(LSTM(50)) model.add(Dense(1)) model.compile(loss='mean_squared_error',optimizer='adam') I am getting a following error: NotImplementedError: Cannot convert a symbolic Tensor (lstm/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
Important to note that real stock prediction takes in MANY different data sources, the time-series data of the stock price alone is not sufficient to make accurate predictions, you must take into account the sentiment of news around a company etc
First of all , thanks for a wonderful session. One question about scaling though. Shouldn't the MinMaxScaler be used to fit_transform the training data and then use the "fitted" scaler to the test data ?
Data Scaling: fit only train data & transform train and test data. You cannot use the entire dataset to fit. This way you're simply cheating. Also, the evaluation is a bit cheating because you construct lagged features from known target data, i.e. data leakage. That's why it now looks great. You need to predict and rebuild the lagged features in a stepwise way.
Thanks for you video. Could you explain why scaling is important here? It doesn't make sense why it would matter since we are only using one column. If there are more than one column it makes sense since two columns have different scales but why for one column?
After data is split please check the train size and test size, total size is 1258 including zero but the train and test combined is 1256, two data points are missing, also there is a lag in prediction
@@srikanthrachakulla5488 Hello Srikanth. Are you trying to build a trade bot ? I am also trying to build one. I am beginner in coding, will be happy to connect with you
1. Not normalised data doesn't work so well, and when you normalise it you make specific "roof" treated as a barrier hard to jump over by neuro network. In simple words - it's hard to predict values higher than historic data. 2. How can you expect when you train on so few samples to make network predict so many days ahead? Each day error of prediction raises, and in my opinion after 3-4 days error is much bigger than data predicted. It makes sense to make network predict one or two days aheadnd that's it. (I believe anyway with at most 70-80% precision, not more)
Krish Sir, dataset is not declared before this function import numpy # convert an array of values into a dataset matrix def create_dataset(dataset, time_step=1): dataX, dataY = [], [] for i in range(len(dataset)-time_step-1): a = dataset[i:(i+time_step), 0] ###i=0, 0,1,2,3-----99 100 dataX.append(a) dataY.append(dataset[i + time_step, 0]) return numpy.array(dataX), numpy.array(dataY) please correct the code
Thanks for the video, I think the python explanation was good to understand how to put things together but I'm afraid the technical approach is very similar to the implementation of an auto-regressive model of a time series. Using only the historicity of closing prices might not be enough to have a good prediction model. Thanks anyway.
So, I ran your code tested like 5 different tickers. And guess what...in the next 30 days we will have a financial crisis and all stocks will drop 80% or more.
I got the same problem. Does it have something to do with that Tiingo only gives data with lenght of 1257 instead of 1258 as in the video? It´s the only difference I can find.
@@TheBjornAlmighty I don’t think the prediction is accurate at all. Tesla, Apple, Amazon all the stocks drop down to like $50... :-) in the next 30 days
@@keylomoon for sure it´s wrong. But our result shouldnt differ that much from his. Just look at the loss. Mine is way above 1000 in training and 2000 in test, on all stocks. So I think our input data is put in wrong somehow.
@@TheBjornAlmighty I find that candlesticks pattern Recognition is way better to use to predict future stock compare to this one. My accuracy predicting bullish stock are like 80% accurate. Try that! This number prediction I think is kind of impossible because that’s like guessing human minds I don’t think that’s possible.
Thanks krish, you have explained step by step anyone can easily follow. I have gone through other videos related to stock market prediction but everybody saying don't use technique to predict regular stock market predictions can you please suggest any specific methods that i can try for regular stock market predictions.
Great insite, love the content. I 'd like a piece of advice though! Would you recommend using the Multi-Step LSTM model for more than one output or re-using the single output as done above?
hi Krish Naik .Thanks for your videos, You are amazing bro i found an error on ur code u did a mistake when u were trying to get the metrics , you lunch mean_squared_error( y_train, train_predict) and u find thats the error is too high bro because you didnt unscale the y_train. thats why the results on the metrics were bad and when u plot u fix it and its given you the right plot because you unscale it . HOPE its HELP GUYS to fix it y_train= scaler.inverse_transform(y_train.reshape(-1,1))
I don't understand we reshaped the x_train and y_train to 3 dimensions. X_train =X_train.reshape(X_train.shape[0],X_train.shape[1] , 1) But, in LSTM layers we put input shape as 2 dimensions why ? model.add(LSTM(50,return_sequences=True,input_shape=(100,1)))
NotImplementedError: Cannot convert a symbolic Tensor (lstm_6/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported is anyone getting this error?if yes, how did you fix it? Thanks
Krish Sir, while calculating the RMSE value train_predict is inverse transformed, but y_train is not inversely transformed. Same with the test data. Please clarify if I am missing something.
Firstly thanks for this valuable and informative video. I just want to know that you have used only 'close' features for the prediction of 'close' values for next 30 days. What about other features? Don't they have impact on 'close' values? Why they are not used for the prediction of close value for next months as well as during trainings? Anyone can answer please... i' m just start off this project but this question not letting me to move forward..
you are just making the train test split more complicated ! for reference you can do this to split -----------------> from sklearn.model_selection import train_test_split train_x ,test_x ,train_y ,test_y = train_test_split( x ,y , test_size = 0.35 , random_state = 0)
Thanks, actually this video explains the time series data very well. I am a bit confused that apart from Stacked LSTM, this video should not be in NLP playlist.
excellent! Thank you. Doesn't the y_train and y_test need to be reshaped from one dimension to 2 dimensions? Or could we leave y_train and y_test as one dimension?
First time I understood this LSTM. You have explained the LSTM very easily and in a very crisp manner.
I asked my teacher where should I start if I want to apply machine learning for stocks and he sent me this video.
You are AMAZING. Thanks a lot sir!!
Try to apply machine learning to coin toss or ask your teacher how to do that... lol.
@@rozsadnymarek5988 lol 50 percent accuracy
@@sayantanmazumdar9371 No more than in financial markets. ML is useles in forex. Believe me. I have 10 years of experience.
@@rozsadnymarek5988 Hi dis ML works in stocks market or gives a good probability ?
@@aminemihoub5269 *Unfortunately no!* You can try but you'll fail like everybody else. Sorry for destroing your wealthy dream. I tried many ultra-fancy ideas and most advanced algos and I never was able to get *stable and significant* accuracy. Stocks are a bit easier to predict, but they are also unprofitable.
Krish, the best part in your teaching is... somehow, you match the audience frequency!
Very much impressed by the way you presented it. So simple and informative which no one shares as secret that is predicting nxt 30 days. Only few members share it in youtube. Hats off to you. Thanks and all the best for your future videos. May god bless you.
I hope you understand that the reason nobody shares the „secret“ on how to predict the next 30 days of a stock price is… that these predictions are really really bad, right? I mean you get that?
"Wow!!" is the word for explanation skill.
Let me congratulations first, I saw other videos that explain the same method using LSTM. However, you were the only one that made it predicting 30 days in the future, amazing! if you can make more videos like this maybe improving this method, explain it with all the details and logic behind it no matter if the video last 2 or 3 hours I'll see it. Finally an idea that you can make with this method is doing it with some other features using volume as an example and close price, instead of one feature, amazing.
Really true! I would like to have the code
Yeah I've done this with the features you r suggesting, it gave good accuracy.
@@swatigupta3173 Did you have to predict the other features as well to feed the predictions back to the model?
@@swatigupta3173 could you plese help me ?is there any way to convert those 30 values to actual stock price.
Easily the best video about the LSTM in RUclips. Incredible explanation skill. Thank you for this priceless source Sir!
SIr, you have done data leakage twice.
1) You should use scaling after splitting the data not before
2) You are fitting the model using test data as validation data which should not be done
Correct me if I'm wrong
1) Scaling 'should' be done after the split to both training and testing datasets in the same way as he did it...
2) not sure
@@arshdeepsingh5950 He did not perform data processing correctly and data leakage is present which will cause the model performance metrics to be overly optimistic. Data was scaled prior to splitting, resulting in information from the test data present at the time of training.
You are correct he did the data leakage twice, that's why the prediction came accurately and the new 30days data got smoothen curve
This is good from an academic point of view and learning about LSTMs and application, but I think that a lot more goes into building a robust profitable automated trading system, specially given the current situation(volatile market), anyways an informative video, keep up the good work!
Yes I also agree
@@krishnaik06 sir, can we do it for Indian market ?
Yes we can
Its doable if the data is available with a reading of 10 minutes interval of everyday from last 7 + years
Does it really require that much data??
Hi, this is a good tutorial about using LSTM in Python. BUT: it is the wrong application. When you zoom into the curve, e.g. day 1150 to day 1258, you can see that the "prediction" works pretty much like a low pass filter. That means: there is a lot of delay. Also, when you compare the performance using RSME of your example to a simple 1-step delay ( predicted cost for the next day = actual cost from actual day), the 1-step delay gives better results.
adding to this comment, the 'naive' forecast should be used as a threshold performance before evaluating any ML model
Great achievement and proof that the stock market can be predicted. Good luck!! ✊
It can, proven, and consistently.. Search for Jim Simons.
may god give you more strength to make such valuable video man!! so grateful for you.
haha I'm reading Jasons book at the same time so this is perfect.
Did you buy jason computer vision book?
And how are jason brownlee time series books?
When you compute the mean squared error you used "train_predict" thier values 200,201 (not scaling) with "y_train" thier values are scaling ( betwen -1 and 1) so you get a big value of mean_squared error
^ more people need to see this (saved me)
Yes kind of figured out when I got very large MSE and RMSE. Do you know how can this be fixed?
@@badalsoni8000 2 solutions :
- inverse scale y_train.
- scale train_predict.
@@youssefkhachaf8258 On inverse scaling y_train it shows error saying "Expected 2D array, got 1D array instead", how do we solve this?
when you run the model on scaled data, you prediction should come between 0 to 1 (in case of min max scaler ) . so you can directly use both series to calculate MSE ,MAPE
The best video that you can learn Machine Learning. Thank you for the video!
If I may, add an edit in the code...the mean squared error you compared are of scaler transformed and non scalar transformed data..eg: y_train values are scalered transformed while train_predict data is already inversed scalar. Hence you got RMSE value of 100+. Compare root mean squared error before inverse transforming the predicted data. RMSE of 0 to 0.5 is quite acceptable.
Shouldnt we check rmse error after inverse transforming both y_train and y_predict? because it will give us the actual rmse values in a practical sense
Very Impressive!!! Another gem from your treasury.
Very good one in great detail!!
Doing great job!!
Keep Posting more..
Thanks for the video. This is the first video this concept is explained really well. Now I understand the code and can use it
This made my Day. Thank you so much Sir, for this Wonderful Tutorial.More power to you.
Aren’t you supposed to apply the Min Max scaler after the train split? I believe what you did might result in a data leak.
It doesn't really matter does it? split then scale or scale then split..
@@doji-san It matters. You have to split before then apply MinMaxScaler() to the train set. After that , you transform the train set and the test set with your MinMaxScaler() instance that has been fitted to train. Fitting from the begining to all the data will lead to a look ahead bias. Thus , the results will be more good than they should be.
Exactly I was thinking the same. And even he used test data as validation while fitting the model, which I guess will also result in data leakage
if you transform before you split, and if its a data wide transformation (not sample specific), you are setting global/data reference points from the test set, thus biasing your transformations, always ensure your test set mimics the real unknown future home slice
hi
i have been watching some videos and so far this is the top related to this topic. keep going
Very good presentation. So clear and understandable. Thankyou
Hi Krish, at 16:03. Shouldn't that be X_test ? Why are you calling it Y_train ?
Hey Krish,
The video showcases the implementation of the LSTM beautifully.
It was easy, will definitely try to implement and check the result with tweaks.
Keep it up.
Thanks for the upload.
You can also use yahoo to fetch the stocks data. It doesn't have api call restrictions
🌟
scaling using a min-max scaler before the train and test split is a kind of data leakage for the test set. Isn't it?
Yes it is. I noticed that too.
At 29.20 , we also need to inverse transform the y_train and y_test before calculating the rmse
Exactly
Please send the code how to do it?
even normal inverse transform is giving error for me since it is in 3d shape
Thanks so much...you really explained well...from the fundamentals to the applications.
sir, you should have a created data-generator intend of creating the function creat_dataset it would save memory and time. Love your videos keep doing it.
Can you elaborate a bit on this?
After the prediction, please show how real market moved in next 30 days to compare how well the model predicted it!! Cause it looks like it is just flattening out at the end.
Naik सर great aahat tumhi 🙏💯
Really, WONDERFUL . Many-many thanks sir.
Dear Sir, If you are should not use for real world project then please mention the step also what kind of step should be consider in real world case study solution interms of stock prise prediction?
Please provide your valuable feedback?
Great video man, really inspired me to make a version 2 code of predicting stock market.
Thanks for the wonderful video.
I tried in google stock today
One clarification is on lines 311 and 312 we need to inverse transform (y_train & y_test) back to non scaled version before taking the rms for comparison with the predicted values.
when i did that the RMSE came down drastically.
But on inverse transforming y_train and y_test it shows error 'Expected 2D array, got 1D array instead', but y_train, y_test is a 1D array only. How to solve this??
@@rajatrautan5005
Hi .Its quite some time i did this. If i have ur email can share the notebook.
Please explain how to do the transform send the code..
First comment Sir please make videos on gsoc and projects related to it to which we can contribute . I have mailed you regarding the same.. ty for such valuable contents sir...
Thanks Krish, please upload Deep Learning further videos.
Thanks for the video. Just one observation. I believe 817th element will be part of the testing data set,not the training.
one problem is you are taking the price for MinMax scaling between 0 and 1. That would create a problem as you can't enter a "maximum price". Maybe use % change instead.
Beautiful Explanation, Thanks for video.
Beautifully explained...thanks a lot!!
Krish Naik sir == improvisation
Sanyam Jain True
very well demonstrated and useful content....
At this step:
model=Sequential()
model.add(LSTM(50,return_sequences=True,input_shape=(100,1)))
model.add(LSTM(50,return_sequences=True))
model.add(LSTM(50))
model.add(Dense(1))
model.compile(loss='mean_squared_error',optimizer='adam')
I am getting a following error:
NotImplementedError: Cannot convert a symbolic Tensor (lstm/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
Important to note that real stock prediction takes in MANY different data sources, the time-series data of the stock price alone is not sufficient to make accurate predictions, you must take into account the sentiment of news around a company etc
First of all , thanks for a wonderful session. One question about scaling though. Shouldn't the MinMaxScaler be used to fit_transform the training data and then use the "fitted" scaler to the test data ?
same question for me , because from jason brownlee sources , i've seen him using the scaler of fitted values on test data
I am going to implement this in my project great job sir
Krish.. can you please upload multivariate time series video
I followed your keras tuner model for univariate time series forecasting,but it shows error like input ० layer not compatible
Data Scaling: fit only train data & transform train and test data. You cannot use the entire dataset to fit. This way you're simply cheating. Also, the evaluation is a bit cheating because you construct lagged features from known target data, i.e. data leakage. That's why it now looks great. You need to predict and rebuild the lagged features in a stepwise way.
Hi Krish Sir, thank you for such an amazing explanation. Really liked it. It helped me in my project too.
Thanks for you video. Could you explain why scaling is important here? It doesn't make sense why it would matter since we are only using one column. If there are more than one column it makes sense since two columns have different scales but why for one column?
Because in LSTM neural networks we are using the activation function which takes only 0 and 1 that y we have to transform it 0 and 1
Why we have transformed the data into range 0 - 1? What if we don't?
After data is split please check the train size and test size, total size is 1258 including zero but the train and test combined is 1256, two data points are missing, also there is a lag in prediction
@16:57 it should say X_test instead of y_train
thank you very much for the video
:)
Thanks for the video! Just one thing, how can we put dates in the x-axis in the graph and even future dates for prediction? Please reply!
Set the index as DATE column. And for future dates, you need to use DateOffset method in a range.
Thank you very much. It was very helpful.
Amazing...In 22:43, there is a table showing model summary....Can u please explain what information does each and every entry in this table tells us.
Hi... Please address Multi time series forecasting using LSTM.
@Srikanth Rachakulla, did you find out any ways to implement this?
@@rahulm774 nope I haven't got any
@@srikanthrachakulla5488 Hello Srikanth. Are you trying to build a trade bot ? I am also trying to build one. I am beginner in coding, will be happy to connect with you
1. Not normalised data doesn't work so well, and when you normalise it you make specific "roof" treated as a barrier hard to jump over by neuro network. In simple words - it's hard to predict values higher than historic data.
2. How can you expect when you train on so few samples to make network predict so many days ahead? Each day error of prediction raises, and in my opinion after 3-4 days error is much bigger than data predicted. It makes sense to make network predict one or two days aheadnd that's it. (I believe anyway with at most 70-80% precision, not more)
Hello my friend, thank you very much for this video. I am having a question,.. What do you think the best model for stock market prediction?
Hi , im having trouble running the api key as it is coming invalid decimal literal , could anyone please help
same
thankyou sir......nicely explained and for the code sharing
This was my final year project ...what if someone asks me why u choose this project what should I say?? In interview
Hi this is a very useful video what changes do I need to make for a multivariate LSTM?
Shouldn't we split the data before scaling it?
Yeah, create another set for mock predictionn
Krish Sir, dataset is not declared before this function
import numpy
# convert an array of values into a dataset matrix
def create_dataset(dataset, time_step=1):
dataX, dataY = [], []
for i in range(len(dataset)-time_step-1):
a = dataset[i:(i+time_step), 0] ###i=0, 0,1,2,3-----99 100
dataX.append(a)
dataY.append(dataset[i + time_step, 0])
return numpy.array(dataX), numpy.array(dataY)
please correct the code
Thanks for the video, I think the python explanation was good to understand how to put things together but I'm afraid the technical approach is very similar to the implementation of an auto-regressive model of a time series. Using only the historicity of closing prices might not be enough to have a good prediction model. Thanks anyway.
So, I ran your code tested like 5 different tickers. And guess what...in the next 30 days we will have a financial crisis and all stocks will drop 80% or more.
I got the same problem. Does it have something to do with that Tiingo only gives data with lenght of 1257 instead of 1258 as in the video? It´s the only difference I can find.
@@TheBjornAlmighty I don’t think the prediction is accurate at all. Tesla, Apple, Amazon all the stocks drop down to like $50... :-) in the next 30 days
@@keylomoon for sure it´s wrong. But our result shouldnt differ that much from his. Just look at the loss. Mine is way above 1000 in training and 2000 in test, on all stocks. So I think our input data is put in wrong somehow.
@@TheBjornAlmighty I find that candlesticks pattern Recognition is way better to use to predict future stock compare to this one. My accuracy predicting bullish stock are like 80% accurate. Try that! This number prediction I think is kind of impossible because that’s like guessing human minds I don’t think that’s possible.
@@keylomoon Cool! Do you have a video/article you recommend??
Thanks krish,
you have explained step by step anyone can easily follow.
I have gone through other videos related to stock market prediction but everybody saying don't use technique to predict regular stock market predictions
can you please suggest any specific methods that i can try for regular stock market predictions.
fantastic one of the best tutorial
Hi sir can you make a vedio on web scraping through python if possible I search alot on internet but no one teaching from basic.
Does the library support extraction of NSE and BSE stocks too?
I loved it !!
It's a good video
Hi can you please take this in detail in live stream? It is very interesting big fan of your work !
Great insite, love the content. I 'd like a piece of advice though! Would you recommend using the Multi-Step LSTM model for more than one output or re-using the single output as done above?
Exactly he will cache the previous lstm cell states for that
Sir, please make video using bi-Directional lstm.
Thanks
Hi Krish , can you video with multi input & output.
pls focus on difference btw one step forecasting and multi step forecasting.
Yeah, this just predicts one day at a time based on previous real days. right?
@ Krish Naik sir please throw some light on multivariate time series analysis
in min 16.26 , it should be ( X_test f1 f2 f3 o/p(y_test) ) not ( y_train ....)
It is good and helpful work . I have one question since the model is predicting. why is Date /Time value a decimal number rather than Date?
excellent explanation PROF:)
Hi, can you tell why you didn't do inverse scaling on y_train & y_test before calculating RMSE?
If I understood your question correctly, it does not matter if we do inverse scaling or not, the RMSE value is going to be the same
hi Krish Naik .Thanks for your videos, You are amazing bro
i found an error on ur code u did a mistake when u were trying to get the metrics , you lunch mean_squared_error( y_train, train_predict) and u find thats the error is too high bro because you didnt unscale the y_train. thats why the results on the metrics were bad and when u plot u fix it and its given you the right plot because you unscale it . HOPE its HELP GUYS
to fix it
y_train= scaler.inverse_transform(y_train.reshape(-1,1))
I don't understand we reshaped the x_train and y_train to 3 dimensions.
X_train =X_train.reshape(X_train.shape[0],X_train.shape[1] , 1)
But, in LSTM layers we put input shape as 2 dimensions why ?
model.add(LSTM(50,return_sequences=True,input_shape=(100,1)))
thanks for a wonderful explanation, could i ask you explain how to predict next unseen nth days for multivariate LSTM models?
Inverse _transform should be applied to y_train and y_test also before finding score as scaling was applied on these two, right ?
NotImplementedError: Cannot convert a symbolic Tensor (lstm_6/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
is anyone getting this error?if yes, how did you fix it? Thanks
Krish Sir,
while calculating the RMSE value train_predict is inverse transformed, but y_train is not inversely transformed. Same with the test data. Please clarify if I am missing something.
Thank you sir for making video on stock prediction.
Firstly thanks for this valuable and informative video. I just want to know that you have used only 'close' features for the prediction of 'close' values for next 30 days. What about other features? Don't they have impact on 'close' values? Why they are not used for the prediction of close value for next months as well as during trainings? Anyone can answer please... i' m just start off this project but this question not letting me to move forward..
you are just making the train test split more complicated !
for reference you can do this to split -----------------> from sklearn.model_selection import train_test_split
train_x ,test_x ,train_y ,test_y = train_test_split( x ,y , test_size = 0.35 , random_state = 0)
Thank you for your video
I have something want to ask.
So we don't need to care about metrics=['accuracy'], we only care about loss?
Thank you
Thanks, actually this video explains the time series data very well. I am a bit confused that apart from Stacked LSTM, this video should not be in NLP playlist.
How can we predict in future dates if we have more than one inputs, like open low high close
calculate the average of high and low
Good one bro, shall I know the logic of next thirty days?. It may come true for next two to three days or one week
Hi Krish.!
Great Info as usual.
Can you please advise how we can check accurecy of this LSTM model?
you can use MAPE
excellent! Thank you. Doesn't the y_train and y_test need to be reshaped from one dimension to 2 dimensions? Or could we leave y_train and y_test as one dimension?
Please, make a video about probabilistic Bayesian LSTM.