I'm getting this error: AttributeError: 'NoneType' object has no attribute 'endswith' on the line with: model.load_weights(model.build(tf.TensorShape([1, None]))) This is the full error: Traceback (most recent call last): File "shakespear.py", line 85, in model.load_weights(epochC) File "/home/zachary/.local/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler raise e.with_traceback(filtered_tb) from None File "/home/zachary/.local/lib/python3.8/site-packages/keras/saving/saving_utils.py", line 322, in is_hdf5_filepath return (filepath.endswith('.h5') or filepath.endswith('.keras') or AttributeError: 'NoneType' object has no attribute 'endswith' I'm using ubuntu, thank you for your help.
Do you know why the padded batch in tf 2.x return [None] or [] as its tensorshape? When we pad in keras using pad_sequences we specify a predefined length which makes sense but how does this work in tf 2.x when there is no specified padding shape?
Great video! I am getting a problem however. After the program writes the created story, it outputs "Exception ignored in: " any idea what this might mean?
Hi Phil, I was recently assigned a project on topic modeling (using latent dirichlent allocation). I was wondering, if you will cover topic modeling in the future in this series on nlp (?) I'm facing a problem, where I am trying to grid search for the optimal hyperparameters and the optimal model will always have the least amounts of topics I select in the grid search. This leads to the problem that the best hyperparameters will always only have 1 topic (in scikit-learn n_components:1). This pretty much defeats the point of finding clusters / topics, if there is only 1 cluster. I think, the dataset I provide the algorithm is just too small but I am not sure. Would be wonderful, if you could have a tip for me or mention it in an upcoming video (if you plan on covering topic modeling anyways).
just realised, it's running it on the cpu not the gpu. I guess that's because the 1070 doesn't have tensor cores, so it's reverting to the cpu (even though I installed the gpu version of tensorflow). I wondered why my front room was getting warm :D
@@MachineLearningwithPhil hmm, weird. I only installed the GPU version using: conda create -n tf-gpu tensorflow-gpu conda activate tf-gpu 100% cpu usage, 31% gpu load.
The persistent memory in the notebooks can lead to bugs. It's just another bit of software that runs in the background, and requires using a browser. It's inferior in every way IMO.
This content is sponsored by my Udemy courses. Level up your skills by learning to turn papers into code. See the links in the description.
followed this video step by step, love it. Keep on keepin on!
I'm getting this error: AttributeError: 'NoneType' object has no attribute 'endswith' on the line with: model.load_weights(model.build(tf.TensorShape([1, None])))
This is the full error:
Traceback (most recent call last):
File "shakespear.py", line 85, in
model.load_weights(epochC)
File "/home/zachary/.local/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/home/zachary/.local/lib/python3.8/site-packages/keras/saving/saving_utils.py", line 322, in is_hdf5_filepath
return (filepath.endswith('.h5') or filepath.endswith('.keras') or
AttributeError: 'NoneType' object has no attribute 'endswith'
I'm using ubuntu, thank you for your help.
You are hands down the best ML/AI youtuber. Keep up the good work. Best of luck for the future
Appreciate that
Do you know why the padded batch in tf 2.x return [None] or [] as its tensorshape? When we pad in keras using pad_sequences we specify a predefined length which makes sense but how does this work in tf 2.x when there is no specified padding shape?
Great video! I am getting a problem however. After the program writes the created story, it outputs "Exception ignored in: " any idea what this might mean?
Great tutorial, thanks a lot!
I'm interested in writing an AI writing assistant. Any tips for libraries or tutorials I could check out?
Hi Phil, I was recently assigned a project on topic modeling (using latent dirichlent allocation). I was wondering, if you will cover topic modeling in the future in this series on nlp (?)
I'm facing a problem, where I am trying to grid search for the optimal hyperparameters and the optimal model will always have the least amounts of topics I select in the grid search. This leads to the problem that the best hyperparameters will always only have 1 topic (in scikit-learn n_components:1). This pretty much defeats the point of finding clusters / topics, if there is only 1 cluster.
I think, the dataset I provide the algorithm is just too small but I am not sure. Would be wonderful, if you could have a tip for me or mention it in an upcoming video (if you plan on covering topic modeling anyways).
The play is Coriolanus
Thank you!
Hey Phil. Can you please explain or guide how to make a smart compose feature like gmail. NGram Model.
Is tf.data.Dataset.from_tensor_slices equivalent to pytorch Dataset and dataset.batch equivalent to the DataLoader? Thanks
what sort of speed do you get on the 2080? I'm getting 190 seconds per epoch on a gtx 1070
just realised, it's running it on the cpu not the gpu. I guess that's because the 1070 doesn't have tensor cores, so it's reverting to the cpu (even though I installed the gpu version of tensorflow). I wondered why my front room was getting warm :D
No. This is a config issue. Make sure you only have tensor flow gpu. If you install the base package after the gpu package, it reverts to cpu.
@@MachineLearningwithPhil hmm, weird. I only installed the GPU version using:
conda create -n tf-gpu tensorflow-gpu
conda activate tf-gpu
100% cpu usage, 31% gpu load.
@@MachineLearningwithPhil github.com/tensorflow/models/issues/1942
Comment for the Algorithm! Nice one Phil!
I know this is a old video but I really hope you see this what are you using to put the code in
It's vim!
Love it!
what keyboard are u using?
The cooler master storm. It's an old model from 2014 so they have probably updated since then.
@@MachineLearningwithPhil thanks!
Thaks another environment appears to be GPT3
Or better yet...did Shakespeare write Shakespeare?
Why dont you just use jupyter notebook?
Why would I? They're slower, more cumbersome, less portable, and the state persistence leads to some difficult to diagnose bugs.
@@MachineLearningwithPhil I'm just learning and using spyder with anaconda. It seems great, do you know of any drawbacks with it?
The persistent memory in the notebooks can lead to bugs. It's just another bit of software that runs in the background, and requires using a browser. It's inferior in every way IMO.
@@MachineLearningwithPhil thanks, what are you using?
Just the terminal and vim for text editing