Hi! I grasp the importance of testing the application locally. However, once the local testing phase is complete, I'm unsure of the steps to deploy it on an internet host. I've been unable to locate any instructional videos outlining this process. Thanks!🤔
You can check out my video "Deploying a PHP web app on AWS EC2" to get some idea of how deploying can be done. In it I set up a whole server though, so a simpler way might be using a "regular" web hosting provider. I might make a video on that at some point, too.
Sensei, I want you ask you...where I occasional I received the text "Sorry, but I don't know how to answer that" in between text... it still can generate response but it include the ""Sorry, but I don't know how to answer that" in that reponse. Below is the sample of the anwers:- is it because of special characters it generate? "Cobra is a venomous snake, known for its distinct hood, which itSorry, but I don't know how to answer that. flare out when threatened. There are more than 20 species of cobras found in different parts of the world,"
That happened to me too while making the video, but I'm not able to replicate it right now. It has something to do with the way I check for errors (badly) in the response. You could try adding an error_log("Weird error:".print_r($json, true)); after line 69 in message.php where that text is printed and see what is logged when that happens. Or you can just set $content to an empty string on line 69. I need to look into this further.
Hey, hope you fine I am facing issue during markdown to html converting when i did without using streaming method, it was showing perfectly after converting to html , but now it making issue let's say when, it making table markdown it's not converting to html table How chatgpt handle this?
Hi, I recieved "414 Request-URI Too Long"... upon check the GET Message.. it includes all previous message "message=Alright%20thank%20you%0A&context=%5B%5B"hi%5Cn"%2C"Hello!%20How%20can%20I%20assist%20you%20today%3" how can we clear this GET Message, so that it always refresh and only send Message based on latest request
@unconv another update on this. If I ask chat-wtf to produce a large output (e.g write me a short story with 10 paragons), then I’ve realised it shows about 5 paragraphs after a few seconds and then the remaining 5 a few more seconds after that. So it seems like it’s streaming very large chunks of the output rather than individual characters.
Interesting... Can you check the Chrome dev tools network tab and the request to message.php, what is shown in the EventStream tab? Might be a Markdown rendering performance issue too
Hi I am having the same issue. It is responding but no stream. Running on IIS 10 with php8.3. Read that it is related to compression and disabled it but still not working. Did you manage to fix it?
My coding skill just increased. Thank you bro!
Amazing tutorial, thanks!
Really nice video. Can you make video for server sent event when api callback?
Hi,
This really Gold Nuggets!!! Thank you, Thank you very much!!! ❤
Hi! I grasp the importance of testing the application locally. However, once the local testing phase is complete, I'm unsure of the steps to deploy it on an internet host. I've been unable to locate any instructional videos outlining this process.
Thanks!🤔
You can check out my video "Deploying a PHP web app on AWS EC2" to get some idea of how deploying can be done. In it I set up a whole server though, so a simpler way might be using a "regular" web hosting provider. I might make a video on that at some point, too.
This is just awesome . Thanks
Sensei, I want you ask you...where I occasional I received the text "Sorry, but I don't know how to answer that" in between text... it still can generate response but it include the ""Sorry, but I don't know how to answer that" in that reponse. Below is the sample of the anwers:- is it because of special characters it generate?
"Cobra is a venomous snake, known for its distinct hood, which itSorry, but I don't know how to answer that. flare out when threatened. There are more than 20 species of cobras found in different parts of the world,"
That happened to me too while making the video, but I'm not able to replicate it right now. It has something to do with the way I check for errors (badly) in the response. You could try adding an error_log("Weird error:".print_r($json, true)); after line 69 in message.php where that text is printed and see what is logged when that happens. Or you can just set $content to an empty string on line 69. I need to look into this further.
I was trying to learn PHP streams, now I have gold mines when found your video. Please make a video on server sent event.
happy for you
Hey, hope you fine
I am facing issue during markdown to html converting
when i did without using streaming method, it was showing perfectly after converting to html , but now it making issue
let's say when, it making table markdown
it's not converting to html table
How chatgpt handle this?
Hi, I recieved "414 Request-URI Too Long"...
upon check the GET Message..
it includes all previous message "message=Alright%20thank%20you%0A&context=%5B%5B"hi%5Cn"%2C"Hello!%20How%20can%20I%20assist%20you%20today%3"
how can we clear this GET Message, so that it always refresh and only send Message based on latest request
I will have to fix this in a future video, so that the message history is not sent in every request, but instead saved in a session.
@@unconv Thanks Sensei. I have learned a lot from you. Really appreciate it.
Hi, I downloaded the chat-wtf GitHub repo and ran it on my localhost, but it doesn't seem to stream? Is the code the the latest version? Thanks!
Does it not respond at all or just doesn't stream?
@@unconv Yes, it does respond, but just in one block, not streaming each character like ChatGPT.
@unconv another update on this. If I ask chat-wtf to produce a large output (e.g write me a short story with 10 paragons), then I’ve realised it shows about 5 paragraphs after a few seconds and then the remaining 5 a few more seconds after that. So it seems like it’s streaming very large chunks of the output rather than individual characters.
Interesting... Can you check the Chrome dev tools network tab and the request to message.php, what is shown in the EventStream tab? Might be a Markdown rendering performance issue too
Hi I am having the same issue.
It is responding but no stream.
Running on IIS 10 with php8.3.
Read that it is related to compression and disabled it but still not working.
Did you manage to fix it?
amazing
awesome.
please make with anthropic API stream
you are aware that your mic is terrible?
Yes