This is the best explanations of COT I have seen by far. I learned quickly a long time ago to do these things but now I have names for them. For example, I made a book maker at first using least to most then adding in the other methods, also. I had it even output in LaTeX for typesetting. More recently, I have implemented past and future contemplation to identify what might have been but was missed or what will be needed to reach a goal. Also substitution problem solving by identifying commonalities in a missing component and other things with the same commonalities. And I have a mechanism for Free Will. That is, derive options, weigh them against each other in terms of likelihood multiplied by efficacy and go with the highest. Contemplation adds to the number of options that can be derived
This is the best explanations of COT I have seen by far. I learned quickly a long time ago to do these things but now I have names for them. For example, I made a book maker at first using least to most then adding in the other methods, also. I had it even output in LaTeX for typesetting. More recently, I have implemented past and future contemplation to identify what might have been but was missed or what will be needed to reach a goal. Also substitution problem solving by identifying commonalities in a missing component and other things with the same commonalities. And I have a mechanism for Free Will. That is, derive options, weigh them against each other in terms of likelihood multiplied by efficacy and go with the highest. Contemplation adds to the number of options that can be derived
Fantastic job explaining this. Thank you
Glad it was helpful!
Can I use CoT prompting techniques for Fine turning an LLM models?
Do you have examples (open source, etc.) of multimodal COT that I can test?