It will work with any API that has an OpenAI-compatible ChatCompletions endpoint. So that seems like a yes. Just use the Local LLM AnyNode and you can point it.
I cannot establish a connection to my local instance of ollama (on docker). I have a custom server url (managed by traefik). I have no Idea how to fill this inputs on AnyNode (especially API key and server). I am not exactly a technical person. Here is the error I get for debugging: Last Error: None Generating Node function... INPUT 30 LE: None Imports in code: [] Stored script: An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"} An error occurred: Traceback (most recent call last): File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode odes\any.py", line 221, in safe_exec exec(code_string, globals_dict, locals_dict) File "", line 1 An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"} ^^^^^ SyntaxError: invalid syntax !!! Exception during processing!!! invalid syntax (, line 1) Traceback (most recent call last): File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode odes\any.py", line 260, in go raise e File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode odes\any.py", line 252, in go self.safe_exec(self.script, globals_dict, locals_dict) File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode odes\any.py", line 225, in safe_exec raise e File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode odes\any.py", line 221, in safe_exec exec(code_string, globals_dict, locals_dict) File "", line 1 An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"} ^^^^^ SyntaxError: invalid syntax
For ollama, api key is the same as I have it by default... and server should be localhost:1134 like default... When you start ollama with docker, what is the command you use?
This would be very interesting, but you going off on tangents and jumping around in a mental maze, makes it really really hard to follow and grasp what you want to say. Maybe a skript for you videos would help. 🤔🤷 I will still subscribe in the hope that it was just the two videos I watched…
The LLMs output is a function (if that is what you mean), but that function to output whatever you want. In the latest video there's a part where I just ask the AnyNode to `output -1.5` because I can't find a node that outputs negative floats, so I use it as a float variable output to pipe into my NormalMap AnyNode as a depth parameter.
I ask it to change it to be black and white and it does. I mean, correct/incorrect is up to you in how you further iterate on what it output. You can just ask in a different way and it will use the old code to change/refactor however you want
your video's are not very structured. You're making it up on the fly it seems. If you write a script beforehand and stick to it, your videos would have more views.
Well this is a wonderful video. Thanks!
How do i prompt Any Node to load the IPAdapter model?
In this case you should just load the ipadapter. AnyNode doesnt spawn other nodes.... yet ^_^
I wonder if you could use this for automatic astrophotography image processing.
Hey great work does this work with LM Studio?
It will work with any API that has an OpenAI-compatible ChatCompletions endpoint. So that seems like a yes. Just use the Local LLM AnyNode and you can point it.
I cannot establish a connection to my local instance of ollama (on docker). I have a custom server url (managed by traefik). I have no Idea how to fill this inputs on AnyNode (especially API key and server). I am not exactly a technical person.
Here is the error I get for debugging:
Last Error: None
Generating Node function...
INPUT 30
LE: None
Imports in code: []
Stored script:
An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"}
An error occurred:
Traceback (most recent call last):
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode
odes\any.py", line 221, in safe_exec
exec(code_string, globals_dict, locals_dict)
File "", line 1
An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"}
^^^^^
SyntaxError: invalid syntax
!!! Exception during processing!!! invalid syntax (, line 1)
Traceback (most recent call last):
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode
odes\any.py", line 260, in go
raise e
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode
odes\any.py", line 252, in go
self.safe_exec(self.script, globals_dict, locals_dict)
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode
odes\any.py", line 225, in safe_exec
raise e
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode
odes\any.py", line 221, in safe_exec
exec(code_string, globals_dict, locals_dict)
File "", line 1
An error occurred: API request failed with status code 405: {"detail":"Method Not Allowed"}
^^^^^
SyntaxError: invalid syntax
Come on in to the discord if you will: discord.gg/f9HhNQdV7h
For ollama, api key is the same as I have it by default... and server should be localhost:1134 like default... When you start ollama with docker, what is the command you use?
It's on Manager already!
Those are pecan nuts, much tastier than walnuts
This would be very interesting, but you going off on tangents and jumping around in a mental maze, makes it really really hard to follow and grasp what you want to say. Maybe a skript for you videos would help. 🤔🤷 I will still subscribe in the hope that it was just the two videos I watched…
0:00
RAW output = possible?!
The LLMs output is a function (if that is what you mean), but that function to output whatever you want. In the latest video there's a part where I just ask the AnyNode to `output -1.5` because I can't find a node that outputs negative floats, so I use it as a float variable output to pipe into my NormalMap AnyNode as a depth parameter.
Fella this is insane.
An offline LLM = best
Except your sobel filter has color in it. so.... how useful are not correct outputs?
I ask it to change it to be black and white and it does. I mean, correct/incorrect is up to you in how you further iterate on what it output. You can just ask in a different way and it will use the old code to change/refactor however you want
Very interesting.
thanks
Oh my god.
Oh my god
ohhhkayyyyy......
your video's are not very structured. You're making it up on the fly it seems. If you write a script beforehand and stick to it, your videos would have more views.
If you believe you can do better, then make the videos yourself
@@DOCTOR-FLEX I don't think I can do better. It was ment as constructive feedback.
0:00