Wolfram Function Repository
Instant-use add-on functions for the Wolfram Language
Function Repository Resource:
Interact with local AI/LLM models via an Ollama server
ResourceFunction["OllamaSynthesize"][prompt] generates a response to the given prompt using an AI model. | |
ResourceFunction["OllamaSynthesize"][prompt,images] generates a response for the given prompt and list of images. | |
ResourceFunction["OllamaSynthesize"][list] generates a response for a list containing textual prompts and images. |
After installing Ollama and downloading a model, try a basic question:
In[1]:= | ![]() |
Out[1]= | ![]() |
Ask a question about an image:
In[2]:= | ![]() |
Out[2]= | ![]() |
Ask another question about a different image:
In[3]:= | ![]() |
Out[3]= | ![]() |
Mix the question and image(s) in a single list:
In[4]:= | ![]() |
Out[4]= | ![]() |
The default model is "Llava". Use the "OllamaModel" option to specify another one:
In[5]:= | ![]() |
Out[5]= | ![]() |
If you specify a model that does not exist or a model that you did not download locally, then an error is raised:
In[6]:= | ![]() |
Out[6]= | ![]() |
Wolfram Language 14.0 (January 2024) or above
This work is licensed under a Creative Commons Attribution 4.0 International License