AntonAntonov/ Chatnik

(1.0.1) current version: 1.0.3 »

CLI scripts for conversing with persistent LLM personas

Contributed by: Anton Antonov

Command Line Interface (CLI) scripts to manage and interact with multiple, persistent Large Language Model (LLM) chat objects. Access to repository prompts is provided via automatic prompt expansion. Access to chat object elements (history, models, etc.) facilitates the creation of useful CLI pipelines.

Installation Instructions

To install this paclet in your Wolfram Language environment, evaluate this code:
PacletInstall["AntonAntonov/Chatnik"]


To load the code after installation, evaluate this code:
Needs["AntonAntonov`Chatnik`"]

Details

The paclet provides Command Line Interface (CLI) scripts for conversing with- and managing persistent Large Language Model (LLM) personas.
The script "LLMChat" is used to interact with the chat objects.
The script "LLMChatMeta" is used to mange the chat objects -- deletion, getting context, first- or last message, clearing of messages, etc.
The script "LLMPrompt" is used to get the texts of repository prompts in the shell environment.
"Chatnik" uses the Wolfram Language persistent values functionalities in order to maintain persistent interaction with multiple LLM chat objects.
"Chatnik" can be seen as a package that "moves" the LLM-chat objects interaction system of the paclet "Chatbook" into typical OS shell interaction. (I.e. an OS shell is used instead of a Wolfram notebook.)
There are several consequences of this approach: (1) Multiple LLMs and LLM providers can be used, (2) The chat messages can use the provided by Wolfram Language prompts collection and prompt spec DSL, (3) Easy access to OS shell functionalities.

Paclet Guide

Examples

Basic Examples (4) 

The script "LLMChat" is used to interact with the chat objects. Create and chat with an LLM persona named "yoda1" (using the Yoda chat persona):

LLMChat 'hi, who are you?' --i=yoda1 --prompt=@Yoda # Hmmm. Yoda, I am. Jedi Master, wise and old. Much to teach, I have. Yes, hmmm.

Continue the conversation with "yoda1":

LLMChat 'since when do you use a green light saber?' --chat-id=yoda1 # Green, my lightsaber is, yes. Symbol of a Jedi Consular, it is. Wisdom and harmony, it represents. Since my training as a Jedi, I have wielded it. A sign of my connection with the Force, green blades are. Hmm. Use it wisely, I do.

Execute a chat object management command using the script LLMChatMeta -- view the whole interaction with "yoda1":

LLMChatMeta full-text --chat-id=yoda1

Clear the messages of "yoda1":

LLMChatMeta clear --chat-id=yoda1

Scope (4) 

Installation MacOSX & Linux (2) 

Copy paclet's scripts to a directory that is already in shell's PATH variable. For example, "~/.local/bin":

On MacOSX if no directory is specified for ChatnikCopyScripts (i.e the directory is Automatic) then paclet's scripts are copied in the "~/Applications" directory:

Chat pipelines (1) 

Longer CLI pipelines make "Chatnik" very useful. For example:

cat README.md | LLMChat - --prompt=@Summarize && LLMChat "&Translate|Russian^"

Chat objects management  (1) 

The script "LLMChatMeta" is used to manage the chat objects. Here is its usage message:

LLMChatMeta --help # Chat with persistent LLM-chat objects. # # * Mandatory positional arguments: # NAME DOCUMENTATION # command Command, one of: card, clear, context, delete, file, first-message, last-message, list, load-llm-personas, message, messages. # # * Optional arguments (must be passed as --name=... in any order): # NAME DEFAULT DOCUMENTATION # chat-id Chat ID. # all false Whether to apply the command to all chat objects or not. # n -Infinity Messages to clear. (For 'clear' and 'messages' only.) # index -1 Message index. (For 'message' only) # format Format of the result. (For 'list' and 'context' only.)LLMChatMeta list --format=json # { # "yoda1":{ # "ChatID":"aa02f6c7-9223-4ebe-b216-607253b09fa7", # "Messages":7, # "LLMConfiguration":{ # "Service":"OpenAI", # "Name":"gpt-4.1-mini" # }, # "Usage":"274 tokens" # } # }

Publisher

Anton Antonov

Disclosures

  • External services
  • Wolfram Language system configuration
  • Paclet dependencies
  • Local system interactions
  • Learn More »

Compatibility

Wolfram Language Version 14.3

Version History

  • 1.0.3 – 12 May 2026
  • 1.0.2 – 12 May 2026
  • 1.0.1 – 12 May 2026
  • 1.0.0 – 11 May 2026

License Information

MIT License

Paclet Source

Source Metadata

See Also