Skip to content

Conversation

Tindell
Copy link

@Tindell Tindell commented Mar 21, 2023

To implement this, I first changed how the prompt and file args are processed, so that they turn off the default interactive mode and color. Then I added the provided prompt into embd_inp after the instruction and prompt prefix.

I also moved some default configuration settings into utils.h to be consistent with llama.cpp.

For example, you can run:
./chat -p "write a python script that prints the current time" or chat -f prompt.txt
to make it print just the response to that prompt. Or:
./chat -p "write a python script that prints the current time" --interactive
To respond to the first prompt and then continue interactively.

See issues 68 and 95.

@antimatter15 antimatter15 merged commit 9116ae9 into antimatter15:master Mar 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants