Simple Ollama hosting
Examples
Llama3 70b-instruct
# bash
curl https://ollama.ncsa.ai/api/chat -d '{
"model": "llama3.1:70b",
"messages": [
{ "role": "user", "content": "Write a long detailed bash program" }
]
}'
# python
from ollama import Client
client = Client(host='https://ollama.ncsa.ai')
response = client.chat(model='llama3.1:70b', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])Text embeddings
Last updated