cURL
curl --request POST \ --url https://api.ragbuddy.ai/proxy/{cache_type}/{version}/{path} \ --header 'Authorization: Bearer <token>' \ --header 'Content-Type: application/json' \ --header 'helvia-rag-buddy-token: <helvia-rag-buddy-token>' \ --data ' { "prompt": "<string>", "model": "<string>" } '
{ "id": "<string>", "choices": [ { "finish_reason": "stop", "index": 123, "logprobs": { "text_offset": [ 123 ], "token_logprobs": [ 123 ], "tokens": [ "<string>" ], "top_logprobs": [ {} ] }, "text": "<string>" } ], "created": 123, "model": "<string>", "object": "<unknown>", "system_fingerprint": "<string>", "usage": { "completion_tokens": 123, "prompt_tokens": 123, "total_tokens": 123 } }
RAG-Buddy Proxy endpoint to transparently read and write to cache and forward requests to the LLM API.
Bearer authentication header of the form Bearer <token>, where <token> is your auth token.
Bearer <token>
<token>
ragc
tc
sem
Successful Response
Show child attributes
stop
length
content_filter