Hi all,
I have a locally hosted Llama 3.1 70b model running on our server at the address: 10.1xx.1xx.50:8084/generate. Since there is no API key required for this setup, I use the following code to get responses from the model:
response = requests.request(method=“POST”, url=“10.1xx.1xx.50:8084/generate”, headers=headers, data=payload)
How can I integrate this endpoint into Jupyter-AI?
Any help would be really appreciated!