Well,
Something has changed since I opened this topic, but I’ll put it here until it’s obvious where it belongs. I haven’t changed any Jupyter or jupyter-ai settings, but I installed some packages into the conda environment and restarted jupyter lab. I installed a bunch of ml libraries; pytorch, tensorflow, and all their dependencies. I got this weird error along with a notification that jupyter lab can be updated. I shut down the server, updated, and restarted, then got the same error.
Traceback (most recent call last):
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/jupyter_ai/chat_handlers/base.py", line 181, in on_message
await self.process_message(message)
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/jupyter_ai/chat_handlers/ask.py", line 75, in process_message
result = await self.llm_chain.acall({"question": query})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
return await self.ainvoke(
^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
raise e
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/conversational_retrieval/base.py", line 224, in _acall
answer = await self.combine_docs_chain.arun(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 682, in arun
await self.acall(
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
return await self.ainvoke(
^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
raise e
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/combine_documents/base.py", line 154, in _acall
output, extra_return_dict = await self.acombine_docs(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/combine_documents/stuff.py", line 267, in acombine_docs
return await self.llm_chain.apredict(callbacks=callbacks, **inputs), {}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 335, in apredict
return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
return await self.ainvoke(
^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
raise e
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 300, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 167, in agenerate
return await self.llm.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 724, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 684, in agenerate
raise exceptions[0]
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 883, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 752, in _agenerate
response = await self.async_client.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1339, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1815, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1509, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1610, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'Invalid URL (POST /v1/embeddings/chat/completions)', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Either I don’t have the right URL after all, or it’s getting clipped somehow before actually making a POST. I tried removing the URL and restarting the server, adjusting that URL and restarting the server again, to no avail. I get some variation of that message. I don’t know what happened.
Peace
EDIT: The URL in the original post was wrong! I was going to go back and edit it, since it was already marked as the solution, but this will have to stand, instead:
The Base API URL (optional)
field should be https://api.openai.com/v1 in order to work with both OpenAI chat and embedding models. I think. Godspeed.