Jupyter-ai extension issue: No option for embedding model API key

Howdy,

I have a Jupyter instance running, installed on MX Linux via mamba from miniconda as a dependency for ipympl and jupyter-ai, with the jupyter-ai chat working well enough for me to have back-and-forth with OpenAI models. However, I cannot use the Jupyternaut /commands with OpenAI embedding models to work with local data, it seems because both the embedding and LLM models both use the same API key in jupyter-ai. I tried talking to ChatGPT about it, which recommended setting an environment variable for each key, but the names it suggested are nowhere to be found in the jupyter-ai documentation. The screenshot shows a field for OPENAI_API_KEY, which I set via the web GUI, and not another for the embedding model.


I know that jupyter-ai is Jupyter-official, so I hope that you can show me what I’m missing.

Much obliged,

Upset_MOSFET

Hi @Upset_MOSFET, welcome to our Discourse!

I wanted to check whether you were able to resolve this issue?

From looking at the jupyter-ai github repo, I found this issue that seems to be closely related to yours, albeit with Azure keys instead of OpenAI ones. If you still haven’t found a solution, perhaps you could open an issue on the repo and link to that one for reference?

Thanks!

1 Like

Cheers
I haven’t made any progress until now, but I think you’ve pointed me in the right direction. I’ll have to look into the Base URL setting, and learn about how all that works on both ends. I’ll update with anything useful.

1 Like

Best of luck, keep us posted!

Okay,

For anybody else who has the same or similar problem, the answer is straightforward but not obvious. Check the OpenAI API Reference for how to create embeddings. Right now, it’s a http post to the following URL: https://api.openai.com/v1/embeddings
Just fill in the URL I mentioned above to the Base API URL (optional) field, select your embedding model, and use just the one key. Once I did that, Jupyternaut was able to /learn, and I was able to /ask. So, jupyter-ai does not need another OpenAI API key for each the language and embedding models, despite me seeing that suggestion from error messages and ChatGPT, itself.

Thank you for the update. This might be worth contributing to the jupyter-ai documentation.

Well,

Something has changed since I opened this topic, but I’ll put it here until it’s obvious where it belongs. I haven’t changed any Jupyter or jupyter-ai settings, but I installed some packages into the conda environment and restarted jupyter lab. I installed a bunch of ml libraries; pytorch, tensorflow, and all their dependencies. I got this weird error along with a notification that jupyter lab can be updated. I shut down the server, updated, and restarted, then got the same error.

Traceback (most recent call last):
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/jupyter_ai/chat_handlers/base.py", line 181, in on_message
    await self.process_message(message)
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/jupyter_ai/chat_handlers/ask.py", line 75, in process_message
    result = await self.llm_chain.acall({"question": query})
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
    return await self.ainvoke(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
    raise e
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/conversational_retrieval/base.py", line 224, in _acall
    answer = await self.combine_docs_chain.arun(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 682, in arun
    await self.acall(
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
    return await self.ainvoke(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
    raise e
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/combine_documents/base.py", line 154, in _acall
    output, extra_return_dict = await self.acombine_docs(
                                ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/combine_documents/stuff.py", line 267, in acombine_docs
    return await self.llm_chain.apredict(callbacks=callbacks, **inputs), {}
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 335, in apredict
    return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/_api/deprecation.py", line 177, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 433, in acall
    return await self.ainvoke(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 217, in ainvoke
    raise e
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/base.py", line 208, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 300, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain/chains/llm.py", line 167, in agenerate
    return await self.llm.agenerate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 724, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 684, in agenerate
    raise exceptions[0]
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 883, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 752, in _agenerate
    response = await self.async_client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1815, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1509, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/Upset_MOSFET/miniforge3/envs/my-project/lib/python3.12/site-packages/openai/_base_client.py", line 1610, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'Invalid URL (POST /v1/embeddings/chat/completions)', 'type': 'invalid_request_error', 'param': None, 'code': None}}

Either I don’t have the right URL after all, or it’s getting clipped somehow before actually making a POST. I tried removing the URL and restarting the server, adjusting that URL and restarting the server again, to no avail. I get some variation of that message. I don’t know what happened.

Peace

EDIT: The URL in the original post was wrong! I was going to go back and edit it, since it was already marked as the solution, but this will have to stand, instead:
The Base API URL (optional) field should be https://api.openai.com/v1 in order to work with both OpenAI chat and embedding models. I think. Godspeed.