Conversation
…into feat/byok
#33 In short, it'll 1. [Frontend] Recognize that user is trying to add a citation (trigger text is `\cite{`) 2. [Frontend] Temporarily suppress default Overleaf dropdown suggestions 3. [Frontend] Get the last sentence as context for LLM 4. [Backend] Fetch bibliography in `.bib` files as raw text, and remove irrelevant fields to save tokens 5. [Backend] Call XtraMCP to get paper abstract, using paper title as key 6. [Backend] Query a fast LLM (hardcoded to `gpt-5.2` for now) to get at most 3 citation keys 7. [Frontend] Suppress default Overleaf tab-completion to allow users to accept citation suggestions
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>## Co-authored-by: andre <95348273+4ndrelim@users.noreply.github.com> Co-authored-by: Junyi Hou <junyi@xtras3.tail08d22c.ts.net> Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Junyi Hou <junyi@xtras3.tail08d22c.ts.net> Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
### Key changes: - Select models by ID instead of slug so users can have multiple API keys for the same slug - Add loading spinner for save/edit actions
|
@wjiayis Hi Jia Yi, I recall you had some great feedback to improve user experience / make it more intuitive for new users. Could you kindly help to check if your concerns have been addressed too? Thanks! |
4ndrelim
left a comment
There was a problem hiding this comment.
Clarification required.
Also, do we know the issue with deepseek and GLM? If not, no worries, we can proceed but raise an issue for those 2 first.
| if customModel != nil { | ||
| params := openaiv3.ChatCompletionNewParams{ | ||
| Model: customModel.Slug, | ||
| Temperature: openaiv3.Float(float64(customModel.Temperature)), |
There was a problem hiding this comment.
Have you tested with varying temperatures for all the models? We might have to handle some corner case here. For example, i believe GPT-5.1 only allows temperature setting of 1.0. Any attempt to configure it otherwise will lead to an error. I am unsure if the other models have this peculiar behaviour too.
| ModelName string | ||
| Endpoint string | ||
| APIKey string | ||
| ModelName string |
There was a problem hiding this comment.
i recall our last discussion that this field is to be unique so users can differentiate different API keys of the same slugs. Is this still the case? If so, how do we ensure it is unique?
|
I noticed in the screenshot here there are 3 custom models. May i verify the expected behaviour: |

Summary
Adds BYOK features (including base URL, API key and param configurations), also added changes as per suggestions from #157.
Tested Providers
Stable: GPT, Claude, Gemini, MiniMax, OpenRouter
Unstable: DeepSeek, GLM
Screenshots
Closes #118 Closes #149 Closes #157