Skip to content

Llm client logging pii#1188

Merged
jeffbl merged 2 commits intomainfrom
llm_client_logging_pii
Mar 5, 2026
Merged

Llm client logging pii#1188
jeffbl merged 2 commits intomainfrom
llm_client_logging_pii

Conversation

@jeffbl
Copy link
Member

@jeffbl jeffbl commented Mar 5, 2026

Not fully tested, but hopefully fixes at least some of the issues in #1182. Requires testing before deployment to production.

PROBLEM: LLM results were not being logged.
SOLUTION: I think this was not set up properly in utils/client.py, so I've updated that, and now with PII logging enabled, you can see the raw results coming back from the LLM.

PROBLEM: qwen3.2-vl worked directly to vLLM on pegasus, but when trying on mallory via open-webui in front of ollama, it was failing.
SOLUTION: remove stop_tokens in client.py, which were preventing the LLM from generating a message field in the response - it was only outputting a reasoning field with all of its internal discussion about object detection.

GUESS: the actual problem with some photos not returning may have been the LLM having a tiny context window. I saw log output on mallory indicating it was hitting context window limits. In open-webui, I've changed num_ctx to 16000, whereas it was originally around 2000, which is a very small window.

Tested on qwen3.5:4b against several photos on IMAGE examples page, and against the problematic photos in #1182, and they all worked. Note it is pretty slow. Not sure if that is just mallory with its 3090, or larger context window, or something else.

Assistance: Used GPT-5 mini via github copilot to assist with finding changes.

Please ensure you've followed the checklist and provide all the required information before requesting a review.
If you do not have everything applicable to your PR, it will not be reviewed!
If you don't know what something is or if it applies to you, ask!

Please note that PRs from external contributors who have not agreed to our Contributor License Agreement will not be considered.
To accept it, include I agree to the [current Contributor License Agreement](/CLA.md) in this pull request.

Don't delete below this line.


Required Information

  • I referenced the issue addressed in this PR.
  • I described the changes made and how these address the issue.
  • I described how I tested these changes.

Coding/Commit Requirements

  • I followed applicable coding standards where appropriate (e.g., PEP8)
  • I have not committed any models or other large files.

New Component Checklist (mandatory for new microservices)

  • I added an entry to docker-compose.yml and build.yml.
  • I created A CI workflow under .github/workflows.
  • I have created a README.md file that describes what the component does and what it depends on (other microservices, ML models, etc.).

OR

  • I have not added a new component in this PR.

@jeffbl jeffbl self-assigned this Mar 5, 2026
@jeffbl jeffbl merged commit dd7b8b9 into main Mar 5, 2026
2 checks passed
@jeffbl jeffbl deleted the llm_client_logging_pii branch March 5, 2026 20:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant