You may be able to find me on other platforms by the same name!

Mastodon: specialwall@woof.tech

Contact me on SimpleX or Signal!

  • 1 Post
  • 80 Comments
Joined 11 months ago
cake
Cake day: March 5th, 2025

help-circle








  • If Gemini truly can’t see PII (no way to add “notes” for example) then I don’t think that would be too big of a concern for most people, at least for those who don’t have a distain for LLMs in the first place. Though I do feel that people with “high threat models” (would be good to be precise about what a “high threat model” is in this instance) would prefer to have a local app that interfaces with a local Ollama API, rather than an internet-connected service.

    What precisely is Gemini “calculating” here and why can’t its function be replaced on a lightweight local LLM?

    Edit: After reading the information from the website, it sounds like there are a lot of opportunities for users to accidentally identify themselves to AI providers or open up de-anonymization attack vectors. If I were very concerned about my identity being linked to my recovery behavior, I would probably not use this service as it is now.





  • If the user trusts the server to serve safe JavaScript each time they connect with an empty cache (which is cleared often for privacy-conscious users), I’m not sure how this adopts a very different security posture from the Trust On First Use security model that’s used by many other apps, even if the app itself implements secure MITM mitigations using data from shared links.

    When you have an app with dedicated updates, it is possible to verify that it is genuinely from the developer or maintainer. Web browsers’ certificate validation protects against connecting to a fake server, but it does not protect the user if the server is compromised when they first connect.

    The most security-conscious users are going to end up hosting the JavaScript in a webserver on localhost, and at that point it might as well be a dedicated application.