Run your own locally but beware of copyright violations. My company uses some form of copilot that supposedly is guaranteed to be copyright safe. Personally i think it's very bad policy to make your skills dependent on something proprietary, which is exactly what these llm providers are trying to do to make back their billions of investment.
This is one of those things I just gave up at, like fully "degoogling". I ran my own ollama server for a while in my machine, for code completion, but it's just garbage without GPU accel, so I just use ChatGPT and pretend I believe their privacy settings
I do not trust an agent near client code or even PII. Multiple windows of VScode still seem to active from time to time, or by forced update. This is why I do not activate them on my local machine. Only on a VM/container which only contains the codebase under edit.
I avoid cloud based stuff. Though I use GitHub Copilot largely because it's from the early days and it's a great tool (and my employer pays for it). But otherwise, local LLM or gtfo.
I use public LLM the same way I'd use a search engine - to search public information only and never proprietary data. With Gemini in Google Workspace that can be used for proprietary information, but of course only if Gemini suits the tasks.
As for coding model my view is they're directly violating my copyright, especially that I checked some "open training data" corpus that it does confirm that my code is part of the corpus without honoring the license.
Honestly? I'm somewhat OK with Mistral, in that it's a French company, therefore they need to comply (on paper anyways) with GDPR. And if they do not, there are serious consequences. But overall - locally. If it is not running on your hardware and it requires an internet connection to work, then the data is not yours. For the time being, building an AI rig at home is relatively cheap on ebay: if you play your cards well, it is doable for 1000 bucks or less.