What Kinds of Data do AI Chatbots Collect?
-
Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.
Check out Ollama, it’s probably the easiest way to get started these days. It provides tooling and an api that different chat frontends can connect to.
-
Which is good (for now). Glad I don't use that shit
On that note, anyone reading this comment, [email protected] (and similar) exists. Take your privacy back!
-
They're talking about what is being recorded while the user is using the tools (your prompts, RAG data, etc.)
Does that include generated responses?
-
Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.
-
Is there away to fake all the data they try to collect?
Pretty sure this is what they scrape from your device if you install their app. I dont know how else they would get access to contacts and location and stuff. So yeah you can just run it on a virtual android device and feed it garbage data, but i assume the app or their backend will detect that and throw out your data.
-
Pretty sure this is what they scrape from your device if you install their app. I dont know how else they would get access to contacts and location and stuff. So yeah you can just run it on a virtual android device and feed it garbage data, but i assume the app or their backend will detect that and throw out your data.
Root, install xprivacy (or xprivacylua if your phone isn't 10 years old).
-
Does that include generated responses?
Nobody knows! There's no specific disclosure that I'm aware of (in the US at least), and even if there was I wouldn't trust any of these guys to tell the truth about it anyway.
As always, don't do anything on the Internet that you wouldn't want the rest of the world to find out about
-
If by more learning you mean learning
ollama run deepseek-r1:7b
Then yeah, it's a pretty steep curve!
If you're a developer then you can also search "$MyFavDevEnv use local ai ollama" to find guides on setting up. I'm using Continue extension for VS Codium (or Code) but there's easy to use modules for Vim and Emacs and probably everything else as well.
The main problem is leveling your expectations. The full Deepseek is a 671b (that's billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.
They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren't as impressive as the cloud hosted big versions though.
-
Only if my hardware could support it..
I can actually use locally some smaller models on my 2017 laptop (though I have increased the RAM to 16 GB).
You'd be surprised how mich can be done with how little.
-
Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.
If you want to start playing around immediately, try Alpaca if Linux, LMStudio if Windows. See if it works for you, then move from there.
-
Pretty sure this is what they scrape from your device if you install their app. I dont know how else they would get access to contacts and location and stuff. So yeah you can just run it on a virtual android device and feed it garbage data, but i assume the app or their backend will detect that and throw out your data.
How about if I only use the web version?
-
Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.
I used this a while back, it was pretty straightforward https://github.com/nathanlesage/local-chat
-
DeepSeek at home: None
Doesn't the official local app still have telemetry? I might be remembering wrong
-
Doesn't the official local app still have telemetry? I might be remembering wrong
You just use the model in an opensource program, not theirs.
-
A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.
- Gemini: Collects all 10 data types; highest total at 22 data points
- Claude: Collects 7 types; 13 data points
- CoPilot: Collects 7 types; 12 data points
- Deepseek: Collects 6 types; 11 data points
- ChatGPT: Collects 6 types; 10 data points
- Perplexity: Collects 6 types; 10 data points
- Grok: Collects 4 types; 7 data points
I'm interested in seeing how this changes when using duck duck go front end at duck.ai
there's no login and history is stored locally (probably remotely too)
-
And what about goddamn Mistral?
Its French as far as I know so at least it abides to gdpr by default.
-
DeepSeek at home: None
How much VRAM does your machine have? Are you using open webui?
-
Its French as far as I know so at least it abides to gdpr by default.
All services you see above are provided to EU citizens, which is why they also have to abide by GDPR. GDPR does not disallow the gathering of information. Google, for example, is GDPR compliant, yet they are number 1 on that list. That’s why I would like to know if European companies still try to have a business case with personal data or not.
-
A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.
- Gemini: Collects all 10 data types; highest total at 22 data points
- Claude: Collects 7 types; 13 data points
- CoPilot: Collects 7 types; 12 data points
- Deepseek: Collects 6 types; 11 data points
- ChatGPT: Collects 6 types; 10 data points
- Perplexity: Collects 6 types; 10 data points
- Grok: Collects 4 types; 7 data points
Who TF using Grok.
-
If by more learning you mean learning
ollama run deepseek-r1:7b
Then yeah, it's a pretty steep curve!
If you're a developer then you can also search "$MyFavDevEnv use local ai ollama" to find guides on setting up. I'm using Continue extension for VS Codium (or Code) but there's easy to use modules for Vim and Emacs and probably everything else as well.
The main problem is leveling your expectations. The full Deepseek is a 671b (that's billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.
They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren't as impressive as the cloud hosted big versions though.
My assumption is always the person I am talking to is a normal window user who don't know what a terminal is. Most of them even freak out when they see "the black box of text". I guess on Lemmy the situation is better. It is just my bad habit.