Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's unnecessarily splitting hairs.

> interaction data—specifically inputs, outputs, code snippets, and associated context [...] will be used to train and improve our AI models

So using Copilot in a private repo, where lots of that repo will be used as context for Copilot, means GitHub will be using your private repo as training data when they were not before.



No it isn't. Most people don't use Copilot, so this term change won't effect most people. You can reasonably be unhappy about it anyways (or unreasonably still be using Copilot in 2026), but it's still ultra-useful information for them to add to the discussion.


Next step they'll rebrand search as "Copilot Search" or auto enable pull-request AI reviews (unless you hear about it and turn each off) and we'll all be "users".

Boiling the frog with a Venn diagram.


Copilot, or "chat with Copilot" is a button that is available on every page right next to the search bar.

I don't have to be a Copilot user to click on it.

This change is malicious, and it doesn't only affect Copilot users. It affects everyone on the platform!


Again, this collects usage data. If you click the button by accident and don’t interact, they get no data.


So? This feature is available to everyone and you have zero idea how many people actually use it.

If I go to one of your GPL projects and I ask a simple question to find out what this project is about, you will be perfectly "ok" that this interaction (that includes most of the code that is required to answer my dumb the question) will be used for training?

This is not ok.


Nobody in this subthread is saying if it's OK or not. We're just saying that it's very useful to know that this is what they're specifically collecting. Jiminy.


It's automatically enabled for example the other day I did a commit directly on GitHub and AI generated commit popup it had to read the code to work


> Most people don't use Copilot

So why do any of this at all? You're putting a large part of your customer base on edge in order to improve a service that "most people don't use." The erosion of trust this brings doesn't seem like a worthwhile or prudent sacrifice.


You're asking me to explain Microsoft AI strategy? Your guess is as good as mine.


I don't use copilot, but somehow was subscribed... I probably clicked something long ago and it just remained active.


They "gift you" a free standard plan if you have above a certain (non-transparent) level of stars, I don't think you can even disable your "subscription" if you get it for free.


They're only training on interactions with Copilot, not with the full contents of repos that happen to be subscribed to Copilot.


Make it opt-in then.


Isn't this pretty standard, using your interaction data for training and making it opt-out? Claude Code, Codex, Antigravity etc. all do the same. Private repo doesn't make a difference as they have a local copy to work from.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: