1

I have a file that was tracked during development and pushed to remote upstream repo so that anyone who fork should get it.

Now this file is stable, no need to touch it anymore during development but it should still allow custom changes on each fork. These custom changes should just be local in fork and should not pollute upstream anymore.

These were tried:

  1. An entry has already been added to .gitignore, but since the file was tracked before, this won't help.

  2. Tried git update-index --assume-unchanged file.xml, this is good on my env., but it does not automatically updates all the forks about this.

  3. Tried git rm --cached file.xml, this is bad as this removes the file from the upstream. Although local copy is not deleted, but any new people who fork from now on won't get this file.

I would love to see some automatic way in git to achieve this. Otherwise I will have to go with 2. and ask each one whoever forked to run that command, which is real pain.

user1589188
  • 4,159
  • 12
  • 49
  • 99
  • 1
    You need to decide what you want this file to be. Is it local? Is it published on the remote? I suspect that this file never should have been committed in the first place, but in the absence of more information I won't answer. – Tim Biegeleisen Jan 20 '17 at 05:38
  • What else is missing? I said it in the question already, it was needed during development and now it is stable, but still needed in the project to keep custom local changes. – user1589188 Jan 20 '17 at 05:40

2 Answers2

1

2 is good enough, but you still need a way to enforce that file you not be modified.
Due to the distributed nature of Git, that means having a pre-receive hook that will detect if that particular file is pushed, and will reject the push.

git diff --name-only $OLD..$NEW

See "git pre-receive (push) hook detect if a merge was done in a specific file" (the merge part is not releavant here)
See also "Git pre-receive hook" to list all files in all oldref newref received.

Community
  • 1
  • 1
VonC
  • 1,042,979
  • 435
  • 3,649
  • 4,283
  • Thanks. I haven't look into the details of this yet, but based on your description, it means everytime a fork try to push this file will be rejected? How does this help the fork to know that an assume-unchanged command should be run to avoid being rejected? – user1589188 Jan 20 '17 at 05:44
  • @user1589188 the upstream repo does not know about an "assume-unchanged" status (which is purely local to a downstream repo). All it knows is that if it sees that particular file beiong pushed, that means someone is attempting to publish a modification done to that file, and that ust be prevented. This is a centralized enforcement of your new policy of "no new modification to that file". – VonC Jan 20 '17 at 05:46
  • @user1589188 your solution 2 helps some user (who will take the time to configure their repo) to remember they *should* not add and commit a local modification to that file, but what I propose will help *all* user realize they *must* no commit (and push) any modification to that file. – VonC Jan 20 '17 at 05:48
  • Thanks heaps, I understand that. But instead of this passive defensive approach, I think I prefer something more active like a config in the git repo for anyone who forked will get that some certain files are not tracked by default. – user1589188 Jan 20 '17 at 05:50
  • @user1589188 that is my point: "due to the distributed nature of Git" means that what I propose is the most active you will get. Add some README if you want, but a centralized policy enforcement is way easier to maintain that distributing somehow said policy. – VonC Jan 20 '17 at 05:50
  • Thats ok, but I dont control the gitlab hosting, this can't be done by myself even though I own the upstream project. – user1589188 Jan 20 '17 at 05:54
1

I dont control the gitlab hosting

In that case, adding a centralized policy is harder (it could be done with webhook though)

Here is another proposal:

  • rename file.xml into file.xml.do-not-modify
  • add a content filter driver (a smudge script) which will on checkout generate automatically file.xml (by copying file.xml.do-not-modify)

smudge

That file.xml will be private (can be addet to your .gitignore) and can be modified at will locally.

A content filter driver still need to be activated by each user locally, but if they *neeed file.xml, they will have to (as explained in the README of your project).

Community
  • 1
  • 1
VonC
  • 1,042,979
  • 435
  • 3,649
  • 4,283
  • Thanks again. When you say on checkout generate the file, would this be EACH checkout? Will that overwrite the local copy? – user1589188 Jan 20 '17 at 08:04
  • @user1589188 what it will do on each checkout is execute the smudge script that you have associated to `file.xml.do-not-modify` in your `.gitattributes`: it is trivial to add any intelligence you need for that script, like, for instance, to *not* overwrite the generated file if it already exists. http://stackoverflow.com/a/638980/6309 – VonC Jan 20 '17 at 08:06