I'm not sure if this is my stupidity or if it's a bug, but

git annex copy --force --to REMOTE . 

just zip's through really quickly and doesn't actually force a copy to a remote location. This is just following up on the git-annex directory hashing problems on osx. I want to just do a force copy of all my data to my portable disk to really make sure that the data is really there. I would similarly would want to make sure I can force a

git annex copy --force --from REMOTE .

to pull down files from a remote.

How remote is REMOTE? If it's a directory on the same computer, then git-annex copy --to is actually quickly checking that each file is present on the remote, and when it is, skipping copying it again.

If the remote is ssh, git-annex copy talks to the remote to see if it has the file. This makes copy --to slow, as Rich complained before. :)

So, copy --to does not trust location tracking information (unless --fast is specified), which means that it should be doing exactly what you want it to do in your situation -- transferring every file that is really not present in the destination repository already.

Neither does copy --from, by the way. It always checks if each file is present in the current repository's annex before trying to download it.

Comment by http://joey.kitenet.net/ Sun Apr 3 16:49:01 2011

Remote as in "another physical machine". I assumed that

git annex copy --force --to REMOTE .

would have not trusted the contents in the current directory (or the remote that is being copied to) and then just go off and re-download/upload all the files and overwrite what is already there. I expected the combination of --force and copy --to that it would not bother to check if the files are there or not and just copy it regardless of the outcome.

On second thought maybe the current behaviour is better than what I am suggesting that the force command should do. I guess it's better to be safe than sorry.
Comments on this page are closed.