Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Brings into perspective how essential git is for my workflow - I am waiting for `scp` to transfer my files from my laptop to work computer and can't push anything into the CI/CD pipeline.


For what it's worth, the "D" in "Distributed Version Control System" is useful here. You can `git init --bare foo` on your work computer, `git remote add workcomp username@hostname:path/to/foo` on your laptop, and `git push workcomp master` to push everything over using pure Git. (And the first steps only have to happen once for these two machines.)

(This creates a bare repo on your work computer, meaning there's no associated working directory -- you'd probably want to add that same repo as a remote from whichever existing repository on your work computer you have. The bare repo, in this scenario, is just a means of passing commits from your laptop to your work computer in a Git semantically-meaningful way.)


Your comment is probably going to inspire a lot of people to start self hosting remote repos during the downtime today.

As with most things, it'll start great, but a lot of those people will be in tears in a few weeks.

Great power; great responsibility.


True! It's best to consider that kind of repo as a downstream fork, with all the responsibilities of upstreaming changes once the canonical repository comes back up.


Having a bare repo is certainly one option. Some may find having two copies of a repo on one system to confusing.

Here's an alternate setup that doesn't use a bare repo. It does require some git hygiene/discipline though.

Setup: Desktop: git config --local receive.denyCurrentBranch updateInstead This will let the laptop push the desktop, updating its files, as long as the desktop doesn't have uncommited things (aka working dir is clean). On the laptop: git remote add desktop...

Working with it: Desktop: commit everything Laptop: commit everything git pull --rebase desktop git push desktop (assuming there are no issues w/ the pull).

Not saying either workflow is better, merely providing an alternative.

imo the reason git took over has some to do with being unopinionated about workflow - there's some tooling, but whatever workflow is managable with that tooling is "supported" - as long as the team can agree to use said workflow.


This is awesome, thank you!!


Assuming you're transferring a bunch of small files with `scp -r`? Try piping tar over ssh instead; it can be much faster for such use cases.

https://unix.stackexchange.com/questions/10026/how-can-i-bes...


This was a great thing to learn as SCP seems to perform poorly when you have many small files. Thank you!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: