Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Latency is bounded by the speed of energy transfer (i.e. speed of light under superconducting situations). Even if bandwidth IS a big problem—as it is—latency will always be a sticking point for distributed systems. As someone else pointed, we need to start analyzing distributed algorithms with a) geographical locality and b) complexity on round trips, which have a constant cost determined by latency.


Speed of light determines the lower bound of roundtrip latency, but the sticking point is the upper bound, and that depends on how congested networks are.

If sensors produce more data and data centers can process more data but network capacities can't keep up, then the analysis you are suggesting will show that we need more decentralised processing, not more data centers.


Ahh, I see.

I don't think we were disagreeing; I was just pointing out that data centers scale very well for what data centers were built for. It's not that surprising that we are outgrowing them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: