Can you site that number? It really doesn't pass the sniff test for me, unless the word "most" is doing some pretty heavy lifting.
A quick search suggests that the range of a 5G tower, operating only at low/mid-band spectrum (so in other words, below peak speed - but at higher range) can only operate in the 1 to 3 mile range. [1] We'll say 2. That's an area of pi*2^2 = ~12.5 square miles, we'll say 13. The area of the US is 3.8 million square miles. So your number would provide coverage for (10,000 * 13) / 3.8 million = 3.4% of the US. That maybe enough to cover the most exceptionally dense urban locations, but you're missing a lot of people there.
And, again, this is just for the low/mid-band stuff. And then you need to regularly maintain those towers. While you could get global coverage with relatively few satellites that can just be trivially remotely launched/decommissioned. A quick search there [2] turns up a current practical (not peak/theoretic) bandwidth for Starlink in the 100+ Mbps range + ~50ms latency. I have difficulty seeing a logical argument for ground based telecom, beside as a hedge against WW3 when probably the first thing that will happen is a huge chunk of all satellites going poof.
Your first link says “On average, the maximum usable range of a cell tower is 25 miles.” Which is relevant because we aren’t trying to provide service just for high density areas.
Before you ask if 10,000 * 2,000 = 20 million square miles is high, handoffs require you to be in range of multiple towers so there’s a lot of overlap and hills, ocean, etc that reduce useful range.
Ultimately there’s 142,100 cell towers in the US, but that’s including density from urban areas and redundancy from multiple cell networks. Anyway, the point I was making was that Starlink is targeting low density areas by necessity they simply can’t target NYC density for any reasonable constellation size. However, if you’re a cellphone company and you’re already covering anywhere in the US with 50+people per square mile extending that to anywhere with 5+ or even 0.5+ people per square mile and killing Starlink just doesn’t take that may towers.
"Usable" is going to mean at the max possible wavelength. The problem with telecoms is that there's a physics imposed inverse relationship between frequency (speed) and wavelength (penetration/distance). So it's not like computing where we basically have gotten a free lunch with stuff that goes faster, runs cooler, and takes up less space.
Each upgrade with telecoms entails a sacrifice. You can have really fast signals that can't go far and have difficulty penetrating obstacles like walls/buildings/hills/etc, or you can have really far reaching and high penetrating signals that can't go fast. So for instance Verizon's max speed towers can only reach 1500 feet [1], so I think my estimate of ~2 miles was a pretty reasonable meet in the middle.
All that said I agree with you in principle. Obviously space based telecoms are much better for less populated areas than heavily populated, but I'd argue that that space based can scale much more easily. The ground based telecoms aren't just those 140k towers, but also the other 450k nodes on top. And that's to cover a pretty small geographic area. And each of those nodes not only needs land and construction permits, but they also need to be be regularly maintained, and so on. It's a pretty big deal. For space based coverage, you can just launch your satellites from Texas and have them providing coverage on the other side of the world in a matter of minutes.
Put another way - imagine we were creating a civilization from scratch and these technologies were all 'unlocked.' I don't think we'd be using ground based stuff much at all. In the present when the infrastructure already exists, there's no reason not to take advantage of it, but in general it just doesn't scale so well.
Don’t forget those towers covers the vast majority of US population with high speed connectivity, where Starlink only has ~1 million US customers 1/300th the population. Those ratios aren’t that off in terms of customers per unit, but the problem with scaling satellites is they don’t stay in one location.
You can’t just put 50 satellites next to each other over a suburb and call it a day you need a ring(s) of satellites circling the entire globe to reach whatever your target density is along their full orbit. Unfortunately, most land has really low density North Dakota only averages 11 people per square mile, while Florida a mostly empty state sits at 422.
Target 10 people per square mile (adjusting for household size and rates percentage of people signing up) and just about all your satellites are useful across the entire US.
But Pick 100 people per square mile 90% of your time over North Dakota is wasted. Worse large chunks of Florida are also nearly empty as most of its population is along the coastline in places like Sweetwater where 8,800 people per square mile live. So your wasteful 100 people per square mile in ND still only covers a small fraction of the population in Florida.
Cellular is the reverse the first 10k towers are largely “dead weight” that cover few people per tower, but the rest of the 130k are really useful because you optimize locations for density. Swap that to satellites initially the constellation has very high utilization, but the ratio keeps getting worse as you add more satellites.
PS: Starlink could try to vary speeds or prices more based on density, but people really want predictable results for their money.
This is what I was trying to say above but much more detailed an eloquent. Theres a lot dislikes in this thread but not many folks addressing the points.
The thing about high density places with a ton of infrastructure is that wired will always be the best because you have close access to infrastructure and its likely to already be built in. For rural areas or even suburban areas the equation starts to tip to orbital wireless for the reasons you state above and also geographic realities make ground based wireless unreliable in places where there are mountains valleys canyons etc.
There’s a 4 orders of magnitude difference between high density areas and low density ones. So no you don’t need millimeter wave everywhere. You can increase bandwidth per tower, but you can also the number of cell sites.
Further every frequency you add removes users from other frequencies. IE: At 10 miles you can use a subset of frequencies, but those frequencies don’t need to cover for people 100m from the cell tower because those are on 5G.
Thus double the number of cell sites means there’s an extra circle of people on mm wave frequencies around the new towers. Thus you more than double effective bandwidth in low density areas when you double the number of towers.
Meanwhile the reverse happens with satellites. For a given number of satellites there’s some areas where you have sufficient capacity for the density at those area. Suppose you have enough satellites for ships and aircraft over the ocean, add new satellites to handle higher density and the time those satellites are over the ocean isn’t getting you new customers. IE the percentage of time the average satellite is at 90+% capacity drops when you add more satellites.
Building things deployed to land in real life, basically sucks. Building out a tower requires buying the land (or even possibly getting involved in extremely dirty eminent domain lawsuits), getting countless building permits/inspectors, architecting your building in accordance with local regulations and any sort of geographic peculiarities, organizing a construction team, [finally] building it, and then maintaining the building itself as well as the various regulatory regulatory, tax, and other requirements that come with such. And that's for exactly 1 tower! And you really cannot overstate how big of an ordeal this is. If you think NIMBYism is bad for housing, think about how people feel about building phallicy energy generating towers reaching hundreds of feet in the air around them.
By contrast SpaceX: build satellites, launch satellites, done. They can launch tens (and soon hundreds if not thousands) from their base in Texas with a single launch. There's still some bureaucrazy they have to deal with, but this is overall just a many orders of magnitude greater difference in terms of scalability and overall ease. And when satellites start hitting end-of-life - no problem, just deorbit them and continue expanding the swarm.
A quick search suggests that the range of a 5G tower, operating only at low/mid-band spectrum (so in other words, below peak speed - but at higher range) can only operate in the 1 to 3 mile range. [1] We'll say 2. That's an area of pi*2^2 = ~12.5 square miles, we'll say 13. The area of the US is 3.8 million square miles. So your number would provide coverage for (10,000 * 13) / 3.8 million = 3.4% of the US. That maybe enough to cover the most exceptionally dense urban locations, but you're missing a lot of people there.
And, again, this is just for the low/mid-band stuff. And then you need to regularly maintain those towers. While you could get global coverage with relatively few satellites that can just be trivially remotely launched/decommissioned. A quick search there [2] turns up a current practical (not peak/theoretic) bandwidth for Starlink in the 100+ Mbps range + ~50ms latency. I have difficulty seeing a logical argument for ground based telecom, beside as a hedge against WW3 when probably the first thing that will happen is a huge chunk of all satellites going poof.
[1] - https://dgtlinfra.com/cell-tower-range-how-far-reach/
[2] - https://www.pcmag.com/news/starlink-speed-tests-2023-vs-2022