This still smells like the kind of paper that a think tank would fund to justify their billionaire-backed policy that social security should be abandoned and the retirement age moved to 75.
How do I use DNS on my home network to set up my home router? It's the same problem as TLS certificates on web interfaces for infrastructure.
And the commercial solution is going to be "pay us a subscription fee so your home device can get an Internet management interface on top of all the egregious data collection".
> I think they are not afraid, they just see 0 reasons to
This is a big part of it. Apart from extra addresses, it offers remarkably little benefit in terms of networking features from an operational management perspective. It sounds like it should be better when you look at the features, but, in actual operation the features don't really offer that much.
Further, there's the general problem that for some reason the network equipment manufacturers seem to think that because you don't frequently need NAT that now you don't need to have a stateful firewall just always on by default on a network edge device.
Plus the general confusion among tech neophytes that NAT itself is offering actual security features, so that a stateful firewall is a downgrade. This is such a fundamental misunderstanding that you can't even communicate with a person that believes it to be the case. I fear that this confusion will remain with us for decades. I'm sure me even mentioning it will spawn a whole thread of people vehemently disagreeing, because there is always at least one.
This is coupled with the fact that the addresses are just ugly. Like, I'm sorry, but unless you're exactly an electrical engineer, the IPv6 addressing scheme is difficult to remember. IPv4 has the same problem -- the magic numbers are only easy to remember if you have memorized the binary values, too -- but it's really only a handful of things to remember in comparison. Hex values are just not as easy to read or remember compared to decimal numbers. So even though IPv6 isn't harder to use, it feels like it's much harder to use.
Yeah, there are alfalfa fields in central Arizona. Alfalfa basically turns water and sunlight into cellulose about as quickly as plants can.
Worse, the owners of those fields are often foreign companies. That means they use tremendous amounts of water in one of the driest regions on earth, in the middle of a multiple decade drought, and the wealth these farms generate disappears overseas.
Part of the issue is not systematically using a pricing structure that charges disproportionately more for usage above high thresholds.
The 101-level "solution" is to just raise the price to account for demand. The problem with that is that it treats all usage the same, whether it's a residence's first gallon or an alfalfa field's last gallon. But the former is something we need to protect.
It makes sense to price water, and electricity, in a fashion where the first X costs a certain amount, and the next X has a higher rate, and above some percentile of usage it has a much higher rate, and at some percentile of usage, customers should be very nearly paying for new required utility infrastructure themselves. That allows using pricing to solve supply problems, without penalizing normal levels of usage.
Some utilities already do this. But if there are actual issues with having enough supply for both datacenters/farms/smelters/etc and residential usage, then they're not doing this well enough, or don't have the pricing correct.
This causes major market distortions and worse outcomes than the econ 101 solution.
The problem is that water isn't traded on a normal market at all. Lots of people have historical water rights and pay nearly nothing for their water use. There's byzantine regulation and many have the right to use for some purpose on their land but not to resell, so the market cannot allocate to more efficient use.
If you just let the 101 level solution actually work, water prices will rise until inefficient uses like water-intensive agriculture (not even all crops!) are pushed out. Urban users easily outbid almost all agricultural use, even at what any person would consider dirt cheap prices. For example, desalinated water, which is considered expensive for agriculture, can be 40 cents per cubic meter of water. That's a lot of water! Usually the last mile of urban water delivery costs more than that.
The amount required to satisfy all urban use, including water hungry lawns etc, and datacenters, corresponds to a very minor reduction in agriculture. Perhaps even just changing which crop is grown or switching irrigation techniques.
Charging more to higher users, price discrimination, causes several problems. First, it creates an incentive to cheat. I'm not using all this water myself, its for this whole group of people who "live" here. Don't allow this kind of spreading (somehow...)? Now you actually screw any business or institution that serves a lot of people. A farm produces food for thousands- do they count as one user? A park uses much more water than a garden but serves many more people. Whatever framework you create will require another bureaucracy to run. Lobbyists will find or insert loopholes for their friends.
The heavy users actually improve the system robustness, in both electricity and water. Their higher demand pays for more supply infrastructure, which itself often benefits from economies of scale, and in a shortage they may even be more responsive to price increases due to their high use.
The 101 level solution means that Native Americans who were granted water rights by the Spanish, and guaranteed those rights by US treaties, would have to outbid urban users in order to grow subsistence crops.
The heavy users have more influence over the laws which govern the infrastructure, as the history of water rights in the West clearly shows. We see it now when secretive organizations negotiate with water companies under NDA to get water for new data centers - something a smaller water user couldn't do.
The riparian doctrine of the East, with its high rainfall, don't work so well in the dry West, which is why it generally uses the prior appropriation doctrine. Water management was traditionally under a communal system. Some of these still exist as acequia associations, which include equity and fairness in their decisions, which doesn't follow the prior appropriation doctrine.
If the Native Americans have water rights, they can also sell them. They can choose to use it inefficiently on subsistence farming, or they could sell at the going rate. A normal market itself doesn't imply any particular allocation of water rights, just that they should be as fungible and transferable as possible.
Why are the laws that govern the infrastructure particularly important? It only matters now because its a tangle of regulation. Yes, big users can often get bulk discounts or other special arrangements by committing to use. This happens in many areas.
There's no law governing what products my grocery store must carry. Yet, I can still choose a store with many things I like, at affordable prices. My store may (and frequently does) exclude all products containing some chemical considered harmful even if it isn't banned. Of course, water has more of a natural monopoly problem, but that's more for last mile infrastructure and not broader supply.
I don't understand the details of the riparian vs prior appropriation doctrine. How does this create an issue? If the water rights are defined somehow, in a usage-independent way, only in terms of the net water removal, to account for runoff from local use, and the water from them can be traded, then a market can work regardless of the specific nature of the right.
Any association holding the rights could allocate its water internally as it sees fit. Just like any other asset? Or it could decide to sell it and distribute the money instead- perhaps even better for fairness to it's members!
> If the Native Americans have water rights, they can also sell them.
You've just described the standard practice for taking over Native American lands by economic coercion instead of direct force. Take away land and water using market forces, and a culture based on land and water shatters.
That's precisely why the Native Americans protected their rights by treaty, not market forces.
Econ 101 was created to justify British colonial expansionism. Econ 101 justifies indentured servitude. Econ 101 justifies vote selling. Econ 101 justifies rule by the rich.
We've collectively decided that some part of life are off-limits to Econ 101.
Water is not simply a commodity. Water is life. Water is culture.
> How does this create an issue?
Water rights in the West are at least a Econ 400 level course, if not graduate school.
The land and any associated rights was taken from the Native Americans by coercive force. Not a market. Not that they have any particular claim to it; states own land, not ethnicities. The particular state that controls land sometimes changes. This has little to do with any discussion of water rights.
I am suggesting expanding their water right- instead of only the right to do X, Y, Z with the water, take whatever right to the water they do have, in terms of amount of water, and say "you can do whatever you want with this much water". How to allocate resource rents doesn't have much effect on the market structure itself.
A lot of vitriol against supply and demand without any evidence.
Food is life. Food is culture. Just as much as water. Which countries have had famine, those that allocate via some system of food rights, or those that had a free market in food? The largest examples of famine I am aware of, in the USSR and Maoist China, were driven by some central allocation of food rather than a market. Not a good record.
One of the great features of markets is that things don't need to be decided collectively. Perhaps 90% of people want to wear blue T shirts, but I want a red one. If we collectively decide, I get a blue T shirt. In the market, I buy a red T shirt- perhaps at a very slightly higher price due to less economy of scale.
We certainly know of areas where vanilla markets can fail- externalities etc, but these do not apply to the situation here. The existing system of water rights doesn't feel like a collective decision, but rather entrenched special interests and lobbyists.
I can see you don't recognize native sovereignty. Tribal nations are domestic dependent nations, and count as a "state". The water rights were not taken by coercive force but remain with the nation, and at least nominally protected by treaty rights.
It's very odd that you talk about a decentralized market when water allocation in the US Southwest was decided by the Colorado River Compact in 1922. This is central allocation and, famously, based on over-estimated flow numbers.
Most large famines were caused by flooding or drought. More people died in the Chinese famine of 1906–1907 than the Russian famine of 1921–1922 and the Soviet famine of 1932–1933 combined.
> The existing system of water rights doesn't feel like a collective decision, but rather entrenched special interests and lobbyists.
That is absolutely correct, and would covered in the first week of any water rights class. In the West, "water flows uphill toward money", as I learned from reading "Cadillac Desert".
Or if you want a novel, read "The Milagro Beanfield War." The small farmers (mostly Hispanic) were drafted for WWII and couldn't farm, so lost their water rights, while the large farmers (mostly Anglo) could hire help.
"The Santa Fe Ring was an informal group of powerful politicians, attorneys, and land speculators in territorial New Mexico from 1865 until 1912. The Ring was composed of newly-arrived Anglo Americans and opportunistic Hispanics from long-resident and prominent families in New Mexico. Acquiring wealth, both groups realized, lay in owning or controlling the millions of acres of land which the Spanish and Mexican governments of New Mexico had granted to individuals and communities. The acquisition of grant lands by members of the Santa Fe Ring was facilitated by U.S. courts who had no allegiance to Mexican claims and land practices which featured allocating most of the land in grants to the common ownership of the first settlers and their descendants vs. legal private ownership."
What exists is a patchwork of economic systems, there is no cohesive whole, and you clearly prefer the US one which prioritizes the private ownership model of Econ 101.
That's why you can't view it simply through an Econ 101 lens.
> Part of the issue is not systematically using a pricing structure that charges disproportionately more for usage above high thresholds.
We don't do this for gasoline (in most countries), even though it is also vital for life. And yet people can still drive, afford to eat food grown with fertilizers, use plastic, and so on.
Turns out markets are pretty good when you leave them alone. But when they're not left alone (as is the case with water today!!) you get some weird shit.
Gasoline is absolutely rationed when it becomes scarce after having been plentiful.
When hurricanes come to South Florida, the well off migrate North to wait out the storm while the poor suffer the dangerous conditions. Part of this is due to the price spikes of gasoline in the local market as supplies dwindle due to fewer truck shipments and refineries shutting down for the storm.
Water is similar. Both water rights and water utilities are gamed by people who have resources. The people that are hurt are usually poor utilities bill payers, rural residents who are the first to lose service when wells dry up, and anyone who thinks they have water rights until an upstream user exhausts their expected supply.
The “markets work” heuristic is frequently wrong if you don’t glaze over the very many counterexamples.
Yeah but that response is stupid, irrational, makes shortages more likely and discourages people from taking action when they need to do something different right now. In an emergency situation, people who can provide more of something that is in desperately short supply should be paid more. People consistently adopt a strategy of trying to not pay them more and it's one of those really annoying cases where people's instincts are primed to make them band together and do something predictably foolish.
Rationing is an inevitable response. But to say that is like saying witch hunts are inevitable - they are. They're still bad ideas. People who can maintain access to their higher reasoning should resist them.
> Gasoline is absolutely rationed when it becomes scarce after having been plentiful.
Sure, but OP is advocating that we should "systematically [use] a pricing structure that charges disproportionately more for usage above high thresholds." They're not arguing that this is something to be applied only in emergencies.
Similarly in your post, you use the need to ration gas after a hurricane to argue that we should ration water all the time. This does not follow.
> Both water rights and water utilities are gamed by people who have resources. The people that are hurt are usually poor utilities bill payers, rural residents who are the first to lose service when wells dry up, and anyone who thinks they have water rights until an upstream user exhausts their expected supply.
The logical extension of your argument here is that the world would be better if we subsidized gasoline for "poor utilities bill payers" and "rural residents".
But why gasoline and water specifically? Why not also healthcare, food, childcare, and other necessities?
Then consider, if we have a budget of $X per family to subsidize necessities, surely the government is not best suited to decide how to split up those dollars between water, gas, healthcare, food, and childcare? There's no right answer universally, some people need food more than they need gas, and vice versa. Surely an individual family would be better equipped to decide for themselves?
We have now invented "giving money to poor people instead of subsidizing demand", which I wholeheartedly support.
200 miles will easily get you out of the path of a hurricane. 200 back home. 400 miles at 20mpg is 20 gallons of gas. Even if gas doubles from $4 to $8, that’s only an extra $80, likely less than the cost of that one night of motel, and certainly less than the economic costs of actually being hit by a hurricane.
As with many things, markets do work, but people don’t make rational choices for their well-being.
No, but commercial trucks use diesel, which carries about 25% higher taxes per gallon. And vehicle registration on semi-trailer trucks is significantly higher as well. They pay, on average, between $25,000 and $30,000 in taxes and fees each year.
> Turns out markets are pretty good when you leave them alone.
No, they aren't. They're ridiculously bad when you leave them alone because someone captures the market, ramps up anti-competitive practices, and immediately begins rent-seeking as hard as possible.
Free markets are pretty good at finding good prices. Markets that are left alone do not remain free. That lauded "self-interest" encourages businesses that have reached nearly 100% market share to increase profit in other ways.
That's a bad argument. There are gasoline trucks with a GVWR of ~20,000 pounds and diesel cars that weigh less than a Honda Accord. If you actually wanted to do that then you'd instead do something like tax based on axle weight and miles traveled, e.g. by reading the odometer during inspections.
The better argument is that diesel is worse for air quality and then it's a pigouvian tax in proportion to how much you burn.
The realpolitik argument is that fewer people have diesel vehicles and democracy is two wolves and a sheep voting on what's for dinner. But taxing commercial trucks is also a pretty sneaky way of taxing ~everything while pretending to not, so it's also the principal/agent problem. Legislators want to spend money while pretending not to take it from you.
> diesel cars that weigh less than a Honda Accord.
It is taxed less than gas in lots of Europe where that is more common. You also need to factor in mpg vs gas, where it is higher, so more road-wear pCO2 was part of the debate in Europe, even though it is longer carbon chain so worse co2 ratio per calorie, the engines are more efficient. Diesel is worse for local air, better for long term co2.
There are a mixture of factors and lobbying behind the differencs, road wear is one. Farm fuel with no road wear isn't taxed much at all in lots of places and is more often diesel.
In theory diesel hybrids would be even more efficient but diesel engines and hybrid transmissions both add up-front cost and further efficiency improvements have diminishing returns because reducing a $100 fuel cost by 30% isn't as much money as reducing a $70 fuel cost by 30%.
> There are a mixture of factors and lobbying behind the differencs, road wear is one.
Road wear is the irrelevant one in terms of fuel. Because of the fourth power law, essentially all road wear is from full-size buses and semi trucks. The contribution from passenger cars and even the likes of diesel pickup trucks rounds to zero. Meanwhile the largest vehicles use a minority of the fuel because there are several times more passenger cars than semi trucks.
"Someone captures the market" is the thing that happens when the government micromanages them. Laws that charge more per unit to high users aren't anti-trust laws. A farm doesn't have higher market share in food than Google has in a tech market just because it uses more water.
I am not saying that there should be no regulations on monopolies. We are discussing a very specific market intervention, namely the proposal to
> systematically [use] a pricing structure that charges disproportionately more for usage above high thresholds.
This is what I'm arguing is a bad idea, by using gasoline as an example.
If you want to argue that imposing this pricing structure systematically is good because it would help prevent a bad monopoly like Standard Oil, you'd need to explain (a) how this market intervention would prevent monopolies and (b) how it's a "better" way (according to however we decide to measure "better") to prevent monopolies than the alternatives. I don't see how this is true, though.
> Turns out markets are pretty good when you leave them alone. But when they're not left alone (as is the case with water today!!) you get some weird shit.
excuse me? leave the markets alone? to do what? continue screwing people over with the cost of living? at some point the government needs to step in when greed outstrips the ability of the consumer to meet the demand. capitalism on it’s own will demand ever increasing profits and that is simply unsustainable for any civilisation
I disagree. A large part of the cost of a utility is fixed per customer. Or any product really. That's how bulk purchasing makes sense. I can get 4x the product at a bulk store for 2x the price. Instead of being prejudicial about the use case, let's just charge what the utility actually costs. Include capital, operation, and decommissioning costs. That way, if you get a sudden spike in demand, you have the cash flow to issue a bond a scale up.
This would be an extremely regressive pricing structure that still has the same punchline: somehow residential users pay more to still not have any water.
1. You're thirsty and need a sip of water? That should be free
2. You're an household and use water? That should cost progressively more the more you use if you use more than typically needed
3. Your business model requires you to evaporate every last drop of water in a desert region? That should be so prohibitively expensive that your business model does not work
This is basically just a low amount threshold and a exponential function. You just need to select the exponent.
My understanding is, that at least in California the Alfalfa itself is often also exported to Gulf countries that are too dry for pasture or growing feed, but they want to raise cattle. Alfalfa really is almost like dehydrated water like aluminum is solid electricity.
Of course, the farmers pay almost nothing for the absolutely gigantic amounts of water, and meanwhile they pester me to use a low flow shower head and charge me $400/month for a few gallons.
Until people/animals eat it, or it decomposes. Not saying this like we should ignore the co2 impact from data centers, but biomass is a pretty poor co2 absorber unless its cyano and falls to the ocean floor before decomposing
Well, if you want to think about it that way (perfectly reasonable), you'd also want to consider the production of new alfalfa. Figure that at any given time, the world contains X amount of alfalfa, and that amount determines how much carbon is absorbed by the alfalfa industry.
I'm not sure any carbon is absorbed even by this metric. Unless were growing alfalfa and sequestering it below ground.
You should probably also consider inputs to growing that alfalfa too. Even single order inputs like transportation, fertilizer, water, etc would likely have more carbon release than the carbon mass of the alfalfa.
Is alfalfa even one of the plants that will nitrogen fix from the air? Or is it all pulled from the growing medium?
Goes into cow, comes out as methane. cow dies/meat --> co2. All the fossil fuel transportation for alfalfa to cow to brisket --> co2. Lot more co2 generated than absorbed.
I think it’s a good thing when non-americans want to invest in businesses that operate in america. They’ll use that investment to hire americans and produce goods that are sold to americans. I’m sure american capitalists would love to be the only ones allowed to invest here, as they would be able to get more equity for their capital, but it’s not good for anyone else and not really even good for them long term. Smart countries try to attract foreign investment, not scare it away
It's quite regulated in the western US, but usually in the direction of guaranteeing water to incumbent landowners. Some people end up with really strong water rights, and they can be wasteful if the law helps them do so.
A big celebrity, I think one of the Kardashians was a couple of years ago fined and forced to update things when the city found that the big fountain in the front of the home had no recycling or such, but was effectively just an open faucet because I guess keeping it algae free was proving a hassle.
Regulation is not necessarily the same as protecting; as other commenters state the specific regulations around agricultural water use in the drier western united states often encourage wasteful agricultural uses of water.
The driest places tend to have the most tightly-regulated water.
And the wettest places tend to have the least-regulated water.
(Nobody talks about it because shortages make bigger headlines than surpluses do, but there's a ton of agricultural areas in the US that have too much water and where providing drainage for farm fields is much more commonplace than irrigating them is.
It doesn't really matter in this context, though, because folks hate datacenters in these water-rich areas just the same as they do everywhere else.)
I don't know the exact situation described above, but water rights are often linked to property rights, and those are regularly treated as sacred. It doesn't matter if the owners are foreigners and the law is outdated. And those with land often have more money and power than the small government with jurisdiction, assuming the lobbyists haven't taken control of the latter.
They indeed are treated as sacred, it's enshrined in the Takings Clause of the US Constitution. The big problem in the American West it that the model of property rights in water sources makes it very difficult as a technical matter to put a price on a specific claim and to adjudicate disputes, without triggering a cascade of pricing and rights dilemmas upstream and downstream (figuratively and literally). Western states could in theory exercise eminent domain to take back water rights, and I think they occasionally do, but it's just very fraught from countless legal angles even before getting into the politics of it, which compound the headaches a hundredfold (partly because of the interdependent nature of everybody's rights). Most of the time Western states try to hack around the issues with complicated regulatory and taxing schemes to try to claw back some semblance of control over water resources. But it's very inefficient and ineffective. Property rights are useful because you don't need to centralize all pricing and usage decisions, or when you do--e.g. regulation, taxation, eminent domain--the mechanisms for applying those decisions are simpler and more mechanical; but Western water rights are just a different kind of beast. What's needed is comprehensive reform that tries to shift the American West to a better water rights model, specifically a better model for how property rights inhere in water resources, to drastically improve transactional efficiency, both from a legal and market perspective. But there's no simple way, and in particular no cheap way from a budgetary perspective, to get there even if the motivation existed to get around the monumental collective action problem, which it doesn't.
> But there's no simple way, and in particular no cheap way from a budgetary perspective, to get there even if the motivation existed to get around the monumental collective action problem, which it doesn't.
It seems like maybe there is though.
The first problem is the "use it or lose it" provisions where someone has the rights to use water but not sell it, thereby encouraging waste. That one has a solid solution: If they have the right to use it, they get the right to sell it. Make sale inalienable from use. Then you don't have to pay them anything because you're giving them something instead of taking it. But you get higher water availability as now all these people wasting "free" water start selling it because the opportunity cost of not selling it is now worth more than the wasteful use. The only "problem" here is that they get a windfall, but we can solve that in the same way as the second "problem".
Which is the takings clause. The purpose of that is to prevent unequal takings. If the government needs your land to build a railroad, they have to pay you for it, because they're taking yours but not anyone else's. Whereas when they take everyone's property at the same rate it's called property tax, and that's allowed. So if you just got a windfall of water rights in a dry place, congrats, you now have a valuable property right which is subject to property tax. Not using the water and don't want to pay the tax? Then sell the water. Since the buyer values it at more than you do, and the tax is less than 100% of the value, everyone comes out ahead compared to the status quo. The previous inefficient user gets $100 in money instead of $10 worth of inefficient use, the government gets some proportion of that in new tax revenue (variously property tax on the rights and income tax on the sale), the buyer gets water it values at >$100.
Can you explain the issue from a more basic level for people who don’t know? what i’m imagining is that, like, an aquifer might connect over a very large area and every property owner in the area has the right to extract as much water as they want from it? Leading to a tragedy of the commons situation that states are unable to regulate for some reason?
The problem is that vendors and developers have repeatedly shown that if you give them an inch, they take a mile. Look at exactly what happened with BlueHammer this month. The security researcher went full disclosure because Microsoft didn't listen to their reports.
Disclosure is vital. It's essential. Because the truth is, if a security researcher has found it, it's extremely likely that it's already been found by either black hats or by state actors. Ignorance is not actually protection from exploitation.
The security researcher also has a responsibility to the general public that is still actively using vulnerable software in ignorance. They need to be protected from vendor and developer negligence as well as from exploits. And the only way to protect yourself from an exploit that hasn't yet been patched is to know that it is there.
The situation with e.g. BlueHammer is fundamentally different: there, the only party that could act on it (Microsoft) ignored them. In this case, the parties that could act on it weren't notified at all.
I'm also not proposing delaying the disclosure to the general public at all. They already waited 30 days with that, that's fine. Just look a bit further than your checklist of only contacting upstream, and send a mail to the distributions if they haven't picked it up a week or two before.
Downstream vulnerability disclosure is a negotiation between the downstreams and the upstreams. It is not the job of a vulnerability researcher to map this out perfectly (or at all).
Yes and that's why the current system where security researchers are expected to reach out to the distro mailing list is flawed and instead there should be a defined pipeline for the kernel security team to give a heads up.
"Prior to Project Zero our researchers had tried a number of different disclosure policies, such as coordinated vulnerability disclosure. [...] "We used this model of disclosure for over a decade, and the results weren’t particularly compelling. Many fixes took over six months to be released, while some of our vulnerability reports went unfixed entirely! We were optimistic that vendors could do better, but we weren’t seeing the improvements to internal triage, patch development, testing, and release processes that we knew would provide the most benefit to users.
[...]
While every vulnerability disclosure policy has certain pros and cons, Project Zero has concluded that a 90-day disclosure deadline policy is currently the best option available for user security. Based on our experiences with using this policy for multiple years across thousands of vulnerability reports, we can say that we’re very satisfied with the results.
[...]
For example, we observed a 40% faster response time from one software vendor when comparing bugs reported against the same target over a 7-year period, while another software vendor doubled the regularity of their security updates in response to our policy."
>Linux distros (specifically) act in this way
carving out special exceptions based on nebulous criteria is a bad idea. 90+30 is what has been settled on, and mostly works.
Because I would call a situation where the development team fails to appreciate the severity of a security vulnerability and has an established procedure that requires the researcher and not the kernel team to communicate with downstream users is already a major failure of process. Security is not just patching the vulnerability, and it seems that the Linux kernel developers or the Linux kernel security team does not understand that.
This is the result of that failure.
If this were any other software, we'd be here with pitchforks and torches. The researcher gave the developers timed disclosure, and even waited until after the developers had patched the issue. And... it's still a problem.
No, this was already timed disclosure. This is very common and widely accepted. 90+30 is what Google Project Zero uses, for example. The security researcher has met their ethical requirements already. This is entirely on the kernel's security team for failure to communicate downstream. That is their responsibility.
The thing is, malicous actors are already monitoring most major projects and doing either source analysis or binary analysis to figure out if changes were made to patch a vulnerability. So, as soon as you actually patch, you really need to disclose because all you're doing by not disclosing the vulnerability is handing the bad actors a free go. The black hats already know. You need to tell the white hats, too, so they can patch.
I'm not advocating for delaying the disclosure at all; my point is, if you see your initial disclosure to the kernel didn't go anywhere, to be responsible is to put in a little extra effort to ensure the fix is picked up before you disclose.
"Didn't go anywhere"? The kernel devs patched it! They patched it weeks ago! The kernel security team needs to communicate security problems in their own releases, because that is where the distros are already looking.
Requiring the security researcher to do it is insane. Should a security researcher that identifies a vulnerability in electron.js need to identify every possible project using electron.js to communicate with them the vulnerability exists? No. That's absurd.
The kernel devs patched it! They patched it weeks ago
FTFA:
> I see that on the 11th of April 6.19.12 & 6.18.22 were released with the fix backported.
> Longterm 6.12, 6.6, 6.1, 5.15, 5.10 have not received the fix and I don't see anything in the upstream stable queues yet as I write.
I wouldn't go so far as to call this "the kernel devs patched it". Virtually none of the kernels that distro's are actually using today have received a fix. This looks like an extremely lackluster response from the kernel security team.
Pretty much the only non-rolling distro's that are shipping a fixed kernel are Fedora 44 and Ubuntu 26.04, both released in the last few weeks. Their previous releases both shipped with Linux 6.17 which is still vulnerable today!
None of this impacts disclosure norms. One important reason the clock starts ticking faster once any patch lands is that for serious attackers, the patch discloses the vulnerability. That's quadruply so in 2026, when many orgs are automatically pumping Linux patches through LLM pipelines to qualify them for exploitability.
But it's been at least 15 years since "reversing means patches are effectively disclosures legible mostly to attackers" became a norm in software security. And that was for closed-source software (most notably Windows). The norms are even laxer for open source.
I don't know if you are or you aren't, but that's the overall topic of the thread, and I'm just clarifying that the details you're adding don't change any of the norms of disclosure.
I'm on Fedora 43 and tried to hack myself with the python script. It didn't work on kernel 6.19.12-200.fc43.x86_64 which has a build date of April 12, 2026
> Should a security researcher that identifies a vulnerability in electron.js need to identify _every_ possible project using electron.js to communicate with them the vulnerability exists? No. That's absurd.
But this is a false comparison, right? The scope of "Linux distributions" and "electron apps" are orders of magnitude different. If the reporter spot checked one or two of the most popular distributions to see if fixes had been adopted, that seems like an extra level of nice diligence before publicizing the details.
It doesn't seem "insane" as much as "not the most efficient path" as has already been well argued. But it also doesn't seem unreasonable to think in a project of the scope of the Linux kernel, with the potential impact of fairly effective(?) privilege escalation, some extra consideration is reasonable--certainly not "insane" at the very least?
They embargoed their vulnerability for 30 days after Linux landed a kernel patch. They did their part. You will always be able to come up with other things they could do for you, and they will always at first blush sound reasonable because of how big and important Linux is, but none of those things will be responsibilities of the vulnerability researcher. Their job is to bring information to light, not to manage downstreams.
About half the thread we're on reads as if the commenters believe Xint made this vulnerability. They did not: they alerted you to it. It was already there.
I realize you've been championing this idea in the thread, and I admire it because I also recognize the misdirected blame. Please understand I do not harbor "blame" for the researchers.
> Their job is to bring information to light, not to manage downstreams.
The researchers are also members of a community in which more harm than is necessary may be dealt by their actions. Nuance must exist in evaluating "reasonable" and "responsible" in the context of actions.
I strongly disagree. I want the information. I don't want to wait longer to find out about critical vulnerabilities so that researchers can fully genuflect to whatever Linux distribution norms people on message boards have. Their "actions" were to disclose a vulnerability that already existed and was putting people at risk. It's an absolute good.
If it helps you out any, even though my logic was absolutely the same and just as categorical in 2012 as it is today: there are now multiple automated projects that run every merged Linux commit through frontier models to scope them (the status quo ante of the patch) out for exploitability, and then add them to libraries of automatically-exploitable bugs.
People here are just mad that they heard about the bug. Serious attackers had this the moment it hit the kernel. This whole debate is kind of farcical. It's about a "real time" response this week to a disaster that struck a month ago.
I do get that, this era of automation is too responsive to not go public to provoke action. I think I might just be wistful of an era in which the alternate path might have made a difference. Sorry to pile on.
The only word doing any work at all in that definition is "artifacts", and the problem is that the methodology that is actually foundational to engineering need not be applied to physical objects. Further, it's not clear that this methodology shouldn't be rigorously applied to non-"artifacts" which that can cause equal or greater harms when created negligently.
The definition I always saw used was this one, I think:
> Engineering is the profession in which a knowledge of the mathematical and natural sciences gained by study, experience, and practice is applied with judgment to develop ways to utilize, economically, the materials and forces of nature for the benefit of mankind.
This sounds like it should exclude software design and development. Except it doesn't need to, and it's not really useful to exclude it simply because the definition isn't broad enough. The definition isn't engineering. The definition is trying to describe and encapsulate the reality of engineering. Nuclear and modern electrical engineers frequently never create anything physical in their careers whatsoever. Nuclear engineers manage power generation at facilities that others designed and built, while electrical engineers are frequently just dealing with signal processing. They are not less rigorous in their methodology.
The reality is that engineering is the methodical application of constraints to solve a problem. And it is the methodology that is the valuable aspect. The knowledge is necessary for each discipline, but it is itself fundamentally a prerequisite. There is a reason engineering is a single school of many disciplines.
Meanwhile, the reason that software engineering looks like half-art and half-guess has a lot more to do with software as a non-theoretical field of study only being about 60 years old in practical terms. The fundamental works of the field like The Art of Computer Programming haven't even been written yet.
Whatever happens to software development and operational systems administration in the next 50 years, however, both roles almost certainly would benefit society by becoming actual professions. Their responsibility to society as a whole has been allowed to be understated, and we're well past the days when a computer bug causing the kinds of deaths and damages such as we'd see from a civic work failure or automotive design flaw sounds unreasonable. Indeed, that actually sound fortunate given some of the software catastrophes that have occurred.
>The only word doing any work at all in that definition is "artifacts"
That's the subject, the only word that is NOT doing any work there (since both regular and software engineering produce artifacts).
Words that do the heavy work in that phrase are:
structured,
mature,
legally enforced,
standards-based approach
- for
repeatable,
reliable,
verifiable,
- artifacts
- under stable external constraints
Software can sometime appear to touch those.
E.g. there are "standards", like HTML or like ARIA, so it's "standards-based" too! But those standards are loosely enforced, usually not mandated, loosely defined, and ad-hoc implemented with all kinds of diverting.
Or e.g. software can some times be repeatable. E.g. reproducible builds (to touch upon one aspect). But that's again left to the implementor, seldom followed (almost never for most software work, only in niche industries).
In general, software is not engineering (in the strict sense) because it's anything goes, all the above conditions can or cannot be handled (in any random set), the final work is a moving target, and verification is fuzzy, if it even happens.
>The reality is that engineering is the methodical application of constraints to solve a problem.
In that case, following specific constraits to solve a math problem, or to draw an artwork (e.g. using perspective) is also "engineering". That's too loose a term to be of any use.
Even accepting that, the degree of "methological" in software "engineering" versus e.g. civic or aviation engineering is orders of magnitude less.
> In that case, following specific constraits to solve a math problem, or to draw an artwork (e.g. using perspective) is also "engineering". That's too loose a term to be of any use.
You're being deliberately obtuse and consistently choosing semantic equivocation, so I'm not really going to engage with you any further, but the point is that there is a specific methodology that is unique to engineering. You've heard of the Scientific Method. There is also the Engineering Method.
Ultimately, though, the problem is you're arguing that the map is the territory. The the arbitrary linguistics we have chosen in the past must always and forever be the same. And that's just not a useful model of reality.
The Welsh or Icelandic "ll" is not quite the same. That's a "voiceless lateral fricative", lacking the alveolar break that earned it the "t" in "tl" for the Latinized spelling. It's much closer than most languages get, but it is a different sound.
The Nahuatl consonant is a "voiceless alveolar lateral affricate". It is a single constant represented with [tɬ] or, more correctly, with a tie bar between those two glyphs: [t͡ɬ].
I stand corrected you are right there is no isolated use of [ɬ] in nahuatl as a phoneme it is used only in the context of an affricative /t͡ɬ/
I got ahead of myself in trying to isolate the sound [ɬ] for untrained ears.
To get back to the original point though if I'm not mistaken again in standard mexican spanish /ʃ/ as a phoneme is lost entirely and only appears in the affricative /t͡ʃ/? So in all likelihood the original /ʃ/ in axolotl would be pronounced by way of habit as [t͡ʃ] (unless again you have say a argentinian dialect where e.g. "ll" (/ʝ/) in llamar is pronounced as [ʃ]) if you try to "correct" mexican spanish speakers.
It's not just DLL hell. Cygwin was also notorious for being really out of date. Security vulnerabilities and missing features were both very common at one point.
USB-C is rated for 10,000 connections, while Lightning is rated for 40,000. Except if you disconnect and reconnect your phone 4 times a day every day of every year you own it, 10,000 is enough for just under 7 years. And Lighting was introduced in 2012, while USB-C was 2014. In those days, the average lifespan of a smartphone was 2.5 years. Even today, the software is only supported for 7 years at most. You don't need a connector that's going to last nearly 30 years.
And the additional durability of Lightning is itself not free. It's not cheaper than USB-C. Quite the opposite. That additional cost means that it either uses more resources to manufacture, or more resources to make the tools to manufacture. So, it's just wasteful. Lightning is "physically superior" but USB-C is better engineering.
Apple knows that. So Apple chose to go with Lightning because it was theirs, not because it was better. Because it's not really better. Not better for the customer. Or really better for business. Apple chose vendor lock-in.
Worse than that, Apple's connectors are higher durability, but their cabling itself is awful. I work at a K-12 and we were in an iPad and Chromebook pilot back in the mid 2010s that ran about 4-5 years. We had a fleet of 3500 of each. The iPads saw less than half the usage hours as the Chromebooks, but had something like triple the incidence of cable replacement. The cable insulation splits. The plasticizers degrade, the cables get really sticky or oily, and then they split and expose the braided grounding sheath. That braided cable will shock you. That was true for both student and staff devices. So they had these wonderful connectors, but the cables still failed at effectively five or six times the rate of the alternative. And since they were proprietary, you couldn't just buy a better cable made by someone else! You had to buy the same cable that you knew was going to fail!
> And since they were proprietary, you couldn't just buy a better cable made by someone else! You had to buy the same cable that you knew was going to fail!
Godswallop! Aftermarket Lightning cables were readily available shortly after Apple first use the the port.
Agreed though, their own Apple branded cables that came with the device are terrible, and I always just threw them straight in the bin.
And connection cycles is the wrong metric for USB-C vs Lightning. The correct metric is how many and how much side-force removals can the port withstand.
My experience shows that for USB-C the answer is wildly insufficient whereas for Lightning it’s sufficiently high enough that it won’t be a concern.
IMX, the third party cables are fine... If you're interested only ever doing slow charging with about half of them. They were real bad when we tried them.
reply