Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

an automated system that could check if a plane is about to land on a runway and show some kind of alert or red light is hardly a stretch of the imagination


That’s such a great idea that it already exists and is deployed at La Guardia.

https://www.faa.gov/air_traffic/technology/rwsl


Thank you for providing your aviation knowledge to this discussion. What a classic example of tech people thinking that because they're smart, every other industry must be dumb and they can just jump in and fix it.


I also do not like this persistent tone of “everyone else is stupid; software would easily fix it” that pops up so often. Not all problems are easy to fix with some code.

To be clear, though, I don’t even have significant aviation knowledge. But this isn’t hard to learn about. That’s part of what irks me so much about this tone. It’s not just “I’m so smart” it’s “I’m so confident that you’re dumb that I don’t need to know anything about the domain you’re working in to know better than you”. Someone could ask ChatGPT why airports don’t have stoplights to stop traffic from crossing the runway and it would reveal the existence of this system.


Yes, in fact I had considered adding your same thought to my initial comment. It's not impossible that a smart tech person might be able to improve the existing systems. The problem is the arrogance of not even checking what existing systems there might be, as if obviously they'd be too backwards to have any.


> "I don’t need to know anything about the domain you’re working in to know better than you"

This frustates me to no end. Is it just an example of the Dunning–Kruger effect?


Something like that. It feels a bit different because it’s less about overestimating one’s knowledge/ability and more about underestimating the complexity of domains outside one’s expertise. But yeah. Very similar.


Me too, but I don’t like referring to Dunning-Kruger ever for multiple reasons. There are perfectly good labels like cockiness, arrogance, ignorance, presumptuousness, and wrongheaded. ;)

There are many issues with DK, and the paper’s widely misunderstood. For one, the primary figure demonstrates a positive correlation between confidence and competence, so according to DK’s own paper, high confidence is not an indicator of incompetence, contrary to popular belief. The paper also measured things in a very funny way (by having participants rank themselves against other people of unknown skill), and it measured only very simple things (like basic grammar, and ability to get a joke), and it only polled Cornell undergrads (no truly incompetent people), and there were a tiny number of participants receiving extra credit (might exclude the As and Fs in the class). Many smart people have come to the conclusion that DK is a statistical artifact of the way they did their experiment, not a real cognitive bias. Some smart people have pointed out that DK is probably popular because it’s really tempting to believe - we like the idea of arrogant people getting justice. The paper also primes the reader, telling them what to believe even though the title isn’t truly supported by the data. It’s an interesting read that I think would not pass today’s publication criteria.

Anyway, sorry, slash rant.


Agreed, but I see this in every industry. And though it's certainly arrogant on some level, I think of it in a more positive light: people are generally optimistic and want to solve problems.

My grandfather had a rule at his business for 55-ish years: we welcome your ideas and suggestions, but not for the first year. You spend that time learning our processes, decisions behind them, pain points, areas that need improvement, etc. You also spend that time doing the work and hearing from your colleagues. Then you can (hopefully) make informed suggestions. That's not possible in every situation, but I like the intent.


> people are generally optimistic and want to solve problems.

This is an amazingly positive spin on the behavior.


I meant something in-vehicle for ground vehicles, like an extremely simple extrapolation of current velocity and the extremely predictable trajectory of a plane, instead of depending on going back and forth over radio asking a very busy fallible human, but sure

even my cheap car has geofencing and automatic braking

I've worked on avionics professionally and I haven't crashed any of my planes yet...


“These lights … turn red in response to traffic, providing direct, immediate alerts without the need for input from controllers”.

It will be interesting to see what the report says. Did the light system not function? Did they override it? Do they ignore it consistently?

> geofencing and automatic braking

I’m not at all sure I want emergency vehicles to be blocked like this. And if they can override then it’s no different. They didn’t roll onto the runway on accident.

> I've worked on avionics professionally and I haven't crashed any of my planes yet...

Is this relevant somehow?


The habit where HN commenters greenfield solutions that are slightly worse versions of the ones experts already have in place is unmatched.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: