> Enterprise knowledge has always been as much a human problem as a technology one. Nobody wants to do the structuring work, and every prior architecture demanded that somebody do the structuring work rather than their actual job
This is correct and very agreeable to everyone, but then after some waffle they then write this:
> Structure, for the first time, can be produced from content instead of demanded from people
These quotes are very much at odds. Where is this structure and content supposed to come from if you just said that nobody makes it? Nowhere in that waffle is it explained clearly how this is really supposed to work. If you want to sell AI and not just grift, this is the part people are hung up on. Elsewhere in the article are stats on hallucination rates of the bigger offerings, and yet there's nothing to convince anyone this will do better other than a pinky promise.
I think the explanation comes later in the article:
"It is graph-native - not a vector database with graph features bolted on, not a document store with a graph view, but a graph at it's core - because the multi-hop question intelligent systems actually have to answer cannot be answered by cosine similarity over chunked text, no matter how much AI you paste on top."
And
"It has a deterministic harness around its stochastic components. The language model proposes but the scaffolding verifies. Every inference, every tool call, every state change is captured in an immutable ledger as first-class data and this is what makes non-deterministic components safe to deploy where determinism is required."
I also see a ton of this here on HN as the political topics have ramped up.
Not enough people are flagging those when it aligns with their bias. It's even less likely to get flagged when it's a double whammy of politics and AI. Loosely being about AI should not give it a free pass.
The problem is, the other side definitely does not play by the rules and has no intention to. Why should we "police our side" and weaken ourselves? The endless purity testing on the progressive side of politics is a large part of why the far-right is so damn powerful worldwide.
Ironically purity testing is something we should police.
We police ourselves because it makes us look better to moderates and defectors as the other side fails to fulfill its promises.
If the other side doesn’t police themselves, it’s their loss, because they’ll alienate fence-sitters and (mispredicting due to groupthink) fail even harder.
That's exactly what it says. There is a big fat in that row as if Tailwind were not Free Software because it is not Copyleft. If he wanted to point that out he could have used a separate row and written " (not copyleft though)" or something like that.
You're right, open source and free software are not the same thing, but software licenced under the MIT licence is still free software. Even the FSF describes the MIT licence as a free software licence (see my other reply in this thread).
Yeah I'm still not following the loaded premise of this question. It's just a table telling people what the project is about.
An MIT-licensed project trying to not scare people away might have the same comparison table in their readme. They'd just flip around the green checkmark and red X.
> Here is an idea: UI mostly server-driven, server-side rendered with the use of HTMX
Yeah it's more HTMX evangelism.
I agree with you, but I can sort of see their point. I have successfully advocated for and implemented very barebones static HTML/CSS pages for some clients in the past, but that's rarely the right choice. They wanted deep control of the exact markup and style because they cared the most about SEO, WCAG compliance, responsive mobile design, and legacy browser support. They did not have much interactivity apart from a couple of form tags. They even worked flawlessly without javascript enabled.
I know that sounds appealing to the naive and stubborn types who hate all other web dev, but there's a catch. The "simpler" a web page is, the more testing there will need to be. You're not reducing the complexity, but just moving it somewhere else. These pages were micromanaged into oblivion with frequent audits by large teams each with their own specific concerns. The majority of my time was coordinating a circus, not writing code. In that kind of situation, there is no other choice. It was an absurd amount of testing to make that work. All that for pages that barely did anything, and questionable business value. If you want your "bullshit job", look no further than that type of web dev.
The misunderstanding I see over and over is not realizing how broad web development is. I wouldn't be doing my job if I didn't optimize for maximum flexibility of the most actively developed implementation details the client cares about. The higher value web dev is interactive functionality, not bullshit marketing pages.
All I can really say is HTMX is not a tool I see myself ever reaching for any serious production use case. It will never beat plain static pages on flexibility, speed, or scalability. It will never beat serious web app frameworks like React or Vue on developer ergonomics and tools. More generally, server-side rendering is shooting yourself in the foot the moment you need to host off a CDN or migrate. The list of downsides is endless.
What makes you think a sustainable negative social/political trend laser focused on AI is even possible?
Statistical approaches were already extremely unpopular socially and politically long before AI came around. Have you considered that it just doesn't work?
Whole fruit also has a lower glycemic index due to the fiber. This slow release of sugar helps reduce insulin resistance and balance out hormone response in general.
Hormonal imbalance is severely underrated as a root cause of common mental health issues like anxiety, depression, etc.
Having fruit in the morning is a little boost without the guilt. Adding in some light exercise, like walking, also helps prime the day. It even gets easier to wake up early for all this the more regularly it's done. It's one big reinforcement cycle for healthy habits.
You know, I was actually hoping for a good listicle of things to watch out for in meetings. The author should take their own advice. Assuming bad faith immediately kills all productivity, so there's no point in finishing reading this.
I agree with the general notion that there are often knowledge gaps getting in the way of better planning and execution. I was hoping for techniques to overcome them, but (sigh) I guess that's just more "engineering" getting in the way.
I've been doing this for long enough to realize there's no substitute for experience. It's basically the opposite of all the popular advice. If you're serious about any successful long-term career, you can't avoid looking foolish and having lots of difficult discussions. There are no shortcuts. There is no "higher path" you're missing out on. If you're going to grind it out, at least save face by working at the "shitty places" with bad reviews on glassdoor where you can safely fail without damage to your ego or reputation. When you finally get hired somewhere nicer mid-career, you can just bury all that in your mind and pretend it never happened. Nobody cares anyway.
If we're going to be judgy, I gotta say some of the worst people I've ever worked with never got out of that phase. It's that simple.
> Assuming bad faith immediately kills all productivity, so there's no point in finishing reading this.
First, the author is not assuming bad faith. They are saying that judging people is common pitfall. And the "hating or dismissing people for misunderstanding the thing you documented badly" is something I have seen done so many times, that yep, it exists.
But second unrelated thing is, sometimes there is a bad faith. Refusing to accept that bad faith situation can happen just makes it massively harder to solve the issue. It empowers the person acting in bad faith.
Judging people doesn’t imply bad faith. We are all judging people all the time. It takes constant effort trying to be self-aware of it and trying to compensate for it.
Oh the author continued by saying: "Stop assuming they are bad at their job or their lives".
This was too absurd and hostile for me to continue listening.
I asked myself whether I thought the author was bad at writing, and realized I fell into their trap.
I asked myself how lost and angry someone has to be to write crap like this, and realized I did it again.
Some people have a real knack for being so defensive and insecure that they invite their own pain. They unwittingly coerce people who meant them no harm into doing so. Everyone is a victim for trying to take this blog post seriously.
I take it seriously and don’t feel like a victim at all. Maybe you never think that someone is bad at their job or their life, but many people do, myself included. It’s neither absurd nor hostile when someone points it out. The article isn’t about morals, it’s about what is a constructive way to get useful results.
Was it really the iMac that did it? I don't think I remember anyone saying that until recent years. Around the time of the first iMac, just about every home PC already had a pair of USB 1.1 ports because of Windows 98 and a lot more plug and play support.
What I recall being sold for Mac were FireWire peripherals back in the late 90s and most of the 2000s. By 2000, USB 2.0 was too good to ignore and addressed all the pain points manufacturers had with USB 1.1 being too slow. That's when I remember USB drives finally being practical and mainstream.
IIRC, the Mac magazines I read at the time were making this claim.
The magazines may have been wrong and their claims turned into an urban legend in the meantime, but it's part of the general sense of what I recall from, ugh, nearly 30 years back now.
The iMac certainly accelerated the adoption curve. There were USB ports on other PCs, but since they also had normal (at the time) ports, no PC users were going out and buying all-new USB peripherals.
Apple's decision to leave out all the other ports meant that a bunch of folks were forced to buy new USB peripherals (and/or adapters), and gave peripheral manufacturers a dedicated market for USB
I don't but this. The first 2 generations of iPod didn't even have USB connectors, only FireWire, which was a PITA as most PCs had a USB connection by that time but FireWire wasn't common as opposed to Mac.
I don't think that runs contrary to my point in any way?
By the time the iPod came around, Apple had adopted FireWire to handle devices that USB's then-limited bandwidth couldn't really support. USB peripherals like mouse/keyboards were already pretty widespread by then.
Sure. The point I was claiming was not Apple being first nor early USB being good, though I don't speak for swiftcoder.
When Firewire was introduced, it wasn't ever popular enough to get the self-sustaining popularity loop of "all the machines have it" <-> "all the peripheral makers support it".
Apple made that happen for USB. Not because USB was amazing in 1997, but because it was the only thing on what was then the cheapest new Mac.
Yup I do think that's true for Mac users, and questioning Apple fans' just-so stories are usually worthwhile to anyone curious. They sure do keep history alive and well. I mean, hey, The Beatles made an entire career out of doing that with their fans!
I just realized this month was Apple's 50th anniversary, so that's likely part of why this is making the rounds. I guess I have my answer.
Indeed. So bad that no one apart from Apple would have tried to go all-in on it. I doubt things like USB mice and keyboards would ever have happened if Apple didn't give it a kick in the behind
Firewire was indeed a nice addition when that came along, but it always remained the domain of pricey high-bandwidth devices.
I did some more digging to find out that the infamous BSOD demo with Bill Gates on stage was meant to show off plug and play on Windows 98. It was caused by a USB scanner.
This happened 4 months before the release date of the first iMac.
Yeah, I mean, I don't disagree with any of this, I just don't think any vendor in the PC world was prepared to rip the PS/2 and serial ports off their motherboards, and tell all the peripheral manufacturers to go take a hike
I do get that it's stressful to raise a family. You're being held accountable for many things you don't have much control over, but I don't think this is a big deal either way. This is a false dichotomy like all the other nonsense aimed at parents.
It misses the point entirely to seek control over whether your kids are "free range" enough. That style of parenting worked so well in the past (it didn't really, but I digress) because they left well enough alone. They weren't trying to contrive anything. Your kids absorb everything from you. Don't let your insecurities be part of it.
What I would argue is much more important is keeping things fresh with new opportunities. That's your main job as a parent. Keep them thinking and engaged with their mostly self-directed path in life. The goal is to open their eyes and help them understand the world. Respect their intelligence and let them decide things on their own.
Many of those so called free range childhoods of the past were actually just empty and boring. That's when they got into trouble. That's not something to be nostalgic about. When I hear about trends like this I have to wonder if some parents are just looking for excuses to be lazy.
This is correct and very agreeable to everyone, but then after some waffle they then write this:
> Structure, for the first time, can be produced from content instead of demanded from people
These quotes are very much at odds. Where is this structure and content supposed to come from if you just said that nobody makes it? Nowhere in that waffle is it explained clearly how this is really supposed to work. If you want to sell AI and not just grift, this is the part people are hung up on. Elsewhere in the article are stats on hallucination rates of the bigger offerings, and yet there's nothing to convince anyone this will do better other than a pinky promise.
reply