Would be nice, but you know they'll carve out exceptions for themselves or use "unauthorized" messaging channels regardless with no consequences. It is _always_ "rules for thee, not for me" with politicians.
Should companies be forced to retain talent of a certain age group? Should they be forced to retain less competent people? How do you expect this to work?
In Sweden,the Employment Protection Act, (LAS ) mandates 'last in - first out', meaning if there are layoffs due to over-capacity, people with seniority (years of employment) take priority for available positions. This is kind of partioned by profession-group, so yes you can fire nurses but keep doctors, or other way around. (Its been a while since I looked into it, but thats the rough gist of it)
Yes, and that makes working for a Swedish company so much better. You know you can’t just be shown the door at any moment after years of service and you get a lot of peace of mind which is worth more than the inflated salaries in the US. There is still a way to get rid of people, of course, but that goes a little like the Japanese do: just don’t give any important work to the person, or give them a bad performance review. People quickly understand they need to move on and they can do it with dignity.
That also means that a) it's harder for younger people to get a stable job b) the bare minimum of work not to get fired decreases over time, which is bad for productivity.
If a significant share of your employees optimise in the sense of doing the least of work possible, without getting fired, you have a huge problem anyways. Usually, given the right conditions, people have intrinsic interest in doing a good job. Even if their motivation is more of the extrinsic type, there is more to it than getting paid.
I have worked a fair share of that kind of jobs in the past. The colleagues on my level who cared about more than being paid and not getting fired where the absolute majority. People want to belong. They want to work. The ones who are the exception of the rule can be seeded out pretty quickly. You do not work for an organisation for 10+ years, wake up one day and switch to pure opportunism.
As for incompetent management, that problem can not be solved by churning workers. It can only be solved by better career paths and selection processes for management roles. The most intelligent people in an organisation are often more interested in getting things done than getting more power.
Yes, it only works in a high trust society where there’s plenty of jobs and people actually care about doing a good job (any company will have incentives, people can’t just sit around and do nothing, lots of social pressure too if you’re a slacker). But hey, that’s been mostly true (until recently, I hear immigrant unemployment is really high, while “local” unemployment is close to zero, but the official statistics sit in the middle at around 7% I think, much higher for the youth).
In my experience those people get juicy positions doing nothing useful as they their competence long atrophied due to zero pressure to keep their knowledge up to date. Of course now companies hire "consultants" to work around to issue, so those get fired on a week's notice when money is tight. The warm bodies remain in their chairs until retirement. Inefficiency remains a huge problem in Swedish economy, but no one dares to touch these archaic rules (BTW no minimal wage in a European country, WTF?) due to political reasons, so the immigrants get the blame instead for everything.
Its a choice - work hard with minimal securities, get better salary. Heck, one can do that in many EU places when working as self-employed on contract (if legal), and be paid by just billed days, no vacations or sick days. Its actually pretty good career path in the beginning of one's career in software development, get more money and ie invest in a property. Then get more secure permanent position, coast more and enjoy and appreciate more those stability benefits.
But high economic performance this isn't. Adaptability of market to ever-changing world that certainly isn't neither. Europe is getting hammered by this and things will get much, much worse in upcoming years. We will have to revisit our comfy lazy attitude towards work, or end up being a stagnant place with 3rd world salaries and corresponding QoL.
Switzerland is doing things much better, its sort of in between both extremes and economy is reflecting this very well. But EU leaders egos will sooner accept poverty than that somebody figured out things better than them.
The Netherlands recognized the problems with the last-in-first-out system and requires that after a reorganization, the statistical distribution remains the same. How well that works is hard to say because the level of unemployment in The Netherlands has been quite low for many yours.
What I hear is that Switzerland is a bad example. Many people there struggle to make a living.
The poverty line is derived from the guidelines of the Conference for Social Welfare (SKOS). In 2024, it was on average CHF 2388 per month for a single person and CHF 4159 for two adults with two children.
I live in Zurich (by far the most expensive city) and while 2388 (or 4159) would be tight (depending on housing) it would still afford you a fairly comfortable life with access to top quality healthcare and public transport. Life quality wise one could argue that poverty in CH is a better option than a middle income in a lot of European countries.
Outside of Zurich rentals are not even that bad. You can easily get a nice apartment for 1500.- or even less. If one is struggling financially, rents are lower e.g. in Aarau district, starting from around 1000 and you can commute from there. Spending 1000 when the median salary is around 7000 is really not that bad. Low inflation in Switzerland meant other European locations are now at the swiss level or sometimes even above.
Yeah Switzerland has rather few poor people and very strong middle class. And poor ain't some US version of homeless/trailer park living, just lower income, less fancy clothes, shopping in cheaper supermarkets, less/no vacations abroad.
Just for others, it seems this was already an article so it came up quickly, but for fines, not taxes.
"In 2024, the total income tax paid by all publicly listed European internet companies combined was approximately €3.2 billion. This total, which includes firms like SAP, Adyen, Spotify, and Zalando, was notably lower than the €3.8 billion in fines the EU collected from US tech giants in the same year"
China is hiring engineering talent. US is firing. Nobody forces anybody to do anything. Just pointing out the current state of affairs in the long life cycle of empire. As Ray Dalio says US is very late stage declining „financial capitalism”. While China is early stage aspiring „production capitalism”. It is not like late stage declining USSR needed as many engineers as it did when it wasnt collapsing. USA is a collapsing empire. China is growing.
TBH this is a pretty good way of looking at it. Yeah we're seeing an explosion of vulnerabilities being found right now, but that (hopefully) means those vulnerabilities are all being cleaned up and we're entering a more hardened era of software. Minus the software packages that are being intentionally put out as exploits, of course. Maybe some might say it's too optimistic and naive, but I think you have a good point.
This is one force that operates. Another is that, in an effort to avoid depending on such a big attack surface, people are increasingly rolling their own code (with or without AI help) where they might previously have turned to an open source library.
I think the effect will generally be an increase in vulnerabilities, since the hand-rolled code hasn't had the same amount of time soaking in the real world as the equivalent OS library; there's no reason to assume the average author would magically create fewer bugs than the original OS library authors initially did. But the vulnerabilities will have much narrower scope: If you successfully exploit an OS library, you can hack a large fraction of all the code that uses it, while if you successfully exploit FooCorp's hand-rolled implementation, you can only hack FooCorp. This changes the economic incentive of funding vulnerabilities to exploit -- though less now than in the past, when you couldn't just point an LLM at your target and tell it "plz hack".
I’m seeing a lot of similar things during code reviews of substantially LLM-produced codebases now. Half-baked bad idea that probably leaked from training sets.
Typically when hand-rolling code you implement only what you require for your use-case, while a library will be more general purpose. As a consequence of doing more, have more code and more bugs.
Also, even seemingly trivial libraries can have bugs. The infamous leftpad library didn't handle certain edge doses properly.
For supply chain security and bug count, I'll take a focused custom implementation of specific features over a library full of generalized functionality.
Yes, a lot hinges on how little you can get away with implementing for your use case. If you have an XML config file with 3 settings in it, you probably won't need to implement handling of external entities the way a full XML parsing library would, which will close off an entire class of attendant vulnerabilities.
> Also, even seemingly trivial libraries can have bugs. The infamous leftpad library didn't handle certain edge doses properly.
This isn't really an argument in favour of having the average programmer reimplement stuff, though. For it to be, you'd have to argue that the leftpad author was unusually sloppy. That may be true in this specific case, but in general, I'm not persuaded that the average OSS author is worse than the average programmer overall. IMHO, contributing your work to an OSS ecosystem is already a mild signal of competence.
On the wider topic of reimplementation: Recently there was an article here about how the latest Ubuntu includes a bunch of coreutils binaries that have been rewritten in Rust. It turns out that, while this presumably reduced the number of memory corruption bugs (there was still one, somehow; I didn't dig into it), it introduced a bunch of new vulnerabilities, mostly caused by creating race conditions between checking a filesystem path and using the path for something.
ETA: I'm not saying it has to, I'm saying it's possible to imagine reasons that would justify this decision in some cases.
Because it might grow in future and you want to allow flexibility for that, because it might be the input to or output from some external system that requires XML, because your team might have standardised on always using XML config files, because introducing yet another custom plain text file format just creates unnecessary cognitive load for everyone who has to use it are real-world reasons I can think of.
But really I was just looking for a concrete example where I know the complexity of the implementation has definitely caused vulnerabilities, whether or not the choice to use it to solve the problem at hand was sensible. I have zero love for XML.
I’m not aware of any memory corruption bugs, but some weird cases where Linux, stuck with legacy 8-bit character handling for filenames and paths, lead to unesirable behavior with Rust’s native Unicode strings.
The race conditions were indeed TOCTOU bugs. In a sense, the bugs were a result of incorrectly handling shared mutable data, though in this case the mutations were external to Rust.
leftpad was a focused custom implementation of a specific feature, instead of a library full of generalized functionality. At the time it was pulled, the leftpad code (JavaScript, Node, NPM) was:
module.exports = leftpad;
function leftpad (str, len, ch) {
str = String(str);
var i = -1;
ch || (ch = ' ');
len = len - str.length;
while (++i < len) {
str = ch + str;
}
return str;
}
Both old and new versions return a string longer than `len` if the padding char is multiple characters, e.g. leftpad('a', 3, '&&&&') will be longer than 3. That feels like it shouldn't happen.
I realize I may have made it seem like I was saying leftpad was a general-purpose library. My aside about it was to note that even widely used libraries can still have bugs. That’s orthogonal to their scope.
That's almost the first literal exercise with strings you'll learn with "The C prog lang 2nd ed" ebook. One of the most trivial cases among writting a word/space/tabs counting program (wc under Unix).
While agreeing, it also changes the mathematics of it: if a bad actor wants to hack me specifically now they have to write custom code that targets my software after figuring out what it _is_. This swaps the asymmetry around: instead of one bad actor writing an exploit for all the world (and those exploits being even harder to find), you have to hate me specifically.
Admittedly, not hard to do, but it could save some other folks.
Do you have a specific library in mind? I think it would have to be an ancient, unmaintained C library.
But I think most OSS code isn't like this -- even C code born long ago, if it's still in wide use, has been hardened by now. Examples: Linux kernel, GNU userland, PostgreSQL, Python.
> even C code born long ago, if it's still in wide use, has been hardened by now. Examples: Linux kernel
There have been two LPE vulnerability and exploits in the Linux kernel announced today. After the one announced just last week. I don't think as much of the C code born long ago has been as carefully hardened as you think.
(Copy Fail 2 and Dirty Frag today, and Copy Fail last week)
Sure, I didn't mean to say that these examples are guaranteed 100% safe -- just that I trust them to be enormously more safe than software that accomplishes the same task that was hand-written by either a human or an an LLM last week.
Are you sure? I'd really like that to be true, I felt bad finishing up work on Friday evening having applied the Dirty Frag mitigation to all our instances, but knowing (thinking?) the Copy Fail 2 vulnerability was still exploitable.
Technically there are two things that need to be fixed in the kernel indeed (and one of them was fixed already), but they're both under the "Dirty Frag" umbrella and the proposed mitigation to not allow the affected modules to load applies to them both.
The future may be distributed quite unevenly here, as they say, with a divergence between a small amount of "responsible" code in systems which leverage AI defensively, and a larger amount of vibe-coded / prompt-engineered code in systems which don't go through the extra trouble, and in fact create additional risk by cutting corners on human review. I personally know a lot of people using AI to create software faster, but none of them have created special security harnesses a la Mozilla (https://arstechnica.com/information-technology/2026/05/mozil...).
You are avoiding intentionally to say ‘thanks to LLMs’ or is implicit? As all these recent mega bugs surface with lots of fuzzing and agentic bashing, right ?
Indeed, yet another proof, there's the part of HN crowd which is passive aggressive, dismissive, and dishonest in the very scientific possible sense. Won't make my day harder than it is, but is a very weak signal.
If I'm to be offended by a single thing in your post that is calling me (names) - is AI Bro. This was undeserved, and cannot be farther from the truth. Not to miss the fact your comment is entirely off topic, and perhaps you see AI bros everywhere now.
I don't disagree with most of your statement, but Valve has and continues to make lots of money from loot boxes in both CS and TF2. Just want to point out that they do do stuff like that too.
Valve is not literally the first but they played a big part in normalizing both lootboxes and micro transactions. Don't rewrite history just because you are a fan.
Not to mention their role in you not owning your games.
> Not to mention their role in you not owning your games.
I do use Steam to "purchase" games, and it irks me that they're still allowed to show "Buy" when in reality you're essentially leasing/renting the game, can't believe it's legal for them (and others) to trick people like this still.
A Steam purchase I have more confidence in than a physical game copy to survive. I trust Steam to honor its agreement with me more than I trust in myself and my feline overlords to keep a game CD alive.
In a previous timeline, this has led to me going on ebay to find CDs of a long lost game (EarthSiege 2), which I promptly uploaded to the Internet Archive as the one distributed by the current license-holder at the time had an older, unstable version with bugs and, more importantly, no audio and my own original copy got damaged to hell and beyond...
Sure, I agree with all of those things, but the fact still stands, Steam is actively lying to customers as the store pages say "Buy" and "Purchase", not "Rent" or "Lease", which are more accurate. You don't actually own the product.
Don't get me wrong, as mentioned, I use Steam and like Steam/Valve, but that move is a bit shitty regardless.
How could it be confusing when that's actually what happens? Imagine HN showed "Delete comment" instead of "Reply" under the comment input, don't you agree that be misleading?
There is a product in the cart, which is a game, and the button says "Purchase", no where does it say anywhere that it's a license (although that's obvious), nor that I don't actually "own" this game after I "purchase" it.
Sure, minor detail perhaps, but I'd still argue that something Steam could do better, and since the industry is lacking self-regulation about this, I'd argue more regulations are needed for this even.
It's on the cart page, before you check out. And I do agree that it should be much more front-and-center, rather than the sort of fine-print thing they have now.
What? Valve basically invented making money with skins and lootboxes, it started with TF2 hats. There is an insane amount of money in the CS2 skin market.
I didn't say Valve is perfect. But they're definitely worth the money I spend there. Great service, proper support, regional pricing, and the list goes on. Everything works today. The work they've put on Proton/Linux gaming easily wins my support.
Did they screw up sometimes? Sure. And I'm from the days when Steam didn't exist. I remember the NoSTEAM game versions in shady sites, including Half-Life 2. Steam was hated with a passion back then. They won by ultimately providing great value and service.
I had a rough time with Proton a few years ago and ended up setting up my most recent gaming rig as a Windows 11 machine. In retrospect it was probably unfair to judge it on dime-a-dozen Humble Bundle leftovers from a decade ago when most of the effort is spent on supporting new releases.
But yeah... just this week I was traveling for work and my kid reached out wanting to play a little Deep Rock Galactic with me. I couldn't believe how easy everything was from my Ubuntu 24.04 laptop. Steam, proton, Discord, all of it just worked and I wouldn't even have realised it wasn't running natively if I hadn't noticed the extra proton download in the Steam client.
> The work they've put on Proton/Linux gaming easily wins my support.
Lets not be naive here, this is the money they are saving in Windows licenses for the Steam Deck, and having their own store instead of Windows Store/XBox PC App.
Yet they are doing zero to foster native Linux games.
There isn't much they can do to foster native Linux support beyond trying to increase the number of people gaming on Linux. It's a chicken-and-egg problem, and you need to make the platform desirable to developers before they will start developing for it.
This is the chicken-and-egg problem though. If you don't get the Linux/Steam Deck audience large enough first, then that tradeoff won't be worth it to developers.
Valve have the money to pay developers to make a Linux port or ensure it works with Proton (maybe they already do in some cases?), if they really wanted to put the heat on Microsoft. Well, any not owned or being published by Microsoft i suppose.
Valve seem happy to let things happen more organically however.
> Yet they are doing zero to foster native Linux games.
"zero" might be a bit harsh, considering that they do some things at least, compared to others who literally do nothing. Steam the platform has native Linux support, what games are natively available is visible on Store listings, and a bunch of the SDKs (all of them even maybe?) are available natively on Linux too. The situation could have been a lot worse.
It will get more worse, with Proton there is no value in e.g. using Vulkan, just use DirectX, and the convinience of modern GPU programming tooling in Visual Studio, HLSL code completion with CoPilot, PIX debugger, and then let Valve have to worry about running it on Linux.
> with Proton there is no value in e.g. using Vulkan
Valve themselves seems to disagree with you here, considering they still have Linux native SDKs available for integration, and are releasing their own games with native Linux support.
I'm guessing if what you say is true, Valve would be the first to move towards that reality you paint, but we haven't seen that yet, I'm doubting we'll ever see that, but the ones who live will see I suppose :)
Valve will get their OS/2 and netbooks moment if they don't foster a proper native Linux games ecosystem, but yeah lets cheer for Windows games translation on Linux while it lasts.
I think there's a reasonable argument that the most stable Linux gaming API surface is actually Proton.
None of this is really going to change until we end up with a situation like the EA/Apple Store conflict: a major player unable to sell a game on Windows for some reason.
Also, it's something of a pragmatic choice -- Valve did put major effort into native Linux games around 2013, but the effort fell flat for a number of reasons.
Proton is them trying a different path towards severing or lessening the Windows dependence, in my opinion.
Totally agree with you there, as much as I love to hate non-transferability, revokable licenses, permanent VAC bans on accounts that got hacked, I still find Steam the most convenient path to "owning" games in one place.
The Linux work done for Steam Deck is fantastic and I do credit their efforts with inspiring others to work on similar projects that extend and complement what Valve achieved. Much of the hard effort did go into Windows games on Linux before Valve looked at it; everything the WINE project, Codeweavers did, gaming via Lutris since 2009, however Valve have definitely been a force multiplier.
Trust is earned and I think Valve are doing pretty well on that front, especially when you look at the differences to other PC stores, Ubisoft, EA, and to some extent Epic. GOG and Itch are very different beasts.
To some extent I miss the time where Steam was totally curated, you had to make an impact to get your game on the platform, back before it was a free-for-all of shovelware and low-effort slop. Occasional controversies aside, at least on Steam the tools / marketing funnel are there to keep the popular games at the forefront of the store whilst also being fairly open to allow devs to publish without being the chosen one.
Is there a danger of doing to games what Spotify has done to music? Maybe, but I reckon the super deep-discount sales have calmed somewhat and are happening later in game's long-tail part of the lifecycle or used as promo for sequels.
There are plenty of publishers that choose to mainly avoid going that route, often the traditional established publishers with console outlets they don't want to cannibalise, for example Sony and Konami.
> Is there a danger of doing to games what Spotify has done to music?
I think such business model ultimately doesn't scale well for games (several million-dollars production budgets sharing minuscule pieces of a ~$20 all-you-can-eat subscription pie).
Microsoft always knew this, they didn't try to win the market, they tried to subvert the business model, probably expecting the industry as a whole moving towards it -- which didn't happen at all, at least not yet.
Simple math would prove this. There's no way acquiring half the good studios in the world and make them release flop after flop was a break-even operation. It's several orders of magnitude behind.
Most of the market talks Nintendo, Sony, XBox, Apple Arcade, Android.
Exactly because they aquired half the good studios, they happen to be one of the biggest publishers, people forget some of those studios keep using their own branding instead of anything Microsoft, and it would hurt Steam if Microsoft decides all those studios would pull out of it.
Microsoft moved to a subscription service because they botched the launch of the Xbox One, with users accumulating digital libraries on the PlayStation, and that failure is something that has continued to drag them further and further down.
I agree that turning CS into a casino wasn’t a tasteful choice on Valve’s part but as someone who has played CS at least once a week for decades I can understand that they needed to find a way to cover server costs somehow. I paid $15 dollars for CS:GO and have clocked 4,500 hours in the game. I don’t gamble but I’d rather those who choose to fund the server costs than Valve charge a monthly subscription to everyone. Skin sales alone would have accomplished this without having to have loot boxes and keys and that’s where I think Valve went overboard with it. Also, for a game that provides so much revenue I expect better anti-cheat and more VAC bans, which are rare.
They didn't need to cover server costs for CS 1.6. I wonder why that is? Hint: CS 1.6 wasn't designed from the ground up as a microtransaction vehicle so could have servers run by the community unlike CS:GO where centrally run servers are needed to make microtransactions work, not the other way around.
The lootboxes drop as a normal course of gameplay, you buy keys to open them. People still play TF2 so presumably some still open boxes. It's also the base unit of trading for high value items.
CS lootboxes are the least shitty ones in the entire industry. There is 0% pay to win, if anything the skins are a disadvantage because they usually stand out.
I didn't say that lootboxes were pay to win and most lootboxes in games are not. That doesn't mean it's not still profiting from and enabling gambling and addiction.
There's a lot of evidence showing that gambling as a child leads to gambling problems as an adult, and loot boxes are just gambling aimed to a large degree at children.
Valve games are even worse for this because Steam trading allows 3rd party sites to sell cosmetics directly for cash, and some of these cosmetics are worth tens of thousands of dollars. It's just children gambling money but with a thin veneer of video game over the top.
And I see no problem with that. I have never bought a single skin for Counter Strike nor Team Fortress 2 and I have bunch. Well, I used to, but then CS2 came out and all of a sudden my skins and unopened boxes were valued at hundreds of euros and I sold them on steam.
If you think enabling childhood gambling addictions and unregulated gambling systems aren't a problem then I don't know what else to say to you. Lootboxes are gambling, plain and simple.
Gamblers will always find a way to gamble. I like getting cosmetics for free even if they are randomized. I don’t think I have ever bought keys to loot boxes, I just don’t see the point.
Americans would rather mention TF2, a game with less than 10 thousand concurrent players and probably making a modicum of money, than ever pretend that game exists or has influenced other games.
Regardless of how many concurrent players it has now, TF2 was massively influential to other FPS games, and it's still held with high regard by the community. It was also one of the first major games to introduce loot boxes.
The title says "quality" but the summary seems to say it only measures the "strength" oand "darkness of roast". Certainly won't measure how good it tastes. Given these are the two properties purportedly measured, I imagine you'd get the same results regardless of tastiness and age of the coffee or beans.
> Given these are the two properties purportedly measured, I imagine you'd get the same results regardless of tastiness and age of the coffee or beans.
Right, but another way of putting it, it might provide useful signal if you hold "age of the coffee or beans" and other such factors constant :).
In this day and age where every company is playing _very_ fast and loose with their LPR/citizen employees' lives and livelihoods, yes, I think PERM should be a very strict and easily lost privilege across the board for the whole company, not a right. If we had sane employee protections in this country maybe my opinion would be different.
I have been both on visa side of things and LPR/citizen side of things. I don’t understand this mindset lpr|citizen>>perm|visa. As if the perm/visa individual has no life and can easily pack a handbag and leave tomorrow. Not defending any processes here just pointing out people (perm or not) buy homes/cars have kids in schools etc and by the time they get to perm process they are pretty much ingrained here. The viewpoint of ‘discarding’ batches of perms sounds very hypocritical.
It's not about you, you're just an unfortunate individual caught in the crossfire, and in some ways I am sorry about that. However I think it's important for countries to look after their own citizens first before foreign peoples who want/have a job here. Yes, you may have made many efforts to integrate permanently, but if you're not on a permanent status yet, then those are choices you always made knowing you're still on a temporary status. It's not hypocritical at all.
Edit: I want to say that I am not saying this from a place of no compassion, however harsh my opinions may seem. I have multiple close friends that are not LPRs/citizens yet and have been the shoulder to cry on when things go sideways. I empathise, I do, but my opinion remains the same that countries should look after their LPRs/citizens strongly first.
One of the problems is that it's hard to tell at first that it's AI music. Probably still hard to figure it out by ear after you've been told. But I think not nearly as many people would choose to listen to AI songs if they knew they were AI.
There's a reason it can succeed as it is now. Making music that is catchy to our ears is fairly formulaic. It's easy fot AI to do the same. But if they start labeling which music is AI and which isn't, it probably won't succeed as well.
I was pretty pissed and considered canceling my Spotify Premium after the first time I'd realized I'd been duped by AI songs. I just report them any time I see them now. If they gave me a settings option to block all AI music I'd be fine.
I'm put in mind of the Merchandise Marks Act 1887
- https://en.wikipedia.org/wiki/Made_in_Germany#History - which ultimately did the opposite of what it was expected to do. There is a real chance here that people just want to listen to something that sounds nice and aren't that fussed about whether a human is involved.
Besides, people seem to go in pretty strongly with computers to tune the sound already. It wouldn't be that shocking if people were already listening to works that can only be made with the aid of a computer.
why does it matter to you if it's AI or not? if you enjoy a song you shouldn't resent it just because it's AI generated. Me personally there's many AI songs that I like and enjoy listening to.
Because I care about art being a human endeavor. AI doesn't create art, it regurgitates an unidentifiable goop churned together in its stomach by all the crap its eaten. There is no thought. There is no feeling. There is no meaning. If you only care about the sound, that's cool, enjoy it, but I don't.
reply