> It's not a "property" it's an attribute/field/member/key/column/variable/getter/function/procedure.
For what it's worth, to a researcher in the field of programming languages (like the author of the post), these all have distinct unambiguous meanings. At least as far as PL goes, almost every term has a well-defined meaning, but as those terms were adopted into less-than-academic contexts, the meanings have diluted.
"Property" is such a term in the context of programming languages research, and, in particular, it is a very specifically defined term in the realm of property-based testing (no surprise).
> Even the constants are variables from the viewpoint of the CPU that has to load it in its registers.
No; this is not what "variable" means. Registers are properties of the processor, i.e., they are implementation details; variables are an abstract concept from the domain of the formal language specification.
> Sometime along the way we decided that "syntax sugar" means "it means the same thing as" but except for (<cast OtherType>obj).foo(), which means that the semantics of "syntax sugar" don't mean it's simpler than the phrase it was supposed to replace.
No; this is not what "syntax sugar" means. If a language defines some syntax f and it "expands to" some other syntax g, then f is syntax sugar for g. This is well defined in Felleisen's "On the Expressive Power of Programming Languages" [0]. For example, Python's addition operator `+` is implemented in terms of a method `__add__`; therefore, `a + b` is syntax sugar for `a.__add__(b)`, because the former syntax is built on top of the latter.
Notably, syntax sugar has nothing to do with casts; casts are semantic, not syntactic. There are also no promises about whether syntax sugar makes something "easier"; it's simply the ability to syntactically express something in multiple ways.
I'd also like to add that, since immediate-operand instructions exist, constants are absolutely not the same as variables at the machine level, since immediates will never be stored in a register (typically, e.g. "move immediate" will obviously store it in one, and I'm sure there are architectures that use an internal/hidden register that's populated during instruction decode).
Also, in Harvard-architecture systems, the constants, being part of the instruction itself, might not even be in the same memory or even address space as variables ([EEP]ROM/Flash vs RAM).
The problem is that the same word is used for different things.
The comment you are responding to was correct in what "property" means in some settings.
The article itself says:
> A property is a universally quantified computation that must hold for all possible inputs.
But, as you say,
> but as those terms were adopted into less-than-academic contexts, the meanings have diluted.
And, in fact, this meaning has been diluted. And is simply wrong from the perspective of what it originally meant in math.
You are right that a CPU register is a property of the CPU. But the mathematical term for what the article is discussing is invariant, not property.
Feel free to call invariants properties; idgaf. But don't shit all over somebody by claiming to have the intellectual high ground, because there's always a higher ground. And... you're not standing on it.
My point was not that there exists some supreme truth about what words mean and that either you use words "correctly" or you're an idiot.
Yes, words have different meanings in different settings, but that's not the dilution I was referring to. It's absolutely fine that a word can be used differently in different places.
The "problem", such as it is, is that there are people who use terms from programming languages research to discuss programming languages and they use these terms inaccurately for their context, leading to a dilution in common understanding. For example, there is a definitive difference between a "function" and a "method", and so it is inaccurate to refer to functions generally as "methods". However, I see people gripe about interactions where these things are treated separately, and that is what I am addressing.
The parent comment to mine tried to offer some examples of such terms within the context of programming languages, so my corrections were constrained to that context. But your correction of my point is, I think, incorrect, because the meaning you are trying to use against me is one from a different context than the one we're all talking about.
There's no intellectual high ground here; my point was not to elevate myself above the parent comment. My point was to explain to them that they were, from the point of view of people like the author of the post (I assume), simply incorrect. There's nothing wrong with being wrong from time to time.
For what it's worth, to a researcher in the field of programming languages (like the author of the post), these all have distinct unambiguous meanings. At least as far as PL goes, almost every term has a well-defined meaning, but as those terms were adopted into less-than-academic contexts, the meanings have diluted.
"Property" is such a term in the context of programming languages research, and, in particular, it is a very specifically defined term in the realm of property-based testing (no surprise).
> Even the constants are variables from the viewpoint of the CPU that has to load it in its registers.
No; this is not what "variable" means. Registers are properties of the processor, i.e., they are implementation details; variables are an abstract concept from the domain of the formal language specification.
> Sometime along the way we decided that "syntax sugar" means "it means the same thing as" but except for (<cast OtherType>obj).foo(), which means that the semantics of "syntax sugar" don't mean it's simpler than the phrase it was supposed to replace.
No; this is not what "syntax sugar" means. If a language defines some syntax f and it "expands to" some other syntax g, then f is syntax sugar for g. This is well defined in Felleisen's "On the Expressive Power of Programming Languages" [0]. For example, Python's addition operator `+` is implemented in terms of a method `__add__`; therefore, `a + b` is syntax sugar for `a.__add__(b)`, because the former syntax is built on top of the latter.
Notably, syntax sugar has nothing to do with casts; casts are semantic, not syntactic. There are also no promises about whether syntax sugar makes something "easier"; it's simply the ability to syntactically express something in multiple ways.
[0] direct PDF: https://www2.ccs.neu.edu/racket/pubs/scp91-felleisen.pdf