May 16

On Wednesday, 1 May 2024 at 10:01:29 UTC, Nick Treleaven wrote:

>

On Tuesday, 30 April 2024 at 17:01:42 UTC, Basile B. wrote:

>

Generally there is a strong correlation between default initialization and boolean evaluation to false.

But the meaning of boolean evaluation of a number is to check if it is non-zero. That is well established from C.

Without looking it up, if x is -0.0, does !x evaluate to true or false?

Hint: Negative zero compares equal to zero (x == 0.0), but it’s not zero: x !is 0.0.

Possibly after looking it up, does the answer make sense to you?

Even if you’re 100% sure, would you bet most D programmers get it right?

May 16

On Thursday, 16 May 2024 at 18:03:30 UTC, Quirin Schroll wrote:

>

On Wednesday, 1 May 2024 at 10:01:29 UTC, Nick Treleaven wrote:

>

But the meaning of boolean evaluation of a number is to check if it is non-zero. That is well established from C.

Without looking it up, if x is -0.0, does !x evaluate to true or false?

Hint: Negative zero compares equal to zero (x == 0.0), but it’s not zero: x !is 0.0.

Possibly after looking it up, does the answer make sense to you?

Even if you’re 100% sure, would you bet most D programmers get it right?

-0.0 would convert to integer 0, which in turn is false. When I said non-zero, that is well defined for integers. So I'm not sure why you think it's surprising that !-0.0 is true.

May 16
On Thursday, 16 May 2024 at 20:36:17 UTC, Nick Treleaven wrote:
> On Thursday, 16 May 2024 at 18:03:30 UTC, Quirin Schroll wrote:
>> On Wednesday, 1 May 2024 at 10:01:29 UTC, Nick Treleaven wrote:
>>> But the meaning of boolean evaluation of a number is to check if it is non-zero. That is well established from C.
>>
>> Without looking it up, if `x` is `-0.0`, does `!x` evaluate to `true` or `false`?
>>
>> Hint: Negative zero compares equal to zero (`x == 0.0`), but it’s not zero: `x !is 0.0`.
>>
>> Possibly after looking it up, does the answer make sense to you?
>>
>> Even if you’re 100% sure, would you bet most D programmers get it right?
>
> `-0.0` would convert to integer 0, which in turn is false. When I said non-zero, that is well defined for integers. So I'm not sure why you think it's surprising that `!-0.0` is true.

Because it has a non-zero bit pattern. It does something rather nontrivial.
1 2
Next ›   Last »