-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Imaginary: implement a pure Imaginary type, let im = Imaginary(1)
.
#5313
Conversation
Seems pretty good; I worry about having lots of extra cases to handle (e.g. all those new It really feels like a bug in the universe that |
Also --- how did the tests break? Glancing at the diff it's not obvious how existing functions of |
I'm not sure about the tests. I decided to just push it and investigate later. |
@@ -455,7 +455,7 @@ cf2 = complex(f2) | |||
@check_bit_operation (+) Matrix{Complex{Uint8}} (b1, cu2) | |||
@check_bit_operation (-) Matrix{Complex{Uint8}} (b1, cu2) | |||
@check_bit_operation (.*) Matrix{Complex{Uint8}} (b1, cu2) | |||
@check_bit_operation (./) Matrix{Complex64} (b1, cu2) | |||
@check_bit_operation (./) Matrix{Complex128} (b1, cu2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if this really should have been a Complex64
result in the first place, but I'm not sure.
The test breakage was because I got a little overzealous and actually changed how real/complex division was done. |
@test isequal(a + b*im, Complex(a,b)) | ||
end | ||
end | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This stuff is a big part of the point of this change. The fact that all these expressions are just correct is much better than the previous state of affairs. The fact that this approach is what Kahan favors is what really convinced me to do this, although I suspected we should for quite some time.
One thing I just noticed about this branch is that it violates the identity that julia> (4im)^2
-16 + 0im
julia> (4im)*(4im)
-16 This is reminiscent of the issue with wanting |
Even though |
Violating julia> 1/(4im)^2 #Approaches negative real axis from below, so should carry -0.0im
-0.0625 - 0.0im (Btw, |
isequal(z::Complex, w::Complex) = isequal(real(z),real(w)) & isequal(imag(z),imag(w)) | ||
|
||
hash(i::Imaginary) = hash(imag(i)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably need something more here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any thoughts on what? Should they hash the same as the corresponding complex number? Something like
hash(i::Imaginary) = bitmix(hash(zero(imag(i))),hash(imag(i)))
Are there still any broken tests I should look over? |
No, I fixed that. It was because I inadvertently changed the definition real/complex division. |
But looking it over wouldn't be a bad thing – even better would be trying it out and seeing how you feel about it. |
I'd like to merge this soon if there aren't any objections because a) I think it's better, and b) I think we won't figure out if it's better or not unless we try it out for a while. There are practical benefits to being able to work with pure imaginary arrays. |
At build time, I get a whole bunch of warnings of the form Warning: New definition
^ could not show value of type Tuple at complex.jl:510
is ambiguous with:
^ could not show value of type Tuple at complex.jl:509.
To fix, define
^ could not show value of type Tuple
rational.jl |
This branch could also use some way to generate imaginary Gaussians: julia> convert(Matrix{Imaginary{Float64}}, im*randn(5,5))
ERROR: no method convert(Type{Imaginary{Float64}}, Complex{Float64})
in copy! at abstractarray.jl:175
in convert at array.jl:252
julia> randn(5,5)
5x5 Array{Float64,2}:
0.889074 -0.33771 -0.155899 -1.86357 1.42763
-1.43924 -0.593955 0.455287 0.00607604 -1.41403
-0.385647 0.13192 1.21841 1.87917 0.997503
-0.118906 -2.10375 0.79757 1.00138 -3.19422
-1.22316 -1.47053 -0.505155 -0.627551 -0.446204
julia> randn(5,5)*im
5x5 Array{Complex{Float64},2}:
0.0+1.23825im 0.0-0.756854im 0.0+0.463509im 0.0+0.960726im 0.0+1.16782im
0.0-0.286364im 0.0+0.294821im 0.0-0.0878349im 0.0+1.57061im 0.0+1.5298im
0.0-1.07506im 0.0-0.713591im 0.0-0.592155im 0.0-1.51654im 0.0-1.44391im
0.0+0.780505im 0.0-0.892068im 0.0-0.370607im 0.0+0.0833817im 0.0+1.24178im
0.0+0.103169im 0.0+1.81969im 0.0-0.38732im 0.0-2.16382im 0.0-0.704337im |
I don't think the branch needs to add features, it just needs to come to parity with the current state of affairs and then once it's merged, we can start adding features like making |
In other words, you want me to do the work, don't you, you sly fox. |
Hahaha. Busted. But yes, I do :-| |
bump |
Done. |
Ok, this is ready to merge now. @JeffBezanson, @ViralBShah – you guys ok with giving this a shot? |
If promote_rule(T,S) and promote_rule(S,T) disagree, call promote_type again on the results to get a single common answer. This can resolve cases where there is a conflict between two such rules, making the promotion system a bit more robust.
Note that this now includes the changes for #5466 since they were necessary to get some of the promotion stuff here to work correctly.