-
-
Notifications
You must be signed in to change notification settings - Fork 18.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: reset_index of level on a MultiIndex with NaT converts to np.nan #11479
Comments
a lot of these are actually fixed by #11343 as we will be catching these errors in putmask. all of that said, you generallly do not want NaN's of any kind in the multiindex it makes your index non-performant and simply hard to work with w.r.t. indexing. what are you actually trying to do? |
Hi, thanks for picking this up! Yeah, I understand it’s not ideal to have NaN in the index. However I wrote a class (actually a number of classes) to do my data evaluation, which need a DataFrame of a defined structure in it’s .data property. Datapoints always have an index and mostly a timestamp as well. The index is unique, the timestamp not always, but mostly. Therefore, I figured it would be generally good to have them both in a MultiIndex (because if there is timestamp information, I usually use this to select data). |
These example all work on master. Could use tests:
|
Actually the last cases are not fixed yet as the
|
The last case appears to work now
|
Not sure if it's know already, but couldn't find any open issues.
It's similar to this closed one:
#10388
using pandas 0.17 and numpy 1.10.1
code to reproduce the issues:
Output:
without timezone:
a works
b works
c works
d fails: ValueError: Could not convert object to NumPy datetime
with timezone:
a fails: TypeError: data type not understood
b fails: TypeError: data type not understood
c fails: TypeError: data type not understood
d fails: ValueError: Could not convert object to NumPy datetime
Can you suggest any workaround?
The text was updated successfully, but these errors were encountered: