8

I have the following df:

Timestamp                            A      B      C     ...     
2014-11-09 00:00:00                     NaN     1      NaN   NaN      
2014-11-09 00:00:00                      2     NaN     NaN   NaN             
2014-11-09 00:00:00                     NaN    NaN     3     NaN   
2014-11-09 08:24:00                     NaN    NaN     1     NaN         
2014-11-09 08:24:00                     105    NaN     NaN   NaN           
2014-11-09 09:19:00                     NaN    NaN     23    NaN          

And I would like to make the following:

Timestamp                            A      B      C     ...     
2014-11-09 00:00:00                  2      1      3     NaN      
2014-11-09 00:01:00                  NaN    NaN    NaN   NaN
2014-11-09 00:02:00                  NaN    NaN    NaN   NaN
...                                  NaN    NaN    NaN   NaN
2014-11-09 08:23:00                  NaN    NaN    NaN   NaN
2014-11-09 08:24:00                  105    NaN     1    NaN         
2014-11-09 08:25:00                  NaN    NaN     NaN  NaN     
2014-11-09 08:26:00                  NaN    NaN     NaN  NaN
2014-11-09 08:27:00                  NaN    NaN     NaN  NaN      
...                                  NaN    NaN     NaN  NaN      
2014-11-09 09:18:00                  NaN    NaN     NaN  NaN  
2014-11-09 09:19:00                  NaN    NaN     23   NaN      

That is: I would like to merge the columns with the same Timestamp (I have 17 columns), resample at 1 min granularity and for those column with no values I would like to have NaN.

I started in the following ways:

df.groupby('Timestamp').sum()

and

df = df.resample('1Min', how='max')

but I obtained the following error:

ValueError: cannot reindex from a duplicate axis

How can I solve this problem? I'm just learning Python so I don't have experience at all.

Thank you!

smci
  • 26,085
  • 16
  • 96
  • 138
Marzia
  • 91
  • 1
  • 2
  • 3

1 Answers1

8

Assumed that you have your Timestamp as index to begin with, you need to do the resample first, and reset_index before doing a groupby, here's the working sample:

import pandas as pd

df
                       A   B   C  ...
Timestamp                            
2014-11-09 00:00:00  NaN   1 NaN  NaN
2014-11-09 00:00:00    2 NaN NaN  NaN
2014-11-09 00:00:00  NaN NaN   3  NaN
2014-11-09 08:24:00  NaN NaN   1  NaN
2014-11-09 08:24:00  105 NaN NaN  NaN
2014-11-09 09:19:00  NaN NaN  23  NaN

df.resample('1Min', how='max').reset_index().groupby('Timestamp').sum()

                      A   B   C  ...
Timestamp                           
2014-11-09 00:00:00   2   1   3  NaN
2014-11-09 00:01:00 NaN NaN NaN  NaN
2014-11-09 00:02:00 NaN NaN NaN  NaN
2014-11-09 00:03:00 NaN NaN NaN  NaN
2014-11-09 00:04:00 NaN NaN NaN  NaN
...
2014-11-09 09:17:00 NaN NaN NaN  NaN
2014-11-09 09:18:00 NaN NaN NaN  NaN
2014-11-09 09:19:00 NaN NaN  23  NaN

Hope this helps.

Updated:

As said in comment, your 'Timestamp' isn't datetime and probably as string so you cannot resample by DatetimeIndex, just reset_index and convert it something like this:

df = df.reset_index()
df['ts'] = pd.to_datetime(df['Timestamp'])
# 'ts' is now datetime of 'Timestamp', you just need to set it to index
df = df.set_index('ts')
...

Now just run the previous code again but replace 'Timestamp' with 'ts' and you should be OK.

Anzel
  • 16,984
  • 5
  • 44
  • 48
  • I still have the same problem and I don't understand where this comes from. :( – Marzia Dec 31 '14 at 06:25
  • 2
    @Marzia the error you have means you're trying to index on an existing index. try: `df = df.reset_index()` then run above code and see if it helps – Anzel Dec 31 '14 at 18:25
  • I tried to do that before and when I resample with df.resample('1Min', how='max'), I get the following: TypeError: Only valid with DatetimeIndex or PeriodIndex and I don't know how to go about this. – Marzia Jan 01 '15 at 01:41
  • @Marzia, I guess your df['Timestamp'] being string or object rather than DatetimeIndex. You need to convert your data, I'll update my answer – Anzel Jan 01 '15 at 01:59
  • thank you for the update. Unfortunately, I didn't help: I still get the same error. I don't know what to do. :( – Marzia Jan 02 '15 at 03:58