[SOLVED] How can I speed up a pandas search with a condition and location?


I have a DataFrame named df_trade_prices_up_to_snap that has about 295k rows and looks something like this:

Sample Data

For each ticker in the DataFrame, I need to get the last trade price and append it into a new DataFrame. The data frame is already ordered properly.

I wrote a little routine that works:

df_trade_prices_at_snap = pd.DataFrame()
ticker_list = list(df_trade_prices_up_to_snap.ticker.unique())
for ticker in ticker_list:
    df_trade_prices_at_snap = df_trade_prices_at_snap.append(df_trade_prices_up_to_snap[df_trade_prices_up_to_snap.ticker == ticker].tail(1))

It takes about six seconds to run that loop which is too long for my needs. Can someone suggest a way to get the resulting DataFrame in a much faster way?


If the prices are ordered chronologically, you can use groupby_last:

df_trade_prices_at_snap  = df.groupby('ticker')['trade_price'].last()

Answered By – enke

Answer Checked By – Cary Denson (BugsFixing Admin)

Leave a Reply

Your email address will not be published. Required fields are marked *