slow postgres performance with simple querys



  • after alot of reading i found multiple sources saying postgres should be able to handle ~100 million rows

    i set up a simple table with:

    CREATE TABLE trade
    (
        id INT PRIMARY KEY NOT NULL,
        symbol VARCHAR(),
        open_time TIMESTAMP,
        end_price INT
    )
    

    i have 12 million of thies records

    my simple query of

    SELECT * FROM table WHERE symbol=x and open_time>start_time and open_time

    this query always returns less than a 1000 rows yet it take 1100 millisecond this seams like alot for a simple table with 10x less rows than it should handle?

    altho i dont have any indexes cause i dont know what would be the best indexs to put on this table.

    is it possible to get this down to 100ms?

    any help writing a more performant query or db would be appriciated

    EDIT 1:

    after reading the comments i put the following indexes on the table

    CREATE INDEX open_time ON trade(open_time);
    CREATE INDEX symbol ON trade(symbol);
    CREATE INDEX end_price ON trade(open_price);
    

    after adding this the query time is 240 ms is this the max?

    altho i have noteced querying the beginging 40k rows the query time drops to 60ms
    after that is rises to 240ms what is causing this?



  • The perfect index for this query is a two-column index, with the column that is compared with = first:

    CREATE INDEX ON trade (symbol, open_time);
    

    If that is still slow, the EXPLAIN (ANALYZE, BUFFERS) output for the query should tell us why.




Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2