August 12th, 2009 by depesz | Tags: , , , , , , | 2 comments »
Did it help? If yes - maybe you can help me?

Robert Haas wrote, and Tom Lane committed (on 2nd of August) patch which adds interesting capability:

Log Message:
-----------
Add ALTER TABLE ... ALTER COLUMN ... SET STATISTICS DISTINCT
 
Robert Haas

Log message is rather terse, so what does it exactly do?

To plan query for execution PostgreSQL has to know some basic statistics for table. These are updated whenever you run ANALYZE on the table, and look like this:

# create table test (i int4, j int4);
# insert into test (i, j) select cast(random() * 10 as int4), cast(random() * 1000000 as int4) from generate_series(1,10000);
# select a.attname, s.* from pg_attribute a join pg_statistic s on a.attnum = s.staattnum and a.attrelid = s.starelid where a.attrelid = 'test'::regclass;
-[ RECORD 1 ]--------------------------------------------------------------------------
attname | i
starelid | 17567
staattnum | 1
stanullfrac | 0
stawidth | 4
stadistinct | 11
stakind1 | 1
stakind2 | 2
stakind3 | 3
stakind4 | 0
staop1 | 96
staop2 | 97
staop3 | 97
staop4 | 0
stanumbers1 | {0.106333,0.104,0.100667}
stanumbers2 | [null]
stanumbers3 | {0.114939}
stanumbers4 | [null]
stavalues1 | {5,9,6}
stavalues2 | {0,1,2,3,4,7,8,10}
stavalues3 | [null]
stavalues4 | [null]
-[ RECORD 2 ]--------------------------------------------------------------------------
attname | j
starelid | 17567
staattnum | 2
stanullfrac | 0
stawidth | 4
stadistinct | -0.9944
stakind1 | 1
stakind2 | 2
stakind3 | 3
stakind4 | 0
staop1 | 96
staop2 | 97
staop3 | 97
staop4 | 0
stanumbers1 | {0.000666667,0.000666667,0.000666667}
stanumbers2 | [null]
stanumbers3 | {0.00116388}
stanumbers4 | [null]
stavalues1 | {145623,493985,667920}
stavalues2 | {64,92162,194071,294372,395093,490197,590029,689411,785237,889573,999954}
stavalues3 | [null]
stavalues4 | [null]

these values are later used when planning query (notice rows= values for HashAggregate):

# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..170.11 rows=11 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)
 
# explain select distinct j from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..269.44 rows=9944 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

Now. ANALYZE is not really deterministic. It analyzes random portion of the
table, so it might get the stats wrong. What's also important – you might have
nonstandard distribution of values, and the number of distinct values in
statistics will be much different than in reality.

So, here comes “ALTER TABLE … SET STATISTICS DISTINCT". Basically, it forces number of distinc values in given column. Logic is like this:

  • if the set value is 0, then system will use value from system statistics (ANALYZE and pg_statistic table)
  • if the value is > 0 then it is assumed to be number of distinct values
  • if the value is < 0 then number of distinct values is calculated (on query planning time) using: estimated_row_count * -1 * value_set_by_user_with_alter_table

Let's see how it works.

First, this is how explain looks when I didn't force any number of distinct values:

# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..170.11 rows=11 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

Now, let's play with it:

# alter table test alter column i set STATISTICS DISTINCT 5;
ALTER TABLE
 
# analyze test;
ANALYZE
 
# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..170.05 rows=5 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

and let's test negative values:

# alter table test alter column i set STATISTICS DISTINCT -0.75;
ALTER TABLE
 
# analyze test;
ANALYZE
 
# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..245.00 rows=7500 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

When confronted with nonsense values, system either rejects them:

# alter table test alter column i set STATISTICS DISTINCT -2;
ERROR: number of distinct values -2 is too low

or accepts, but adjusts to reality:

# alter table test alter column i set STATISTICS DISTINCT 1000000000000;
ALTER TABLE
 
# analyze test;
ANALYZE
 
# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..270.00 rows=10000 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

And of course we can go back to system-updated number:

# alter table test alter column i set STATISTICS DISTINCT 0;
ALTER TABLE
 
# analyze test;
ANALYZE
 
# explain select distinct i from test;
QUERY PLAN
----------------------------------------------------------------
HashAggregate (cost=170.00..170.11 rows=11 width=4)
-> Seq Scan on test (cost=0.00..145.00 rows=10000 width=4)
(2 rows)

  1. 2 comments

  2. # Vao Tsun
    Aug 9, 2017

    It never came official release – right?.. you can only set statistics target `with set STATISTICS`

  3. Aug 10, 2017

    @Vao Tsun:

    Yeah, looks like it got reverted later on 🙁

Leave a comment