Why does estimated memory have a three phase shape?



  • The graph for this https://dba.stackexchange.com/a/307388/36809 is for a 1 million row table. When drawn for 10,000 rows the graph has a rather different and more interesting shape.

    enter image description here

    There's a plateau when the column length is between 524 and 903 long.

    This shape is seen on my work and home laptops. (SQL Server 2017 CU27 and 2017 CU22, 2019 CU8 respectively. Both boxes have 16GB RAM.)

    The slopes vary slightly between versions. enter image description here

    This makes me think its a genuine design choice that's been tweaked rather than an artefact.

    Setting BATCH_MODE_ON_ROWSTORE = OFF for SQL Server 2019 makes barely any difference. Desired memory is the same at all sizes except t800 which is 6688kB OFF and 6656kB ON.

    How come this graph has this shape? What considerations mean a three-region solution is optimal at moderate row counts (10,000 rows) but not at larger row counts (1 million rows)?

    I understand no one outside the SQL Sever development team will be able to conclusively state the reason this particular graph is this shape. But generally speaking, from academic and industry experience, if I were to start writing a memory management module for a hypothetical DBMS, what sort of considerations would lead to this sort of outcome?



  • Memory grant is an estimate based largely on expected cardinality and average row size, plus overheads due to implementation details.

    With smaller total grants, one can afford a bit more 'wiggle room' around the estimates (say a factor of 1.4). With larger grants, one might choose a lower fudge factor (say 1.25).

    There might also be fixed overheads associated with additional memory needed when a sort exceeds the granted memory and needs to spill to disk. Efficient processing of spilled data might need some kind of 'big output buffer'. One might to choose to account for this need or not with a trace flag like 7470.

    For a range of common moderately-sized memory grants, it might make sense to use a fixed additional grant (say 6MB) rather than mess about with small, detailed calculations that are unlikely to be exactly correct anyway. This would be on top of whatever fixed memory size was needed (say 512KB) as the absolute minimum size of the in-memory needs for the sort.

    In this sort of arrangement, you would see one part of the line with a slope around 1.25, another part sloped around 1.4, and a flat bit in the middle where a fixed additional grant is selected.




Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2