Database archiving strategy



  • I work on a payment system (a bit regulated area with requirements to store really old data for audits & customer requests).

    We have SQL Server 2016 (but will move to 2019 soon).

    System is 16 years old, and for its functioning old data is not needed (but needed for ocasional analytics queries done by close-to-business analysts on readonly replica).

    When i first thought about some kind of cold storage of old data i though about backups. But backups have 2 drawbacks:

    1. No backup will contain all data (so if i need data that is 3 yr old & 10 yr old i probably would need to restore two backups)
    2. They are hard to use, since they need to be restored and it takes some time.

    Our business guys are okay to remove old data from main DB Server but really want to have old data be easily accessible with single SQL query (e.g. slow machine with all data for all years).

    Is this even possible? Some kind of a replication that ignores huge data deletions (not alll DELETE operations but specific DELETEs that clean up all data), or any other approach that would be easily mainteinable in long run

    Thanks!

    P.S. Already use SSD & legally can't use Strech Database / public cloud 😞



  • "Some kind of a replication that ignores huge data deletions".

    You could use transactional replication and modify the replication properties of specific articles (tables) to ignores deletes, that way you can replicate and keep everything to a reporting or archive database.

    There are lots of other things to consider with replication of course, but by design it can do that.




Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2