Archive to the Lake

  • Comments posted to this topic are about the item Archive to the Lake

  • very informative and helpful article

  • We considered using the Stretch Database feature a while back, but the service was too costly. This isn't documented by Microsoft, but Stretch Database (I have deduced) actually uses Azure Synapse Analytics (SQL Data Warehouse) to store the data on the backend, and that's why the service is so expensive.

    Another option, which we have not tried but from what I've read could work, is to create an archive database (ie: SalesArchive) and mount the data files in Azure blob storage. The tables can be compressed (possibly ColumnStore?). The advantage of this is that the data is seamlessly integrated in SQL Server.

    This is possible, but if anyone has actually implemented this, I'd like to hear how well it works in terms of technical snags and usability.

    SECRET = '<your SAS key>'

    CREATE DATABASE SalesArchive
    ( NAME = salesarhive_dat,
    FILENAME = '' )
    LOG ON
    ( NAME = salesarhive_log,
    FILENAME = '')




    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Stretch was crazy expensive. Not remotely viable, and deprecated now.

    I think the lake is different, where you use cheap, text (blog) storage.

  • I have only a little experience with these matters, but I am very wary of solutions that involve taking well structured data in relational models and throwing them into files for processing.  I don't see how that can be more efficiently (and thus cheaply) worked with.  It seems the solution is always just to throw more compute power at it, which ends up costing far more than was originally estimated when designing the solution.

    [font="Tahoma"]Personal blog relating fishing to database administration:[/font]

    [font="Comic Sans MS"][/url]/[/font]

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply