• In the past, I maintained a database containing TB sized tables and billions of rows where occasionally once every one or two weeks) the data analysts needed to perform mass updates on subsets of data, which would typically be 10s of millions of rows. What I had was a table containing something like the following columns and a simple in-house developed app where users could submit request, and I would then use the same for reviewing and approving the request for deployment. For auditing and production control process reasons, the users were still required to submit their ad-hoc DML script requests through the support desk system and enter the change order number.  

    I had a job polling the AdHocRequest table every 1 minute, executing the TOP 1 request where StatusCode = Pending and ScheduledTime > getdate(). The job also updates the StartedTime, DeployedTime, and StatusCode columns upon start and completion. It's important to set StatusCode = Processing in the first step, so the same request doesn't get executed repeatedly. So you're essentially looking at a single table, an app that any developer could throw together in a day or two, and a scheduled job. It's nothing fancy, but it covers all the bases and was extensively used for a long time.

    [ChangeOrderID] (links back to IT ServiceNow support desk system)
    [SubmissionTime] (when the users requested the deployment)
    [RequestTime] (when the users initially requested deployment to occur)
    [ApprovedTime] (when I reviewed and approved and deployment)
    [ScheduleTime] (when I scheduled the deployment to occur)
    [StartedTime] (when the deployment actually started)
    [CompletionTime] (when the deployment actually completed)
    [StatusCode] (pending, processing, success, failed, denied request, etc.)
    [SQLText]  (actual text of T-SQL batch)

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho