Schema Security

  • Comments posted to this topic are about the item Schema Security

  • I had always stuck to [dbo] until I started architecting our data warehouse/BI strategy, and I am using schema to manage the project boundaries between the various ETL jobs that pull from different parts of the business - putting all the relevant objects in the [mfg], [bud], [inv], [sls], etc. schema (with shared objects/functions in [ref] or [dbo]) is really helping to keep developers from stepping on too many toes, and allows shorter names because the schema helps to define the object. 

    I'm a convert.

  • Great practical topic. Thanks Steve.
    Using schemas is not common in my company, but I personally use "not-dbo-only" and  like using schemas for the objects grouping and implement granular security controls.

  • ahooper - Friday, March 1, 2019 6:13 AM

    I had always stuck to [dbo] until I started architecting our data warehouse/BI strategy, and I am using schema to manage the project boundaries between the various ETL jobs that pull from different parts of the business - putting all the relevant objects in the [mfg], [bud], [inv], [sls], etc. schema (with shared objects/functions in [ref] or [dbo]) is really helping to keep developers from stepping on too many toes, and allows shorter names because the schema helps to define the object. 

    I'm a convert.

    This is pretty much my story, too, although I use the schemas to identify the ultimate source of the data.  For things that are created in the data warehouse to support the cubes, the schema is datawarehouse.  I rarely assign anything to dbo now.

  • I'm a proponent of schema for both security and organization.
    Our design for data access and manipulation is fairly straight forward. the R part of CRUD is done via views and the CUD part of CRUD is done through stored procedures.

    Schemas facilitate this nicely, our tables sit in different schemas than our views, procs, functions...
    i.e. We create tables in a data schema (typically named [data], [config], [audit] and others relevant to the solution such as [customer], [engineering] etc), then we create views, procs, synonyms etc in "role" schemas such as [app], [service], [bi], [security] etc). We don't assign any permission to the data schemas, rather, using AD Groups, we assign SELECT, EXECUTE and VIEW DEFINITION (as needed) to the appropriate schema(s).

    This has been very effective and it's encouraging that you're seeing more usage in the schema direction.

    -

  • I am in the early phases of building the data warehouse.  I put all my translation tables into the etl schema and kept the facts and dimensions in dbo.

  • I put all my translation tables into the etl schema and kept the facts and dimensions in dbo.

    Consider using a more elaborate schema for the facts and dimensions.  One suggestion above was by functional area.  I identify via schema the source database for the tables.  As I use views for all the dim and fact tables (a suggestion from Kimball's 3rd edition book) the schema for those identifies the supported cube.  Works great, although sometimes the DSV is a little fiddly when working with views versus tables.

  • We have several internal applications that make use of the same database. We use schemas to separate data and code specific to one application or another into their own schemas. This makes it much easier to organize the associated objects. It didn't take long to get the development team used to the idea.

  • RonKyle - Friday, March 1, 2019 8:18 AM

    I put all my translation tables into the etl schema and kept the facts and dimensions in dbo.

    Consider using a more elaborate schema for the facts and dimensions.  One suggestion above was by functional area.  I identify via schema the source database for the tables.  As I use views for all the dim and fact tables (a suggestion from Kimball's 3rd edition book) the schema for those identifies the supported cube.  Works great, although sometimes the DSV is a little fiddly when working with views versus tables.

    Thanks, Ron.  Along that line, I was thinking of putting aggregate tables or a presentation layer into another schema.  My tables have multiple sources, such as all customers into one customer dimension.

  • nguntert - Friday, March 1, 2019 8:27 AM

    RonKyle - Friday, March 1, 2019 8:18 AM

    I put all my translation tables into the etl schema and kept the facts and dimensions in dbo.

    Consider using a more elaborate schema for the facts and dimensions.  One suggestion above was by functional area.  I identify via schema the source database for the tables.  As I use views for all the dim and fact tables (a suggestion from Kimball's 3rd edition book) the schema for those identifies the supported cube.  Works great, although sometimes the DSV is a little fiddly when working with views versus tables.

    Thanks, Ron.  Along that line, I was thinking of putting aggregate tables or a presentation layer into another schema.  My tables have multiple sources, such as all customers into one customer dimension.

    I have some tables with multiple, but there's usually a primary source that provides most of the data.  You might consider "common" or some such.  Just suggesting minimizing the use of the dbo schema.

  • Yep schemas are a great way segregate data and we use them extensively.  Oh you need access to all the data related to X, great you have access to X's schema there you go.

  • It is a common practice here to use schemas. At first I balked against it, but have come to realize that using schemas is a good practice.

    Kindest Regards, Rod Connect with me on LinkedIn.

  • We mostly use schemas for organization, but we do have some per-schema permissions set. This is partly because the DB started in MySQL, and was only recently migrated to SQL Server, but schemas are also a basic structure / organizational tool in the larger database world -- the hierarchy, slightly simplified, is database -> schema -> table -> row.  I wish SSMS had an option to display schemas as a tree level in Object Explorer, but in the meantime, SSMS Schema Folders is a very useful tool:
    https://github.com/nicholas-ross/SSMS-Schema-Folders

  • Schemas can be a great tool as everyone has previously stated. But for the love of all humanity, do not use schemas such as 'DEV.', 'QA', and 'PROD' to identify the DB environment. The situation I walked into, due to previous contractors and consultants, is the development database objects are all in the schema DEV. QA database objects are in the schema 'QA', and yep, you guessed it, the production database objects are all in the schema 'PROD'. For reference, all of these database environments are separate, on separate servers, separate databases, with separate security and permissions. But every time we migrate a stored procedure, someone has to edit the code for the correct schema before deploying it. And when I copy down a production database for development or QA (yeah, get over it, you all do it), I have to alter the DB schema on all the objects, AND drop and recreate all the triggers (again,get over it), stored procedures and functions to point at the correct schema for this DB environment. It is freaking hell.  For new development I am forcing the developers to use schemas correctly. But legacy stuff is just going to live on forever with this arrangement, because no one has the time or the will to change it all.

  • I've used schema's since around 1999, when I started using Oracle. The fact that SQL Server didn't use them has always been strange to me and I was pretty happy when they arrived in SQL Server and I even got a nice DBA that let me use them.

    When designing the DWH solution architecture, you need to map the physical levels in the database (server, instance, database, schema, table) to the logical layers in your architecture (such as the different layers: staging, persistent storage, data delivery, security management)  and having schema's just fills a big gap. 

    Personally I have used schema's to either map to the logical layers (staging etc.) or more recently to map them to the data delivery sources. I use these straight through all logical layers (mapped to databases) up until persistent storage (or what others might call the DWH layer) and then into the data delivery layer people will get multiple schema's from me:
    - one standard one with the "as is" schema (in views with triggers, or if performance is an issue, with semi-materialized views)
    - one schema with just synonyms
    - one schema with historical views
    - other schema's with different ways of looking at the data at will 
    these are usually separated with postfixes
    The advantage is that I can fix things without breaking anything else I don't want to touch. Automated rollout will take care of issues that need to be fixed for everyone.

    But as long as people don't dump stuff in [dbo] I'm pretty much okay with anything. My DBA introduced a trigger to prevent people from deploying into [dbo] (or default storage spaces for that matter) and I think that was a pretty good idea.

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply