SQL Clone
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in

Business Intelligence/Data Warehouse Assessment

When I am tasked to do a business intelligence or data warehouse assessment, the steps I take to do that depend on the amount of time and the number of people I have.  The result of the assessment will be a plan to build a new data warehouse or business intelligence solution along with a proposed architecture for the new solution (i.e. a “Data Warehouse Architecture Blueprint”).  And obviously the bigger solution the longer time that should be spent on the assessment.

Here are those steps based on a quick, medium, or long approach:

Quick approach (2-4 weeks, 1 person):

Interview business units, find out gaps and needs and pain points, diagram existing environment, create assessment document, create solution document of 7-10 pages (with goals and technologies to prove out), create proposed architecture presentation.

Medium approach (4-6 weeks, 2 people):

The solution document will contain more details when using the medium approach.

The solution document could be stated as a data warehouse architecture blueprint (30-40 pages) that outlines the enterprise information architecture, the data warehouse solution, and the supporting infrastructure and environment needs for the Data Warehouse solution.  It’s TOC may look like:

  1. Executive Summary
  2. Data Warehouse Architecture Vision
    1. Current Environment
    2. Data Warehouse Vision
  3. Data and Integration Architecture
    1. Data Extraction and Staging
    2. Master and Operational Data
    3. Data Warehouse Data
    4. Business Area Cubes
    5. ETL, Integration, and Auditing
  4. BI Architecture
    1. Overall Architecture
    2. BI Governance and Compliance
      1. Governance Risks
      2. BI Governance Plan
      3. SharePoint BI Center Design
  5. Security and Infrastructure Architecture
    1. Security Architecture
    2. Infrastructure

Also useful is to create is a data warehouse roadmap (30-40 pages) that identifies the timeline, resources and approach to implement a corporate data warehouse.  It’s TOC may look like:

  1. Executive Summary
  2. Data Warehouse Roadmap Vision
    1. Corporate Support Objectives
      1. Success Factors
      2. Current Architecture and Drawbacks
      3. Architectural Vision
    2. Business Value
    3. Project Planning
      1. Architecture Foundation (Platform and Infrastructure)
      2. Initial EDW and BI Iteration Planning
      3. Master Data Management
  3. Resource planning
    1. Team Roles and Human Resources
      1. Team Roles
      2. Human Resources
      3. Team Skills Development
    2. Services, Software, and Systems
      1. Services
      2. Software
        1. Platform Toolset
        2. Client Software
        3. Developer and Management Tools
      3. Production Hardware Environment

Another layout that combines the above two documents could look like this:

  • Project description: Overview, scope, high-level requirements
  • Current technical architecture: Overview, logical architecture, system inventory, data architecture
  • Current capabilities: Overview, data management, analysis, information delivery
  • Target architecture and components: Overview, target data architecture diagram, technical architecture key decisions, data management, data integration, MDM, OLAP, presentation, metadata management, people, processes, roles and responsibilities, tools and technologies, data governance, target capabilities, target constraints, gap analysis
  • Implementation approach: Overview, key decisions, description of phases, roadmap

Long approach (10-25 weeks, 3 people):

Assess your current environment, identify current and future requirements, prioritize requirements using a business valuation framework and thereby develop a roadmap.  The method is inherently iterative but will be completed in 4 phases:

Phase 1 - Assessment and  Discovery

  1. Situational Assessment: History of BI and DW, priorities for future growth
  2. Architecture Discovery and Assessment: Gain a broad and rich understanding of the current architecture from multiple perspectives, including principles, systems, data, tools, technology, and infrastructure
  3. Organizational Assessment: Roles and responsibilities. evaluate skillsets of team members and identify gaps in skills and proficiency
  4. Data Quality Assessment: Profile data
  5. Process Assessment: Current methodologies used within organization around all aspects of delivering analytical solutions and their conformance to best practices
  6. Governance Assessment: Review the data governance practices

Phase 2 - Opportunity Assessment  and Architecture Definition

  1. Opportunity Identification: Reveal high level expectations for the future of business intelligence
  2. Needs Definition: Document and rank opportunities

Phase 3 - Future State Architecture Definition

  1. Architecture Workshops: Builds on the Architecture Discovery sessions to obtain and define more detail regarding the desired system, data, and technical architecture revisions required in the migration plan.  Drive the pilot exercises by producing requirements for technology and functionality (the pilot exercise is leveraged to continually inform and refine the architecture recommendations). The results of each Pilot exercise will then inform any updates to the architecture and generate requirements for the next pilot exercise
  2. Optimal Component Scheme: Classes of components that will be required to implement the architecture will be specified along with their functionality

Phase 4 – Targeting and Migration Plan

  1. Opportunity Targeting: Synthesizes the results of Needs Definition, and the Architecture Workshops to define incremental work effort that can be bundled into discrete delivery projects for the roll out
  2. Incremental Investment Plan: Combines the proposed project specifications and the architecture specifications to create an overall plan for a minimum of 12 – 18 months

James Serra's Blog

James is a big data and data warehousing technology specialist at Microsoft. He is a thought leader in the use and application of Big Data technologies, including MPP solutions involving hybrid technologies of relational data, Hadoop, and private and public cloud. Previously he was an independent consultant working as a Data Warehouse/Business Intelligence architect and developer. He is a prior SQL Server MVP with over 30 years of IT experience. James is a popular blogger (JamesSerra.com) and speaker, having presented at dozens of PASS events including the PASS Business Analytics conference and the PASS Summit. He is the author of the book “Reporting with Microsoft SQL Server 2012”. He received a Bachelor of Science degree in Computer Engineering from the University of Nevada-Las Vegas.


Leave a comment on the original post [www.jamesserra.com, opens in a new window]

Loading comments...