ETL Framework for Data Warehouse Environments
The Non Functional ETL Requirements
This course provides a high level approach to implement an ETL framework in any typical Data Warehouse environments. The practical approaches can be used for a new application that needs to design and implement ETL solution which is highly reusable with different data loading strategies, error/exception handling, audit balance and control handling, a bit of job scheduling and the restartability features and also to any existing ETL implementations. For existing implementations this framework needs to be embedded into the existing environment, jobs and business requirements and it might also go to a level of redesigning the whole mapping/mapplets and the workflows (ETL jobs) from scratch, which is definitely a good decision considering the benefits for the environment with high re-usability and improved design standards.
This course is a combination of standard and practical approaches of designing and implementing a complete ETL solution which details the guidelines, standards, developer/architect checklist and the benefits of the reusable code. And, this course also teaches you the Best practices and standards to be followed in implementing ETL solution.
Though this course, covers the ETL design principles and solutions based on Informatica 10x, Oracle 11g, these can be incorporated to any of the ETL tools in the market like IBM DataStage, Pentaho, Talend, Ab-intio etc.
Multiple reusable code bundles from the marketplace, checklists and the material required to get started on UNIX for basic commands and Shell Scripting will be provided.
What are the requirements?
- Basic understanding of Data Warehouse Concepts
- Basic understanding of ETL Concepts
- Basic understanding of SQL commands and RDBMS concepts
- Basic understanding of UNIX commands
- Basic understanding of Shell/BAT Scripting
What am I going to get from this course?
- This course provides an in detailed approach to implement an ETL framework in typical Data Warehouse environments. This approach can be used for a new application that needs to design and implement ETL solution which is highly reusable with data loading, error handling, audit handling, job scheduling and re-start-ability features. This framework will help reduce time and increase quality due to high re-usability and design standards.
- Metadata Categories, learn the commonly used types of metadata in a real time project and how these are different from the Business and Technical viewpoints.
- ETL Framework process flow, the process flow and different activities which should be taken care during the ETL framework implementation from file (source data) validations, Exception handling and Audit Control.
- Data Sourcing, the different types of Data Sourcing possible in a Data Warehouse environment, different mechanisms in which the data sourcing can happen like the Scheduled events, Change Data Capture, Pub- Sub, Web services/API connectivity and the classification.
- Different commonly required/used scripts for Data Sourcing, the different validations required to be performed for Data Sourcing and what functionality to be included in the scripts (shell/bat).
- File Validation process, post file validation steps and file validation failure notifications.
- Staging Layer, the need for staging layer, Reference Data, Audit columns for Staging and Reference tables, Data retention in the staging layer, partitions and DB standards.
- Business Validation Layer, different situations possible during the data processing, concurrent workflow process, partitions in staging and business validation layer.
- Data warehouse Layer, Dimension Load, Fact Load types/process, Fact partitions, Fact Summary Load and Source File Management/Archival.
- Exception Handling/Error Handling, Data model for exception handling, Error Category, Error Code and different possible solutions for exception handling.
- Sample Project Setup, Steps to download the project setup, executing the DDLs for metadata, project explanation and importing the code base into Informatica.
- Extending the Operational Metadata’s Data Model for exception handling with additional supporting tables.
- Error Handling Data Model, the framework for the data model design.
- Using PMREP tables, for exception handling.
- Audit, Balance and Control, the need, different technology components involved, table structure and data model, workflow example.
- Configuration Management, Software Change Management, Identification, Tracking and Management of all the assets/objects of a project, One of the standard project management processes, the formal way for managing changes of the software and the process for deploying code from development to testing to production.
What is the target audience?
- ETL Developers/Administrators
- ETL Testing Professionals
- Data Architects and Data Modelers
- Data Scientists and Big Data Experts who want to understand the practical Data Warehouse Process
- Database Administrators who want to explore the DWH/ETL/BI areas
- BI/ETL/DW Technology experts and Team Leaders
- Software Engineers who are already part of any Data Warehouse and Business Intelligence Projects
- Software Engineers from different technology background who want to explore the Data Warehouse and Business Intelligence development process
- Mainframe developers who want to switch their carrier into the Data Warehouse stream
- Freshers/Engineering Graduates who are looking for placements
- Non IT professionals who like to learn how data is handled in enterprises
Data Management and Business Intelligence Consultant/Trainer with 16+ years of extensive work experience on various client engagements. Delivered many large Data Management projects (Data Integration, Data Quality, Data Governance, Metadata Management, Master Data Management, Data Security, Data Catalog etc) and trained numerous professionals on various tools and technologies. Extensively worked on all facets of Data Managment including requirement gathering, gap analysis, database design, data integration, data modeling, enterprise reporting, data analytics, data quality, data visualization, OLAP.
Has worked on broad range of business verticals and hold exceptional expertise on various ETL tools like Informatica Powercenter, SSIS, ODI and IDQ, Data Virtualization, DVO, MDM.