ZAHID MEHMOOD, Senior Data Architect

ZAHID MEHMOOD

Senior Data Architect

Saudi Business Machines

Location
Saudi Arabia
Education
Master's degree, Computer Science
Experience
12 years, 2 Months

Share My Profile

Block User


Work Experience

Total years of experience :12 years, 2 Months

Senior Data Architect at Saudi Business Machines
  • Saudi Arabia - Riyadh
  • My current job since April 2014

Summary
• Over 10 years of experience in Data Analysis, Data Migration, Data Modelling, Data Integration, Data Warehousing, Database Design/Administration and ETL with a total 18 years of IT experience.
• Db2/Oracle/MSSQL Administration and Performance tuning, data migration for batch processing.
• Db2/Oracle/MSSQL backup & Restore, Access Control and Security design.
• Netezza Administration using nzadmin and NPS (Netezza Performance Server).
• Netezza data migration of history and daily loads using nzload, nzmigrate, and External Tables methods.
• Netezza backup using nzhostbackup, nzbackup -users, nzbackup -db.
• Netezza Administration, Access Control and Security design.
• Hands on experience of Un-structured data handling and processing.
• End to End successful implementation of 10+ projects. Experience of working in full SDLC process of a project implementation.
• Experienced in Dimensional and ER data modeling concepts including Conceptual, Logical & Physical data models.
• Understanding of Ralph Kimbell and Bill Inmon models, Data warehousing, Data Marts, Star and Snowflake Schema concepts.
• Worked on Waterfall and Agile methodology of data warehouse development. 24/7 Production Support on rotational basis.
• Excellent communication and interpersonal skills. Versatile team player with proven problem solving skills.

-------------
King Saud University/SBM, Riyadh, KSA Apr-14~ Date
Senior Information Management Consultant/Architect

SBM provided a data warehouse and eQMS solution for 80 data sources to fulfill the KSU reporting, querying and accreditation requirements by an efficient customer centric decision support system. KSU institute needs the continuing effort to demonstrate compliance with regional and programmatic accreditation standards which is supported by eQMS and DWH solution. Almost eighty source systems are integrated into DWH using efficient and standardized staging area and ETL techniques.
Responsibilities:
• Designs Data Architectures, Define and build relational and dimensional databases.
• Developed strategies for data acquisitions, archive recovery, and implementation of a database.
• Designing of Data Quality processes and methodologies.
• Used and implemented Datastage Qualitystage rules using Investigate, Standardize, Match Frequency and Survive stages.
• Provide direction, instructions and guidance to team members to achieve major project milestones on time.
• Design & Implementation of ETL processes and approach to stage, transform and load data into target Netezza data warehouse.
• Preparation of ETL Standards and Performance Tuning documentation.
• Infosphere Server hardware and software setup, Installation and support.
• Configuration of multiple environments (dev, uat and production) in Datastage, File Server and Netezza.
• Designing of Netezza databases on Development and Production environment.
• Involved in data extraction and analysis from eighty sources on various databases and data formats including xml files, spread sheets and databases to load into data Warehouse and data marts.
• Preparation of User Hands on Training materials and conducted sessions for hands on training.
• Mentoring and guidance for ETL team and KSU users following Best Practices standards.
• Scheduler scripts for crontab using dsjob to get the return code for error handling and notification purposes.
• Support on Production Issues during the monthly and daily production batches.
• Used the ETL and SQL tuning techniques for to achieve performance benchmardks.

Environment
Datastage Infosphere 11.3/9.1, Netezza 6, DB2 9.x, SQL Server 2008, Oracle 11g, Crontab, MS Visio, Erwin Data Modeler, Cognos

Senior Data Warehouse Consultant at TD Group
  • Canada
  • October 2013 to March 2014

Responsibilities:
• Working on Reject Handling and Reconciliation of data from upstream flow.
• Developed complex jobs to handle canonical XML files into MDM XML format.
• Loading of MDM XML files by calling web services to load into MDM Server.
• Writing of AIX shell scripts for data handling and cleansing.
• Running daily batches to perform Unit Testing and bug fixes.
• Code promotion to PAT, UAT and Production environment.
• Usage of XML, XML Input/Output/Transformer, WebServices, CDC, Pivot, ODBC/Connector, Surrogate Key, Data Set, File Set, Aggregator and Sequential file stages.
• Developed Best Practices Jobs and common routines.

Senior Data Warehouse Consultant at Canadian National Railway
  • Canada
  • March 2012 to September 2013

Responsibilities:
• Involved in the various stages of data warehousing life cycle development, physical design of database, ETL Designing and coding using Agile methodology.
• Used Datastage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse.
• Worked closely with BI Practice team to develop Common Routines and Framework for Reject Handling, Audit Control and Logging.
• Developed complex jobs to load Dimension tables, Base Fact, Aggregate Fact and Derived Facts in Netezza target data warehouse.
• Involved in writing SQL`s to extract data from various source systems including XML, Mainframe (CFF), SAP, Oracle, MSSQL and Netezza into Staging and Dimensional Areas.
• Created UNIX shell scripts for data cleansing, pre/post processing and for scheduling the jobs.
• Writing of Test Cases for Unit Testing, Prepared test data and Execution of test cases.
• Writing of Integration Test Cases, Prepared test data by writing Data Copy Scripts from Prod, Execution and Validation of test cases.
• Involved in the Change Management Process to create Change Request and attend CAB, TAB meetings.
• Code promotion to UIT, UAT and Production environment using ISPW Eclipse.
• Developed Complex Jobs using Transformers, CDC, Pivot, ODBC/Connector, Oracle, Surrogate Key, Slowly Changing Dimension, File Set, Aggregator and Sequential file stages.
• Used Netezza Ezload and ET methods for data load into Netezza data warehouse.

Education

Master's degree, Computer Science
  • at Islamia University
  • December 1997

Specialties & Skills

Database
Datastage
Data Warehousing
Data Warehouse
Datastage
Oracle / DB2 / MSSQL / Teradata
Netezza

Languages

English
Expert