Data Integration Center of Excellence

Our Data Integration CoE offers a comprehensive range of services to provide our clients the information they need in the appropriate format, so that their time is effectively spent on making accurate and timely business decisions.

Our Data Integration services are based on industry best practices, methodologies, rich domain expertise and experience across similar engagements.

Our Data Integration solutions are delivered by a dedicated pool of data integration specialists, domain experts, business analysts, technical architects, backed by our time-tested CoE processes.

  • Over 500 Data Integration service professionals in our CoE provide high-end design, architecture and technical leadership
  • Proven solution accelerators and frameworks increase productivity and decrease total cost for EIM implementations and improve time to market
  • Reusable components that accelerate project delivery
  • Substantial cost reduction realized through our global delivery dodel
  • Right size, high-touch, client centric relationship management, providing superior customer experience

Our tools and accelerators

  • Data model accelerators: Domain-focused data model accelerators, developed to the logical data model level and provision to add data elements that may be required later
  • ETL accelerators:
    • Top N time consuming job monitor: Automated mechanism to identify resource consuming ETL jobs to initiate performance tuning efforts
    • Job compare tool: Automates tracking of version changes for ETL jobs, automated mechanism to compare version of a job which enables quick rollout / rollback of jobs resulting from a change request
    • Impact analysis tool: Development aid for impact analysis, search utilities, parameter listing and job complexity
    • ETL test automation tool: Automated way to compare and perform metadata validation, automated verification and validation process ensuring 100% accuracy in less time
    • ETL jobs / load statistics tool: Automates the monitoring of DataStage jobs through UNIX scripting solution to collect the job statistics once daily load gets completed
    • DB rejects capture tool: Captures the records that were missed during loading because of database constraints, identifies incomplete data load at the table / row level and provides summary of why there is a data reconciliation mismatch
    • Attributes reconciliation: Provides an automated on-demand flash of actual attribute count across projects, rule engine based workflow with configurable counting algorithms that saves reconciliation time
  • ETL code review tool:
    • Enables standardization which ensures code quality and effective job implementation
    • Scalable and configurable solution that can be extended across projects and scenarios
    • Batch Processing enables to review multiple Talend objects at one shot
    • Provides accurate error details on the failed objects in the code review report
    • Productivity increase by 8 fold
    • Ability to review changes that impact multiple jobs
  • Data validation test tool:
    • Data validation includes validating that all records, all fields and complete data for each field is loaded, validates source and target data for counts and completeness
    • Automated way to compare and perform metadata validation
  • Delivery assurance tools, templates:
    • EIM Design review checklist
    • INFA, Oracle, Teradata, Talend, Ab Initio, DataStage - development standards and best practices
    • Development checklist and deployment Checklist
  • ETL migration framework
    • Common ETL migration framework that can be extended across tools: SSIS to Informatica and Ab Initio to Talend, etc.