Virtusa utilizes Artificial Intelligence (AI) technology for the purpose of conducting candidate pre-employment assessments. This AI may include both text-based and voice-based functionalities. The AI technology will have access to the data that you upload or provide during your interaction with the AI bot and will use such data solely for the purpose of performing assessment in accordance with Virtusa policies and practices. The AI is not used to substantially assist or replace discretionary employment decisions. Read our full employee privacy policy here.
This role focuses on ingesting operational data from source systems (primarily PostgreSQL), transforming and modeling that data in the data warehouse (currently Snowflake), and preparing reporting-ready datasets and optimized queries for analytics and reporting services.
The engineer partners closely with product and software engineering teams to ensure data is reliable, performant, and easy to consume - whether through optimized SQL queries or analytics-friendly data models.
This role combines deep technical expertise in data engineering with ownership of data architecture, a version-controlled database codebase, and support for multiple environments.
Responsibilities:
● Design, develop, and maintain a scalable data warehouse architecture to support analytics and reporting needs.
● Build and manage data ingestion pipelines that move operational data (from PostgreSQL) into the data warehouse (Snowflake) with high reliability and data quality.
● Transform and model raw data into reporting-friendly schemas (e.g., dimensional models, denormalized datasets, or analytics-optimized akeholders to translate business requirements into scalable data solutions.
● Establish and promote data engineering best practices, including naming conventions, documentation, testing, and performance optimization.
● Monitor data pipelines and warehouse performance, proactively identifying and resolving data quality, latency, or scalability issues.
● Contribute to architectural decisions around data modeling, ingestion patterns, and warehouse optimization.
● Participate in agile development processes, additional tasks within the department as assigned by management. Qualifications
● 4–8 years of prof data modeling concepts (e.g., dimensional modeling, star/snowflake schemas, reporting-optimized structures).
● Familiarity with ELT/ETL concepts, data pipelines, and orchestration practices.
● Experience ingesting and transforming data from relational databases, particularly PostgreSQL.
● Solid experience with Git-based version control for database code and data transformations.
● Experience supporting multiple environments (development, staging, production) with controlled deployment processes.
● Strong problem-solving skills and ability
This role focuses on ingesting operational data from source systems (primarily PostgreSQL), transforming and modeling that data in the data warehouse (currently Snowflake), and preparing reporting-ready datasets and optimized queries for analytics and reporting services.
The engineer partners closely with product and software engineering teams to ensure data is reliable, performant, and easy to consume - whether through optimized SQL queries or analytics-friendly data models.
This role combines deep technical expertise in data engineering with ownership of data architecture, a version-controlled database codebase, and support for multiple environments.
Responsibilities:
● Design, develop, and maintain a scalable data warehouse architecture to support analytics and reporting needs.
● Build and manage data ingestion pipelines that move operational data (from PostgreSQL) into the data warehouse (Snowflake) with high reliability and data quality.
● Transform and model raw data into reporting-friendly schemas (e.g., dimensional models, denormalized datasets, or analytics-optimized akeholders to translate business requirements into scalable data solutions.
● Establish and promote data engineering best practices, including naming conventions, documentation, testing, and performance optimization.
● Monitor data pipelines and warehouse performance, proactively identifying and resolving data quality, latency, or scalability issues.
● Contribute to architectural decisions around data modeling, ingestion patterns, and warehouse optimization.
● Participate in agile development processes, additional tasks within the department as assigned by management. Qualifications
● 4–8 years of prof data modeling concepts (e.g., dimensional modeling, star/snowflake schemas, reporting-optimized structures).
● Familiarity with ELT/ETL concepts, data pipelines, and orchestration practices.
● Experience ingesting and transforming data from relational databases, particularly PostgreSQL.
● Solid experience with Git-based version control for database code and data transformations.
● Experience supporting multiple environments (development, staging, production) with controlled deployment processes.
● Strong problem-solving skills and ability
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 30,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa is an Equal Opportunity Employer. All applicants will receive fair and impartial treatment without regard to race, color, religion, sex, national origin, ancestry, age, legally protected physical or mental disability, protected veteran status, status in the U.S. uniformed services, sexual orientation, gender identity or expression, marital status, genetic information or on any other basis which is protected under applicable federal, state or local law.
Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government-issued ID during each interview. All candidates must be authorized to work in the USA.
Learn more
Have any questions?
To join our bright team of professionals, you can apply directly to our website under the Careers tab and search all open jobs. https://www.virtusa.com/careers
Yes, you can. Virtusa gives you the flexibility to apply for multiple open positions that excite you about your future and align to your experience and career goals.
Yes, you can. Virtusa is a global Company, and we serve our clients through our global delivery model.
Our dedicated recruitment team will review your online application and match it to all our open jobs. We update our open jobs on a daily basis and encourage you to check back often.
Our team of recruiters will review your application, relevant job experience, and skills to appropriately align it to our open jobs. From there, the recruitment team will contact the qualified candidate to start the interview process.
Want to explore the ways you can engineer your career in technology? Our thought leaders share key career insights for candidates from entry-level job seekers to senior technologists.
Check your downloads folder for files and implementation instructions.
Assets are now available in your profile for future editing and use.