Consultant

Hyderabad, AP, IN
 

Consultant - (CREQ138158)

Description

Unum-Adhoc+Matrixs
Snowflake data engineers will be responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.
Experience in design patterns for data lake and data warehouse snowflake experience in API real time solutioning experience
A solid experience and understanding of architecting designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
Need to have professional knowledge of Teradata. Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake and stored procedures. Developing scripts Unix Python etc. to do Extract Load and Transform data Working knowledge of Teradata MsSQLServer Provide production support for Data Warehouse issues such data load problems transformation translation problems Translate requirements for BI and Reporting to Database design and reporting design.
Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations so others can easily understand the requirements implementation and test conditions.
Basic Qualifications Minimum 2 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
3 to 4 years of hands on experience with building productionized data ingestion and processing pipelines using Java Spark Scala Python and AWS Services
2 years of hands on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata MsSQLServer or DB2 Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
Excellent presentation and communication skills both written and verbal Ability to problem solve and architect in an environment with unclear requirements.
Required Skills Minimum 4 year of experience in architecting large scale data solutions performing architectural assessments crafting architectural options and analysis finalizing preferred solution alternative working with IT and Business stakeholders Experience building data ingestion pipeline using data streaming technology like Kafka Informatica Experience in working with AWS Azure and Google data services

Primary Location

: IN-AP-Hyderabad

Schedule

: Full Time

Job Type

: Experienced

Travel

: No

Job Posting

: 27/07/2022, 11:12:54 AM
Apply Now Back to Results Return to Job Search

Apply Now

Are you the right fit for this role? Please submit you application and resume/CV and we will be in touch.

Resume

Required, maximum file size is 512KB, allowed file types are doc, docx, pdf, odf, and txt

Currently selected file:

Apply Now

Are you the right fit for this role? Please submit you application and resume/CV and we will be in touch.
Please ensure all fields have been filled.

Your Information

Upload your Resume*

Please note only files with .pdf, .docx or .doc file extensions are accepted.

Max file weight: 512KB.

Please attach your resume, ensure it is in the correct format and smaller than 512KB.

×