Hyderabad, AP, IN
Consultant - (CREQ130113)
DescriptionYour key responsibilities
Youll spend most of your time working with a wide variety of clients to deliver the latest big data technologies and practices to design, build and maintain scalable and robust solutions that unify, enrich and analyse data from multiple sources.
Skills and attributes for success
Designing, Architecting, and Developing solutions leveraging big data technology (Open Source, AWS, or Microsoft) to ingest, process and analyze large, disparate data sets to exceed business requirements
Unifying, enriching, and analyzing customer data to derive insights and opportunities
Leveraging inhouse data platforms as needed and recommending and building new data platforms and data applications based on solutions patterns as required to exceed business requirements
Clearly communicating findings, recommendations, and opportunities to improve data systems and solutions
Demonstrating deep understanding of big data technology, concepts, tools, features, functions and benefits of different approaches
Seeking out information to learn about emerging methodologies and technologies
Clarifying problems by driving to understand the true issue
Looking for opportunities for improving methods and outcomes
Applying data driven approach in tying technology solutions to specific business outcomes
Collaborating, influencing and building consensus through constructive relationships and effective listening
Solving problems by incorporating data into decision making
To qualify for the role you must have
A bachelors degree and approximately five years of related work experience or a masters degree and approximately two years of related work experience
At least five years handson experience with data integration development and big data technologies in either the Adobe or Microsoft technology stack.
Data Pipeline/ETL development expertise using Azure Data Factory or SSIS
Experience with the Azure cloud stack including but not limited toAzure Functions, Azure SQL Server, Azure Blob Storage and Azure Cosmos DB
Hadoop, Spark, NoSQL, Streaming, Atlas, Sqoop, HIVE
Cassandra, Mongo, Redshift, Kafka, Spark
Proficiency coding in Python, Java and C#
Experience with Streaming Platforms (kafka, spark, streamsets etc) and Complex Event Processing
Experienced organizing, aggregating, querying, and analyzing large data sets
Communication is essential, must be able to listen and understand the question and develop and deliver clear insights.
Outstanding team player.
Independent and able to manage and prioritize workload.
Ability to quickly and positively adapt to cange.
A valid drivers license in the USwillingness and ability to travel to meet client needs.
Ideally, youll also have
Bachelors Degree or above in mathematics, information systems, statistics, computer science, or related disciplines
Nice to have Adobe platform experience
Nice to have Adobe Experience Platform
SaFE Agile methodology
Primary Location: IN-AP-Hyderabad
Schedule: Full Time
Job Type: Experienced
Job Posting: 27/07/2022, 4:27:23 PM
We’re sorry, there was some trouble processing your submission. The error code is:
Please ensure all fields have been filled.