Description & Requirements
Essential Skills & Experience:
- Demonstrable experience in:
- Cloud Platforms, specifically AWS, and components EMR, S3, SQS, Lambda, Java, EC2/SSM.
- Foundational data integration tools, specifically Apache SPARK.
- Database programming languages, specifically SQL.
- Additionally an understanding of Scala, Maven, Git.
- Working knowledge of the SDLC.
- Should be able to work closely within an onshore and offshore model.
- Being part of an Agile delivery team (i.e. Scrum/Jira).
- Experience in data modelling.
- Strong communication and inter-personal skills.
- Step Functions/Data Pipeline
- Hive/Presto/Athena
- Talend(Optional)
- Experience in atleast one of the ETL tools like SAP BODS, SSIS, ODI, Informatica, etc
- AWS-SSM
- Implementing IAM policies for various services at granular level
- S3 life cycle management
- CI/CD (Jenkins, Terraform, Code Deploy, Code Pipeline, etc)
Desirable Skills & Experience:
- Experience in all things big data including some of the following: Zookeeper, HBase, HDFS, Yarn, Kafka, Spark, Cassandra, Elastic Stack, Graph Databases, Airflow.
- An understanding of the concepts and strategies of a Data Lake in the Cloud.
- Experience of ETL techniques and tools.
- Knowledge in multiple programming languages including Java, Scala, Python and Shell Scripting.
- Experience of building unit tests, integration tests, system tests and acceptance tests.
- Business Objects
- Qlik
Experience : 8+ years (relavant experience)
Bachelor's /Master’s degree in IT computer science/engineering, Mathematics & Data Analytics