Supercharge Your Digital Future
Find your fit and supercharge progress with our exciting career opportunities
Bigdata with Kafka
Roles and Responsibilities
- Develop deploy big data pipelines in a cloud environment using Snowflake cloud DW,ETL design, development and migration of existing on-prem ETL routines to Cloud Service
- Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams
- Design and optimize model codes for faster execution
- Should be willing to travel for consultancy
- Very good communication Skills and should be able to interact with customer directly
- Good troubleshooting and Analytical Skills
- Conceptualize a new product or, new feature or, new component and to test new technologies or features through PoCs
- Contribute in practice growth initiatives – Interviews, training, mentoring, operational activities etc.
- Create documentation of operational tasks, procedures and automated processes.
- Fix faults and restore the operability or carry out suitable measures (creation of workaround)
Required Technical and Professional Expertise
- 5+ Years of Industry experience in Big Data / Hadoop Field
- Should have proven track record in the space of Big data Architecting, Solutioning, consulting
- Technology Stacks : Spark, Kafka, Cassandra, HBase, Hadoop, HDFS, Mapreduce
- Programming Skills : Python, Scala, Java experience
- CI/CD, JIRA, Any Automation Testing experience is preferable
- Should have strong thought leadership capabilities - like Blogs, Opensource contribution, Whitepapers, Research paper publications, Forum participation
- Shall have proven track records in competency development, innovation and value addition
- Prior working within a CoE is preferable.
Apply Now
Discover Opportunities
Couldn't find the right role?
Leave your resume with us and we'll get back when a suitable role opens up.
Click HereAre you a recent engineering graduate?
Our fresher hiring team is looking for talented engineers just like you.
Click Here