Scala-Spark Developer | HCLTech
The Digital Path Forward

Scala-Spark Developer

Roles and Responsibilities:

  • Excellent on Scala/Spark development.
  • Experience with HDFS: Hive, Impala, Files Format: Avro, Parquet.
  • Hands-on experience with SQL using both traditional and in-memory databases (e.g. Hazelcast).
  • Good understanding of functional programming paradigms.
  • Experience with OOP Design Patterns and best practices.
  • Experience with version control solutions like GIT, SVN or BitBucket.
  • Experience with real time analytic applications, microservices and ETL development.

Nice to have:

  • Experience with Google Cloud, Big Query, Dataproc, Dataflow etc.
  • Experience with Spark Tuning.
  • Understand Oozie + Cloudera stack.
  • Good knowledge on Shell/Bash scripting.
  • Knowledge about Kafka.
  • Streaming experience: Apache Beam, Spark Streaming, KStreams, Flink. 
  • Practical experience on Monitoring/Logging/Operations (e.g: Splunk, Prometheus, Grafana, ELK or GCP stack for SRE).
  • Experience working with Agile Scrum methodologies.

Preferred Qualifications:

  • Bachelor's degree in Computer Science or a related field.

Required Skills:

  • Excellent experience in developing Big Data applications in enterprise environment.
  • 2+ years of relevant software development experience on Scala/Spark or Java programming.
  • Excellent communication and English skills.
  • Great teamwork with collaborative, innovative and intelligent problem solving.
  • Can work under pressure and be delivery focused and pragmatic.
  • Experience in financial services.

Key Skills Required:

  • Scala + Spark.

Apply Now

File Extension Allowed: Pdf, Doc, Docx | Max File Size: 2MB


I have read HCL Technologies’ Privacy Statement and agree to the terms of use*

I have read HCL Technologies’ Candidate Data Privacy Policy and agree to the terms of use*

Once you submit the form, you'll receive an email verification link to confirm your subscription