Skip to content

Sr. Data Engineer, Mexico

  • by

Responsibilities:

  • Building pipelines using Apache Beam or Spark Familiarity with core provider services from AWS, Azure or GCP, preferably having supported deployments on one or more of these platforms
  • Mentor junior Data Engineers on the team

Required:

  • 5+ years of experience with Hadoop (Cloudera) or Cloud Technologies
  • Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.)
  • Experience with containerization and related technologies (e.g. Docker, Kubernetes)
  • Experience in other opensources like Druid, Elastic Search, Logstash etc is a plus
  • Advanced knowledge of the Hadoop ecosystem and Big Data technologies Handson experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr)
  • Knowledge of agile(scrum) development methodology is a plus
  • Strong development/automation skills
  • Proficient in programming in Java or Python with prior Apache Beam/Spark experience a plus.
  • System level understanding Data structures, algorithms, distributed storage & compute
  • Cando attitude on solving complex business problems, good interpersonal and teamwork skills

Preferred:

  • Angular.JS 4 Development and React.JS Development expertise in a up to date Java Development Environment with Cloud Technologies
  • Exposure and/or development experience in Microservices Architectures best practices, Java Spring Boot Framework (Preferred), Docker, Kubernetes
  • Experience around REST APIs, services, and API authentication schemes
  • Knowledge in RDBMS and NoSQL technologies
  • Exposure to multiple programming languages
  • Knowledge of modern CI/CD, TDD, Frequent Release Technologies and Processes (Docker, Kubernetes, Jenkins)
  • Exposure to mobile programming will be a plus.

Education:

Bachelor’s degree in Computer Science or similar field

To apply for this job email your details to careers.mx@worldlink.mx