Core Responsibilities

Responsibilities include designing, building, and maintaining a robust, self-service, scalable, and secure data platform, along with end-to-end data pipelines to enable strategic decision-making by Data Analysts and Scientists. This involves creating high-performance data platforms, developing data models, managing the full code development lifecycle, and promoting data-driven culture.

Requirements

Candidates must have 7 or more years of experience in Data Engineering, specializing in Big Data technologies like Spark, Hadoop, and multiple languages including Python, with proven experience using Airflow, AWS, and Databricks. A strong foundation in software engineering principles, data lakehouse concepts, and proficiency in English are essential.

Additional Information

Experience Level

10+

Job Language

Spanish

Work Mode

Hybrid