Minimum Experience and Key Competencies:
– 4+ years of professional data engineering & data platform architecture design experience, or a related role, preferably within the financial services or technology sectors
– Advanced proficiency in Python, Spark, SQL, and Scala; deep expertise in Apache Spark, Kafka, Airflow, and dbt
– Proven track record designing and implementing large-scale data pipelines processing large scale datasets
– Hands-on expertise in Python, Spark, cloud platforms AWS/GCP/Azure in production environments
– Hands-on experience with data cataloging tools, metadata management, data quality frameworks, encryption, and compliance automation