Bachelor’s degree in Computer Science, Engineering, or a related field; Master’s degree preferred.
5+ years of experience in the software development industry, preferably in data engineering, data warehousing or data analytics companies and teams.
at least 2+ years of experience with the DataBricks ecosystem.
Experienced in designing and implementing complex, scalable data pipelines/ETL processes using Databricks.
Skilled in cloud-based data storage and processing technologies, particularly AWS services such as S3, Step Functions, Lambda, and Airflow.
Familiar with CI/CD practices, version control (Git), automated testing, and Agile environments.
Experience with the Agile development process in a distributed engineering team.
Ability to articulate ideas clearly, present findings persuasively, and build rapport with clients and team members.
Experience working in US-led high-tech companies and startups.
Good communication and leadership skills
Nice to Have
DataBricks certifications
AWS DevOps, Developer or SA certifications