Posted: 15 hours ago
Job Description
<h3>Job Description</h3><p>Job Description<p><p>Hybrid onsite 2 days/week or 8 days/month<br />1 interview will be onsite (2 interview process)<br /><strong>Databricks is a must</strong><br />AWS or Azure<br />Must be able to do code reviews of Python code<br />Create designs and validate code<br />"own" the architecture<br /></p><p>As the Data Engineering Manager, you will be responsible for architecting, implementing, and optimizing end-to-end data solutions on Databricks while integrating with core AWS services. You will lead a technical team of data engineers, ensuring best practices in performance, security, and scalability. This role requires a deep, hands-on understanding of Databricks internals and a track record of delivering large-scale data platforms in a cloud environment.</p><ul><li><p>Lead a team of data engineers in the architecture and maintenance of Databricks Lakehouse platform, ensuring optimal platform performance and efficient data versioning using Delta Lake</p></li><li><p>Manage and optimize Databricks infrastructure including cluster lifecycle, cost optimization, and integration with AWS services (S3, Glue, Lambda)</p></li><li><p>Design and implement scalable ETL/ELT frameworks and data pipelines using Spark (Python/Scala), incorporating streaming capabilities where needed</p></li><li><p>Drive technical excellence through advanced performance tuning of Spark jobs, cluster configurations, and I/O optimization for large-scale data processing</p></li><li><p>Implement robust security and governance frameworks using Unity Catalog, ensuring compliance with industry standards and internal policies</p></li><li><p>Lead and mentor data engineering teams, conduct code reviews, and champion Agile development practices while serving as technical liaison across departments</p></li><li><p>Establish and maintain comprehensive monitoring solutions for data pipeline reliability, including SLAs, KPIs, and alerting mechanisms</p></li><li><p>Configure and manage end-to-end CI/CD workflows using source control, automated testing, and version control</p></li></ul><strong>Qualifications</strong><p><strong>Your Role:</strong></p><ul><li><p>You have a Bachelor’s Degree in Engineering, Computer Science or equivalent.</p></li><li><p>5+ years of hands-on experience with Databricks and Apache Spark, demonstrating expertise in building and maintaining a production-grade data pipelines</p></li><li><p>Proven experience leading and mentoring data engineering teams in complex, fast paced environments</p></li><li><p>Extensive experience with AWS cloud services (S3, EC2, Glue, EMR, Lambda, Step Functions)</p></li><li><p>Strong programming proficiency in Python (PySpark) or Scala, and advanced SQL skills for analytics and data modeling</p></li><li><p>Demonstrated expertise in infrastructure as code using Terraform or AWS CloudFormation for cloud resource management</p></li><li><p>Strong background in data warehousing concepts, dimensional modeling, and experience with RDBMS systems (e.g., Postgres, Redshift)</p></li><li><p>Proficiency with version control systems (Git) and CI/CD pipelines, including automated testing and deployment workflows</p></li><li><p>Excellent communication and stakeholder management skills, with demonstrated ability to translate complex technical concepts into business terms</p></li><li><p>Has demonstrated the use of AI in the development lifecycle</p></li><li><p>Some travel may be required to the US </p></li><li><p>Knowledge of financial industry will be preferred</p></li></ul></p></p>Create Your Resume First
Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.
It's fast, easy, and increases your chances of getting an interview!
Application Disclaimer
You are now leaving 925work.com and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.
Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.