Databricks Architect


Job Details

Hring on behalf of a client

Databricks Architect. Full time hire / require travel to the client s office in Dublin, OH up to 2 days per week.

candidates local to the Dublin, OH area or willing to commute to that area up to 2 times per week are preferrable.

Role & Responsibilities Overview:

Lead and/or assist in designing and developing data systems, tailoring solutions to meet client specific requirements Design and implement databricks based solutions with a focus on distributed data processing , data partitioning and optimization for parallelism Engage with client to evaluate their current and future needs, crafting bespoke solution architectures and providing strategic recommendations Develop comprehensive architecture solution roadmaps integrating client business processes and technologies Define and enforce coding standards for ETL processes, ensuring maintainability, reusability, and adherence to best practices Architect and implement CI/CD pipelines for databricks notebooks and jobs, ensuring testing , versioning and deployment Disaster recovery strategies for databricks environments, ensuring data resilience and minimal downtime in case of failure Innovate and expand solution offerings to address data challenges Advise stakeholders on data cloud platform architecture optimization, focusing on performance Experienced with Scrum and Agile Methodologies to coordinate global delivery teams, run scrum ceremonies, manage backlog items, and handle escalations Integrate data across different systems and platforms Strong verbal and written communication skills to manage client discussions

Candidate Profile: 5+ years experience in architecture, design and implementation using databricks Experience in designing and implementing scalable, fault tolerant systems Deep understanding of one or more of the big data compute technologies such as databricks, snowflake Demonstrated experience with the deployment of databricks on cloud platforms , including advanced configurations In depth knowledge of spark internals, catalyst optimization, and databricks runtime environment Must have experience in implementing solution using Databricks Experience in Insurance (P&C) is good to have Programming Languages SQL, Python Technologies Databricks, Delta Lake storage, Spark (PySpark, Spark SQL). o Good to have - Airflow, Splunk, Kubernetes, Power BI, Git, Azure Devops Project Management using Agile, Scrum B.S. Degree in a data-centric field (Mathematics, Economics, Computer Science, Engineering or other science field), Information Systems, Information Processing or engineering. Excellent communication & leadership skills, with the ability to lead and motivate team members Ability to work independently with some level of ambiguity and juggle multiple demands

#J-18808-Ljbffr





 ishare

 04/24/2024

 Dublin,OH