Senior Databricks Developer

Experience: 8+ years
Type: Hybrid
Location: Bangalore and Hyderabad
Notice-period: Immediate/15 days
Technology: Data Engineering & Big Data Solutions

Job Overview:
We are looking for an experienced and dynamic Databricks Developer with strong expertise in designing and implementing scalable big data solutions on cloud platforms. The ideal candidate will take ownership of the data engineering pipeline, provide technical leadership, and collaborate with cross-functional teams to drive data strategy and architecture using the Databricks Lakehouse platform.

Key Responsibilities:
Technical Leadership: Mentor and guide team members on Databricks best practices, solution design, and development workflows.
Architecture & Design: Design and build scalable, fault-tolerant data pipelines and solutions using Databricks and Delta Lake architecture.
Data Engineering: Lead ingestion, transformation, and processing of batch and streaming data at scale.
Cloud Integration: Deploy and manage Databricks environments on AzureAWS, or GCP.
Performance Optimization: Monitor workloads, identify bottlenecks, and optimize Apache Spark jobs for cost and efficiency.
Monitoring & Automation: Set up workflow automation, CI/CD, and monitoring tools to ensure operational efficiency.
Security & Compliance: Enforce best practices in data governanceaccess control, and regulatory compliance (including Unity Catalog).
Stakeholder Collaboration: Work closely with data analysts, engineers, and business stakeholders to translate business needs into technical solutions.

Required Skills & Qualifications:
Minimum 8+ years of total IT experience, with 6+ years in Databricks and Apache Spark.
Proficient in Python, Scala, and SQL for data processing.
Strong experience with Delta LakeUnity Catalog, and MLflow.
Hands-on experience with ETL/ELT processes, data modeling, and data quality frameworks.
Deep understanding of big data frameworks, Spark internals, and distributed systems.
Experience working in cloud environments (Azure, AWS, or GCP) for data platform implementations.
Strong analytical and problem-solving abilities, with a focus on performance and scalability.
Excellent communication skills and the ability to lead technical discussions.

Preferred Qualifications:
Databricks Certified Developer / Architect (Preferred)
Experience in implementing machine learning workflows using MLflow
Familiarity with tools like AirflowAzure Data FactoryAWS Glue, etc.
Exposure to Agile/Scrum methodologies

Why Join Us?
Work on cutting-edge data engineering projects across cloud platforms
Collaborate with a high-performing team in a fast-paced, innovation-driven environment
Competitive compensation and strong career advancement opportunities

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
Scroll to Top