GCP / Databricks Platform Engineer 🚀
exp : 6 - 8 yrs
Are you passionate about building scalable data infrastructure and driving innovation in cloud-native platforms? Join our team as a Data Platform Engineer and play a key role in designing, managing, and optimizing enterprise-scale data platforms.
🔹 Responsibilities include:
Administration & cluster management with Databricks and UnityCatalog on
Expertise in BigQuery, CloudStorage,CloudFunctions, CloudRun,
PubSub, and IAM Infrastructure as Code using Terraform and automated deployment pipelines
Containerization (Docker, hashtag
Kubernetes,GKE) and microservices architecture
CI/CD (GitHub) with automated build scripts for release management
Data pipeline orchestration with
Airflow Strong programming in Python and PySpark
Optimizing performance, writing runbooks, and implementing best practices
🔹 What we’re looking for:
6–8 years of total experience
2–3 years of hands-on experience in GCP and/or Databricks
2–3 years of coding experience in Python & PySpark
Experience in automating and monitoring large-scale distributed systems
If you’re ready to make an impact by improving data platforms and mentoring teams, we’d love to hear from you!