Bitsight is seeking a highly skilled Databricks Expert Data Engineer to support the integration of Bitsight data feeds into a key customer’s internal databases and platforms. The role involves building data pipelines in Databricks and collaborating with various stakeholders to ensure reliable data solutions.
Responsibilities:
- Build, optimize, and maintain end-to-end data pipelines in Databricks using PySpark, Delta Lake, SQL, and related tooling
- Ingest, transform, and operationalize Bitsight data feeds into partner-owned data environments (databases, analytics platforms, BI dashboards, automated workflows)
- Design scalable data models and schemas aligned to partner requirements and use-case needs
- Implement robust data quality, monitoring, and validation processes
- Collaborate closely with partner engineering, BI, and product teams to understand requirements and translate them into reliable, production-ready data solutions
- Ensure integrations adhere to security, governance, compliance, and performance best practices
- Troubleshoot and resolve issues across the ingestion, processing, and delivery layers
Requirements:
- 5+ years of hands-on experience with Databricks (not just administration—direct development of pipelines and transformations)
- Expert-level knowledge of PySpark, Spark SQL, Delta Lake, and Databricks Workflows
- Proven experience integrating external data feeds into enterprise data platforms
- Strong understanding of data modeling, ETL/ELT design, and distributed computing principles
- Ability to work independently with minimal guidance in a fast-paced consulting environment
- Strong communication skills for working with both technical and non-technical stakeholders
- Experience with cloud platforms (AWS, Azure, or GCP) and associated data services
- Familiarity with BI / visualization tools (Tableau, PowerBI, Looker, etc.)
- Experience supporting security-focused or risk-oriented data products is a plus