Bankrate is looking for a Staff Data Engineer to shape the next generation of their data platform—powering analytics and insights that help millions of people make smarter financial decisions. The role involves designing and scaling data systems, mentoring engineers, and delivering reliable data solutions.
Responsibilities:
- Design, build, and optimize data pipelines and infrastructure in Databricks and AWS (S3, EC2, Lambda, etc)
- Partner with analytics and product teams to translate business needs into scalable data models and solutions
- Develop automated workflows for data ingestion, transformation, and validation
- Implement and promote best practices in data architecture, security, observability, and governance
- Collaborate with data stakeholders on privacy-safe data sharing solutions, including clean-room environments and other secure data exchange methods
- Lead design and code reviews, mentor engineers, and raise technical standards across teams
- Drive platform improvements that enhance performance, reliability, and cost efficiency
- Contribute to long-term data strategy, evaluating new tools and technologies to advance our data ecosystem
Requirements:
- Must be able to work Eastern Standard Time hours and be based in the United States
- This role is not open to visa sponsorship or transfer of visa sponsorship including those on OPT and STEM-EXT OPT, nor is it available to work corp-to-corp
- 7+ years of data-engineering experience designing and deploying large-scale data solutions
- Deep expertise in SQL and strong software engineering skills in a modern language such as Python or Scala, with an emphasis on building reliable, performant data pipelines
- Hands-on experience with Databricks, Spark, and AWS cloud services
- Strong understanding of data modeling, ELT/ETL design, and distributed data systems
- Familiarity with CI/CD and infrastructure-as-code tools (e.g., GitHub Actions, Terraform, Airflow)
- Experience supporting BI and analytics platforms such as Looker or similar
- Ability to balance technical depth with business context, clearly communicating complex ideas to diverse audiences
- Track record of mentorship, collaboration, and continuous improvement
- No formal degree required—relevant professional experience and technical excellence are what matter most
- Experience modernizing or migrating large-scale data environments to Databricks or AWS
- Familiarity with data governance, cataloging, and access-control frameworks
- Exposure to financial, consumer, or content data domains
- Proven success influencing technical direction and driving adoption of best practices across teams