Design, build, and scale cloud-based data platforms and pipelines (batch and streaming) to support analytics, product, and operational use cases.
Own critical data workflows, ensuring reliability, performance, and quality across the data ecosystem.
Implement modern data solutions using tools like dbt, Databricks, ELT frameworks, and other cloud platforms (AWS, Snowflake).
Apply best practices in CI/CD, Git, testing, monitoring, and infrastructure-as-code.
Collaborate effectively with cross-functional teams, balancing technical rigor with strong interpersonal skills to translate business needs into scalable, accessible data solutions.
Lead technical initiatives with empathy and influence, mentoring team members, fostering trust, and promoting shared standards and engineering excellence.
Requirements
6+ years building and scaling cloud-based data platforms and pipelines.
Strong Python and SQL expertise
Strong SQL and Python skills, with a focus on automation, performance, and reliability.
Experience with large-scale datasets, structured and unstructured
Hands-on experience with modern data stack tools (dbt, Airflow, Fivetran, Databricks) and streaming/event-driven systems (Kafka, Kinesis).
Solid experience with AWS data services and ELT/ETL workflows.
Proficiency in CI/CD, Git, and infrastructure-as-code.
Proven ability to lead projects, influence stakeholders, and mentor others.
Tech Stack
Airflow
AWS
Cloud
ETL
Kafka
Python
SQL
Benefits
Comprehensive full medical, dental and vision Insurance
Basic Life Insurance at no cost to the employee
Company paid short-term and long-term disability
12 weeks of 100% paid Parental Leave
Health Savings Account (HSA)
Flexible Spending Accounts (FSA)
Retirement savings plan
Personal Paid Time Off
Paid holidays and company-wide Wellness Day off
Paid time off to volunteer at nonprofit organizations