Work closely with business product owners, data scientists, analysts, and cross-functional stakeholders to understand the business’ data needs.
Build technical solutions and adhere to technical architecture standards.
Create efficient and effective data solutions in collaboration with contractors.
Utilize Azure Data Factory, Databricks, Python, Spark, and other tools to develop and deploy robust data pipelines, data models, and dynamic data solutions.
Develop high-quality code to process and move vast amounts of data and integrate systems across diverse technical platforms.
Implement data improvements and optimization techniques to enhance the data products.
Build and maintain data pipelines to integrate data from various source systems.
Optimize data pipelines for performance, reliability, and cost-effectiveness.
Collaborate with Data Governance Stewards and Business Stakeholders to enforce data quality rules and cataloging activities.
Develop and manage business intelligence solutions to transform data into insights.
Requirements
Bachelor’s degree in Computer Science, Information Systems, or a related field with 3–5 years of experience, or a Master’s/PhD with 1–3 years of experience.
Hands-on experience designing and delivering data solutions such as data warehouses, data lakes, or lakehouse architectures.
2–3 years of experience working with cloud data platforms (e.g., Microsoft Azure, AWS, or similar).
Strong experience building, deploying, and supporting end-to-end data pipelines, including ETL/ELT processes.
Proficiency in SQL, Python, and Spark (including PySpark and Pandas) for data processing and transformation.
Working knowledge of modern cloud data tools such as Azure Data Factory, Databricks, Delta Lake, or equivalent technologies.
Solid understanding of data modeling, dimensional modeling, performance optimization, and medallion/lakehouse architectures (Bronze, Silver, Gold).
Experience developing and maintaining BI reports and dashboards using tools such as Power BI, Tableau, or similar platforms.
Familiarity with data security and governance concepts, including row-level security, data masking, and access controls.
Ability to collaborate effectively with senior engineers, architects, and business stakeholders to deliver scalable, reliable data solutions.
Strong problem-solving, communication, and documentation skills, with the ability to translate business needs into technical solutions.
Tech Stack
AWS
Azure
Cloud
ETL
Pandas
PySpark
Python
Spark
SQL
Tableau
Benefits
robust health plans
a market-leading 401(k) program with a company match
flexible time off benefits (including half-day summer Fridays depending on location)