Celero Commerce is building smarter, data-driven solutions to power the future of payments, and they are seeking a talented Data Engineer to join their expanding analytics team. The role involves designing, building, and maintaining modern data infrastructure, developing reliable data pipelines, and enabling self-service analytics to support decision-making across the organization.
Responsibilities:
- Design and develop scalable data pipelines using Microsoft Fabric Lakehouse, Pipelines, and Dataflows
- Build and maintain ETL/ELT processes using Python, PySpark, and Fabric Notebooks
- Ingest and transform data from diverse sources (APIs, SQL databases, flat files) into OneLake and curate clean, analytics-ready tables
- Develop and maintain semantic models and Power BI datasets to support reporting and analytics
- Partner closely with analysts, data scientists, and business stakeholders to translate requirements into high‑quality technical solutions
- Implement data quality checks, monitoring, and validation to ensure reliable and observable pipelines
- Optimize performance of transformations and queries within Fabric and Power BI
- Document data models, pipeline logic, and operational processes to ensure maintainability
Requirements:
- 3+ years of experience in a data engineering or data integration role
- Hands-on experience with components of Microsoft Fabric (Lakehouse, Pipelines, Notebooks, or OneLake) or equivalent data engineering tools with willingness to ramp up quickly
- Proficient in Python for data transformation, automation, and scripting
- Solid experience with Power BI, including dataset modeling, DAX, and performance tuning
- SQL, ability to write complex queries for reporting, debugging and identifying data discrepancies. Experience in data modeling (dimensional/star schema), and data warehousing principles
- Experience with Spark or PySpark in big data contexts
- Understanding of cloud platforms (preferably Azure) and data storage formats (Parquet, Delta, JSON, etc.)
- Experience in the FinTech and/or Merchant Services industry
- Exposure to CI/CD practices for data pipelines and version control (Git)
- Familiarity with data governance, security, and compliance frameworks (e.g., PCI, SAC)
- Experience with orchestration tools such as Fabric pipelines, ADF, or Airflow