Design, develop, and maintain scalable ETL/ELT pipelines using Azure Data Factory (ADF)
Configure and manage Linked Services, Datasets, and Pipelines in ADF
Develop and optimize data transformation workflows using Azure Databricks (PySpark)
Work across Lakehouse architecture layers (Bronze/Silver/Gold), storage accounts, Unity Catalog, and support metadata‑driven design (control tables, mappings)
Strong technical expertise across Power BI, SQL, Power Platform tools, and Azure services
Build and manage dbt models for data transformation, testing, and documentation
Implement real-time data ingestion using Azure Event Hub
Integrate data from multiple sources (databases, APIs, cloud storage, on-prem systems)
Ensure data quality by implementing dbt tests and validations
Monitor, troubleshoot, and optimize data workflows
Collaborate with cross-functional teams to understand business data requirements
Maintain data governance, security, and performance standards
Strong understanding of data warehousing concepts and ELT methodologies
Experience working with Azure Data Lake Storage (ADLS) or similar storage solutions
Knowledge of version control (Git) and CI/CD processes
Willing to work in production support
Requirements
Design, develop, and maintain scalable ETL/ELT pipelines using Azure Data Factory (ADF)
Configure and manage Linked Services, Datasets, and Pipelines in ADF
Develop and optimize data transformation workflows using Azure Databricks (PySpark)
Work across Lakehouse architecture layers (Bronze/Silver/Gold), storage accounts, Unity Catalog, and support metadata‑driven design (control tables, mappings)
Strong technical expertise across Power BI, SQL, Power Platform tools, and Azure services
Build and manage dbt models for data transformation, testing, and documentation
Implement real-time data ingestion using Azure Event Hub
Integrate data from multiple sources (databases, APIs, cloud storage, on-prem systems)
Ensure data quality by implementing dbt tests and validations
Monitor, troubleshoot, and optimize data workflows
Collaborate with cross-functional teams to understand business data requirements
Maintain data governance, security, and performance standards
Strong understanding of data warehousing concepts and ELT methodologies
Experience working with Azure Data Lake Storage (ADLS) or similar storage solutions
Knowledge of version control (Git) and CI/CD processes