Dexian is a leading provider of staffing, IT, and workforce solutions. They are seeking an Azure Data Engineer responsible for developing and managing data pipelines, working with large datasets, and implementing data governance frameworks in cloud environments.
Responsibilities:
- 8 years of experience in ETL / Data Engineering
- 8+ years of experience with programming using Python
- 8+ years of experience working in Unix/Linux environments
- 8+ years of experience writing Shell scripts
- 6+ years of experience with Databricks ecosystem including Lakehouse, Delta Lake, Workflows, Medallion Architecture, Apache SPARK, PySpark, Unity Catalog, Delta Sharing, Notebooks, SQL, GIT
- 6+ years of experience with ADF
- 6+ years of experience working with large enterprise datasets
- years of experience with Snowflake
- Strong analytical and troubleshooting skills
- Excellent communication and collaboration abilities
- Ability to work independently and mentor junior analysts
- Strong documentation and design skills
- Strong SQL skills
- Experience implementing governance using Unity Catalog
- Experience working with Apache Iceberg or other open table formats
- Experience working with Azure Data Lake Storage (ADLS) or AWS S3
- Understanding of cloud data lake architecture
- Hands-on experience with Apache Airflow
- Experience developing pipelines for Snowflake
- Strong understanding of SAS programming, SAS Data step, SAS Macros, PROC SQL
- Experience migrating SAS ETL pipelines to Spark and Databricks
- Knowledge of data governance frameworks
- Healthcare experience
Requirements:
- 8 years of experience in ETL / Data Engineering
- 8+ years of experience with programming using Python
- 8+ years of experience working in Unix/Linux environments
- 8+ years of experience writing Shell scripts
- 6+ years of experience with Databricks ecosystem including Lakehouse, Delta Lake, Workflows, Medallion Architecture, Apache SPARK, PySpark, Unity Catalog, Delta Sharing, Notebooks, SQL, GIT
- 6+ years of experience with ADF
- 6+ years of experience working with large enterprise datasets
- years of experience with Snowflake
- Strong analytical and troubleshooting skills
- Excellent communication and collaboration abilities
- Ability to work independently and mentor junior analysts
- Strong documentation and design skills
- Strong SQL skills
- Experience implementing governance using Unity Catalog
- Experience working with Apache Iceberg or other open table formats
- Experience working with Azure Data Lake Storage (ADLS) or AWS S3
- Understanding of cloud data lake architecture
- Hands-on experience with Apache Airflow
- Experience developing pipelines for Snowflake
- Strong understanding of SAS programming, SAS Data step, SAS Macros, PROC SQL
- Experience migrating SAS ETL pipelines to Spark and Databricks
- Knowledge of data governance frameworks
- Healthcare experience