Fractal Analytics is a strategic AI partner to Fortune 500 companies. The Azure Data Engineer role involves translating complex analytical requirements into technical designs, architecting solutions on the Azure platform, and delivering large-scale data management initiatives.
Responsibilities:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfil the technical design
- Architect on Azure platform to meet client specific functional and non-functional requirements
- Evaluate the current technology landscape and recommend a forward-looking, short, and long-term technology strategic vision
- Participate in the creation and sharing of best practices, technical content, and new reference architectures
- Preparing and answering RFPs, RFIs and RFQs
- Oversee design and solution architecture to ensure standards are followed, codebase is modular and scalable
- Estimate effort using a parametric estimation model and bring all stakeholders on-board about the overall schedule, and dependencies
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfil consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, Snowflake, SQL DW / Synapse or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
Requirements:
- 6-10 years of software development experience
- Translated complex analytical requirements into technical design including data models, ETLs and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Dbt core or dbt cloud experience
- Data vault 2.0 implementation experience with automate_dv package
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing and Unity Catalog
- Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Strong knowledge of continuous integration, static code analysis and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have Excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Experience working on APIs
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations