Fractal Analytics is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. They are seeking a Senior AI & Data Platform Engineer who will excel at AI-assisted development, data engineering, and platform modernization, playing a key role in accelerating engineering productivity and delivering scalable AI/data solutions.
Responsibilities:
- Leverage AI-assisted development tools (Copilots, LLMs, prompt-driven workflows) to accelerate development cycles
- Apply spec-driven development to generate, refine, and validate production-quality code
- Improve developer velocity while maintaining strong standards in code quality, testing, and architecture
- Lead the migration of data and AI workloads from Databricks to Microsoft Fabric
- Refactor PySpark, SQL, and orchestration pipelines into Fabric-native solutions
- Optimize workloads using OneLake, Lakehouse, Warehouse, Notebooks, and Dataflows Gen2
- Ensure performance, scalability, and cost efficiency across the platform
- Build and maintain scalable data pipelines and analytical workloads
- Develop AI-enabled services and data products integrated with enterprise systems
- Implement robust testing, validation, and deployment practices
- Translate business and technical requirements into clear, structured implementations
- Follow contract-first / spec-driven engineering practices (schemas, interfaces, SLAs)
- Maintain full traceability from requirements → implementation → testing → deployment
- Work across the full SDLC (design, build, test, deploy, operate)
- Partner with architects, product teams, and stakeholders globally
- Mentor engineers on modern AI engineering and platform best practices
Requirements:
- Bachelor's or Master's degree in Computer Science, Engineering, or related field
- 5+ years of experience in software, data, or AI engineering
- Strong SQL skills (advanced querying, optimization)
- Data modeling (relational, dimensional, analytical)
- Distributed data processing concepts
- Hands-on experience with AI-assisted development / Copilot-style workflows
- Experience working with spec-driven engineering approaches
- Strong foundation in clean code, modular design, and automated testing
- Experience with Microsoft Fabric: Lakehouse, Warehouse, Notebooks, Dataflows Gen2
- Understanding of OneLake architecture
- Prior experience with Databricks (Spark, pipelines, notebooks)
- Prior experience with Azure services (ADLS, Azure SQL, Synapse, Azure Functions)
- Ability to translate Databricks workloads into Fabric-native solutions
- Self-starter with a forward-thinking mindset
- Strong problem-solving and systems thinking abilities
- Comfortable working in fast-paced, high-ownership environments
- Passion for building end-to-end, production-grade AI and data platforms