RedCloud Consulting is a global business and technology consulting firm supporting clients across various regions. They are seeking a Data Automation Architect & Analytics Engineer to modernize and automate an existing analytics and reporting framework, focusing on end-to-end ETL process automation and improving data reliability and scalability.
Responsibilities:
- Automate an existing ETL workflow by understanding the current process, connecting it to new pipelines, performing required transformations, and loading curated data into the presentation layer
- Ingest and integrate data from a broad set of internal sources that may include SQL queries, BI reports, and data cubes, and ensure the data is usable after ingestion through appropriate transformations and validation
- Design and build the automation infrastructure primarily using a modern analytics platform such as Fabric while continuing to leverage SQL where needed
- Deliver post transformation data to reporting and downstream consumers, including a Power BI reporting layer and external blob storage destinations used by other teams and partners
- Evaluate existing pipelines, queries, endpoints, and refresh processes to identify inefficiencies and redesign for scale, resilience, and improved refresh cadence
- Create parameterized automation patterns that allow business users to control configuration inputs such as filters, with governance considerations such as change tracking and auditability
- Explore a lightweight configuration interface approach using tools such as Power Apps to enable non engineering users to adjust parameters without code changes
- Operate with a self serve working style by independently seeking access, documentation, and cross team help wherever possible, escalating only after other paths have been exhausted
- Provide frequent progress visibility to stakeholders through demos, working sessions, and iterative deliverables from day one
- Contribute to a longer term reference architecture and repeatable model that can be adopted by other teams
Requirements:
- Strong SQL expertise
- Hands on experience with Power BI, including supporting or evolving reporting solutions connected to automated pipelines
- Experience with Fabric or the ability to ramp quickly across key capabilities such as lakehouse patterns, pipelines, and dataflows
- Understanding of cloud based data architectures and analytics services, such as data lakes, Synapse, and Kusto or equivalent technologies
- Demonstrated experience automating analytics workflows and reducing manual operational overhead through repeatable engineering patterns
- Ability to independently perform discovery by reading pipelines, reviewing queries, interpreting endpoints, and synthesizing findings into clear recommendations
- Strong analytical mindset with the ability to question existing logic, identify better sources, and surface improvement opportunities
- Proven ability to execute proactively with a continuous delivery approach and to produce tangible deliverables every 1 to 2 weeks
- Experience implementing self service configuration patterns for business users, including configurable inputs and change tracking
- Experience designing data delivery paths that support both reporting consumers and downstream operational consumers, including file or blob based distribution
- Strong communication skills to collaborate effectively with both technical and non technical stakeholders