Farmers Insurance is a company known for its commitment to making a real difference in people's lives. The Software Developer - Analytics Engineering role is crucial for building custom solutions that enable the deployment of predictive models and transforming raw data into structured datasets for analysis.
Responsibilities:
- Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar
- Automates data ingestion processes from various sources including databases, APIs, and third party services
- Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery
- Optimizes storage solutions for performance, cost efficiency, and scalability
- Develops and maintains logical and physical data models to support business analytics
- Creates and manages dimensional models, star/snowflake schemas, and other data structures
- Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages
- Implements data transformation workflows to handle data cleansing, normalization, and enrichment
- Conducts data validation and consistency checks to ensure the accuracy and reliability of data
- Implements data quality monitoring and alerting mechanisms
- Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs
- Acts as a liaison between technical teams and business units to translate business requirements into technical specifications
- Clearly communicates complex technical concepts and data insights to non-technical stakeholders
- Provides training and support to team members on data tools, best practices, and methodologies
- Implements and enforces data governance policies to ensure data privacy, security, and compliance with relevant regulations
- Defines and manages data access, controls, permissions, and audit trails
- Monitors and enforces data security measures to protect sensitive information from unauthorized access and breaches
- Ensures compliance with industry standards and regulations such as GDPR, CCPA, or HIPAA and others as applicable
- Utilizes modern data tools and technologies such as SQL, Python, dbt, Airflow, and cloud platforms like AWS, Azure, or GCP
- Evaluates and integrates new tools and technologies to improve data infrastructure and processes
- Stays updated with the latest trends, best practices, and advancements in data engineering and analytics
- Participates in professional development opportunities to enhance technical and analytical skills
- Provides code as requirements for hardening and operationalization by technology with coaching, guidance, and feedback
- Performs other duties as assigned
Requirements:
- High School Diploma or equivalent required
- 3-5 years of related work experience required
- Experience with cloud-based data platforms (AWS, Azure, GCP)
- Competent data infrastructure development with limited coaching and guidance
- Pipeline Design and Development - Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar
- Automates data ingestion processes from various sources including databases, APIs, and third party services
- Data Storage and Management - Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery
- Optimizes storage solutions for performance, cost efficiency, and scalability
- Competent data modeling and transformation with limited coaching and guidance
- Data Modeling - Develops and maintains logical and physical data models to support business analytics
- Creates and manages dimensional models, star/snowflake schemas, and other data structures
- Data Transformation - Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages
- Implements data transformation workflows to handle data cleansing, normalization, and enrichment
- Data Quality Assurance - Conducts data validation and consistency checks to ensure the accuracy and reliability of data
- Implements data quality monitoring and alerting mechanisms
- Competent collaboration and stakeholder management with limited coaching and guidance
- Cross-Functional Collaboration - Works closely with data analysts, data scientists, and business stakeholders to gather requirements and understand their data needs
- Acts as a liaison between technical teams and business units to translate business requirements into technical specifications
- Technical Communication - Clearly communicates complex technical concepts and data insights to non-technical stakeholders
- Provides training and support to team members on data tools, best practices, and methodologies
- Competent data governance and security knowledge with limited coaching and guidance
- Governance Policies - Implements and enforces data governance policies to ensure data privacy, security, and compliance with relevant regulations
- Defines and manages data access, controls, permissions, and audit trails
- Security Measures - Monitors and enforces data security measures to protect sensitive information from unauthorized access and breaches
- Ensures compliance with industry standards and regulations such as GDPR, CCPA, or HIPAA and others as applicable
- Competent tool and technology utilization with limited coaching and guidance
- Technology Stack - Utilizes modern data tools and technologies such as SQL, Python, dbt, Airflow, and cloud platforms like AWS, Azure, or GCP
- Evaluates and integrates new tools and technologies to improve data infrastructure and processes
- Continuous Learning - Stays updated with the latest trends, best practices, and advancements in data engineering and analytics
- Participates in professional development opportunities to enhance technical and analytical skills
- Provides code as requirements for hardening and operationalization by technology with coaching, guidance, and feedback
- Bachelors degree preferred in computer science, data science, engineering, or a related field
- Experience with C#, Python, JavaScript, Blazor, SQL, and Snowflake highly preferred