Verisk is a leading data analytics and technology partner in the global insurance industry, dedicated to empowering communities and businesses. The Analytics Engineer will be responsible for transforming raw data into structured datasets for analysis and machine learning, collaborating with various teams to ensure data integrity and accessibility.
Responsibilities:
- Design, develop, and maintain scalable data pipelines to process raw data from various sources
- Clean, transform, and enrich data to create high-quality datasets suitable for analysis and machine learning
- Work closely with product teams, software developers, data scientists, and analysts to understand data needs and deliver innovative solutions
- Ensure data accuracy, consistency, and reliability across all datasets
- Optimize data processes for performance and scalability
- Maintain comprehensive documentation of data pipelines, processes, and schemas
Requirements:
- bachelor's degree in computer science, Data Engineering, or a related field
- 3+ years of experience as a Data Engineer or in a similar role
- Proficiency in Python, SQL, and familiarity with languages such as C# or Java
- Experience with ETL tools and frameworks (e.g., Apache Airflow, Luigi, DBT)
- Hands-on experience with big data technologies such as Hadoop, Spark, and Kafka
- Strong knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra)
- Experience with cloud services (e.g., AWS, Google Cloud, Azure) and their data processing tools (e.g., AWS Glue, Google BigQuery)
- Familiarity and enthusiasm for bleeding-edge analytical enablement using tools such as Large Language Models and Prompt Engineering
- Knowledge of data warehousing concepts and solutions (e.g., Redshift, Snowflake)
- Proficient with version control systems (e.g., Git)
- Understanding of machine learning concepts and experience working with data for ML model training