Northwestern Mutual is a leading life insurance company, and they are seeking a Senior Data Engineer to design, develop, and maintain data products and platform infrastructure. The role involves collaborating with cross-functional teams to ensure efficient operation of data systems and implementing advanced monitoring and performance tuning.
Responsibilities:
- Design, develop, and maintain data products and data platform infrastructure
- Work with cross-functional teams to ensure efficient and reliable operation of data systems and development of frameworks to support data engineers, scientists, and analysts
- Work with site reliability engineers and operational support teams to implement auto-scaling policies and advanced monitoring/alerting systems and tune application performance configurations
- Utilize Data Lake platforms, including Databricks, AWS Lake Formation, and Snowflake
- Perform tooling to support logging, metrics, and telemetry (including Prometheus, Grafana, CloudWatch, OpenSearch, and Dynatrace)
- Lead risk assessment efforts when new technology is introduced and implement process improvement opportunities and automation
- Improve reliability, quality, and time-to-market for enterprise data platform
- Measure and optimize system performance, including automation and self-healing capabilities
- Work with modern CI/CD pipeline technologies involving git repositories, static code analysis, and test-driven development
- Partner with development teams to improve automated delivery through rigorous testing and release procedures
- Maintain and update terraform modules for infrastructure deployment
- Build and configure infrastructure resources on AWS cloud platform like Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables, Elastic Load Balancer (ELB), EC2, RDS
- End-to-end deployment and maintaining all applications, performing testing and security engineering assessments
- Interact with application vendors to get updates on product features and issues
Requirements:
- Bachelor's degree in Computer Information Sciences or a related field
- 3 years of experience as a data engineer or related occupation
- 3 years of experience with cloud services, architecture of cloud resources, system architecture, and networking components of AWS services
- 3 years of experience with requirement gathering and analysis
- 3 years of experience defining and implementing observability frameworks
- 3 years of experience with Data Lake platforms, including Databricks, AWS Lake Formation, and Snowflake
- 3 years of experience working in a distributed data system and application
- 3 years of experience with tooling to support logging, metrics, and telemetry (including Prometheus, Grafana, CloudWatch, OpenSearch, and Dynatrace)
- 3 years of experience with CI/CD pipeline technologies involving git repositories, static code analysis, and test-driven development