SS&C Technologies is a leading financial services and healthcare technology company headquartered in Windsor, Connecticut. As a Lead DevOps Data Engineer at DST Health Solutions, you will design, build, and maintain data infrastructure and pipelines while leading a team to drive best practices in DevOps methodologies for data platforms.
Responsibilities:
- Lead the design, development, and implementation of robust, scalable, and secure data pipelines and data warehouses using cloud-native services and PostgreSQL
- Drive the adoption and implementation of DevOps principles and practices (CI/CD, infrastructure as code, automated testing, monitoring) across data engineering initiatives
- Mentor and guide a team of data engineers, fostering their technical growth and ensuring adherence to engineering best practices and architectural standards
- Collaborate with data scientists, analysts, and other engineering teams to understand data requirements and translate them into efficient and effective data solutions
- Help teams to optimize and maintain queries essential in data pipelines, reporting, and analytics
- Troubleshoot and resolve complex data-related issues, optimizing performance and ensuring data integrity
- Evaluate and recommend new technologies and tools to enhance our data platform capabilities and operational efficiency
- Develop and maintain comprehensive documentation for data architecture, pipelines, and operational procedures
- Participate in on-call rotations as needed to support critical data infrastructure
Requirements:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related technical field; Master's degree preferred
- 7+ years of experience in data engineering, with at least 3 years in a lead or senior role focusing on DevOps practices
- Proven expertise in designing and building large-scale data pipelines using ETL/ELT tools and techniques utilizing PostgreSQL
- Expertise in scripting languages such as Python or Java, and strong SQL skills
- Solid experience with Infrastructure as Code (IaC) tools like Terraform
- Demonstrated experience with CI/CD pipelines (e.g. Github actions, Airflow, PySpark) for data solutions
- Experience with containerization technologies (Docker, Kubernetes)
- Strong understanding of data warehousing concepts, dimensional modeling, and data lake architectures
- Experience in tabular data (e.g. Hydra table)
- Excellent leadership, communication, and interpersonal skills with the ability to mentor and guide technical teams
- Ability to work effectively in a fast-paced, collaborative, and agile environment
- Experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming) is a plus