Charter Global is a staffing company, and they are seeking a Data Engineer to design and maintain scalable ETL/ELT pipelines and manage data models. The role involves implementing cloud data architectures and ensuring the performance and reliability of data pipelines.
Responsibilities:
- Design and maintain scalable ETL/ELT pipelines using Python and AWS-native services
- Develop, optimize, and manage Postgres and Snowflake data models to support analytics and operational workloads
- Implement and maintain graph database solutions using Amazon Neptune or Neo4j
- Build secure, high performance cloud data architectures leveraging AWS (e.g., Lambda, Glue, S3, IAM, Step Functions)
- Monitor, troubleshoot, and improve data pipeline performance, reliability, and quality across environments
Requirements:
- 5+ years of experience designing and developing data pipelines using Python
- Strong hands on experience with Postgres, including schema design, query tuning, and performance optimization
- Experience with Snowflake (warehousing concepts, virtual warehouses, SnowSQL, data sharing)
- Working knowledge of graph technologies (Amazon Neptune or Neo4j), including modeling and query languages (Gremlin, Cypher)
- Proven experience building and deploying solutions in AWS, including best practices for security, scalability, and cost optimization
- AI skills