KPG99 INC is seeking a Snowflake Data Engineer with expertise in Python, Apache, and AWS. The role involves implementing advanced Snowflake capabilities and leading migration initiatives from legacy data warehouses to Snowflake.
Responsibilities:
- Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
- Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
- Design and develop data applications and solutions on Snowflake, including Streamlit apps and Snowflake Native Apps
- Hands on experience with AWS cloud architecture and development using AWS resources like S3, Lambda, API Gateway, RDS, etc
Requirements:
- 10/10 Python
- Snowflake
- Airflow Apache Airflow is an open-source platform used to programmatically author, schedule, and monitor complex data pipelines and workflows using Python
- AWS / S3 and Lambda
- Snowflake UI
- Extract data from Data store and load in AWS. Will be structured and unstructured Data. ETL tools used for reporting. Different tables so downstream apps can consume using Python
- Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
- Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
- Design and develop data applications and solutions on Snowflake, including Streamlit apps and Snowflake Native Apps
- Hands on experience with AWS cloud architecture and development using AWS resources like S3, Lambda, API Gateway, RDS, etc
- PREFERRED TO WORK ON W2 OR 1099