Home
Jobs
Saved
Resumes
Cloud Data Engineer – AWS, Terraform, Python at Kolomolo | JobVerse
JobVerse
Home
Jobs
Recruiters
Companies
Pricing
Blog
Jobs
/
Cloud Data Engineer – AWS, Terraform, Python
Kolomolo
Remote
Website
LinkedIn
Cloud Data Engineer – AWS, Terraform, Python
Bulgaria
Full Time
5 hours ago
No H1B
Apply Now
Key skills
Amazon Redshift
AWS
Cloud
Docker
ETL
Grafana
Kubernetes
Neo4j
Postgres
Prometheus
Python
SQL
Terraform
ELT
Data Lake
Snowflake
Redshift
Databricks
ECS
EKS
Lambda
S3
CloudWatch
Glue
PostgreSQL
Agile
Scrum
CI/CD
Communication
About this role
Role Overview
You will play a key role in designing and deploying the data infrastructure supporting Kolomolo’s modernization stack
Your work will focus on building Terraform-based AWS deployments, ensuring smooth data ingestion, transformation, enrichment, and validation
Designing and implementing Infrastructure as Code (IaC) using Terraform for AWS services
Building data processing pipelines and automating workflows for ingestion, transformation, and validation
Handling data enrichment and data integrity checks across multiple formats (e.g., Parquet, JSON, CSV)
Integrating data intake from relational databases (SQL, PostgreSQL, etc.) and graph-based data sources (e.g., Neo4j, Amazon Neptune)
Developing and maintaining Python-based data orchestration tools and scripts
Working with containerized environments (Docker, Kubernetes) to enable scalable deployments
Supporting continuous improvement and optimization of cloud infrastructure performance and cost
Requirements
Proven experience as a Data Engineer or Cloud Engineer in AWS environments
Strong knowledge of Terraform and IaC practices
Hands-on experience with AWS core services (S3, Lambda, Glue, ECS/EKS, Redshift, etc.)
Proficiency in Python for automation, scripting, and data workflows
Understanding of ETL/ELT processes, data enrichment, and validation frameworks
Experience working with containerization tools (Docker, Kubernetes)
Familiarity with multiple data formats, including Parquet
Exposure to both relational and graph data models
Good communication skills and experience collaborating in Agile/Scrum teams
Experience with Databricks, Snowflake, or data lake architectures
Knowledge of monitoring and observability tools (e.g., CloudWatch, Grafana, Prometheus)
Experience with CI/CD pipelines for data workflows
Tech Stack
Amazon Redshift
AWS
Cloud
Docker
ETL
Grafana
Kubernetes
Neo4j
Postgres
Prometheus
Python
SQL
Terraform
Benefits
Competitive salary and benefits
Career development opportunities in a growing tech company
Continuous learning culture: mentorship, internal training, and certifications
Flexible, agile work environment (remote, hybrid, or on-site in Kraków)
Office perks: great coffee, tea, fresh fruit, snacks, and a fun atmosphere
Flat management structure, where your voice matters
Regular team events and a social, supportive work culture
B2B contract or Contract of Mandate (Umowa Zlecenie)
Apply Now
Home
Jobs
Saved
Resumes