AWSCloudDistributed SystemsDynamoDBEC2JavaScriptKafkaKubernetesMicroservicesNode.jsPythonSparkTerraformTypeScriptServerlessEKSLambdaS3RDSGlueCI/CDRemote Work
About this role
Role Overview
Design, build, and scale new features for REST APIs and large-scale data processing pipelines that handle high-volume datasets across distributed systems.
Architect and optimize backend services for high throughput and low-latency performance.
Develop data-intensive and event-driven applications using Python, Typescript, Spark, and AWS-native services.
Work with Spark, EMR, Glue, Kafka, or similar frameworks to process and transform very large datasets.
Improve system performance, reliability, and scalability across microservices and cloud infrastructure.
Partner with senior engineers, architects, DevOps, and QA throughout the full development lifecycle.
Mentor developers, guide code reviews, and raise engineering quality standards.
Automate deployments and CI/CD using Terraform, Serverless Framework, and Kubernetes-based workflows.
Requirements
7+ years of backend or full-stack engineering experience with a strong backend focus.
7+ years of hands-on Python experience (APIs, automation, large-scale data pipelines).