Blue Coding is a company specializing in hiring excellent developers from Latin America and other parts of the world. They are seeking a Senior Data Engineer to lead and execute large-scale data migration and modernization initiatives, focusing on building robust data pipelines and ensuring data integrity across enterprise data platforms.
Responsibilities:
- Design, develop, and maintain data pipelines and datastores that support enterprise analytics, data science, and operational workloads
- Lead and support large-scale database migration initiatives, including on-premises to cloud migrations
- Monitor, analyze, and optimize the performance and stability of data layer services and platforms
- Ensure data integrity, quality, and compliance across pipelines and datasets
- Collaborate closely with peers across engineering, analytics, and technology teams
- Guide, coach, and mentor data engineers, BI developers, and analysts
- Design and implement enterprise-scale data solutions with long-term business impact
- Build and maintain data processing solutions using Python and/or Scala
- Work with a variety of data ingestion patterns, including SFTP, APIs, streaming, and batch processing
- Design and support database models optimized for analytical and reporting use cases
- Implement monitoring, alerting, and observability for data pipelines and infrastructure
- Maintain clear and comprehensive documentation of data architectures, pipelines, and processes
- Work within an Agile environment, collaborating through tools such as Jira and Git
Requirements:
- 5+ years of experience working in data engineering or data-centric technical roles
- Strong experience designing and maintaining enterprise-scale data pipelines and platforms
- Solid understanding of data warehousing concepts, modeling strategies, and analytical datasets
- Proven experience with SQL-based data environments and database performance tuning
- Hands-on experience with cloud-based data platforms, particularly on AWS
- Experience working with cloud-native data technologies such as S3, EMR/Spark, Lambda, Kinesis, Firehose, and Glue
- Strong programming skills in Python and/or Scala
- Experience with Infrastructure as Code concepts; Terraform experience is a strong plus
- Experience supporting analytics and data science consumers, including BI tools such as Tableau or PowerBI
- Strong communication skills and the ability to collaborate effectively across teams
- Ability to work autonomously with minimal direction while owning complex data initiatives end-to-end
- Experience with Redshift or other analytical data stores
- Familiarity with NoSQL data models and unstructured data processing
- Experience with cloud monitoring and alerting frameworks
- Exposure to feature stores or advanced analytics platforms
- Prior experience mentoring or leading other data engineers