DTN is a global data and technology company helping operational leaders in energy, agriculture, and weather-driven industries make faster, smarter decisions. The Senior Data Engineer will lead the design and delivery of trusted, scalable data solutions that power DTN’s products and customer outcomes.
Responsibilities:
- Architect and deliver trusted data sets and services from complex data sources to support DTN partners and customers
- Design and implement modern data and analytics architectures, establishing best practices for data ingestion, integration, and pipeline development
- Lead initiatives to enhance data governance, security, quality, and reliability across the organization
- Drive continuous improvement of data engineering standards, patterns, and documentation to enable enterprise-wide consistency
- Provide expert-level support to internal teams and customers, resolving complex data infrastructure challenges
- Mentor junior data engineers and provide technical leadership across cross-functional teams
- Identify opportunities to modernize and optimize internal processes through automation and innovative data solutions
Requirements:
- 7+ years of experience in data engineering, including pipelines, ingestion, and ETL/ELT processes
- 5+ years of experience designing and deploying solutions within AWS cloud environments
- Strong expertise in modern data tooling, including Apache Airflow (or similar workflow tools), PySpark, EMR, and lakehouse architectures leveraging S3, Lake Formation, and Apache Iceberg
- Experience with containerization and infrastructure-as-code tools such as Docker, Kubernetes, Terraform, and CI/CD pipelines
- Advanced SQL and database expertise, including relational (Postgres, Oracle) and non-relational database technologies
- Proficiency in Python for data engineering, API integration (REST and GraphQL), and automation
- Experience implementing data quality frameworks and governance practices
- Strong communication skills with the ability to influence stakeholders and lead technical discussions
- Experience with geospatial data processing tools such as PostGIS or Apache Sedona
- Deep expertise in deploying and managing large-scale datasets efficiently
- Proven track record implementing enterprise data engineering best practices
- Strong customer focus and understanding of downstream data consumption and business value creation