StackAdapt is the leading technology company that empowers marketers to reach, engage, and convert audiences with precision. The Senior Big Data Engineer & Database Administrator will be responsible for all operational database administration duties, data engineering, and data operations within the Data Lake and Enterprise Data Warehouse ecosystem, ensuring a healthy and reliable data environment and designing high-quality data artifacts.
Responsibilities:
- Take the lead on our daily database administration, ensuring our data environment stays healthy and reliable
- Partner closely with our Staff EDW Architect to bring new visions to life, designing high-quality data artifacts that follow industry best practices
- Turn business needs into smart, reusable ETL solutions that grow alongside our company
- Design end-to-end pipelines using your expertise with data models and architecture diagrams to build automated ingestion and transformation pipelines that keep our data moving on schedule
- Analyze and produce artifacts such as Source-to-Target Mapping documents
- Build pipelines that bring together data from everywhere—whether it’s RDBMS, APIs, JSON, or flat files—creating a seamless flow of information
- Define the 'Source of Truth' using tools like Python and ETL software to clean, integrate, and transform data into a single, reliable resource the whole company can trust
- Provide day-to-day operational support for all the data pipelines by monitoring and investigating alerts, while troubleshooting and remediating production issues in a timely manner
- Act as a player-coach for junior and intermediate data engineers and ETL developers as the EDO team grows
Requirements:
- Deep experience performing ETL design and development via custom coding (SQL, Python, Spark, Java, etc.) as well as using ETL tools (e.g., Coalesce, Informatica, DataStage, Talend) and other tools such as Data Quality tools, Metadata Manager, etc
- Extensive hands-on, professional, experience working with Snowflake as a Database Administrator (DBA) and clear understanding of Snowflake's Access Control framework
- Comfortable with Cloud-hosted solutions, especially AWS with experience deploying Secrets Manager, KMS, S3, EC2, Linux, Cross-Account Access, etc in a scaled environment
- Innate curiosity to fundamentally understand and solve problems. You're not satisfied until you can clearly explain how and why an error occurred, and you are driven to ensure it never happens again!
- Understanding of data warehousing architecture fundamentals (e.g., Kimball vs. Inmon, Medallion Architecture, 3NF data models, reference data models, dimensional models, conformed dimensions, SCDs, etc.)
- Experience in orchestrating data operations via tools such as Apache Airflow, Cron, Astronomer etc