IBM is a leading technology company seeking a skilled Consultant Data Engineer to join their expanding team. The role focuses on the design and development of Snowflake Data Cloud solutions, including constructing data ingestion pipelines and implementing data governance protocols.
Responsibilities:
- Constructing data ingestion pipelines
- Establishing sound data architecture
- Implementing stringent data governance and security protocols
- Collaborating closely with database architects, data analysts, and data scientists
- Ensuring a consistent and optimal data delivery architecture across ongoing customer projects
- Navigating the diverse data needs of multiple teams, systems, and products
Requirements:
- Bachelor's degree in engineering, computer science or equivalent area
- 3+ years in related technical roles with experience in data management, database development, ETL, and/or data prep domains
- Experience developing data warehouses
- Experience building ETL / ELT ingestion pipelines
- Proficiency in using cloud platform services for data engineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow)
- Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance
- Knowledge of how to manipulate, process and extract value from large disconnected datasets
- SQL and Python scripting experience required
- Strong interpersonal skills including assertiveness and ability to build strong client relationships
- Strong project management and organizational skills
- Ability to support and work with cross-functional and agile teams in a dynamic environment
- Advanced English required
- Cloud Integration Knowledge: Exposure to integrating cloud computing concepts and technologies with Snowflake platforms, enhancing data and AI use case implementation
- Advanced Data Engineering: Experience working with data engineering principles and practices to deliver high-quality solutions on Snowflake platforms, leveraging expertise in Snowflake and cloud computing
- Technical Solution Optimization: Experience applying technical expertise to optimize solutions on Snowflake platforms, ensuring seamless integration and optimal performance for data and AI use cases
- AI Development Experience: Familiarity with leveraging AI-assisted development tools (e.g., GitHub Copilot, Cursor, or similar) to accelerate coding, debugging, and solution design within data engineering workflows
- Cloud experience (AWS, Azure or GCP) is a plus
- Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT
- Scala and Javascript is a plus