Effectual is a company that focuses on data architecture and engineering solutions. As a Data Engineer, you will be responsible for building and maintaining data pipelines, ensuring efficient data management, and collaborating with data scientists to meet complex analytics needs.
Responsibilities:
- Design and create the overall data architecture (databases, large scale processing systems)
- Define how data will be stored, consumed, integrated, and managed by different data entities and IT systems
- Ensure the data strategies and architectures are in compliance with regulatory requirements
- Work with business executives to ensure the data architecture aligns with business requirements
- Ensure that data flows efficiently through the pipelines, making it usable for further analysis and reporting
- Implement, maintain, and update data management tools and systems
- Optimize data retrieval, develop dashboards, reports and perform database maintenance tasks
- Collaborate closely with data scientists, helping to ensure that the organization's data infrastructure meets the requirements of complex data analytics
Requirements:
- Bachelor's or master's degree in Computer Science, Engineering or a related field
- 4-7 years experience working as a Data Engineer, preferably in a professional services or consulting environment
- Strong proficiency in programming languages such as Python, Java, or Scala, with expertise in data processing frameworks and libraries (e.g., Spark, Hadoop, SQL, etc.)
- In-depth knowledge of database systems (relational and NoSQL), data modeling, and data warehousing concepts
- Experience with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud) including familiarity with relevant tools and technologies (e.g., S3, Redshift, BigQuery, etc.)
- Computer Vision/Intelligent Document Processing
- Proficiency in designing and implementing ETL processes and data integration workflows using tools like Apache Airflow, Informatica, or Talend
- Familiarity with data governance practices, data quality frameworks, and data security principles
- Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions
- Excellent communication and collaborations skills, with the ability to effectively work with clients and cross-functional teams
- Self-motivated and proactive, with a passion for learning and staying updated with the latest trends and advancements in the field of data engineering
- Able to work with ambiguity and turn client wants and needs into working stories, epics which can be executed upon during a sprint. This means Data Engineers understand and know the ‘agile' progress software delivery
- A firm understanding of the SDLC process
- An understanding of object-oriented programming
- Needs minimal direction
- AWS background, AWS Cloud Formation & Data Migration Services (DMS)
- Solution Engineer mindset
- A curious nature and inquisitive attitude when approaching problems
- Have the attitude of ‘good is not good enough' for our clients
- Snowflake or Databricks certifications and/or hands-on-keyboard experience
- AWS Certified Solutions Architect – Associate
- AWS Certified Solutions Architect – Professional
- AWS Certified Developer – Associate
- AWS Certified Machine Learning – Specialty
- AWS Certified Data Engineer - Associate
- HashiCorp Certified: Terraform Associate
- Certified Associate in Python Programming (PCAP)
- Certified Entry-Level Data Analyst with Python (PCED)
- Snowflake SnowPro Core
- Databricks Certified Data Engineer Associate