Scalence L.L.C. is seeking a motivated and skilled professional to join their dynamic team. The role involves designing, developing, and maintaining data platforms to support innovative projects, requiring collaboration with stakeholders to translate business requirements into technical specifications.
Responsibilities:
- Design, build, and maintain scalable data platforms and pipelines using Python, SQL, Airflow, and Spark
- Collaborate with stakeholders to understand and translate business requirements into technical specifications
- Develop and implement data models that support analytics and reporting needs
- Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks
- Work with cross-functional teams to deliver high-quality data solutions
- Continuously monitor and optimize data pipelines for performance, scalability, and cost-efficiency
- Build and implement monitoring and observability metrics to ensure data quality and detect anomalies in data pipelines
- Maintain clear and comprehensive documentation of data processes and communicate technical concepts effectively to non-technical stakeholders
Requirements:
- Applicants must be able to work directly on W2
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field
- 2 years of experience in data engineering and infrastructure
- Proficiency in Python, SQL, Airflow, and Spark
- Strong experience in building and maintaining robust data pipelines and ETL processes
- Excellent verbal and written communication skills, with the ability to convey technical information to non-technical audiences
- Proven ability to work effectively in a collaborative, cross-functional environment
- Experience with cloud platforms such as AWS, GCP, or Azure
- Familiarity with data warehousing technology (e.g., Deltalake, Azure Fabric, Snowflake, Redshift, BigQuery)
- Knowledge of data governance and data security best practices