CopperPoint Insurance Companies is a leading provider of workers’ compensation and commercial insurance solutions. They are seeking a Data Engineering Intern to support the design, development, and optimization of scalable data pipelines and analytics platforms, providing hands-on experience in a corporate environment.
Responsibilities:
- Assist in building and maintaining data ingestion and transformation pipelines
- Work with structured and semi-structured data
- Support ETL / ELT workflows using SQL and Python
- Participate in data quality checks and validation
- Document data pipelines and schemas
- Support performance tuning and optimization
- Collaborating in Agile Execution
Requirements:
- A current college undergraduate pursuing a degree in Computer Science, Data Science, Information Systems, or a related field
- Proficient in SQL and comfortable writing queries to analyze and transform data
- Familiar with Python, including basic scripting and working with libraries such as Pandas
- Knowledgeable about ETL concepts and data pipeline workflows
- Experienced with version control tools, particularly Git
- Equipped with strong analytical and problem-solving skills
- Familiar with cloud platforms such as AWS, Azure, or GCP
- Experienced or eager to learn modern data platforms like Databricks and Snowflake
- Knowledgeable in distributed data processing using Spark or PySpark
- Exposed to workflow orchestration tools such as Airflow
- Comfortable working with and consuming REST APIs