NinjaOne is looking for a skilled Data Engineer to join their team and drive the future of their data infrastructure. The role involves building, maintaining, and scaling systems to ensure smooth data flow, accuracy, and security, while collaborating with cross-functional teams to leverage data for business decisions.
Responsibilities:
- Design and implement scalable data pipelines that move and transform large volumes of data from multiple sources to central data warehouses, transforming data to enable business reporting and advanced analytics
- Manage and optimize the performance of relational databases, ensuring data availability, reliability, and consistency
- Automate and optimize data workflows to reduce manual processes and improve efficiency in data collection, storage, and processing
- Ensure the integrity and security of data across systems, monitor performance, and troubleshoot any issues that arise within the data pipeline
- Build dashboards and reports in Tableau and Databricks to expose key data points and trends to business stakeholders
- Work closely with data scientists, analysts, and other teams to gather requirements, understand data needs, and provide solutions that support data-driven decision-making
- Other duties as needed
Requirements:
- Bachelor's degree in Computer Science, Computer Engineering, Information Technology or equivalent work experience preferred
- 3+ years of experience in software development, with a strong focus on data engineering and data science
- Experience in building data pipelines and managing large-scale data systems using technologies like SQL and Python
- Proficiency in cloud platforms like AWS, GCP, or Azure, and experience with tools like Airflow, Kafka or dbt for orchestrating data workflows
- Experience with both relational databases including MySQL, PostgreSQL and NoSQL databases like MongoDB, Cassandra
- Experience with data warehousing concepts and tools such as Redshift, BigQuery, Snowflake