L3Harris Technologies is a trusted disruptor in defense tech, dedicated to delivering end-to-end technology solutions for national security. They are seeking a Senior Specialist, Data Engineer to oversee data ETL/ELT pipelines, maintain data frameworks, and ensure seamless access to data across the organization.
Responsibilities:
- Design, build, and maintain robust data pipelines to ensure reliable data flow across the enterprise
- Maintain data pipeline schedules, orchestrate workflows, and monitor the overall health of data pipelines to ensure continuous data availability
- Create, update, and optimize data connections, datasets, and transformations to align with business needs
- Troubleshoot and resolve data sync issues, ensuring consistent and correct data flow from source systems
- Collaborate with cross-functional teams to uphold data quality standards and ensure accurate data is available for use
- Utilize Palantir Foundry to establish data connections to source applications, extract and load data, and design complex logical data models that meet functional and technical specifications
- Develop and manage data cleansing, consolidation, and integration mechanisms to support big data analytics at scale
- Build visualizations using Palantir Foundry tools and assist business users with testing, troubleshooting, and documentation creation, including data maintenance guides
Requirements:
- Bachelor's Degree and minimum 6 years prior Palantir experience or Graduate Degree and a minimum of 4 years of prior Palantir experience In lieu of degree, minimum 10 years of prior Palantir experience
- 4+ years of experience with Data Pipeline development or ETL tools such as Palantir Foundry, Azure Data Factory, SSIS, or Python
- 4+ years of experience in Data Integration
- 4+ years experience with design, development of Data Pipelines in Palantir Foundry Pipeline Builder or Code Repo, PySpark and Spark SQL, and data build/sync schedule deployment in Palantir
- Understanding of BI (Business Intelligence) & DW (Data Warehouse) development methodologies
- Experience with Snowflake cloud data platform including but not limited to hands-on experience with Snowflake
- Experience with Python, Pandas, Databricks, JavaScript, Typescript or other scripting languages
- Experience in ETL tools such as Palantir Foundry, ADF (Azure Data Factory), SSIS, Informatica or Talend is preferable
- Working knowledge to connect and extract data from various ERP applications such as Oracle EBS, SAP ECC/S4, Deltek Costpoint, REST API, and more
- Experience with AI tools such as OpenAI, Palantir AIP, Snowflake Cortex or similar