Informatica is reimagining the supply chain with an AI-powered platform for designing and automating business processes. The Principal Backend Software Development Engineer will be responsible for architecting and implementing integrations, building scalable data export pipelines, and optimizing ETL processes to support customer data synchronization.
Responsibilities:
- Architect, design, and deliver high-quality, scalable code for complex data integration and platform features
- Drive technical decision-making and project execution as the primary Individual Contributor for integration initiatives
- Conduct thorough code reviews and mentor other engineers on best practices for performance, security, and scalability
- Collaborate cross-functionally with Product Management and other engineering teams to define requirements and deliver solutions that meet business needs
- Set the technical direction and standards for all data integration and ETL processes within the platform
- Identify and mitigate architectural risks associated with scaling our data infrastructure to support exponential customer growth
- Drive continuous improvement in system performance, observability, and operational efficiency
Requirements:
- 8+ years of professional software development experience, with a significant focus on large-scale data systems, ETL, or integration platforms
- A related technical degree required
- Demonstrated expertise in designing, building, and maintaining production-grade data pipelines (ETL/ELT)
- Expert-level proficiency in at least one modern programming language (e.g., Java, Python, Go) suitable for platform development
- Proven experience designing and implementing robust, external-facing REST APIs and Webhooks
- Deep practical knowledge of building and operating distributed systems, concurrency, and high-availability architectures
- Experience in a lead technical role (LMTS/PMTS level or equivalent), driving and owning complex projects as an Individual Contributor
- Excellent written and verbal communication skills
- Hands-on experience with Salesforce integration technologies such as Mulesoft or Salesforce Data Cloud
- Experience working with large-scale event streaming platforms (e.g., Kafka, Kinesis)
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes)
- Experience designing or working with GraphQL API implementations
- Advanced degree in Computer Science or a related technical field
- Published works including contributions to open-source projects, patents, or papers
- Exposure to the supply chain, logistics, or manufacturing industry