PGC Digital (America) Inc is a cutting-edge technology organization operating at the intersection of Private Cloud, Data, and AI. They are seeking a Senior Data Engineer to lead enterprise-scale data migration and consolidation initiatives, working closely with global enterprises to deliver client-specific solutions.
Responsibilities:
- Lead enterprise-scale data migration and consolidation initiatives
- Partner directly with Global Enterprises and Industry Leaders navigating complex modernization journeys
- Translate advanced migration acceleration capabilities into production-ready, client-specific solutions
- Architect and execute large-scale migrations from legacy enterprise platforms to modern cloud data ecosystems
- Operate across cloud architecture, data engineering, ontology modeling, and executive stakeholder engagement
- Own end-to-end delivery of 12–18+ month transformation programs while serving as a trusted advisor to C-suite and enterprise architecture leaders
Requirements:
- 7–10+ years of experience in enterprise data engineering, system integration, or large-scale data migration initiatives
- 3–5+ years leading end-to-end enterprise data migration or multi-system consolidation programs with full technical ownership
- Proven experience delivering migrations involving 3+ heterogeneous systems and 100M+ records, including complex master data harmonization and phased cutovers
- Advanced proficiency in Python and SQL, with hands-on experience in PySpark and TypeScript/JavaScript
- Deep expertise in ETL/ELT and integration platforms (Informatica, Talend, Matillion, Fivetran, AWS Glue, Azure Data Factory)
- Experience building scalable, version-controlled data pipelines with error handling, incremental loading, and Change Data Capture (CDC)
- Strong working knowledge of at least one major cloud provider (AWS, Azure, or GCP), including infrastructure, managed data services, and security best practices
- Experience with modern data warehouse and lakehouse platforms (Snowflake, Databricks, BigQuery, Redshift, Synapse, Delta Lake)
- Hands-on integration with enterprise systems such as SAP (RFC, IDoc, BAPI, OData) and Oracle (AQ, GoldenGate, APIs)
- Familiarity with semantic modeling, knowledge graphs, ontology frameworks (RDF, OWL), or platforms such as Neo4j or Stardog
- Experience integrating LLMs or AI-driven tooling into transformation or schema mapping workflows
- Client-facing experience advising C-level executives and enterprise stakeholders
- Industry depth in at least two sectors such as Healthcare, Financial Services, Manufacturing, Retail, Energy & Utilities, or Public Sector
- Certifications in AI, SAP, cloud architecture, or data governance (e.g., AWS Solutions Architect, Azure Data Engineer, CDMP, SAP Technology Associate)
- Production experience with real-time streaming platforms (Kafka, Kinesis, Event Hubs, Pub/Sub)
- Expertise with enterprise MDM platforms (Informatica MDM, SAP MDG, Profisee, Reltio)
- Experience building APIs, microservices, and implementing service mesh patterns
- Proficiency with CI/CD pipelines and infrastructure-as-code tools (Jenkins, GitLab CI, Azure DevOps, Terraform, ArgoCD)
- Strong understanding of data privacy and regulatory compliance frameworks (GDPR, HIPAA, SOC 2, CCPA, FedRAMP)
- Familiarity with process mining, data observability, and catalog/lineage platforms (Celonis, Monte Carlo, Collibra, Alation, Apache Atlas)
- Working knowledge of financial close, supply chain, procurement, or revenue cycle workflows
- Experience with industry-specific standards such as EDI, HL7, FHIR, SWIFT, or XBRL