Oracle is a leading company in AI and cloud solutions, and they are seeking a Senior Principal Software Development Engineer for their AI Data Platform. The role involves providing architectural guidance, designing and optimizing data platforms, and collaborating with cross-functional teams to ensure the adoption of Oracle's AI Data Platform and related technologies.
Responsibilities:
- Strong data engineering, HPC, and data science experience. Spark, PySpark, Delta Lake, Parquet, Feature Extraction, MLOps, Flink, with a deep understanding of the distributed systems techniques that make these services work
- Full stack development experience. This includes web app development, RESTful APIs, SSE, all the way to deploying your solution
- Design, implement, and maintain scalable software components and services that support AI/ML workloads
- Build APIs, SDKs, and automation frameworks to streamline the adoption of Oracle AI Data Platform and Gen AI services
- Optimize performance, scalability, and reliability of distributed data/AI systems
- Collaborate with cross-functional teams (engineering, product, and field) to solve complex technical challenges
- Participate in code reviews, testing, and CI/CD to ensure high-quality deliverables
- Document technical designs and contribute to knowledge-sharing (e.g., blogs, internal docs, demos)
- Continuously explore new tools, frameworks, and best practices in AI, cloud, and data engineering
- Experience with LLMs and agentic frameworks (e.g., MCP, LangChain, CrewAI, Semantic Kernel)
- Knowledge of RAG pipelines and vector DBs (e.g., Oracle 26ai, FAISS, Pinecone, Weaviate)
- Familiarity with OCI Gen AI Services and model lifecycle workflows
- Solid Python and REST API skills
- Exposure to building autonomous agents and orchestration pipelines
- Experience working with cloud platforms like Oracle Cloud Infrastructure (OCI) and Oracle Cloud Infrastructure Big Data Service (BDS) and Big Data Appliance (BDA)
- Proficiency in big data technologies such as Hadoop, Spark, Kafka, and Nosql
- Design and implement scalable, secure, and efficient complex data architectures
- Manage and optimize large-scale databases
- Must have a solid understanding of networking concepts to design and optimize data transmission networks, configure network settings, and diagnose and resolve network-related issues
- Troubleshooting and problem-solving skills
- Excellent communication and collaboration skills
- Commitment to continuous learning and staying up-to-date with the latest big data technologies and trends
- Worked with multiple Cloud Platforms
- Certification in Data Platform or a related field
Requirements:
- Strong data engineering, HPC, and data science experience. Spark, PySpark, Delta Lake, Parquet, Feature Extraction, MLOps, Flink, with a deep understanding of the distributed systems techniques that make these services work
- Full stack development experience. This includes web app development, RESTful APIs, SSE, all the way to deploying your solution
- Design, implement, and maintain scalable software components and services that support AI/ML workloads
- Build APIs, SDKs, and automation frameworks to streamline the adoption of Oracle AI Data Platform and Gen AI services
- Optimize performance, scalability, and reliability of distributed data/AI systems
- Collaborate with cross-functional teams (engineering, product, and field) to solve complex technical challenges
- Participate in code reviews, testing, and CI/CD to ensure high-quality deliverables
- Document technical designs and contribute to knowledge-sharing (e.g., blogs, internal docs, demos)
- Continuously explore new tools, frameworks, and best practices in AI, cloud, and data engineering
- Experience with LLMs and agentic frameworks (e.g., MCP, LangChain, CrewAI, Semantic Kernel)
- Knowledge of RAG pipelines and vector DBs (e.g., Oracle 26ai, FAISS, Pinecone, Weaviate)
- Familiarity with OCI Gen AI Services and model lifecycle workflows
- Solid Python and REST API skills
- Exposure to building autonomous agents and orchestration pipelines
- Experience working with cloud platforms like Oracle Cloud Infrastructure (OCI) and Oracle Cloud Infrastructure Big Data Service (BDS) and Big Data Appliance (BDA)
- Proficiency in big data technologies such as Hadoop, Spark, Kafka, and Nosql
- Design and implement scalable, secure, and efficient complex data architectures
- Manage and optimize large-scale databases
- Must have a solid understanding of networking concepts to design and optimize data transmission networks, configure network settings, and diagnose and resolve network-related issues
- Troubleshooting and problem-solving skills
- Excellent communication and collaboration skills
- Commitment to continuous learning and staying up-to-date with the latest big data technologies and trends
- Worked with multiple Cloud Platforms
- Certification in Data Platform or a related field