AWSCloudDistributed SystemsDockerJavaScriptKubernetesMicroservicesPythonSeleniumSQLTypeScript.NETC#CPowerShellAIPlaywrightTeamCitySQL ServerDatadogGitCI/CDCommunicationCollaborationRemote Work
About this role
Role Overview
Design, build, and maintain scalable automated test frameworks covering UI, API, integration, and performance layers across AWS-based .NET services
Expand and refactor the existing automated test suite to increase coverage, reliability, and execution speed
Leverage AI and agentic AI tools to generate, enhance, and maintain automated tests, systematically reducing manual regression testing
Integrate automated tests into CI/CD pipelines to enforce quality gates and support rapid, high-confidence releases
Analyze test results, investigate failures, and partner with engineers to diagnose and resolve defects across distributed systems
Collaborate closely with developers and DevOps engineers to ensure new features are designed with testability, observability, and automation in mind
Continuously improve test data management, environment configuration, and reproducibility of test runs
Identify automation gaps and propose improvements that increase release cadence and reduce risk
Actively participate in architectural discussions, sprint planning, backlog grooming, and milestone reviews; partner with Engineering Leads to define scope, technical approach, deliverables, and quality criteria for upcoming releases
Contribute to documentation, standards, and best practices that elevate quality engineering across the organization
Ensure audit readiness by maintaining complete, traceable documentation in alignment with SOPs and Quality Management System (QMS)
Collaborate with DevOps and Cloud teams to optimize infrastructure scalability, reliability, and performance within compliance constraints
Support defect triage, root cause analysis, CAPA processes, and continuous improvement initiatives
Other duties as assigned
Requirements
Bachelor’s Degree in Computer Science, Engineering, Data Science, or related discipline, or equivalent work experience preferred
10+ years of hands-on experience in enterprise software engineering and test automation (SDET), including at least 6 years designing and scaling automated test frameworks preferred
Proven experience driving automation strategy and expanding adoption of automated testing across engineering teams
Strong experience implementing automation across all layers: database validation (SQL), API/integration (REST, HTTP), and end-to-end UI testing
Hands-on expertise with C#/.NET platform, Selenium and Playwright, TypeScript/JavaScript, HTML, and CSS
Experience with Implementing AI-enabled monitoring and anomaly detection to proactively identify system degradation or performance drift
Deep understanding of distributed systems, microservices architecture, and service-to-service communication
Experience integrating automated tests into CI/CD pipelines and implementing quality gates
Hands-on experience with containerization (Docker) and Kubernetes-based test execution environments
Ensure audit readiness by maintaining complete, traceable documentation in alignment with SOPs and Quality Management System (QMS)
Experience designing and executing performance/load testing strategies
Strong collaboration skills; experience partnering with Development Leads and Product teams in planning, grooming, and milestone definition
Demonstrated ability to influence engineering standards, mentor SDETs, and drive quality engineering best practices
Experience with generating and maintaining performance validation documentation including test plans, test cases, traceability matrices, summary reports, and deviation documentation