<>TECHNICAL SKILLS</>
Must Have
- administration of Hadoop platforms and solutions
- advanced Python skills
- Alteryx
- API
- SEQUEL
Location: San Antonio, TX - 4 days onsite, 1 days remote (Hybrid)
MINIMUM QUALIFICATIONS
- Bachelor’s degree in Computer Science, Engineering, or related field from an accredited university.
- Experience in a data integration role.
- Experience using Apache Spark, Nifi and/or Kafka.
- Experience using Python.
- Experience integrating enterprise software using ETL modules.
- Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
- Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
- Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB).
- Ability to integrate data from Web services in XML, JSON, flat file format, SOAP.
- Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
PREFERRED QUALIFICATIONS
- Relevant Certifications
- Experience in API Management
- Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
- Knowledge of Test Driven Development (TDD)
- Familiarity with Cloud base architecture
- Experince with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
- Experience in a technology organization
POSITION SUMMARY
Provide the development and automation of computing processes to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.
TASK AND RESPONSIBILITES
- Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
- Daily production support for Enterprise Data Warehouse including jobs in Informatica/Mulesoft and Oracle PL/SQL; and be flexible to manage high severity incidents/problem resolution.
- Participate in troubleshooting and resolving data integration issues such as data quality.
- Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
- Provide work estimates and communicate status of assignments.
- Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
- Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
- Develop Informatica/Mulesoft Mappings and complex Oracle PL/SQL programs for the Data Warehouse.
- Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions.
- May provide input to area budget.
- Makes some independent decisions and recommendations which affect the section, department and/or division.
- Performs other duties as assigned.