Keller Postman LLC is a law firm representing a wide array of clients in various legal matters. The Data Engineer will focus on designing, constructing, and maintaining scalable data management systems, utilizing cloud technologies to enhance data processes and support analytics solutions.
Responsibilities:
- Develop, construct, test, and maintain data architectures, including databases and large-scale processing systems
- Design, build, and optimize data pipelines and ETL/ELT processes leveraging Snowflake and Azure Services
- Develop and maintain Snowflake data warehouses, ensuring efficient data modeling, partitioning, and performance tuning
- Implement data flow processes that automate and streamline data collection, processing, and analysis
- Ensure data governance, quality, and security best practices across all data platforms
- Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decisionmaking across the organization
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
- Provide operational support for existing data infrastructure and develop new solutions as needed
- Monitor, troubleshoot, and optimize system performance in Azure and Snowflake environments
- Support CI/CD pipelines and automation for data workflows and deployments
- Keep current with industry trends and innovations in data engineering and propose changes to the existing landscape
Requirements:
- Proficient in Snowflake, Databricks, or similar tools and experience in data warehousing
- Skilled in SQL, ETL design, and data modeling
- Proficiency in SQL (complex queries, stored procedures, optimization) and familiarity with Python for data engineering tasks
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Strong knowledge of ETL/ELT patterns, orchestration, and workflow automation
- Understanding of data governance, security, and compliance frameworks (e.g., GDPR, HIPAA)
- Adept at queries, report writing, and presenting findings
- Excellent problem-solving and troubleshooting skills
- Ability to work in a fast-paced environment and manage multiple projects simultaneously
- Strong communication skills, capable of conveying complex data issues to non-technical team members
- Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field
- A minimum of 5 years of experience in a data engineering role
- Experience working with Azure cloud services and data warehousing technologies
- Must be able to read, write, and speak fluent English
- Experience with Salesforce data integration is a plus
- Familiarity with Sigma Computing for reporting, data visualization, and business user self-service analytics
- Experience with streaming data technologies (Kafka, Event Hubs, or similar)
- Exposure to DevOps practices and Infrastructure as Code (e.g., Terraform, ARM templates)
- Relevant certifications in Azure or other cloud technologies are beneficial