Greystar is a leading global real estate platform offering expertise in various services related to rental housing. They are seeking a Senior Data Engineer/Architect to join their D2AI team, focusing on leveraging Azure SQL, Cosmos DB, and DataBricks to develop and optimize data capabilities for customer-facing applications.
Responsibilities:
- 100% hands-on development – Azure SQL, Cosmos DB, DataBricks: develop and unit test database code, including but not limited to T-SQL, stored procedures, functions and views
- Own/Maintain the DataBricks data ingestion & output for our Microsoft Customer Insights CDP platform
- Create and maintain database structures
- Participate in the design of databases, using first, second or third normalized form as needed to support business requirements
- Create, deploy, and maintain ADF pipelines, adhering to Greystar’s standards and documented best practices
- Monitor database performance, identify bottlenecks and implement improvements to ensure scalable and reliable database operations in a production environment
- Perform analysis of complex data and document findings
- Prepare data for prescriptive and predictive modeling
- Combine raw data from different external sources and build and support complex ingestions
- Collaborate with application developers/data analysts who will be consuming the data
- Play a direct role in the maintenance, technical support, documentation, and administration of databases
- Ensure standards are followed by participating in code reviews
- You are a thought leader who will bring up new ideas or methods and deliver awesome code
Requirements:
- 6+ years relevant and progressive data engineering experience
- Deep Technical knowledge and experience in Microsoft Azure architecture, including Azure PaaS databases, Synapse, ADF pipelines, Azure functions, Event Grids etc
- 3+ Years of Experience with Cosmos DB
- 3+ Years of Experience with Data Bricks
- Hands-on skills working with data pipelines using SQL and No-SQL databases
- Minimum of 1 year of relevant experience working with Azure Data Lakes Gen 2
- Experience with Power Platform / Power BI
- Experience in engineering practices such as code refactoring, design patterns, CI/CD, and highly scalable data applications
- Experience developing batch ETL pipelines; real-time pipelines are a plus
- Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing, structured and unstructured data
- Good Experience with Python, machine learning frameworks and statistics
- Knowledge of Agile software development process
- Excellent problem-solving skills and experience
- Strong communication and collaboration skills
- "Self-starter" attitude and the ability to make decisions with minimal guidance from others
- Innovative and passionate about your work and the work of your teammates
- Ability to comprehend and analyze operational systems and ask appropriate questions to determine how to improve, migrate or modify the solution to meet business needs
- Bachelor's Degree in computer science, information technology, business management, information systems, or equivalent experience
- Experience / Familiarity with D365 Customer Insights platform / Dataverse is a plus
- Databricks or other Certifications are nice to have but not required
- Advanced degree preferred