The Mortgage Office Software streamlines and automates the processes involved in loan servicing, origination, and fund management for financial institutions. The Mortgage Office offers features such as loan tracking, payment processing, escrow management, and investor reporting, enhancing efficiency and compliance servicing for private money loans. TMO is recognized for its robust, user-friendly solutions that help clients manage their lending portfolios effectively. Founded in 1978 and proud of the work we do, our customers and employees feel like family.

Come join our rapidly growing team, as a Mid-Level Data Engineer and work on mission-critical projects.

Position Summary:
As a Data Engineer, you will be responsible for developing, maintaining, and optimizing data pipelines and systems that support the acquisition, storage, transformation, and analysis of large volumes of data. You will collaborate with cross-functional teams, including data scientists, analysts, software engineers, and operations teams to ensure the availability, reliability, and integrity of data for various business needs. This role requires strong technical expertise in data engineering principles, database management, and programming skills.

• Design, develop, and maintain data pipelines in Azure for ingesting, transforming, and loading data from various sources into centralized Azure data lakes and data warehouses. Ensure data quality and integrity throughout the process.
• Implement efficient ELT/ETL processes to ensure data quality, consistency, and reliability. Develop transformation processes to clean, aggregate, and enrich raw data, ensuring it is in the appropriate format for downstream analysis and consumption. Integrate data from diverse sources to provide a unified view of information.
• Design and implement efficient data models and database schemas that support the storage and retrieval of structured and unstructured data. Optimize data storage and access for performance and scalability.
• Implement knowledge of modern data processing principles to streamline data import/transformation processes. Leverage modern data pipeline tools to reduce human attention during ETL process. Ensure the efficiency and reliability of data ingestion and processing.
• Work closely with cross-functional teams to understand data requirements and translate them into technical solutions.
• Monitor data pipelines, troubleshoot issues, and ensure data integrity and security.
• Implement data quality controls and validation processes to identify and rectify data anomalies, inconsistencies, and errors. Collaborate with stakeholders to define and enforce data governance standards and policies.
• Identify performance bottlenecks in data pipelines and database systems and optimize queries, data structures, and infrastructure configurations to improve overall system performance and scalability.
• Implement appropriate security measures to protect sensitive data and ensure compliance with data privacy regulations. Monitor and address data security vulnerabilities and risks.
• Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver effective data solutions. Document data engineering processes, data flows, and system configurations.
• Stay updated with the latest trends, tools, and technologies in the field of data engineering. Proactively identify opportunities to improve data engineering practices and contribute to the evolution of data infrastructure.
• Provide support to the development team with managing multiple instances of databases and servers, implementing complex queries with proper tuning, provide input to design-impacting data, manage data infrastructure in Azure (DataLake, DataWarehouse and Synapse).

• Bachelor’s degree in Computer Science, Engineering, or related field.
• 5+ years of experience as a Data Engineer, with a focus on designing and implementing data solutions on the Azure platform.
• Hands-on experience with Azure services such as Azure Synapse Analytics, Azure Data Factory, Azure Databricks, Azure SQL Database, etc.
• Strong experience in data engineering, including data pipeline development, ETL/ELT processes, and data modeling.
• Strong proficiency in SQL and experience with programming languages such as C#, Python, Scala, or Java.
• In-depth knowledge of SQL and experience with relational and non-relational databases such as SQLServer (preferred), MySQL, and PostgreSQL.
• Knowledge of data warehousing concepts, dimensional modeling, and best practices.
• Experience with version control systems such as Git.
• Excellent problem-solving skills and ability to work independently as well as part of a team.
• Azure certifications such as Azure Data Engineer Associate or Azure Solutions Architect are a plus.
• Excellent communication skills and the ability to collaborate effectively with cross-functional teams.

What You’ll Get:
• Full medical benefits.
• 401K with matching.
• Paid vacation.
• Paid sick days.
• Paid holidays.
• Compensation commensurate with experience.
• Hybrid work schedule

Note: The benefits listed are subject to company policies and may vary based on location and employment terms.
Please note this is a full-time W2 onsite position at our offices in Huntington Beach, CA. A possibility of Hybrid may be available.

Take the First Step Toward Better Loan Management with The Mortgage Office

Speak with one of our demo experts today!