Bring your passion for data to help build a modern enterprise data platform for our client, a bank based in McKinney TX. You will design, develop, and maintain data pipelines, integrating data from various core systems including banking, Salesforce CRM, and mortgage platforms.

Key Responsibilities

  • Design, develop, and maintain data pipelines to support enterprise data products in Azure and legacy systems.
  • Apply strong data modeling practices for raw, canonical, and semantic data zones.
  • Integrate data from core banking systems, Salesforce CRM, and mortgage origination and servicing platforms.
  • Utilize a mix of legacy and modern platforms, including MS-SQL, Snowflake, and Azure SQL DB.
  • Leverage ETL tools like SSIS, ADF, and DBT for data transformation.
  • Follow CI/CD practices using Azure DevOps and ADO.
  • Use container tools such as AKS to manage the data lifecycle.

Qualifications

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • 5+ years of experience in data engineering with SQL & Azure.
  • 3+ years of experience in SQL Programming Knowledge
  • 3+ years of experience in programming in Python or Java or C#
  • Proficiency in MS-SQL, Snowflake, and Azure SQL DB.
  • Hands-on experience with ETL tools such as SSIS, ADF, and DBT.
  • Familiarity with CI/CD tools like Azure DevOps (ADO).
  • Experience with container tools, specifically AKS.
  • Experience with streaming processing technologies – Kafka, Spark Streaming, Azure Service Bus
  • Willingness and ability to work onsite in McKinney or Plano TX 3 days per week with 2 days remote.

To apply for this job email your details to thane@arlensa.com