Skip to main content
Search Jobs

Search Jobs

Cloud Data Platform Engineer

Southlake, Texas, United States; Austin, Texas, United States; Chicago, Illinois, United States Requisition ID 2026-120056 Category Engineering & Software Development Position Type Regular Pay range USD $135,000.00 - $160,000.00 / Year Application Deadline 2026-03-25
Apply Now

Your Opportunity


At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together.

We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).

Schwab Asset Management (SAM) is a leading asset manager supporting mutual funds, ETFs, and managed account products governed under stringent regulatory and compliance requirements. SAM operates in a multi‑cloud, multi‑custodian, multi‑vendor ecosystem, relying on a diverse set of external platforms such as Vestmark, Aladdin, Eagle, and others to serve its investment, operational, and regulatory functions.


This role sits directly within SAM Data team, the team responsible for designing, building, operating, and enhancing SAM Data products , platform capabilities underpinning SAM Data platform.

The Cloud Data Platform Engineer is within the SAM Data Technology organization, with a strong emphasis on the Investment Data domain. The role owns end‑to‑end data capabilities—from ingestion and domain modeling to platform frameworks, Data APIs, and Python‑based visualization/UI layers—that power Schwab Asset Management’s investment, operational, analytical, and regulatory use cases.

This position is expected to operate as a senior technical contributor and platform leader, defining reusable frameworks, setting engineering standards, and driving consistency across the SAMDA ecosystem. The engineer partners closely with architects, product owners, and investment stakeholders to ensure scalable, governed, and business‑aligned data solutions.

Responsibilities:

Platform Frameworks & Engineering Standards

  • Design and build reusable data platform frameworks for ingestion, transformation, validation, and consumption.
  • Establish standardized patterns for data pipelines, APIs, and visualization layers across SAMDA.
  • Define best practices for schema evolution, versioning, error handling, and observability.
  • Influence platform roadmap through hands‑on engineering leadership.

Design & Build Advanced Data Pipelines

  • Architect and implement complex, cloud‑native ETL/ELT pipelines supporting investment and analytical data.
  • Build reliable workflows using GCS, Dataproc, Cloud Dataflow, Composer (Airflow), and Pub/Sub.
  • Implement scalable transformations and curated layers in Snowflake and cloud data warehouses.

Investment Data Domain Focus

This role has deep ownership within the Investment Data domain, including but not limited to:

  • Holdings, positions, transactions, cash flows, and security master data
  • Portfolio, account, and instrument hierarchies
  • Performance, risk, exposure, and attribution datasets
  • Reference data, taxonomies, and domain models that support downstream analytics and reporting

The engineer is responsible for designing domain‑driven data models and operational data stores that accurately represent investment concepts and scale across multiple SAM products and platforms.

Investment Data Modeling & Domain Engineering

  • Design enterprise‑grade investment data models using Kimball, relational, and domain‑driven design principles.
  • Create operational and analytical data stores from the ground up, including taxonomies and canonical models.
  • Ensure models support regulatory, performance, and investment analytics use cases.

Data APIs & Service Layer

  • Design and implement Data APIs using Python (FastAPI / Flask) to expose curated investment datasets.
  • Build scalable, secure RESTful services for analytical and operational consumers.
  • Apply governance, access control, and data protection standards aligned with regulated environments.

Python Dashboards & Data Visualization

  • Develop Python‑based dashboards and UI applications using Streamlit, Dash, Panel, or similar frameworks.
  • Create interactive visualizations using Plotly, Matplotlib, and Seaborn to support investment insights.
  • Translate complex investment data into intuitive, self‑service analytical experiences.

Advanced Cloud & Application Engineering

  • Build data and application services using Cloud Run, Cloud Functions, and Cloud SQL.
  • Apply distributed processing frameworks such as Apache Spark, Beam, and Flink.
  • Package and deploy data services, APIs, and UI components using Docker.

DevOps, CI/CD & Automation

  • Lead CI/CD design for pipelines, APIs, and visualization apps using Git, Bitbucket, Bamboo, Jenkins, and GitHub Actions.
  • Implement automated testing, deployment, and release management patterns.
  • Drive infrastructure automation using Terraform or Google Cloud Deployment Manager.

Data Quality, Reliability & Observability

  • Define and implement data quality frameworks, reconciliation checks, and monitoring standards.
  • Proactively identify and resolve complex data, platform, and application issues.

Technical Leadership & Collaboration

  • Act as a senior technical leader and mentor for data engineers across SAMDA.
  • Lead design reviews and influence cross‑team engineering decisions.
  • Communicate complex platform and investment data concepts to technical and business stakeholders.

What you have


Required Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or equivalent practical experience.
  • 6–8 years of experience building cloud‑based data platforms and enterprise data solutions.
  • Strong experience in the Investment or Asset Management data domain.
  • Hands‑on expertise with Snowflake and GCP services (GCS, Cloud Run, Cloud Functions, Pub/Sub, Composer, Cloud SQL).
  • Advanced proficiency in Python for data engineering, API development, and visualization.
  • Proven experience building REST APIs using Python frameworks (FastAPI, Flask, or equivalent).
  • Experience with Python visualization and UI frameworks (Streamlit, Dash, Panel, or similar).
  • Strong background with distributed processing frameworks (Spark, Beam, or Flink).
  • Expertise in CI/CD, containerization (Docker), and infrastructure as code (Terraform or GCP Deployment Manager).

Preferred Qualifications

  • Experience designing platform‑level frameworks adopted by multiple engineering teams.
  • Deep understanding of regulated data environments, governance, lineage, and auditability.
  • Strong grasp of modern data architecture and self‑service analytics patterns.
  • Ability to influence platform strategy and mentor senior engineers.
  • Excellent documentation and executive‑level communication skills.

In addition to the salary range, this role is also eligible for bonus or incentive opportunities.


What’s in it for you

At Schwab, you’re empowered to shape your future. We champion your growth through meaningful work, continuous learning, and a culture of trust and collaboration—so you can build the skills to make a lasting impact. Our Hybrid Work and Flexibility approach balances our ongoing commitment to workplace flexibility, serving our clients, and our strong belief in the value of being together in person on a regular basis.

We offer a competitive benefits package that takes care of the whole you – both today and in the future:

  • 401(k) with company match and Employee stock purchase plan
  • Paid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positions
  • Paid parental leave and family building benefits
  • Tuition reimbursement
  • Health, dental, and vision insurance
Apply Now