Theme
DATA ENGINEER
New York, NY  ·  Data Engineer

MaxGoldstein

Data engineer with 3+ years building production pipelines in high-availability financial environments. ETL/ELT, real-time streaming, Linux infrastructure, and cloud — with a hands-on approach to building and operating systems end to end.

01

Experience

Apr 2024 — Present
TP ICAP
Junior Data Engineer / Market Data Support Analyst
  • Developed and maintained ETL pipelines to extract, transform, and deliver broker trade data to external clients with SLA-bound accuracy and timeliness requirements
  • Produced and monitored real-time data streams across Bloomberg and LSEG market data terminals, ensuring consistent uptime and data integrity for downstream consumers
  • Supported application deployments and configuration changes across Linux and Windows environments; used Geneos ITRS for proactive monitoring and incident resolution
  • Liaised with front office stakeholders to support internal data workflows and ensure accurate delivery to external clients
Apr 2022 — Apr 2024
TP ICAP
Pricing Analyst
  • Maintained and quality-controlled data models across multiple asset classes including CDS, LATAM bonds, US Treasuries, and MBS — ensuring accurate real-time distribution to global clients
  • Contributed to pipeline development and go-live testing for new data product offerings; collaborated with engineering on CI/CD processes across multiple markets
  • Built automated pricing tools using Excel VBA, Macros, and Bloomberg BQL — reducing manual effort for CAD bond and CDS pricing workflows
  • Replaced manual natural gas pricing procedures with scripted, repeatable workflows, improving consistency and speed
2024 — Present
Personal Project
Homelab Infrastructure
  • Built and operate a self-hosted server running Proxmox with multiple containerized services across Docker LXCs, learning hands-on infrastructure and systems administration along the way.
  • Set up a dedicated networking stack handling DHCP, DNS-level ad blocking, VPN access, and per-device traffic visibility across the whole home network.
  • Built a local data engineering environment running Airflow for pipeline orchestration, Postgres as the data warehouse, dbt for transformations, and Metabase for visualization.
  • Deployed infrastructure monitoring using Grafana and Prometheus with live dashboards tracking system health, storage capacity, and network activity across all hosts.
02

Certifications

2023
IBM / Coursera
IBM Data Engineering Professional Certificate
  • Built ETL pipelines in Python extracting from REST APIs and HTML sources, transforming with pandas/NumPy, loading into relational databases
  • Developed shell scripts and cron jobs to automate scheduled pipeline runs; wrote complex SQL across multi-table schemas
  • Hands-on work with Jupyter Notebooks, SQLite, PostgreSQL, and IBM Db2
2024
Amazon Web Services
AWS Certified Cloud Practitioner
  • Foundational certification covering AWS core services, cloud architecture, security, and pricing models
  • Practical experience with S3, EC2, Glue, Athena, and IAM through project work and coursework
03

Skills

Infrastructure
  • Linux (RHEL / Ubuntu)
  • Windows Server
  • Docker / Proxmox
  • Git / GitLab / CI/CD
  • Cron / Bash
  • Geneos ITRS
  • ServiceNow
Cloud & Big Data
  • AWS S3 / Glue / Athena
  • AWS EC2
  • Databricks
  • PySpark
  • Apache Airflow
  • dbt Core
Languages & Data
  • Python (pandas, NumPy)
  • SQL (MySQL, PostgreSQL)
  • Bash / Shell
  • Excel / VBA / BQL
  • REST APIs
  • Bloomberg Terminal
Tools
  • Jupyter Notebooks
  • Jira / Confluence
  • Grafana / Prometheus
  • Metabase
  • Pi-hole / WireGuard
  • Home Assistant
04

About

I'm a data engineer based in New York, currently working at TP ICAP where I build and maintain production pipelines handling real-time financial market data. My background sits at the intersection of finance and engineering — I came up through pricing and market data operations before moving fully into the engineering side.

Outside of work I run a self-hosted homelab that doubles as a learning environment. It started as a media server and turned into a full infrastructure project — networking, monitoring, automation, and a local data engineering stack. Building things end-to-end, from raw hardware to working pipelines, is how I learn best.

I'm actively seeking data engineering roles where I can work closer to the pipeline and platform layer, particularly in environments that deal with real-time or high-volume data.

Location New York, NY
Current role Junior Data Engineer, TP ICAP
Education B.S. University of Connecticut, 2020
Interests Self-hosted infrastructure, real-time data, home automation
Currently building Housing market DE pipeline (Zillow + Redfin)
Availability Open to new opportunities
Let's
Talk
[email protected]