Fundamental Investor
Augmented by AI

I build tools to enhance the investment process.

Public markets investor with a decade of experience across healthcare, SMID equities, technology, and multi-asset investing.

01

Research Tools

BIOTECH TRIAGE

PRIVATE
Summary

Systematically scores and prioritizes biotech names across quality and valuation dimensions to focus deep research where it is most likely to generate alpha.

Problem

With 400+ US-listed biotechs, deciding where to spend deep research time is itself a research problem. Without a structured framework, coverage tends to skew toward familiar names rather than the best opportunities.

Approach

Developed a two-axis scoring framework that evaluates every name on fundamental quality across six categories and valuation margin of safety using a sum-of-parts NPV model, sorting names into Priority Buy, Monitor, Avoid Trap, or Remove.

Why it matters

Makes idea prioritization systematic rather than intuitive, increasing the likelihood that research effort is concentrated where the risk/reward is most attractive.

Claude API MCP Reportlab

FORENSIC TRIAGE

PRIVATE
Summary

Screens the coverage universe for accounting quality concerns using sector-specific diagnostic frameworks built for healthcare services and medtech.

Problem

Accounting irregularities are often visible in filings before they affect the stock — but identifying them consistently across a large universe requires time and sector-specific knowledge that most workflows don't support.

Approach

Built a systematic screen that evaluates every name across eight flag families — including accruals quality, revenue recognition, balance sheet trends, and disclosure patterns — using rubrics calibrated for healthcare services and medtech business models.

Why it matters

Surfaces names with elevated accounting risk before they become consensus concerns, supporting better short idea generation and reducing the risk of holding deteriorating longs.

Python edgartools Claude API

SIGMA ALERT

PUBLIC
Summary

Monitors the full coverage universe in real time and flags statistically abnormal price moves throughout the trading day.

Problem

Significant price moves across a large universe happen constantly. Most analysts catch them for their core names but miss the periphery — by the time a move is visible, the opportunity to understand the cause has often passed.

Approach

Built a statistical screener that computes z-scores for every ticker against its trailing return distribution and fires alerts three times per trading day — open, midday, and close — with company name, sector, and magnitude of the move.

Why it matters

Ensures no meaningful move across the coverage universe goes unnoticed, creating more opportunities to act on price dislocations before they are fully explained by consensus.

GitHub Actions NumPy yfinance

13F ANALYZER

PRIVATE
Summary

Tracks institutional positioning across 35 funds each quarter to surface emerging conviction and identify names with significant smart-money activity.

Problem

Institutional 13F filings are public and information-rich, but extracting meaningful signal across dozens of funds each quarter is time-consuming and difficult to do consistently.

Approach

Built an automated pipeline that pulls four quarters of holdings data for 35 funds across seven investment style categories, diffs positions quarter over quarter, reconstructs 10-year holding history, and generates AI-assisted investment thesis summaries for top positions.

Why it matters

Converts a time-intensive manual process into a systematic quarterly input for idea generation, making institutional positioning a reliable part of the research process rather than an occasional reference.

yfinance Reportlab SEC Edgar

COVERAGE MANAGER

PRIVATE
Summary

Maintains a live investment universe of ~400 tickers with automated performance tracking and fundamentals enrichment across multiple data providers.

Problem

Managing a large coverage universe manually is operationally intensive — tickers go stale, sector tags drift, and performance data lives in disconnected spreadsheets. Most analysts either limit their universe or sacrifice data quality.

Approach

Built a weekly automated pipeline that validates, enriches, and segments the coverage universe, pulling from multiple fundamentals providers with automatic failover, and delivers performance reports by email and Slack every Friday morning.

Why it matters

Creates a reliable system of record for the coverage universe that feeds every downstream research workflow, freeing time for analysis rather than data maintenance.

Python Finnhub FMP

EARNINGS AGENT

PUBLIC
Summary

Automates earnings calendar maintenance across the full coverage universe, with consensus estimates surfaced directly in the calendar event.

Problem

Tracking earnings dates and consensus estimates across a large coverage list requires constant manual upkeep across multiple data sources, creating operational drag during the most information-intensive periods of the quarter.

Approach

Built a daily agent that pulls upcoming earnings dates from Finnhub for every name in the coverage universe and automatically creates Google Calendar events with timing, consensus EPS, and revenue estimates baked into the description.

Why it matters

Eliminates calendar maintenance entirely, ensures no earnings event is missed, and keeps consensus expectations visible without requiring a separate lookup during earnings season.

SQLite Google Calendar Finnhub

DAILY READS

PUBLIC
Summary

Improves information intake by aggregating and prioritizing high-signal content across curated sources, with selection criteria that improve over time based on feedback.

Problem

The volume of information relevant to investing has expanded dramatically, while most workflows for consuming it remain manual and fragmented. This creates a risk of missing important signals or spending time on low-value inputs.

Approach

Built a system that aggregates content from newsletters, curated web sources, and financial media, uses Claude to select four high-signal articles each morning, and refines its selection criteria over time based on explicit feedback.

Why it matters

Improves pattern recognition and allows more time to be spent on synthesis and judgment rather than information collection — and gets more useful the longer it runs.

GitHub Pages Gmail API Claude API
02

Design Philosophy

01_

Designed to reduce time spent on manual data work and increase time spent on analysis and judgment.

02_

Built to improve consistency across the research process — the same rigor applied to every name, not just the highest-conviction ones.

03_

Focused on surfacing signal, not generating conclusions — the tools inform judgment, they don't replace it.

04_

Structured to integrate into a fundamental, bottoms-up workflow — outputs land in Slack, Gmail, Google Calendar, and PDF reports, not in a separate dashboard.

03

How They're Built

01_

Cron-driven, server-free.

Runs on GitHub Actions or Windows Task Scheduler. No servers to keep alive. The system exists only when it's executing.

02_

Secrets hygiene.

No keys in code. GitHub Actions secrets or local .env only. Security is architectural, not accidental.

03_

Idempotent reruns.

Re-trigger without duplicates or corruption. Scripts check state before mutation to ensure safety.

04_

Real-world data tolerance.

Holiday handling, stale-data guards, multi-currency, and robust API rate limit management.

05_

Composable, not monolithic.

Projects communicate through files (JSON/CSV), not live APIs. Failure in one node doesn't bring down the stack.

06_

Reasoning Models + Structured Analysis.

Claude where reasoning matters, code where structure matters. Knowing where that line goes is the primary skill.

04

About

I am a public markets investor with a decade of experience investing across healthcare, SMID equities, and technology, with a foundation in multi-asset investing.

I have worked across long-only and long/short strategies within large, global investment platforms and on small, focused teams, contributing to differentiated investment outcomes over both short and long time horizons.

I build AI-native research workflows that enhance fundamental research and decision-making.

05

Contact