Building an AI Trading Agent with Claude and News Signals
Build an automated trading agent that extracts market signals from news using Claude Haiku, executes trades via Alpaca, and manages positions with trailing stops and sentiment monitoring.
What if you could have an AI agent that reads financial news 24/7, extracts trading signals, and executes trades automatically? In this tutorial, we’ll build exactly that—a complete trading agent powered by Claude Haiku that aggregates news from multiple sources, identifies market-moving events, and trades volatile stocks with intelligent position management.
What We’re Building
Our trading agent has four main components:
┌─────────────────────────────────────────────────────────────────┐
│ TRADING AGENT ARCHITECTURE │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ NEWS │ │ SIGNAL │ │ TRADING │ │
│ │ AGGREGATOR │───▶│ EXTRACTOR │───▶│ ENGINE │ │
│ │ │ │ │ │ │ │
│ │ • Reddit │ │ • Claude │ │ • Alpaca API │ │
│ │ • RSS Feeds │ │ Haiku 4.5 │ │ • Position │ │
│ │ • Google │ │ • Structured │ │ Management │ │
│ │ • Finnhub │ │ Output │ │ • Risk Mgmt │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ SCHEDULER │ │
│ │ 6AM: News │ 9:30AM: Trade │ 12PM: Check │ 4PM: Close │ │
│ └──────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
Key features:
- Multi-source news aggregation (Reddit, RSS, Google News, Finnhub)
- AI-powered signal extraction with confidence scoring
- Smart position sizing for small accounts ($1,000)
- Trailing stops that lock in gains
- Sentiment-based exit signals
- Market regime detection for shorts
- Holiday-aware scheduling
Why News-Based Trading?
Traditional quantitative models try to predict price movements from historical data. The problem? Markets are largely efficient—by the time a pattern is detectable, it’s often already priced in.
News-based trading takes a different approach: instead of predicting what will happen, we react to what is happening. When a company announces earnings, when the Fed changes policy, when geopolitical events unfold—these are the moments that move markets.
Project Setup
Create the project structure:
mkdir -p trading-agent/{signals,scripts,logs,bot_output}
cd trading-agent
python -m venv venv
source venv/bin/activate
pip install boto3 feedparser requests yfinance alpaca-py
Set up your environment variables:
export AWS_DEFAULT_REGION=us-east-1
export ALPACA_API_KEY=your_paper_trading_key
export ALPACA_API_SECRET=your_paper_trading_secret
Part 1: News Aggregation
The first component collects news from multiple sources. Each source has different strengths:
| Source | Strengths | Latency |
|---|---|---|
| Reddit (WSB, stocks) | Retail sentiment, meme stocks | Real-time |
| RSS Feeds (CNBC, BBC) | Mainstream news, macro events | Minutes |
| Google News | Broad coverage, company-specific | Minutes |
| Finnhub | Structured data, earnings | Seconds |
# signals/news_aggregator.py
import feedparser
import requests
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import List, Optional
import hashlib
@dataclass
class NewsItem:
"""Standardized news item from any source."""
id: str
source: str
title: str
content: str
url: Optional[str]
timestamp: str
tickers: List[str]
class RedditNews:
"""Fetch from Reddit finance subreddits via RSS."""
SUBREDDITS = [
"wallstreetbets",
"stocks",
"investing",
"options",
"stockmarket"
]
def fetch(self, since_hours: int = 24) -> List[NewsItem]:
items = []
cutoff = datetime.utcnow() - timedelta(hours=since_hours)
for subreddit in self.SUBREDDITS:
url = f"https://www.reddit.com/r/{subreddit}/hot.rss"
feed = feedparser.parse(url)
for entry in feed.entries[:20]:
# Parse timestamp
published = datetime(*entry.published_parsed[:6])
if published < cutoff:
continue
items.append(NewsItem(
id=hashlib.md5(entry.title.encode()).hexdigest()[:16],
source=f"reddit/{subreddit}",
title=entry.title,
content=f"{entry.title}. {entry.get('summary', '')}",
url=entry.link,
timestamp=published.isoformat(),
tickers=[] # Will be extracted by signal extractor
))
return items
class RSSNews:
"""Fetch from financial news RSS feeds."""
FEEDS = {
"cnbc_top": "https://search.cnbc.com/rs/search/combinedcms/view.xml?partnerId=wrss01&id=100003114",
"bbc_business": "http://feeds.bbci.co.uk/news/business/rss.xml",
"nyt_business": "https://rss.nytimes.com/services/xml/rss/nyt/Business.xml",
}
def fetch(self, since_hours: int = 24) -> List[NewsItem]:
items = []
cutoff = datetime.utcnow() - timedelta(hours=since_hours)
for source_name, url in self.FEEDS.items():
try:
feed = feedparser.parse(url)
for entry in feed.entries[:15]:
items.append(NewsItem(
id=hashlib.md5(entry.title.encode()).hexdigest()[:16],
source=source_name,
title=entry.title,
content=f"{entry.title}. {entry.get('summary', '')}",
url=entry.get('link'),
timestamp=datetime.utcnow().isoformat(),
tickers=[]
))
except Exception as e:
print(f"Error fetching {source_name}: {e}")
return items
class NewsAggregator:
"""Aggregate news from all sources."""
def __init__(self):
self.sources = [
RedditNews(),
RSSNews(),
]
def fetch_all(self, since_hours: int = 24) -> List[NewsItem]:
all_items = []
for source in self.sources:
items = source.fetch(since_hours)
all_items.extend(items)
print(f" Got {len(items)} items from {source.__class__.__name__}")
# Deduplicate by ID
seen = set()
unique = []
for item in all_items:
if item.id not in seen:
seen.add(item.id)
unique.append(item)
return sorted(unique, key=lambda x: x.timestamp, reverse=True)
Part 2: Signal Extraction with Claude
This is where the magic happens. We use Claude Haiku to analyze each news item and extract structured trading signals:
# signals/signal_extractor.py
import json
import boto3
from dataclasses import dataclass
from typing import List, Optional
@dataclass
class TickerImpact:
ticker: str
company_name: str
sentiment: str # "bullish", "bearish", "neutral"
magnitude: str # "low", "medium", "high"
confidence: float
reasoning: str
@dataclass
class TradingSignal:
timestamp: str
source: str
headline: str
event_type: str
is_market_relevant: bool
primary_sentiment: str
urgency: str # "low", "medium", "high", "critical"
confidence: float
primary_tickers: List[TickerImpact]
key_reasoning: List[str]
class SignalExtractor:
"""Extract trading signals using Claude Haiku."""
MODEL_ID = "us.anthropic.claude-haiku-4-5-20251001-v1:0"
SYSTEM_PROMPT = """You are a financial analyst AI that extracts trading signals from news.
For each news item, analyze and return a JSON object with:
{
"is_market_relevant": boolean,
"event_type": "earnings|merger|macro|geopolitical|product|legal|other",
"primary_sentiment": "bullish|bearish|neutral",
"urgency": "low|medium|high|critical",
"confidence": 0.0-1.0,
"primary_tickers": [
{
"ticker": "AAPL",
"company_name": "Apple Inc",
"sentiment": "bullish|bearish|neutral",
"magnitude": "low|medium|high",
"confidence": 0.0-1.0,
"reasoning": "Brief explanation"
}
],
"key_reasoning": ["Point 1", "Point 2"]
}
Focus on:
- Direct company impacts (earnings, products, legal)
- Sector-wide effects (Fed policy, regulations)
- Second-order effects (oil prices → airlines)
Only include tickers with clear, actionable signals. Be conservative with confidence scores."""
def __init__(self):
self.client = boto3.client("bedrock-runtime", region_name="us-east-1")
def extract_signal(self, news_text: str, source: str) -> Optional[TradingSignal]:
"""Extract trading signal from a news item."""
prompt = f"""Analyze this financial news and extract trading signals:
Source: {source}
Content: {news_text}
Return only valid JSON, no markdown."""
response = self.client.invoke_model(
modelId=self.MODEL_ID,
body=json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 1024,
"system": self.SYSTEM_PROMPT,
"messages": [{"role": "user", "content": prompt}]
})
)
result = json.loads(response["body"].read())
response_text = result["content"][0]["text"]
# Parse JSON (handle markdown wrapping)
response_text = response_text.strip()
if response_text.startswith("```json"):
response_text = response_text[7:]
if response_text.startswith("```"):
response_text = response_text[3:]
if response_text.endswith("```"):
response_text = response_text[:-3]
data = json.loads(response_text.strip())
# Convert to TradingSignal
return TradingSignal(
timestamp=datetime.utcnow().isoformat(),
source=source,
headline=news_text[:100],
event_type=data.get("event_type", "other"),
is_market_relevant=data.get("is_market_relevant", False),
primary_sentiment=data.get("primary_sentiment", "neutral"),
urgency=data.get("urgency", "low"),
confidence=data.get("confidence", 0.5),
primary_tickers=[
TickerImpact(**t) for t in data.get("primary_tickers", [])
],
key_reasoning=data.get("key_reasoning", [])
)
Part 3: The Trading Engine
The trading engine executes signals with proper risk management:
# signals/trading_engine.py
from dataclasses import dataclass, field
from typing import Dict, List, Optional
from datetime import datetime, timezone
@dataclass
class TradingConfig:
initial_capital: float = 1000.0
paper_trading: bool = True
# Position sizing
max_position_pct: float = 0.10 # 10% max per position
# Risk management
max_positions: int = 10
max_daily_loss_pct: float = 0.05 # Stop at 5% daily loss
# Order management
default_stop_loss_pct: float = 0.03 # 3% stop
default_take_profit_pct: float = 0.06 # 6% target
# Strategy
long_only: bool = False # Allow shorts
min_short_confidence: float = 0.80 # Higher bar for shorts
require_weak_market_for_shorts: bool = True
# Trailing stops
use_trailing_stop: bool = True
trailing_stop_activation_pct: float = 0.02 # Activate at 2% gain
trailing_stop_distance_pct: float = 0.015 # Trail 1.5% behind
@dataclass
class Position:
ticker: str
side: str # "long" or "short"
entry_price: float
quantity: int
entry_time: str
stop_loss_price: float
take_profit_price: float
high_water_mark: float # For trailing stops
trailing_stop_active: bool = False
class TradingEngine:
def __init__(self, config: TradingConfig = None):
self.config = config or TradingConfig()
self.positions: Dict[str, Position] = {}
self.cash = self.config.initial_capital
self.daily_pnl = 0.0
def execute_signal(self, signal: dict) -> Optional[str]:
"""Execute a trading signal."""
ticker = signal["ticker"]
action = signal["action"]
confidence = signal["confidence"]
# Check long-only mode
if self.config.long_only and action == "SELL":
return None
# Higher confidence required for shorts
if action == "SELL" and confidence < self.config.min_short_confidence:
return None
# Check market regime for shorts
if action == "SELL" and self.config.require_weak_market_for_shorts:
if not self._is_market_weak():
return None
# Get current price (via Alpaca or simulation)
price = self._get_price(ticker)
if not price:
return None
# Calculate position size
position_value = self.cash * self.config.max_position_pct
quantity = int(position_value / price)
if quantity == 0:
return None
# Calculate stops
if action == "BUY":
stop_loss = price * (1 - self.config.default_stop_loss_pct)
take_profit = price * (1 + self.config.default_take_profit_pct)
side = "long"
else:
stop_loss = price * (1 + self.config.default_stop_loss_pct)
take_profit = price * (1 - self.config.default_take_profit_pct)
side = "short"
# Create position
self.positions[ticker] = Position(
ticker=ticker,
side=side,
entry_price=price,
quantity=quantity,
entry_time=datetime.now(timezone.utc).isoformat(),
stop_loss_price=stop_loss,
take_profit_price=take_profit,
high_water_mark=price
)
self.cash -= price * quantity
return f"ORDER-{ticker}-{datetime.now().strftime('%H%M%S')}"
def update_trailing_stop(self, position: Position, current_price: float):
"""Update trailing stop for a position."""
if position.side != "long":
return
# Update high water mark
if current_price > position.high_water_mark:
position.high_water_mark = current_price
# Check activation
gain_pct = (position.high_water_mark - position.entry_price) / position.entry_price
if gain_pct >= self.config.trailing_stop_activation_pct:
position.trailing_stop_active = True
# Update stop if active
if position.trailing_stop_active:
new_stop = position.high_water_mark * (1 - self.config.trailing_stop_distance_pct)
if new_stop > position.stop_loss_price:
position.stop_loss_price = new_stop
def _is_market_weak(self) -> bool:
"""Check if market is showing weakness (for shorts)."""
try:
import yfinance as yf
spy = yf.download("SPY", period="1d", interval="1h", progress=False)
if len(spy) < 2:
return False
open_price = float(spy['Open'].iloc[0])
current = float(spy['Close'].iloc[-1])
change = (current - open_price) / open_price
return change < -0.003 # SPY down 0.3%+
except:
return False
Part 4: Volatile Stock Focus
A key insight from backtesting: with a small account ($1,000), expensive stocks like COST ($960) or META ($620) are hard to trade properly. You can only buy 1 share, which limits your position sizing.
Cheap, volatile stocks are much better:
# Volatile, tradeable stocks watchlist
VOLATILE_WATCHLIST = [
# Meme stocks / high retail interest
"AMC", "GME", "PLTR", "SOFI", "RIVN", "LCID",
# Crypto-adjacent
"MARA", "RIOT", "CLSK",
# Tech volatile
"SNAP", "PINS", "HOOD", "RBLX",
# Energy volatile
"OXY", "DVN", "MRO",
# Financials (tradeable price)
"BAC", "WFC", "C",
# Materials / Industrial
"CLF", "X", "AA", "FCX",
# Consumer discretionary
"F", "GM", "AAL", "UAL", "DAL",
]
Why this matters:
| Stock | Price | Shares per $100 | 5% Move P&L |
|---|---|---|---|
| COST | $960 | 0 | Can’t trade |
| META | $620 | 0 | Can’t trade |
| RIOT | $19 | 5 | $4.75 |
| CLF | $14 | 7 | $4.90 |
| AMC | $1.60 | 62 | $4.96 |
With cheap stocks, you get proper position sizing AND amplified P&L from volatility.
Part 5: Automated Scheduling
The agent runs on a daily schedule, handling holidays automatically:
# signals/scheduler.py
from datetime import date
US_MARKET_HOLIDAYS_2026 = [
date(2026, 1, 1), # New Year's Day
date(2026, 1, 19), # MLK Day
date(2026, 2, 16), # Presidents Day
date(2026, 4, 3), # Good Friday
date(2026, 5, 25), # Memorial Day
date(2026, 7, 3), # Independence Day
date(2026, 9, 7), # Labor Day
date(2026, 11, 26), # Thanksgiving
date(2026, 12, 25), # Christmas
]
def is_market_open_today() -> bool:
today = date.today()
if today.weekday() >= 5: # Weekend
return False
if today in US_MARKET_HOLIDAYS_2026:
return False
return True
Cron schedule (all times ET):
# 6:00 AM - Collect news
0 11 * * 1-5 /path/to/run_task.sh collect_news
# 6:30 AM - Extract signals
30 11 * * 1-5 /path/to/run_task.sh extract_signals
# 9:30 AM - Execute trades
30 14 * * 1-5 /path/to/run_task.sh market_open
# 12:00 PM - Midday check
0 17 * * 1-5 /path/to/run_task.sh midday_check
# 4:00 PM - Daily summary
0 21 * * 1-5 /path/to/run_task.sh daily_summary
Backtesting Results
We backtested the agent on the week of January 13-17, 2026:
═══════════════════════════════════════════════════════════════════
STRATEGY COMPARISON - WEEK OF JAN 13-17, 2026
═══════════════════════════════════════════════════════════════════
Strategy Weekly P&L Win Rate Notes
───────────────────────────────────────────────────────────────────
Original (expensive) $+2.15 1/5 COST, AMZN, META
With bad shorts $-8.12 2/5 Shorts crushed
Long-only tradeable $-4.76 3/5 CLF only
VOLATILE CHEAP $+19.48 3/5 RIOT, RBLX, AMC...
═══════════════════════════════════════════════════════════════════
The volatile cheap stock strategy outperformed by 9x compared to the original approach.
Daily breakdown:
| Day | P&L | Best Performer | Worst |
|---|---|---|---|
| Mon | +$20.18 | RIOT +7.5% | - |
| Tue | +$2.81 | RBLX +10.2% | AMC -6.9% |
| Wed | -$10.65 | CLSK +0.7% | RBLX -4.5% |
| Thu | -$3.96 | AMC +2.6% | CLSK -3.9% |
| Fri | +$11.10 | RIOT +8.6% | RIVN -3.1% |
Position Management
The agent uses several techniques to manage positions:
1. Trailing Stops
Once a position is up 2%, the trailing stop activates and follows 1.5% behind the high:
Entry: $19.00
Price hits $19.38 (2% gain) → Trailing stop activates
Price hits $20.00 → Stop moves to $19.70
Price drops to $19.80 → Stop stays at $19.70
Price hits $19.70 → Position closed, locked in $0.70 profit
2. Sentiment-Based Exits
If we’re long a stock and bearish news comes in with 65%+ confidence, we exit:
def check_positions_against_signals(self, signals):
for signal in signals:
if signal["ticker"] in self.positions:
position = self.positions[signal["ticker"]]
# Long position + bearish signal = exit
if position.side == "long" and signal["action"] == "SELL":
if signal["confidence"] >= 0.65:
self.close_position(signal["ticker"], "sentiment_reversal")
3. Market Regime Filter for Shorts
We only short when the market is weak (SPY down 0.3%+ intraday OR VIX elevated):
def _is_market_weak(self):
spy_change = get_spy_intraday_change()
vix_elevated = is_vix_above_average()
return spy_change < -0.003 or vix_elevated
Full Code
The complete trading agent code is available at:
Running the Agent
# Check market status
python scheduler.py --check-market
# Run manually
python scheduler.py collect_news
python scheduler.py extract_signals
python scheduler.py market_open
# Check status
./scripts/check_status.sh
# View logs
tail -f logs/cron.log
What’s Next
Ideas for extending the agent:
- Add more news sources - Twitter/X via RSS bridges, SEC filings
- Implement options trading - Higher leverage for strong signals
- Add technical filters - RSI, moving averages to confirm signals
- Multi-timeframe analysis - Combine daily and weekly signals
- Portfolio optimization - Sector balancing, correlation management
Key Takeaways
- News-based trading reacts to events, not predictions
- Claude Haiku is fast and cheap for signal extraction (~$0.001/item)
- Cheap volatile stocks work better for small accounts
- Trailing stops lock in gains automatically
- Market regime matters for shorts—don’t short in uptrends
- Automation removes emotion from trading decisions
The agent won’t make you rich overnight, but it demonstrates how AI can be applied to systematic trading. The real value is in the framework—you can extend it with better signals, more sophisticated risk management, and additional asset classes.
Happy trading, and remember: always paper trade first!
Comments
to join the discussion.