Doctrine

How we think about research.

These aren't aspirational statements. They're the rules we actually follow when doing research. We wrote them down so we'd stop arguing about process.

01

Try to kill your own ideas

Research is designed to reject hypotheses. If you can't specify what evidence would make you abandon a thesis, you don't have a thesis. You have a belief. We look for disconfirming evidence before accepting anything.

02

The data is the argument

A good narrative explains nothing by itself. If the regression doesn't hold out-of-sample, the story behind it is irrelevant. We've discarded ideas that sounded compelling because the numbers didn't back them up. That's the process working.

03

If it can't be reproduced, it didn't happen

Every result needs to be reproducible from the same inputs and methodology. This is why we're building Kernel so pipeline configs are hashed and reproducible: if someone else can't get the same output from the same input, we treat the original result as suspect.

04

Risk first, always

The first question about any strategy or signal is: what can go wrong? How does it behave in a drawdown? What happens when correlations spike? We'd rather miss a good trade than take a bad one. This is a deliberate bias and we're comfortable with the opportunity cost.

05

Parameters are regime-dependent

A model calibrated on 2012–2019 low-vol data will blow up in a 2020-style event. We condition everything on detected regime state: exposure limits, rebalancing frequency, even which factors we trust. A single parameter set across all environments is a bug.

06

Research ≠ deployment

A research finding that "factor X has historically generated alpha" is not the same as "we should allocate capital to factor X." The decision to deploy requires a separate evaluation: current regime, crowding, cost structure, personal risk tolerance. Research produces information. Deployment is a different process.