Kernel
Research infrastructure. Right now: sec master and clean data. Future plans: backtest engine, factor framework, risk and regime tooling. Open-source when ready.
The problem
Platforms like QuantConnect and Zipline make assumptions about strategy structure, data access, and execution that get in the way of research. They're built to deploy strategies, not to study them. If you want to do something like run a cross-sectional momentum sort with regime-conditional rebalancing and custom transaction cost models, you end up fighting the framework.
Kernel is intended to be research-first: every component exposed and replaceable, no black boxes, no vendor lock-in. We're building toward that.
What's in place
Right now Kernel has sec master and clean data, the foundation for point-in-time, survivorship-aware research. Everything else builds on that.
Future scope
- Backtest engine Vectorized backtesting with configurable cost models, designed for factor research.
- Factor framework Cross-sectional and time-series factor construction, sort/portfolio logic, winsorization, neutralization.
- Risk & regime Factor/idiosyncratic/regime decomposition; regime detection (e.g. HMM on realized vol).