The ML conductor: combines all model predictions with dynamic weighting based on recent performance.
Ensemble Meta is the ML arena's meta-strategy that combines predictions from ALL other machine learning models (XGBoost, LightGBM, CatBoost, Random Forest, LSTM, Transformer, DQN) into a single consensus signal. It's the ultimate 'stacking' — a second-level model that learns how to weight each model based on market conditions. When LSTM is more reliable (trending markets), it increases its weight. When XGBoost excels (classic tabular markets), it favors it.
Each ML model produces a prediction with a confidence score. A stacking model (logistic regression) learns optimal weights: weight_i = f(recent_performance_i, correlation_with_others, market_conditions). Final signal = weighted sum of 7 predictions. Trading signal when consensus > 60% in one direction. Weights recalculated daily on the last 30 days of performance.
Weighted consensus of 7 ML models. Dynamic weights based on recent performance. Stacking (2nd-level model). Approach diversity (boosting, forest, deep learning, RL). Aggregated >60% confidence required. Daily weight recalibration.
Low
Combines the strength of 7 different ML approaches. Stacking corrects individual weaknesses. More robust than any single model. Adaptive weights based on market conditions. The 'dream team' of crypto machine learning.
Maximum complexity (7 models + 1 stacker). Consensus can dilute strong signals from individual models. High computational cost (7 models to maintain and retrain). If all ML models are wrong together, the ensemble is wrong too.
Explore all 74 trading strategies across 4 arenas
🏟️ View all strategies