Rice Analytics

Automated Reduced Error Predictive Analytics

Many Variables in Machine Learning
The 'curse of dimensionality' is one of the basic problems that has plagued machine learning since the Perceptron origins. Reduced Error Logistic Regression (RELR) overcomes this problem in today's standard business applications.  This is because RELR knows the most important variables in the model prior to running the model.  Hence, it avoids modeling the vast majority of unimportant variables, but returns a model which is equivalent in accuracy if it had modeled all variables.  Independent users have now run accurate RELR models that worked with as many as 80,000 variables and interactions. These models discovered interaction variables that were hidden from other standard machine learning methods.  Because of this, the RELR model gave a substantial lift in model performance.  Please go to the 
Case Studies page to learn more details about  these and other case examples.

Reduced Error Logistic Regression is a modern type of "abacus" with major machine learning and business intelligence applications.  
Machine Learning  Segmentation  Consumer Surveys  Predictive Modeling  Risk Management