Regression analysis is used to statistically derive correlation between problem and user's pain point, basis the correlation coefficient. There are several types of correlation coefficients (e.g. Pearson, Kendall, Spearman), but the most commonly used is the Pearson’s correlation coefficient. This coefficient is calculated as a number between -1 and 1 with 1 being the strongest possible positive correlation and -1 being the strongest possible negative correlation. If there is no correlation, there can not be any causal relationship as well. Therefore, we can eliminate those problems from further analysis. However, if there is a strong correlation, it may or may not have causal relationship, which needs to be further established through experiments.
Regression analysis includes several variations, such as linear, multiple linear, and nonlinear. The most common models are simple linear and multiple linear. Nonlinear regression analysis is commonly used for more complicated data sets in which the dependent and independent variables show a nonlinear relationship.
Simple linear regression is a model that assesses the relationship between a dependent variable and an independent variable. The simple linear model is expressed using the following equation:
Y = a + bX + e
Y – Dependent variable
C – Independent (explanatory) variable
a – Intercept
b – Slope
e – Residual error
Multiple linear regression analysis is essentially similar to the simple linear model, with the exception that multiple independent variables are used in the model. Since there are several independent variables in multiple linear analysis, there is another mandatory condition for the model that Independent variables should show a minimum correlation with each other. If the independent variables are highly correlated with each other, it will be difficult to assess the true relationships between the dependent and independent variables. The mathematical representation of multiple linear regression is:
Y = a + bX1 + cX2 + dX3 + e
Y – Dependent variable
X1, X2, X3 – Independent (explanatory) variables
a – Intercept
b, c, d – Slopes
e– Residual error
Copyright © 2024 Design Thinking - All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.