Category: Statistics
Maximum Likelihood Estimation (MLE) in Linear Regression
This tutorial is going to explain what Maximum Likelihood Estimation (MLE) is and how Maximum Likelihood Estimation (MLE) can be used in linear regression. Basics of Maximum Likelihood Estimation Before discussing about linear regression, we need to have a basic...
Read Full Article →
Experimental Design in Advertising Research
This tutorial explains the types of advertising experiment design and the detailed steps of conducting experimental research for advertising campaigns. Types of Advertising Experiments The first one is a simple version of the experiment design, which includes a control condition...
Read Full Article →
Linear Mixed Models in SPSS
This tutorial includes the explanation of what a linear mixed model is, how to structure its statistical model, data example, as well as steps for linear mixed models in SPSS. Definition of Linear Mixed Models Linear mixed models (LMMs) are...
Read Full Article →
Data Type and Data Summary
This tutorial explains what data type (including numerical data and categorical data) is and how to summarize different types of data. Data Type Broadly speaking, data can be categorized into two types: categorical and numerical. Categorical data refers to variables...
Read Full Article →
Linear Regression and Orthogonal Projection
This tutorial explains why and how linear regression can be viewed as an orthogonal projection on 2 and 3-dimensional spaces. Projection with 2 Dimensions Suppose that both X0 and Y have 2 dimensions (e.g., 2 observations from 2 participants). It...
Read Full Article →
Mean as a Projection
This tutorial explains how mean can be viewed as an orthogonal projection onto a subspace defined by the span of an all 1’s vector (i.e., basis vector). Suppose that \( \vec{y} \in \mathbb{R}^n \) and \( L \subset \mathbb{R}^n\) is...
Read Full Article →
Orthogonal Projection
This tutorial explains what an orthogonal projection is in linear algebra. Further, it provides proof that the difference between a vector and a subspace is orthogonal to that subspace. Let’s define two vectors, \(\vec{X} \) and \(\vec{Y} \), and we...
Read Full Article →
Orthonormal Vectors: Definitions and Examples
Two Orthogonal Vectors Definition: Two vectors are orthogonal if they are perpendicular to each other. That is, the dot product of the two vectors is zero. The following is an example of two orthonormal vectors. \( \vec{V_1} =\left[\begin{array}{ccc}1\\0\\-1\end{array}\right]\), \( \vec{V_2}...
Read Full Article →
Interaction in Linear Regression
This tutorial focuses on interaction between a categorial variable and a continuous variable in linear regression. Note that, in this tutorial, we limit the the categorical variable to be 2 levels. (For a categrocial variable with 3 levels, please refer...
Read Full Article →
Dummy and Contrast Codings in Linear Regression
This tutorial explains the differences between dummy coding and contrast coding in linear regression using R code examples. It is worth pointing out that, this tutorial focuses on the categorical independent variable has 3 levels. Short Note Note that, in...
Read Full Article →