Skip to main content

Classification and Logistic Regression

1. Classification Problem

  • Definition: Classification is a supervised learning task where the output variable y is discrete-valued rather than continuous.
  • In particular, consider binary classification where y {0,1} (e.g., spam detection: spam =1, not spam =0).
  • Each training example is a pair (x(i), y(i)), where x(i)Rd is a feature vector, and y(i) is the label.

2. Why Not Use Linear Regression for Classification?

  • Linear regression tries to predict continuous values, which is problematic for classification as the prediction can be outside [0,1].
  • For example, predicting y1.5 or −0.2 is meaningless when y is binary.
  • Instead, we want the output (x) to be interpreted as the probability that y=1 given x.

3. Logistic Regression Model

Hypothesis:

(x)=g(θTx)=1+e−θTx1,

where:

  • g(z)=1+e−z1 is the sigmoid function, which maps any real value to the interval (0, 1).
  • θRd+1 are parameters (including intercept term).
  • (x) can be interpreted as the estimated probability P(y=1x;θ).

Decision Boundary:

  • Predict y=1 if (x)0.5; otherwise, predict y=0.
  • The decision boundary corresponds to θTx=0, which is a linear boundary in input space.

4. Loss Function and Cost Function

Probability Model:

  • Logistic regression models conditional probability directly:

P(y=1x;θ)=(x),P(y=0x;θ)=1(x).

  • Equivalently, likelihood for data point (x(i),y(i)):

p(y(i)x(i);θ)=((x(i)))y(i)(1(x(i)))1y(i).

Cost (Loss) Function:

  • Use negative log-likelihood (cross-entropy loss) as cost per example:

J(i)(θ)=[y(i)log(x(i))+(1y(i))log(1(x(i)))].

  • Overall cost function (average over n examples):

J(θ)=n1i=1nJ(i)(θ).

  • This loss is convex in θ, enabling efficient optimization.

5. Training Logistic Regression

·         Use methods such as gradient descent or more advanced optimization (Newton's method, quasi-Newton) to minimize cost J(θ).

·         The gradient of the cost function is:

θJ(θ)=n1i=1n((x(i))y(i))x(i).

  • Update rule in gradient descent:

θ:=θαθJ(θ),

where α is the learning rate.


6. Multi-class Classification

·         When y{1,2,...,k} for k>2, logistic regression generalizes to multinomial logistic regression or Softmax regression.

·         Model outputs hˉθ(x)Rk called logits.

·         The Softmax function converts logits into probabilities:

P(y=jx;θ)=s=1kexp(hˉθ(x)s)exp(hˉθ(x)j).

  • Loss for example (x(i),y(i)) is the negative log-likelihood:

J(i)(θ)=logP(y(i)x(i);θ).


7. Discriminative vs. Generative Classification Algorithms

  • Discriminative algorithms (like logistic regression) model p(yx) directly or learn a direct mapping from x to y.
  • Generative algorithms model the joint distribution p(x,y)=p(xy)p(y).
  • Example: Gaussian Discriminant Analysis (GDA).
  • Logistic regression is an example of a discriminative approach focusing purely on p(yx).

8. Linear Hypothesis Class and Decision Boundaries

  • Logistic regression hypothesis class:

H={:(x)=1{θTx0}},

which are classifiers with linear decision boundaries.

  • More generally, hypothesis classes can be extended to neural networks or other complex architectures.

9. Perceptron Learning as Contrast to Logistic Regression

·         Perceptron also uses a linear classifier but with a different loss and update rule.

·         Logistic regression provides probabilistic outputs and optimizes a convex cost function, generally yielding better statistical properties.


10. Practical Considerations

  • Feature scaling often improves numerical stability.
  • Regularization (e.g., L2) is frequently added to cost to prevent overfitting.
  • Logistic regression handles input features linearly; non-linear boundaries require feature engineering or kernel methods.

Summary:

Logistic regression is a fundamental classification algorithm that models the conditional probability of the positive class using a sigmoid of a linear function of input features. It is trained via maximizing likelihood (or minimizing cross-entropy loss) and extends naturally to multi-class problems via Softmax. It is a discriminative model focusing directly on p(yx) and yields linear decision boundaries. It contrasts with generative models by its direct approach to classification.

 

Comments

Popular posts from this blog

Experimental Research Design

Experimental research design is a type of research design that involves manipulating one or more independent variables to observe the effect on one or more dependent variables, with the aim of establishing cause-and-effect relationships. Experimental studies are characterized by the researcher's control over the variables and conditions of the study to test hypotheses and draw conclusions about the relationships between variables. Here are key components and characteristics of experimental research design: 1.     Controlled Environment : Experimental research is conducted in a controlled environment where the researcher can manipulate and control the independent variables while minimizing the influence of extraneous variables. This control helps establish a clear causal relationship between the independent and dependent variables. 2.     Random Assignment : Participants in experimental studies are typically randomly assigned to different experimental condit...

Brain Computer Interface

A Brain-Computer Interface (BCI) is a direct communication pathway between the brain and an external device or computer that allows for control of the device using brain activity. BCIs translate brain signals into commands that can be understood by computers or other devices, enabling interaction without the use of physical movement or traditional input methods. Components of BCIs: 1.       Signal Acquisition : BCIs acquire brain signals using methods such as: Electroencephalography (EEG) : Non-invasive method that measures electrical activity in the brain via electrodes placed on the scalp. Invasive Techniques : Such as implanting electrodes directly into the brain, which can provide higher quality signals but come with greater risks. Other methods can include fMRI (functional Magnetic Resonance Imaging) and fNIRS (functional Near-Infrared Spectroscopy). 2.      Signal Processing : Once brain si...

Prerequisite Knowledge for a Quantitative Analysis

To conduct a quantitative analysis in biomechanics, researchers and practitioners require a solid foundation in various key areas. Here are some prerequisite knowledge areas essential for performing quantitative analysis in biomechanics: 1.     Anatomy and Physiology : o     Understanding the structure and function of the human body, including bones, muscles, joints, and organs, is crucial for biomechanical analysis. o     Knowledge of anatomical terminology, muscle actions, joint movements, and physiological processes provides the basis for analyzing human movement. 2.     Physics : o     Knowledge of classical mechanics, including concepts of force, motion, energy, and momentum, is fundamental for understanding the principles underlying biomechanical analysis. o     Understanding Newton's laws of motion, principles of equilibrium, and concepts of work, energy, and power is essential for quantifyi...

Conducting a Qualitative Analysis

Conducting a qualitative analysis in biomechanics involves a systematic process of collecting, analyzing, and interpreting non-numerical data to gain insights into human movement patterns, behaviors, and interactions. Here are the key steps involved in conducting a qualitative analysis in biomechanics: 1.     Data Collection : o     Use appropriate data collection methods such as video recordings, observational notes, interviews, or focus groups to capture qualitative information about human movement. o     Ensure that data collection is conducted in a systematic and consistent manner to gather rich and detailed insights. 2.     Data Organization : o     Organize the collected qualitative data systematically, such as transcribing interviews, categorizing observational notes, or indexing video recordings for easy reference during analysis. o     Use qualitative data management tools or software to f...

LPFC Functions

The lateral prefrontal cortex (LPFC) plays a crucial role in various cognitive functions, particularly those related to executive control, working memory, decision-making, and goal-directed behavior. Here are key functions associated with the lateral prefrontal cortex: 1.      Executive Functions : o     The LPFC is central to executive functions, which encompass higher-order cognitive processes involved in goal setting, planning, problem-solving, cognitive flexibility, and inhibitory control. o     It is responsible for coordinating and regulating other brain regions to support complex cognitive tasks, such as task switching, attentional control, and response inhibition, essential for adaptive behavior in changing environments. 2.      Working Memory : o     The LPFC is critical for working memory processes, which involve the temporary storage and manipulation of information to guide behavior and decis...