Topics
Mathematical Logic
Matrices
Differentiation
- Derivatives of Composite Functions - Chain Rule
- Derivatives of Inverse Functions
- Derivatives of Logarithmic Functions
- Derivatives of Implicit Functions
- Derivatives of Parametric Functions
- Second Order Derivative
- Overview of Differentiation
Applications of Derivatives
Integration
Definite Integration
Applications of Definite Integration
- Standard Forms of Parabola and Their Shapes
- Standard Forms of Ellipse
- Area Under Simple Curves
- Overview of Application of Definite Integration
Differential Equation and Applications
- Differential Equations
- Order and Degree of a Differential Equation
- Formation of Differential Equation by Eliminating Arbitary Constant
- Differential Equations with Variables Separable Method
- Homogeneous Differential Equations
- Linear Differential Equations
- Application of Differential Equations
- Overview of Differential Equations
Commission, Brokerage and Discount
- Commission and Brokerage Agent
- Concept of Discount
- Overview of Commission, Brokerage and Discount
Insurance and Annuity
- Insurance
- Types of Insurance
- Annuity
- Overview of Insurance and Annuity
Linear Regression
- Regression
- Types of Linear Regression
- Fitting Simple Linear Regression
- The Method of Least Squares
- Lines of Regression of X on Y and Y on X Or Equation of Line of Regression
- Properties of Regression Coefficients
- Overview: Linear Regression
Time Series
- Introduction to Time Series
- Uses of Time Series Analysis
- Components of a Time Series
- Mathematical Models
- Measurement of Secular Trend
- Overview of Time Series
Index Numbers
- Weighted Aggregate Method
- Cost of Living Index Number
- Method of Constructing Cost of Living Index Numbers - Aggregative Expenditure Method
- Overview of Index Numbers
- Method of Constructing Cost of Living Index Numbers - Family Budget Method
- Uses of Cost of Living Index Number
Linear Programming
- Introduction of Linear Programming
- Linear Programming Problem (L.P.P.)
- Mathematical Formulation of Linear Programming Problem
- Overview of Linear Programming
Assignment Problem and Sequencing
- Assignment Problem
- Hungarian Method of Solving Assignment Problem
- Special Cases of Assignment Problem
- Sequencing Problem
- Types of Sequencing Problem
- Finding an Optimal Sequence
- Overview of Assignment Problem and Sequencing
Probability Distributions
- Poisson Distribution
- Expected Value and Variance of a Random Variable
- Overview of Probability Distributions
- Overview of Binomial Distribution
Definition: Regression
A statistical method used to predict the value of one variable based on another
Dependent Variable (Y)
Variable being predicted.
Independent Variable (X)
Variable used for prediction.
Regression Equations
A mathematical equation used for prediction.
Key Points: Types of Regression
Simple Linear Regression:
One independent variable.Multiple Linear Regression
Multiple Linear Regression:
Two or more independent variables
Definition: Fitting Simple Linear Regression
Fitting Regression:
Finding the straight line that best represents the relationship between X and Y using the given sample data.
Scatter Diagram:
A graphical representation of paired data (X, Y).
Each pair is plotted as a point.
Formula: Method of Least Squares
Best-fit line is the one that minimises the sum of squares of residuals:
\[S^2=\sum(y_i-\hat{y}_i)^2\]
Residual: \[e_i=y_i-\hat{y}_i\]
Formula: Line of Regression of Y on X
Y = a + bX
where b = bYX = regression coefficient of Y on X
\[b_{_{YX}}=\frac{\operatorname{cov}(X,Y)}{\operatorname{var}(X)}\]
\[=\frac{\frac{\sum\left(x_i-\overline{x}\right)\left(y_i-\overline{y}\right)}{n}}{\frac{\sum\left(x_i-\overline{x}\right)^2}{n}}\]
\[=\frac{\sum x_iy_i-n\bar{x}\bar{y}}{\sum x_i^2-n\bar{x}^2}\]
\[a=\overset{-}{\operatorname*{y}}-b\overset{-}{\operatorname*{x}}\]
Formula: Line of Regression of X on Y
\[X=a^{\prime}+b^{\prime}y\]
where b' = bXY = regression coefficient of X on Y
\[b_{_{XY}}\quad=\quad\frac{\operatorname{cov}(X,Y)}{\operatorname{var}(Y)}\]
\[\begin{array}
{cc} & \frac{\sum\left(x_i-\overline{x}\right)\left(y_i-\overline{y}\right)}{n} \\
= & \frac{\sum\left(y_i-\overline{y}\right)^2}{n}
\end{array}\]
\[b_{XY}=\frac{\sum x_iy_i-n\bar{x}\bar{y}}{\sum y_i^2-n\bar{y}^2}\]
\[\begin{array}
{rcl}a^{\prime} =\overline{x}-b^{\prime}\overline{y}
\end{array}\]
Key Points: Properties of Regression Coefficients
1.\[b_{_{XY}}.b_{_{YX}}=r^{2}\]
2. If bYX > 1, then bXY < 1.
3. \[\left|\frac{b_{yx}+b_{xy}}{2}\right|\geq|r|\]
4. Regression coefficients are independent of a change of origin but are affected by a change of scale.
