• Machine Learning – Approximate Nearest Neighbors(ANNOY)

    Machine Learning – Approximate Nearest Neighbors(ANNOY)

    ML – Approximate Nearest Neighbors. Table Of Contents: What Is ANNOY Algorithm. Use Case Example. How ANNOY Works Step By Step. Trade Off Controls. Advantages Of Annoy. Limitation Of ANNOY. (1) What Is ANNOY Algorithm ? (2) Example Use Cases. (3) How Annoy Works — Step by Step Step-1: Input Data Preparation Step-2: Build Trees Using Random Projections Step-3: Save The Tree To The Desk Step-4: Querying / Searching (4) Trade-Off Controls (5) Advantages of Annoy (6) Limitation

    Read More

  • Linear Regression – Interview Question & Answers !

    Linear Regression – Interview Q & A. Table Of Contents: Beginner-Level (Fundamentals) What is Linear Regression? What is the equation of a simple linear regression model? What are the assumptions of linear regression? What is the difference between simple and multiple linear regression? What do the coefficients in a linear regression model represent? How do you interpret the intercept and slope in a regression line? What is the cost function used in linear regression? What is the difference between correlation and regression? What is Mean Squared Error (MSE)? How is R-squared interpreted? What does an R-squared of 0.85 mean? What

    Read More

  • Machine Learning – L1 & L2 Regularization.

    Machine Learning – L1 & L2 Regularization.

    Machine Learning – L1 & l2 Regularization Table Of Contents: What Is L1 & L2 Regularization ? How Controlling The Magnitude Of The Model’s Coefficients, Overcome Overfitting ? How Too Large Coefficients More Likely To Fit Random Noise In The Training Set ? What Is Sparsity In The Model ? How The L2 Regularization Handle The Larger Weights ? Explain With Mathematical Example How The Weights Are Getting Zero In L1 Normalization ? Why For L2 Regularization Weight Can’t Be Zero Explain With One Example ? (1) What Is L1 & L2 Regularization ? (2) How Controlling The Magnitude Of

    Read More

  • NLP – BERT Architecture

    NLP – BERT Architecture

    NLP – BERT Architecture Table Of Contents: Introduction to BERT BERT Architecture Input Representation Pretraining Objectives Fine-Tuning BERT Variants of BERT BERT Evaluation and Benchmarks Advanced Concepts Implementation with Libraries Limitations and Challenges Applications of BERT (1) Introduction To BERT. (2) BERT – Questions What is BERT and the transformer, and why do I need to understand it? Models like BERT are already massively impacting academia and business, so we’ll outline some of the ways these models are used, and clarify some of the terminology around them. What did we do before these models? To understand these models, it’s important to look

    Read More

  • Linear Regression – Assumption – 6 (How To Detect & Avoid Endogeneity ?)

  • Linear Regression – Assumption – 5 (How To Detect & Avoid Autocorrelation In Regression ?)

    How To Detect & Avoid Autocorrelation In Regression ? Table Of Contents: Methods To Detect Autocorrelation In Error Term? Methods To Avoid The Autocorrelation In Error Term. (1) Methods To Detect Autocorrelation In Error Term? Residual Plot (vs. time or observation order) Durbin-Watson Test Autocorrelation Function (ACF) Plot Ljung-Box Test (for multiple lags) (1.1) Residual Plot To Detect Autocorrelation In Error Term? import numpy as np import pandas as pd import matplotlib.pyplot as plt import statsmodels.api as sm # Simulate ordered data (e.g., time series) np.random.seed(42) n = 100 advertising = np.random.normal(1000, 200, n) # Introduce autocorrelation in error terms

    Read More

  • Linear Regression – Assumption – 3(How To Detect & Avoid Non Normal Distribution Of Error Term ?)

  • Linear Regression – Assumption – 2 (How To Detect & Avoid Multicollinearity ?)

    Linear Regression – Assumption – 2 (How To Detect & Avoid Multicollinearity ?)

    Linear Regression – Assumption- 2 (How To Detect & Avoid Multicollinearity ?) Table Of Contents: How To Detect Multicollinearity In The Dataset ? Correlation Matrix. Variance Inflection Factor Model Behavior Observation. How To Avoid Multicollinearity In The Dataset ? Remove One of the Correlated Variables Use Principal Component Analysis (PCA) Use Regularization Techniques (Ridge/Lasso) (1) How To Detect Multicollinearity In The Dataset? Method – 1: Correlation Matrix (Pearson correlation) We will use Pearson ‘r’ Correlation Coefficient to find the correlation between two variable. import seaborn as sns import matplotlib.pyplot as plt # Load dataset tips = sns.load_dataset("tips") # Compute the

    Read More

  • Linear Regression – Assumption -2 (No Multicollinearity)

  • Linear Regression – Assumption- 1 (Linear Relationship)

    Linear Regression – Assumption- 1 (Linear Relationship)

    Linear Regression – Assumption- 1 (Linear Relationship) Table Of Contents: What Is Linear Relationship Assumption ? Why Linear Regression Assumption Is Important ? How To Check Linearity Between Dependent & Independent Variable ? How The Residuals Can Say About The Linearity ? What To Do If You Have Non Linearity Present In The Data ? (1) What Is Linear Relationship Assumption ? (2) What Is Linear Relationship Assumption Important ? (3) How To Check Linearity Between Dependent & Independent Variable ? (1) Using Scatter Plot import seaborn as sns import matplotlib.pyplot as plt # Load a real dataset tips =

    Read More