Skip to content

Topic 1: Introduction to Probabilistic Models

Probabilistic models are foundational in understanding the uncertainties in data...

Key Concepts

  • Random Variables
  • Joint Probability
  • Bayes Theorem

Applications

  • ML model uncertainty
  • Bayesian Inference

The Encoder only Transformer like BERT stands for Bidirectional Encoder Representations from Transformers. Is a cutting-edge machine learning framework developed by Google for Natural Language Processing(NLP). It's designed to help computers understand the meaning of ambiguous language in the text by analyzing surronding words to establish context. This is achieved through its bi-directional nature, which means it reads text from both left to right and vice versa, providing richer understanding of word relationships than previous models.

So the first task like this encoder transformer does is Word Embedding.

Word Embedding: This converts words, bits of words and characters collectively called tokens to numbers. We convert them to numbers because the nueral network can onl understand numbers. One easy way to convert words into numbers is to randomly assign each words a number.

Example: I love Coding