daniel@danielsobrado.com
Let's talk:
Ping me on LinkedIn
Home
Programming
Devops
Big Data
Data Science
Others
Machine Learning
Home
Machine Learning
05
Jun
Attention Mechanisms - Part 1: The Core Idea
by Daniel Sobrado
22
May
LSTM - Learning Long-Term Dependencies
by Daniel Sobrado
15
May
Layer Normalization - Keeping Activations Stable
by Daniel Sobrado
08
May
Conv + ReLU - The Building Block of CNNs
by Daniel Sobrado
01
May
2D Convolutions - How Neural Networks See Images
by Daniel Sobrado
24
Apr
Cross-Entropy Loss - The Classification Loss Function
by Daniel Sobrado
Prev
2
3
4
5
6
Next
Categories
Big data
Certification
Data science
Devops
Investing
Machine learning
Mathematics
Maths
Others
Programming
Rag
Sideprojects
Top Articles
Precedence of properties
10 Feb 2019
Apache Spark: Introduction to project Tungsten
28 Sep 2018
Loss functions
23 Jul 2016
Tags
Activation functions
Ai
Algorithms
Attention
Bayes
Bayesian
Certification
Classification
Clustering
Cnn
Computer vision
Concepts
Concurrency
Confluent
Correlation
Cross entropy
Data
Data preparation
Data structures
Deep learning
Design patterns
Diffusion models
Dimensionality reduction
Dot product
Dsa
Elasticsearch
Embeddings
Entropy
Exploration
Fasttext
Functional programming
Generation
Generative models
Git
Glove
Graph theory
Information retrieval
Information theory
Java
Kafka
Kl divergence
Kubernetes
Linear algebra
Llm
Llms
Loss functions
Machine data
Machine learning
Math
Mathematics
Microservices
Multimodal ai
Networking
Neural networks
Nlp
Normalization
Numerical methods
Numpy
Optimization
Pattern
Pca
Performance
Pods
Probabilistic
Probability
Python
Q learning
Regression
Reinforcement learning
Rest API
Rnn
Scikit learn
Scipy
Search
Security
Sequence modeling
Set theory
Spark
Statistics
Stochastic processes
Tensors
Text processing
Text to image
Time series
Tokenization
Training
Transfer learning
Transformers
U net
Unsupervised learning
Word2vec
Stay Updated