Akshay ๐ (@akshay_pachaar)
2024-06-12 | โค๏ธ 1020 | ๐ 177
Self-attention as a directed graph!
Self-attention is at the heart of transformers, the architecture that led to the LLM revolution that we see today.
In this post, Iโll clearly explain self-attention & how it can be thought of as a directed graph.
Read moreโฆ๐ https://x.com/akshay_pachaar/status/1800868205029941338/photo/1
๋ฏธ๋์ด
