๐Ÿ“š ์„ธํ˜„'s Vault

๐ŸŒ ๋„๋ฉ”์ธ

  • ๐Ÿ”ฎ3D-Vision
  • ๐ŸŽจRendering
  • ๐Ÿค–Robotics
  • ๐Ÿง LLM
  • ๐Ÿ‘๏ธVLM
  • ๐ŸŽฌGenAI
  • ๐ŸฅฝXR
  • ๐ŸŽฎSimulation
  • ๐Ÿ› ๏ธDev-Tools
  • ๐Ÿ’ฐCrypto
  • ๐Ÿ“ˆFinance
  • ๐Ÿ“‹Productivity
  • ๐Ÿ“ฆ๊ธฐํƒ€

๐Ÿ“„ Papers

  • ๐Ÿ“š์ „์ฒด ๋…ผ๋ฌธ172
Home

โฏ

bookmarks

โฏ

self attention as a directed graph self attention is at the

self-attention-as-a-directed-graph-self-attention-is-at-the

2024๋…„ 6์›” 12์ผ1 min read

  • XR
  • training

Akshay ๐Ÿš€ (@akshay_pachaar)

2024-06-12 | โค๏ธ 1020 | ๐Ÿ” 177


Self-attention as a directed graph!

Self-attention is at the heart of transformers, the architecture that led to the LLM revolution that we see today.

In this post, Iโ€™ll clearly explain self-attention & how it can be thought of as a directed graph.

Read moreโ€ฆ๐Ÿ‘‡ https://x.com/akshay_pachaar/status/1800868205029941338/photo/1

๋ฏธ๋””์–ด

photo


Tags

domain-ai-ml domain-xr


๊ทธ๋ž˜ํ”„ ๋ทฐ

  • Akshay ๐Ÿš€ (@akshay_pachaar)
  • ๋ฏธ๋””์–ด
  • Tags

๋ฐฑ๋งํฌ

  • domain-XR

Created with Quartz v4.5.2 ยฉ 2026

  • GitHub
  • Sehyeon Park