albertgt 9 hours ago

Word/token meanings are embedded in high dimensional space from different embedding models, but projected down to 3D, we can get a sense of how they are stored.

Fly through a 3D visualization of open source texts stored in a vector database.

I used this to demonstrate a portion of the RAG process. (After chunking and embedding, and before relevant text are searched for.)

https://youtu.be/S9InoEjxly8