• 17 Posts
  • 3 Comments
Joined 1Y ago
cake
Cake day: Aug 05, 2023

help-circle
rss











Interesting. I’m just thinking aloud to understand this.

In this case, the models are looking at a few sequence of bytes in their context and are able to predict the next byte(s) with good accuracy, which allows efficient encoding. Most of our memories are associative, i.e. we associate them with some concept/name/idea. So, do you mean, our brain uses the concept to predict a token which gets decoded in the form of a memory?




I have the same problem. The number of things I want to read and write about is scaling faster than I can tackle them :)


Inside CPython's Clever Use of Bloom Filters for Efficient String Processing
fedilink

An illustrated introduction to bloom filters—learn their implementation, and applications. Also, explore the Counting Bloom Filter extension!
fedilink

A detailed examination of Python 3.12's internal changes featuring the concept of 'immortal' objects, for performance enhancements
fedilink

An Extensive Walkthrough of Python’s Primary Memory Management Technique, Reference Counting
fedilink

LZ77 Is All You Need? Why Gzip + KNN Works for Text Classification
Decoding the Success of Gzip + KNN: The Central Role of LZ77
fedilink