Sort:  


You just got DOOKed!
@sabrinah thinks your content is the shit.
They have 40/200 DOOK left to drop today.
dook_logo
Learn all about this shit in the toilet paper! 💩

It is the chatgpt tokenizer but it wasnt created in there. It was something else that I posted. I can do a few of them per day I think.

You're correct about the tokenizer thing. When AI content generators first came out, they didn't use tokens. It was just a monthly subscription for unlimited use. But as the service became popular, the owner started using tokens. I was frustrated at the time, but hearing you talk about it recently made me remember and understand why the owners did it. We do need something similar for INLEO.

#freecompliments !DOOK


You just got DOOKed!
@sabrinah thinks your content is the shit.
They have 41/200 DOOK left to drop today.
dook_logo
Learn all about this shit in the toilet paper! 💩

Tokens are the base unit.

If Leo is going to be an #ai platform, it needs a couple of things:

  • billions of tokens to train on
  • billions of tokens processed
  • billions of tokens of synthetic data output
  • billions of tokens of both human and synthetic data fed back into the model

Wait a minute, I think I confused tokens with credits 🤦. Had to do a quick Google search on the difference

#freecompliments !DOOK


You just got DOOKed!
@sabrinah thinks your content is the shit.
They have 42/200 DOOK left to drop today.
dook_logo
Learn all about this shit in the toilet paper! 💩

They are not the same thing. Vastly different. Tokens represent data.