Almost all embeddings ML and AI related are vector embeddings. The process of turning something into an vector embedding is vectorization.

Foundations of vectors, in a mathematical sense: Vectors.

Vectorizing text

As the basis, texts are usually converted into tokens, using a Tokenizer

For more, see everything related to:NLP or the dedicated note on text embedding models / algorithms: Text embedding models

Resources