Word Embedding (word2vec)

https://d2l.ai/chapter_natural-language-processing-pretraining/word2vec.html

I could not understand something so clearly about function of two submodels in word2vec. When we want to train word2vec model, where one word is mapped to one real vector, should we leverage both submodels, or just select one of them?

We usually just use either CBOW or skip-gram.