Word Embeddings
Reframe word meaning based on context using word embeddings, a natural language processing technique to map words in relation to each other.
StartKey Concepts
Review core concepts you need to learn to master this subject
Vectors in NLP
Word Embeddings
Distance Between Vectors
Word Contexts in Vector Space
Word2Vec Algorithm
Gensim Embeddings Creation
Vectors in NLP
Vectors in NLP
In natural language processing, vectors are very important! They contain numerical information that indicates the magnitude of the pieces of data being represented.
The dimension, or size, of a vector can be manipulated to allow for more data to be stored in the vector. Above, three one-dimensional vectors are shown.
In Python, you can represent a vector with a NumPy array. The following creates a vector of even numbers from 2 to 10.
even_nums = np.array([2,4,6,8,10])
What you'll create
Portfolio projects that showcase your new skills
How you'll master it
Stress-test your knowledge with quizzes that help commit syntax to memory