For this week's question, we'll be looking into word embeddings. Word embeddings encode the semantics of words. A toy example of this appears below, where the word embeddings are stored in rows and hypothetical semantic dimensions are stored in columns. Word embeddings allow us to capture semantic relationships between words; e.g. queen - woman + man = king. Word Monarch Male Female queen 1 0 1 woman 0 0 1 king 1 1 0 man 0 1 0 Suppose we have a dictionary named Word2Index that stores our vocabulary and a matrix named Embedding that stores our word embeddings, as shown below. What would the embedded sequence look like for the input sequence shown below? import numpy as np Word2Index = { 'zero': 0, 'one': 1, 'two': 2, 'three': 3, 'four': 4, 'five': 5, 'six': 6, 'seven': 7, 'eight': 8, 'nine': 9 } Embedding = np.array([[ 0.006, 0.04 , -0.012], [-0.008, -0.039, 0.013], [-0.002, 0.033, -0.046], [ 0.023, 0.024, 0.005], [-0.006, 0.029, 0.007], [-0.01 , 0.047, 0.018], [ 0.029, -0.004, 0.03 ], [-0.027, 0.012, -0.013], [ 0.023, 0.002, -0.03 ], [ 0.038, -0.028, -0.024]]) input = 'eight five six three six' Three possible methods for answering this question appears below. a) We could encode the sequence as shown below, then compute the product of the Input and Embedding matrices: Identity = np.identity(len(Word2Index)) OneHotIndex = [ Identity[Word2Index[word]] for word in input.split(' ') ] np.dot(OneHotIndex, Embedding) b) Alternatively, we could use the following: Embedding[[ Word2Index[word] for word in sequence.split(" ") ]] c) And, of course, we can use tensorflow as well ... from tensorflow.keras import initializers, layers, models model = models.Sequential([ layers.Embedding(10, 3, embeddings_initializer = initializers.Constant(Embedding)) ]) X = np.array([ Word2Index[word] for word in input.split(' ') ], dtype = 'int32').reshape(1, 5) model.predict(X) Here's the embedded sequence ... array([[[ 0.023, 0.002, -0.03 ], [-0.01 , 0.047, 0.018], [ 0.029, -0.004, 0.03 ], [ 0.023, 0.024, 0.005], [ 0.029, -0.004, 0.03 ]]], dtype=float32)