Week 4 Comprehensive >> Introduction to Machine Learning
TOTAL POINTS 10
1.What is meant by “word vector”?
1 point
Assigning a corresponding number to each word.
A vector consisting of all words in a vocabulary.
The latitude and longitude of the place a word originated.
2.Which word is a synonym for “word vector”?
1 point
Embedding
Stack
Array
Norm
3.What is the term for a set of vectors, with one vector for each word in the vocabulary?
1 point
Space
Array
Embedding
Codebook
4.What is natural language processing?
1 point
Translating human-readable code to machine-readable instructions.
Translating natural text characters to unicode representations.
Taking natural text and making inferences and predictions.
Making natural text conform to formal language standards.
5.What is the goal of learning word vectors?
1 point
Determine the vocabulary in the codebook.
Given a word, predict which words are in its vicinity.
Find the hidden or latent features in a text.
Labelling a text corpus, so a human doesn’t have to do it.
6.What function is the generalization of the logistic function to multiple dimensions?
1 point
Softmax function
Exponential log likelihood
Squash function
Hyperbolic tangent function
7.What is the continuous bag of words (CBOW) approach?
1 point
Word n is used to predict the words in the neighborhood of word n.
Word n is learned from a large corpus of words, which a human has labeled.
The code for word n is fed through a CNN and categorized with a softmax.
Vectors for the neighborhood of words are averaged and used to predict word n.
8.What is the Skip-Gram approach?
1 point
Vectors for the neighborhood of words are averaged and used to predict word n.
The code for word n is fed through a CNN and categorized with a softmax.
Word n is used to predict the words in the neighborhood of word n.
Word n is learned from a large corpus of words, which a human has labeled.
9.What is the goal of the recurrent neural network?
1 point
Synthesize a sequence of words.
Classify an unlabeled image.
Learn a series of images that form a video.
Predict words more efficiently than Skip-Gram.
10.Which model is the state-of-the-art for text synthesis?
1 point
CNN
CBOW
Multilayer perceptron
Long short-term memory
Related Questions & Answers:
Week 1 Comprehensive Week 1 Comprehensive >> Introduction to Machine Learning TOTAL POINTS 10 1.Which of the following are necessary for ... Read more...
Week 2 Comprehensive Week 2 Comprehensive >> Introduction to Machine Learning TOTAL POINTS 10 1.What does the equation for the loss ... Read more...
Week 3 Comprehensive Week 3 Comprehensive >> Introduction to Machine Learning TOTAL POINTS 10 1.Which of the following indicates whether a ... Read more...