**Q.1** What is the output of print(np.array([1,2,3]) +1)?

**A.** [11 21 31]

**B.** [1 1 2 3]

**C.** [1 2 3 1]

**D.** [2 3 4]

**Ans : **[2 3 4]

**Q.2** Which of the below formula is used to update weights while performing gradient descent?

**A.** dw – learning_rate*w

**B.** w – learning_rate*dw

**C.** w /learning_rate*dw

**D.** w +learning_rate*dw

**Ans : **w – learning_rate*dw

**Q.3** Cost is equal to average of sum of losses.

**A.** True

**B.** False

**Ans : **True

**Q.4** In dot product the number of rows in first matrix must be equal to number of columns in second.

**A.** True

**B.** False

**Ans : **False

**Q.5** What is the output of print(np.dot([1,2,3],[[1],[2],[3]])?

**A.** [[1 2 3] [2 4 6] [3 6 9]]

**B.** throws error

**C.** [14]

**D.** [[14]]

**Ans : **[14]

**Q.6** What does it mean if derivatives of parameters with respect to cost is negative?

**A.** Current parameter value must be reduced

**B.** The cost function has reached its minimum

**C.** Current parameter value must be increased

**D.** None of the options

**Ans : **Current parameter value must be increased

**Q.7** You are building a binary classifier for classifying output(y=1) vs. output(y=0). Which one of these activation functions would you recommend using for the output layer?

**A.** tanh

**B.** relu

**C.** sigmoid

**D.** leaky-relu

**Ans : **sigmoid

**Q.8** What is the equation for linear output of a hidden_layer in shallow neural network, if X is of shape (num_features, num_samples) and W is of shape(num_neurons, num_input)?

**A.** Z = W.X+b

**B.** Z = transpose(W).X +b

**C.** Z = W.transpose(X) + b

**D.** Z = X.W + b

**Ans : **Z = W.X+b

**Q.9** If a shallow neural network has five hidden neurons with three input features what would be the dimension of bias matrix of hidden layer?

**A.** (1,1)

**B.** (5,3)

**C.** (1,5)

**D.** (5,1)

**Ans : **(5,1)

**Q.10** Hidden layer must use activation function with larger derivative.

**A.** True

**B.** False

**Ans : **True

**Q.11** In shallow neural network, number of rows in weight matrix for hidden layer is equal to number of nodes (neurons) in hidden layer.

**A.** True

**B.** False

**Ans : **True

**Q.12** A vector of size (n,1) is called a row vector.

**A.** True

**B.** False

**Ans : **False

**Q.13** If a shallow neural network has five hidden neurons with three input features, what would be the dimension of weight matrix of hidden layer?

**A.** (5,5)

**B.** (3,3)

**C.** (5,3)

**D.** (3,5)

**Ans : **(5,3)

**Q.14** If a shallow neural network has five hidden neurons with three input features what would be the dimension of bias matrix of hidden layer?

**A.** (5,3)

**B.** (5,1)

**C.** (1,1)

**D.** (1,5)

**Ans : **(5,1)

**Q.15** For a single neuron network, if number of features is 5 then what would be the dimension of bias vector?

**A.** (5,5)

**B.** (1,1)

**C.** (1,5)

**D.** (5,1)

**Ans : **(1,1)

**Q.16** How many hidden layers are present if layer_dims = [3,9,9,1]?

**A.** 1

**B.** 2

**C.** 3

**D.** 4

**Ans : **1

**Q.17** If layer_dims = [3,9,9,1], then the shape of weight vector for third layer is _____________.

**A.** (9,3)

**B.** (5,5)

**C.** (5,3)

**D.** (3,5)

**Ans : **(5,3)

**Q.18** What is the output of print(np.array([1,2,3]) * np.array([1,2,3]) )?

**A.** [14 14 14]

**B.** [14]

**C.** [1 4 9]

**D.** [1 2 3 1 2 3]

**Ans : **[1 4 9]

**Q.19** In dot product the number of rows in first matrix must be equal to number of columns in second.

**A.** True

**B.** False

**Ans : **False

**Q.20** If layer_dims = [3,9,9,1], then the shape of weight vector for third layer is _____________.

**A.** (9,3)

**B.** (9,1)

**C.** (9,9)

**D.** (3,9)

**Ans : **(9,9)

**Q.21** Hidden layer must use activation function with a larger derivative.

**A.** True

**B.** False

**Ans : **True

**Q.22** Broadcasting in Python throws error when you try to add two vectors of shape(1,5) and (1,6).

**A.** True

**B.** False

**Ans : **True

**Q.23** In case of DNN weight vector for each layer must always be initialized to zero before training the network.

**A.** True

**B.** False

**Ans : **False

**Q.24** In a shallow neural network, number of rows in weight matrix for hidden layer is equal to number of nodes (neurons) in hidden layer.

**A.** True

**B.** False

**Ans : **True