▸ Principal Component Analysis :

1. Consider the following 2D dataset:

Which of the following figures correspond to possible values that PCA may return for $\inline&space;u^{(1)}$ (the first eigen vector / first principal component)? Check all that apply (you may have to check more than one figure).

• Figure 1:
The maximal variance is along the y = x line, so this option is correct.

• Figure 2:
The maximal variance is along the y = x line, so the negative vector along that line is correct for the first principal component.

• Figure 3:

• Figure 4:

1. Which of the following is a reasonable way to select the number of principal components k?
(Recall that n is the dimensionality of the input data and m is the number of input examples.)

1. Which of the following statements are true? Check all that apply.

Check-out our free tutorials on IOT (Internet of Things):

1. Which of the following are recommended applications of PCA? Select all that apply.

• To get more features to feed into a learning algorithm.

• Data compression: Reduce the dimension of your data, so that it takes up less memory / disk space.
If memory or disk space is limited, PCA allows you to save space in exchange for losing a little of the data’s information. This can be a reasonable tradeoff.

• Preventing overfitting: Reduce the number of features (in a supervised learning problem), so that there are fewer parameters to learn.

• Data visualization: Reduce data to 2D (or 3D) so that it can be plotted.
This is a good use of PCA, as it can give you intuition about your data that would otherwise be impossible to see.

• Data compression: Reduce the dimension of your input data $\inline&space;x^{(i)}$, which will be used in supervised learning algorithm (i.e., use PCA so that your supervised learning algorithm runs faster ).
If your learning algorithm is too slow because of the input dimension is too high, then using PCA to speed it up is a reasonable choice.

• As a replacement for (or alternative to) linear regression: For most learning applications, PCA and linear regression give sustantially similar results.

• Data visualization: To take 2D data, and find a different way of plotting it in 2D (using k=2)

&
Click here to see more codes for Raspberry Pi 3 and similar Family.
&
Click here to see more codes for NodeMCU ESP8266 and similar Family.
&
Click here to see more codes for Arduino Mega (ATMega 2560) and similar Family.

Feel free to ask doubts in the comment section. I will try my best to answer it.
If you find this helpful by any mean like, comment and share the post.
This is the simplest way to encourage me to keep doing such work.

Thanks & Regards,
- APDaga DumpBox