Data/Machine learning
-
Export the environment specification with Anaconda (environment.yml)Data/Machine learning 2022. 8. 11. 18:21
Similar to "requirements.txt", Anaconda provides an exporting option for its environment (i.e., installed libraries) as "environment.yml". You can simply run conda env create -f environment.yml Then, you'd get something like: Reference: https://www.anaconda.com/blog/moving-conda-environments
-
Recall and PrecisionData/Machine learning 2022. 1. 10. 18:16
Example: https://developers.google.com/machine-learning/crash-course/classification/precision-and-recall Classification: Precision and Recall | Machine Learning Crash Course | Google Developers Google is committed to advancing racial equity for Black communities. See how. Send feedback Classification: Precision and Recall Estimated Time: 9 minutes Precision Precision attempts to answer the follo..
-
Vision Transformer (ViT) Study MaterialData/Machine learning 2021. 12. 5. 18:48
1. https://youtu.be/j6kuz_NqkG0 2. https://youtu.be/TrdevFK_am4 3. What is the Class Token? One of the interesting things about the Vision Transformer is that the architecture uses Class Tokens. These Class Tokens are randomly initialized tokens that are prepended to the beginning of your input sequence. What is the reason for this Class Token and what does it do? Note that the Class Token is ra..
-
Transformer Study MaterialsData/Machine learning 2021. 12. 4. 16:24
1. https://youtu.be/z1xs9jdZnuY 2. https://youtu.be/4Bdc55j80l8 About Positional Encoding https://kazemnejad.com/blog/transformer_architecture_positional_encoding/ Transformer Architecture: The Positional Encoding - Amirhossein Kazemnejad's Blog Transformer architecture was introduced as a novel pure attention-only sequence-to-sequence architecture by Vaswani et al. Its ability for parallelizabl..
-
Evidence Lower Bound (ELBO)Data/Machine learning 2021. 8. 24. 18:21
In the variational auto encoder (VAE) (my blog posting link), a probability of observing given can be presented as follows using Bayes' theorm: where is a posterior. However, this posterior is difficult to compute due to its marginal likelihood . can be expanded as below using the law of total probability: $$ p(x) = \int..