My goal is to be a ranker in Kaggle community and I want to be somekind of running mate with readers of my blog. So I am planning to write some posts like this that records how I am studying.
As of 2024.10.25, I have some background knowledge about data science because I took some undergraduate courses and studied for about 1 or 2 month before. So, I just started to compete in kaggle instead of taking any online lectures or else.
I will separtate my posts into 2 categories: 1. Kaggle Study and 2. Kaggle Extra Study.
[Kaggle Study] is posts about what I studied following curriculum documented by Yuhan Lee, Korean Kaggle Grandmaster. I will post summarization or further studies while following the curriculum. He picked kernels that Kaggle beginners should study for popular competitions in terms of each field of ml.
- Updates:
[Kaggle Extra Study] is posts about what I studied while competing in Kaggle competitions.
I will summarize the posts below:
[Kaggle Study]
- Loss function + Gradient Descent
- Backpropagation
- More about relationship between loss function and gradient descent
[Kaggle Extra Study]
- Supervised learning vs. Unsupervised Learning
- AutoEncoder
- Time-series data
- Curse of dimensionality
- Cross validation
- Ensembling
“Courage is the commitment to begin without any guarantee of success”
―
Johann Wolfgang von Goethe
'공부법' 카테고리의 다른 글
근황 - 2024.10.21 (2) | 2024.10.21 |
---|