My goal is to be a ranker in Kaggle community and I want to be somekind of running mate with readers of my blog. So I am planning to write some posts like this that records how I am studying.
As of 2024.10.25, I have some background knowledge about data science because I took some undergraduate courses and studied for about 1 or 2 month before. So, I just started to compete in kaggle instead of taking any online lectures or else.
I will separtate my posts into 2 categories: 1. Kaggle Study and 2. Kaggle Extra Study.
[Kaggle Study] is posts about what I studied following curriculum documented by Yuhan Lee, Korean Kaggle Grandmaster. I will post summarization or further studies while following the curriculum. He picked kernels that Kaggle beginners should study for popular competitions in terms of each field of ml.
- Updates:
[Kaggle Extra Study] is posts about what I studied while competing in Kaggle competitions.
I will summarize the posts below:
[Kaggle Study]
- Loss function + Gradient Descent
- Backpropagation
- More about relationship between loss function and gradient descent
[Kaggle Extra Study]
- Supervised learning vs. Unsupervised Learning
- AutoEncoder
- Time-series data
- Curse of dimensionality
- Cross validation
- Ensembling
“Courage is the commitment to begin without any guarantee of success”
―
Johann Wolfgang von Goethe
'캐글' 카테고리의 다른 글
[Kaggle Extra Study] 7. Data Imputation (3) | 2024.10.27 |
---|---|
[Kaggle Study] #1 Titanic - Machine Learning from Disaster (1) | 2024.10.26 |
[Kaggle Study] 1. Loss Function 손실 함수 & Gradient Descent 경사 하강법 (4) | 2024.10.25 |
[Kaggle Extra Study] 6. Ensemble Method 앙상블 기법 (3) | 2024.10.24 |
[Kaggle Extra Study] 5. Cross Validation 교차 검증 (3) | 2024.10.23 |