Week 1 Report
Lecture review
ai math (posts 1~11)
https://velog.io/@naem1023/series/ai-math
python (posts 1~2)
https://velog.io/@naem1023/series/python
Assignment process / deliverables
Optional assignment 1 was the key challenge. Implementing gradient descent through vector operations was manageable since it was covered in class. But I unexpectedly got stuck implementing gradient descent for a linear function like y = mx + c according to the example.
https://towardsdatascience.com/linear-regression-using-gradient-descent-97a6c8700931
I had previously studied and organized gradient descent in Notion, but I keep forgetting, so I referenced the link above. I’ll probably need to reference it again in the future.
The key point, as the link describes, is to define the loss function and use its derivatives with respect to m and c as the gradient vector.
  After differentiating the loss function with respect to m and c separately, m and c are updated via the familiar formula below. 
I solved it by directly translating this process into code using numpy.
Peer session summary
Professor Im Sung-bin organized the content that was frequently discussed during peer sessions really well. Since these were topics where we struggled due to ambiguity or lack of information, I’ll summarize them in the peer session section.
https://naem1023.notion.site/4b3c83b157ca43a8b6d1ef706084a1fb
I organized this via Notion.
Study retrospective
https://naem1023.notion.site/ML-68740e6ac0db42e9a01b17c9ab093606 The first week revisited content I had gradually organized in the link above throughout my university years. Still, everything felt new. I took that as a sign that even my fundamentals weren’t solid enough.
I hope the study content I organize on velog will accumulate well.