Gradient Descent algoritmi weight "w" ning optimal qiymatini topishda qo'llaniladigan algoritmdir. Sodda qilib aytganda, bu algoritm bizlarga avtomatik tarzda optimal yechimni topib beradi. Ushbu algortim nafaqat w ning qiymatini topishda balki keyinchalik cost function(kelasi darslarimizda o'rganamiz) ni hisoblashda ham qo'llaniladi.
MATEMATIK FORMULASI (LOSS UCHUN)
# Training Data(O'rgatishdagi ma'lumotlar)x_soat = [1.0,2.0,3.0]y_baho = [2.0,4.0,6.0]w =1.0#w uchun dastalbki taxminiy qiymat# (Modelimiz)To'g'ri hisoblash uchun funksiyadefforward(x):return x * w# Xatolik (Loss) ning funkisyasidefloss(x,y): y_pred =forward(x)return (y_pred - y) * (y_pred - y)# Gradient uchun funksiyadefgradient(x,y): # d_loss/d_wreturn2* x * (x * w - y)# Training dan avvalprint("Bashorat (training dan avval)", "4 soat o'qilganda:", forward(4))# Training zanjiri (loop)learning_rate =0.01for epoch inrange(10):for x_hb_qiym, y_hb_qiym inzip(x_soat, y_baho):# Hosilani hisoblash# w ning qiymatini yangilash# xatolikni hisoblab progressni chop qilish grad =gradient(x_hb_qiym, y_hb_qiym) w = w - learning_rate * gradprint("\tgrad: ", x_hb_qiym, y_hb_qiym, round(grad, 2)) l =loss(x_hb_qiym, y_hb_qiym)print("progress:", epoch, "w=", round(w, 2), "loss=", round(l, 2))# Traningdan so'ngprint("Bashorat (training dan keyin)", "4 saot o'qilganda: ", forward(4))