<04> BACK-PROPAGATION & AUTOGRAD

TESKARI HISOBLASH PYTORCHDA

ESLATMA: Ushbu darsligimizdagi matematik hisob kitoblar sizga biroz qiyinlik qilsa, havotirlanishga o'rin yo'q. Shunchaki siz ma'nosini tushunishga va PyTorch kutubxonasidan qanday foydalanishni puxta egallashingiz yetarlicha bo'ladi.

BACKPROPAGATION

Video darsligimizda ta'kidlaganimizdek, w eng optimal qiymatini topish orqali biz model xatoligi (loss)ni minimal darajaga tushurishimiz bu DL va ML ning eng asosiy va ajralmas qismi hisolanadi. Backpropagation ning asosiy maqsadi, w ning qiymatini yangilash uchun loss dan w gacha oraliqda teskari tarzda zanjir qoidasi yordamida hisoblashni amalga oshirish. Backpropagation dan olingan natija esa, navbatida, gradient descent algortimida w ning qiymatini yangilashda foyadaliniladi.

O'tgan darslarimizda sizlar bilan qo'lda(manual) va avtomatik(automatic) hisoblashni ko'rgan edik, bu gal esa avtomatik hisoblashni to'liq pytorch orqali ko'rib chiqdik.

#Kerakli kutubxonalrni chaqirib olish
import torch

x_soat = [1.0, 2.0, 3.0]
y_baho = [2.0, 4.0, 6.0]

w = torch.tensor([1.0], requires_grad=True) #Taxminiy qiymat

# (Modelimiz)To'g'ri hisoblash uchun funksiya
def forward(x):
    return x * w


# Xatolik (Loss) ning funkisyasi
def loss(y_pred, y_val):
    return (y_pred - y_val) ** 2          

# Training dan avval
print("Bashorat (training dan avval)",  "4 soat o'qilganda:", forward(4))

# Training zanjiri (loop)
learning_rate = 0.01
for epoch in range(10):
    for x_hb_qiym, y_hb_qiym in zip(x_soat, y_baho):
        y_pred = forward(x_hb_qiym) # 1) Forward hisoblash
        l = loss(y_pred, y_hb_qiym) # 2) Loss ni hisoblash
        l.backward() # 3) backward hisoblash
        print("\tgrad: ", x_hb_qiym, y_hb_qiym, '{:.3f}'.format(w.grad.item()))
        w.data = w.data - learning_rate * w.grad.item()  #W ning qiymatini yangilash

        # w ning qiymattini yangilagach, nolga tenglashtirish
        w.grad.data.zero_()

    print(f"Epoch: {epoch} | Loss: {l.item()}")

# Traningdan so'ng
print("Bashorat (training dan keyin)",  "4 saot o'qilganda: ", forward(4))

x=2, y=4 hamda w ning taxminiy qiymati 1 ga tenga bo'lganda, computational graph ko'rinishidagi hisoblash qanday bo'ladi.

Last updated