optimizer:关联的优化器,改动优化器内的学习率
last_epoch:以epoch为周期,记录epoch数
base_lrs:记录初始学习率
scheduler_lr = optim.lr_scheduler.StepLR(optimizer, step_size=50, gamma=0.1)
milestones = [50, 125, 160] scheduler_lr = optim.lr_scheduler.MultiStepLR(optimizer, milestones=milestones, gamma=0.1)
gamma为指数的底,一般取接近于1(如0.95)
gamma = 0.95 scheduler_lr = optim.lr_scheduler.ExponentialLR(optimizer, gamma=gamma)
?t_max:余弦的下降周期
?eta_min:学习率下限
t_max = 50 scheduler_lr = optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=t_max, eta_min=0.)
factor:调整系数
mode:min/max两种模式,min用于下降(loss),max用于上升(acc)
patience:多长时间不变化
cooldown:冷却周期
min_lr:学习率下限
verbose:是否打印日志
?scheduler_lr = optim.lr_scheduler.ReduceLROnPlateau(optimizer, factor=factor, mode=mode,? ?patience=patience,cooldown=cooldown, min_lr=min_lr, verbose=verbose)
?scheduler_lr = torch.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
在epoch的循环中:
scheduler_lr.step()