Liner Regression
2022/4/2 23:23:44
本文主要是介绍Liner Regression,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
import numpy as np # y = w*x + b def compute_error_for_line_given_points(b,w,points): totalError=0 for i in range(0,len(points)): x=points[i,0] y=points[i,1] totalError +=(y-(w*x+b))**2 return totalError / float(len(points)) def step_gradient(b_current,w_current,points,learningRate): b_gradient=0 w_gradient=0 N=float(len(points)) for i in range(0,len(points)): x=points[i,0] y=points[i,1] b_gradient+=-(2/N)*(y-((w_current*x)+b_current)) w_gradient+=-(2/N)*x*(y-((w_current*x))+b_current) new_b = b_current-(learningRate*b_gradient) new_w = w_current-(learningRate*w_gradient) return [new_b,new_w] def gradient_descent_runnder(points,starting_b,starting_w,learning_rate,num_iterations): b=starting_b w=starting_w for i in range(num_iterations): b,w=step_gradient(b,w,np.array(points),learning_rate) return [b,w] def run(): points=np.genfromtxt("data.csv",delimiter=',') learning_rate=0.0001 initial_b=0 initial_w=0 num_iterations=1000 print("Starting gradient descent at b={0},w={1},error={2}" .format(initial_b,initial_w, compute_error_for_line_given_points(initial_b,initial_w,points)) ) print('Running.....') [b,w]=gradient_descent_runnder(points,initial_b,initial_w,learning_rate,num_iterations) print('After {0} iterations b={1},w={2},error={3}' .format(num_iterations,b,w, compute_error_for_line_given_points(b,w,points)) ) if __name__=='__main__': run()
这篇关于Liner Regression的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-05-01为什么公共事业机构会偏爱 TiDB :TiDB 数据库在某省妇幼健康管理系统的应用
- 2024-04-26敏捷开发:想要快速交付就必须舍弃产品质量?
- 2024-04-26静态代码分析的这些好处,我竟然都不知道?
- 2024-04-26你在测试金字塔的哪一层?(下)
- 2024-04-26快刀斩乱麻,DevOps让代码评审也自动起来
- 2024-04-262024年最好用的10款ER图神器!
- 2024-04-2203-为啥大模型LLM还没能完全替代你?
- 2024-04-2101-大语言模型发展
- 2024-04-17基于SpringWeb MultipartFile文件上传、下载功能
- 2024-04-14个人开发者,Spring Boot 项目如何部署